Master thesis work to simulate a microscope and stream the images coming from the microscope.
The simulator can either be run on its own or connected with Kafka. To run it with Kafka a Kafka server is needed.
cv2, numpy, PIL, Flask, kafka-python, scikit-image
To stream to Harmonic IO, the stream connector is needed (with container support): https://github.com/HASTE-project/HarmonicIO (see 'Install the Streaming Connector only')
For installation in development mode:
A step by step series of examples that tell you have to get a development env running
git clone https://github.com/HASTE-project/microscope-simulator.git
cd microscope-simulator
git pull https://github.com/HASTE-project/microscope-simulator.git
python3 simulator.py
For the a single test image:
$ python3 run_prod_pipeline_dummy_set.py
For all the images in one of the test AZ datasets (500 images, takes 2-3 mins):
$ python3 run_prod_pipeline_from_volume.py
For the all the green images:
$ python3 run_prod_pipeline_from_volume_filtered.py
Or use from your own application:
import simulator_no_flask
simulator_no_flask.get_files(file_path, period, binning, color_channel, connect_kafka) # period = period time in seconds, color_channel given as
#list with up to 5 channels ie. ["1", "2", "5"], connect_kafka set to "yes" for Kafka server connection
You should see output like this:
simulator: list of files to stream:
['dummy_image_0.png']
simulator: stream ID is: 2018_01_08__10_35_55_dummy_set
simulator: initialized streaming target
file: ./test-images/dummy_image_0.png has size: 130450
Sending request..
http://192.168.1.24:8080/streamRequest?token=None&c_name=benblamey/haste-example:latest&c_os=ubuntu&priority=0&source=node1&digest=8571979be2b85fe2f64a26f67e8221e1
[OUT: Push data to worker (192.168.1.24:9001>benblamey/haste-example:latest) successful.]
simulator: all files streamed
import profiling
profiling.timer_kafka("file_path_to_images.tif", "to_time") # to_time can be p: producer, p2: producer already connected to Kafka or g:simulator
profiling.time_kafka_consumer() # time Kafka consumer
profiling.timer_kafka_100bytes() # time Kafka producer when sending small messages