Closed martinschorb closed 2 years ago
OK, I found it in the images.json
.
Do I need to assign these things manually or is there some tool?
If there is a tool then @constantinpape would have created it. I could only do it manually.
Yes, I am cleaning this up right now, but I am working on putting it into mobie-utils-python
to have it in a central place. I will let you know when it's there.
@martinschorb I have updated the master now so you can add segmentations with this script: https://github.com/mobie-org/covid-em-datasets/blob/master/add_seg_to_dataset.py
Note that I moved the core functionality for adding data to https://github.com/mobie-org/mobie-utils-python. I installed it in the conda env on the emcf share, so you should be able to use the new script.
However, I did not pull the current master into the covid-em repository on the emcf share, because there were some local incompatible changes with the bookmarks.
Hi @constantinpape ,
thanks for that. I cannot pull either to the group share due to permissions issues. It seems that git is not really made for handling local repos that are collaboratively edited by different users with different permissions. Essentially everything should be newest on github. We could basically discard the group share folder, except for new data uploads. But I don't know if there will be more...
It seems that git is not really made for handling local repos that are collaboratively edited by different users with different permissions.
Yes, true that's a problem. I wonder if we can set different default umask values for folders on the group share. I will write to IT now.
Essentially everything should be newest on github. We could basically discard the group share folder, except for new data uploads. But I don't know if there will be more...
Ok, let's see what IT has to say and then clean up this situation.
@martinschorb given the discussion with IT, I think it would be best to make a new repo on the emcf share with running umask g+w
beforehand and then copy the data over.
If you want I can set this up.
Got some trouble here as well...
$ python add_seg_to_dataset.py /g/emcf/common/5792_Sars-Cov-2/Exp_070420/FIB-SEM/segmentation/upscale/20-04-23_S4_area2_Sam/seg_dmv_scale1.0/run_200518_00_dmv_full_dataset.h5 scale_01 Covid19-S4-Area2 S4-Area2_DMV
DEBUG: Checking if DownscalingWorkflow(tmp_folder=tmp_S4-Area2_DMV, max_jobs=16, config_dir=tmp_S4-Area2_DMV/configs, target=local, dependency=DummyTask, input_path=/g/emcf/common/5792_Sars-Cov-2/Exp_070420/FIB-SEM/segmentation/upscale/20-04-23_S4_area2_Sam/seg_dmv_scale1.0/run_200518_00_dmv_full_dataset.h5, input_key=scale_01, scale_factors=[[2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2]], halos=[[2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2]], metadata_format=bdv.n5, metadata_dict={"resolution": null, "unit": "micrometer"}, output_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, output_key_prefix=, force_copy=False, skip_existing_levels=False, scale_offset=0) is complete
/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/miniconda/envs/covid-em-dev/lib/python3.7/site-packages/luigi/parameter.py:279: UserWarning: Parameter "dtype" with value "None" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/miniconda/envs/covid-em-dev/lib/python3.7/site-packages/luigi/parameter.py:279: UserWarning: Parameter "scale_factor" with value "(2, 2, 2)" is not of type string.
warnings.warn('Parameter "{}" with value "{}" is not of type string.'.format(param_name, param_value))
DEBUG: Checking if WriteDownscalingMetadata(tmp_folder=tmp_S4-Area2_DMV, output_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, scale_factors=[[2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2]], dependency=DownscalingLocal, metadata_format=bdv.n5, metadata_dict={"resolution": null, "unit": "micrometer"}, output_key_prefix=, scale_offset=0, prefix=downscaling) is complete
INFO: Informed scheduler that task DownscalingWorkflow_tmp_S4_Area2_DMV_DummyTask_False_cff887c535 has status PENDING
DEBUG: Checking if DownscalingLocal(tmp_folder=tmp_S4-Area2_DMV, max_jobs=16, config_dir=tmp_S4-Area2_DMV/configs, input_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, input_key=setup0/timepoint0/s5, output_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, output_key=setup0/timepoint0/s6, scale_factor=(2, 2, 2), scale_prefix=s6, halo=[2, 2, 2], effective_scale_factor=[64, 64, 64], dependency=DownscalingLocal) is complete
INFO: Informed scheduler that task WriteDownscalingMetadata_DownscalingLocal___resolution___n_bdv_n5_28203bf66a has status PENDING
INFO: Informed scheduler that task DownscalingLocal_tmp_S4_Area2_DMV_DownscalingLocal__64__64__64__428a2b9e15 has status DONE
INFO: Done scheduling tasks
INFO: Running Worker with 1 processes
DEBUG: Asking scheduler for work...
DEBUG: Pending tasks: 2
INFO: [pid 111961] Worker Worker(salt=667644300, workers=1, host=vm-schwab-02.embl.de, username=schorb, pid=111961) running WriteDownscalingMetadata(tmp_folder=tmp_S4-Area2_DMV, output_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, scale_factors=[[2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2]], dependency=DownscalingLocal, metadata_format=bdv.n5, metadata_dict={"resolution": null, "unit": "micrometer"}, output_key_prefix=, scale_offset=0, prefix=downscaling)
ERROR: [pid 111961] Worker Worker(salt=667644300, workers=1, host=vm-schwab-02.embl.de, username=schorb, pid=111961) failed WriteDownscalingMetadata(tmp_folder=tmp_S4-Area2_DMV, output_path=./data/Covid19-S4-Area2/images/local/S4-Area2_DMV.n5, scale_factors=[[2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2], [2, 2, 2]], dependency=DownscalingLocal, metadata_format=bdv.n5, metadata_dict={"resolution": null, "unit": "micrometer"}, output_key_prefix=, scale_offset=0, prefix=downscaling)
Traceback (most recent call last):
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/miniconda/envs/covid-em-dev/lib/python3.7/site-packages/luigi/worker.py", line 191, in run
new_deps = self._run_get_new_deps()
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/miniconda/envs/covid-em-dev/lib/python3.7/site-packages/luigi/worker.py", line 133, in _run_get_new_deps
task_gen = self.task.run()
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/cluster_tools/cluster_tools/downscaling/downscaling_workflow.py", line 95, in run
self._bdv_metadata()
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/cluster_tools/cluster_tools/downscaling/downscaling_workflow.py", line 84, in _bdv_metadata
overwrite_data=False, enforce_consistency=False)
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/pybdv/pybdv/metadata.py", line 266, in write_xml_metadata
overwrite, overwrite_data, enforce_consistency)
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/pybdv/pybdv/metadata.py", line 37, in _require_view_setup
dz, dy, dx = resolution
TypeError: cannot unpack non-iterable NoneType object
DEBUG: 1 running tasks, waiting for next task to finish
INFO: Informed scheduler that task WriteDownscalingMetadata_DownscalingLocal___resolution___n_bdv_n5_28203bf66a has status FAILED
DEBUG: Asking scheduler for work...
DEBUG: Done
DEBUG: There are no more tasks to run at this time
DEBUG: There are 2 pending tasks possibly being run by other workers
DEBUG: There are 2 pending tasks unique to this worker
DEBUG: There are 2 pending tasks last scheduled by this worker
INFO: Worker Worker(salt=667644300, workers=1, host=vm-schwab-02.embl.de, username=schorb, pid=111961) was stopped. Shutting down Keep-Alive thread
INFO:
===== Luigi Execution Summary =====
Scheduled 3 tasks of which:
* 1 complete ones were encountered:
- 1 DownscalingLocal(...)
* 1 failed:
- 1 WriteDownscalingMetadata(...)
* 1 were left pending, among these:
* 1 had failed dependencies:
- 1 DownscalingWorkflow(...)
This progress looks :( because there were failed tasks
===== Luigi Execution Summary =====
Traceback (most recent call last):
File "add_seg_to_dataset.py", line 44, in <module>
args.resolution, args.chunks, args.target, args.max_jobs)
File "add_seg_to_dataset.py", line 21, in add_seg_to_dataset
add_default_table=True)
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/segmentation.py", line 48, in add_segmentation
max_jobs=max_jobs)
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/segmentation.py", line 95, in import_segmentation
raise RuntimeError("Importing segmentation failed")
RuntimeError: Importing segmentation failed
Indeed, there was an issue with the resolution command line parameter. I have fixed it in 1935c63fcfd3526b3b6047cf21b3474dfeb30b25.
Now, the resolution will be [0.008, 0.008, 0.008]
by default. It can also be specified as json encoded string, e.g.:
python add_segmentation.py --resolution "[0.005,0.005,0.005]"
Hi @constantinpape ,
is the script assuming binary volumes? I have HDF5s with labels 0-6) in there.
is the script assuming binary volumes? I have HDF5s with labels 0-6) in there.
No, it assumes segmentation with arbitrary indices for objects in the segmentation and should accept all integer data types.
Ok, cool,
in this case the different labels do not represent the different objects but different materials/organelles. How do I assign those correctly?
ping @jhennies
in this case the different labels do not represent the different objects but different materials/organelles. How do I assign those correctly?
There are different options and what to do depends on how you / your collaborators want to interact with the data.
.../<DATASET_FOLDER>/tables/<SEGMENTATION_NAME>/default.csv
. This way the organelles can be identified by looking at the table that will be loaded with the segmentation.Normally, I would prefer #3. Viktoriia is still working on the proper stitching, until that is finished #1 sounds good to me
Hi @constantinpape the chunks (at 64ˆˆ3) for the segmentations are super-small (<1kB) due to compression.
Would you leave the size as for the raw dataset or would you increase it to get asuitable file sizes (minimum 256ˆˆ3)?
Would you leave the size as for the raw dataset or would you increase it to get asuitable file sizes (minimum 256ˆˆ3)?
I think you could try 256^3
Yeah, I think you can go for 256^3 or 128^3 if the performance is degraded with 256^3..
1. add the segmentation as it is and then add the names for the organelles as a column to the table that will be generated in `.../<DATASET_FOLDER>/tables/<SEGMENTATION_NAME>/default.csv`. This way the organelles can be identified by looking at the table that will be loaded with the segmentation.
Does it have to be any special column name, or will it just display the whole table in MoBIE?
@tischi Are segmentations disabled for this repo? I have pushed and uploaded everything. MoBIE however, does not offer any.
Is it because there are only segmentations available for a fraction of the datasets? Does the default dataset need to have segmentations for the UI to display them?
Does it have to be any special column name, or will it just display the whole table in MoBIE?
It will display the whole table in MoBIE, so the column name doesn't matter.
Are segmentations disabled for this repo? I have pushed and uploaded everything. MoBIE however, does not offer any.
The problem is that the segmentation has not been added to the images.json file.
If you have used the add_seg_to_dataset
script, it should have been added to it automatically.
Maybe you just forgot to add it to git?
Hmm, I ran precisely that script. I only added the new files to the commit but did not stage the changed file...
Here it is, BUT:
Adding source: s4_area2_segmentation...
Exception in thread "AWT-EventQueue-0" java.lang.RuntimeException: Could not open URL: https://raw.githubusercontent.com/mobie/covid-em-datasets/master/data/Covid19-S4-Area2/../../default.csv
at de.embl.cba.tables.Tables.getReader(Tables.java:150)
at de.embl.cba.tables.FileUtils.isRelativePath(FileUtils.java:43)
at de.embl.cba.tables.FileUtils.resolveTableURL(FileUtils.java:32)
at de.embl.cba.mobie.utils.Utils.createAnnotatedImageSegmentsFromTableFile(Utils.java:292)
at de.embl.cba.mobie.viewer.SourcesPanel.showAnnotatedLabelsSource(SourcesPanel.java:441)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToViewer(SourcesPanel.java:353)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToPanelAndViewer(SourcesPanel.java:337)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToPanelAndViewer(SourcesPanel.java:294)
at de.embl.cba.mobie.viewer.ActionPanel.lambda$null$4(ActionPanel.java:403)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:758)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:728)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:205)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
were does this ../../
come from...?
@martinschorb yes, I know that issue. It's something I have fixed already, but it looks like something is not up-to-date on your end. I will fix it.
Should be fixed by 90e55d5. @martinschorb can you try again?
now I got
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException
at de.embl.cba.mobie.viewer.MoBIEViewer.close(MoBIEViewer.java:275)
at de.embl.cba.mobie.viewer.ActionPanel.switchDataset(ActionPanel.java:549)
at de.embl.cba.mobie.viewer.ActionPanel.lambda$addDatasetSelectionUI$13(ActionPanel.java:536)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(BasicButtonListener.java:252)
at java.awt.Component.processMouseEvent(Component.java:6539)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3324)
at java.awt.Component.processEvent(Component.java:6304)
at java.awt.Container.processEvent(Container.java:2239)
at java.awt.Component.dispatchEventImpl(Component.java:4889)
at java.awt.Container.dispatchEventImpl(Container.java:2297)
at java.awt.Component.dispatchEvent(Component.java:4711)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4904)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4535)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4476)
at java.awt.Container.dispatchEventImpl(Container.java:2283)
at java.awt.Window.dispatchEventImpl(Window.java:2746)
at java.awt.Component.dispatchEvent(Component.java:4711)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:760)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:84)
at java.awt.EventQueue$4.run(EventQueue.java:733)
at java.awt.EventQueue$4.run(EventQueue.java:731)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:730)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:205)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
when switching the dataset, I don't know if that's related or the old switching problem.
The segmentation is there. YAY!
when switching the dataset, I don't know if that's related or the old switching problem.
Ok, I think that one is unrelated to the segmentation.
The segmentation is there. YAY!
Nice! I have also updated the version of mobie-utils-python on the emcf share, so when you call the add_seg_to_dataset
script the next time the issue with the table should not occur.
eh, looks like this has broken something...
$ python add_seg_to_dataset.py /scratch/jhennies/project_corona/upscale/s5_merged_segmentations.h5 data/ Covid19-S5-mock-Cell1-2 s5_mock_segmentation --chunks 256 256 256 --target slurm --max_jobs 150
Traceback (most recent call last):
File "add_seg_to_dataset.py", line 5, in <module>
from mobie import add_segmentation
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/__init__.py", line 1, in <module>
from .image_data import add_image_data
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/image_data.py", line 6, in <module>
from mobie.import_data import import_raw_volume
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/__init__.py", line 4, in <module>
from .registration import apply_registration
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/registration/__init__.py", line 1, in <module>
from .apply_registration import apply_registration
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/registration/apply_registration.py", line 11, in <module>
from elf.transformation import elastix_parser
ImportError: cannot import name 'elastix_parser' from 'elf.transformation' (/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/elf/elf/transformation/__init__.py)
eh, looks like this has broken something...
True, I forgot to update the dependency. Should be fixed now.
not yet... antoher one:
$ python add_seg_to_dataset.py /scratch/jhennies/project_corona/upscale/s5_merged_segmentations.h5 data/ Covid19-S5-mock-Cell1-2 s5_mock_segmentation --chunks 256 256 256 --target slurm --max_jobs 150
Traceback (most recent call last):
File "add_seg_to_dataset.py", line 5, in <module>
from mobie import add_segmentation
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/__init__.py", line 1, in <module>
from .image_data import add_image_data
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/image_data.py", line 6, in <module>
from mobie.import_data import import_raw_volume
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/__init__.py", line 4, in <module>
from .registration import apply_registration
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/registration/__init__.py", line 1, in <module>
from .apply_registration import apply_registration
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/registration/apply_registration.py", line 14, in <module>
from .registration_impl import (registration_affine,
File "/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/mobie-utils-python/mobie/import_data/registration/registration_impl.py", line 6, in <module>
from cluster_tools.transformations import AffineTransformationWorkflow, TransformixTransformationWorkflow
ImportError: cannot import name 'AffineTransformationWorkflow' from 'cluster_tools.transformations' (/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/cluster_tools/cluster_tools/transformations/__init__.py)
Ok, also need to update this repo:
/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/software/cluster_tools
I don't have permissions there though.
Could you run git pull origin master
in there? That should fix it.
Yes, that's it. Thanks!
now something went wrong with the next one... Maybe because of parallel commits to the repo...?
Adding source: s5_mock_segmentation...
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException
at bdv.img.n5.N5GenericImageLoader.createSetupImgLoader(N5GenericImageLoader.java:166)
at bdv.img.n5.N5GenericImageLoader.open(N5GenericImageLoader.java:114)
at bdv.img.n5.N5GenericImageLoader.getSetupImgLoader(N5GenericImageLoader.java:158)
at bdv.img.n5.N5GenericImageLoader.getSetupImgLoader(N5GenericImageLoader.java:75)
at bdv.BigDataViewer.initSetupNumericType(BigDataViewer.java:260)
at bdv.BigDataViewer.initSetups(BigDataViewer.java:292)
at de.embl.cba.bdv.utils.sources.LazySpimSource.initSpimData(LazySpimSource.java:64)
at de.embl.cba.bdv.utils.sources.LazySpimSource.wrappedVolatileSource(LazySpimSource.java:38)
at de.embl.cba.bdv.utils.sources.LazySpimSource.getNumMipmapLevels(LazySpimSource.java:127)
at de.embl.cba.bdv.utils.sources.ARGBConvertedRealSource.getNumMipmapLevels(ARGBConvertedRealSource.java:106)
at bdv.viewer.render.DefaultMipmapOrdering.<init>(DefaultMipmapOrdering.java:88)
at bdv.viewer.render.DefaultMipmapOrdering.<init>(DefaultMipmapOrdering.java:101)
at bdv.tools.transformation.TransformedSource.<init>(TransformedSource.java:116)
at bdv.tools.transformation.TransformedSource.<init>(TransformedSource.java:94)
at bdv.BigDataViewer.wrapWithTransformedSource(BigDataViewer.java:242)
at bdv.util.BdvFunctions.addSourceToListsNumericType(BdvFunctions.java:637)
at bdv.util.BdvFunctions.addSourceToListsGenericType(BdvFunctions.java:608)
at bdv.util.BdvFunctions.addSource(BdvFunctions.java:575)
at bdv.util.BdvFunctions.show(BdvFunctions.java:196)
at de.embl.cba.tables.view.SegmentsBdvView.showSource(SegmentsBdvView.java:364)
at de.embl.cba.tables.view.SegmentsBdvView.showSourceSet(SegmentsBdvView.java:294)
at de.embl.cba.tables.view.SegmentsBdvView.showInitialSources(SegmentsBdvView.java:181)
at de.embl.cba.tables.view.SegmentsBdvView.<init>(SegmentsBdvView.java:124)
at de.embl.cba.tables.view.combined.SegmentsTableBdvAnd3dViews.show(SegmentsTableBdvAnd3dViews.java:60)
at de.embl.cba.tables.view.combined.SegmentsTableBdvAnd3dViews.<init>(SegmentsTableBdvAnd3dViews.java:46)
at de.embl.cba.mobie.viewer.SourcesPanel.showAnnotatedLabelsSource(SourcesPanel.java:452)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToViewer(SourcesPanel.java:353)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToPanelAndViewer(SourcesPanel.java:337)
at de.embl.cba.mobie.viewer.SourcesPanel.addSourceToPanelAndViewer(SourcesPanel.java:294)
at de.embl.cba.mobie.viewer.ActionPanel.lambda$null$4(ActionPanel.java:403)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:758)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:728)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:205)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Are you trying to open it locally or via s3? Because I can't find it in the s3 bucket:
pape@gpu6:/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/data$ mc ls embl/covid-fib-sem/Covid19-S5-mock-Cell1-2/images/local
[2020-06-04 17:34:40 CEST] 1.0KiB fibsem-raw.xml
[2020-08-07 14:55:07 CEST] 0B fibsem-raw.n5/
whereas it looks correct for the segmentation that you added before:
pape@gpu6:/g/emcf/common/5792_Sars-Cov-2/covid-em-datasets/data$ mc ls embl/covid-fib-sem/Covid19-S4-Area2/images/local
[2020-08-07 14:55:21 CEST] 0B s4_area2_segmentation.n5/
[2020-08-07 14:55:21 CEST] 0B sbem-6dpf-1-whole-raw.n5/
Aaah, I copied your suggested command from the script.
However, for me the endpoint for mc
is called EMBL
, not embl
. But what the stupid mc cp
does if it is not pointing to a remote s3 directory, it just creates a local directory without any feedback...
Stupid but true.
Now it works.
Hi,
Julian @jhennies has the segmentations almost ready. What would be the workflow to add those? I saw that it is basically additional datasets (xml). But how does MoBIE know that it is a segmentation?