Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to start from labeled images (tiff file) #127

Open
BoyuLyu opened this issue Mar 1, 2022 · 11 comments
Open

How to start from labeled images (tiff file) #127

BoyuLyu opened this issue Mar 1, 2022 · 11 comments
Labels

Comments

@BoyuLyu
Copy link

BoyuLyu commented Mar 1, 2022

I am having trouble getting started. I am trying to visualize a small labeled volume in neuroglancer. The volume is in tiff file now. How can I transform it into precomputed format?

I guess I need to create the info file first. Can I do that using igneous?

@william-silversmith
Copy link
Contributor

Hi, you can use CloudVolume.create_new_info to make a new info file. You can then place it in the directory you'd like.

After that, you can follow this guide:
https://github.com/seung-lab/cloud-volume/wiki/Example-Single-Machine-Dataset-Upload

I haven't figured out a uniform way to help people create new datasets since everyone organizes their initial files differently. It might be possible to create import/exports from other systems at some point, but raw off-the-microscope is probably always going to be challenging.

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 1, 2022

Thank you so much for the prompt reply!
I try using create_new_info and I can obtain the info file. But when I run downsample using igneous, it reported cloudvolume.exceptions.EmptyVolumeException error.
I am wondering how should the segmentation files be stored? I understand that for a file with resolution (8nm, 8nm, 40nm) and with chunk size being (100, 100, 40), I should store the first chunk in the folder 8_8_40/0-100_0-100_0-40, but how should I store the chunk? Should I save it as z slices? And how should I name them?

@william-silversmith
Copy link
Contributor

It's not necessary to store as z slices. Theoretically, CloudVolume should be doing all the work of figuring out the file names for you. Just make sure that you're uploading the right numpy array to the right (chunk aligned) location. You can do non-aligned uploads, but that requires careful planning or single core uploads to avoid conflicts.

Check the location with the EmptyVolumeException and see where the missing area lies. That should give you a clue. You can also try downsampling with fill_missing enabled to get an overview of the volume and see where chunks are missing. Then you can fix them and re-run it.

When fixing, see if you can disable caching on cloud storage when you upload otherwise you may have to wait an hour to see the results.

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 1, 2022

That works! Now I can generate the precomputed files. And I am trying to visualize it using vol.viewer(). But it returns error at HTTP://localhost:1337.
vol = CloudVolume('file://local/path/to/segmentation')
vol.viewer()

Error response
Error code: 404
Message: /: Not Found.

@william-silversmith
Copy link
Contributor

Don't worry about the error. I should probably add an index.html page with instructions. You can enter the URL into Neuroglancer and it will start working.

Alternatively, just run igneous view /local/path/to/segmentation and it will open your browser to the right location.

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 2, 2022

It works now. But I am wondering if there is a way I can input the path/to/local/segmentation into the Neuroglancer? I tried input URL like precomputed://file://local/path/to/segmentation, but it does work.

@william-silversmith
Copy link
Contributor

william-silversmith commented Mar 2, 2022 via email

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 2, 2022

Thanks! That clears my confusion.

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 2, 2022

Under this restriction, can I visualize multiple layers? Currently, I can visualize the segmentation layer as well as the mesh. But if I want to visualize the raw data (like segmentation overlaid on the raw EM data), how could I achieve that?

@william-silversmith
Copy link
Contributor

william-silversmith commented Mar 2, 2022 via email

@BoyuLyu
Copy link
Author

BoyuLyu commented Mar 2, 2022

Thanks! I tried starting multiple server with different ports and it worked well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants