-
Notifications
You must be signed in to change notification settings - Fork 40
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Script to generate Allen mouse brain atlas with barrel annotations #313
Conversation
for more information, see https://pre-commit.ci
Thank @abisi - I'll have a look next week! |
Thanks a lot @abisi - this is looking good. With some minor hacks, I managed to reproduce a nice-looking barrel-enhanced BrainGlobe atlas 🎉 I've done an initial top-level review for now, just some things I've noticed - I hope it's helpful 😃 feel free to ask for clarification on my comments, and to disagree with them as much as you like! You can post them either as responses here, or message us on zulip. My only major worry is how to install Am I right in thinking your script currently doesn't create meshes for the new regions you introduce? You should be able to do this with our helper function
Just to double-check, what do you mean by manually here. It looks to me like you've done this programmatically from your code, which I think is more than reasonable?
I have access now, thanks to @alTeska 🙏 |
pyproject.toml
Outdated
@@ -53,6 +53,10 @@ dev = [ | |||
"tox", | |||
] | |||
allenmouse = ["allensdk"] | |||
allenmouse_barrels = [ | |||
"allensdk", | |||
"atlas-enhancement"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IIUC you will need to publish atlas-enhancement
as a package on PyPI for this to work.
I needed to clone atlas-enhancement
and install it from the requirements file to get the script to work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It also looks like the version of numpy
(and possibly the versions of scipy
and xarray
?) required are incompatible between atlas-enhancement
and allensdk
?
Could you advise on how you've installed this locally?
I did (in a clean conda environment with Python=3.11, in the root of the cloned atlas-enhancement
repo):
pip install -e .[dev, allenmouse, atlasgen]
pip install -r requirements
Maybe we don't have to be so strict in pinning versions in atlas-enhancement
but unsure - there may be good reason!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have asked the reason for a specific numpy version, awaiting response.
Indeed I am not sure one needs to be strict about this.
Edit: no particular reason to use this numpy version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This would be the quick instructions to install atlas-enhancement and run the script:
- Clone the fork of atlas-enhancement repository (https://github.com/abisi/atlas-enhancement) into BrainGlobe's working directory
- Download annotation files in
mouse_enhanced_cortex.zip
from Zenodo: https://zenodo.org/records/11218079 - Move the 'hierarchy.json' and 'annotation_barrels.nrrd' files into a new
data
folder inatlas-enhancement\barrel-annotations\data
. - Run the script
allen_mouse_bluebrain_barrels.py
## TODO: import file | ||
# sys.run("python transplant_barrels_nrrd.py --annotation_barrels.nrrd --annotation_10.nrrd --hierarchy.json") | ||
# if resolution == 10: | ||
# annotation_file = 'annotation_barrels_10.nrrd' | ||
# elif resolution == 25: | ||
# annotation_file = 'annotation_barrels_25.nrrd' | ||
# else: | ||
# raise ValueError("Resolution not supported.") | ||
|
||
# Load annotated volume: | ||
annotation_dir_path = Path( | ||
r"C:\Users\bisi\Github\atlas-enhancement\barrel-annotations\data\atlas" | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess ultimately it'd be cleaner to use the above-mentioned code to regenerate these files from scratch? This would require using the transplant_barrels.nrrd.py script (BlueBrain/atlas-enhancement@main/barrel-annotations/transplant_barrels_nrrd.py).
Yes, I think so - amongst other good reasons for this, it should allow us to avoid hard-coding a local path here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should now run with a separate script that I have PR'ed in atlas-enhancement, one that accepts arguments.
# Generated atlas path: | ||
bg_root_dir = ( | ||
Path.home() | ||
/ "Desktop" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/ "Desktop" |
There is a loose standard (we should document, sorry) that this should be ~/brainglobe_workingdir
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, sorry, I did this to test locally!
/ "brainglobe_workingdir" | ||
/ "allen_mouse_bluebrain_barrels" | ||
) | ||
bg_root_dir.mkdir(exist_ok=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
bg_root_dir.mkdir(exist_ok=True) | |
bg_root_dir.mkdir(parents=True, exist_ok=True) |
Needed to work on machines where ~/brainglobe_workingdir
doesn't exist yet.
annotation_file = "annotation_barrels_10.nrrd" | ||
elif resolution == 25: | ||
annotation_dir_path = annotation_dir_path / "atlas_25um" | ||
annotation_file = "annotation_barrels_25.nrrd" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
annotation_file = "annotation_barrels_10.nrrd" | |
elif resolution == 25: | |
annotation_dir_path = annotation_dir_path / "atlas_25um" | |
annotation_file = "annotation_barrels_25.nrrd" | |
annotation_file = "annotation_barrels_10.nrrd" | |
elif resolution == 25: | |
annotation_dir_path = annotation_dir_path / "atlas_25um" | |
annotation_file = "annotations_barrels_25.nrrd" |
or make file names on GDrive consistent? ("annotations" plural for 25um, but singular for 10um at the moment)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes the file names were inconsistent in the Google drive. Given that the files used are now from Zenodo or the ccf_2017, the issue is not there anymore. Also, only the 10um resolution would be supported.
Just a quick additional note that I've run the atlas through the validation script (a modified local version of the checks implemented in validate_atlases.py), so validation runs just on one atlas. Validation passes up to some missing meshes, which presumably links to my comment above that we should add meshes for the new regions. Out of curiosity, how did you choose the barrel-related ID numbers? Validation failure detailsStructures with IDs [545, 614454385, 614454386, 614454389, 614454390, 614454391, 614454392, 614454384, 614454394, 614454395, 614454398, 614454399, 614454400, 614454401, 614454393, 614454403, 614454404, 614454407, 614454408, 614454409, 614454410, 614454402, 614454412, 614454413, 614454416, 614454417, 614454418, 614454419, 614454411, 614454421, 614454422, 614454425, 614454426, 614454427, 614454428, 614454420, 614454430, 614454431, 614454434, 614454435, 614454436, 614454437, 614454429, 614454439, 614454440, 614454443, 614454444, 614454445, 614454446, 614454438, 614454448, 614454449, 614454452, 614454453, 614454454, 614454455, 614454447, 614454457, 614454458, 614454461, 614454462, 614454463, 614454464, 614454456, 614454466, 614454467, 614454470, 614454471, 614454472, 614454473, 614454465, 614454475, 614454476, 614454479, 614454480, 614454481, 614454482, 614454474, 614454484, 614454485, 614454488, 614454489, 614454490, 614454491, 614454483, 614454493, 614454494, 614454497, 614454498, 614454499, 614454500, 614454492, 614454502, 614454503, 614454506, 614454507, 614454508, 614454509, 614454501, 614454511, 614454512, 614454515, 614454516, 614454517, 614454518, 614454510, 614454520, 614454521, 614454524, 614454525, 614454526, 614454527, 614454519, 614454529, 614454530, 614454533, 614454534, 614454535, 614454536, 614454528, 614454538, 614454539, 614454542, 614454543, 614454544, 614454545, 614454537, 614454547, 614454548, 614454551, 614454552, 614454553, 614454554, 614454546, 614454556, 614454557, 614454560, 614454561, 614454562, 614454563, 614454555, 614454565, 614454566, 614454569, 614454570, 614454571, 614454572, 614454564, 614454574, 614454575, 614454578, 614454579, 614454580, 614454581, 614454573, 614454583, 614454584, 614454587, 614454588, 614454589, 614454590, 614454582, 614454592, 614454593, 614454596, 614454597, 614454598, 614454599, 614454591, 614454601, 614454602, 614454605, 614454606, 614454607, 614454608, 614454600, 614454610, 614454611, 614454614, 614454615, 614454616, 614454617, 614454609, 614454619, 614454620, 614454623, 614454624, 614454625, 614454626, 614454618, 614454628, 614454629, 614454632, 614454633, 614454634, 614454635, 614454627, 614454637, 614454638, 614454641, 614454642, 614454643, 614454644, 614454636, 614454646, 614454647, 614454650, 614454651, 614454652, 614454653, 614454645, 614454655, 614454656, 614454659, 614454660, 614454661, 614454662, 614454654, 614454664, 614454665, 614454668, 614454669, 614454670, 614454671, 614454663, 614454673, 614454674, 614454677, 614454678, 614454679, 614454680, 614454672] are in the atlas, but don't have a corresponding mesh file. (The fact that region 545 is missing is a known issue, I think the other ID are barrel-related) |
Hello, Great, thank you for looking into this so quickly! I am at a conference all week so I may only update you middle of next week! Axel |
…ation from bluebrain code
for more information, see https://pre-commit.ci
Hello, To answer your questions:
I don't think atlas-enhancement will be on pip so it is necessary to remove the dependency, as you pointed out. One thing I have modified is which resolutions can be used. I wanted 10 and 25, however only the the 10um res file for the barrel annotation is pregenerated in Zenodo. Creating a new file for 25um takes time and I am not sure it's a gain to run the entire barrel-annotation pipeline when creating a new atlas in BrainGlobe. So for simplicity I guess it's easier to ask users to download from Zenodo that pregenerated 10um file then use for the new atlas.
I have added this for new structs, it runs but I'd need to inspect visually that the meshes look ok in brainrender.
I have used simple rules related to the tree organization to select structs of interest, yes.
I have asked about this, awaiting answer. Edit: the IDs were generated using the logic of https://github.com/BlueBrain/atlas-splitter/blob/main/atlas_splitter/utils.py I guess I'd need your input on how to best organize the BlueBrain files into this before running the atlas generation script. Best, |
@@ -0,0 +1,311 @@ | |||
__version__ = "2" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be 0. This is the atlas/minor version, that will get combined with the current version of the API to create the final version of the packaged atlas. If this is set to 0, then the final packaged atlas will be 1.0. It can be incremented each time the script is updated. If the atlas API is updated, it will jump to 2.0.
__version__ = "2" | |
__version__ = "0" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was fixed
for more information, see https://pre-commit.ci
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @abisi,
The script is looking good. I can run the 10um version locally, but the 25um errors as the wrapup script can't find a mesh.
The main issue is that the "new" meshes don't overlay with the existing barrel meshes. They are in the wrong place, and are too small. I assume it's because they aren't scaled properly. The meshes should be in "real" (i.e. microns) units. There may need to be some logic to manually scale these meshes, and not the ones from the Allen.
These two issues may be related, it's possible that at 25um, some meshes are so small they cannot be extracted?
brainglobe_atlasapi/atlas_generation/atlas_scripts/allen_mouse_bluebrain_barrels.py
Outdated
Show resolved
Hide resolved
root_id = 997 | ||
closing_n_iters = 2 | ||
decimate_fraction = 0.3 | ||
smooth = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could these be defined at the top of the file?
…_bluebrain_barrels.py Co-authored-by: Adam Tyson <[email protected]>
I think the issue with the meshes is as follows:
I think that this problem can be solved by scaling the "new" meshes by the resolution before they are passed to the wrap up script (and don't scale in the wrap up script). This code should work: brainglobe-atlasapi/brainglobe_atlasapi/atlas_generation/wrapup.py Lines 169 to 174 in 3ed159d
It's possible that the meshes may need reorienting, but this will be the first step. |
Ok! I am currently testing this. |
Hello :) So it seems that setting this to True does change anything. |
Sorry I wasn't clear. Assuming you're talking about the |
Ah, I see. Do you have an idea how to achieve this? Thanks! |
You should be able to use the code that's linked above (from the wrap up function), but in your atlas generation script. Use it to scale each of barrel meshes in a loop. Hopefully all meshes will be in the same space before they're passed to the wrap up function (where they won't be scaled again). |
for more information, see https://pre-commit.ci
I tested with the new files and the atlas + rescaled meshes are created as it should now! |
Do any new files need to be added to GIN then? |
Hi! Yes, I sent a drive link on zulip! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is finally done!
Hello!
I added a script to generate a new BrainGlobe Atlas object that is the Allen Mouse Brain atlas using barrel annotations derived from work at the Blue Brain Project, EPFL. The resolutions are either 10 or 25 microns.
Github repo for barrel annotations: https://github.com/BlueBrain/atlas-enhancement/tree/main/barrel-annotations
Preprint: https://www.biorxiv.org/content/10.1101/2023.08.24.554204v2
These annotations refine the SSp-bfd structure with additional children which are the barrel columns, themselves containing children for each cortical layers. The end goal in my case is to more accurately track Neuropixels probe location within the SSp-bfd region. However these annotations may be useful for a larger community as well as for other applications.
To add these annotations, I used pregenerated annotations volumes in lieu of the current annotations, shared by the author of the code @alTeska.
I have manually added the structures in the hierarchy as the .json files were organized differently (dict of dicts in BlueBrain code, vs. list of dicts in Allen
structures.json
). In particular I retained the original SSp-bfd cortical layers (1-2/3-4-5-6a-6b) as children of SSp-bfd. Then I added the columns e.g. SSp-bfd-A1 and their children for cortical layers e.g. SSp-bfd-A1-1, etc. Note: the BlueBrain hierarchy contains layers 2 and 3 as children of 2/3. For clarity and ease of use I have unilaterally decided to exclude the former, as often layers 2/3 are analyzed jointly. 😄Because these pregenerated files are not hosted online, I need to share them with you so you may test.
Can you access this link? https://drive.google.com/drive/folders/1xVEEdhZIYKsDNDimiPO850Pwz4ZCdsg2?usp=sharing
I guess ultimately it'd be cleaner to use the above-mentioned code to regenerate these files from scratch? This would require using the
transplant_barrels.nrrd.py
script (https://github.com/BlueBrain/atlas-enhancement/blob/main/barrel-annotations/transplant_barrels_nrrd.py).Let me know if there are any issues or questions!
Thank you for your help!
Axel