Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

threshold-based segmentation, coiled for N4/thresholding, BIDS app CLI, gubra and marmoset template, #18

Open
wants to merge 44 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
44 commits
Select commit Hold shift + click to select a range
397c25a
reading from cloud storage
akhanf Oct 14, 2024
641f727
fixes
akhanf Oct 14, 2024
c693e95
updates to get deform_to_template working
akhanf Oct 15, 2024
9b6336e
update cvpl-tools
akhanf Oct 15, 2024
38a6488
update deps
akhanf Oct 15, 2024
fdf75ad
fix workflow bug
akhanf Oct 17, 2024
49d1b27
remove print
akhanf Oct 17, 2024
751178d
add wip MIP example with coiled
akhanf Oct 23, 2024
9bb6e98
add WIP sandbox notebook on various analyses
akhanf Oct 29, 2024
3789f01
some more updates -- couldn't get upsampling to work!
akhanf Oct 29, 2024
e917f05
updtes to get n4 a p plied by seb
akhanf Oct 29, 2024
3d76a74
nearly completed workflow with n4, threshold, density
akhanf Nov 1, 2024
0b72cd0
recent changes
akhanf Nov 1, 2024
ae7f537
add antspyx dependency
akhanf Nov 1, 2024
15f992f
WIP - adding n4 and coiled otsu to workflow
akhanf Nov 2, 2024
2820cc5
working up to otsu mask
akhanf Nov 3, 2024
6c9939f
use updated ZarrNii downsampling upon construction
akhanf Nov 4, 2024
b298735
updates before rerunning on all up to n4
akhanf Nov 4, 2024
0638e93
fix to get fieldfrac working, uses updated zarrnii
akhanf Nov 5, 2024
cf63339
updated params, use coiled resource
akhanf Nov 6, 2024
73bb440
remove extra
akhanf Nov 7, 2024
74467ea
updates for oldBlaze
akhanf Nov 7, 2024
064c126
fix typo, extend idle timeout
akhanf Nov 7, 2024
c6d1705
WIP analysis script
akhanf Nov 7, 2024
57c5bc7
wip analysis notebook, add template-warped outputs
akhanf Nov 7, 2024
adaa1b7
add default profile for rerun triggers
akhanf Nov 7, 2024
03d73d5
add PI deformed
akhanf Nov 7, 2024
ceab3ab
don't add group neg mask yet
akhanf Nov 7, 2024
dbf9b83
add group avg
akhanf Nov 7, 2024
6a08e41
create negative mask from group subset and apply
akhanf Nov 8, 2024
1f8154f
change default target, add avg maskedfieldfrac
akhanf Nov 8, 2024
6f41cd6
add groupavg roi maps
akhanf Nov 8, 2024
5856696
add gubra template hopefully to improve registration
akhanf Nov 9, 2024
feff631
updated scripts, apply both negative masks
akhanf Nov 11, 2024
b67df9e
update path to tsv
akhanf Nov 11, 2024
bc519f4
marked template resource files ancient()
akhanf Nov 20, 2024
5d7382d
updates to use negative_group_mask from config
akhanf Nov 21, 2024
08a8f24
uses masked image to get threshold, should be more robust
akhanf Nov 21, 2024
8d78d14
update local pybidsand snakebids
akhanf Nov 21, 2024
cd84722
more updates for CLI execution
akhanf Nov 21, 2024
380ac39
add marm template
akhanf Nov 21, 2024
e340bb8
make priors_template same as template
akhanf Nov 21, 2024
492d585
add MBMv3 template
akhanf Nov 21, 2024
cc7a85f
update zarrnii version
akhanf Nov 22, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
669 changes: 669 additions & 0 deletions examples/notebooks/MIP_coiled.ipynb

Large diffs are not rendered by default.

6,499 changes: 6,499 additions & 0 deletions examples/notebooks/coiled_sandbox.ipynb

Large diffs are not rendered by default.

2,694 changes: 2,039 additions & 655 deletions poetry.lock

Large diffs are not rendered by default.

10 changes: 6 additions & 4 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ classifiers = [
[tool.poetry.dependencies]
python = ">=3.11,<3.12"
snakemake = ">=8.0.0"
snakebids = ">=0.12.0"
snakebids = {path = "/local/scratch/snakebids", develop = true }
pulp = "<2.8.0"
pandas = [
{ version = "<=2.0.3", python = "<3.9" },
Expand All @@ -28,11 +28,13 @@ scipy = "^1.12.0"
scikit-image = "^0.22.0"
dask-image = "^2023.8.1"
ome-zarr = "^0.9.0"
pybids = "^0.16.5"
sparse = "^0.15.1"
bokeh = "^3.4.1"
zarrnii = "0.1.3a1"
cvpl_tools = "^0.6.3"
zarrnii = "^0.1.4a1"
cvpl_tools = "^0.7.0"
gcsfs = "^2024.9.0.post1"
coiled = "^1.56.1"
antspyx = "^0.5.4"

[tool.poetry.scripts]
spimquant = "spimquant.run:app.run"
Expand Down
90 changes: 77 additions & 13 deletions spimquant/config/snakebids.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,14 @@
bids_dir: '../tests/data'
output_dir: '.'
#bids_dir: 'gcs://khanlab-lightsheet/data/mouse_appmaptapoe/bids'
#bids_dir: 'gcs://khanlab-lightsheet/data/mouse_appmaptapoe/bids_oldBlaze'
#output_dir: '.'

#root: '/cifs/trident/projects/mouse_appmaptapoe/lightsheet/derivatives/SPIMquant/v0.1.1-alpha'
#work: '/cifs/trident/.work'
#root: 'results'
#work: 'work'

#root_coiled: 'gcs://khanlab-lightsheet/data/mouse_appmaptapoe/bids/derivatives/SPIMquant_v0.1.1-alpha'
#work_coiled: 'gcs://khanlab-scratch/spimquant_work'

#list of analysis levels in the bids app
analysis_levels: &analysis_levels
Expand All @@ -24,6 +32,7 @@ pybids_inputs:
filters:
suffix: 'SPIM'
extension: 'ome.zarr'
sample: brain
wildcards:
- subject
- sample
Expand All @@ -34,6 +43,44 @@ pybids_inputs:
#configuration for the command-line parameters to make available
# passed on the argparse add_argument()
parse_args:
bids_dir_or_uri:
help: |
Local path or a remote cloud uri (e.g. gcs:// ) for BIDS dataset
type: str
--work:
help: "Local path to use for temporary files\n"
type: Path
default: /cifs/trident/.work
--root_coiled:
help: "Remote uri to use for finalized cloud outputs from coiled rules\n"
type: str
default:
gcs://khanlab-lightsheet/data/mouse_appmaptapoe/bids/derivatives/SPIMquant_v0.1.1-alpha
--work_coiled:
help: "Remote uri to use for temporary cloud outputs from coiled rules\n"
type: str
default: gcs://khanlab-scratch/spimquant_work

--template:
help: "Template to use for registration\n"
default: ABAv3
choices:
- ABAv3
- gubra
- MBMv3


--template_negative_mask:
help: "Negative mask, in the template space, to highlight regions to avoid\n"
type: Path
default: /cifs/trident/projects/mouse_appmaptapoe/lightsheet/derivatives/SPIMquant/v0.1.1-alpha/analysis_appmaptapoe_20241111/etc/negative_mask.nii


--remote_creds:
help: "Google cloud default credentials to use for coiled"
default: ~/.config/gcloud/application_default_credentials.json


--skip-bids-validation:
help: |
Skip validation of BIDS dataset. BIDS validation is performed by
Expand All @@ -52,9 +99,7 @@ containers:
ants: 'docker://kaczmarj/ants:2.3.4'
itksnap: 'docker://khanlab/itksnap:latest'

root: 'results'
work: 'work'

remote_creds: '~/.config/gcloud/application_default_credentials.json' #this is needed so we can pass creds to container

ome_zarr:
max_downsampling_layers: 4 # e.g. 4 levels: { 0: orig, 1: ds2, 2: ds4, 3: ds8, 4: ds16}
Expand All @@ -65,8 +110,9 @@ ome_zarr:
scaling_method: 'local_mean' #can be nearest, gaussian, local_mean, zoom (zoom uses spline interp)


template: ABAv3 #template to use
#template: ABAv3 #template to use

#template_negative_mask: /cifs/trident/projects/mouse_appmaptapoe/lightsheet/derivatives/SPIMquant/v0.1.1-alpha/analysis_appmaptapoe_20241111/etc/negative_mask.nii


templates:
Expand All @@ -76,8 +122,8 @@ templates:
lut: '{workflow.basedir}/../resources/ABAv3/labelmapper_ABAv3_to_all.json'
segs:
all:
dseg: 'results/tpl-ABAv3/tpl-ABAv3_desc-LR_dseg.nii.gz'
tsv: 'results/tpl-ABAv3/tpl-ABAv3_desc-LR_dseg.tsv'
dseg: 'tpl-ABAv3/tpl-ABAv3_desc-LR_dseg.nii.gz'
tsv: 'tpl-ABAv3/tpl-ABAv3_desc-LR_dseg.tsv'
roi22:
dseg: '{workflow.basedir}/../resources/ABAv3/eed_labels/P56_annotation_22_R_L.nii.gz'
csv: '{workflow.basedir}/../resources/ABAv3/eed_labels/P56_annotation_22_R_L.csv'
Expand All @@ -87,22 +133,35 @@ templates:


gubra:
anat: '{workflow.basedir}/../resources/gubra/gubra_template_olf_affine_padABAv3.nii.gz'
dseg: '{workflow.basedir}/../resources/gubra/gubra_ano_olf_affine_padABAv3_remap.nii.gz'
anat: '{workflow.basedir}/../resources/gubra/gubra_template_olf_spacing_reslice.nii.gz'
dseg: '{workflow.basedir}/../resources/gubra/gubra_ano_olf_spacing_remap_reslice.nii.gz'
lut: '{workflow.basedir}/../resources/ABAv3/labelmapper_ABAv3_to_all.json'
segs:
all:
dseg: 'tpl-gubra/tpl-gubra_desc-LR_dseg.nii.gz'
tsv: 'tpl-gubra/tpl-gubra_desc-LR_dseg.tsv'

MBMv3:
anat: '{workflow.basedir}/../resources/MBMv3/template_T2w_brain.nii.gz'
dseg: '{workflow.basedir}/../resources/MBMv3/segmentation_three_types_seg.nii.gz' #this is used for brainmasking
segs:
paxinos:
dseg: '{workflow.basedir}/../resources/MBMv3/atlas_MBM_cortex_vPaxinos.nii.gz'
tsv: '{workflow.basedir}/../resources/MBMv3/atlas_MBM_cortex_vPaxinos.txt'


stains_for_reg: #ordered by priority
- PI
- autof
- AutoF

stains_for_blobdetect:
stains_for_segment:
- abeta
- Abeta
- BetaAmyloid
- AlphaSynuclein



templatereg:
level: 5
Expand All @@ -113,13 +172,18 @@ templatereg:

masking:
level: 5
priors_template: ABAv3
gmm_k: 9
gmm_bg_class: 1
pre_atropos_downsampling: '50%'

blobdetect:
level: 4 #downsampling level to use for blob detection
level: 5 #downsampling level to use for blob detection
dseg_level: 5 # downsampling level to use for template when assigning labels to blobs
dseg_template: ABAv3

segment:
otsu_level: 0
n4_ds_level: 4
fieldfrac_ds_level: 5


1 change: 1 addition & 0 deletions spimquant/profiles/default/config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
rerun-triggers: mtime
Binary file not shown.
Binary file not shown.
136 changes: 136 additions & 0 deletions spimquant/resources/MBMv3/atlas_MBM_cortex_vPaxinos.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
0 background 150 150 150 0
1 A1/A2 7 8 64 0
2 A10 179 100 217 0
3 A11 165 95 172 0
4 A13L 14 228 236 0
5 A13M 51 172 123 0
6 A13a 100 103 118 0
7 A13b 28 132 190 0
8 A14C 238 200 155 0
9 A14R 225 60 213 0
10 A19DI 146 204 208 0
11 A19M 99 49 13 0
12 A23V 0 191 175 0
13 A23a 13 211 56 0
14 A23b 2 229 144 0
15 A23c 132 131 187 0
16 A24a 189 196 167 0
17 A24b 37 108 71 0
18 A24c 237 99 183 0
19 A24d 27 241 237 0
20 A25 58 184 104 0
21 A29a-c 216 74 146 0
22 A29d 160 100 160 0
23 A30 174 115 171 0
24 A31 195 172 232 0
25 A32 60 4 178 0
26 A32V 167 133 133 0
27 A35 122 26 55 0
28 A36 8 95 46 0
29 A3a 70 225 112 0
30 A3b 231 133 195 0
31 A45 42 139 15 0
32 A46D 59 75 44 0
33 A46V 185 153 120 0
34 A47L 186 52 116 0
35 A47M 133 127 171 0
36 A47O 190 250 213 0
37 A4ab 23 140 132 0
38 A4c 65 137 171 0
39 A6DC 102 237 98 0
40 A6DR 122 178 222 0
41 A6M 161 197 114 0
42 A6Va 55 199 103 0
43 A6Vb 11 134 164 0
44 A8C 69 165 51 0
45 A8aD 189 185 209 0
46 A8Av 166 192 88 0
47 A8b 36 165 102 0
48 A9 187 87 130 0
49 dummylabel49 230 195 13 0
50 AI 154 249 209 0
51 AIP 231 42 21 0
52 Apri 148 131 133 0
53 dummylabel53 82 111 230 0
54 AuA1 0 107 19 0
55 AuAL 20 186 75 0
56 AuCL 18 243 45 0
57 AuCM 241 117 122 0
58 AuCPB 96 130 21 0
59 AuML 152 110 37 0
60 AuR 245 32 45 0
61 AuRM 20 97 209 0
62 AuRPB 205 53 17 0
63 AuRT 238 11 43 0
64 AuRTL 238 224 176 0
65 AuRTM 187 159 152 0
66 dummylabel66 34 230 119 0
67 DI 89 187 234 0
68 dummylabel68 51 179 40 0
69 dummylabel69 145 207 212 0
70 Ent 240 249 98 0
71 FST 215 5 25 0
72 GI 157 124 190 0
73 Gu 147 21 113 0
74 dummylabel74 145 217 128 0
75 Ipro 104 62 82 0
76 LIP 49 52 55 0
77 dummylabel77 71 22 174 0
78 dummylabel78 202 220 134 0
79 MIP 111 207 71 0
80 dummylabel80 176 32 98 0
81 MST 34 167 186 0
82 dummylabel82 216 207 134 0
83 OPAl 216 58 65 0
84 OPro 72 168 10 0
85 OPt 187 205 219 0
86 dummylabel86 89 85 95 0
87 PE 195 13 121 0
88 PEC 51 98 229 0
89 PF 140 58 19 0
90 PFG 135 166 22 0
91 PG 53 78 129 0
92 PGM 100 205 225 0
93 Pga-IPa 191 200 241 0
94 dummylabel94 112 207 155 0
95 dummylabel95 45 105 196 0
96 PaIL 214 128 134 0
97 PaIM 252 19 159 0
98 dummylabel98 53 82 119 0
99 dummylabel99 133 72 20 0
100 Pir 194 103 140 0
101 dummylabel101 33 232 134 0
102 dummylabel102 14 92 248 0
103 ProM 242 93 177 0
104 ProSt 49 177 96 0
105 ReI 39 126 55 0
106 S2E 216 214 74 0
107 S2I 251 158 185 0
108 S2PR 99 211 181 0
109 S2PV 179 18 249 0
110 dummylabel110 66 134 238 0
111 STR 123 101 9 0
112 TE1 170 183 113 0
113 TE2 204 197 166 0
114 TE3 51 68 239 0
115 TEO 152 110 38 0
116 TF 237 30 183 0
117 TFO 137 158 71 0
118 TH 7 8 200 0
119 TL 134 230 110 0
120 TLO 100 68 175 0
121 TPO 71 122 174 0
122 TPPro 128 130 190 0
123 Tpro 213 211 56 0
124 TPt 14 128 136 0
125 dummylabel125 110 113 118 0
126 V1 122 126 155 0
127 V2 12 209 104 0
128 V3 112 109 4 0
129 V3A 155 99 103 0
130 V4 170 105 171 0
131 V4T 30 100 119 0
132 V5 100 104 208 0
133 V6 11 34 64 0
134 V6A 130 190 113 0
135 VIP 27 18 64 0
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
11 changes: 11 additions & 0 deletions spimquant/resources/gubra/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
Gubra AutoF template
Modified from: https://github.com/Gubra-ApS/LSFM-mouse-brain-atlas

(posted files did not have correct header information (voxel size), was not in similar physical space to ABAv3, and had a different label mapping. This corrects by using the below c3d commands for spacing reslicing, and a python script to remap the labels. Note: the rigid transform was found using itksnap (align by center, then MI rigid reg)

e.g.:
```
c3d gubra_template_olf.nii.gz -spacing 0.025x0.025x0.025mm -o gubra_template_olf_spacing.nii.gz
c3d ../../ABAv3/P56_Atlas.nii.gz gubra_template_olf_spacing.nii.gz -reslice-itk ../../gubra/gubra_withspacing_to_aba_rigid_itk.txt -o ../../gubra/gubra_template_olf_spacing_reslice.nii.gz
c3d -int 0 gubra_template_olf_spacing_reslice.nii.gz gubra_ano_olf_spacing_remap.nii.gz -reslice-itk gubra_withspacing_to_aba_rigid_itk.txt -o gubra_ano_olf_spacing_remap_reslice.nii.gz
```
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#Insight Transform File V1.0
#Transform 0
Transform: MatrixOffsetTransformBase_double_3_3
Parameters: 0.9999998057710713 -0.0002197796696958088 -0.0005256684084527258 0.00021575284727777427 0.9999707352656585 -0.007648191524472518 0.0005273339125645778 0.007648076704510192 0.9999705800196018 235.7528966315556 316.94457485358885 -159.1717420834618
FixedParameters: 0 0 0
48 changes: 48 additions & 0 deletions spimquant/resources/gubra/remap_labels.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
import nibabel as nib
import pandas as pd
import json
import numpy as np

# Paths to input files
nifti_path = "../LSFM-mouse-brain-atlas/LSFM_atlas_files/gubra_ano_olf_spacing.nii.gz" # Path to the input NIfTI file
tsv_path = "../LSFM-mouse-brain-atlas/LSFM_atlas_files/ARA2_annotation_info.csv" # Path to the TSV file with intensities and names
json_path = "../ABAv3/labelmapper_ABAv3_to_all.json" # Path to the JSON file with target mappings
output_path = "gubra_ano_olf_spacing_remap.nii.gz" # Path to save the output NIfTI file

# Load the NIfTI image
nifti_img = nib.load(nifti_path)
nifti_data = nifti_img.get_fdata()

# Load the TSV file into a DataFrame
tsv_data = pd.read_csv(tsv_path)
id_to_name = dict(zip(tsv_data['id'], tsv_data['name']))

# Load the target mapping from the JSON file
with open(json_path, "r") as f:
target_map = json.load(f)

# Create a mapping from source intensities to target intensities
intensity_mapping = {}
for entry in target_map:
output_intensity = entry[0]
target_name = entry[3]

# Find the source ID in the TSV file that corresponds to the target name in the JSON
source_id = next((k for k, v in id_to_name.items() if v == target_name), None)
if source_id is not None:
intensity_mapping[source_id] = output_intensity
print(f'mapping {source_id} to {output_intensity} for {target_name}')
else:
print(f"Warning: No match found in TSV for target name '{target_name}' in JSON")

# Remap the intensities in the NIfTI data
remapped_data = np.copy(nifti_data)
for src_intensity, tgt_intensity in intensity_mapping.items():
remapped_data[nifti_data == src_intensity] = tgt_intensity

# Save the remapped data as a new NIfTI image
remapped_img = nib.Nifti1Image(remapped_data, nifti_img.affine, nifti_img.header)
remapped_img.to_filename(output_path)

print(f"Remapped NIfTI image saved as '{output_path}'")

Loading