A primary obstacle in the analysis of tissue imaging information is cell division– the job of determining the exact border of every cell in an image. To resolve this issue we built TissueNet, a dataset for training division designs which contains more than 1 million by hand identified cells, an order of magnitude more than all formerly released division training datasets. We utilized TissueNet to train Mesmer, a deep-learning-enabled division algorithm. We showed that Mesmer is more precise than previous approaches, generalizes to the complete variety of tissue types and imaging platforms in TissueNet, and accomplishes human-level efficiency. Mesmer made it possible for the automated extraction of essential cellular functions, such as subcellular localization of protein signal, which was challenging with previous techniques. We then adjusted Mesmer to harness cell family tree details in extremely multiplexed datasets and utilized this boosted variation to measure cell morphology modifications throughout human pregnancy. All code, information and designs are launched as a neighborhood resource.
Subscribe to Journal
Get complete journal gain access to for 1 year
just 8,25 EUR per concern
Tax estimation will be settled throughout checkout.
Rent or Buy post
Get time restricted or complete short article gain access to on ReadCube.
All rates are NET rates.
The TissueNet dataset is readily available at https://datasets.deepcell.org/ for noncommercial usage.
All software application for dataset building, design training, release and analysis is offered on our github page https://github.com/vanvalenlab/intro-to-deepcell All code to create the figures in this paper is offered at https://github.com/vanvalenlab/publication-figures/tree/master/2021- Greenwald_Miller_et_al-Mesmer
Giesen, C. et al. Extremely multiplexed imaging of growth tissues with subcellular resolution by mass cytometry. Nat. Techniques11, 417–422(2014).
Keren, L. et al. MIBI-TOF: a multiplexed imaging platform relates cellular phenotypes and tissue structure. Sci. Adv. 5, eaax5851(2019).
Huang, W., Hennrick, K. & Drew, S. A vibrant future of quantitative pathology: recognition of Vectra innovation utilizing chromogenic multiplexed immunohistochemistry and prostate tissue microarrays. Hum. Pathol.44, 29–38(2013).
Lin, J.-R. et al. Extremely multiplexed immunofluorescence imaging of human tissues and growths utilizing t-CyCIF and traditional optical microscopic lens. eLife 7, e31657(2018).
Gerdes, M. J. et al. Extremely multiplexed single-cell analysis of formalin-fixed, paraffin-embedded cancer tissue. Proc. Natl Acad. Sci.110, 11982–11987(2013).
Goltsev, Y. et al. Deep profiling of mouse splenic architecture with CODEX multiplexed imaging. Cell174, 968–981 e15(2018).
Chen, K. H., Boettiger, A. N., Moffitt, J. R., Wang, S. & Zhuang, X. Spatially solved, extremely multiplexed RNA profiling in single cells. Science348, aaa6090(2015).
Lee, J. H. et al. Extremely multiplexed subcellular RNA sequencing in situ. Science343, 1360–1363(2014).
Moffitt, J. R. et al. Molecular, spatial and practical single-cell profiling of the hypothalamic preoptic area. Science362, eaau5324(2018).
Wang, X. et al. Three-dimensional intact-tissue sequencing of single-cell transcriptional states. Science361, eaat5691(2018).
Lubeck, E., Coskun, A. F., Zhiyentayev, T., Ahmad, M. & Cai, L. Single-cell in situ RNA profiling by consecutive hybridization. Nat Methods11, 360–361(2014).
Eng, C.-H. L. et al. Transcriptome-scale super-resolved imaging in tissues by RNA seqFISH . Nature568,235–239(2019).
Rozenblatt-Rosen, O. et al. The human growth atlas network: charting growth shifts throughout area and time at single-cell resolution. Cell181, 236–249(2020).
Snyder, M. P. et al. The body at cellular resolution: the NIH Human Biomolecular Atlas Program. Nature574, 187–192(2019).
Regev, A. et al. The human cell atlas white paper. Preprint at https://arxiv.org/abs/181005192 v1(2018).
Keren, L. et al. A structured tumor-immune microenvironment in triple unfavorable breast cancer exposed by multiplexed ion beam imaging. Cell174, 1373–1387 e19(2018).
Milo, R. & Phillips, R. Cell Biology by the Numbers 1st edn (Garland Science, 2015).
Mescher, A. Junqueira’s Basic Histology: Text and Atlas13 th edn (McGraw Hill, 2013).
McQuin, C. et al. CellProfiler 3.0: next-generation image processing for biology. PLoS Biol.16, e2005970(2018).
Schindelin, J. et al. Fiji: an open-source platform for biological-image analysis. Nat. Techniques 9, 676–682(2012).
Schneider, C. A., Rasband, W. S. & Eliceiri, K. W. NIH Image to ImageJ: 25 years of image analysis. Nat. Techniques 9, 671–675(2012).
Berg, S. et al. ilastik: interactive device discovering for (bio) image analysis. Nat. Approaches16, 1226–1232(2019).
de Chaumont, F. Icy: an open bioimage informatics platform for prolonged reproducible research study. Nat. Techniques 9, 690–696(2012).
Belevich, I., Joensuu, M., Kumar, D., Vihinen, H. & Jokitalo, E. Microscopy image internet browser: a platform for division and analysis of multidimensional datasets. PLoS Biol.14, e1002340(2016).
Ronneberger, O., Fischer, P. & Brox, T. in Medical Image Computing and Computer-Assisted Intervention– MICCAI 2015(eds Navab, N. et al.) 234–241(Lecture Notes in Computer Science 9351, Springer, 2015).
Valen, D. A. V. et al. Deep knowing automates the quantitative analysis of specific cells in live-cell imaging experiments. PLoS Comput. Biol.12, e1005177(2016).
Caicedo, J. C. et al. Nucleus division throughout imaging experiments: the 2018 Data Science Bowl. Nat. Approaches16, 1247–1253(2019).
Stringer, C., Wang, T., Michaelos, M. & Pachitariu, M. Cellpose: a generalist algorithm for cellular division. Nat. Approaches18, 100–106(2021).
Hollandi, R. et al. nucleAIzer: a parameter-free deep knowing structure for nucleus division utilizing image design transfer. Cell Syst.10, 453–458 e6 (2020).
Koyuncu, C. F., Gunesli, G. N., Cetin-Atalay, R. & Gunduz-Demir, C. DeepDistance: a multi-task deep regression design for cell detection in inverted microscopy images. Med. Image Anal.63, 101720 (2020).
Yang, L. et al. NuSeT: A deep knowing tool for dependably separating and examining congested cells. PLoS Comput. Biol.16, e1008193(2020).
Yu, W. et al. CCDB:6843, mus musculus, Neuroblastoma. CIL. Dataset. https://doi.org/107295/ W9CCDB6843
Koyuncu, C. F., Cetin‐Atalay, R. & Gunduz‐Demir, C. Object‐oriented division of cell nuclei in fluorescence microscopy images. Cytometry A93, 1019–1028(2018).
Ljosa, V., Sokolnicki, K. L. & Carpenter, A. E. Annotated high-throughput microscopy image sets for recognition. Nat. Approaches 9, 637–637(2012).
Kumar, N. et al. A multi-organ nucleus division obstacle. IEEE Trans. Medication. Imaging39, 1380–1391(2020).
Verma, R. et al. MoNuSAC2020: A Multi-organ Nuclei Segmentation and Classification Challenge. IEEE Trans. Medication. Imaging10.1109/ TMI.20213085712(2021).
Moen, E. et al. Precise cell tracking and family tree building in live-cell imaging try outs deep knowing. Preprint at bioRxiv https://doi.org/101101/803205(2019).
Gamper, J. et al. PanNuke dataset extension, insights and standards. Preprint at https://arxiv.org/abs/200310778 v7(2020).
Bannon, D. et al. DeepCell Kiosk: scaling deep knowing– made it possible for cellular image analysis with Kubernetes. Nat. Techniques18, 43–45(2021).
Haberl, M. G. et al. CDeep3M– plug-and-play cloud-based deep knowing for image division. Nat. Techniques15, 677–680(2018).
Ouyang, W., Mueller, F., Hjelmare, M., Lundberg, E. & Zimmer, C. ImJoy: an open-source computational platform for the deep knowing period. Nat. Approaches16, 1199–1200(2019).
von Chamier, L. et al. Democratising deep knowing for microscopy with ZeroCostDL4Mic. Nat. Commun.12, 2276 (2021).
Hughes, A. J. et al. Quanti.us: a tool for quick, versatile, crowd-based annotation of images. Nat. Approaches15, 587–590(2018).
Ouyang, W., Le, T., Xu, H. & Lundberg, E. Interactive biomedical division tool powered by deep knowing and ImJoy. F1000 Research10, 142 (2021).
Wolny, A. et al. Precise and flexible 3D division of plant tissues at cellular resolution. eLife 9, e57613(2020).
DeepCell Label: https://github.com/vanvalenlab/deepcell-label
Lin, T.-Y. et al. Function pyramid networks for item detection. In Proc. IEEE Conference on Computer Vision and Pattern Recognition (CVPR)2117–2125(IEEE, 2017).
Tan, M., Pang, R. & Le, Q. V. EfficientDet: scalable and effective things detection. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)10778–10787(IEEE, 2020).
He, K., Zhang, X., Ren, S. & Sun, J. Deep recurring knowing for image acknowledgment. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)770–778(IEEE, 2016).
Zuiderveld, K. in Graphics Gem s (ed Heckbert, P. S.) Ch. VIII.5 (Academic Press, 1994).
Chevalier, G. Make smooth forecasts by mixing image spots, such as for image division. https://github.com/Vooban/Smoothly-Blend-Image-Patches
Meyer, F. & Beucher, S. Morphological division. J. Vis. Commun. Image R 1, 21–46(1990).
Weigert, M., Schmidt, U., Haase, R., Sugawara, K. & Myers, G. Star-convex polyhedra for 3D things detection and division in microscopy. In IEEE Winter Conference on Applications of Computer Vision (WACV)3655–3662(IEEE, 2020).
Fu, C.-Y., Shvets, M. & Berg, A. C. RetinaMask: discovering to anticipate masks enhances modern single-shot detection totally free. Preprint at https://arxiv.org/abs/190103353 v1(2019).
Schürch, C. M. et al. Collaborated cellular areas manage antitumoral resistance at the colorectal cancer intrusive front. Cell182, 1341–1359 e19(2020).
Ali, H. R. et al. Imaging mass cytometry and multiplatform genomics specify the phenogenomic landscape of breast cancer. Nat. Cancer 1, 163–175(2020).
Gaglia, G. et al. HSF1 stage shift moderates tension adjustment and cell fate choices. Nat. Cell Biol.22, 151–158(2020).
Nelson, D. E. et al. Oscillations in NF-κB signaling manage the characteristics of gene expression. Science306, 704–708(2004).
Kumar, K. P., McBride, K. M., Weaver, B. K., Dingwall, C. & Reich, N. C. Regulated nuclear-cytoplasmic localization of interferon regulative aspect 3, a subunit of double-stranded RNA-activated element 1. Mol. Cell Biol.20, 4159–4168(2000).
Wolff, A. C. et al. Suggestions for human skin development element receptor 2 screening in Breast Cancer: American Society of Clinical Oncology/College of American pathologists medical practice standard upgrade. J. Clin. Oncol.31, 3997–4013(2013).
Risom, T. et al. Shift to intrusive breast cancer is connected with progressive modifications in the structure and structure of growth stroma. Preprint at bioRxiv https://doi.org/101101/20210105425362(2021)
Ark Analysis. https://github.com/angelolab/ark-analysis
Koss, L. G. Diagnostic Cytology and Its Histopathologic Bases (J.B. Lippincott Company, 1979).
Erlebacher, A. Immunology of the maternal-fetal user interface. Annu. Rev. Immunol.31, 387–411(2013).
Greenbaum, S. et al. Spatio-temporal coordination at the maternal-fetal user interface promotes trophoblast intrusion and vascular renovation in the very first half of human pregnancy. Preprint at bioRxiv https://doi.org/101101/20210908459490(2021).
Garrido-Gomez, T. et al. Faulty decidualization throughout and after extreme preeclampsia exposes a possible maternal contribution to the etiology. Proc. Natl Acad. Sci. U.S.A.114, E8468– E8477(2017).
Deep Cell Core Library. Deep knowing for single-cell analysis. https://github.com/vanvalenlab/deepcell-tf
Bankhead, P. et al. QuPath: open source software application for digital pathology image analysis. Sci. Rep. 7, 16878 (2017).
Graham, S. et al. Hover-Net: synchronised division and category of nuclei in multi-tissue histology images. Med. Image Anal.58, 101563 (2019).
Tsai, H.-F., Gajda, J., Sloan, T. F. W., Rares, A. & Shen, A. Q. Usiigaci: instance-aware cell tracking in stain-free stage contrast microscopy allowed by artificial intelligence. SoftwareX 9, 230–237(2019).
Kiemen, A. et al. In situ characterization of the 3D microanatomy of the pancreas and pancreatic cancer at single cell resolution. bioRxiv2020.1208416909(2020) https://doi.org/101101/20201208416909
Cao, J. et al. Facility of a morphological atlas of the Caenorhabditis elegans embryo utilizing deep-learning-based 4D division. Nat. Commun.11, 6254 (2020).
Schulz, D. et al. Synchronised multiplexed imaging of mRNA and proteins with subcellular resolution in breast cancer tissue samples by mass cytometry. Cell Syst. 6, 531 (2018).
McKinley, E. T. et al. Enhanced multiplex immunofluorescence single-cell analysis exposes tuft cell heterogeneity. JCI Insight 2, e93487(2017).
Patel, S. S. et al. The microenvironmental specific niche in timeless Hodgkin lymphoma is enhanced for CTLA-4- favorable T-cells that are PD-1-negative. Blood134, 2059–2069(2019).
Jackson, H. W. et al. The single-cell pathology landscape of breast cancer. Nature578, 615–620(2020).
Rashid, R. et al. Extremely multiplexed immunofluorescence images and single-cell information of immune markers in tonsil and lung cancer. Sci. Information 6, 323 (2019).
McCaffrey, E. F. et al. Multiplexed imaging of human tuberculosis granulomas reveals immunoregulatory functions saved throughout tissue and blood. Preprint at bioRxiv https://doi.org/101101/20200608140426(2020).
Walt, Svander et al. scikit-image: image processing in Python. PeerJ 2, e453(2014).
Kingma, D. P. & Bachelor’s Degree, J. Adam: an approach for stochastic optimization. Preprint at https://arxiv.org/abs/14126980 v9(2014).
Kluyver, T. et al. in Positioning and Power in Academic Publishing: Players, Agents and Agendas(eds Schmidt, B. & Loizides, F.) (IOS Press, 2016).
Chollet, F. et al. Keras. https://keras.io(2015).
Hunter, J. D. Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9, 90–95(2007).
Harris, C. R. et al. Variety programs with NumPy. Nature585, 357–362(2020).
Reback, J. et al. pandas-dev/pandas: Pandas 1.1.3. https://doi.org/105281/ zenodo.3509134(2020).
Pedregosa, F. et al. Scikit-learn: artificial intelligence in Python. Preprint at https://arxiv.org/abs/12010490 v4(2012).
Waskom, M. et al. mwaskom/seaborn. https://doi.org/105281/ zenodo.592845(2020).
Abadi, M. et al. TensorFlow: massive artificial intelligence on heterogeneous dispersed systems. Preprint at https://arxiv.org/abs/160304467 v2(2016).
Hoyer, S. & Hamman, J. xarray: N-D identified selections and datasets in Python. J. Open Res. Softw. 5, 10 (2017).
We thank K. Borner, L. Cai, M. Covert, A. Karpathy, S. Quake and M. Thomson for fascinating conversations; D. Glass and E. McCaffrey for feedback on the manuscript; T. Vora for copy modifying; R. Angoshtari, G. Barlow, B. Bodenmiller, C. Carey, R. Coffey, A. Delmastro, C. Egelston, M. Hoppe, H. Jackson, A. Jeyasekharan, S. Jiang, Y. Kim, E. McCaffrey, E. McKinley, M. Nelson, S.-B. Ng, G. Nolan, S. Patel, Y. Peng, D. Philips, R. Rashid, S. Rodig, S. Santagata, C. Schuerch, D. Schulz, Di. Simons, P. Sorger, J. Weirather and Y. Yuan for offering imaging information for TissueNet; the crowd annotators who powered our human-in-the-loop pipeline; and all clients who contributed samples for this research study. This work was supported by grants from the Shurl and Kay Curci Foundation, the Rita Allen Foundation, the Susan E. Riley Foundation, the Pew Heritage Trust, the Alexander and Margaret Stewart Trust, the Heritage Medical Research Institute, the Paul Allen Family Foundation through the Allen Discovery Centers at Stanford and Caltech, the Rosen Center for Bioengineering at Caltech and the Center for Environmental and Microbial Interactions at Caltech (D.V.V.). This work was likewise supported by 5U54 CA20997105, 5DP5OD01982205, 1R01 CA24063801 A1, 5R01 AG06827902, 5UH3CA24663303, 5R01 CA22952904, 1U24 CA22430901, 5R01 AG05791504 and 5R01 AG05628705 from NIH, W81 XWH2110143 from DOD, and other financing from the Bill and Melinda Gates Foundation, Cancer Research Institute, the Parker Center for Cancer Immunotherapy and the Breast Cancer Research Foundation (M.A.). N.F.G. was supported by NCI CA246880-01 and the Stanford Graduate Fellowship. B.J.M. was supported by the Stanford Graduate Fellowship and Stanford Interdisciplinary Graduate Fellowship. T.D. was supported by the Schmidt Academy for Software Engineering at Caltech.
M.A. is a developer on patent United States20150287578 A1. M.A. is a board member and investor in IonPath Inc. T.R. has actually formerly sought advice from for IonPath Inc. D.V.V and E.M. have actually submitted a provisionary patent for this work. The staying authors state no completing interests.
Peer evaluation details Nature Biotechnology thanks the confidential customers for their contribution to the peer evaluation of this work.
Publisher’s note Springer Nature stays neutral with regard to jurisdictional claims in released maps and institutional associations.
a, How multichannel images are represented and modified in DeepCell Label. b, Scalable backend for DeepCell Label that dynamically changes needed resources based upon use, permitting concurrent annotators to operate in parallel. c, Human-in-the-loop workflow diagram. Images are submitted to the server, gone through Mesmer to make forecasts, and cropped to assist in mistake correction. These crops are sent out to the crowd to be remedied, sewed back together, go through quality assurance to guarantee precision, and utilized to train an upgraded design.
a, PanopticNet architecture. Images are fed into a ResNet50 foundation paired to a function pyramid network. 2 semantic heads produce pixel-level forecasts. The very first head forecasts whether each pixel comes from the interior, border, or background of a cell, while the 2nd head forecasts the center of each cell. b, Relative percentage of preprocessing, reasoning, and postprocessing time in PanopticNet architecture. c, Evaluation of accuracy, recall, and Jaccard index for Mesmer and formerly released designs (right) and designs trained on TissueNet (left). d, Summary of TissueNet precision for Mesmer and chosen designs to assist in future benchmarking efforts e, f Breakdown of a lot of widespread mistake types ( e) and less common mistake types ( f) for Mesmer and formerly released designs shows Mesmer’s benefits over previous techniques. g, Comparison of the size circulation of forecast mistakes for Mesmer (left) with nuclear division followed by growth (right) reveals that Mesmer’s forecasts are objective.
a, Accuracy of professional designs trained on each platform type (rows) and examined on information from other platform types (columns) suggests excellent contract within immunofluorescence and mass spectrometry-based techniques, however not throughout unique techniques. b, Accuracy of professional designs trained on each tissue type (rows) and assessed on information from other tissue types (columns) shows that designs trained on just a single tissue type do not generalize too to other tissue types. c, Quantification of F1 rating as a function of the size of the dataset utilized for training. d-h, Quantification of specific mistake types as a function of the size of the dataset utilized for training. i, Representative images where Mesmer precision was bad, as identified by the image particular F1 rating. j, Impact of image blurring on design precision. k, Impact of image downsampling and after that upsampling on design precision. l, Impact of including random sound to image on design precision. All scale bars are 50 μM.
Proof of concept for utilizing Mesmer’s division forecasts to create 3D divisions. A z-stack of 3D information is fed to Mesmer, which produces different 2D forecasts for each piece. We computationally connect the divisions forecasts from each piece to form 3D items. This method can form the basis for human-in-the-loop building and construction of training information for 3D designs.
About this post
Cite this post
Greenwald, N.F., Miller, G., Moen, E. et al. Whole-cell division of tissue images with human-level efficiency utilizing massive information annotation and deep knowing.
Nat Biotechnol(2021). https://doi.org/101038/ s41587-021-01094 -0