Preprint Highlight: Learning orientation-invariant representations enables accurate and robust morphologic profiling of cells and organelles

Highlighted By: Assaf Zaritsky, Ben-Gurion University of the Negev

Preprint DOI: https://doi.org/10.1101/2022.12.08.519671
This preprint has been assigned the following Badges:

Significance Statement:

  • One application of deep learning in analysis of cell biological microscopy data is developing meaningful quantitative representations of cellular and/or molecular phenotypic signatures. Because image orientation has no relevance for shape and morphology, encoding orientation within such representations confounds downstream analyses.

  • This study presents O2-VAE, a neural method for learning orientation-invariant, image-based shape representations. The authors demonstrate that O2-VAE is not sensitive to image orientation for applications of cell/organelle shape phenotyping. Specifically, O2-VAE was verified on diverse experimental systems, ranging from simulations to human induced pluripotent stem cells, for downstream analyses that include clustering, dimensionality reduction, and/or outlier detection.

  • This article opens the door for design and evaluation of orientation-invariant representations that may enable more effective deep learning–driven phenotyping.

Read the Preprint:

Learning orientation-invariant representations enables accurate and robust morphologic profiling of cells and organelles
James Burgess, Jeffrey J. Nirschl, Maria-Clara Zanellati, Sarah Cohen, Serena Yeung
bioRxiv 2022.12.08.519671; doi: https://doi.org/10.1101/2022.12.08.519671

 

This Preprint Highlight was previously published in Molecular Biology of the Cell on Apr 20, 2023: https://doi.org/10.1091/mbc.P23-04-0013

 

 

MBoC's Preprint Highlights are commentaries written by Early-Career Editors on recent preprints of interest. They do not constitute peer review or imply publication of the original preprint by MBoC. For more information, visit https://www.molbiolcell.org/curation-tools.