Deep-learning-powered data analysis in plankton ecology on ArXiv

Segmentation of two plankton species using deep learning (N. scintillans in blue, D. tertiolecta in green). (Image by H. Bachimanchi.)
Deep-learning-powered data analysis in plankton ecology
Harshith Bachimanchi, Matthew I. M. Pinder, Chloé Robert, Pierre De Wit, Jonathan Havenhand, Alexandra Kinnby, Daniel Midtvedt, Erik Selander, Giovanni Volpe
arXiv: 2309.08500

The implementation of deep learning algorithms has brought new perspectives to plankton ecology. Emerging as an alternative approach to established methods, deep learning offers objective schemes to investigate plankton organisms in diverse environments. We provide an overview of deep-learning-based methods including detection and classification of phyto- and zooplankton images, foraging and swimming behaviour analysis, and finally ecological modelling. Deep learning has the potential to speed up the analysis and reduce the human experimental bias, thus enabling data acquisition at relevant temporal and spatial scales with improved reproducibility. We also discuss shortcomings and show how deep learning architectures have evolved to mitigate imprecise readouts. Finally, we suggest opportunities where deep learning is particularly likely to catalyze plankton research. The examples are accompanied by detailed tutorials and code samples that allow readers to apply the methods described in this review to their own data.

Roadmap on Deep Learning for Microscopy on ArXiv

Spatio-temporal spectrum diagram of microscopy techniques and their applications. (Image by the Authors of the manuscript.)
Roadmap on Deep Learning for Microscopy
Giovanni Volpe, Carolina Wählby, Lei Tian, Michael Hecht, Artur Yakimovich, Kristina Monakhova, Laura Waller, Ivo F. Sbalzarini, Christopher A. Metzler, Mingyang Xie, Kevin Zhang, Isaac C.D. Lenton, Halina Rubinsztein-Dunlop, Daniel Brunner, Bijie Bai, Aydogan Ozcan, Daniel Midtvedt, Hao Wang, Nataša Sladoje, Joakim Lindblad, Jason T. Smith, Marien Ochoa, Margarida Barroso, Xavier Intes, Tong Qiu, Li-Yu Yu, Sixian You, Yongtao Liu, Maxim A. Ziatdinov, Sergei V. Kalinin, Arlo Sheridan, Uri Manor, Elias Nehme, Ofri Goldenberg, Yoav Shechtman, Henrik K. Moberg, Christoph Langhammer, Barbora Špačková, Saga Helgadottir, Benjamin Midtvedt, Aykut Argun, Tobias Thalheim, Frank Cichos, Stefano Bo, Lars Hubatsch, Jesus Pineda, Carlo Manzo, Harshith Bachimanchi, Erik Selander, Antoni Homs-Corbera, Martin Fränzl, Kevin de Haan, Yair Rivenson, Zofia Korczak, Caroline Beck Adiels, Mite Mijalkov, Dániel Veréb, Yu-Wei Chang, Joana B. Pereira, Damian Matuszewski, Gustaf Kylberg, Ida-Maria Sintorn, Juan C. Caicedo, Beth A Cimini, Muyinatu A. Lediju Bell, Bruno M. Saraiva, Guillaume Jacquemet, Ricardo Henriques, Wei Ouyang, Trang Le, Estibaliz Gómez-de-Mariscal, Daniel Sage, Arrate Muñoz-Barrutia, Ebba Josefson Lindqvist, Johanna Bergman
arXiv: 2303.03793

Through digital imaging, microscopy has evolved from primarily being a means for visual observation of life at the micro- and nano-scale, to a quantitative tool with ever-increasing resolution and throughput. Artificial intelligence, deep neural networks, and machine learning are all niche terms describing computational methods that have gained a pivotal role in microscopy-based research over the past decade. This Roadmap is written collectively by prominent researchers and encompasses selected aspects of how machine learning is applied to microscopy image data, with the aim of gaining scientific knowledge by improved image quality, automated detection, segmentation, classification and tracking of objects, and efficient merging of information from multiple imaging modalities. We aim to give the reader an overview of the key developments and an understanding of possibilities and limitations of machine learning for microscopy. It will be of interest to a wide cross-disciplinary audience in the physical sciences and life sciences.

Single-shot self-supervised object detection in microscopy published in Nature Communications

LodeSTAR tracks the plankton Noctiluca scintillans. (Image by the Authors of the manuscript.)
Single-shot self-supervised particle tracking
Benjamin Midtvedt, Jesús Pineda, Fredrik Skärberg, Erik Olsén, Harshith Bachimanchi, Emelie Wesén, Elin K. Esbjörner, Erik Selander, Fredrik Höök, Daniel Midtvedt, Giovanni Volpe
Nature Communications 13, 7492 (2022)
arXiv: 2202.13546
doi: 10.1038/s41467-022-35004-y

Object detection is a fundamental task in digital microscopy, where machine learning has made great strides in overcoming the limitations of classical approaches. The training of state-of-the-art machine-learning methods almost universally relies on vast amounts of labeled experimental data or the ability to numerically simulate realistic datasets. However, experimental data are often challenging to label and cannot be easily reproduced numerically. Here, we propose a deep-learning method, named LodeSTAR (Localization and detection from Symmetries, Translations And Rotations), that learns to detect microscopic objects with sub-pixel accuracy from a single unlabeled experimental image by exploiting the inherent roto-translational symmetries of this task. We demonstrate that LodeSTAR outperforms traditional methods in terms of accuracy, also when analyzing challenging experimental data containing densely packed cells or noisy backgrounds. Furthermore, by exploiting additional symmetries we show that LodeSTAR can measure other properties, e.g., vertical position and polarizability in holographic microscopy.

Recent eLife article on plankton tracking gets featured on Swedish national radio

Planktons imaged under a holographic microscope. (Illustration by J. Heuschele.)
The article Microplankton life histories revealed by holographic microscopy and deep learning gets featured on Vetenskapradion Nyheter (Science radio) operated by Sveriges Radio (Swedish national radio) on November 7, 2022.

The short audio feature (Hologram hjälper forskare att förstå plankton) which highlights the important results of the paper (in Swedish) is now available for public listening.

Vetenskapradion Nyheter airs daily news, reports and in-depth discussions about latest research.

Press release on Microplankton life histories revealed by holographic microscopy and deep learning

Planktons imaged under a holographic microscope. (Illustration by J. Heuschele.)
The article Microplankton life histories revealed by holographic microscopy and deep learning has been featured in the news of University of Gothenburg (in English & Swedish) and in the press release of eLife (in English).

The study, now published in eLife, and co-written by researchers at the Soft Matter Lab of the Department of Physics at the University of Gothenburg, demonstrates how the combination of holographic microscopy and deep learning provides a strong complimentary tool in marine microbial ecology. The research allows quantitative assessments of microplankton feeding behaviours, and biomass increase throughout the cell cycle from generation to generation.

The study is featured also in eLife digest.

Here are the links to the press releases:
Researchers combine microscopy with AI to characterise marine microbial food web (eLife, English)
Holographic microscopy provides insights into the life of microplankton (GU, English)
Hologram ger insyn i planktonens liv (GU, Swedish)
The secret lives of microbes (eLife digest)

Microplankton life histories revealed by holographic microscopy and deep learning published in eLife

Tracking of microplankton by holographic optical microscopy and deep learning. (Image by H. Bachimanchi.)
Microplankton life histories revealed by holographic microscopy and deep learning
Harshith Bachimanchi, Benjamin Midtvedt, Daniel Midtvedt, Erik Selander, and Giovanni Volpe
eLife 11, e79760 (2022)
arXiv: 2202.09046
doi: 10.7554/eLife.79760

The marine microbial food web plays a central role in the global carbon cycle. Our mechanistic understanding of the ocean, however, is biased towards its larger constituents, while rates and biomass fluxes in the microbial food web are mainly inferred from indirect measurements and ensemble averages. Yet, resolution at the level of the individual microplankton is required to advance our understanding of the oceanic food web. Here, we demonstrate that, by combining holographic microscopy with deep learning, we can follow microplanktons throughout their lifespan, continuously measuring their three dimensional position and dry mass. The deep learning algorithms circumvent the computationally intensive processing of holographic data and allow rapid measurements over extended time periods. This permits us to reliably estimate growth rates, both in terms of dry mass increase and cell divisions, as well as to measure trophic interactions between species such as predation events. The individual resolution provides information about selectivity, individual feeding rates and handling times for individual microplanktons. This method is particularly useful to explore the flux of carbon through micro-zooplankton, the most important and least known group of primary consumers in the global oceans. We exemplify this by detailed descriptions of micro-zooplankton feeding events, cell divisions, and long term monitoring of single cells from division to division.