Presentation by Saga Helgadottir at the CECAM Workshop “Active Matter and Artificial Intelligence”, Lausanne, Switzerland, 30 September 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir, Aykut Argun & Giovanni Volpe
CECAM Workshop “Active Matter and Artificial Intelligence”, Lausanne, Switzerland
30 September 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack

03:40 PM–04:00 PM, Monday, September 30, 2019

Presentation by Saga Helgadottir at the AI for Health and Healthy AI conference, Gothenburg, Sweden, 30 August 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir, Aykut Argun & Giovanni Volpe
AI for Health and Healthy AI conference, Gothenburg, Sweden
30 August 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Friday, August 30, 2019

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack

Seminar by G. Volpe at MTL BrainHack School 2019, Montreal, Canada, 22 August 2019

Be friendly to your users:
Add comments and tutorials to your code
Giovanni Volpe
MTL BrainHack School 2019, Montreal, 22 August 2019
https://brainhackmtl.github.io/school2019/

When releasing a software package, it is critical to provide potential users with all the information they need to help them using it.
Using the example of Braph — a software we recently developed to study brain connectivity http://braph.org/ —, I’ll illustrate how we have commented the code, created a website and off-line documentation, and recoded video tutorials.
I’ll conclude with some practical advice and some best practices.

Talk by G. Volpe at SPIE OTOM XVI, San Diego, 14 Aug 2019

FORMA: a high-performance algorithm for the calibration of optical tweezers
Laura Pérez-García, Alejandro V. Arzola, Jaime Donlucas Pérez, Giorgio Volpe  & Giovanni Volpe
SPIE Nanoscience + Engineering, Optical trapping and Optical Manipulation XV, San Diego (CA), USA
11-15 August 2019

We introduce a powerful algorithm (FORMA) for the calibration of optical tweezers. FORMA estimates accurately the conservative and non-conservative components of the force field with important advantages over established techniques, being parameter-free, requiring ten-fold less data and executing orders-of-magnitude faster. We demonstrate FORMA performance using optical tweezers, showing how, outperforming other available techniques, it can identify and characterise stable and unstable equilibrium points in generic force fields.

Reference: Pérez-García et al., Nature Communications 9, 5166 (2018)
doi: 10.1038/s41467-018-07437-x

Plenary Presentation by G. Volpe at SPIE Nanoscience + Engineering, San Diego, 12 Aug 2019

Optical forces go smart
Giovanni Volpe
Plenary Presentation
SPIE Nanoscience + Engineering, San Diego (CA), USA
11-15 August 2019

Optical forces have revolutionized nanotechnology. In particular, optical forces have been used to measure and exert femtonewton forces on nanoscopic objects. This has provided the essential tools to develop nanothermodynamics, to explore nanoscopic interactions such as critical Casimir forces, and to realize microscopic devices capable of autonomous operation. The future of optical forces now lies in the development of smarter experimental setups and data-analysis algorithms, partially empowered by the machine-learning revolution. This will open unprecedented possibilities, such as the study of the energy and information flows in nanothermodynamics systems, the design of novel forms of interactions between nanoparticles, and the realization of smart microscopic devices.

Invited talk by G. Volpe at MPI-PKS Workshop, Dresden, Germany, 23 July 2019

Deep Learning Applications in Photonics and Active Matter
Giovanni Volpe
Invited talk at the “
Microscale Motion and Light” MPI-PKS Workshop, Dresden, Germany, 22-26 July 2019
https://www.pks.mpg.de/mml19/

After a brief overview of artificial intelligence, machine learning and deep learning, I will present a series of recent works in which we have employed deep learning for applications in photonics and active matter. In particular, I will explain how we employed deep learning to enhance digital video microscopy [1], to estimate the properties of anomalous diffusion [2], and to improve the calculation of optical forces. Finally, I will provide an outlook for the application of deep learning in photonics and active matter.

References

[1] S. Helgadottir, A. Argun and G. Volpe, Digital video microscopy enhanced by deep learning. Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506

[2] S. Bo, F Schmidt, R. Eichborn and G. Volpe, Measurement of Anomalous Diffusion Using Recurrent Neural Networks. arXiv: 1905.02038

Falko Schmidt presented his PhD half-time seminar

About mid-way through his PhD, Falko Schmidt presented his past research activities and gave an outlook on his future projects. The topics range from miniaturised machines to self-assembled active molecules activated by light to machine-learning techniques to better characterise dynamical behaviour of microscopic systems.

The seminar will be held at the Department of Physics at Gothenburg University, June 10th 2019 starting at 12:15 p.m.

Invited talk by G. Volpe at Interface Dynamics and Dissipation Across the Time and Length-Scales, Tel Aviv, 22 May 2019

Emergent Complex Behaviour in Active Matter across Time- and Length Scales
Giovanni Volpe
Invited talk at “Interface Dynamics and Dissipation Across the Time- and Length-Scales”
CECAM Israel Workshop
Tel Aviv University, Tel Aviv, Israel
21-23 May 2019

After a brief introduction of active particles, I’ll present some recent advances on the study of active particles in complex and crowded environments.
First, I’ll show that active particles can work as microswimmers and microengines powered by critical fluctuations and controlled by light.
Then, I’ll discuss some examples of behavior of active particles in crowded environments: a few active particles alter the overall dynamics of a system; active particles create metastable clusters and channels; active matter leads to non-Boltzmann distributions and alternative non-equilibrium relations; and active colloidal molecules can be created and controlled by light.
Finally, I’ll present some examples of the behavior of active particles in complex environments: active particles often perform 2D active Brownian motion; active particles at liquid-liquid interfaces behave as active interstitials or as active atoms; and the environment alters the optimal search strategy for active particles in complex topologies.

https://www3.tau.ac.il/cecam/index.php/events/eventdetail/28/-/interface-dynamics-and-dissipation-across-the-time-and-length-scales#Program

Invited Seminar by Saga Helgadottir at the Max Planck Institute for the Science of Light, 10 May 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir
Sandoghdar Division, Max Planck Institute for the Science of Light, Erlangen, Germany
10 May 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack