Presentation by S. Helgadottir at the Gothenburg Science Festival, 2 October 2020

Logo of the Gothenburg Science Festival.

Saga Helgadottir will give a presentation at the Gothenburg Science Festival 2020.

The International Science Festival Gothenburg is one of Europe’s leading popular science events. Its first edition dates back to 1997, and it is held every year in spring.
This year the festival will take place during autumn, 28 September-4 October. Due to the current situation the festival will be a digital event. The digital festival will be available during the week of the festival.

The contribution of Saga Helgadottir will be presented according to the following schedule:

Saga Helgadottir
Deep Learning for Object Recognition
Deep Learning is a machine learning technique that teaches computers to do what comes naturally to humans: learn by example. In this talk, I will show how Deep Learning can be used to identify objects in images, in particular microscopic particles.

Date: 2 October 2020
Time: 18:08
Duration: 17′
Link: Deep Learning for Object Recognition

Links:
Vetenskapsfestivalen Göteborg (in Swedish)
The International Science Festival Gothenburg (in English)
Full Program

Diagnosis of a genetic disease improves with machine learning, a summary in Swedish published in Fysikaktuellt

Neural networks consist of a series of connected layers of neurons, whose connection weights are adjusted to learn how to determine the diagnosis from the input data.

A summary in Swedish of our previously published article “Virtual genetic diagnosis for familial hypercholesterolemia powered by machine learning” has been published in Fysikaktuellt, the journal of the Swedish Physical Society (Svenska fysikersamfundet).

Article: “Diagnostisering av sjukdomar förbättras med maskininlärning”, Saga Helgadottir, Giovanni Volpe and Stefano Romeo (in Swedish)

Original article: Virtual genetic diagnosis for familial hypercholesterolemia powered by machine learning

Press release: 
Algoritm lär sig diagnostisera genetisk sjukdom (in Swedish)
An algorithm that learns to diagnose genetic disease (in English)

Saga Helgadottir interviewed by Curie, a magazine issued by the Swedish Research Council

Saga Helgadottir discussed her research with Curie, a magazine issued by the Swedish Research Council. The article gives examples of how AI is used in many research disciplines. Read the article on Curie’s webpage here.

Presentation by Saga Helgadottir at the CECAM Workshop “Active Matter and Artificial Intelligence”, Lausanne, Switzerland, 30 September 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir, Aykut Argun & Giovanni Volpe
CECAM Workshop “Active Matter and Artificial Intelligence”, Lausanne, Switzerland
30 September 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack

03:40 PM–04:00 PM, Monday, September 30, 2019

Presentation by Saga Helgadottir at the AI for Health and Healthy AI conference, Gothenburg, Sweden, 30 August 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir, Aykut Argun & Giovanni Volpe
AI for Health and Healthy AI conference, Gothenburg, Sweden
30 August 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Friday, August 30, 2019

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack

Invited Seminar by Saga Helgadottir at the Max Planck Institute for the Science of Light, 10 May 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir
Sandoghdar Division, Max Planck Institute for the Science of Light, Erlangen, Germany
10 May 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Saga Helgadottir, Aykut Argun & Giovanni Volpe, Optica 6(4), 506—513 (2019)
doi: 10.1364/OPTICA.6.000506
arXiv: 1812.02653
GitHub: DeepTrack

Presentation by Saga Helgadottir at the OSA Biophotonics Congress, Tucson, 16 Apr 2019

Digital video microscopy enhanced by deep learning

Saga Helgadottir, Aykut Argun & Giovanni Volpe
OSA Biophotonics Congress, Tucson (AZ), USA
16 April 2019

Single particle tracking is essential in many branches of science and technology, from the measurement of biomolecular forces to the study of colloidal crystals. Standard methods rely on algorithmic approaches; by fine-tuning several user-defined parameters, these methods can be highly successful at tracking a well-defined kind of particle under low-noise conditions with constant and homogenous illumination. Here, we introduce an alternative data-driven approach based on a convolutional neural network, which we name DeepTrack. We show that DeepTrack outperforms algorithmic approaches, especially in the presence of noise and under poor illumination conditions. We use DeepTrack to track an optically trapped particle under very noisy and unsteady illumination conditions, where standard algorithmic approaches fail. We then demonstrate how DeepTrack can also be used to track multiple particles and non-spherical objects such as bacteria, also at very low signal-to-noise ratios. In order to make DeepTrack readily available for other users, we provide a Python software package, which can be easily personalized and optimized for specific applications.

Session: Biological Applications
10:30 AM–12:00 AM, Tuesday, April 16, 2019

More information can be found on the link: https://www.osapublishing.org/abstract.cfm?uri=OMA-2019-AT2E.5