Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 17 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,16 @@
<p align="center">
<a href="/LICENSE" alt="licence"><img src="https://img.shields.io/github/license/softmatterlab/DeepTrack-2.0"></a>
<a href="https://badge.fury.io/py/deeptrack"><img src="https://badge.fury.io/py/deeptrack.svg" alt="PyPI version"></a>
<a href="https://softmatterlab.github.io/DeepTrack2/deeptrack.html"><img src="https://img.shields.io/badge/docs-passing-green" alt="PyPI version"></a>
<a href="https://deeptrackai.github.io/DeepTrack2"><img src="https://img.shields.io/badge/docs-passing-green" alt="PyPI version"></a>
<a href="https://badge.fury.io/py/deeptrack"><img src="https://img.shields.io/badge/python-3.8%20%7C%203.9%20%7C%203.10%20%7C%203.11%20%7C%203.12-blue" alt="Python version"></a>
<a href="https://doi.org/10.1063/5.0034891" alt="DOI">
<img src="https://img.shields.io/badge/DOI-10.1063%2F5.0034891-blue">
</a>
</p>
<p align="center">
<a href="#installation">Installation</a> •
<a href="#using-deeptrack">Examples</a> •
<a href="#learning-deeptrack-21">Basics</a> •
<a href="#examples-of-applications-using-deeptrack"> Examples</a> •
<a href="#getting-started-guides">Basics</a> •
<a href="#cite-us">Cite us</a> •
<a href="/LICENSE">License</a>
</p>
Expand Down Expand Up @@ -91,42 +91,35 @@ Everybody learns in different ways! Depending on your preferences, and what you

## Getting-started guides

We have two separate series of notebooks which aims to teach you all you need to know to use DeepTrack to its fullest. The first is a set of six notebooks with a focus on the application.
We have a set of four notebooks which aims to teach you all you need to know to use DeepTrack to its fullest with a focus on the application.

1. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/get-started/01.%20deeptrack_introduction_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> deeptrack_introduction_tutorial </a> gives an overview of how to use DeepTrack 2.1.
2. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/tracking_particle_cnn_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> tracking_particle_cnn_tutorial </a> demonstrates how to track a point particle with a convolutional neural network (CNN).
3. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorial/tracking_multiple_particles_unet_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> tracking_particle_cnn_tutorial </a> demonstrates how to track multiple particles using a U-net.
4. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/characterizing_aberrations_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> characterizing_aberrations_tutorial </a> demonstrates how to add and characterize aberrations of an optical device.
5. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/distinguishing_particles_in_brightfield_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> distinguishing_particles_in_brightfield_tutorial </a> demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
6. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/analyzing_video_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> analyzing_video_tutorial </a> demonstrates how to create videos and how to train a neural network to analyze them.
1. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/get-started/01.%20deeptrack_introduction_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> deeptrack_introduction_tutorial </a> Gives an overview of how to use DeepTrack 2.1.
2. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/01.%20tracking_particle_cnn_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> tracking_particle_cnn_tutorial </a> Demonstrates how to track a point particle with a convolutional neural network (CNN).
3. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorial/02.%20tracking_multiple_particles_unet_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> tracking_multiple_particles_unet_tutorial </a> Demonstrates how to track multiple particles using a U-net.
4. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/tutorials/03.%20distinguishing_particles_in_brightfield_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> distinguishing_particles_in_brightfield_tutorial </a> Demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.


The second series focuses on individual topics, introducing them in a natural order.

1. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/get-started/01.%20deeptrack_introduction_tutorial.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> Introducing how to create simulation pipelines and train models. </a>
2. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/get-started/02.%20using_deeptrack_generators.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> Demonstrating data generators. </a>
3. <a href="https://colab.research.google.com/github/DeepTrackAI/DeepTrack2/blob/develop/examples/get-started/03.%20customizing_deeptrack_models.ipynb"> <img src="https://colab.research.google.com/assets/colab-badge.svg"> Demonstrating how to customize models using layer-blocks. </a>

## DeepTrack 2.1 in action

Additionally, we have seven more case studies which are less documented, but gives additional insight in how to use DeepTrack with real datasets
Additionally, we have six more case studies which are less documented, but gives additional insight in how to use DeepTrack with real datasets

1. [MNIST](examples/paper-examples/1-MNIST.ipynb) classifies handwritted digits.
2. [single particle tracking](examples/paper-examples/2-single_particle_tracking.ipynb) tracks experimentally captured videos of a single particle. (Requires opencv-python compiled with ffmpeg to open and read a video.)
3. [single particle sizing](examples/paper-examples/3-particle_sizing.ipynb) extracts the radius and refractive index of particles.
4. [multi-particle tracking](examples/paper-examples/4-multi-molecule-tracking.ipynb) detects quantum dots in a low SNR image.
5. [3-dimensional tracking](examples/paper-examples/5-inline_holography_3d_tracking.ipynb) tracks particles in three dimensions.
6. [cell counting](examples/paper-examples/6-cell_counting.ipynb) counts the number of cells in fluorescence images.
7. [GAN image generation](examples/paper-examples/7-GAN_image_generation.ipynb) uses a GAN to create cell image from masks.
1. [Single Particle Tracking](examples/paper-examples/2-single_particle_tracking.ipynb) Tracks experimental videos of a single particle. (Requires opencv-python compiled with ffmpeg)
2. [Multi-Particle tracking](examples/paper-examples/4-multi-molecule-tracking.ipynb) Detect quantum dots in a low SNR image.
3. [Particle Feature Extraction](examples/paper-examples/3-particle_sizing.ipynb) Extract the radius and refractive index of particles.
4. [Cell Counting](examples/paper-examples/6-cell_counting.ipynb) Count the number of cells in fluorescence images.
5. [3D Multi-Particle tracking](examples/paper-examples/5-inline_holography_3d_tracking.ipynb)
6. [GAN image generation](examples/paper-examples/7-GAN_image_generation.ipynb) Use a GAN to create cell image from masks.

## Model-specific examples

We also have examples that are specific for certain models. This includes
- [*LodeSTAR*](examples/LodeSTAR) for label-free particle tracking.
- [*MAGIK*](deeptrack/models/gnns/) for graph-based particle linking and trace characterization.
- [*MAGIK*](examples/MAGIK) for graph-based particle linking and trace characterization.

## Documentation
The detailed documentation of DeepTrack 2.1 is available at the following link: https://softmatterlab.github.io/DeepTrack2/deeptrack.html
The detailed documentation of DeepTrack 2.1 is available at the following link: [https://deeptrackai.github.io/DeepTrack2](https://deeptrackai.github.io/DeepTrack2)

## Video Tutorials

Expand Down
66 changes: 66 additions & 0 deletions examples/MAGIK/readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
# MAGIK

MAGIK is a geometric deep learning approach for the analysis of dynamical properties from time-lapse microscopy.
Here we provide the code as well as instructions to train models and to analyze experimental data.

# Getting started

## Installation from PyPi

MAGIK requires at least python 3.6.

To install MAGIK you must install the [Deeptrack](https://github.com/softmatterlab/DeepTrack-2.0) framework. Open a terminal or command prompt and run:

pip install deeptrack

## Software requirements

### OS Requirements

MAGIK has been tested on the following systems:

- macOS: Monterey (12.2.1)
- Windows: 10 (64-bit)

### Python Dependencies

```
tensorflow
numpy
matplotlib
scipy
Sphinx==2.2.0
pydata-sphinx-theme
numpydoc
scikit-image
tensorflow-probability
pint
pandas

```

If you have a very recent version of python, you may need to install numpy _before_ DeepTrack. This is a known issue with scikit-image.

## It's a kind of MAGIK...

To see MAGIK in action, we provide an [example](//github.com/softmatterlab/DeepTrack-2.0/blob/develop/examples/MAGIK/) based on live-cell migration experiments. Data courtesy of Sergi Masó Orriols, [the QuBI lab](https://mon.uvic.cat/qubilab/).

## Cite us!

If you use MAGIK in your project, please cite our article:

```
Jesús Pineda, Benjamin Midtvedt, Harshith Bachimanchi, Sergio Noé, Daniel Midtvedt, Giovanni Volpe, and Carlo Manzo
"Geometric deep learning reveals the spatiotemporal fingerprint of microscopic motion."
arXiv 2202.06355 (2022).
https://arxiv.org/pdf/2202.06355.pdf
```

## Funding

This work was supported by FEDER/Ministerio de Ciencia, Innovación y Universidades – Agencia Estatal de Investigación
through the “Ram ́on y Cajal” program 2015 (Grant No. RYC-2015-17896) and the “Programa Estatal de I+D+i Orientada a los Retos de la Sociedad” (Grant No. BFU2017-85693-R); the Generalitat de Catalunya (AGAUR Grant No. 2017SGR940); the ERC Starting Grant ComplexSwimmers (Grant No. 677511); and the ERC Starting Grant MAPEI (101001267); the Knut and Alice Wallenberg Foundation (Grant No. 2019.0079).

## License

This project is covered under the **MIT License**.
Loading