From f82ab112f4d6917f6bb241ce883a1b5bad50b46e Mon Sep 17 00:00:00 2001 From: Alex <95913221+Pwhsky@users.noreply.github.com> Date: Sat, 7 Dec 2024 18:18:15 +0100 Subject: [PATCH 1/5] Update README.md MAGIK link now points to MAGIK examples and not the source code for gnn's --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 00b6a7208..f31218270 100644 --- a/README.md +++ b/README.md @@ -123,7 +123,7 @@ Additionally, we have seven more case studies which are less documented, but giv We also have examples that are specific for certain models. This includes - [*LodeSTAR*](examples/LodeSTAR) for label-free particle tracking. -- [*MAGIK*](deeptrack/models/gnns/) for graph-based particle linking and trace characterization. +- [*MAGIK*](examples/MAGIK) for graph-based particle linking and trace characterization. ## Documentation The detailed documentation of DeepTrack 2.1 is available at the following link: https://softmatterlab.github.io/DeepTrack2/deeptrack.html From 95741140a003faa222d4ddffa15191bcac813fc2 Mon Sep 17 00:00:00 2001 From: Alex <95913221+Pwhsky@users.noreply.github.com> Date: Sat, 7 Dec 2024 18:22:21 +0100 Subject: [PATCH 2/5] moved magik readme to examples --- examples/MAGIK/readme.md | 66 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 66 insertions(+) create mode 100644 examples/MAGIK/readme.md diff --git a/examples/MAGIK/readme.md b/examples/MAGIK/readme.md new file mode 100644 index 000000000..cbd6f1612 --- /dev/null +++ b/examples/MAGIK/readme.md @@ -0,0 +1,66 @@ +# MAGIK + +MAGIK is a geometric deep learning approach for the analysis of dynamical properties from time-lapse microscopy. +Here we provide the code as well as instructions to train models and to analyze experimental data. + +# Getting started + +## Installation from PyPi + +MAGIK requires at least python 3.6. + +To install MAGIK you must install the [Deeptrack](https://github.com/softmatterlab/DeepTrack-2.0) framework. Open a terminal or command prompt and run: + + pip install deeptrack + +## Software requirements + +### OS Requirements + +MAGIK has been tested on the following systems: + +- macOS: Monterey (12.2.1) +- Windows: 10 (64-bit) + +### Python Dependencies + +``` +tensorflow +numpy +matplotlib +scipy +Sphinx==2.2.0 +pydata-sphinx-theme +numpydoc +scikit-image +tensorflow-probability +pint +pandas + +``` + +If you have a very recent version of python, you may need to install numpy _before_ DeepTrack. This is a known issue with scikit-image. + +## It's a kind of MAGIK... + +To see MAGIK in action, we provide an [example](//github.com/softmatterlab/DeepTrack-2.0/blob/develop/examples/MAGIK/) based on live-cell migration experiments. Data courtesy of Sergi Masó Orriols, [the QuBI lab](https://mon.uvic.cat/qubilab/). + +## Cite us! + +If you use MAGIK in your project, please cite our article: + +``` +Jesús Pineda, Benjamin Midtvedt, Harshith Bachimanchi, Sergio Noé, Daniel Midtvedt, Giovanni Volpe, and Carlo Manzo +"Geometric deep learning reveals the spatiotemporal fingerprint of microscopic motion." +arXiv 2202.06355 (2022). +https://arxiv.org/pdf/2202.06355.pdf +``` + +## Funding + +This work was supported by FEDER/Ministerio de Ciencia, Innovación y Universidades – Agencia Estatal de Investigación +through the “Ram ́on y Cajal” program 2015 (Grant No. RYC-2015-17896) and the “Programa Estatal de I+D+i Orientada a los Retos de la Sociedad” (Grant No. BFU2017-85693-R); the Generalitat de Catalunya (AGAUR Grant No. 2017SGR940); the ERC Starting Grant ComplexSwimmers (Grant No. 677511); and the ERC Starting Grant MAPEI (101001267); the Knut and Alice Wallenberg Foundation (Grant No. 2019.0079). + +## License + +This project is covered under the **MIT License**. From 6a586ff7218de5c09a7acceac1756c4afb23b10a Mon Sep 17 00:00:00 2001 From: Alex <95913221+Pwhsky@users.noreply.github.com> Date: Sat, 7 Dec 2024 18:58:06 +0100 Subject: [PATCH 3/5] valid colab links now work. Removed colab links without notebooks in repository. --- README.md | 31 ++++++++++++------------------- 1 file changed, 12 insertions(+), 19 deletions(-) diff --git a/README.md b/README.md index f31218270..366a1d2e6 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,7 @@
Installation •
- Examples •
+ Examples •
Basics •
Cite us •
License
@@ -91,33 +91,26 @@ Everybody learns in different ways! Depending on your preferences, and what you
## Getting-started guides
-We have two separate series of notebooks which aims to teach you all you need to know to use DeepTrack to its fullest. The first is a set of six notebooks with a focus on the application.
+We have two separate series of notebooks which aims to teach you all you need to know to use DeepTrack to its fullest. The first is a set of four notebooks with a focus on the application.
1. deeptrack_introduction_tutorial gives an overview of how to use DeepTrack 2.1.
-2.
tracking_particle_cnn_tutorial demonstrates how to track a point particle with a convolutional neural network (CNN).
-3.
tracking_particle_cnn_tutorial demonstrates how to track multiple particles using a U-net.
-4.
characterizing_aberrations_tutorial demonstrates how to add and characterize aberrations of an optical device.
-5.
distinguishing_particles_in_brightfield_tutorial demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
-6.
analyzing_video_tutorial demonstrates how to create videos and how to train a neural network to analyze them.
+2.
tracking_particle_cnn_tutorial demonstrates how to track a point particle with a convolutional neural network (CNN).
+3.
tracking_multiple_particles_unet_tutorial demonstrates how to track multiple particles using a U-net.
+4.
distinguishing_particles_in_brightfield_tutorial demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
-The second series focuses on individual topics, introducing them in a natural order.
-1.
Introducing how to create simulation pipelines and train models.
-2.
Demonstrating data generators.
-3.
Demonstrating how to customize models using layer-blocks.
## DeepTrack 2.1 in action
-Additionally, we have seven more case studies which are less documented, but gives additional insight in how to use DeepTrack with real datasets
+Additionally, we have six more case studies which are less documented, but gives additional insight in how to use DeepTrack with real datasets
-1. [MNIST](examples/paper-examples/1-MNIST.ipynb) classifies handwritted digits.
-2. [single particle tracking](examples/paper-examples/2-single_particle_tracking.ipynb) tracks experimentally captured videos of a single particle. (Requires opencv-python compiled with ffmpeg to open and read a video.)
-3. [single particle sizing](examples/paper-examples/3-particle_sizing.ipynb) extracts the radius and refractive index of particles.
-4. [multi-particle tracking](examples/paper-examples/4-multi-molecule-tracking.ipynb) detects quantum dots in a low SNR image.
-5. [3-dimensional tracking](examples/paper-examples/5-inline_holography_3d_tracking.ipynb) tracks particles in three dimensions.
-6. [cell counting](examples/paper-examples/6-cell_counting.ipynb) counts the number of cells in fluorescence images.
-7. [GAN image generation](examples/paper-examples/7-GAN_image_generation.ipynb) uses a GAN to create cell image from masks.
+1. [Single Particle Tracking](examples/paper-examples/2-single_particle_tracking.ipynb) Tracks experimental videos of a single particle. (Requires opencv-python compiled with ffmpeg)
+2. [Multi-Particle tracking](examples/paper-examples/4-multi-molecule-tracking.ipynb) Detect quantum dots in a low SNR image.
+3. [Particle Feature Extraction](examples/paper-examples/3-particle_sizing.ipynb) Extract the radius and refractive index of particles.
+4. [Cell Counting](examples/paper-examples/6-cell_counting.ipynb) Count the number of cells in fluorescence images.
+5. [3D Multi-Particle tracking](examples/paper-examples/5-inline_holography_3d_tracking.ipynb)
+6. [GAN image generation](examples/paper-examples/7-GAN_image_generation.ipynb) Use a GAN to create cell image from masks.
## Model-specific examples
From 5f6922954ff94a4a71cfe8d059899c08b5cf50e3 Mon Sep 17 00:00:00 2001
From: Alex <95913221+Pwhsky@users.noreply.github.com>
Date: Sat, 7 Dec 2024 19:00:32 +0100
Subject: [PATCH 4/5] headline links now point to the correct headlines
---
README.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/README.md b/README.md
index 366a1d2e6..fe38be095 100644
--- a/README.md
+++ b/README.md
@@ -15,7 +15,7 @@
Installation • Examples • - Basics • + Basics • Cite us • License
From f9e040f962b4cdbd9cd2287d7d8009396a7c0e03 Mon Sep 17 00:00:00 2001 From: Alex <95913221+Pwhsky@users.noreply.github.com> Date: Sat, 7 Dec 2024 19:08:53 +0100 Subject: [PATCH 5/5] Links to documentation and rephrased Getting-started guides --- README.md | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index fe38be095..2fbcc3b96 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@
-
+
@@ -91,12 +91,12 @@ Everybody learns in different ways! Depending on your preferences, and what you
## Getting-started guides
-We have two separate series of notebooks which aims to teach you all you need to know to use DeepTrack to its fullest. The first is a set of four notebooks with a focus on the application.
+We have a set of four notebooks which aims to teach you all you need to know to use DeepTrack to its fullest with a focus on the application.
-1.
deeptrack_introduction_tutorial gives an overview of how to use DeepTrack 2.1.
-2.
tracking_particle_cnn_tutorial demonstrates how to track a point particle with a convolutional neural network (CNN).
-3.
tracking_multiple_particles_unet_tutorial demonstrates how to track multiple particles using a U-net.
-4.
distinguishing_particles_in_brightfield_tutorial demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
+1.
deeptrack_introduction_tutorial Gives an overview of how to use DeepTrack 2.1.
+2.
tracking_particle_cnn_tutorial Demonstrates how to track a point particle with a convolutional neural network (CNN).
+3.
tracking_multiple_particles_unet_tutorial Demonstrates how to track multiple particles using a U-net.
+4.
distinguishing_particles_in_brightfield_tutorial Demonstrates how to use a U-net to track and distinguish particles of different sizes in brightfield microscopy.
@@ -119,7 +119,7 @@ We also have examples that are specific for certain models. This includes
- [*MAGIK*](examples/MAGIK) for graph-based particle linking and trace characterization.
## Documentation
-The detailed documentation of DeepTrack 2.1 is available at the following link: https://softmatterlab.github.io/DeepTrack2/deeptrack.html
+The detailed documentation of DeepTrack 2.1 is available at the following link: [https://deeptrackai.github.io/DeepTrack2](https://deeptrackai.github.io/DeepTrack2)
## Video Tutorials