Particle Tracking for Soft Matter offers a modular set of Jupyter notebooks to help experimentalists and researchers analyze microscopy data with minimal coding effort. The tutorials are designed to be accessible, with well-structured utility modules and clear workflows.
This repository supports the full pipeline for particle tracking:
- Detection – Identify and localize particles using both classical and deep learning methods.
- Linking – Associate particles across time frames to reconstruct trajectories.
- Simulation - Generate realistic dataset for training and evaluation.
- Evaluation – Compare predictions to ground truth using tracking metrics.
- Visualization – Animate and inspect results interactively.
The tutorials are organized into two main parts: Detection & Localization and Linking.
You will apply and compare several detection strategies:
- Thresholding & Connected Components – Simple and fast.
- Crocker–Grier (TrackPy) – Classical approach to particle tracking.
- U-Net – Supervised deep learning for segmentation.
- LodeSTAR – Unsupervised deep learning for subpixel localization.
Each method is benchmarked on simulated data and then applied to experimental datasets.
Tutorials:
Associate localized particles across frames to reconstruct trajectories using:
- Nearest-neighbor linking (TrackPy).
- Linear Assignment Problem (LAP) using Hungarian algorithm (LapTrack).
- Graph-based deep learning linker (MAGIK, via
deeplay).
Tutorial:
- Clone the repository:
git clone https://github.com/softmatterlab/ParticleTracking.git
cd ParticleTracking- Install dependencies:
pip install -r requirements.txt- Launch the tutorials:
jupyter lab # or jupyter notebookThe tutorials rely on the utility modules stored in the utils/ folder.
!git clone https://github.com/softmatterlab/ParticleTracking.git
%cd ParticleTrackingUpload a zipped copy of utils/:
from google.colab import files
uploaded = files.upload() # Upload utils.zipUnzip it:
!unzip utils.zip -d .Then import as usual:
from utils import detection_utils, tracking_utils, video_utilsCore libraries:
numpy,scipy,matplotlibscikit-image,torchtrackpy,laptrack,deeptrack,deeplay
See requirements.txt for full details.
This project is licensed under the MIT License.
If you use this toolkit for your research, please cite:
(BibTeX and citation information coming soon)