Skip to content

A novel deep transfer learning framework for forecasting NavIC satellite ephemeris and clock errors. PRESTO overcomes extreme data scarcity using a hybrid architecture (GNNs + Semiparametric Decomposition + Autoformers) and synthetic data augmentation to generate normally distributed prediction residuals.

License

Notifications You must be signed in to change notification settings

SatyaJaiss/PRESTO

 
 

Repository files navigation

🛰️ PRESTO: Predictive Residual Error Spatio-Temporal Orchestrator

License: CC BY-NC 4.0 Framework Status

PRESTO is a novel deep transfer learning framework designed to forecast satellite ephemeris and clock errors for the NavIC (Navigation with Indian Constellation) system.

Built to solve the challenge of extreme data scarcity (training on only 7 days of history), PRESTO successfully generates stable 24-hour forecasts where the error residuals conform to a near-perfect normal distribution—the gold standard for GNSS error modeling.

Note: This project was developed as a solution to SIH 2025 PS ID SIH25176.


🎯 The Core Objective: The Quest for Normality

In satellite navigation, minimizing the Mean Squared Error (MSE) isn't enough. The ultimate goal is to strip away all systematic, predictable error components (trends, seasonality, orbital perturbations) until only pure, random noise remains.

Our Success Metric: A prediction distribution that is statistically indistinguishable from a Gaussian Normal Distribution.

The Goal The Result (PRESTO-GEO)
Eliminate systematic bias $\mu \approx 0.00$ (Zero Mean)
Normalize residuals Shapiro-Wilk $p > 0.05$ (Passed Normality Test)

Distribution Plot The distribution of PRESTO's out-of-sample predictions (blue) aligns perfectly with the theoretical normal curve (black dashed).


🧠 The Architecture

PRESTO is not a monolithic model. It is a hybrid "divide and conquer" system designed to handle specific aspects of the satellite signal:

  1. Spatio-Temporal Extraction (GNN): We model the 4 error channels (x, y, z, clock) as nodes in a graph. A Graph Attention Network (GAT) learns the latent physical dependencies (e.g., how orbital drag affects clock drift).
  2. Semiparametric Decomposition: Satellite errors are a composite of slow drifts and fast noise. We use a quadratic model to isolate and subtract the long-term trend.
  3. Residual Forecasting (Autoformer): The remaining high-frequency signal is complex and chaotic. We use an Autoformer with Auto-Correlation mechanisms to discover periodicity and forecast these residuals.

🚀 The 3-Stage Training Strategy

Training a Transformer on 145 data points is impossible. We solved this with a novel transfer learning pipeline:

  1. Synthetic Data Generation: We fine-tuned a Large Time-series Model (TimeR-XL) on the cleaned ISRO data to generate a synthetic dataset 100x larger than the original, capturing the unique statistical "dialect" of the satellites.
  2. Pre-Training: PRESTO is trained from scratch on this massive synthetic dataset, learning generalized error dynamics without overfitting.
  3. Fine-Tuning: The model is finally refined on the real 7-day dataset using K-Fold Cross-Validation.

📂 Repository Structure

PRESTO/
├── Datasets/
│   ├── Cleaned_Datasets/      # Preprocessed 15-min interval data
│   ├── Synthetic_Datasets/    # 100x augmented data generated by TimeR-XL
│   └── Forecasted_Datasets/   # Final 8th-day predictions
├── Notebooks/
│   ├── Data_Preprocessing.ipynb        # Resampling, interpolation, outlier fix
│   ├── Synthetic_Data_Generation.ipynb # Fine-tuning TimeR-XL
│   ├── PRESTO_Pre_Train.ipynb          # Training on synthetic data
│   └── PRESTO_Fine_Tune.ipynb          # Transfer learning to real ISRO data
├── Plots/                     # Visualizations of results and analysis
├── PRESTO_Weights/            # Saved model states (.pth)
└── PRESTO_Technical_Deep_Dive.pdf   # Full technical report

📊 Performance Visuals

1. Data Cleaning

Raw satellite data is noisy and leptokurtic (heavy-tailed). Our preprocessing pipeline regularizes this without destroying signal integrity.

2. 8th Day Forecasting

PRESTO generates physically plausible, stable forecasts for the unseen 8th day with well-calibrated confidence intervals. Forecast


💻 Usage & Requirements

Note on Dependencies:
This project does not use a strictly pinned requirements.txt. Instead, all necessary libraries are installed directly within the first cells of the Jupyter Notebooks using !pip install.

Core Libraries Used:

  • torch (PyTorch)
  • torch_geometric (GNN layers)
  • OpenLTM (For TimeR-XL)
  • scikit-learn, pandas, numpy, scipy

How to Run:

  1. Clone the repository.
  2. Navigate to the Notebooks/ folder.
  3. Run the notebooks in sequential order (Data_Preprocessing -> Synthetic_Data_Generation -> PRESTO_Pre_Train -> PRESTO_Fine_Tune).

👥 Authors

This project was a collaborative research effort.

  • Satyam Jaiswal$^*$
    GitHub LinkedIn

  • Vishisht Mishra$^*$
    GitHub LinkedIn

*These authors contributed equally to this work.


📜 License

This project is licensed under the Creative Commons Attribution-NonCommercial 4.0 International License.

See the LICENSE file for details.

About

A novel deep transfer learning framework for forecasting NavIC satellite ephemeris and clock errors. PRESTO overcomes extreme data scarcity using a hybrid architecture (GNNs + Semiparametric Decomposition + Autoformers) and synthetic data augmentation to generate normally distributed prediction residuals.

Topics

Resources

License

Stars

Watchers

Forks

Languages

  • Jupyter Notebook 100.0%