Welcome to the AI Lab Assistant repository of Tinkerers' Lab, PREC Loni! 🚀 This project is a Spring Boot-based RAG (Retrieval-Augmented Generation) LLM model, designed to assist students and researchers by providing contextual responses based on academic materials, research papers, and lab resources.
- RAG-Powered AI: Combines retrieval-based search with a powerful LLM for accurate and contextual responses.
- Spring Boot Backend: A robust and scalable backend built using Java Spring Boot.
- Vector Database Integration: Uses
pgvectorwith PostgreSQL to store and retrieve embeddings efficiently. - Local Embedding Generation: Utilizes
nomic-embed-textto generate embeddings without relying on external API keys. - PDF Processing: Scans and processes PDFs to extract relevant knowledge for AI-assisted answers.
├── src/main/java/com/TinkerersLab/LabAssistant # Backend Spring Boot code
│ ├── config/ # Configuration files
│ ├── controller/ # REST API controllers
│ ├── service/ # Business logic
│ ├── model/ # Data models
│ ├── util/ # Utility functions
│ ├── LabAssistantApplication.java # Application Entrypoint
│
├── data/ # PDF and document storage
├── Dockerfile # Docker setup for containerization
├── README.md # Project documentation
├── comose.yaml # Docker configuration of services
├── pom.xml # Project dependency configuration
└── .env # Environment variables
- Java 17+
- Spring Boot 3+
- PostgreSQL with
pgvector - Docker (optional, for containerized deployment)
- Ollama
-
Clone the Repository
git clone https://github.com/tinkererslab/LabMate-AI.git cd LabMate-AI -
Set Up the Database
- Install PostgreSQL and enable
pgvectorextension. - Create a new database and update
.envwith database credentials.
- Install PostgreSQL and enable
-
Run the Application
./LabMate-AI/mvnw spring-boot:run
or
docker compose up
-
Add PDF file for context
cp ~/input-context.pdf /opt/docs/.or
cp data/input.zip /opt/docs; cd /opt/docs; unzip input.zip
(you have have multiple pdf files at /opt/docs)
-
Access the API
- API runs on
http://localhost:8080/api/v1/llm/askandhttp://localhost:8080/api/v1/llm/askAndStream?query=this+is+my+query - add request body as
{ "query": "you query" } - API runs on
-
Access the UI
- Visit http://localhost:8080 on you web browser
- Fine-tuning the LLM for improved contextual accuracy.
- Integration with Chat UI for a seamless user experience.
- Multi-Document Support to process various academic file formats.
This AI Lab Assistant can be used as a personal AI assistant for any purpose! Simply replace the contents of the input PDF files in the data/ directory with your own documents. The system will process and retrieve information based on your specific files, making it a versatile tool for various applications.
We welcome contributions from the community! Feel free to fork this repository, submit pull requests, or open issues.
This project is open-source and available under the MIT License.
🚀 Built with passion at Tinkerers' Lab, PREC Loni. Let's innovate together! 🔥
For any inquiries or feedback, please reach out to us at:
- Email: tl@pravaraengg.org.in or
- Maintainer : email , github
- GitHub: Tinkerer's Lab PREC