aiautomotivemlllm
Undergraduate Co-op Thesis - LLM based In-Cabin Comfort System
Scalable local LLM-based reasoning system for real-time in-cabin comfort prediction and entertainment suggestions using multi-modal sensor data.

The Problem
Traditional automotive systems lacked intelligent, context-aware recommendations for passenger comfort and entertainment.
The Solution
Designed and integrated agentic Retrieval-Augmented Generation (RAG) techniques to enable on-device inference and generate actionable, context-aware recommendations using Python and Ollama.
Impact & Results
Deployed on Cruden Simulator with context-aware memory and vector databases for scalable backend performance
Real-time comfort prediction using multi-modal sensor data
Agentic RAG for on-device inference
Scalable backend with context-aware memory
Deployed on Cruden Simulator
Tech Stack
PythonOllamaRAGVector DatabasesMulti-modal SensorsAgnoLangChainHugging Face
Links