This repo contains a pre-architected, production-ready LLM application that may be easily deployed to cloud thanks to the independent containers used for the frontend, backend, and LLM service Ollama.