The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
Updated
Nov 26, 2024 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Chat with your pdf using your local LLM, OLLAMA client.
Ollama with Let's Encrypt Using Docker Compose
Streamlit Chatbot Application that integrates multiple language models through the Ollama API, featuring a multi language model management system with an intuitive user interface.
Streamlit Chatbot using Ollama Open Source LLMs
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
"A simple and lightweight client-server program for interfacing with local LLMs using ollama, and LLMs in groq using groq api."
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."