The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
-
Updated
Aug 7, 2025 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
A single-file tkinter-based Ollama GUI project with no external dependencies.
Ollama负载均衡服务器 | 一款高性能、易配置的开源负载均衡服务器,优化Ollama负载。它能够帮助您提高应用程序的可用性和响应速度,同时确保系统资源的有效利用。
Your AI mate into your favourite terminal
TalkNexus: Ollama Chatbot Multi-Model & RAG Interface
Simple html ollama chatbot that is easy to install. Simply copy the html file on your computer and run it.
Chat with your pdf using your local LLM, OLLAMA client.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy‑first Chrome extension to chat with local LLMs via Ollama, LM Studio, and llama.cpp. Supports streaming, stop/regenerate, RAG, and easy model switching — all without cloud APIs or data leaks.
PuPu is a lightweight, cross-platform desktop AI client that works with both local and cloud-hosted models. Whether you prefer running models on your own machine or connecting to providers like OpenAI and Anthropic, PuPu gives you a unified, elegant interface — your AI, your rules.
Ollama with Let's Encrypt Using Docker Compose
ollama client for android
ollama gui desktop
Check Your Password is Ever Cracked & Know About Strength of Your Password & Generate Passwords Using a Specialized AI Model (StrengthX-Dildo:V1) Dynamic Intelligent Lock & Defense Operator
Ollama Web UI is a simple yet powerful web-based interface for interacting with large language models. It offers chat history, voice commands, voice output, model download and management, conversation saving, terminal access, multi-model chat, and more—all in one streamlined platform.
A vs code extension where users can come and select their locally downloaded ollama models and make them their personal coding agents
generate content | http://researchusai.com | $10 access | BYOK
Minimal Ollama chat UI - no login, no heavy features.
AI model deployment on Synology NAS and macOS 🧠🐳
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."