Overview:
Khoj is an open-source personal AI application designed to extend user capabilities through a local or cloud AI assistant. It connects users to various online and local large language models (LLMs) for chat, and can retrieve answers from both the internet and personal documents. Making its features accessible via a browser, desktop app, mobile phone, or platforms like Obsidian and Emacs, Khoj is intended for individuals or teams seeking a customizable, self-hostable AI that handles research, document search, and automated notifications. It is positioned as a scalable tool that runs from a private computer to an enterprise deployment.
Core Features:
Multi-LLM chat: Connects with local and online LLMs including llama3, gpt, gemini, claude, and mistral.
Document search and Q&A: Performs semantic search across user documents (PDF, markdown, Notion, Word, org-mode) to answer queries.
Agent creation: Users can create custom AI agents with specific knowledge, identity, chat models, and tools.
Research automation: Automates repetitive research and can deliver personalized news and smart notifications to email.
Multi-platform access: Available via browser, Obsidian, Emacs, desktop, phone, and WhatsApp.
Image generation and voice: Supports generating images, voice output, and playing messages.
Use Cases:
Personal research assistant: Using an AI agent to automate repetitive research and receive relevant newsletters directly in email.
Document retrieval: Finding relevant information from a personal collection of PDFs, Markdown files, and other documents using advanced semantic search.
On-device AI: Running a private, self-hosted AI on a personal computer for chating and document interaction without cloud dependency.
Enterprise AI deployment: Deploying the AI assistant on-premises or as a hybrid solution for organizational use.
Why It Matters:
Khoj provides a practical, open-source option for individuals and organizations that want a personal AI assistant with self-hosting capabilities. It avoids reliance on a single cloud provider by supporting local and online LLMs from multiple developers. Its accessible design allows operation on a personal computer, a cloud service, or an on-premises server, offering flexibility without requiring user lock-in to a single ecosystem or vendor.




