Skip to content

JinFanZheng/OpenDeepWiki

 
 

Repository files navigation

OpenDeepWiki

中文 | English

OpenDeepWiki Logo

AI-Driven Code Knowledge Base

Project Introduction

OpenDeepWiki is an open-source project inspired by DeepWiki, developed using .NET 9 and Semantic Kernel. It aims to help developers better understand and utilize codebases by providing features such as code analysis, documentation generation, and knowledge graph creation.

  • Analyze code structure
  • Understand core concepts of repositories
  • Generate code documentation
  • Automatically create README.md for code MCP Support

OpenDeepWiki supports MCP (Model Context Protocol)

  • Supports providing an MCPServer for a single repository and conducting analysis on a single repository.

Usage: The following is the usage of cursor:

{
  "mcpServers": {
    "OpenDeepWiki":{
      "url": "http://Your OpenDeepWiki service IP:port/sse?owner=AIDotNet&name=OpenDeepWiki"
    }
  }
}
  • owner: It is the name of the organization or owner of the repository.
  • name: It is the name of the repository.

After adding the repository, test by asking a question (please note that before doing this, the repository must be processed first): What is OpenDeepWiki? The effect is as shown in the picture: !

In this way, you can use OpenDeepWiki as an MCPServer, making it available for other AI models to call upon, facilitating the analysis and understanding of an open-source project.

Features

  • Quick Generation: Convert all code repositories from Github, Gitlab, Gitee, Gitea, etc., into a knowledge base in just a few minutes.
  • Multi-language Support: Supports code analysis and documentation generation for all programming languages.
  • Code Structure: Automatically generate Mermaid charts to understand code structure.
  • Custom Models: Supports custom models and custom APIs for extension as needed.
  • AI Intelligent Analysis: AI-based code analysis and understanding of code relationships.
  • Easy SEO: Generate SEO-friendly documents and knowledge bases using Next.js for better search engine indexing.
  • Conversational Interaction: Support for conversational interaction with AI to obtain detailed information and usage methods for code, enabling deeper understanding.

🚀 Quick Start

  1. Clone the repository
git clone https://github.com/AIDotNet/OpenDeepWiki.git
cd OpenDeepWiki
  1. Open the docker-compose.yml file and modify the following environment variables:

OpenAI:

services:
  koalawiki:
    environment:
      - KOALAWIKI_REPOSITORIES=/repositories
      - TASK_MAX_SIZE_PER_USER=5 # Maximum number of parallel document generation tasks per user by AI
      - CHAT_MODEL=DeepSeek-V3 # Model must support functions
      - ANALYSIS_MODEL= # Analysis model used for generating repository directory structure
      - CHAT_API_KEY= # Your API key
      - LANGUAGE= # Set the default language for generation as "Chinese"
      - ENDPOINT=https://api.token-ai.cn/v1
      - DB_TYPE=sqlite
      - MODEL_PROVIDER=OpenAI # Model provider, default is OpenAI, supports AzureOpenAI and Anthropic
      - DB_CONNECTION_STRING=Data Source=/data/KoalaWiki.db
      - EnableSmartFilter=true # Whether intelligent filtering is enabled or not may affect how the AI can obtain the file directory of the repository
      - UPDATE_INTERVAL # Warehouse increment update interval, unit: days
      - PARALLEL_COUNT=1 # The warehouse processes the quantity in parallel

AzureOpenAI:

services:
  koalawiki:
    environment:
      - KOALAWIKI_REPOSITORIES=/repositories
      - TASK_MAX_SIZE_PER_USER=5 # Maximum number of parallel document generation tasks per user by AI
      - CHAT_MODEL=DeepSeek-V3 # Model must support functions
      - ANALYSIS_MODEL= # Analysis model used for generating repository directory structure
      - CHAT_API_KEY= # Your API key
      - LANGUAGE= # Set the default language for generation as "Chinese"
      - ENDPOINT=https://your-azure-address.openai.azure.com/
      - DB_TYPE=sqlite
      - MODEL_PROVIDER=AzureOpenAI # Model provider, default is OpenAI, supports AzureOpenAI and Anthropic
      - DB_CONNECTION_STRING=Data Source=/data/KoalaWiki.db
      - EnableSmartFilter=true # Whether intelligent filtering is enabled or not may affect how the AI can obtain the file directory of the repository
      - UPDATE_INTERVAL # Warehouse increment update interval, unit: days
      - PARALLEL_COUNT=1 # The warehouse processes the quantity in parallel

Anthropic:

services:
  koalawiki:
    environment:
      - KOALAWIKI_REPOSITORIES=/repositories
      - TASK_MAX_SIZE_PER_USER=5 # Maximum number of parallel document generation tasks per user by AI
      - CHAT_MODEL=DeepSeek-V3 # Model must support functions
      - ANALYSIS_MODEL= # Analysis model used for generating repository directory structure
      - CHAT_API_KEY= # Your API key
      - LANGUAGE= # Set the default language for generation as "Chinese"
      - ENDPOINT=https://api.anthropic.com/
      - DB_TYPE=sqlite
      - MODEL_PROVIDER=Anthropic # Model provider, default is OpenAI, supports AzureOpenAI and Anthropic
      - DB_CONNECTION_STRING=Data Source=/data/KoalaWiki.db
      - EnableSmartFilter=true # Whether intelligent filtering is enabled or not may affect how the AI can obtain the file directory of the repository
      - UPDATE_INTERVAL # Warehouse increment update interval, unit: days
      - PARALLEL_COUNT=1 # The warehouse processes the quantity in parallel

💡 How to get an API Key:

  1. Start the service

You can use the provided Makefile commands to easily manage the application:

# Build all Docker images
make build

# Start all services in background mode
make up

# Or start in development mode (with logs visible)
make dev

Then visit http://localhost:8090 to access the knowledge base.

For more commands:

make help

For Windows Users (without make)

If you're using Windows and don't have make available, you can use these Docker Compose commands directly:

# Build all Docker images
docker-compose build

# Start all services in background mode
docker-compose up -d

# Start in development mode (with logs visible)
docker-compose up

# Stop all services
docker-compose down

# View logs
docker-compose logs -f

For building specific architectures or services, use:

# Build only backend
docker-compose build koalawiki

# Build only frontend
docker-compose build koalawiki-web

# Build with architecture parameters
docker-compose build --build-arg ARCH=arm64
docker-compose build --build-arg ARCH=amd64

🔍 How It Works

OpenDeepWiki uses AI to:

  • Clone code repository locally
  • Analyze based on repository README.md
  • Analyze code structure and read code files as needed, then generate directory json data
  • Process tasks according to directory, each task is a document
  • Read code files, analyze code files, generate code documentation, and create Mermaid charts representing code structure dependencies
  • Generate the final knowledge base document
  • Analyze repository through conversational interaction and respond to user inquiries
graph TD
    A[Clone code repository] --> B[Analyze README.md]
    B --> C[Analyze code structure]
    C --> D[Generate directory json data]
    D --> E[Process multiple tasks]
    E --> F[Read code files]
    F --> G[Analyze code files]
    G --> H[Generate code documentation]
    H --> I[Create Mermaid charts]
    I --> J[Generate knowledge base document]
    J --> K[Conversational interaction]
Loading

Advanced Configuration

Environment Variables

  • KOALAWIKI_REPOSITORIES Path for storing repositories
  • TASK_MAX_SIZE_PER_USER Maximum parallel tasks for AI document generation per user
  • CHAT_MODEL Model must support functions
  • ENDPOINT API Endpoint
  • ANALYSIS_MODEL Analysis model for generating repository directory structure
  • CHAT_API_KEY Your API key
  • LANGUAGE Change the language of the generated documents
  • DB_TYPE Database type, default is sqlite
  • MODEL_PROVIDER Model provider, by default OpenAI, supports Azure, OpenAI and Anthropic
  • DB_CONNECTION_STRING Database connection string
  • EnableSmartFilter Whether intelligent filtering is enabled or not may affect how the AI can obtain the file directory of the repository
  • UPDATE_INTERVAL Warehouse increment update interval, unit: days
  • PARALLEL_COUNT The warehouse processes the quantity in parallel

Build for Different Architectures

The Makefile provides commands to build for different CPU architectures:

# Build for ARM architecture
make build-arm

# Build for AMD architecture
make build-amd

# Build only backend for ARM
make build-backend-arm

# Build only frontend for AMD
make build-frontend-amd

WeChat

2e3d38be0544d933e14c64e7de26f307

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

Star History

Star History Chart

About

OpenDeepWiki 是 DeepWiki 项目的开源版本,旨在提供一个强大的知识管理与协作平台。项目主要使用 C# 和 TypeScript 开发,支持模块化设计,易于扩展和自定义。其核心功能包括: 高效的知识库构建与管理工具 支持多语言与团队协作 前后端分离架构,提升开发与部署效率 无论是个人用户、团队,还是企业级用户,都可以使用 OpenDeepWiki 来构建和管理自己的知识生态。同时,欢迎开发者参与贡献,共同完善项目。

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • C# 56.2%
  • TypeScript 39.6%
  • CSS 2.5%
  • Dockerfile 0.5%
  • Makefile 0.5%
  • Batchfile 0.3%
  • Other 0.4%