Skip to content

Conversational RAG interface for openBIS docs. Uses LangGraph for memory, Ollama for chat models, and integrates processed data from ragBIS. Includes CLI and web UI for natural-language queries with persistent context.

Notifications You must be signed in to change notification settings

carlosmada22/openbis-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

openBIS chatbot - Conversational RAG Interface for openBIS Documentation

The openBIS chatbot is a standalone Python package that provides a conversational interface for querying openBIS documentation using RAG (Retrieval Augmented Generation) with conversation memory powered by LangGraph.

Features

  • Conversational Interface: Chat with openBIS documentation using natural language
  • Memory Persistence: Maintains conversation context across multiple interactions
  • Multiple Interfaces: Both CLI and web interfaces available
  • RAG Integration: Uses processed data from RAGbis for accurate responses
  • LangGraph Architecture: Advanced conversation flow with state management
  • Web Interface: Modern, responsive web UI with markdown rendering

Prerequisites

  • Python 3.8 or higher
  • Ollama installed and running
  • The qwen3 model installed in Ollama (or another compatible model)
  • Processed data from RAGbis

Installing Ollama and Required Models

  1. Install Ollama from https://ollama.ai/
  2. Pull the required chat model:
    ollama pull qwen3

Getting Processed Data

The openBIS chatbot requires processed data from ragBIS. The repository is available here. You have two options:

  1. Run ragBIS first (recommended):

    # In ragBIS project directory
    python -m ragbis
    # Then copy the data directory to the openBIS chatbot project
  2. Point to existing data: Use the --data-dir option to specify the location of processed data

Installation

From Source

  1. Clone or download this project
  2. Navigate to the openBIS chatbot_project directory
  3. Create a virtual environment (recommended):
    python -m venv venv
    source venv/bin/activate  # On Windows: venv\Scripts\activate
  4. Install dependencies:
    pip install -r requirements.txt

Using pip (if published)

pip install chatbis

Usage

Quick Start

  1. Ensure you have processed data in ./data/processed/ (from RAGbis)
  2. Run the CLI interface:
    python -m chatbis
  3. Or run the web interface:
    python -m chatbis --web

Command Line Interface (CLI)

Start an interactive chat session:

python -m chatbis

The CLI supports these commands:

  • Type your questions naturally
  • clear - Clear conversation history
  • quit, exit, or bye - Exit the application

Web Interface

Start the web server:

python -m chatbis --web

Then open your browser to http://localhost:5000

Command Line Options

python -m chatbis --help

Available options:

  • --data-dir DIR: Directory containing processed data (default: ./data/processed)
  • --model MODEL: Ollama model to use (default: qwen3)
  • --web: Run web interface instead of CLI
  • --host HOST: Host for web interface (default: 127.0.0.1)
  • --port PORT: Port for web interface (default: 5000)
  • --debug: Enable debug mode for web interface
  • --top-k N: Number of chunks to retrieve for CLI (default: 5)
  • --verbose: Enable verbose logging

Examples

Run with custom data directory:

python -m chatbis --data-dir /path/to/processed/data

Run web interface on all interfaces:

python -m chatbis --web --host 0.0.0.0 --port 8080

Use a different Ollama model:

python -m chatbis --model llama2

Run with verbose logging:

python -m chatbis --verbose

Data Requirements

The openBIS chatbot expects the following file structure in the data directory:

data/processed/
├── chunks.json           # Required: Main data file with embeddings
└── conversation_memory.db # Created automatically for conversation memory

The chunks.json file should contain processed chunks from RAGbis with the following structure:

[
  {
    "title": "Page Title",
    "url": "https://...",
    "content": "Chunk content...",
    "embedding": [0.1, 0.2, ...],
    "chunk_id": "unique_id"
  }
]

Features

Conversation Memory

The openBIS chatbot maintains conversation context using LangGraph's SQLite checkpointer:

  • Remembers previous questions and answers
  • Maintains context across multiple interactions
  • Supports multiple concurrent sessions (web interface)
  • Persistent storage across application restarts

Web Interface Features

  • Responsive Design: Works on desktop and mobile devices
  • Markdown Rendering: Properly formats responses with headers, lists, code blocks
  • Session Management: Maintains conversation state in browser
  • Real-time Chat: Instant responses with loading indicators
  • Clear History: Option to clear conversation history

CLI Interface Features

  • Interactive Chat: Natural conversation flow
  • Command Support: Built-in commands for managing conversations
  • Session Persistence: Maintains context during the session
  • Keyboard Shortcuts: Ctrl+C to exit gracefully

Configuration

Environment Variables

  • OLLAMA_HOST: Ollama server host (default: localhost)
  • OLLAMA_PORT: Ollama server port (default: 11434)
  • SECRET_KEY: Flask secret key for web interface (auto-generated if not set)

Customizing the Assistant

You can modify the assistant's behavior by editing the system message in the conversation engine:

  • Adjust response style and tone
  • Add domain-specific instructions
  • Modify conversation guidelines

Troubleshooting

Common Issues

  1. "Processed data not found" Error

    • Ensure you have run RAGbis first
    • Check that chunks.json exists in the data directory
    • Use --data-dir to specify the correct path
  2. Ollama Connection Error

    • Ensure Ollama is running: ollama serve
    • Check if the model is installed: ollama list
    • Install the model if missing: ollama pull qwen3
  3. Web Interface Not Loading

    • Check if the port is already in use
    • Try a different port: --port 8080
    • Check firewall settings
  4. Memory Issues

    • Conversation memory is stored in SQLite
    • Database file is created automatically
    • Clear memory by deleting the .db file

Logging

Enable verbose logging to debug issues:

python -m chatbis --verbose

For web interface debugging:

python -m chatbis --web --debug

API Reference

Web API Endpoints

  • GET /: Main chat interface
  • POST /api/chat: Send a message and get response
  • GET /api/chat/history/<session_id>: Get conversation history
  • POST /api/chat/clear/<session_id>: Clear conversation history

Request/Response Format

POST /api/chat

{
  "query": "Your question here",
  "session_id": "optional-session-id"
}

Response

{
  "success": true,
  "answer": "Assistant response",
  "session_id": "session-id",
  "metadata": {
    "token_count": 150,
    "rag_chunks_used": 3,
    "conversation_length": 4
  }
}

Development

Running Tests

pytest

Code Formatting

black src/

Type Checking

mypy src/

Integration with RAGbis

The openBIS chatbot is designed to work seamlessly with RAGbis:

  1. Run RAGbis to generate processed data
  2. Copy or link the data directory to the openBIS chatbot
  3. Start the openBIS chatbot with the processed data

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

Support

For issues and questions:

  • Create an issue on GitHub
  • Check the troubleshooting section above
  • Ensure RAGbis data is properly generated
  • Verify Ollama is properly configured

About

Conversational RAG interface for openBIS docs. Uses LangGraph for memory, Ollama for chat models, and integrates processed data from ragBIS. Includes CLI and web UI for natural-language queries with persistent context.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published