Skip to content

Devamm007/TDS-P1-2025-09-24F2000828

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Self-Hosted-LLM-DevOps-Engine πŸ€–

Python 3.11+ FastAPI GitHub API License: MIT

πŸ“‘ Table of Contents

  1. 🎯 Overview
  2. πŸ“‘ TDS-Project-1
  3. πŸ› οΈ Prerequisites
  4. πŸ”‘ Configuration & Setup
  5. πŸ› οΈ Installation and Execution Locally
  6. πŸš€ Deployment and How to use
  7. πŸ“œ License

🎯 Overview

If you copy this project as it is, without understanding and debugging, I feel sorry for you to miss this learning experience and reduce chances of being self-employed/employed, giving an advantage to your friendly-competition.

The Self-Hosted-LLM-DevOps-Engine is a API-driven backend service designed to automate the process of turning a simple task brief into a working code repository. Users can send a JSON payload to the running API, which then orchestrates the entire development and deployment flow: LLM code generation (simulated here) $\rightarrow$ GitHub Repository creation $\rightarrow$ GitHub Pages deployment.

This engine is built for self-hosting, allowing users to secure their own API keys and secrets for full control over the process.

πŸ“‘ TDS-Project-1

Some code content is particularly for my project submission and evaluation, a seperate version will be avaible for users to use themselves and can use their own self hosted API-endpoint. [https://github.com/Devamm007/Self-Hosted-LLM-DevOps-Engine Coming Soon!]

My evaluators will: Publish a Google Form where I will be submitting my API URL, secret, and this GitHub repo URL For each submission, create a unique task request. POST the request to student's latest API URL. If the response is not HTTP 200, try up to 3 times over 3-24 hours. Then fail. Accept POST requests on the evaluation_url. Add it to queue to evaluate and return a HTTP 200 response. Evaluate the repo based on the task-specific as well as common checks and log these. Repo-level rule-based checks (e.g. LICENSE is MIT) LLM-based static checks (e.g. code quality, README.md quality) Dynamic checks (e.g. use Playwright to load your page, run and test your app) Save the results in a results table. For all {"round": 1} requests, generate and POST a unique round 2 task request (even if checks failed). Publish the database after the deadline. My evaluators may, at their discretion, send up to 3 such tasks.

This is payload's structure to POST to evaluation.url:

{
  # Copy these from the request
  "email": "...",
  "task": "captcha-solver-...",
  "round": 1,
  "nonce": "ab12-...",
  # Send these based on your GitHub repo and commit
  "repo_url": "https://github.com/user/repo",
  "commit_sha": "abc123",
  "pages_url": "https://user.github.io/repo/",
}

Prerequisites

Before you begin, ensure you have the following ready:

  1. Python 3.11+ installed.
  2. A GitHub Account for repository creation.
  3. An OpenAI API Key (or another LLM service key) if you intend to replace the simulated LLM function.

πŸ”‘ Configuration & Setup

1. Generate Required Tokens

You need to set up three critical environment variables to run the application:

Variable Purpose How to Get It
GITHUB_TOKEN Used to create repositories and push code. Generate a Personal Access Token (PAT). Settings > Developer Settings > Personal Access Tokens > Fined-grained tokens, create token with Repository access: "All repositories" and Permissions: Administration, Contents, Pages, Workflows (read and write) and Metadata
LLM_API_KEY Used for communication with the LLM for code generation. Obtain this key from any platform of your choice.
SECRET A private, custom string used to validate incoming requests to your API. So only those who have secret can use your API-endpoint if exposed. Choose any long, random, and secure string (e.g., generated by a password manager).

2. Configure the .env File

Create a file named .env in your main application directory and populate it with the keys you generated in the previous step:

# A unique secret key for API request validation
SECRET="your_long_random_and_secure_secret_key_here"

# Your Personal Access Token for GitHub API access with required settings
GITHUB_TOKEN="ghp_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

# Your API key for LLM-Integration
OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

Installation and Execution Locally

You have a few options for installing the requirements and running the server.

Option 1: Using UV (Recommended)

The uv tool is a modern, fast Python package installer and executor.

  1. Install UV:

    pip install uv
  2. Run the Application: Since main.py includes a script header with dependencies, uv can install them into a temporary virtual environment and execute the file in one step.

    uv run main.py

Option 2: Using requirements

  1. Install Dependencies: It is highly recommended to use a virtual environment or you can directly install dependencies.

    # Create and activate a virtual environment
    python3 -m venv venv
    source venv/bin/activate  # On Linux/macOS
    # venv\Scripts\activate   # On Windows
    
    # Install the dependencies
    pip install -r requirements.txt
  2. Run the Application:

    python main.py
    # OR
    python3 main.py

    (Remember to keep the virtual environment active, or use uv run main.py to handle the environment automatically.)


πŸš€ Deployment and How to use

Deployment on Render (Recommended)

The application is best deployed as a Web Service on a platform like Render, which supports running the Python server (uvicorn).

Steps

  1. Repository Setup: Ensure your main.py and a requirements.txt file (listing fastapi[standard], uvicorn[standard], etc.) are in your GitHub repository.
  2. Render Setup:
    • Sign up/Log in to Render and create a New Web Service.
    • Connect your GitHub account and select this repository.
    • Choose Python 3 as the runtime.
  3. Environment Variables: Add your secret keys as Environment Variables in Render's configuration manually or upload .env file, replacing the local .env file:
    • GITHUB_TOKEN: (Your GitHub PAT)
    • LLM_API_KEY: (Your LLM service key)
    • SECRET: (Your strong secret validation key)
  4. Start Command: Set the Start Command to run your application using uvicorn:
    uvicorn main:app --host 0.0.0.0 --port $PORT
  5. Deploy: Select the Free plan and click Create Web Service. Render will provide a public URL for your API endpoint.

Deployment on Hugging Face Spaces (Using Docker)

Since this is a standard FastAPI API, the Docker SDK must be used for deployment on Hugging Face Spaces.

Steps

Repository Setup: Ensure the following files are pushed to your Git repository or directly add/create them on Hugging Face Space:

  • main.py
  • requirements.txt
  • Dockerfile (using an image like python:3.12-slim and binding to port 7860).
# 1. Base Image: Start from an official Python image
FROM python:3.12-slim

# 2. Set Environment Variables
# The default port for Spaces is 7860, which the app must bind to.
ENV PORT 7860
ENV PYTHONUNBUFFERED 1

# 3. Set Working Directory to root
WORKDIR /

# 4. Copy Dependencies File to root
COPY requirements.txt .

# 5. Install Dependencies
# This layer only gets rebuilt if requirements.txt changes
RUN pip install --no-cache-dir -r requirements.txt

# 6. Copy Application Code and Templates to root
# This copies everything from your local repo root to the container's root (/)
COPY . .

# 7. Define the Command to Run the Application
# Assumes 'main.py' is in the root directory and contains the 'app' object
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "7860"]
  • README.md must start with configuration setup with or without license:
   ---
   title: {{title}}
   emoji: {{emoji}}
   colorFrom: {{colorFrom}}
   colorTo: {{colorTo}}
   sdk: docker
   license: mit
   app_file: main.py
   pinned: false
   ---

Spaces Setup:

  • Go to Hugging Face Spaces and create a New Space.
  • Select the Docker SDK.
  • Choose the hardware (CPU Basic is often sufficient) and upload main.py, requirements.txt, Dockerfile, templates folder to Hugging Face Space.
  • Secrets Management: Navigate to the Settings tab of your Space and add the GITHUB_TOKEN, LLM_API_KEY, SECRET

Deploy: The Space will automatically build the Docker image and deploy the service on port 7860. The public URL will be https://{your-username}-{your-space-name}.hf.space


How to Use the API

The core functionality is exposed via a single POST endpoint.

🎯 Endpoint

  • URL: [YOUR_RENDER_URL]/handle_task
  • Method: POST
  • Purpose: Triggers the full workflow (LLM generation, GitHub repo creation, code push, Pages activation).

πŸ“– Testing via Swagger UI

To test the API interactively, navigate to the automatic documentation provided by FastAPI by appending /docs to your deployment URL: Render Swagger URL: https://{your-username}-{your-space-name}.hf.space/docs Hugging Face Swagger URL: https://{your-username}-{your-space-name}.hf.space/ Open the Swagger UI in your browser. Expand the POST /handle_task endpoint. Click Try it out. Input a valid JSON payload (see below for structure) into the Request body field. Ensure the "secret" field exactly matches the SECRET environment variable you set during deployment. Click Execute to send the request. OR Directly use https://{your-username}-{your-space-name}.hf.space/handle_task

βœ‰οΈ Request Payload

The endpoint expects a JSON payload containing the task details:

Field Type Description
email string User's email (for tracking).
secret string Must match the SECRET environment variable.
task string A short name for the task (used in the repository name).
round integer 1 for initial creation, 2 for updates/refactoring.
nonce string A unique identifier (used in the repository name to ensure uniqueness).
brief string Detailed instruction for the LLM.
checks array List of requirements the generated code must meet.
evaluation_url string URL to notify upon completion.
attachments array Optional list of objects (name, url) pointing to files for context.

πŸ› οΈ Example Usage (send_task.py)

You can test the endpoint using the provided send_task.py script. Remember to update the API_URL in the script with your specific Render domain.

Example send_task.py Call (Round 1):

import requests

def send_task():
    # ⚠️ UPDATE THIS URL with your Render domain!
    API_URL = "[https://YOUR-RENDER-SERVICE-NAME.onrender.com/handle_task](https://YOUR-RENDER-SERVICE-NAME.onrender.com/handle_task)"
    
    payload = {
        "email": "dummy@example.com",
        "secret": "%Br6n887uih8g78Bbo", # Must match your SECRET env var
        "task": "github_user_created_date",
        "round": 1,
        "nonce": "abcd",
        "brief": "Publish a Bootstrap page that fetches a GitHub username...",
        "checks": ["Repo has MIT license", "README.md is professional...", /* ... */],
        "evaluation_url": "[https://example.com/notify](https://example.com/notify)",
        "attachments": [{"name": None, "url": None}]
    }
    
    response = requests.post(API_URL, json=payload, timeout=30)
    print("Response JSON:", response.json())

if __name__ == "__main__":
    send_task()

πŸ”” Notification Payload (Received on evaluation_url)

Once the task is complete and code has been pushed (and Pages is confirmed live/built), the application sends a POST request to the provided evaluation_url with the following JSON structure:

Field Type Source Description
"email" string Request Copied from the initial request payload.
"task" string Request Copied from the initial request payload.
"round" integer Request Copied from the initial request payload.
"nonce" string Request Copied from the initial request payload.
"repo_url" string Generated The public URL of the newly created/updated GitHub repository.
"commit_sha" string Generated The SHA hash of the latest commit applied by the bot.
"pages_url" string Generated The public URL for the live GitHub Pages site.

Example Notification Payload:

{
  "email": "dummy@example.com",
  "task": "<reponame>",
  "round": 1,
  "nonce": "abcd",
  "repo_url": "[https://github.com/<username>/<reponame>](https://github.com/<username>/<reponame>)",
  "commit_sha": "f10d7a6e7c9b0a3d4f5e6b7c8a9d0e1f2b3c4d5e",
  "pages_url": "[https://<username>.github.io/<reponame>/](https://<username>.github.io/<reponame>/)"
}

πŸ“œ License

This project is licensed under the MIT License.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors