This repository hosts the backend microservice for the Writing Assistant, an AI-powered tool designed to help users draft, refine, and improve their written content. The service uses large language models (such as Anthropic Claude via AWS Bedrock) to provide real-time writing suggestions, content completions, and productivity tools for writers, editors, and content creators.
This backend is structured as a modular microservice, focused on scalable, maintainable, and extensible AI-driven writing assistance. Each component is designed for independent development and integration with other services or frontends.
- File:
backend/main.py - Role:
- Serves as the main entrypoint for the backend service.
- Exposes REST API endpoints for writing assistance, chat completions, and tool-based content operations.
- Handles request routing and integrates with FastAPI for high-performance, async API handling.
- Directory:
backend/bedrock/ - Key Files:
anthropic_chat_completions.py: Handles chat completion requests to Anthropic Claude via AWS Bedrock, including prompt formatting and response parsing.client.py: Manages the setup and secure communication with AWS Bedrock and Anthropic Claude, abstracting away authentication and API details.
- Directory:
backend/writing_assistant/ - Key Files:
assistant.py: Orchestrates the workflow for writing assistance, including input validation, tool selection, and response aggregation.tools.py: Provides specialized utility functions and tools for content editing, rewriting, grammar checking, and enhancement.
- Files:
Dockerfile.backend: Defines the Docker image for the backend service.docker-compose.yml: Orchestrates multi-container deployment (if needed).makefile: Provides build and clean commands for Dockerized workflows.
- Files:
.env(to be placed in/backend): Stores environment variables for database, AWS, and service configuration.pyproject.toml,poetry.lock: Manage Python dependencies and project metadata.
-
Database: MongoDB (Atlas)
-
Integration:
- Managed via environment variables and
pymongoin the backend. - Collections:
userProfiles. - In this microservice, we use only
userProfilesto provide a personalised writing assistance.
- Managed via environment variables and
-
Example documents for all collections of this demo are stored in JSON format under
backend/collections/. You can also explore our main microservice to learn more about the other backend services and how the collections are used throughout the demo.
- AWS Bedrock:
- Provides access to Anthropic Claude for LLM-powered completions and suggestions.
- (Optional) Other APIs:
- The codebase is structured for easy extension to other AI or data services.
- A short summary of the components used in our microservice for easier understanding and scalability:
| Layer/Component | Code Location(s) | Responsibility |
|---|---|---|
| API Layer | backend/main.py |
Expose REST endpoints, route requests |
| LLM Integration | backend/bedrock/ |
Communicate with AWS Bedrock & Anthropic Claude |
| Writing Tools | backend/writing_assistant/ |
Content editing, enhancement, workflow orchestration |
| Containerization | Dockerfile.backend, docker-compose.yml, makefile |
Deployment, build, orchestration |
| Config & Env | .env, pyproject.toml, poetry.lock |
Environment, secrets, dependencies |
| Data Storage | MongoDB (Atlas), via pymongo |
Store user data, drafts, suggestions, etc. |
| External Services | AWS Bedrock, Anthropic Claude | LLM completions, AI-powered suggestions |
-
Real-time AI-powered writing suggestions and completions
-
Modular toolset for content editing, rewriting, and enhancement
-
Integration with Anthropic Claude and AWS Bedrock for advanced language capabilities
-
API endpoints for seamless frontend integration
-
Scalable, containerized deployment with Docker
- User Input:
Users submit a writing prompt, draft, or editing request via the frontend interface.
- Request Handling:
The backend receives the request and routes it to the appropriate writing assistant tool or AI model.
- AI Processing:
The service interacts with Anthropic Claude (via AWS Bedrock) to generate completions, suggestions, or edits based on the user’s input.
- Tool-Based Enhancement:
Additional tools may process the AI output for grammar checking, style improvement, or formatting.
- Response Delivery:
The content or suggestions are then returned to the user through the API.
In summary:
The Writing Assistant backend combines user input, advanced AI models, and specialized tools to deliver high-quality writing assistance in real time.
- pymongo for MongoDB connectivity and operations.
- boto3 for AWS SDK integration and Bedrock API access.
- botocore for low-level AWS service operations.
- Anthropic Claude via AWS Bedrock for text generation and content analysis.
- Docker for containerized deployment.
- docker-compose for multi-service orchestration.
- python-dotenv for environment variable management.
- Claude 3 Haiku for writing assisant.
-
anthropic_chat_completions.py:
Handles chat completion requests to Anthropic Claude via AWS Bedrock. -
client.py:
Manages API client setup and secure communication with external AI services.
-
assistant.py:
Core logic for orchestrating writing assistance workflows. -
tools.py:
Utility functions and specialized tools for content editing and enhancement.
- Serves as the main API gateway, routing requests to the appropriate modules and tools.
Before you begin, ensure you have met the following requirements:
- MongoDB Atlas account - Register Here
- Python 3.10 or higher
- Poetry – Install Here
- AWS CLI configured with appropriate credentials – Installation Guide
- AWS Account with Bedrock access enabled – Sign up Here
- Docker (optional, for containerized deployment) – Install Here
-
Fork the Repository
- Visit the GitHub repository page and click the Fork button in the top right corner to create your own copy of the repository under your GitHub account.
-
Clone Your Fork
- Open your terminal and run:
git clone https://github.com/<your-username>/ist-media-internship-be2.git cd ist-media-internship-be
- Open your terminal and run:
-
(Optional) Set Up Upstream Remote
- To keep your fork up to date with the original repository, add the upstream remote:
git remote add upstream https://github.com/<original-owner>/ist-media-internship-be2.git
- To keep your fork up to date with the original repository, add the upstream remote:
Follow MongoDB's guide to create a user with readWrite access to the contentlab database.
Important
Create a .env file in the /backend directory with the following content:
MONGODB_URI=your_mongod_uri
DATABASE_NAME=dbname
APP_NAME=appname
USER_PROFILES_COLLECTION=userProfiles
DRAFTS_COLLECTION=drafts
AWS_REGION=regionname- Open a terminal in the project root directory.
- Run the following commands:
make poetry_start make poetry_install
- Verify that the
.venvfolder has been generated within the/backenddirectory.
To start the backend service, run:
poetry run uvicorn main:app --host 0.0.0.0 --port 8001Default port is
8001for this microservice. Modify the--portflag if needed.
Run the following command in the root directory:
make buildTo remove the container and image:
make cleanYou can access the API documentation by visiting the following URL:
http://localhost:<PORT_NUMBER>/docs
E.g. http://localhost:8001/docs
Note
Make sure to replace <PORT_NUMBER> with the port number you are using and ensure the backend is running.
Important
Check that you've created an .env file that contains the required environment variables.
See LICENSE file for details.