Skip to content

Added Gemini & X.AI Support, Command Mode, and File Operations #53

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 5 commits into
base: master
Choose a base branch
from

Conversation

lecheel
Copy link

@lecheel lecheel commented Feb 28, 2025

  1. Integration with multiple LLMs: Support was added for both the Gemini and X.AI large language models, allowing users to interact with these AI services.
  2. Command Mode and File Operations: A new "command mode" was implemented, enabling users to perform actions like loading messages from files, saving responses, clearing the chat history, and quitting the application using commands (e.g., :o, :w, :clear, :save, :quit). and a fixed 'm' command.
  3. Screenshots for Command Mode: Visual aids (screenshots) were added to demonstrate the newly implemented command mode features.
  4. Configuration and Output Files The Chat struct now takes a config, and there is support for writing output to file. The handling LLM was refactored to support the output file.

feat01
feat02

- Implements the `Xai` struct and `LLM` trait for interacting with the X.AI API.
- Includes error handling for missing API key.
- Supports streaming responses and termination.
- Implements the `Gemini` struct and `LLM` trait for interacting with the Gemini API.
- Includes error handling for missing API key.
- Supports streaming responses and termination.
- Adds command mode with `:o`, `:w`, `:clear`, `:save`, `:quit` commands
- Implements file loading, response saving, chat clearing, and saving chat history
- Updates prompt rendering to display mode and cursor position
- Adds LLM status indicator
- Add 'm' command load messages from a fixed file
- Add screenshots demonstrating new features
- Pass config to `Chat::new`
- Add optional output file writing
- Refactor LLM answer handling for file writing
@pythops
Copy link
Owner

pythops commented Feb 28, 2025

Thanks for the PR 🙏
Would you mind break it down to multiple PRs instead, one for each backend and a separate one for the command just to make it easy to review ?

@lecheel
Copy link
Author

lecheel commented Mar 1, 2025

yes breaking down changes into multiple PRs completed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants