Skip to content

Add support for conversation-template argument for openai endpoint #1294

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 6 commits into from

Conversation

Tostino
Copy link
Contributor

@Tostino Tostino commented Oct 9, 2023

Here is a PR for support for conversation-templates setup as json files which can be specified for a model upon starting the api. Just create your template, and pass in the path with the --conversation-template my_template.json argument.

There was no where for me to add support for this in the vLLM api, so I only added it to the OpenAI api.

Added an example, and updated the quickstart part of the readme (the only place that talked about the other arguments)

@Tostino
Copy link
Contributor Author

Tostino commented Oct 16, 2023

Going to cancel this PR, and work on another which properly implements the HF chat templates within the tokenizer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant