+| llm_prompt(str string, optionalModel string) string | Query OpenAI LLM (default GPT 3.5, supports o1-mini, o1-mini-2024-09-12, o1-preview, o1-preview-2024-09-12, o1, o1-2024-12-17, o3-mini, o3-mini-2025-01-31, gpt-4-32k-0613, gpt-4-32k-0314, gpt-4-32k, gpt-4-0613, gpt-4-0314, gpt-4o, gpt-4o-2024-05-13, gpt-4o-2024-08-06, gpt-4o-2024-11-20, chatgpt-4o-latest, gpt-4o-mini, gpt-4o-mini-2024-07-18, gpt-4-turbo, gpt-4-turbo-2024-04-09, gpt-4-0125-preview, gpt-4-1106-preview, gpt-4-turbo-preview, gpt-4-vision-preview, gpt-4, gpt-3.5-turbo-0125, gpt-3.5-turbo-1106, gpt-3.5-turbo-0613, gpt-3.5-turbo-0301, gpt-3.5-turbo-16k, gpt-3.5-turbo-16k-0613, gpt-3.5-turbo, gpt-3.5-turbo-instruct") with the provided text prompt and return the result as string (requires api token as environment variable `OPENAI_API_KEY`) | `llm_prompt("produce a generic json")` | `{'a':'b'}` |
0 commit comments