This module contains runnable Kotlin examples organized by feature:
example/- SDK examples (ListRuns, Dataset, PromptManagement, RecordExperiment, E2eEval)example/otel/- OpenTelemetry tracing examples
All examples require:
./gradlew :langsmith-java-example:run -Pexample=ExampleName -Dlangchain.langsmithApiKey=your_api_keyAlternatively, you can use environment variables:
export LANGSMITH_API_KEY=your_api_keyThe langchain.baseUrl system property (or LANGSMITH_ENDPOINT environment variable) is optional and defaults to https://api.smith.langchain.com/ if not set.
Located in src/main/kotlin/com/langchain/smith/example/otel/
Make actual OpenAI API calls with automatic tracing to LangSmith.
./gradlew :langsmith-java-example:run -Pexample=OtelOpenAI \
-Dlangchain.langsmithApiKey=your_api_key \
-DOPENAI_API_KEY=your_openai_key \
-DLANGSMITH_PROJECT=my-project # optional, defaults to "default"View traces at https://smith.langchain.com
Send mock traces to LangSmith without external API calls.
./gradlew :langsmith-java-example:run -Pexample=OtelLangSmith \
-Dlangchain.langsmithApiKey=your_api_key \
-DLANGSMITH_PROJECT=my-project # optional, defaults to "default"View traces at https://smith.langchain.com
REST API with OpenTelemetry traces sent to LangSmith.
# Start server
./gradlew :langsmith-java-example:run -Pexample=SpringBootLangSmith \
-Dlangchain.langsmithApiKey=your_api_key \
-DLANGSMITH_PROJECT=my-project # optional
# In another terminal, test endpoints:
curl -X POST http://localhost:8080/api/chat \
-H "Content-Type: application/json" \
-d '{"message": "Hello!"}'
curl "http://localhost:8080/api/analyze?text=This%20is%20great"Located in src/main/kotlin/com/langchain/smith/example/
RECOMMENDED - Clean, simple example following the same pattern as the Dataset example.
./gradlew :langsmith-java-example:run -Pexample=PromptManagement \
-Dlangchain.langsmithApiKey=your_api_keyFeatures demonstrated:
- Create prompt repositories using
client.repos().create() - Add prompt content with variables using
client.commits().update() - List and filter prompts using
client.repos().list() - Retrieve prompt content using
client.commits().retrieve() - View prompts in the LangSmith UI
This example follows the LangSmith Prompt Management docs and uses the SDK directly.
**Additional features:**
- Use prompts with OpenAI to generate responses
- Update prompt metadata and descriptions
- Pull specific commit versions
- System + user message prompts