This is a simple Spring AI Chatbot REST application that uses the OpenAI API to answer questions. It demonstrates how to monitor Spring AI Chatbot using OpenTelemetry and OPIK server for observability and tracing.
- Prerequisites
- Installation
- Configuration
- Running the Application
- Manual Testing
- API Endpoints
- Monitoring and Observability
- Troubleshooting
Before running this application, ensure you have the following installed:
- Java 21 or higher
- Maven 3.6+ for dependency management and building
- OpenAI API Key - Sign up at OpenAI Platform
- OPIK API Key - Sign up at Comet OPIK
git clone [email protected]:comet-ml/opik-springai-demo.git
cd opik-springai-demojava --versionEnsure you have Java 21 or higher installed.
mvn --versionmvn clean installFor additional details refer to the OPIK documentation.
The application requires the following environment variables to be set:
- OPENAI_API_KEY: Your OpenAI API key
- OTEL_EXPORTER_OTLP_ENDPOINT: OPIK OpenTelemetry endpoint
- OTEL_EXPORTER_OTLP_HEADERS: Authorization headers for OPIK
On macOS/Linux:
export OPENAI_API_KEY="sk-your-openai-api-key-here"
export OTEL_EXPORTER_OTLP_ENDPOINT="https://www.comet.com/opik/api/v1/private/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>"On Windows (Command Prompt):
set OPENAI_API_KEY=sk-your-openai-api-key-here
set OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
set OTEL_EXPORTER_OTLP_HEADERS=Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>On Windows (PowerShell):
$env:OPENAI_API_KEY="sk-your-openai-api-key-here"
$env:OTEL_EXPORTER_OTLP_ENDPOINT="https://www.comet.com/opik/api/v1/private/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS="Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>"If you're running OPIK locally on port 8080, use these environment variables instead:
On macOS/Linux:
export OPENAI_API_KEY="sk-your-openai-api-key-here"
export OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:5173/api/v1/private/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Comet-Workspace=default,projectName=<your-project-name>"On Windows (Command Prompt):
set OPENAI_API_KEY=sk-your-openai-api-key-here
set OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel
set OTEL_EXPORTER_OTLP_HEADERS=Comet-Workspace=default,projectName=<your-project-name>On Windows (PowerShell):
$env:OPENAI_API_KEY="sk-your-openai-api-key-here"
$env:OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:5173/api/v1/private/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS="Comet-Workspace=default,projectName=<your-project-name>"Note: When using a local OPIK server, you don't need the Authorization header in OTEL_EXPORTER_OTLP_HEADERS.
The application is configured via src/main/resources/application.yml:
- Server Port: 8085 (customizable)
- OpenAI Model: gpt-4o (customizable)
- Temperature: 0.7 (controls response creativity)
- Tracing: All requests are traced (100% sampling)
mvn spring-boot:runmvn clean package
java -jar target/spring-ai-demo-opik-0.0.1-SNAPSHOT.jarmvn spring-boot:run -Dspring-boot.run.jvmArguments="-Dspring.devtools.restart.enabled=true"The application will start on http://localhost:8085
Basic question:
curl "http://localhost:8085/api/chat/ask-me?question=What is Spring AI?"Complex question with URL encoding:
curl --get --data-urlencode "question=How to integrate Spring AI with OpenAI for building chatbots?" http://localhost:8085/api/chat/ask-meDefault question (if no parameter provided):
curl "http://localhost:8085/api/chat/ask-me"Simple POST:
curl -X POST \
-H "Content-Type: text/plain" \
-d "Explain the benefits of using OpenTelemetry for monitoring" \
http://localhost:8085/api/chat/askPOST with JSON (if needed):
curl -X POST \
-H "Content-Type: application/json" \
-d '"What are the key features of Spring Boot 3.4?"' \
http://localhost:8085/api/chat/askEnhanced POST with tags and metadata (JSON):
curl -X POST \
-H "Content-Type: application/json" \
-d '{
"question": "What are the benefits of using Spring AI?",
"tags": ["spring", "ai", "tutorial"],
"metadata": {
"userId": "user123",
"sessionId": "session456",
"category": "educational"
}
}' \
http://localhost:8085/api/chat/ask-enhancedEnhanced POST with parameters:
curl -X POST \
"http://localhost:8085/api/chat/ask-with-params?question=What%20is%20OpenTelemetry?&tags=monitoring,observability&metadata=userId:123,sessionId:abc"- Method: GET
- URL:
http://localhost:8085/api/chat/ask-me - Query Parameters:
- Key:
question - Value:
How to integrate Spring AI with OpenAI?
- Key:
- Method: POST
- URL:
http://localhost:8085/api/chat/ask - Headers:
Content-Type: text/plain - Body:
What is the difference between Spring AI and LangChain?
- Method: POST
- URL:
http://localhost:8085/api/chat/ask-enhanced - Headers:
Content-Type: application/json - Body:
{ "question": "What are the benefits of using Spring AI?", "tags": ["spring", "ai", "tutorial"], "metadata": { "userId": "user123", "sessionId": "session456", "category": "educational" } }
- Method: POST
- URL:
http://localhost:8085/api/chat/ask-with-params - Query Parameters:
- Key:
question, Value:What is OpenTelemetry? - Key:
tags, Value:monitoring,observability,tracing - Key:
metadata, Value:userId:123,sessionId:abc,environment:dev
- Key:
http GET localhost:8085/api/chat/ask-me question=="What is machine learning?"Open your browser and navigate to:
http://localhost:8085/api/chat/ask-me?question=Tell me about Spring Framework
- Description: Ask a question using query parameter
- Parameters:
question(optional): Your question (defaults to "Tell me a joke")
- Example:
/api/chat/ask-me?question=What is AI?
- Description: Ask a question using request body
- Content-Type:
text/plain - Body: Your question as plain text
- Example:
POST /api/chat/ask Content-Type: text/plain What is Spring AI?
- Description: Ask a question with tags and metadata for enhanced tracing
- Content-Type:
application/json - Body: JSON object with question, tags, and metadata
- Request Format:
{ "question": "Your question here", "tags": ["tag1", "tag2", "tag3"], "metadata": { "key1": "value1", "key2": "value2" } } - Example:
POST /api/chat/ask-enhanced Content-Type: application/json { "question": "What are the benefits of using Spring AI?", "tags": ["spring", "ai", "tutorial"], "metadata": { "userId": "user123", "sessionId": "session456", "category": "educational" } }
- Description: Ask a question with tags and metadata using URL parameters
- Parameters:
question(required): Your questiontags(optional): Comma-separated list of tagsmetadata(optional): Key-value pairs in formatkey1:value1,key2:value2
- Example:
/api/chat/ask-with-params?question=What is OpenTelemetry?&tags=monitoring,observability&metadata=userId:123,sessionId:abc
The application automatically captures:
- HTTP requests and responses
- OpenAI API calls and responses
- Application metrics and traces
- Custom spans for business logic
- Navigate to Comet OPIK Dashboard
- Select your workspace and project
- View real-time traces and metrics
- Analyze performance and debugging information
Check application health:
curl http://localhost:8085/actuator/healthError: Failed to configure a DataSource
- Solution: This shouldn't occur with this application as it doesn't use a database
Error: OpenAI API key not found
- Solution: Ensure
OPENAI_API_KEYenvironment variable is set correctly
Error: Failed to export telemetry data
- Solution: Check your
OTEL_EXPORTER_OTLP_ENDPOINTandOTEL_EXPORTER_OTLP_HEADERSconfiguration - Verify: Your OPIK API key is valid and has proper permissions
Error: Port 8085 is already in use
- Solution: Change the port in
application.yml:server: port: 8086
Error: Rate limit exceeded
- Solution: Check your OpenAI usage limits and billing
Error: Invalid API key
- Solution: Verify your OpenAI API key is correct and active
To enable debug logging, add to application.yml:
logging:
level:
com.comet.opik.examples: DEBUG
org.springframework.ai: DEBUGTo run without telemetry export, set:
export OTEL_EXPORTER_OTLP_ENDPOINT=""- Spring Boot 3.4.3
- Spring AI 1.0.0
- OpenTelemetry Instrumentation
- Micrometer Tracing
- Spring Boot Actuator
For development purposes, you can modify the OpenAI model and parameters in application.yml:
spring:
ai:
openai:
chat:
options:
model: gpt-3.5-turbo # or gpt-4, gpt-4o-mini
temperature: 0.3 # Lower for more deterministic responses