Skip to content

This is a simple Spring AI Chatbot REST application that uses the OpenAI API to answer questions and OPIK server for observability and tracing.

License

Notifications You must be signed in to change notification settings

comet-ml/opik-springai-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Spring AI Chatbot REST Application

This is a simple Spring AI Chatbot REST application that uses the OpenAI API to answer questions. It demonstrates how to monitor Spring AI Chatbot using OpenTelemetry and OPIK server for observability and tracing.

Table of Contents

Prerequisites

Before running this application, ensure you have the following installed:

  • Java 21 or higher
  • Maven 3.6+ for dependency management and building
  • OpenAI API Key - Sign up at OpenAI Platform
  • OPIK API Key - Sign up at Comet OPIK

Installation

1. Clone the Repository

git clone [email protected]:comet-ml/opik-springai-demo.git
cd opik-springai-demo

2. Verify Java Installation

java --version

Ensure you have Java 21 or higher installed.

3. Verify Maven Installation

mvn --version

4. Install Dependencies

mvn clean install

Configuration

For additional details refer to the OPIK documentation.

Environment Variables

The application requires the following environment variables to be set:

Required Variables

  • OPENAI_API_KEY: Your OpenAI API key
  • OTEL_EXPORTER_OTLP_ENDPOINT: OPIK OpenTelemetry endpoint
  • OTEL_EXPORTER_OTLP_HEADERS: Authorization headers for OPIK

Setting Environment Variables: Using Cloud OPIK (Comet)

On macOS/Linux:

export OPENAI_API_KEY="sk-your-openai-api-key-here"
export OTEL_EXPORTER_OTLP_ENDPOINT="https://www.comet.com/opik/api/v1/private/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>"

On Windows (Command Prompt):

set OPENAI_API_KEY=sk-your-openai-api-key-here
set OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
set OTEL_EXPORTER_OTLP_HEADERS=Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>

On Windows (PowerShell):

$env:OPENAI_API_KEY="sk-your-openai-api-key-here"
$env:OTEL_EXPORTER_OTLP_ENDPOINT="https://www.comet.com/opik/api/v1/private/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS="Authorization=<your-opik-api-key>,Comet-Workspace=default,projectName=<your-project-name>"

Setting Environment Variables: Local OPIK Server

If you're running OPIK locally on port 8080, use these environment variables instead:

On macOS/Linux:

export OPENAI_API_KEY="sk-your-openai-api-key-here"
export OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:5173/api/v1/private/otel"
export OTEL_EXPORTER_OTLP_HEADERS="Comet-Workspace=default,projectName=<your-project-name>"

On Windows (Command Prompt):

set OPENAI_API_KEY=sk-your-openai-api-key-here
set OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel
set OTEL_EXPORTER_OTLP_HEADERS=Comet-Workspace=default,projectName=<your-project-name>

On Windows (PowerShell):

$env:OPENAI_API_KEY="sk-your-openai-api-key-here"
$env:OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:5173/api/v1/private/otel"
$env:OTEL_EXPORTER_OTLP_HEADERS="Comet-Workspace=default,projectName=<your-project-name>"

Note: When using a local OPIK server, you don't need the Authorization header in OTEL_EXPORTER_OTLP_HEADERS.

Application Configuration

The application is configured via src/main/resources/application.yml:

  • Server Port: 8085 (customizable)
  • OpenAI Model: gpt-4o (customizable)
  • Temperature: 0.7 (controls response creativity)
  • Tracing: All requests are traced (100% sampling)

Running the Application

Method 1: Using Maven Spring Boot Plugin

mvn spring-boot:run

Method 2: Using JAR File

mvn clean package
java -jar target/spring-ai-demo-opik-0.0.1-SNAPSHOT.jar

Method 3: Development Mode with Auto-reload

mvn spring-boot:run -Dspring-boot.run.jvmArguments="-Dspring.devtools.restart.enabled=true"

The application will start on http://localhost:8085

Manual Testing

1. Test with curl (GET Request)

Basic question:

curl "http://localhost:8085/api/chat/ask-me?question=What is Spring AI?"

Complex question with URL encoding:

curl --get --data-urlencode "question=How to integrate Spring AI with OpenAI for building chatbots?" http://localhost:8085/api/chat/ask-me

Default question (if no parameter provided):

curl "http://localhost:8085/api/chat/ask-me"

2. Test with curl (POST Request)

Simple POST:

curl -X POST \
  -H "Content-Type: text/plain" \
  -d "Explain the benefits of using OpenTelemetry for monitoring" \
  http://localhost:8085/api/chat/ask

POST with JSON (if needed):

curl -X POST \
  -H "Content-Type: application/json" \
  -d '"What are the key features of Spring Boot 3.4?"' \
  http://localhost:8085/api/chat/ask

Enhanced POST with tags and metadata (JSON):

curl -X POST \
  -H "Content-Type: application/json" \
  -d '{
    "question": "What are the benefits of using Spring AI?",
    "tags": ["spring", "ai", "tutorial"],
    "metadata": {
      "userId": "user123",
      "sessionId": "session456",
      "category": "educational"
    }
  }' \
  http://localhost:8085/api/chat/ask-enhanced

Enhanced POST with parameters:

curl -X POST \
  "http://localhost:8085/api/chat/ask-with-params?question=What%20is%20OpenTelemetry?&tags=monitoring,observability&metadata=userId:123,sessionId:abc"

3. Test with Postman

GET Request:

  • Method: GET
  • URL: http://localhost:8085/api/chat/ask-me
  • Query Parameters:
    • Key: question
    • Value: How to integrate Spring AI with OpenAI?

POST Request:

  • Method: POST
  • URL: http://localhost:8085/api/chat/ask
  • Headers: Content-Type: text/plain
  • Body: What is the difference between Spring AI and LangChain?

Enhanced POST Request (JSON with tags and metadata):

  • Method: POST
  • URL: http://localhost:8085/api/chat/ask-enhanced
  • Headers: Content-Type: application/json
  • Body:
    {
      "question": "What are the benefits of using Spring AI?",
      "tags": ["spring", "ai", "tutorial"],
      "metadata": {
        "userId": "user123",
        "sessionId": "session456",
        "category": "educational"
      }
    }

Enhanced POST Request (URL parameters):

  • Method: POST
  • URL: http://localhost:8085/api/chat/ask-with-params
  • Query Parameters:
    • Key: question, Value: What is OpenTelemetry?
    • Key: tags, Value: monitoring,observability,tracing
    • Key: metadata, Value: userId:123,sessionId:abc,environment:dev

4. Test with HTTPie

http GET localhost:8085/api/chat/ask-me question=="What is machine learning?"

5. Test with Browser

Open your browser and navigate to:

http://localhost:8085/api/chat/ask-me?question=Tell me about Spring Framework

API Endpoints

GET /api/chat/ask-me

  • Description: Ask a question using query parameter
  • Parameters:
    • question (optional): Your question (defaults to "Tell me a joke")
  • Example: /api/chat/ask-me?question=What is AI?

POST /api/chat/ask

  • Description: Ask a question using request body
  • Content-Type: text/plain
  • Body: Your question as plain text
  • Example:
    POST /api/chat/ask
    Content-Type: text/plain
    
    What is Spring AI?
    

POST /api/chat/ask-enhanced

  • Description: Ask a question with tags and metadata for enhanced tracing
  • Content-Type: application/json
  • Body: JSON object with question, tags, and metadata
  • Request Format:
    {
      "question": "Your question here",
      "tags": ["tag1", "tag2", "tag3"],
      "metadata": {
        "key1": "value1",
        "key2": "value2"
      }
    }
  • Example:
    POST /api/chat/ask-enhanced
    Content-Type: application/json
    
    {
      "question": "What are the benefits of using Spring AI?",
      "tags": ["spring", "ai", "tutorial"],
      "metadata": {
        "userId": "user123",
        "sessionId": "session456",
        "category": "educational"
      }
    }
    

POST /api/chat/ask-with-params

  • Description: Ask a question with tags and metadata using URL parameters
  • Parameters:
    • question (required): Your question
    • tags (optional): Comma-separated list of tags
    • metadata (optional): Key-value pairs in format key1:value1,key2:value2
  • Example: /api/chat/ask-with-params?question=What is OpenTelemetry?&tags=monitoring,observability&metadata=userId:123,sessionId:abc

Monitoring and Observability

OpenTelemetry Integration

The application automatically captures:

  • HTTP requests and responses
  • OpenAI API calls and responses
  • Application metrics and traces
  • Custom spans for business logic

Viewing Traces in OPIK

  1. Navigate to Comet OPIK Dashboard
  2. Select your workspace and project
  3. View real-time traces and metrics
  4. Analyze performance and debugging information

Health Check

Check application health:

curl http://localhost:8085/actuator/health

Troubleshooting

Common Issues

1. Application Won't Start

Error: Failed to configure a DataSource

  • Solution: This shouldn't occur with this application as it doesn't use a database

Error: OpenAI API key not found

  • Solution: Ensure OPENAI_API_KEY environment variable is set correctly

2. OpenTelemetry Issues

Error: Failed to export telemetry data

  • Solution: Check your OTEL_EXPORTER_OTLP_ENDPOINT and OTEL_EXPORTER_OTLP_HEADERS configuration
  • Verify: Your OPIK API key is valid and has proper permissions

3. Port Already in Use

Error: Port 8085 is already in use

  • Solution: Change the port in application.yml:
    server:
      port: 8086

4. OpenAI API Errors

Error: Rate limit exceeded

  • Solution: Check your OpenAI usage limits and billing

Error: Invalid API key

  • Solution: Verify your OpenAI API key is correct and active

Logs

To enable debug logging, add to application.yml:

logging:
  level:
    com.comet.opik.examples: DEBUG
    org.springframework.ai: DEBUG

Testing Without OpenTelemetry

To run without telemetry export, set:

export OTEL_EXPORTER_OTLP_ENDPOINT=""

Additional Information

Dependencies Used

  • Spring Boot 3.4.3
  • Spring AI 1.0.0
  • OpenTelemetry Instrumentation
  • Micrometer Tracing
  • Spring Boot Actuator

Development

For development purposes, you can modify the OpenAI model and parameters in application.yml:

spring:
  ai:
    openai:
      chat:
        options:
          model: gpt-3.5-turbo  # or gpt-4, gpt-4o-mini
          temperature: 0.3      # Lower for more deterministic responses

About

This is a simple Spring AI Chatbot REST application that uses the OpenAI API to answer questions and OPIK server for observability and tracing.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages