A conversational AI agent built with Bun that runs in the terminal. This project uses LangGraph to manage conversation state, Groq for ultra-fast LLM inference, and Tavily to give the AI access to real-time information via web search.
- Stateful Conversation: Remembers previous interactions within the session using
MemorySaver. - Web Search Capability: Automatically decides when to search the web using the Tavily API to answer questions about current events or specific topics.
- High-Performance LLM: Powered by
openai/gpt-oss-120bvia the Groq API. - Agentic Workflow: Uses a cyclical graph structure (Agent ↔ Tools) to refine answers.
- Bun (Version 1.0+ recommended)
- API Keys:
-
Clone the repository (or create your project folder):
mkdir langgraph-agent cd langgraph-agent -
Initialize the project:
bun init
-
Install dependencies:
bun add @langchain/groq @langchain/langgraph @langchain/tavily
Set up your environment variables. Bun automatically loads environment variables from .env files, so no extra configuration code is needed.
- Create a file named
.envin the root directory:GROQ_API_KEY=your_groq_api_key_here TAVILY_API_KEY=your_tavily_api_key_here
bun run index.js- Type your prompt: The AI will respond. If it needs to look up information (e.g., "What is the weather in Tokyo?"), it will trigger the Tavily tool automatically.
- Exit: Type
/byeto close the application.
This project uses LangGraph to define a state machine for the agent:
graph LR
Start --> Agent
Agent -- tool_calls present --> Tools
Tools --> Agent
Agent -- no tool_calls --> End
- Agent Node: Calls the LLM (Groq) with the current conversation history.
- Conditional Logic:
- If the LLM generates a Tool Call (e.g., it wants to search), the flow moves to the
toolsnode. - If the LLM generates a Final Answer, the flow stops and prints to the user.
- If the LLM generates a Tool Call (e.g., it wants to search), the flow moves to the
- Tool Node: Executes the Tavily search and feeds the results back to the Agent.
- Checkpointer: A
MemorySaverpersists the conversation state (usingthread_id: '1'), allowing the bot to remember context from previous turns in the loop.
@langchain/groq: Integration for Groq models.@langchain/langgraph: Library for building stateful, multi-actor applications with LLMs.@langchain/tavily: Search tool integration.