Skip to content

A conversational AI agent built with Bun that runs in the terminal. This project uses LangGraph to manage conversation state, Groq for ultra-fast LLM inference, and Tavily to give the AI access to real-time information via web search.

Notifications You must be signed in to change notification settings

sam28u/AI-Agent-Terminal-based-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Search Agent with LangGraph, Groq, and Tavily

A conversational AI agent built with Bun that runs in the terminal. This project uses LangGraph to manage conversation state, Groq for ultra-fast LLM inference, and Tavily to give the AI access to real-time information via web search.

🚀 Features

  • Stateful Conversation: Remembers previous interactions within the session using MemorySaver.
  • Web Search Capability: Automatically decides when to search the web using the Tavily API to answer questions about current events or specific topics.
  • High-Performance LLM: Powered by openai/gpt-oss-120b via the Groq API.
  • Agentic Workflow: Uses a cyclical graph structure (Agent ↔ Tools) to refine answers.

📋 Prerequisites

🛠️ Installation

  1. Clone the repository (or create your project folder):

    mkdir langgraph-agent
    cd langgraph-agent
  2. Initialize the project:

    bun init
  3. Install dependencies:

    bun add @langchain/groq @langchain/langgraph @langchain/tavily

⚙️ Configuration

Set up your environment variables. Bun automatically loads environment variables from .env files, so no extra configuration code is needed.

  1. Create a file named .env in the root directory:
    GROQ_API_KEY=your_groq_api_key_here
    TAVILY_API_KEY=your_tavily_api_key_here

🏃 Usage

bun run index.js

Interaction

  • Type your prompt: The AI will respond. If it needs to look up information (e.g., "What is the weather in Tokyo?"), it will trigger the Tavily tool automatically.
  • Exit: Type /bye to close the application.

🧠 How It Works (The Graph)

This project uses LangGraph to define a state machine for the agent:

graph LR
    Start --> Agent
    Agent -- tool_calls present --> Tools
    Tools --> Agent
    Agent -- no tool_calls --> End
Loading
  1. Agent Node: Calls the LLM (Groq) with the current conversation history.
  2. Conditional Logic:
    • If the LLM generates a Tool Call (e.g., it wants to search), the flow moves to the tools node.
    • If the LLM generates a Final Answer, the flow stops and prints to the user.
  3. Tool Node: Executes the Tavily search and feeds the results back to the Agent.
  4. Checkpointer: A MemorySaver persists the conversation state (using thread_id: '1'), allowing the bot to remember context from previous turns in the loop.

📦 Dependencies

  • @langchain/groq: Integration for Groq models.
  • @langchain/langgraph: Library for building stateful, multi-actor applications with LLMs.
  • @langchain/tavily: Search tool integration.

📄 License

MIT

About

A conversational AI agent built with Bun that runs in the terminal. This project uses LangGraph to manage conversation state, Groq for ultra-fast LLM inference, and Tavily to give the AI access to real-time information via web search.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published