Skip to content

Feature Request: Add awaitingUserInput hook type #1128

@xaqrox

Description

@xaqrox

Describe the feature or problem you'd like to solve

Currently, the userPromptSubmitted hook fires after the user submits input, but there's no hook that fires when the CLI is waiting for user input. This creates a gap for use cases that need to trigger actions when the agent is ready for interaction.

Proposed solution

Add a new hook type: awaitingUserInput that fires when:

  • The CLI is waiting for user input
  • The agent has finished generating its response text
  • Control is returned to the user

This would complement the existing hooks:

  • userPromptSubmitted - fires after user hits enter
  • awaitingUserInput (new) - fires when CLI awaits input

The hook should receive JSON input indicating the context:

{
  "inputType": "normal" | "ask_user",
  "sessionId": "string",
  "timestamp": "ISO8601"
}

This allows hooks to distinguish between:

  • Normal conversation flow (after response generation completes)
  • Interactive prompts (when ask_user tool displays choices/question)

Benefits:

  • Enables audio/visual notifications when agent needs attention
  • Improves accessibility for users relying on non-visual feedback
  • Different handling for normal vs interactive (ask_user) inputs
  • Session automation and logging capabilities

Example prompts or workflows

Workflow 1: Audio accessibility notifications

User sets up hook to play audio cues:

  • Subtle sound when normal input is ready
  • Urgent sound when ask_user requires immediate response
  • Hook distinguishes between these using inputType field

Workflow 2: Window focus management

User sets up hook to automatically focus terminal window when agent awaits input, particularly for ask_user interactions that require immediate decision.

Workflow 3: Session metrics logging

User tracks time spent in different states:

  • Response generation time (postToolUse → awaitingUserInput)
  • User thinking time (awaitingUserInput → userPromptSubmitted)
  • Interactive vs normal input patterns

Workflow 4: IDE integration

IDE extension listens for awaitingUserInput hook to update status bar, show notifications, or adjust UI state when Copilot CLI needs attention.

Workflow 5: Terminal bell notifications

Simple terminal bell when agent is idle and awaiting input, useful for multitasking users who want passive notifications.

Example configuration

{
  "version": 1,
  "hooks": {
    "awaitingUserInput": [
      {
        "type": "command",
        "bash": "./notify-ready.sh",
        "cwd": "$HOME/.copilot/hooks",
        "timeoutSec": 2
      }
    ]
  }
}

Example hook script:

#!/bin/bash
INPUT=$(cat)
INPUT_TYPE=$(echo "$INPUT" | jq -r '.inputType')

if [[ "$INPUT_TYPE" == "ask_user" ]]; then
  # Play urgent sound for interactive questions
  afplay /System/Library/Sounds/Sosumi.aiff &
else
  # Play subtle sound for normal input
  afplay /System/Library/Sounds/Ping.aiff &
fi

Additional context

The gap this addresses:

  1. Tool completes → sound plays (via postToolUse hook) ✅
  2. Agent generates response text → no hook available ❌
  3. Input awaited → no hook available ❌

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions