-
Notifications
You must be signed in to change notification settings - Fork 0
Home
Developer toolkit for building shell-style commands in Go
The yupsh framework is a Go package for creating composable, pipeline-ready commands that mimic Unix shell utilities. Build commands that work seamlessly with pipes, handle I/O automatically, and compose into powerful data processing pipelines.
The framework handles all the plumbing so you can focus on your command's logic:
- ✅ Automatic I/O routing - stdin vs files handled transparently
- ✅ Pipeline composition - Commands connect naturally with
Pipe() - ✅ Context cancellation - Built-in timeout and cancellation support
- ✅ Type-safe flags - Strongly-typed options, no magic strings
- ✅ Zero boilerplate - Focus on processing logic, not I/O plumbing
go get github.com/gloo-foo/frameworkImportant: yupsh uses intentional naming where packages and their primary functions share the same name (e.g., grep.Grep(), sort.Sort(), cat.Cat()). This allows for two import styles:
// Standard import - explicit package names
import "github.com/yupsh/grep"
cmd := grep.Grep("pattern")
// Dot import - cleaner for heavy usage
import . "github.com/yupsh/grep"
cmd := Grep("pattern") // Still clear it's the grep commandThe naming is deliberate: even with dot imports, Grep() clearly indicates the grep command. This balances brevity with clarity.
Every command is its own independent Go module. This architectural choice enables:
- ✅ Extensibility - Add new commands without modifying the framework
- ✅ Individual iteration - Update commands independently, no monolith
- ✅ Third-party commands - Anyone can publish commands that work seamlessly
- ✅ Minimal dependencies - Only install the commands you actually use
- ✅ Version independence - Each command can evolve at its own pace
# Install only what you need
go get github.com/yupsh/grep
go get github.com/yupsh/sort
go get your-org/custom-command # Third-party commands work the same way!Even Pipe() is just another command - it composes commands together, but it's implemented using the same gloo.Command interface. This means pipes can be composed into other pipes, passed as parameters, and used anywhere a command can be used.
// Pipe is just a command that composes other commands
subPipeline := gloo.Pipe(grep.Grep("ERROR"), cut.Cut(cut.Fields(1)))
mainPipeline := gloo.Pipe(cat.Cat("*.log"), subPipeline, sort.Sort())Here's a simplified grep command to illustrate the key concepts:
package mygrep
import (
"strings"
gloo "github.com/gloo-foo/framework"
)
type flags struct {
IgnoreCase bool
}
type command gloo.Inputs[gloo.File, flags]
func Grep(pattern string, parameters ...any) gloo.Command {
cmd := command(gloo.Initialize[gloo.File, flags](append(parameters, pattern)...))
return cmd
}
func (c command) Executor() gloo.CommandExecutor {
return gloo.Inputs[gloo.File, flags](c).Wrap(
gloo.LineTransform(func(line string) (string, bool) {
pattern := c.Positional[0]
if c.Flags.IgnoreCase {
line = strings.ToLower(line)
pattern = strings.ToLower(pattern)
}
return line, strings.Contains(line, pattern)
}).Executor(),
)
}
func IgnoreCase(f *flags) { f.IgnoreCase = true }See full implementation: grep/command.go
// Read from stdin
cmd := mygrep.Grep("ERROR")
gloo.Run(cmd)
// Read from files
cmd := mygrep.Grep("ERROR", gloo.File("log.txt"), mygrep.IgnoreCase)
gloo.Run(cmd)
// Use in pipelines
pipeline := gloo.Pipe(
cat.Cat("*.log"),
mygrep.Grep("ERROR", mygrep.IgnoreCase),
sort.Sort(),
)
gloo.MustRun(pipeline)The framework uses types to define behavior:
gloo.File // Files to read (automatically opened)
io.Reader // Direct readers (caller manages lifecycle)
// Custom types // Commands can define their own types (DirPath, Pattern, etc.)Example:
// File-reading command (cat, grep, sort, wc)
type command gloo.Inputs[gloo.File, flags]
// Reader-based command (library API)
type command gloo.Inputs[io.Reader, flags]
// Directory command with custom type (find, ls)
type DirPath string
type command gloo.Inputs[DirPath, flags]One function handles all initialization:
func Sort(parameters ...any) gloo.Command {
// Initialize automatically:
// - Opens files when T is gloo.File
// - Parses flags from parameters
// - Sets up stdin if no files provided
cmd := command(gloo.Initialize[gloo.File, flags](parameters...))
// Optional: set defaults
if cmd.Flags.Delimiter == "" {
cmd.Flags.Delimiter = " "
}
return cmd
}Wrap() routes input automatically:
func (c command) Executor() gloo.CommandExecutor {
return gloo.Inputs[gloo.File, flags](c).Wrap(
// Your executor - framework handles stdin vs files
gloo.LineTransform(c.processLine).Executor(),
)
}What Wrap() does:
- Files provided → reads from those files
- Readers provided → reads from those readers
- Otherwise → reads from stdin parameter
Common processing patterns built-in:
// Line-by-line transformation (grep, cut, tr)
gloo.LineTransform(func(line string) (string, bool))
// Line transformation with state (nl, head, uniq)
gloo.StatefulLineTransform(func(lineNum int64, line string) (string, bool))
// Accumulate all lines, then process (sort, tac, shuf)
gloo.AccumulateAndProcess(func(lines []string) []string)
// Accumulate with custom output (wc)
gloo.AccumulateAndOutput(func(lines []string, stdout io.Writer) error)Commands that process files line-by-line use gloo.File as their input type. The framework handles file opening/closing automatically.
Pattern:
- Use
gloo.Inputs[gloo.File, flags]as the command type -
Initializeopens files automatically -
Wrap()routes input from files or stdin - Focus on processing logic only
Examples:
-
sort/command.go- Accumulates and sorts lines -
grep/command.go- Line-by-line filtering -
head/command.go- Stateful line processing
Commands that traverse directories define custom types for their inputs instead of using gloo.File. This gives the command full control over how to handle directory paths.
Pattern:
- Define a custom type like
type DirPath string - Use
gloo.Inputs[DirPath, flags]as the command type -
Initializeparses arguments (no automatic file opening) - Implement custom traversal logic in
Executor()
Examples:
-
find/command.go- Directory traversal with filtering -
ls/command.go- Directory listing
Commands that accept io.Reader directly for library use. Unlike gloo.File, readers are managed by the caller.
Pattern:
- Use
gloo.Inputs[io.Reader, flags]as the command type -
Initializewraps provided readers - Caller manages reader lifecycle
Example:
-
cat/command.go- Can accept files or readers
// Simple two-command pipeline
pipeline := gloo.Pipe(
grep.Grep("ERROR"),
wc.Wc(wc.Lines),
)
// Multi-stage pipeline
pipeline := gloo.Pipe(
find.Find(".", find.Name("*.log")),
grep.Grep("ERROR", grep.IgnoreCase),
cut.Cut(cut.Delimiter(":"), cut.Fields(1, 3)),
sort.Sort(sort.Unique),
head.Head(head.Lines(10)),
)// Execute with standard I/O
err := gloo.Run(pipeline)
// Execute with context for timeouts/cancellation
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
err := gloo.RunWithContext(ctx, pipeline)
// For tests/examples - panic on error
gloo.MustRun(pipeline)
// Custom I/O (for command developers/testing)
ctx := context.Background()
var buf bytes.Buffer
err := pipeline.Executor()(ctx, os.Stdin, &buf, os.Stderr)
result := buf.String()Parses parameters and sets up command state:
// T is the positional type (gloo.File, io.Reader, custom types, etc.)
// O is the flags struct type
cmd := gloo.Initialize[gloo.File, flags](parameters...)Behavior based on T:
-
gloo.File→ Opens files, or uses stdin if none -
io.Reader→ Wraps provided readers - Custom types → Just parses (commands define their own types as needed)
Wraps an executor to automatically route input:
func (c command) Executor() gloo.CommandExecutor {
return gloo.Inputs[gloo.File, flags](c).Wrap(executor)
}Routing logic:
- If files opened → read from files
- If readers provided → read from readers
- Otherwise → read from stdin parameter
Process each line independently:
executor := gloo.LineTransform(func(line string) (string, bool) {
// Transform line
processed := transform(line)
return processed, true // emit=true includes in output
}).Executor()Process lines with access to line number:
executor := gloo.StatefulLineTransform(func(lineNum int64, line string) (string, bool) {
if lineNum <= 10 { // Only first 10 lines
return line, true
}
return "", false
}).Executor()Read all lines, then process:
executor := gloo.AccumulateAndProcess(func(lines []string) []string {
sort.Strings(lines) // Process all at once
return lines
}).Executor()Read all lines with custom output:
executor := gloo.AccumulateAndOutput(func(lines []string, stdout io.Writer) error {
count := len(lines)
fmt.Fprintf(stdout, "Total: %d\n", count)
return nil
}).Executor()type flags struct {
Count bool
IgnoreCase bool
MaxLines int
Delimiter string
}// Boolean flag
func Count(f *flags) { f.Count = true }
// Parameterized flag
func MaxLines(n int) func(*flags) {
return func(f *flags) { f.MaxLines = n }
}
func Delimiter(d string) func(*flags) {
return func(f *flags) { f.Delimiter = d }
}// Command usage
cmd := grep.Grep("pattern",
grep.IgnoreCase, // Boolean flag
grep.MaxLines(100), // Parameterized flag
grep.Delimiter(","), // String parameter
gloo.File("data.txt"), // Positional argument
)
// Access in executor
func (c command) Executor() gloo.CommandExecutor {
return gloo.Inputs[gloo.File, flags](c).Wrap(
gloo.LineTransform(func(line string) (string, bool) {
if c.Flags.IgnoreCase {
line = strings.ToLower(line)
}
// ... process line
return line, true
}).Executor(),
)
}func ExampleGrep() {
cmd := grep.Grep("error", gloo.File("input.txt"))
gloo.MustRun(cmd)
// Output: error line
}func TestGrepBasic(t *testing.T) {
// For testing with custom I/O, use Executor() directly
ctx := context.Background()
input := strings.NewReader("foo\nbar\nbaz")
var output bytes.Buffer
cmd := grep.Grep("bar")
err := cmd.Executor()(ctx, input, &output, io.Discard)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
expected := "bar\n"
if output.String() != expected {
t.Errorf("got %q, want %q", output.String(), expected)
}
}func TestGrepFiles(t *testing.T) {
// Create temp file
tmpfile, _ := os.CreateTemp("", "test")
defer os.Remove(tmpfile.Name())
tmpfile.WriteString("line1\nERROR\nline3")
tmpfile.Close()
// For testing with custom output, use Executor() directly
ctx := context.Background()
var output bytes.Buffer
cmd := grep.Grep("ERROR", gloo.File(tmpfile.Name()))
err := cmd.Executor()(ctx, nil, &output, io.Discard)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if !strings.Contains(output.String(), "ERROR") {
t.Error("expected to find ERROR in output")
}
}func TestGrepCancellation(t *testing.T) {
ctx, cancel := context.WithCancel(context.Background())
// Cancel immediately
cancel()
input := strings.NewReader("line1\nline2")
cmd := grep.Grep("line")
err := cmd.Executor()(ctx, input, io.Discard, io.Discard)
if err != context.Canceled {
t.Errorf("expected context.Canceled, got %v", err)
}
}Each command is its own Go module with its own repository, version, and release cycle:
github.com/yupsh/grep # Independent module
github.com/yupsh/sort # Independent module
github.com/yupsh/cat # Independent module
github.com/yourname/custom # Third-party module - works the same!
Benefits:
- Extensibility without modification - Add commands without touching the framework
- Decentralized development - Anyone can publish compatible commands
- Minimal dependencies - Users install only what they need
- Independent versioning - Commands evolve at different rates
- No monolith - Update one command without rebuilding everything
The gloo.Pipe() function creates a command by composing other commands. It's not special - it implements the same gloo.Command interface:
// Pipe composes commands
pipeline := gloo.Pipe(grep.Grep("ERROR"), sort.Sort())
// Pipes can be nested (pipe is just another command)
subPipe := gloo.Pipe(grep.Grep("WARN"), cut.Cut(cut.Fields(1)))
mainPipe := gloo.Pipe(cat.Cat("*.log"), subPipe, head.Head(head.Lines(10)))
// Pipes can be passed around like any command
func processLogs(filter gloo.Command) gloo.Command {
return gloo.Pipe(cat.Cat("*.log"), filter, sort.Sort())
}This uniformity means anything you can do with a command, you can do with a pipeline.
Framework handles:
- I/O routing (stdin vs files)
- File lifecycle (open/close)
- Pipeline plumbing
- Context cancellation
Commands handle:
- Data processing logic only
- Flag interpretation
- Business logic
Type determines behavior - no manual plumbing needed:
gloo.File → "Open this for reading"
io.Reader → "Direct reader access"
CustomType → "Command-specific semantics"Everything is explicit:
// Explicit types
cmd := command(gloo.Initialize[gloo.File, flags](parameters...))
// Explicit wrapping
return gloo.Inputs[gloo.File, flags](c).Wrap(executor)
// Clear what happens whenCommands work the same as CLI tools or library functions:
// Simple execution
cmd := grep.Grep("ERROR", gloo.File("log.txt"))
gloo.Run(cmd)
// With context and timeout
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
defer cancel()
gloo.RunWithContext(ctx, cmd)
// In tests/examples - panic on error
gloo.MustRun(cmd)For complete, real-world command implementations, see:
-
grep/command.go- Pattern matching with line transforms -
sort/command.go- Accumulate and process pattern -
uniq/command.go- Stateful line processing with flags -
wc/command.go- Custom output formatting -
find/command.go- Custom types for directory traversal - All command examples → - Complete reference implementations
- Command Examples - Reference implementations
- Contributing Guide - How to contribute
- yupsh Project - Main repository
- Use semantic types (
gloo.Fileor custom types likeDirPath) - Let
Wrap()handle I/O routing - Use helper executors for common patterns
- Support context cancellation
- Write example tests for documentation
- Keep processing logic pure (no I/O in business logic)
- Use type-safe flag constructors
- Manually check stdin vs files (use
Wrap()) - Ignore context cancellation
- Load entire files into memory (use streaming)
- Use magic strings for flags
- Mix I/O and processing logic
- Forget to test edge cases
MIT License - see LICENSE file for details.
Built for developers who want shell power with Go reliability.