Completed
on 6 Jan 2026, 4:42 pm

Output

🔥 Powerful Pipe Commands in gvShell

1. Multi-Provider Pipelines

Chain AI providers - each processes the previous output:

gv> /pipe @claude "explain quicksort" | @openai "simplify this for beginners"

gv> /pipe @ollama "draft email" | @claude "make it professional"

2. Response References

Every AI response becomes a referenceable "file":

# Reference by number
gv> @claude explain monads
gv> convert $1 to Python

# Reference by provider name
gv> show $claude      # Last Claude response
gv> compare $1 and $2

3. Shell Integration with $refs

Use AI responses in Unix commands:

# Word count of AI response
gv> ! echo $1 | wc -w

# Save response to file
gv> ! cat $claude > output.txt

# Pipe to system commands
gv> $1.code | python           # Execute code block
gv> $1 | pbcopy                  # Copy to clipboard (macOS)

4. Compare & Benchmark

Compare responses from different providers:

# Get responses from both
gv> @claude explain monads
gv> @openai explain monads

# Diff them
gv> /diff $claude $openai
gv> /diff $1 $2 --side-by-side

# Benchmark all providers
gv> /benchmark "explain bubble sort"

PROVIDER   MODEL        LATENCY    TOKENS   COST
claude     sonnet       1.2s       234      $0.003
openai     gpt-4o       0.9s       198      $0.002
ollama     llama3       2.1s       256      $0.000

5. Export & Import

Move data in and out:

# Export to file
gv> /export $1 --format md > explanation.md
gv> /export session --format json > session.json

# Import for context
gv> /import error.log
gv> @claude what's causing this error?

gv> /import screenshot.png
gv> @claude what's in this image?

6. Remote Multi-Agent

Execute across machines:

# Run on remote agent
gv> /remote dev-vm "@claude explain this codebase"
gv> /remote prod-api "@ollama --model codellama review main.go"

# Broadcast to all agents
gv> /broadcast "what is your current git branch?"

laptop:   main
dev-vm:   feature/auth
prod-api: v2.3.1 (tag)

# Delegate session to another machine
gv> /delegate dev-vm with "deploy this fix"

7. Field Extraction

Access specific parts of responses:

gv> $1.code         # Extract code blocks
gv> $1.text         # Prose only (no code)
gv> $1.code[0]      # First code block
gv> $1.code.python  # Python code blocks only
gv> $1.summary      # AI-generated summary

✨ Key Insight

gvShell treats AI like Unix: responses are files, providers are commands, and everything composes with pipes. This is the Unix philosophy applied to AI.

Quick Actions

Original Request

No layout configured

Details

Type General
Status Completed
Scope vibetools
Tags gvshelldocumentationpipes
Created 6 Jan 2026, 4:42 pm
Updated 6 Jan 2026, 4:42 pm
Created By claude

Raw Data

{
  "id": "f603c253-3cf4-4c88-b551-d22237d2ee53",
  "type": "general",
  "status": "completed",
  "title": "Powerful Pipe Commands in gvShell",
  "description": "Examples of powerful pipe and composition commands from gvShell documentation",
  "context": {
    "output": "<div style=\"background:#1e1e1e;color:#e0e0e0;padding:20px;border-radius:8px;font-family:'Consolas','Monaco',monospace;\">\n<h2 style=\"font-family:sans-serif;color:#fff;margin:0 0 16px 0;\">🔥 Powerful Pipe Commands in gvShell</h2>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">1. Multi-Provider Pipelines</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Chain AI providers - each processes the previous output:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/pipe</span> <span style=\"color:#61afef;\">@claude</span> <span style=\"color:#98c379;\">\"explain quicksort\"</span> | <span style=\"color:#61afef;\">@openai</span> <span style=\"color:#98c379;\">\"simplify this for beginners\"</span>\n\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/pipe</span> <span style=\"color:#61afef;\">@ollama</span> <span style=\"color:#98c379;\">\"draft email\"</span> | <span style=\"color:#61afef;\">@claude</span> <span style=\"color:#98c379;\">\"make it professional\"</span>\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">2. Response References</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Every AI response becomes a referenceable \"file\":</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#7f848e;\"># Reference by number</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#61afef;\">@claude</span> explain monads\n<span style=\"color:#c678dd;\">gv></span> convert <span style=\"color:#e06c75;\">$1</span> to Python\n\n<span style=\"color:#7f848e;\"># Reference by provider name</span>\n<span style=\"color:#c678dd;\">gv></span> show <span style=\"color:#e06c75;\">$claude</span>      <span style=\"color:#7f848e;\"># Last Claude response</span>\n<span style=\"color:#c678dd;\">gv></span> compare <span style=\"color:#e06c75;\">$1</span> and <span style=\"color:#e06c75;\">$2</span>\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">3. Shell Integration with $refs</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Use AI responses in Unix commands:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#7f848e;\"># Word count of AI response</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">!</span> echo <span style=\"color:#e06c75;\">$1</span> | wc -w\n\n<span style=\"color:#7f848e;\"># Save response to file</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">!</span> cat <span style=\"color:#e06c75;\">$claude</span> > output.txt\n\n<span style=\"color:#7f848e;\"># Pipe to system commands</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.code</span> | python           <span style=\"color:#7f848e;\"># Execute code block</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1</span> | pbcopy                  <span style=\"color:#7f848e;\"># Copy to clipboard (macOS)</span>\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">4. Compare & Benchmark</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Compare responses from different providers:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#7f848e;\"># Get responses from both</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#61afef;\">@claude</span> explain monads\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#61afef;\">@openai</span> explain monads\n\n<span style=\"color:#7f848e;\"># Diff them</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/diff</span> <span style=\"color:#e06c75;\">$claude</span> <span style=\"color:#e06c75;\">$openai</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/diff</span> <span style=\"color:#e06c75;\">$1</span> <span style=\"color:#e06c75;\">$2</span> --side-by-side\n\n<span style=\"color:#7f848e;\"># Benchmark all providers</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/benchmark</span> <span style=\"color:#98c379;\">\"explain bubble sort\"</span>\n\n<span style=\"color:#7f848e;\">PROVIDER   MODEL        LATENCY    TOKENS   COST</span>\n<span style=\"color:#98c379;\">claude</span>     sonnet       1.2s       234      $0.003\n<span style=\"color:#98c379;\">openai</span>     gpt-4o       0.9s       198      $0.002\n<span style=\"color:#98c379;\">ollama</span>     llama3       2.1s       256      $0.000\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">5. Export & Import</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Move data in and out:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#7f848e;\"># Export to file</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/export</span> <span style=\"color:#e06c75;\">$1</span> --format md > explanation.md\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/export</span> session --format json > session.json\n\n<span style=\"color:#7f848e;\"># Import for context</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/import</span> error.log\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#61afef;\">@claude</span> what's causing this error?\n\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/import</span> screenshot.png\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#61afef;\">@claude</span> what's in this image?\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;margin-bottom:16px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">6. Remote Multi-Agent</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Execute across machines:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#7f848e;\"># Run on remote agent</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/remote</span> dev-vm <span style=\"color:#98c379;\">\"@claude explain this codebase\"</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/remote</span> prod-api <span style=\"color:#98c379;\">\"@ollama --model codellama review main.go\"</span>\n\n<span style=\"color:#7f848e;\"># Broadcast to all agents</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/broadcast</span> <span style=\"color:#98c379;\">\"what is your current git branch?\"</span>\n\n<span style=\"color:#7f848e;\">laptop:   </span><span style=\"color:#98c379;\">main</span>\n<span style=\"color:#7f848e;\">dev-vm:   </span><span style=\"color:#98c379;\">feature/auth</span>\n<span style=\"color:#7f848e;\">prod-api: </span><span style=\"color:#98c379;\">v2.3.1 (tag)</span>\n\n<span style=\"color:#7f848e;\"># Delegate session to another machine</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#56b6c2;\">/delegate</span> dev-vm with <span style=\"color:#98c379;\">\"deploy this fix\"</span>\n</pre>\n</div>\n\n<div style=\"background:#2d2d2d;padding:16px;border-radius:6px;\">\n<h3 style=\"color:#e5c07b;margin:0 0 12px 0;font-family:sans-serif;\">7. Field Extraction</h3>\n<p style=\"color:#9ca3af;margin:0 0 12px 0;font-family:sans-serif;\">Access specific parts of responses:</p>\n<pre style=\"background:#252526;padding:12px;border-radius:4px;overflow-x:auto;margin:0;\">\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.code</span>         <span style=\"color:#7f848e;\"># Extract code blocks</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.text</span>         <span style=\"color:#7f848e;\"># Prose only (no code)</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.code[0]</span>      <span style=\"color:#7f848e;\"># First code block</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.code.python</span>  <span style=\"color:#7f848e;\"># Python code blocks only</span>\n<span style=\"color:#c678dd;\">gv></span> <span style=\"color:#e06c75;\">$1.summary</span>      <span style=\"color:#7f848e;\"># AI-generated summary</span>\n</pre>\n</div>\n\n<div style=\"background:#1a3d1a;padding:12px;border-radius:6px;margin-top:16px;font-family:sans-serif;\">\n<h4 style=\"margin:0 0 8px 0;color:#4ade80;\">✨ Key Insight</h4>\n<p style=\"color:#a7f3d0;margin:0;\">gvShell treats AI like Unix: responses are files, providers are commands, and everything composes with pipes. This is the Unix philosophy applied to AI.</p>\n</div>\n\n</div>",
    "choices": [
      {
        "label": "Run gvShell",
        "value": "Show me how to run gvShell and try some of these commands",
        "primary": true
      },
      {
        "label": "Build status",
        "value": "What's the current build status of gvShell - what's implemented?"
      },
      {
        "label": "More examples",
        "value": "Show me more advanced gvShell examples from the documentation"
      }
    ]
  },
  "createdBy": "claude",
  "createdAt": "2026-01-06T06:42:02.883Z",
  "updatedAt": "2026-01-06T06:42:03.094Z",
  "scope": "vibetools",
  "tags": [
    "gvshell",
    "documentation",
    "pipes"
  ],
  "targetUser": "claude"
}
DashboardReportsKontasksSessionsTelemetryLogs + Go