A cross-platform command-line interface for interacting with AI models from SuperDuck.AI
The SuperDuck.AI CLI provides a seamless way to interact with codycode.ai models from the command line. It allows you to pipe content between different AI models, making it easy to create complex workflows and process data.
Key features include:
Use SuperDuck in vim: vim
The only external dependency is libcurl
, which is usually pre-installed on most systems.
curl https://superduck.ai/superduck-cli.git --output sdcli.bundle
git clone sdcli.git
cd codycli
make
chmod +x setup-env.sh
./setup-env.sh
source ./codycodes_env.sh # If you didn't add to .bashrc/.zshrc
./cody 'Write a hello world program in Python'
sudo make install
export CFLAGS="-I/opt/homebrew/include"
export LDFLAGS="-L/opt/homebrew/lib"
make
The CLI uses environment variables for configuration. There are three types of environment variables you need to set:
Variable Type | Example | Purpose |
---|---|---|
System Prompts | cody , emmy |
Defines the system instruction for each AI persona |
Model Names | codybrain , emmybrain |
Specifies which AI model to use for each persona |
API Key | CODYCODES_API_KEY |
Your API key for authentication (REQUIRED) |
# System prompts - lowercase names to match command names
export cody="You are Cody, a helpful coding assistant."
export emmy="You are Emmy, specialized in creating clear explanations and presentations."
# Models
export codybrain="llama3.1-8b"
export emmybrain="gpt4o"
# API key (REQUIRED)
export CODYCODES_API_KEY="your-api-key-here"
The basic syntax for using the SuperDuck.AI CLI is:
[command] 'your prompt'
Where [command]
is the name of an AI persona (e.g., cody
or emmy
).
cody 'Write a hello world program in Python'
This will ask Cody to generate a Python hello world program.
One of the most powerful features of the SuperDuck.AI CLI is the ability to pipe content between different commands.
cat complex_code.py | cody 'explain this code'
This will send the contents of complex_code.py
to Cody and ask for an explanation.
cat data.csv | cody 'analyze this data' | emmy 'create a latex presentation summary'
This sends a CSV file to Cody for analysis, then pipes Cody's output to Emmy to create a LaTeX presentation.
You can easily create new AI personas by creating symbolic links to the main executable:
# Create symbolic link
ln -sf codycli newpersona
# Set environment variables
export newpersona="You are NewPersona, specialized in technical documentation."
export newpersonabrain="gpt4o"
Now you can use newpersona
just like cody
or emmy
.
cody 'Write a function that calculates the Fibonacci sequence in JavaScript'
cat myfile.py | cody 'Review this code and suggest improvements'
cat buggy_code.js | cody 'Fix the bugs in this code'
cat function.py | cody 'Add docstrings to this code following Google style'
emmy 'Write a professional email requesting a meeting with a client'
cat long_report.txt | emmy 'Summarize the key points of this document'
cat research_data.txt | emmy 'Create an outline for a presentation on this research'
cat raw_data.csv | cody 'Clean this CSV data and analyze trends' | emmy 'Create a report with visualizations based on this analysis'
cat source_code.js | cody 'Explain how this code works in detail' | emmy 'Convert this explanation into user-friendly documentation'
Issue | Solution |
---|---|
Environment variable not set |
Make sure to set all required environment variables:
Check that variables are set correctly:
|
Compilation errors |
Ensure you have libcurl installed:
|
API errors |
Check that your API key is valid and properly set:
|
To see more information about your configuration, run:
make debug
cody 'Write a Python script that downloads files from multiple URLs in parallel' > downloader.py
cat downloader.py | cody 'Explain how this code works, step by step'
cat sales_data.csv | cody 'Analyze this sales data and identify the top 5 performing products' | emmy 'Create a presentation for the marketing team based on this analysis'
cat meeting_notes.txt | emmy 'Create a follow-up email summarizing the meeting and action items' > email.txt
cat document.txt | cody 'Translate this text to Spanish' > document_es.txt
cat legacy_code.js | cody 'Refactor this code to use modern ES6+ syntax and improve performance' > modernized_code.js
cat data.json | cody 'Parse this JSON and explain the key data points' | tee analysis.txt | emmy 'Create a non-technical summary of this analysis'
This uses tee
to both save Cody's analysis to a file and pipe it to Emmy for further processing. This is useful when you want to preserve intermediate results in a pipeline.
#!/bin/bash
# iterative_refinement.sh
# Initial prompt
echo "Create a basic algorithm for sorting a list of numbers" > current.txt
# Iterative refinement loop with 3 AI personas
for i in {1..5}; do
echo "Round $i of refinement:"
# Cody improves the algorithm
cat current.txt | cody "Improve this algorithm focusing on efficiency" > improved.txt
# Emmy explains the improvements
cat improved.txt | emmy "Explain what improvements were made and why" > explanation.txt
# Third AI persona critiques and suggests next steps
cat improved.txt explanation.txt | newpersona "Critique this algorithm and suggest one specific improvement for the next iteration" > feedback.txt
# Output results of this round
echo "----- IMPROVED ALGORITHM -----"
cat improved.txt
echo "----- EXPLANATION -----"
cat explanation.txt
echo "----- FEEDBACK FOR NEXT ROUND -----"
cat feedback.txt
echo "------------------------------"
# Prepare for next iteration
cat feedback.txt > current.txt
# Optional: Add a delay to avoid API rate limits
sleep 2
done
This bash script demonstrates an iterative collaboration between three AI personas, working together to refine an algorithm over multiple rounds. Each AI plays a different role in the process.
#!/bin/bash
# web_research.sh
# Define the search topic
TOPIC="quantum computing recent developments"
# Use curl to get search results (using DuckDuckGo for example)
echo "Searching for information on: $TOPIC"
curl -s -A "Mozilla/5.0" "https://html.duckduckgo.com/html/?q=${TOPIC// /+}" | \
cody "Extract the top 5 relevant article titles and URLs from these search results" > search_results.txt
# Process each result
cat search_results.txt | while read url; do
# Skip empty lines
[ -z "$url" ] && continue
echo "Analyzing: $url"
# Get the content of the webpage
curl -s -A "Mozilla/5.0" "$url" | \
cody "Extract the main content from this HTML, ignoring navigation, ads, etc." | \
emmy "Summarize this article about quantum computing in 3-4 paragraphs, highlighting key innovations" >> research_summary.txt
echo "---" >> research_summary.txt
# Avoid hitting rate limits
sleep 3
done
# Create a final synthesized report
cat research_summary.txt | newpersona "Create a comprehensive research report synthesizing these article summaries about $TOPIC. Include an introduction, key findings, trends, and conclusion." > final_report.md
echo "Web research complete! Report saved to final_report.md"
This script demonstrates how to use curl
to fetch web content and use AI personas to process and interpret the information. It creates a research workflow that searches for a topic, extracts and summarizes relevant articles, and synthesizes a final report.