Since the start of modern AI in late 2022 we’ve seen an extraordinary number of AI applications for accomplishing tasks. There are thousands of websites, chat-bots, mobile apps, and other interfaces for using all the different AI out there.
It’s all really exciting and powerful, but it’s not easy to integrate this functionality into our lives.
Fabric was created to address this by creating and organizing the fundamental units of AI—the prompts themselves!
Fabric organizes prompts by real-world task, allowing people to create, collect, and organize their most important AI solutions in a single place for use in their favorite tools. And if you’re command-line focused, you can use Fabric itself as the interface!
Dear Users,
We’ve been doing so many exciting things here at Fabric, I wanted to give a quick summary here to give you a sense of our development velocity!
Below are the new features and capabilities we’ve added (newest first):
/swagger/index.html with comprehensive REST API documentation, enhanced developer guides, and improved endpoint discoverability for easier integration.create_conceptmap pattern for visual knowledge representation using Vis.js, introduces WELLNESS category with psychological analysis patterns, and upgrades to Claude Sonnet 4.5openai-go/azure SDK with improved authentication and default API version support--transcribe-file, --transcribe-model, and --split-media-file flags.readme_updates python script--search FlagThese features represent our commitment to making Fabric the most powerful and flexible AI augmentation framework available!
Keep in mind that many of these were recorded when Fabric was Python-based, so remember to use the current install instructions below.
fabricFabric is evolving rapidly.
Stay current with the latest features by reviewing the CHANGELOG for all recent changes.
AI isn’t a thing; it’s a magnifier of a thing. And that thing is human creativity.
We believe the purpose of technology is to help humans flourish, so when we talk about AI we start with the human problems we want to solve.
Our approach is to break problems into individual pieces (see below) and then apply AI to them one at a time. See below for some examples.
Prompts are good for this, but the biggest challenge I faced in 2023——which still exists today—is the sheer number of AI prompts out there. We all have prompts that are useful, but it’s hard to discover new ones, know if they are good or not, and manage different versions of the ones we like.
One of fabric‘s primary features is helping people collect and integrate prompts, which we call Patterns, into various parts of their lives.
Fabric has Patterns for all sorts of life and work activities, including:
Unix/Linux/macOS:
curl -fsSL https://raw.githubusercontent.com/danielmiessler/fabric/main/scripts/installer/install.sh | bash
Windows PowerShell:
iwr -useb https://raw.githubusercontent.com/danielmiessler/fabric/main/scripts/installer/install.ps1 | iex
See scripts/installer/README.md for custom installation options and troubleshooting.
The latest release binary archives and their expected SHA256 hashes can be found at https://github.com/danielmiessler/fabric/releases/latest
NOTE: using Homebrew or the Arch Linux package managers makes fabric available as fabric-ai, so add
the following alias to your shell startup files to account for this:
alias fabric='fabric-ai'
brew install fabric-ai
yay -S fabric-ai
Use the official Microsoft supported Winget tool:
winget install danielmiessler.Fabric
To install Fabric, make sure Go is installed, and then run the following command.
# Install Fabric directly from the repo
go install github.com/danielmiessler/fabric/cmd/fabric@latest
Run Fabric using pre-built Docker images:
# Use latest image from Docker Hub
docker run --rm -it kayvan/fabric:latest --version
# Use specific version from GHCR
docker run --rm -it ghcr.io/ksylvan/fabric:v1.4.305 --version
# Run setup (first time)
mkdir -p $HOME/.fabric-config
docker run --rm -it -v $HOME/.fabric-config:/root/.config/fabric kayvan/fabric:latest --setup
# Use Fabric with your patterns
docker run --rm -it -v $HOME/.fabric-config:/root/.config/fabric kayvan/fabric:latest -p summarize
# Run the REST API server (see REST API Server section)
docker run --rm -it -p 8080:8080 -v $HOME/.fabric-config:/root/.config/fabric kayvan/fabric:latest --serve
Images available at:
See scripts/docker/README.md for building custom images and advanced configuration.
You may need to set some environment variables in your ~/.bashrc on linux or ~/.zshrc file on mac to be able to run the fabric command. Here is an example of what you can add:
For Intel based macs or linux
# Golang environment variables
export GOROOT=/usr/local/go
export GOPATH=$HOME/go
# Update PATH to include GOPATH and GOROOT binaries
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
for Apple Silicon based macs
# Golang environment variables
export GOROOT=$(brew --prefix go)/libexec
export GOPATH=$HOME/go
export PATH=$GOPATH/bin:$GOROOT/bin:$HOME/.local/bin:$PATH
Now run the following command
# Run the setup to set up your directories and keys
fabric --setup
If everything works you are good to go.
You can configure specific models for individual patterns using environment variables
like FABRIC_MODEL_PATTERN_NAME=vendor|model
This makes it easy to maintain these per-pattern model mappings in your shell startup files.
In order to add aliases for all your patterns and use them directly as commands, for example, summarize instead of fabric --pattern summarize
You can add the following to your .zshrc or .bashrc file. You
can also optionally set the FABRIC_ALIAS_PREFIX environment variable
before, if you’d prefer all the fabric aliases to start with the same prefix.
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in $HOME/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name="$(basename "$pattern_file")"
alias_name="${FABRIC_ALIAS_PREFIX:-}${pattern_name}"
# Create an alias in the form: alias pattern_name="fabric --pattern pattern_name"
alias_command="alias $alias_name='fabric --pattern $pattern_name'"
# Evaluate the alias command to add it to the current shell
eval "$alias_command"
done
yt() {
if [ "$#" -eq 0 ] || [ "$#" -gt 2 ]; then
echo "Usage: yt [-t | --timestamps] youtube-link"
echo "Use the '-t' flag to get the transcript with timestamps."
return 1
fi
transcript_flag="--transcript"
if [ "$1" = "-t" ] || [ "$1" = "--timestamps" ]; then
transcript_flag="--transcript-with-timestamps"
shift
fi
local video_link="$1"
fabric -y "$video_link" $transcript_flag
}
You can add the below code for the equivalent aliases inside PowerShell by running notepad $PROFILE inside a PowerShell window:
# Path to the patterns directory
$patternsPath = Join-Path $HOME ".config/fabric/patterns"
foreach ($patternDir in Get-ChildItem -Path $patternsPath -Directory) {
# Prepend FABRIC_ALIAS_PREFIX if set; otherwise use empty string
$prefix = $env:FABRIC_ALIAS_PREFIX ?? ''
$patternName = "$($patternDir.Name)"
$aliasName = "$prefix$patternName"
# Dynamically define a function for each pattern
$functionDefinition = @"
function $aliasName {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline = `$true)]
[string] `$InputObject,
[Parameter(ValueFromRemainingArguments = `$true)]
[String[]] `$patternArgs
)
begin {
# Initialize an array to collect pipeline input
`$collector = @()
}
process {
# Collect pipeline input objects
if (`$InputObject) {
`$collector += `$InputObject
}
}
end {
# Join all pipeline input into a single string, separated by newlines
`$pipelineContent = `$collector -join "`n"
# If there's pipeline input, include it in the call to fabric
if (`$pipelineContent) {
`$pipelineContent | fabric --pattern $patternName `$patternArgs
} else {
# No pipeline input; just call fabric with the additional args
fabric --pattern $patternName `$patternArgs
}
}
}
"@
# Add the function to the current session
Invoke-Expression $functionDefinition
}
# Define the 'yt' function as well
function yt {
[CmdletBinding()]
param(
[Parameter()]
[Alias("timestamps")]
[switch]$t,
[Parameter(Position = 0, ValueFromPipeline = $true)]
[string]$videoLink
)
begin {
$transcriptFlag = "--transcript"
if ($t) {
$transcriptFlag = "--transcript-with-timestamps"
}
}
process {
if (-not $videoLink) {
Write-Error "Usage: yt [-t | --timestamps] youtube-link"
return
}
}
end {
if ($videoLink) {
# Execute and allow output to flow through the pipeline
fabric -y $videoLink $transcriptFlag
}
}
}
This also creates a yt alias that allows you to use yt https://www.youtube.com/watch?v=4b0iet22VIk to get transcripts, comments, and metadata.
If in addition to the above aliases you would like to have the option to save the output to your favorite markdown note vault like Obsidian then instead of the above add the following to your .zshrc or .bashrc file:
# Define the base directory for Obsidian notes
obsidian_base="/path/to/obsidian"
# Loop through all files in the ~/.config/fabric/patterns directory
for pattern_file in ~/.config/fabric/patterns/*; do
# Get the base name of the file (i.e., remove the directory path)
pattern_name=$(basename "$pattern_file")
# Remove any existing alias with the same name
unalias "$pattern_name" 2>/dev/null
# Define a function dynamically for each pattern
eval "
$pattern_name() {
local title=\$1
local date_stamp=\$(date +'%Y-%m-%d')
local output_path=\"\$obsidian_base/\${date_stamp}-\${title}.md\"
# Check if a title was provided
if [ -n \"\$title\" ]; then
# If a title is provided, use the output path
fabric --pattern \"$pattern_name\" -o \"\$output_path\"
else
# If no title is provided, use --stream
fabric --pattern \"$pattern_name\" --stream
fi
}
"
done
This will allow you to use the patterns as aliases like in the above for example summarize instead of fabric --pattern summarize --stream, however if you pass in an extra argument like this summarize "my_article_title" your output will be saved in the destination that you set in obsidian_base="/path/to/obsidian" in the following format YYYY-MM-DD-my_article_title.md where the date gets autogenerated for you.
You can tweak the date format by tweaking the date_stamp format.
If you have the Legacy (Python) version installed and want to migrate to the Go version, here’s how you do it. It’s basically two steps: 1) uninstall the Python version, and 2) install the Go version.
# Uninstall Legacy Fabric
pipx uninstall fabric
# Clear any old Fabric aliases
(check your .bashrc, .zshrc, etc.)
# Install the Go version
go install github.com/danielmiessler/fabric/cmd/fabric@latest
# Run setup for the new version. Important because things have changed
fabric --setup
Then set your environmental variables as shown above.
The great thing about Go is that it’s super easy to upgrade. Just run the same command you used to install it in the first place and you’ll always get the latest version.
go install github.com/danielmiessler/fabric/cmd/fabric@latest
Fabric provides shell completion scripts for Zsh, Bash, and Fish shells, making it easier to use the CLI by providing tab completion for commands and options.
You can install completions directly via a one-liner:
curl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh | sh
Optional variants:
# Dry-run (see actions without changing your system)
curl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh | sh -s -- --dry-run
# Override the download source (advanced)
FABRIC_COMPLETIONS_BASE_URL="https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions" \
sh -c "$(curl -fsSL https://raw.githubusercontent.com/danielmiessler/Fabric/refs/heads/main/completions/setup-completions.sh)"
To enable Zsh completion:
# Copy the completion file to a directory in your $fpath
mkdir -p ~/.zsh/completions
cp completions/_fabric ~/.zsh/completions/
# Add the directory to fpath in your .zshrc before compinit
echo 'fpath=(~/.zsh/completions $fpath)' >> ~/.zshrc
echo 'autoload -Uz compinit && compinit' >> ~/.zshrc
To enable Bash completion:
# Source the completion script in your .bashrc
echo 'source /path/to/fabric/completions/fabric.bash' >> ~/.bashrc
# Or copy to the system-wide bash completion directory
sudo cp completions/fabric.bash /etc/bash_completion.d/
To enable Fish completion:
# Copy the completion file to the fish completions directory
mkdir -p ~/.config/fish/completions
cp completions/fabric.fish ~/.config/fish/completions/
Once you have it all set up, here’s how to use it.
fabric -h
Usage:
fabric [OPTIONS]
Application Options:
-p, --pattern= Choose a pattern from the available patterns
-v, --variable= Values for pattern variables, e.g. -v=#role:expert -v=#points:30
-C, --context= Choose a context from the available contexts
--session= Choose a session from the available sessions
-a, --attachment= Attachment path or URL (e.g. for OpenAI image recognition messages)
-S, --setup Run setup for all reconfigurable parts of fabric
-t, --temperature= Set temperature (default: 0.7)
-T, --topp= Set top P (default: 0.9)
-s, --stream Stream
-P, --presencepenalty= Set presence penalty (default: 0.0)
-r, --raw Use the defaults of the model without sending chat options
(temperature, top_p, etc.). Only affects OpenAI-compatible providers.
Anthropic models always use smart parameter selection to comply with
model-specific requirements.
-F, --frequencypenalty= Set frequency penalty (default: 0.0)
-l, --listpatterns List all patterns
-L, --listmodels List all available models
-x, --listcontexts List all contexts
-X, --listsessions List all sessions
-U, --updatepatterns Update patterns
-c, --copy Copy to clipboard
-m, --model= Choose model
-V, --vendor= Specify vendor for chosen model (e.g., -V "LM Studio" -m openai/gpt-oss-20b)
--modelContextLength= Model context length (only affects ollama)
-o, --output= Output to file
--output-session Output the entire session (also a temporary one) to the output file
-n, --latest= Number of latest patterns to list (default: 0)
-d, --changeDefaultModel Change default model
-y, --youtube= YouTube video or play list "URL" to grab transcript, comments from it
and send to chat or print it put to the console and store it in the
output file
--playlist Prefer playlist over video if both ids are present in the URL
--transcript Grab transcript from YouTube video and send to chat (it is used per
default).
--transcript-with-timestamps Grab transcript from YouTube video with timestamps and send to chat
--comments Grab comments from YouTube video and send to chat
--metadata Output video metadata
-g, --language= Specify the Language Code for the chat, e.g. -g=en -g=zh
-u, --scrape_url= Scrape website URL to markdown using Jina AI
-q, --scrape_question= Search question using Jina AI
-e, --seed= Seed to be used for LMM generation
-w, --wipecontext= Wipe context
-W, --wipesession= Wipe session
--printcontext= Print context
--printsession= Print session
--readability Convert HTML input into a clean, readable view
--input-has-vars Apply variables to user input
--no-variable-replacement Disable pattern variable replacement
--dry-run Show what would be sent to the model without actually sending it
--serve Serve the Fabric Rest API
--serveOllama Serve the Fabric Rest API with ollama endpoints
--address= The address to bind the REST API (default: :8080)
--api-key= API key used to secure server routes
--config= Path to YAML config file
--version Print current version
--listextensions List all registered extensions
--addextension= Register a new extension from config file path
--rmextension= Remove a registered extension by name
--strategy= Choose a strategy from the available strategies
--liststrategies List all strategies
--listvendors List all vendors
--shell-complete-list Output raw list without headers/formatting (for shell completion)
--search Enable web search tool for supported models (Anthropic, OpenAI, Gemini)
--search-location= Set location for web search results (e.g., 'America/Los_Angeles')
--image-file= Save generated image to specified file path (e.g., 'output.png')
--image-size= Image dimensions: 1024x1024, 1536x1024, 1024x1536, auto (default: auto)
--image-quality= Image quality: low, medium, high, auto (default: auto)
--image-compression= Compression level 0-100 for JPEG/WebP formats (default: not set)
--image-background= Background type: opaque, transparent (default: opaque, only for
PNG/WebP)
--suppress-think Suppress text enclosed in thinking tags
--think-start-tag= Start tag for thinking sections (default: <think>)
--think-end-tag= End tag for thinking sections (default: </think>)
--disable-responses-api Disable OpenAI Responses API (default: false)
--voice= TTS voice name for supported models (e.g., Kore, Charon, Puck)
(default: Kore)
--list-gemini-voices List all available Gemini TTS voices
--notification Send desktop notification when command completes
--notification-command= Custom command to run for notifications (overrides built-in
notifications)
--yt-dlp-args= Additional arguments to pass to yt-dlp (e.g. '--cookies-from-browser brave')
--thinking= Set reasoning/thinking level (e.g., off, low, medium, high, or
numeric tokens for Anthropic or Google Gemini)
--show-metadata Print metadata (input/output tokens) to stderr
--debug= Set debug level (0: off, 1: basic, 2: detailed, 3: trace)
Help Options:
-h, --help Show this help message
Use the --debug flag to control runtime logging:
0: off (default)1: basic debug info2: detailed debugging3: trace levelFabric supports extensions that can be called within patterns. See the Extension Guide for complete documentation.
Important: Extensions only work within pattern files, not via direct stdin. See the guide for details and examples.
Fabric includes a built-in REST API server that exposes all core functionality over HTTP. Start the server with:
fabric --serve
The server provides endpoints for:
For complete endpoint documentation, authentication setup, and usage examples, see REST API Documentation.
Fabric Patterns are different than most prompts you’ll see.
Markdown to help ensure maximum readability and editability. This not only helps the creator make a good one, but also anyone who wants to deeply understand what it does. Importantly, this also includes the AI you’re sending it to!Here’s an example of a Fabric Pattern.
https://github.com/danielmiessler/Fabric/blob/main/data/patterns/extract_wisdom/system.md
Next, we are extremely clear in our instructions, and we use the Markdown structure to emphasize what we want the AI to do, and in what order.
And finally, we tend to use the System section of the prompt almost exclusively. In over a year of being heads-down with this stuff, we’ve just seen more efficacy from doing that. If that changes, or we’re shown data that says otherwise, we will adjust.
The following examples use the macOS
pbpasteto paste from the clipboard. See the pbpaste section below for Windows and Linux alternatives.
Now let’s look at some things you can do with Fabric.
Run the summarize Pattern based on input from stdin. In this case, the body of an article.
pbpaste | fabric --pattern summarize
Run the analyze_claims Pattern with the --stream option to get immediate and streaming results.
pbpaste | fabric --stream --pattern analyze_claims
Run the extract_wisdom Pattern with the --stream option to get immediate and streaming results from any Youtube video (much like in the original introduction video).
fabric -y "https://youtube.com/watch?v=uXs-zPc63kM" --stream --pattern extract_wisdom
Create patterns- you must create a .md file with the pattern and save it to ~/.config/fabric/patterns/[yourpatternname].
Run a analyze_claims pattern on a website. Fabric uses Jina AI to scrape the URL into markdown format before sending it to the model.
fabric -u https://github.com/danielmiessler/fabric/ -p analyze_claims
If you’re not looking to do anything fancy, and you just want a lot of great prompts, you can navigate to the /patterns directory and start exploring!
We hope that if you used nothing else from Fabric, the Patterns by themselves will make the project useful.
You can use any of the Patterns you see there in any AI application that you have, whether that’s ChatGPT or some other app or website. Our plan and prediction is that people will soon be sharing many more than those we’ve published, and they will be way better than ours.
The wisdom of crowds for the win.
Fabric also implements prompt strategies like “Chain of Thought” or “Chain of Draft” which can be used in addition to the basic patterns.
See the Thinking Faster by Writing Less paper and the Thought Generation section of Learn Prompting for examples of prompt strategies.
Each strategy is available as a small json file in the /strategies directory.
The prompt modification of the strategy is applied to the system prompt and passed on to the LLM in the chat session.
Use fabric -S and select the option to install the strategies in your ~/.config/fabric directory.
You may want to use Fabric to create your own custom Patterns—but not share them with others. No problem!
Fabric now supports a dedicated custom patterns directory that keeps your personal patterns separate from the built-in ones. This means your custom patterns won’t be overwritten when you update Fabric’s built-in patterns.
Run the Fabric setup:
fabric --setup
Select the “Custom Patterns” option from the Tools menu and enter your desired directory path (e.g., ~/my-custom-patterns)
Fabric will automatically create the directory if it does not exist.
Create your custom pattern directory structure:
mkdir -p ~/my-custom-patterns/my-analyzer
Create your pattern file
echo "You are an expert analyzer of ..." > ~/my-custom-patterns/my-analyzer/system.md
Use your custom pattern:
fabric --pattern my-analyzer "analyze this text"
fabric --listpatterns alongside built-in onesfabric --updatepatternsYour custom patterns are completely private and won’t be affected by Fabric updates!
Fabric also makes use of some core helper apps (tools) to make it easier to integrate with your various workflows. Here are some examples:
to_pdfto_pdf is a helper command that converts LaTeX files to PDF format. You can use it like this:
to_pdf input.tex
This will create a PDF file from the input LaTeX file in the same directory.
You can also use it with stdin which works perfectly with the write_latex pattern:
echo "ai security primer" | fabric --pattern write_latex | to_pdf
This will create a PDF file named output.pdf in the current directory.
to_pdf InstallationTo install to_pdf, install it the same way as you install Fabric, just with a different repo name.
go install github.com/danielmiessler/fabric/cmd/to_pdf@latest
Make sure you have a LaTeX distribution (like TeX Live or MiKTeX) installed on your system, as to_pdf requires pdflatex to be available in your system’s PATH.
code2contextcode2context is used in conjunction with the create_coding_feature pattern.
It generates a json representation of a directory of code that can be fed into an AI model
with instructions to create a new feature or edit the code in a specified way.
See the Create Coding Feature Pattern README for details.
Install it first using:
go install github.com/danielmiessler/fabric/cmd/code2context@latest
The examples use the macOS program pbpaste to paste content from the clipboard to pipe into fabric as the input. pbpaste is not available on Windows or Linux, but there are alternatives.
On Windows, you can use the PowerShell command Get-Clipboard from a PowerShell command prompt. If you like, you can also alias it to pbpaste. If you are using classic PowerShell, edit the file ~\Documents\WindowsPowerShell\.profile.ps1, or if you are using PowerShell Core, edit ~\Documents\PowerShell\.profile.ps1 and add the alias,
Set-Alias pbpaste Get-Clipboard
On Linux, you can use xclip -selection clipboard -o to paste from the clipboard. You will likely need to install xclip with your package manager. For Debian based systems including Ubuntu,
sudo apt update
sudo apt install xclip -y
You can also create an alias by editing ~/.bashrc or ~/.zshrc and adding the alias,
alias pbpaste='xclip -selection clipboard -o'
Fabric now includes a built-in web interface that provides a GUI alternative to the command-line interface. Refer to Web App README for installation instructions and an overview of features.
[!NOTE] Special thanks to the following people for their inspiration and contributions!
-c context flag that adds pre-created context in the ./config/fabric/ directory to all Pattern queries.llama2 before sending on to gpt-4 for analysis.Made with contrib.rocks.
fabric was created by Daniel Miessler in January of 2024.
Available for macOS, Linux and Windows
fabricfabricis an open-source framework for augmenting humans using AI.Updates • What and Why • Philosophy • Installation • Usage • REST API • Examples • Just Use the Patterns • Custom Patterns • Helper Apps • Meta