Learn jq — The 10 Commands That Cover 90% of Real-World JSON Work
Practical beginner's guide to jq: install it in one command, understand the filter model, and learn the 10 patterns that handle most real JSON processing tasks.
- A terminal (macOS, Linux, or Windows with WSL/PowerShell)
- Basic comfort with the command line
Learn jq — The 10 Commands That Cover 90% of Real-World JSON Work
jq is probably a 15-minute learn for the 80% of commands you’ll actually use. The remaining 20% is genuinely complex — recursive descent, advanced aggregations, custom function definitions — but you don’t need any of it to transform API responses, debug MCP server output, or parse CI/CD logs. The official docs read like a language specification, not a tutorial, which is why most developers know jq exists but never actually learn it. This guide closes that gap: from zero to productive in one read.
What Is jq?
jq is a lightweight command-line JSON processor. Think of it as sed for JSON — you can slice, filter, map, and transform structured data with the same ease that sed, awk, and grep handle text. Current version is 1.8.1 (released July 2025), written in portable C with zero runtime dependencies. Download a single binary, copy it to any machine of the same architecture, and expect it to work.
It excels at three things: making JSON readable, extracting specific values from API responses, and transforming JSON structures into formats your scripts can consume.
Prerequisites
- A terminal (macOS, Linux, or Windows with WSL or PowerShell)
- Basic comfort with the command line — if you can run
curland pipe output, you’re ready
Installation
One command per platform. No package manager conflicts, no dependency hell.
macOS (Homebrew):
brew install jq
Linux (Debian/Ubuntu):
sudo apt-get install jq
Linux (Fedora/Red Hat):
sudo dnf install jq
Windows (winget):
winget install jqlang.jq
Windows (Chocolatey):
choco install jq
Verify the install worked:
jq --version
# jq-1.8.1
That’s it. No config files, no runtime to manage. The binary is self-contained.
Your First Command
Start with pretty-printing. Take any compressed JSON and make it readable:
echo '{"name":"Alice","age":30,"city":"NYC"}' | jq '.'
Output:
{
"name": "Alice",
"age": 30,
"city": "NYC"
}
The quoting rule you must know before anything else: The jq filter must always be quoted. On Unix shells, use single quotes: jq '.'. On PowerShell, use double quotes: jq ".". Without quotes, the shell tries to interpret the filter itself — your command breaks immediately and the error message won’t tell you why. This is the single most common beginner mistake, and it will cost you ten minutes the first time it happens.
What happened in that command? You piped JSON to jq with the . filter. The dot is the identity operator — it takes the input and passes it through unchanged, but jq adds pretty formatting and syntax highlighting in the process. That brings us to the mental model you need before the patterns make sense.
The Mental Model: Filters and Pipes
jq thinks in filters and pipes, not loops. Every expression takes input, produces output. The . is identity. The | chains filters together. Each filter transforms the data stream it receives.
Think of it as a pipeline: Input → Filter → Output, where you chain as many filters as you need. This has one consequence that surprises almost every beginner: .[] iterating over an array produces multiple separate outputs, not a single array result. Each element appears as its own line of output. If you expect a JSON array back and you’re getting newline-separated values instead, this is why — and once it clicks, the rest follows naturally.
Things that would require loops and iteration in other languages are done by gluing filters together in jq. It’s a different way of thinking about data transformation, but it’s consistent — and that consistency is what makes it fast to learn. A filter takes an input stream and produces an output stream. Piping (|) connects them. That’s the whole model.
The 10 Essential Patterns
1. Pretty-Print Any JSON
When you receive compressed API output and need to read it.
curl -s https://api.github.com/repos/jqlang/jq | jq '.'
The . filter is identity — it passes input through unchanged, but jq formats it with indentation and color. This alone is worth installing jq. Every developer has squinted at a wall of minified JSON at some point, and pretty-printing is the first thing you’ll reach for.
2. Extract a Single Field
When you need one specific value from a JSON object.
curl -s https://api.github.com/repos/jqlang/jq | jq '.description'
# Output: "Command-line JSON processor"
Dot notation accesses any key directly: .name, .status, .created_at. If the key doesn’t exist, jq returns null rather than throwing an error — which is useful when schemas aren’t guaranteed. This is the pattern you’ll use dozens of times a day once jq is in your workflow.
3. Navigate Nested Objects
When your data is buried several levels deep.
curl -s https://api.github.com/repos/jqlang/jq | jq '.owner.login'
# Output: "jqlang"
Chain dots to traverse the structure: .user.profile.city, .response.data.items. Each dot is just another filter in sequence. You can go as deep as the structure allows. GitHub’s API response for a repository is a good test object for this — the owner field is an object with its own fields, so .owner.login goes two levels deep.
4. Access Array Elements by Index
When you need a specific item from a list.
curl -s "https://api.github.com/repos/jqlang/jq/commits?per_page=3" | jq '.[0].commit.message'
# Output: the most recent commit message
.[0] is the first element. .[-1] is the last. .[2:5] is a slice covering the third through fifth elements. Indexing is zero-based. Combine indexing with field access by chaining: .[0].author.name gets the first commit’s author in one expression. The key distinction to hold onto: .[0] gives you one element; .[] (no index) gives you all of them as a stream.
5. Iterate Over an Array
When you need to process every element in a list.
curl -s https://api.github.com/repos/jqlang/jq/contents | jq '.[] | .name'
# Output: every filename in the repo root, one per line
.[] produces each element as a separate output — this is the foundation for almost every real jq task involving arrays. It does not return an array; it explodes one into a stream of values. Pipe that stream into any subsequent filter. This two-step pattern — iterate with .[], then extract with .fieldname — handles a large share of real API work. Once you’re comfortable with it, you’ll use it constantly.
6. Filter with Conditions
When you only want elements that meet a criterion.
curl -s https://api.github.com/repos/jqlang/jq/contents | jq '.[] | select(.size > 10000) | .name'
# Output: only files larger than 10KB
select() passes through elements where the condition is true and drops the rest. Works with numbers, strings, and booleans. String matching: select(.name | contains("test")). Combined conditions: select(.active and .role == "admin"). Regex matching: select(.name | test("^error")). If you know SQL’s WHERE clause, select() is the same idea — it’s a filter that keeps or discards based on a predicate.
7. Build New Objects
When you need to reshape data into a different structure.
curl -s https://api.github.com/repos/jqlang/jq | jq '{repo: .name, stars: .stargazers_count, language: .language}'
# Output: {"repo":"jq","stars":29847,"language":"C"}
Construct a new object with {key: expression, key: expression}. The keys are literal names you choose; the values are filters applied to the current input. If the key name matches the field name, jq has a shorthand: {name, language} is equivalent to {name: .name, language: .language}. This is the primary tool for trimming bloated API responses down to what you actually need — pick the five fields you care about and discard the other forty.
8. Chain Multiple Operations
When the transformation requires several steps.
echo '[{"name":"alice","active":true,"email":"alice@example.com"},{"name":"bob","active":false,"email":"bob@example.com"}]' \
| jq '.[] | select(.active) | .email'
# Output: "alice@example.com"
This chains three filters: iterate the array, keep only active users, extract the email. Each | passes the output of the left side as input to the right side. This is where jq becomes genuinely useful — complex transformations assembled from simple, composable pieces. A pipe chain like .users[] | select(.role == "admin") | {id: .id, email: .email} is readable on first glance once you have the mental model. The key insight: each step does one thing, and steps compose cleanly.
9. Count and Measure
When you need the size of an array, object, or string.
curl -s https://api.github.com/repos/jqlang/jq/contents | jq 'length'
# Output: number of files and directories in the repo root
length returns the number of elements in an array, the number of keys in an object, or the number of characters in a string. Combine with iteration: .items | length counts items in a nested array. Combine with select to count filtered results: [.logs[] | select(.level == "error")] | length counts error-level log entries. Note the wrapping brackets — you need to collect the stream back into an array before calling length on it.
10. Raw Output for Scripts
When you need strings without JSON quotes for use in shell variables or downstream tools.
REPO_NAME=$(curl -s https://api.github.com/repos/jqlang/jq | jq -r '.name')
echo "Repository: $REPO_NAME"
# Output: Repository: jq
Without -r, you get "jq" — with quotes — which breaks variable assignments and string comparisons in bash. With -r, you get jq. This flag is essential the moment you move from exploration to scripting. Forget it once and you’ll remember it every time after. The rule is simple: reading output on screen to understand data, skip -r. Output feeds a script or another tool, add -r.
Real-World Examples
The 10 patterns above are building blocks. Here’s what they look like assembled for actual developer tasks.
Parse API Error Responses
When an API call fails and the error is buried in a nested structure:
echo '{"error":{"code":404,"message":"Repository not found","details":{"resource":"repository","field":"name"}}}' \
| jq '.error | {code: .code, message: .message}'
# Output: {"code":404,"message":"Repository not found"}
One pipe chain extracts the meaningful signal from the noise. Add -r if you want to assign the message to a bash variable for logging or alerting.
Filter JSON Logs by Severity
Production services logging as JSON are common. Extract only what’s on fire:
cat app.log | jq -c 'select(.level == "error")'
The -c flag outputs compact JSON — one result per line. Better for piping to grep, wc -l, or further jq stages. This pattern scales to any structured log format: swap .level == "error" for whatever field your logging library uses. It also composes well with the count pattern: cat app.log | jq '[select(.level == "error")] | length' tells you exactly how many errors are in the file.
Convert API Response to CSV
When a downstream tool expects CSV and your API returns JSON:
echo '[{"id":1,"name":"Alice","email":"alice@example.com"},{"id":2,"name":"Bob","email":"bob@example.com"}]' \
| jq -r '.[] | [.id, .name, .email] | @csv'
# Output:
# 1,"Alice","alice@example.com"
# 2,"Bob","bob@example.com"
The @csv built-in handles quoting and escaping correctly. Use @tsv for tab-separated output. The -r flag is mandatory here — without it, @csv output is double-quoted as a JSON string, which defeats the purpose entirely.
Extract Enabled Tools from a Config File
Useful for MCP server configs, feature flags, or any boolean-gated JSON structure:
echo '{"tools":[{"name":"github","enabled":true},{"name":"slack","enabled":false},{"name":"filesystem","enabled":true}]}' \
| jq -r '.tools[] | select(.enabled) | .name'
# Output:
# github
# filesystem
This is patterns 5, 6, and 10 composed. Three filters, one pipe chain, the result you need. The same structure works for any configuration format that uses enabled/disabled toggles.
Remove Sensitive Fields Before Committing
Strip credentials from config before it hits version control:
jq 'del(.password, .api_key, .secret_token)' config.json > config.public.json
del() removes one or multiple keys in a single pass. Safe to run on fields that don’t exist — jq ignores them silently. The output is the full JSON minus the named keys. This is one of the less-obvious jq features that saves real pain: instead of manually editing config files before committing, you automate the scrubbing.
Three Mistakes Beginners Make
Forgetting to quote the filter. Running jq .name file.json instead of jq '.name' file.json. The shell sees .name and tries to interpret it — what follows is a confusing error that has nothing to do with your JSON. On Unix, always use single quotes. On PowerShell, always use double quotes. Make it a habit before you’ve even finished this guide.
Confusing .[] with .[0]. .[] iterates — it produces one output per element, each appearing on its own line as a separate result. .[0] indexes — it returns exactly the first element as a single result. If you expect a JSON array back and you’re getting newline-separated values, you used .[] when you wanted [.[] | filter] — wrapping the output in [...] collects a stream back into a proper array. The brackets matter: they’re the difference between a stream and an array.
Skipping -r for scripts. jq outputs properly-quoted JSON strings by default. "alice@example.com" with quotes is valid JSON, but useless in a bash variable or a comparison. The rule is simple: if you’re reading the output on screen to understand data, skip -r. If the output feeds a script, a variable, or another tool, add -r. Forget this once while piping jq output into a SQL query and you’ll never forget it again.
Where to Go From Here
These 10 patterns cover the overwhelming majority of real jq use. When you hit the edge cases — recursive descent with .., complex aggregations with reduce, custom function definitions — the official jq manual is the authoritative reference. It’s dense, but searchable. Use it as a lookup, not a tutorial. The jq playground lets you test filters against JSON in a browser and share examples with teammates — it’s the fastest way to debug a filter that isn’t behaving the way you expect.
For the strategic question — when to use jq versus Python, gron, or fx — read our jq tool profile. The short answer: use jq for one-liners and shell scripts where the JSON structure is known and stable. Use Python when the logic gets complex, the schema is uncertain, or you need error handling and multiple API calls in sequence. jq is sharp and narrow; Python is broad and forgiving. Most pipelines need both at different points.
The tool itself is simple at its core: filters compose, pipes chain, the dot is identity. Everything else is just more filters.