Readme
pick
pick-cli.pages.dev
Extract values from anything — JSON, YAML, TOML, .env, HTTP headers, logfmt, CSV, and more.
cargo install pick-cli
pick auto-detects the input format and lets you extract, filter, and transform values using a unified selector syntax. Think of it as jq for all config formats — no more juggling jq , yq , grep , awk , and cut .
Quick Start
# JSON
curl -s https://api.github.com/users/octocat | pick login
# octocat
# .env
cat .env | pick DATABASE_URL
# postgres://localhost:5432/mydb
# YAML
cat config.yaml | pick server.port
# 8080
# TOML
cat Cargo.toml | pick package.version
# 0.1.0
# HTTP headers
curl -sI https://example.com | pick content-type
# text/html; charset=UTF-8
# logfmt
echo 'level=info msg="request handled" status=200' | pick status
# 200
# CSV
cat data.csv | pick '[0].name'
# Alice
Selectors
Syntax
Description
foo
Top-level key
foo. bar
Nested key
foo[ 0 ]
Array index
foo[ - 1 ]
Last element
foo[ * ] . name
All elements, pluck field
foo[ 1 : 3 ]
Array slice (elements 1 and 2)
foo[ : 2 ]
First 2 elements
foo[ - 2 : ]
Last 2 elements
[ 0 ]
Index into root array
.. name
Recursive descent — find name at any depth
" dotted.key" . sub
Quoted key (for keys containing dots)
name, age
Multiple selectors (union)
Pipes & Filters
Chain operations with the pipe operator (| ), filter with select ( ) , and transform with builtins:
# Filter: find expensive items
cat data.json | pick 'items[*] | select(.price > 100) | name'
# Regex: match patterns
cat data.json | pick 'items[*] | select(.email ~ "@gmail\\.com$") | name'
# Boolean logic: and, or, not
cat data.json | pick 'users[*] | select(.age >= 18 and .active == true) | name'
# Builtins: keys, values, length
cat config.json | pick 'keys()'
cat config.json | pick 'dependencies | length()'
cat data.json | pick 'items[*].name | length()'
# Chain multiple stages
cat data.json | pick 'items[*] | select(.price > 50) | name | length()'
Filter Operators
Operator
Example
Description
==
select ( . role == " admin" )
Equality
!=
select ( . status != " deleted" )
Inequality
>
select ( . price > 100 )
Greater than
<
select ( . age < 18 )
Less than
>=
select ( . score >= 90 )
Greater or equal
<=
select ( . count <= 10 )
Less or equal
~
select ( . name ~ " ^A" )
Regex match
and
select ( . a > 1 and . b < 5 )
Logical AND
or
select ( . x == 1 or . y == 2 )
Logical OR
not
select ( not . deleted)
Logical NOT
Builtins
Builtin
Description
keys ( )
Get object keys or array indices
values ( )
Get object values
length ( )
Length of array, object, or string
Mutation
Modify data in-place with set ( ) and del ( ) :
# Set a value
cat config.json | pick 'set(.version, "2.0")' --json
# Delete a key
cat config.json | pick 'del(.temp)' --json
# Chain mutations
cat data.json | pick 'set(.status, "active") | del(.temp)' --json
# Mutate then extract
cat config.json | pick 'set(.version, "2.0") | version'
Convert between formats with --output :
# JSON to YAML
cat data.json | pick -o yaml
# JSON to TOML
cat data.json | pick 'config' -o toml
# Always output JSON
cat data.json | pick 'name' --json
Streaming (JSONL)
Process newline-delimited JSON (JSONL) line-by-line with --stream :
# Extract from each line
cat events.jsonl | pick 'user.name' --stream
# Filter streamed data
cat logs.jsonl | pick 'select(.level == "error") | message' --stream
# Stream from file
pick -f events.jsonl 'timestamp' --stream
Flags
Flag
Description
- i, - - input < format>
Force input format (json , yaml , toml , env , headers , logfmt , csv , text )
- o, - - output < format>
Output format (json , yaml , toml )
- f, - - file < path>
Read from file instead of stdin
--json
Output result as JSON
--raw
Output without trailing newline
- 1 , - - first
Only output first result
--lines
Output array elements one per line
- d, - - default < value>
Default value if selector doesn't match
- q, - - quiet
Suppress error messages
- e, - - exists
Check if selector matches (exit code only)
- c, - - count
Output count of matches
--stream
Stream mode: process JSONL input line-by-line
Examples
Pipe-friendly
# Get the current git user's repos
curl -s https://api.github.com/users/octocat/repos | pick '[*].name' --lines
# Check if a key exists before using it
if cat config.json | pick database.host --exists; then
DB_HOST=$(cat config.json | pick database.host)
fi
# Extract with a fallback
cat config.yaml | pick server.port --default 3000
# Count results
echo '[1,2,3,4,5]' | pick '[*]' --count
# 5
Real-world
# Docker container status
docker inspect mycontainer | pick '[0].State.Status'
# Kubernetes pod IPs
kubectl get pods -o yaml | pick 'items[*].status.podIP' --lines
# Find all IDs recursively in any JSON
cat response.json | pick '..id'
# Cargo.toml dependencies
pick -f Cargo.toml dependencies.serde.version
# GitHub API: filter Rust repos with 100+ stars
curl -s https://api.github.com/users/octocat/repos | \
pick '[*] | select(.language == "Rust" and .stargazers_count > 100) | name'
# Terraform outputs
terraform output -json | pick '..value'
# npm: count dependencies
cat package.json | pick 'dependencies | keys() | length()'
# .env database URL for a script
export DB=$(cat .env | pick DATABASE_URL)
Format
Auto-detected
Example
JSON
Yes
{ " key" : " value" }
YAML
Yes
key: value
TOML
Yes
[ section ] / key = " value"
.env
Yes
KEY = value
HTTP headers
Yes
Content- Type: text/ html
logfmt
Yes
level= info msg= " hello"
CSV/TSV
Yes
name, age\nAlice, 30
Plain text
Fallback
Key-value extraction and substring search
Auto-detection works in most cases. Use - i to override when the input is ambiguous.
Install
Cargo (Rust)
cargo install pick-cli
Homebrew (macOS/Linux)
brew install aryanbhosale/pick/pick
npm
npm install - g @aryanbhosale/pick
Snap (Linux)
snap install pick-cli
Chocolatey (Windows)
choco install pick
WinGet (Windows)
winget install aryanbhosale.pick
Docker
echo ' {"foo":"bar"}' | docker run - i ghcr.io/aryanbhosale/pick foo
GitHub Releases
Download pre-built binaries from Releases — macOS (ARM/x64), Linux (x64/ARM64), and Windows (x64).
From source
Requires Rust 1.85+:
git clone https://github.com/aryanbhosale/pick.git
cd pick
cargo install -- path .
Contributing
Contributions are welcome! Here's how to get started:
Fork the repository
Create a feature branch: git checkout - b my-feature
Make your changes
Run the tests: cargo test
Commit and push: git push origin my-feature
Open a pull request
Development
# Run all tests (866 tests)
cargo test
# Run a specific test
cargo test test_name
# Build release binary
cargo build --release
# The binary will be at target/release/pick
Project Structure
src/
main. rs Entry point, stdin/ file reading, streaming
lib. rs Orchestration and format routing
cli. rs CLI argument definitions
error. rs Error types
selector/ Selector engine ( modular)
types. rs AST types ( Expression, Pipeline, Selector, Filter, etc. )
parser. rs Hand- rolled recursive descent parser
extract. rs Path traversal and pipeline execution
filter. rs Filter evaluation ( select, comparisons, regex)
manipulate. rs set ( ) and del ( ) operations
detector. rs Format auto- detection heuristics
output. rs Output formatting ( plain, JSON , YAML , TOML )
streaming. rs JSONL streaming processor
formats/ Per- format parsers
json. rs, yaml. rs, toml_format. rs, env. rs,
headers. rs, logfmt. rs, csv_format. rs, text. rs
tests/
integration. rs CLI integration tests
Issues
Found a bug or have a feature request? Open an issue .
License
MIT