Skip to content

Glob Patterns & Multi-File Mode

Several commands support glob patterns for batch processing multiple files.

CommandGlob SupportOutput Behavior
splitCreates subdirectory per input file
analyzeAggregates results across all files
convertRequires output directory
validateValidates each file, aggregates results
Terminal window
# All SQL files in current directory
sql-splitter analyze "*.sql"
# All SQL files in a directory
sql-splitter validate "dumps/*.sql"
# Recursive (all subdirectories)
sql-splitter analyze "backups/**/*.sql"
# Compressed files
sql-splitter validate "dumps/*.sql.gz"

Important: Quote glob patterns to prevent shell expansion.

PatternMatches
*Any characters except /
**Any characters including / (recursive)
?Any single character
[abc]Any character in brackets
[a-z]Any character in range
Terminal window
# All .sql files in dumps/
sql-splitter analyze "dumps/*.sql"
# All .sql and .sql.gz files recursively
sql-splitter validate "backups/**/*.sql*"
# Files starting with "prod_"
sql-splitter analyze "prod_*.sql"
# Files from 2024
sql-splitter validate "backup_2024*.sql"

By default, glob processing continues even if one file fails. Use --fail-fast to stop on first error:

Terminal window
# Stop on first error
sql-splitter validate "*.sql" --fail-fast
# Continue despite errors (default)
sql-splitter validate "*.sql"

Outputs aggregate statistics across all files:

Terminal window
sql-splitter analyze "dumps/*.sql" --json
{
"files_processed": 5,
"total_tables": 42,
"total_rows": 150000,
"results": [
{ "file": "dumps/users.sql", "tables": 1, "rows": 1000 },
{ "file": "dumps/orders.sql", "tables": 1, "rows": 50000 }
]
}

Reports validation status for each file:

Terminal window
sql-splitter validate "*.sql" --json
{
"total_files": 3,
"passed": 2,
"failed": 1,
"results": [
{ "file": "a.sql", "summary": { "errors": 0, "warnings": 0 } },
{ "file": "b.sql", "summary": { "errors": 0, "warnings": 1 } },
{ "file": "c.sql", "summary": { "errors": 2, "warnings": 0 } }
]
}

Creates a subdirectory for each input file:

Terminal window
sql-splitter split "dumps/*.sql" -o output/
output/
├── dump1/
│ ├── users.sql
│ └── orders.sql
├── dump2/
│ ├── users.sql
│ └── products.sql

Requires an output directory (not a file):

Terminal window
# Correct: output directory
sql-splitter convert "*.sql" --to postgres -o converted/
# Creates:
# converted/
# ├── file1.sql
# ├── file2.sql

The --progress flag shows per-file and overall progress:

Terminal window
sql-splitter validate "backups/**/*.sql.gz" --progress
[1/5] backups/2024/jan.sql.gz ✓
[2/5] backups/2024/feb.sql.gz ✓
[3/5] backups/2024/mar.sql.gz ⚠ 2 warnings
[4/5] backups/2024/apr.sql.gz ✓
[5/5] backups/2024/may.sql.gz ✗ 1 error
Summary: 4 passed, 1 failed

For parallel processing of many files, use external tools:

Terminal window
# Process 4 files in parallel
find dumps -name '*.sql.gz' -print0 | \
xargs -0 -n1 -P4 sql-splitter validate --strict
# GitHub Actions example
- name: Validate all SQL dumps
run: |
sql-splitter validate "migrations/*.sql" --strict --json > validation.json
if jq -e '.failed > 0' validation.json; then
echo "Validation failed"
exit 1
fi