Unix Piping
sql-splitter follows Unix philosophy and works well in pipelines.
Standard I/O
Section titled “Standard I/O”Omit -o to write to stdout, enabling pipes:
sql-splitter convert mysql.sql --to postgres | psql "$PG_CONN"Read from stdin with -:
cat dump.sql | sql-splitter analyze -Note: Most commands default to stdout when -o is not specified.
Pipeline Examples
Section titled “Pipeline Examples”Convert and Import
Section titled “Convert and Import”# MySQL to PostgreSQL, direct importsql-splitter convert mysql.sql.gz --to postgres | psql "$PG_CONN"
# PostgreSQL to MySQL, direct importsql-splitter convert pg_dump.sql --to mysql | mysql -u user -p dbCompress on the Fly
Section titled “Compress on the Fly”# Merge and compress (omit -o to write to stdout)sql-splitter merge tables/ | gzip > merged.sql.gz
# Merge and compress with zstdsql-splitter merge tables/ | zstd > merged.sql.zstFilter and Process
Section titled “Filter and Process”# Sample, redact, and savesql-splitter sample prod.sql --percent 10 | \ sql-splitter redact - --hash "*.email" -o dev.sqlParallel Validation
Section titled “Parallel Validation”# Validate many files in parallelfind dumps -name '*.sql.gz' -print0 | \ xargs -0 -n1 -P4 sql-splitter validate --strictDecompress on the Fly
Section titled “Decompress on the Fly”# Decompress, process, recompresszcat backup.sql.gz | \ sql-splitter convert - --to postgres | \ gzip > backup_pg.sql.gzCombining Commands
Section titled “Combining Commands”Full Dev Dataset Pipeline
Section titled “Full Dev Dataset Pipeline”sql-splitter sample prod.sql.gz --percent 10 --preserve-relations | \ sql-splitter redact - --hash "*.email" --fake "*.name" | \ sql-splitter validate - --strict && \ echo "Valid dev dataset created"Migration with Validation
Section titled “Migration with Validation”# Validate, convert, validate, importsql-splitter validate source.sql --strict && \ sql-splitter convert source.sql --to postgres | \ sql-splitter validate - --dialect postgres --strict && \ sql-splitter convert source.sql --to postgres | \ psql "$PG_CONN"Exit Code Handling
Section titled “Exit Code Handling”# Stop on first failureset -esql-splitter validate dump.sql --strictsql-splitter convert dump.sql --to postgres -o output.sql
# Or explicit checkingif sql-splitter validate dump.sql --strict; then sql-splitter convert dump.sql --to postgres -o output.sqlelse echo "Validation failed" exit 1fiJSON Pipeline
Section titled “JSON Pipeline”# Extract specific info with jqsql-splitter analyze dump.sql --json | jq '.tables[].name'
# Filter validation resultssql-splitter validate "*.sql" --json | \ jq '.results[] | select(.passed == false)'Tee for Logging
Section titled “Tee for Logging”# Process and logsql-splitter convert dump.sql --to postgres | \ tee conversion.log | \ psql "$PG_CONN"