Pavel Stehule: using jq for processing PostgreSQL logs in json format

PostgreSQL supports logging in json format. From my perspective json logs are badly readable, but allows machine processing, and with good tools, it is beautifully simple. There are more tools for json processing - I use jq. For simple analyze of errors in log, I can use sequence of commands: cat postgresql-Sun.json | jq 'select(.error_severity=="ERROR").message'|sort -n | uniq -c 1 "canceling statement due to user request" 1 "column \"de.id\" must appear in the GROUP BY clause or be used in an aggregate function" 1 "column reference \"modify_time\" is ambiguous" 3 "column \"us.show_name\" must appear in the GROUP BY clause or be used in an aggregate function" 24 "current transaction is aborted, commands ignored until end of transaction block" 3 "deadlock detected" For transformation to csv and viewing it in pspg: cat postgresql-Sun.json | \ jq -r 'select(.error_severity=="ERROR") | \ [.timestamp, .user, .ps, .error_severity, .message ] | \ @csv' \| pspg --csv  Note - I divided the jq command to separated lines for readability (in real life this is one line). With these tools the work with log is "almost" effective and friendly (pspg supports sorting, searching, clipboard).

相关推荐 去reddit讨论