Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Telegraf agent scraping your Apache logs with the tail input plugin (parsing each entry into desired metrics at the same time), and saving them to a flat file in CSV format with the file output plugin. Then you could have a script parse the CSV file and upload it to MySQL at your desired cadence. Or, you could even have telegraf execute the script for you at your desired batch interval, with the exec output plugin.
Yep, you can create a filter in jq to do that. Alternatively, if you prefer Python syntax you could try jello, which works like jq but is really Python under the hood. (I am also the author of jello)
Related posts
- the case for bash
- Parsing Complex JSON
- Anyone have a resource to filter out information from complex json outputs? In the example, I am trying to get the "state": "succeeded" information for each entry in the resource array.
- FX: An interactive alternative to jq to process JSON
- Json file handling with python