-
csv_log_cleaner
Clean CSV files to conform to a type schema by streaming them through small memory buffers using multiple threads and logging data loss.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
CSVLint
CSV Lint plug-in for Notepad++ for syntax highlighting, csv validation, automatic column and datatype detecting, fixed width datasets, change datetime format, decimal separator, sort data, count unique values, convert to xml, json, sql etc. A plugin for data cleaning and working with messy data files.
Sounds like it could be more of a data cleansing problem you're facing than a data inference one. Even a single non-numerical value in a million rows of numbers will necessarily mess up type inference for the whole column. I work with a lot of CSVs and that's one of the issues we have to spend a huge amount of time dealing with. I even ended up writing this open source tool to handle the cleansing: https://github.com/ambidextrous/csv_log_cleaner
There's also the CSV Lint plug-in for Notepad++ which can detect datatypes, and then you can do CSV Lint > Generate metadata > Python script. Although idk it might not work correctly for all datetime datatypes.