-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I currently have teslamate running on a raspberryPi, I previously used for a piHole. As I deprecate the pi, in favor of my FWG, I just don't know enough to determine if the added horsepower of the teslamate running in a docker container is going to affect my FWG's core competencies. To disclose; it's for use on my home network, certainly not a highly trafficked environment. And my connection is ~460 mbps. I'm assuming it's fine since FWG's own documentation shows how to install a piHole. But I don't know how to determine if the teslamate will be "too much" for the FWG hardware and affect it's performance. Thoughts?
I currently have teslamate running on a raspberryPi, I previously used for a piHole. As I deprecate the pi, in favor of my FWG, I just don't know enough to determine if the added horsepower of the teslamate running in a docker container is going to affect my FWG's core competencies. To disclose; it's for use on my home network, certainly not a highly trafficked environment. And my connection is ~460 mbps. I'm assuming it's fine since FWG's own documentation shows how to install a piHole. But I don't know how to determine if the teslamate will be "too much" for the FWG hardware and affect it's performance. Thoughts?