-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
For better or worse, I inherited some Express.js apps running on Heroku. It's been a year and it's mostly fine until recently. I've got a need/desire to stream a response back to the client and while Node and Express support that quite well, Heroku has a response buffer that I haven't found a way to flush from within the app or disable with any configuration. Oh well, not interested in Heroku pricing, security breaches, recent downtimes, etc. Thanks to this subreddit, I came upon the recommendations for render.com and this morning I quickly spun up a free tier test server and confirmed that they do indeed support response streaming without any buffering. They even gzipped each response chunk. I don't know much else about render.com and know knows if I'll actually be migrating, but in case anyone else needs this info, here it is :)
Logging the response chunks received, all chunks arrive at the same time. When the app runs elsewhere (locally for example) you can watch each chunk arrive one after another. Now that doesn't actually tell us that Heroku is buffering the response, just that something between the client and server is buffering the response. For example, I found that Firefox will buffer the first 512B of a response for content type sniffing when no content type header is found. Basically, through testing various combinations of factors, I isolated Heroku to be the common denominator, found some hints that a proxy server called Vegur may be the piece of infrastructure to blame, and then finally I found this from Heroku themselves that calls this out as a feature.