Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
This doesn’t seem correct to me. I saw in another pull request, an Ollama contributor said:
> As you pointed out, we carry patches, although in general we try to upstream those.
— https://github.com/ollama/ollama/issues/2534#issuecomment-19...
So I followed the link to his profile and saw that he has opened some non-documentation pull requests for llama.cpp:
https://github.com/ggerganov/llama.cpp/pull/5244
https://github.com/ggerganov/llama.cpp/pull/5576
I didn’t dig any deeper, but it took me less than thirty seconds to find those so I expect there are more.
This doesn’t seem correct to me. I saw in another pull request, an Ollama contributor said:
> As you pointed out, we carry patches, although in general we try to upstream those.
— https://github.com/ollama/ollama/issues/2534#issuecomment-19...
So I followed the link to his profile and saw that he has opened some non-documentation pull requests for llama.cpp:
https://github.com/ggerganov/llama.cpp/pull/5244
https://github.com/ggerganov/llama.cpp/pull/5576
I didn’t dig any deeper, but it took me less than thirty seconds to find those so I expect there are more.