-
azurechatgpt
Discontinued 🤖 Azure ChatGPT: Private & secure ChatGPT for internal enterprise use 💼
-
sample-app-aoai-chatGPT
Sample code for a simple web chat experience through Azure OpenAI, including Azure OpenAI On Your Data.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
azure-search-openai-demo
A sample app for the Retrieval-Augmented Generation pattern running in Azure, using Azure AI Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&A experiences.
-
anything-llm
The all-in-one Desktop & Docker AI application with full RAG and AI Agent capabilities.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
azurechatgpt
🤖 Azure ChatGPT: Private & secure ChatGPT for internal enterprise use 💼 (by imartinez2)
Assuming you are referring to this section: https://github.com/microsoft/azurechatgpt/blob/main/docs/3-r...
It means you run the front end (the chat-gui) and the backend code from the repo. This code connects to cosmo-db for uploading documents used for "chat with you pdf" and connects to an OpenAI instance on Azure for the chat inferrence.
How is this different from the other OpenAI GUI? Why another one? https://github.com/microsoft/sample-app-aoai-chatGPT.
There's at least two more. There's also https://github.com/Azure-Samples/azure-search-openai-demo
And you can deploy a chat bot from within the Azure playground which runs on another codebase.
A lot of companies are already using projects like chatbot-ui with Azure's OpenAI for similar local deployments. Given this is as close to local ChatGPT as any other project can get, this is a huge deal for all those enterprises looking to maintain control over their data.
Shameless plug: Given the sensitivity of the data involved, we believe most companies prefer locally installed solutions to cloud based ones at least in the initial days. To this end, we just open sourced LLMStack (https://github.com/TryPromptly/LLMStack) that we have been working on for a few months now. LLMStack is a platform to build LLM Apps and chatbots by chaining multiple LLMs and connect to user's data. A quick demo at https://www.youtube.com/watch?v=-JeSavSy7GI. Still early days for the project and there are still a few kinks to iron out but we are very excited for it.
This appears to be a web frontend with authentication for Azure's OpenAI API, which is a great choice if you can't use Chat GPT or its API at work.
If you're looking to try the "open" models like Llama 2, definitely check out https://github.com/jmorganca/ollama or some of the lower level runners like llama.cpp (which powers the aforementioned project) or Candle, the new project by hugging face.
What's are folks' take on this vs Llama 2, which was recently released by Facebook Research? While I haven't tested it extensively, 70B model is supposed to rival Chat GPT 3.5 in most areas, and there are now some new fine-tuned versions that excel at specific tasks like Codeup
Im not surprised Azure would add something like this to the stack. We build AnythingLLM (https://github.com/Mintplex-Labs/anything-llm) back in June due to some enterprise customers wanting something isolated they could run on premises with Azure OpenAI support + any vector DB they want.
With Azure's move to try to internalize any enterprise integration for AI it makes sense to make a chatbot wrapper because its a no-moat move. I think a lot of the "moat" if one can exist in the "chat with your docs" vertical is just integrations into flows and data sources SMB/Enterprises are already using.
For businesses, in my experience, the on-prem thing has been the first decision point - without question. Azure wrapper could be nice to have for those who cannot use chatGPT on the work comp but have access to this instead.
I wonder what kind of hypervisor view it gives to Azure admins for those who use it - it any. Multi-tenant instances was the second highest demand from SMB/Enterprise customers for AnythingLLM.