openai-python
llama_index
openai-python | llama_index | |
---|---|---|
71 | 78 | |
27,279 | 42,912 | |
1.7% | 2.4% | |
9.7 | 9.9 | |
4 days ago | 4 days ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
openai-python
- Structured Output with LangChain and Llamafile
-
๐ Building an Azure OpenAI Chatbot: Challenges, Solutions & Why JavaScript Beats Python for the Web
Check the official migration guide for updates.
-
XAI Has Acquired X
Okay, I know Tesla's extremely high P/E ratio is because it's worth is not just tied to cars, and so xAI priced at $20B more than Anthropic does not necessarily mean xAI's AI products are that much better than Anthropic's (e.g. presumably xAI's worth is tied to synergies with Tesla FSD, Optimus, and maybe even Neurolink)...but what products does xAI actually offer, other than Grok being an add-on for premium X subscriptions?
Not only does the Grok API not have access to Grok 3, which was released more than a month ago, it doesn't even have it's own SDK? [0]
> Some of Grok users might have migrated from other LLM providers. xAI API is designed to be compatible with both OpenAI and Anthropic SDKs, except certain capabilities not offered by respective SDK. If you can use either SDKs, we recommend using OpenAI SDK for better stability.
(every code example has a call for `from openai import OpenAI`)
How would using Grok be viable for any enterprise? And if Grok's API is designed to be drop-in replacement for OpenAI's, how are they not able to just use Grok to whip up their own SDK variant based on OpenAI's open-sourced SDK [1] and API spec?
[0] https://docs.x.ai/docs/guides/migration
[1] https://github.com/openai/openai-python
-
New Tools for Building Agents
If you want to get an idea for the changes, here's a giant commit where they updated ALL of the Python library examples in one go from the old chat completions to the new resources APIs: https://github.com/openai/openai-python/commit/2954945ecc185...
-
Build your next AI Tech Startup with DeepSeek
The API itself is pretty straightforward. You can use it with the OpenAI package on NPM or PIP, or make an HTTP Request. Note for this demo I will be using NodeJS. I will be working in an empty folder with an index.js file, and a package.json file.
-
Introduction to Using Generative AI Models: Create Your Own Chatbot!
To interact with the OpenAI API, you will install the openai package:
-
Exploring Job Market for Software Engineers
Python was chosen for its versatile libraries, particularly linkedin_jobs_scraper and openai. These packages streamlined the scraping and processing of job data.
- OpenAI adds new o1 models
-
LLM Fine-Tuning: Domain Embeddings with GPT-3
The essential library for this project is OpenAI, supported by two helper libraries. Install them with the poetry dependency manager a shown:
- The Stainless SDK Generator
llama_index
-
Complete Large Language Model (LLM) Learning Roadmap
Resource: LlamaIndex Documentation
-
Quick tip: Replace MongoDBยฎ Atlas with SingleStore Kai in LlamaIndex
The notebook is adapted from the LlamaIndex GitHub repo.
- Show HN: Route your prompts to the best LLM
- LlamaIndex: A data framework for your LLM applications
- FLaNK AI - 01 April 2024
-
Show HN: Ragdoll Studio (fka Arthas.AI) is the FOSS alternative to character.ai
For anyone curious llamaindex's "prompt mixins", they're actually dead simple: https://github.com/run-llama/llama_index/blob/8a8324008764a7... - and maybe no longer supported.
I basically reinvented this wheel in ragdoll but made it more dynamic: https://github.com/bennyschmidt/ragdoll/blob/master/src/util...
- LlamaIndex is a data framework for your LLM applications
- How to verify that a snippet of Python code doesn't access protected members
-
๐ Local & Open Source AI: a kind ollama & LlamaIndex intro
Being able to plug third party frameworks (Langchain, LlamaIndex) so you can build complex projects
-
I made an app that runs Mistral 7B 0.2 LLM locally on iPhone Pros
Mistral Instruct does use a system prompt.
You can see the raw format here: https://www.promptingguide.ai/models/mistral-7b#chat-templat... and you can see how LllamaIndex uses it here (as an example): https://github.com/run-llama/llama_index/blob/1d861a9440cdc9...
What are some alternatives?
Awesome-LLMOps - An awesome & curated list of best LLMOps tools for developers
langchain - ๐ฆ๐ Build context-aware reasoning applications
maelstrom - A workbench for writing toy implementations of distributed systems.
chatgpt-retrieval-plugin - The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
openai-node - Official JavaScript / TypeScript library for the OpenAI API
langchain - โก Building applications with LLMs through composability โก [Moved to: https://github.com/langchain-ai/langchain]