langchainjs
instructor
langchainjs | instructor | |
---|---|---|
12 | 17 | |
11,089 | 5,417 | |
4.8% | - | |
9.9 | 9.8 | |
5 days ago | about 16 hours ago | |
TypeScript | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
langchainjs
-
On the unpredictable nature of LLM output and type safety in LangChain TS
*** all code examples are using LangChain TS on the main branch on September 22nd, 2023 (roughly version 0.0.153).
-
Moving from Typescript and Langchain to Rust and Loops
At the time of the prototype's development, the Langchain GitHub loader sent one request per file to fetch the repository sequentially, leading to prolonged download times. In our case about 2 minutes for the insights.opensauced.pizza repository. This issue was later resolved in hwchase17/langchainjs#2224, enabling parallel requests for faster retrieval.
-
ai-utils.js VS langchainjs - a user suggested alternative
2 projects | 26 Jul 2023
Another llm orchestration library for js/ts
-
Ai personal assistant with long term memory?
You will probably need to create a custom agent with custom tools to do what you want to do. Look at Langchain (seems like there is an open PR for Google calendar tools here: https://github.com/hwchase17/langchainjs/pull/1777). There are a lot of great integration examples on their website (including for vectorDB memory https://python.langchain.com/docs/modules/memory/how_to/vectorstore_retriever_memory)
-
Building A Chat GPT Clone With Strapi Open AI and LangChain with Next JS 13 Frontend
You can checkout there docs (here)[https://js.langchain.com/docs/].
- Show HN: Python package for interfacing with ChatGPT with minimized complexity
-
Is there any project on langchain with scala
The strategy I tried, was to point scalablytyped, at langchainJS.
-
open-source app helps you brainstorm BANGER TWEETS
TL;DR the BANGER TWEET BRAINSToRMER ๐ฅ ๐ฆ ๐ง is an open-source, fullstack React/Express/Postgres/Pinecone app that brainstorms new ideas and tweet drafts based on your own notes/ideas and the tweets of your favorite twitter users. This isn't a bot, but you can think of it rather as your personal twitter intern that monitors current twitter **trends**, keeps note of your **ideas**, helps you **brainstorm** new ones, and write **draft** tweets. It's your job to find and edit the best ideas before saving them to your personal notes database or tweeting them out from the app itself. it uses pg-boss cron jobs via Wasp, OpenAI, langChain, and Pinecone for the vector store https://github.com/vincanger/twitter-brainstorming-agent
- Paid AI to train on company docs?
-
MongoDB and Generative AI
It is not great. It has a lot of limitations, but can be used under certain conditions. https://github.com/hwchase17/langchainjs/pull/655
instructor
- Instructor: Structured Outputs for LLMs
-
Anthropic's Haiku Beats GPT-4 Turbo in Tool Use
Ah yes. Have you tried out instructor [0] or Guidance [1]?
[0]: https://github.com/jxnl/instructor/
- Instructor: Structured Data Like JSON from Large Language Models
-
Show HN: Fructose, LLM calls as strongly typed functions
Good stuff. How does this compare to Instructor? Iโve been using this extensively
https://jxnl.github.io/instructor/
-
Show HN: Ellipsis โ Automatic pull request reviews
it's super cool! checkout how the Instructor repo uses it to keep various parts of their docs in sync: https://github.com/jxnl/instructor/blob/main/ellipsis.yaml
-
Pushing ChatGPT's Structured Data Support to Its Limits
I've been using the instructor[1] library recently and have found the abstractions simple and extremely helpful for getting great structured outputs from LLMs with pydantic.
1 https://github.com/jxnl/instructor/tree/main
-
Efficiently using python in GPTs
Maybe try using jason liuโs instructor package (https://github.com/jxnl/instructor) to structure the outputs with pydantic? Itโs explained in his presentation from the AI Engineer summit (https://youtu.be/yj-wSRJwrrc)
-
Ask HN: Cheapest way to run local LLMs?
One of the most powerful ways to integrate LLMs with existing systems is constrained generation. Libraries such as outlines[1] and instructor[2] allow structural specification of the expected outputs as regex patterns, simple types, jsonschema or pydantic models.
These outputs often consume significantly fewer tokens than chat or text completion.
[1] https://github.com/outlines-dev/outlines
[2] https://github.com/jxnl/instructor
- OpenAI Function Calls for Humans
-
Unbounded Books: Search by ~Vibes
The best GPT-wrapper youโll see today?
...but this one hasn't raised oodles of cash.
Mike (creator) here, excited to hear what HN-folks think. Anything to add/improve?
Had fun building, extra s/out to Railway, NextJS, and https://github.com/jxnl/instructor
Check it out: https://www.unboundedbooks.com/
What are some alternatives?
modelfusion - The TypeScript library for building AI applications.
simpleaichat - Python package for easily interfacing with chat apps, with robust features and minimal code complexity.
Converter - Typescript to Scala.js converter
chatgpt-localfiles - Make local files accessible to ChatGPT
PythonGPT - PythonGPT writes and indexes code to implement dynamic code execution using generative models. Younger sibling of DoctorGPT.
ort - A Rust wrapper for ONNX Runtime
httpx - A next generation HTTP client for Python. ๐ฆ
app - ๐ Insights into your entire open source ecosystem.
outlines - Structured Text Generation
camel - ๐ซ CAMEL: Communicative Agents for โMindโ Exploration of Large Language Model Society (NeruIPS'2023) https://www.camel-ai.org
next13-chat-blog-repo