FinBERT-QA
haystack
FinBERT-QA | haystack | |
---|---|---|
1 | 55 | |
113 | 13,711 | |
- | 3.1% | |
0.0 | 9.9 | |
11 months ago | 3 days ago | |
Python | Python | |
- | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
FinBERT-QA
-
Best way to approach financial statement analysis with NLP and Image Recognition?
Open Domain Question Answering (ODQA) using a deep transformer NLP model that has been fine tune trained on a financial domain dataset such as FiQA.
haystack
-
Haystack DB – 10x faster than FAISS with binary embeddings by default
I was confused for a bit but there is no relation to https://haystack.deepset.ai/
-
Release Radar • March 2024 Edition
View on GitHub
-
First 15 Open Source Advent projects
4. Haystack by Deepset | Github | tutorial
-
Generative AI Frameworks and Tools Every Developer Should Know!
Haystack can be classified as an end-to-end framework for building applications powered by various NLP technologies, including but not limited to generative AI. While it doesn't directly focus on building generative models from scratch, it provides a robust platform for:
-
Best way to programmatically extract data from a set of .pdf files?
But if you want an API that you can use to develop your own flow, Haystack from Deepset could be worth a look.
-
Which LLM framework(s) do you use in production and why?
Haystack for production. We cannot afford breaking changes in our production apps. Its stable, documentation is excellent and did I mention its' STABLE!??
- Overview: AI Assembly Architectures
-
Llama2 and Haystack on Colab
I recently conducted some experiments with Llama2 and Haystack (https://github.com/deepset-ai/haystack), the NLP/LLM framework.
The notebook can be helpful for those trying to load Llama2 on Colab.
1) Installed Transformers from the main branch (and other libraries)
- Build with LLMs for production with Haystack – has 10k stars on GitHub
- Show HN: Haystack – Production-Ready LLM Framework
What are some alternatives?
BERT-QE - Code and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
langchain - 🦜🔗 Build context-aware reasoning applications
happy-transformer - Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
kiri - Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
KitanaQA - KitanaQA: Adversarial training and data augmentation for neural question-answering models
BentoML - The most flexible way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Inference Graph/Pipelines, Compound AI systems, Multi-Modal, RAG as a Service, and more!
kiri - Kiri is a visual tool designed for reviewing schematics and layouts of KiCad projects that are version-controlled with Git.
label-studio - Label Studio is a multi-type data labeling and annotation tool with standardized output format
TextFooler - A Model for Natural Language Attack on Text Classification and Inference
jina - ☁️ Build multimodal AI applications with cloud-native stack