bert2bert-summarization
haystack
bert2bert-summarization | haystack | |
---|---|---|
1 | 55 | |
30 | 13,883 | |
- | 4.3% | |
0.0 | 9.9 | |
over 3 years ago | 1 day ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
bert2bert-summarization
-
[P] Summarization using Bert2Bert Frameworks
Here is the implementation of the Summarization model using pytorch lighting and huggingface transformers. The model used Bert2Bert, which uses the Korean Bert as an encoder-decoder structure. This model recorded ROUGE-1 score of 44.8 on the Korean benchmark dataset. Details of the implementation can be found here. (https://github.com/hyunwoongko/bert2bert-summarization)
haystack
-
Haystack DB – 10x faster than FAISS with binary embeddings by default
I was confused for a bit but there is no relation to https://haystack.deepset.ai/
-
Release Radar • March 2024 Edition
View on GitHub
-
First 15 Open Source Advent projects
4. Haystack by Deepset | Github | tutorial
-
Generative AI Frameworks and Tools Every Developer Should Know!
Haystack can be classified as an end-to-end framework for building applications powered by various NLP technologies, including but not limited to generative AI. While it doesn't directly focus on building generative models from scratch, it provides a robust platform for:
-
Best way to programmatically extract data from a set of .pdf files?
But if you want an API that you can use to develop your own flow, Haystack from Deepset could be worth a look.
-
Which LLM framework(s) do you use in production and why?
Haystack for production. We cannot afford breaking changes in our production apps. Its stable, documentation is excellent and did I mention its' STABLE!??
- Overview: AI Assembly Architectures
-
Llama2 and Haystack on Colab
I recently conducted some experiments with Llama2 and Haystack (https://github.com/deepset-ai/haystack), the NLP/LLM framework.
The notebook can be helpful for those trying to load Llama2 on Colab.
1) Installed Transformers from the main branch (and other libraries)
- Build with LLMs for production with Haystack – has 10k stars on GitHub
- Show HN: Haystack – Production-Ready LLM Framework
What are some alternatives?
pytextrank - Python implementation of TextRank algorithms ("textgraphs") for phrase extraction
langchain - 🦜🔗 Build context-aware reasoning applications
sumy - Module for automatic summarization of text documents and HTML pages.
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
summarizers - Package for controllable summarization
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
BentoML - The most flexible way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Inference Graph/Pipelines, Compound AI systems, Multi-Modal, RAG as a Service, and more!
label-studio - Label Studio is a multi-type data labeling and annotation tool with standardized output format
jina - ☁️ Build multimodal AI applications with cloud-native stack
BERT-pytorch - Google AI 2018 BERT pytorch implementation
BERT-NER - Pytorch-Named-Entity-Recognition-with-BERT
gpt-neox - An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries