HugNLP VS knowledge-rumination

Compare HugNLP vs knowledge-rumination and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
HugNLP knowledge-rumination
2 1
248 16
- -
7.3 4.8
9 months ago 11 months ago
Python Python
- MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

HugNLP

Posts with mentions or reviews of HugNLP. We have used some of these posts to build our list of alternatives and similar projects.

knowledge-rumination

Posts with mentions or reviews of knowledge-rumination. We have used some of these posts to build our list of alternatives and similar projects.
  • Knowledge Rumination for Pre-trained Language Models
    1 project | /r/BotNewsPreprints | 16 May 2023
    Previous studies have revealed that vanilla pre-trained language models (PLMs) lack the capacity to handle knowledge-intensive NLP tasks alone; thus, several works have attempted to integrate external knowledge into PLMs. However, despite the promising outcome, we empirically observe that PLMs may have already encoded rich knowledge in their pre-trained parameters but fails to fully utilize them when applying to knowledge-intensive tasks. In this paper, we propose a new paradigm dubbed Knowledge Rumination to help the pre-trained language model utilize those related latent knowledge without retrieving them from the external corpus. By simply adding a prompt like ``As far as I know'' to the PLMs, we try to review related latent knowledge and inject them back to the model for knowledge consolidation. We apply the proposed knowledge rumination to various language models, including RoBERTa, DeBERTa, GPT-3 and OPT. Experimental results on six commonsense reasoning tasks and GLUE benchmarks demonstrate the effectiveness of our proposed approach, which further proves that the knowledge stored in PLMs can be better exploited to enhance the downstream performance. Code will be available in https://github.com/zjunlp/knowledge-rumination.

What are some alternatives?

When comparing HugNLP and knowledge-rumination you can also consider the following projects:

HugNLP - CIKM2023 Best Demo Paper Award. HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊

LLMSurvey - The official GitHub page for the survey paper "A Survey of Large Language Models".

megabots - 🤖 State-of-the-art, production ready LLM apps made mega-easy, so you don't have to build them from scratch 🤯 Create a bot, now 🫵

OpenPrompt - An Open-Source Framework for Prompt-Learning.

llmware - Providing enterprise-grade LLM-based development framework, tools, and fine-tuned models.

prompt-lib - A set of utilities for running few-shot prompting experiments on large-language models

AdaKGC - [EMNLP 2023 (Findings)] Schema-adaptable Knowledge Graph Construction

haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.