MOSS
AdaKGC
MOSS | AdaKGC | |
---|---|---|
4 | 1 | |
11,825 | 16 | |
0.2% | - | |
8.5 | 7.7 | |
8 months ago | 3 months ago | |
Python | Python | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MOSS
- Has anyone tried fine tuning on a dataset of complex tasks that require tool use?
-
Benchmarks for Recent LLMs
Missing Vicuna, Dolly, BELLE, phoenix, MOSS, the ones used by open assistant.
- [D] Open-Source LLMs vs APIs
- GitHub - OpenLMLab/MOSS: An open-source tool-augmented conversational language model from Fudan University
AdaKGC
-
Schema-adaptable Knowledge Graph Construction
Conventional Knowledge Graph Construction (KGC) approaches typically follow the static information extraction paradigm with a closed set of pre-defined schema. As a result, such approaches fall short when applied to dynamic scenarios or domains, whereas a new type of knowledge emerges. This necessitates a system that can handle evolving schema automatically to extract information for KGC. To address this need, we propose a new task called schema-adaptable KGC, which aims to continually extract entity, relation, and event based on a dynamically changing schema graph without re-training. We first split and convert existing datasets based on three principles to build a benchmark, i.e., horizontal schema expansion, vertical schema expansion, and hybrid schema expansion; then investigate the schema-adaptable performance of several well-known approaches such as Text2Event, TANL, UIE and GPT-3. We further propose a simple yet effective baseline dubbed AdaKGC, which contains schema-enriched prefix instructor and schema-conditioned dynamic decoding to better handle evolving schema. Comprehensive experimental results illustrate that AdaKGC can outperform baselines but still have room for improvement. We hope the proposed work can deliver benefits to the community. Code and datasets will be available in https://github.com/zjunlp/AdaKGC.
What are some alternatives?
LLMZoo - ⚡LLM Zoo is a project that provides data, models, and evaluation benchmark for large language models.⚡
llm-guard - The Security Toolkit for LLM Interactions
private-gpt - Deploy smart and secure conversational agents for your employees, using Azure. Able to use both private and public data.
spacy-llm - 🦙 Integrating LLMs into structured NLP pipelines
databunker - A secure user directory built for developers to comply with the GDPR [Moved to: https://github.com/securitybunker/databunker]
LLMSurvey - The official GitHub page for the survey paper "A Survey of Large Language Models".
alpaca_farm - A simulation framework for RLHF and alternatives. Develop your RLHF method without collecting human data.
knowledge-rumination - [EMNLP 2023] Knowledge Rumination for Pre-trained Language Models
Yi - A series of large language models trained from scratch by developers @01-ai
marqo - Unified embedding generation and search engine. Also available on cloud - cloud.marqo.ai
awesome-totally-open-chatgpt - A list of totally open alternatives to ChatGPT
Baichuan-7B - A large-scale 7B pretraining language model developed by BaiChuan-Inc.