DeepKE
OpenNRE
DeepKE | OpenNRE | |
---|---|---|
2 | 3 | |
2,973 | 4,254 | |
4.6% | 0.8% | |
9.5 | 6.5 | |
13 days ago | 4 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DeepKE
- Would this method work to increase the memory of the model? Saving summaries generated by a 2nd model and injecting them depending on the current topic.
-
How to Unleash the Power of Large Language Models for Few-shot Relation Extraction?
Scaling language models have revolutionized widespread NLP tasks, yet little comprehensively explored few-shot relation extraction with large language models. In this paper, we investigate principal methodologies, in-context learning and data generation, for few-shot relation extraction via GPT-3.5 through exhaustive experiments. To enhance few-shot performance, we further propose task-related instructions and schema-constrained data generation. We observe that in-context learning can achieve performance on par with previous prompt learning approaches, and data generation with the large language model can boost previous solutions to obtain new state-of-the-art few-shot results on four widely-studied relation extraction datasets. We hope our work can inspire future research for the capabilities of large language models in few-shot relation extraction. Code is available in \url{https://github.com/zjunlp/DeepKE/tree/main/example/llm.
OpenNRE
-
RAG Using Unstructured Data and Role of Knowledge Graphs
OpenNRE (https://github.com/thunlp/OpenNRE) is another good approach to neural relation extraction, though it's slightly dated. What would be particularly interesting is to combine models like OpenNRE or SpanMarker with entity-linking models to construct KG triples. And a solid, scalable graph database underneath would make for a great knowledge base that can be constructed from unstructured text.
-
How to get knowledge graph triples from text?
OpenNRE
- KELM: Integrating Knowledge Graphs with Language Model Pre-Training Corpora
What are some alternatives?
llama_farm - Use local llama LLM or openai to chat, discuss/summarize your documents, youtube videos, and so on.
OpenIE-standalone
GoLLIE - Guideline following Large Language Model for Information Extraction
CogCompNLP - CogComp's Natural Language Processing Libraries and Demos: Modules include lemmatizer, ner, pos, prep-srl, quantifier, question type, relation-extraction, similarity, temporal normalizer, tokenizer, transliteration, verb-sense, and more.
zshot - Zero and Few shot named entity & relationships recognition
minie - An open information extraction system that provides compact extractions
NaLLM - Repository for the NaLLM project
PURE - [NAACL 2021] A Frustratingly Easy Approach for Entity and Relation Extraction https://arxiv.org/abs/2010.12812
ARElight - Granular Viewer of Sentiments Between Entities in Massively Large Documents and Collections of Texts, powered by AREkit
knowledge-net - KnowledgeNet: A Benchmark Dataset for Knowledge Base Population
VLDet - [ICLR 2023] PyTorch implementation of VLDet (https://arxiv.org/abs/2211.14843)
llm-experiments - Experiments using ChatGPT, Jupyter, and rdflib for distributed knowledge graph construction