BERT-Relation-Extraction
DeepKE
BERT-Relation-Extraction | DeepKE | |
---|---|---|
1 | 2 | |
557 | 3,001 | |
- | 5.5% | |
4.4 | 9.5 | |
8 months ago | 2 days ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
BERT-Relation-Extraction
-
How to enable GPU with using github BERT model?
I'm currently following this github repo-https://github.com/plkmo/BERT-Relation-Extraction - in attempt to parse my own data later. When i'm trying to pre-train this model with cnn.txt the compute goes entirely to cpu/ram or so I assume. It has taken over 7h doing this so far with no sign of completion.
DeepKE
- Would this method work to increase the memory of the model? Saving summaries generated by a 2nd model and injecting them depending on the current topic.
-
How to Unleash the Power of Large Language Models for Few-shot Relation Extraction?
Scaling language models have revolutionized widespread NLP tasks, yet little comprehensively explored few-shot relation extraction with large language models. In this paper, we investigate principal methodologies, in-context learning and data generation, for few-shot relation extraction via GPT-3.5 through exhaustive experiments. To enhance few-shot performance, we further propose task-related instructions and schema-constrained data generation. We observe that in-context learning can achieve performance on par with previous prompt learning approaches, and data generation with the large language model can boost previous solutions to obtain new state-of-the-art few-shot results on four widely-studied relation extraction datasets. We hope our work can inspire future research for the capabilities of large language models in few-shot relation extraction. Code is available in \url{https://github.com/zjunlp/DeepKE/tree/main/example/llm.
What are some alternatives?
ARElight - Granular Viewer of Sentiments Between Entities in Massively Large Documents and Collections of Texts, powered by AREkit
llama_farm - Use local llama LLM or openai to chat, discuss/summarize your documents, youtube videos, and so on.
bertviz - BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
GoLLIE - Guideline following Large Language Model for Information Extraction
OpenNRE - An Open-Source Package for Neural Relation Extraction (NRE)
zshot - Zero and Few shot named entity & relationships recognition
NaLLM - Repository for the NaLLM project
VLDet - [ICLR 2023] PyTorch implementation of VLDet (https://arxiv.org/abs/2211.14843)
llm-experiments - Experiments using ChatGPT, Jupyter, and rdflib for distributed knowledge graph construction
Video-LLaVA - Video-LLaVA: Learning United Visual Representation by Alignment Before Projection
ai_llm_kb_sandbox - Investigating the use of LLMs to populate knowledge graphs (KG) and then use KG to utilize predictive models