NLTK | Jieba | |
---|---|---|
68 | 7 | |
13,760 | 33,578 | |
0.8% | - | |
9.3 | 0.0 | |
2 months ago | 5 months ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
NLTK
-
Create a Question/Answer Chatbot in Python
Using the NTLK Natural Language Toolkit
- NLTK version 3.8.2 is no longer available on PyPI
- Nltk version 3.8.2 is no longer available on PyPI
-
350M Tokens Don't Lie: Love and Hate in Hacker News
Is this just using LLM to be cool? How does pure LLM with simple "In the scale between 0-10"" stack up against traditional, battle-tested sentiment analysis tools?
Gemini suggests NLTK and spaCy
https://www.nltk.org/
https://spacy.io/
-
Building a local AI smart Home Assistant
alternatively, could we not simply split by common characters such as newlines and periods, to split it within sentences? it would be fragile with special handling required for numbers with decimal points and probably various other edge cases, though.
there are also Python libraries meant for natural language parsing[0] that could do that task for us. I even see examples on stack overflow[1] that simply split text into sentences.
[0]: https://www.nltk.org/
-
Sorry if this is a dumb question but is the main idea behind LLMs to output text based on user input?
Check out https://www.nltk.org/ and work through it, it'll give you a foundational understanding of how all this works, but very basically it's just a fancy auto-complete.
-
Best Portfolio Projects for Data Science
NLTK Documentation
- Where to start learning NLP ?
-
Is there a programmatic way to check if two strings are paraphrased?
If this is True, then you need also Natural Language Toolkit to process the words.
-
[CROSS-POST] What programming language should I learn for corpus linguistics?
In that case, you should definitely have a look at Python's nltk library which stands for Natural Language Toolkit. They have a rich corpus collection for all kinds of specialized things like grammars, taggers, chunkers, etc.
Jieba
-
PostgreSQL Full-Text Search in a Nutshell
Let's continue with jieba as an example. This is the main program logic for pg_jieba, which is also a Python package, so let's use Python for the example.
-
[OC] How Many Chinese Characters You Need to Learn to Read Chinese!
jieba to do Chinese word segmentation
-
Sentence parser for Mandarin?
Jieba: Chinese text segmenter
-
How many in here use google sheets to keep track on their Chinese vocabulary? (2 pics) - More info in the comments
If you know some python you can use a popular library called Jieba 结巴 to automatically get pinyin for every word. (Jieba has actually been ported to many languages) You can also use it to break a chinese text into a set of unique words for easy addition to your spreadsheet.
- Where can I download a database of Chinese word classifications (noun, verb, etc)
-
Learn vocabulary effortlessly while browsing the web [FR,EN,DE,PT,ES]
Since you're saying the main issue is segmentation, there are libraries to help out with that issue. jieba is fantastic if you have a Python backend, nodejieba (50k downloads/week) if it's more JS-side.
-
I'm looking for a specific vocab list
https://github.com/fxsjy/jieba/ (has some good word frequency data)
What are some alternatives?
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
TextBlob - Simple, Pythonic, text processing--Sentiment analysis, part-of-speech tagging, noun phrase extraction, translation, and more.
SnowNLP - Python library for processing Chinese text
bert - TensorFlow code and pre-trained models for BERT
pkuseg-python - pkuseg多领域中文分词工具; The pkuseg toolkit for multi-domain Chinese word segmentation
Stanza - Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
polyglot - Multilingual text (NLP) processing toolkit
PyTorch-NLP - Basic Utilities for PyTorch Natural Language Processing (NLP)
textacy - NLP, before and after spaCy