tiktoken
codon
tiktoken | codon | |
---|---|---|
32 | 34 | |
9,980 | 13,861 | |
6.4% | 0.6% | |
6.7 | 7.9 | |
about 1 month ago | 5 days ago | |
Python | C++ | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tiktoken
- FLaNK AI - 01 April 2024
-
GPT-3.5 crashes when it thinks about useRalativeImagePath too much
Their tokenizer is open source: https://github.com/openai/tiktoken
Data files that contain vocabulary are listed here: https://github.com/openai/tiktoken/blob/9e79899bc248d5313c7d...
-
How fast is JS tiktoken?
OpenAI's refference tokeniser - https://github.com/openai/tiktoken
-
Anthropic announces Claude 2.1 – 200k context, less refusals
ChatGPT presumably adds them as special tokens to the cl100k_base tokenizer, as they demo in the tiktoken documentation: https://github.com/openai/tiktoken#extending-tiktoken
-
What is the best way to get an approximate number of tokens for a piece of text?
I want to measure the approximate number of tokens in a piece of text to understand if I will need to modify it before passing it into the context of an OpenAI API call. Tiktoken can do this, but I'm not sure if it's overkill to use that library just for this simple task. I don't need to actually tokenize the text, I just need an approximate count (e.g. within like 1% of the text's actual token length for text that represents the visible text on a webpage).
-
Show HN: LLaMA tokenizer that runs in browser
https://platform.openai.com/tokenizer or the official python library tiktoken https://github.com/openai/tiktoken or this JS port of tiktoken https://github.com/dqbd/tiktoken
-
Made a GPT-3.5-Turbo and GPT-4 Tokenizer
It's built on top of the tiktoken library and is basically just a lambda function in the backend.
- AiPrice - an API for calculating OpenAI tokens and pricing
-
Anyone able to explain what happened here?
"All" is a single token in OpenAI's tiktoken Tokenizer, unrelated to the token for capital "A". Even lowercase "all" is a distinct token from "All" or "ALL."
-
Which lib is the tokenizer page using to calculate the tokens?
check tiktoken
codon
-
Should I Open Source my Company?
https://github.com/exaloop/codon/blob/develop/LICENSE
Here are some others: https://github.com/search?q=%22Business+Source+License%22+%2...
-
Python running on the Dart VM?
I found at least one project that managed to compile python AOT to LLVM https://github.com/exaloop/codon. Even if LLVM is more expressive than Dart Kernel, that should at least be some evidence that this might not be too impractical.
-
Codon: Python Compiler
Their fannkuch benchmark seems to be a bit dishonest. They claim an enormous perf delta on https://exaloop.io/benchmarks.html but fannkuch uses factorial a lot and they define factorial with a very small (n=20) table: https://github.com/exaloop/codon/blob/fb461371613049539654c1...
Disclaimer: I've worked on several Python runtimes and compilers, but I'm not by any means out to get Codon. Just happened across this by accident while looking at their inline LLVM, which is neat.
-
The father of Swift made another baby: Mojo: looks to be based on Python using MLIR
If you literally want Python, but compiled ... Look at Codon: https://github.com/exaloop/codon
-
Mojo – a new programming language for all AI developers
Another "Python with high-performance compiled builds" would be https://github.com/exaloop/codon.
-
MIT Turbocharges Python’s Notoriously Slow Compiler
This is the project being discussed: https://github.com/exaloop/codon
-
Is there a way to use turn a project into a single executable file that doesn't require anyone to do anything like install Python before using it?
Try Codon? https://github.com/exaloop/codon
- Since when did Python haters spread out everywhere? Maybe DNF5 would be faster because of ditched it, maybe.
-
Budget HomeLab converted to endless money-pit
https://github.com/exaloop/codon might save you from the rewrite.
- What are your thoughts on Codon compiler having a paid licence?
What are some alternatives?
tokenizer - Pure Go implementation of OpenAI's tiktoken tokenizer
Nuitka - Nuitka is a Python compiler written in Python. It's fully compatible with Python 2.6, 2.7, 3.4, 3.5, 3.6, 3.7, 3.8, 3.9, 3.10, and 3.11. You feed it your Python app, it does a lot of clever things, and spits out an executable or extension module.
daath-ai-parser - Daath AI Parser is an open-source application that uses OpenAI to parse visible text of HTML elements.
Numba - NumPy aware dynamic Python compiler using LLVM
CLIP - CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
Cython - The most widely used Python to C compiler
skypilot - SkyPilot: Run LLMs, AI, and Batch jobs on any cloud. Get maximum savings, highest GPU availability, and managed execution—all with a simple interface.
taichi - Productive, portable, and performant GPU programming in Python.
bricks - Open-source natural language enrichments at your fingertips.
julia - The Julia Programming Language
terminal-copilot - A smart terminal assistant that helps you find the right command.
Nim - Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).