grok-1 | fructose | |
---|---|---|
8 | 3 | |
48,188 | 692 | |
4.5% | 12.1% | |
5.9 | 9.1 | |
8 days ago | 23 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
grok-1
-
Maxtext: A simple, performant and scalable Jax LLM
Is t5x an encoder/decoder architecture?
Some more general options.
The Flax ecosystem
https://github.com/google/flax?tab=readme-ov-file
or dm-haiku
https://github.com/google-deepmind/dm-haiku
were some of the best developed communities in the Jax AI field
Perhaps the “trax” repo? https://github.com/google/trax
Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...
Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py
-
Elon Musk's xAI previews Grok-1.5V, its first multimodal model
Anyone know the system requirements? Anyone even able to run it? In their last release "grok 1" the issues are full of people who can't even run it: https://github.com/xai-org/grok-1/issues
- Grok-1 Weights Published
- Grok-1
- FLaNK AI Weekly 18 March 2024
- X.ai's Grok-1 Model Is Officially Open-Source and Larger Than Expected
- Grok-1 (LLM with 314B parameters) is now source
- Elon drops open source Grok onto the stage
fructose
- FLaNK AI Weekly 18 March 2024
-
Show HN: Fructose, LLM calls as strongly typed functions
This approach may be too high-level "magic" to the point of being difficult to work with and iterate upon.
Looking at the prompt templates (https://github.com/bananaml/fructose/tree/main/src/fructose/... ), they use LangChain-esque "just try to make the output to be valid JSON" when APIs such as the GPT-4 turbo which this model uses by defauly now support function calling/structured data natively, and libraries such as outlines (https://github.com/outlines-dev/outlines) which is more complex but can better ensure a dictionary output for local LLMs
What are some alternatives?
FLaNK-python-processors - Many processors
outlines - Structured Text Generation
rnote - Sketch and take handwritten notes.
pytorch-image-models - PyTorch image models, scripts, pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (ViT), MobileNet-V3/V2, RegNet, DPN, CSPNet, Swin Transformer, MaxViT, CoAtNet, ConvNeXt, and more
sqlglot - Python SQL Parser and Transpiler
quarto-cli - Open-source scientific and technical publishing system built on Pandoc.
openvino_notebooks - 📚 Jupyter notebook tutorials for OpenVINO™
brave-browser - Brave browser for Android, iOS, Linux, macOS, Windows.
bebop - 🎷No ceremony, just code. Blazing fast, typesafe binary serialization.
mini.nvim - Library of 35+ independent Lua modules improving overall Neovim (version 0.7 and higher) experience with minimal effort
FLiPStackWeekly - FLaNK AI Weekly covering Apache NiFi, Apache Flink, Apache Kafka, Apache Spark, Apache Iceberg, Apache Ozone, Apache Pulsar, and more...