Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Llm.f90 Alternatives
Similar projects and alternatives to llm.f90
-
llvm-project
The LLVM Project is a collection of modular and reusable compiler and toolchain technologies.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
ai-notes
notes for software engineers getting up to speed on new AI developments. Serves as datastore for https://latent.space writing, and product brainstorming, but has cleaned up canonical references under the /Resources folder.
-
TinyLlama
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
inference-engine
A deep learning library for use in high-performance computing applications in modern Fortran
-
curated-transformers
🤖 A PyTorch library of curated Transformer models and their composable components
-
Time-LLM
[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
-
heinsen_sequence
Code implementing "Efficient Parallelization of a Ubiquitious Sequential Computation" (Heinsen, 2023)
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
llm.f90 reviews and mentions
- llm.f90: LLM Inference in Fortran
-
karpathy/llm.c
I'd like to think he took the name from my llm.f90 project https://github.com/rbitr/llm.f90
It was originally based off of Karpathy's llama2.c but I renamed it when I added support for other architectures.
Probable a coincidence :)
-
Winteracter – The Fortran GUI Toolset
I'm a Fortran hobbyist. I'm working (unfortunately less frequently now) on a LLM framework in Fortan: https://github.com/rbitr/llm.f90
- Fortran implementation of phi-2 LLM
- Fortran implementation of phi-2 language model
-
TinyLlama: An Open-Source Small Language Model
Also, I should promote the code I wrote for running this. It runs models in ggml format, the one I made available is an older checkpoint though. It's easy to convert the newer one. And it's in Fortran but it should be easy to get gfortran if you don't have it installed.
https://github.com/rbitr/llm.f90/tree/optimize16/purefortran
- Mamba LLM Inference on CPU
-
Minimal implementation of Mamba, the new LLM architecture, in 1 file of PyTorch
The original mamba code has a lot of speed optimizations and other stuff that make it difficult to immediately get so this will help with learning.
I can't help but also plug my own Mamba inference implementation. https://github.com/rbitr/llm.f90/tree/master/ssm
- Mamba state-space LLM inference
-
Guide to the Mamba architecture that claims to be a replacement for Transformers
You may also be interested in https://github.com/rbitr/llm.f90/tree/master/ssm it's my inference only implementation of mamba which ends up being much simpler than the training code in the original repo
-
A note from our sponsor - InfluxDB
www.influxdata.com | 27 Apr 2024
Stats
rbitr/llm.f90 is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of llm.f90 is Fortran.
Sponsored