mud-pi
A simple MUD server in Python, for teaching purposes, which could be run on a Raspberry Pi (by Frimkron)
gemma.cpp
lightweight, standalone C++ inference engine for Google's Gemma models. (by google)
mud-pi | gemma.cpp | |
---|---|---|
5 | 8 | |
341 | 5,516 | |
- | 7.7% | |
0.0 | 9.3 | |
almost 3 years ago | 7 days ago | |
Python | C++ | |
MIT License | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mud-pi
Posts with mentions or reviews of mud-pi.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-02-26.
- FLaNK Stack 26 February 2024
- A simple MUD server in Python which can be run on a Raspberry Pi
- Python for dnd
- What kind of data structure is this and how can I modify it to meet my needs?
-
has any one created a MUD(multiuser dungeon) game via python?
Maybe look at MUD Pi. Are you sure you want to develop a MUD though? That will require you to create a server (possibly hosted through a paid platform like AWS or Heroku than if you plan to connect via a LAN then that won't be necessary). A single-player roguelike would be a bit easier.
gemma.cpp
Posts with mentions or reviews of gemma.cpp.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-03-31.
-
LLaMA Now Goes Faster on CPUs
For C++, also check out our https://github.com/google/gemma.cpp/blob/main/gemma.cc, which has direct calls to MatVec.
- FLaNK Stack 26 February 2024
-
Gemma.cpp: lightweight, standalone C++ inference engine for Gemma models
Looks like they're working on it: https://github.com/google/gemma.cpp/issues/16
- Source code of Google Gemma model in C++
-
Gemma: New Open Models
They have implemented the model also on their own C++ inference engine: https://github.com/google/gemma.cpp
What are some alternatives?
When comparing mud-pi and gemma.cpp you can also consider the following projects:
evennia - Python MUD/MUX/MUSH/MU* development system
llamafile - Distribute and run LLMs with a single file.