h-former VS EasyLM

Compare h-former vs EasyLM and see what are their differences.

h-former

H-Former is a VAE for generating in-between fonts (or combining fonts). Its encoder uses a Point net and transformer to compute a code vector of glyph. Its decoder is composed of multiple independent decoders which act on a code vector to reconstruct a point cloud representing a glpyh. (by mzguntalan)

EasyLM

Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax. (by young-geng)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
h-former EasyLM
3 8
5 2,228
- -
0.0 7.7
almost 2 years ago 4 months ago
Python Python
- Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

h-former

Posts with mentions or reviews of h-former. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-29.

EasyLM

Posts with mentions or reviews of EasyLM. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.

What are some alternatives?

When comparing h-former and EasyLM you can also consider the following projects:

GradCache - Run Effective Large Batch Contrastive Learning Beyond GPU/TPU Memory Constraint

mlc-llm - Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.

pointnet - PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation

camel - 🐫 CAMEL: Communicative Agents for “Mind” Exploration of Large Language Model Society (NeruIPS'2023) https://www.camel-ai.org

Open-Llama - The complete training code of the open-source high-performance Llama model, including the full process from pre-training to RLHF.

brev-cli - Connect your laptop to cloud computers. Follow to stay updated about our product

RWKV-LM - RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

modal-examples - Examples of programs built using Modal

text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

h2o-llmstudio - H2O LLM Studio - a framework and no-code GUI for fine-tuning LLMs. Documentation: https://h2oai.github.io/h2o-llmstudio/

Macaw-LLM - Macaw-LLM: Multi-Modal Language Modeling with Image, Video, Audio, and Text Integration