tensorflow VS LightGBM

Compare tensorflow vs LightGBM and see what are their differences.

LightGBM

A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. (by Microsoft)
Our great sponsors
  • Mergify - Updating dependencies is time-consuming.
  • InfluxDB - Collect and Analyze Billions of Data Points in Real Time
  • SonarCloud - Analyze your C and C++ projects with just one click.
tensorflow LightGBM
216 11
177,728 15,479
0.7% 1.0%
10.0 0.0
4 days ago 3 days ago
C++ C++
Apache License 2.0 MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

tensorflow

Posts with mentions or reviews of tensorflow. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-08-04.

LightGBM

Posts with mentions or reviews of LightGBM. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-29.
  • SIRUS.jl: Interpretable Machine Learning via Rule Extraction
    2 projects | /r/Julia | 29 Jun 2023
    SIRUS.jl is a pure Julia implementation of the SIRUS algorithm by Bénard et al. (2021). The algorithm is a rule-based machine learning model meaning that it is fully interpretable. The algorithm does this by firstly fitting a random forests and then converting this forest to rules. Furthermore, the algorithm is stable and achieves a predictive performance that is comparable to LightGBM, a state-of-the-art gradient boosting model created by Microsoft. Interpretability, stability, and predictive performance are described in more detail below.
  • What's New with AWS: Amazon SageMaker built-in algorithms now provides four new Tabular Data Modeling Algorithms
    3 projects | dev.to | 28 Jun 2022
    LightGBM is a popular and high-performance open-source implementation of the Gradient Boosting Decision Tree (GBDT). To learn how to use this algorithm, please see example notebooks for Classification and Regression.
  • Search YouTube from the terminal written in python
    2 projects | /r/Python | 28 Feb 2022
    Microsoft lightGBM. https://github.com/microsoft/LightGBM
  • LightGBM VS CXXGraph - a user suggested alternative
    2 projects | 28 Feb 2022
  • Writing the fastest GBDT libary in Rust
    6 projects | dev.to | 11 Jan 2022
    Here are our benchmarks on training time comparing Tangram's Gradient Boosted Decision Tree Library to LightGBM, XGBoost, CatBoost, and sklearn.
  • Workstation Management With Nix Flakes: Build a Cmake C++ Package
    2 projects | dev.to | 31 Oct 2021
    { inputs = { nixpkgs = { url = "github:nixos/nixpkgs/nixos-unstable"; }; flake-utils = { url = "github:numtide/flake-utils"; }; }; outputs = { nixpkgs, flake-utils, ... }: flake-utils.lib.eachDefaultSystem (system: let pkgs = import nixpkgs { inherit system; }; lightgbm-cli = (with pkgs; stdenv.mkDerivation { pname = "lightgbm-cli"; version = "3.3.1"; src = fetchgit { url = "https://github.com/microsoft/LightGBM"; rev = "v3.3.1"; sha256 = "pBrsey0RpxxvlwSKrOJEBQp7Hd9Yzr5w5OdUuyFpgF8="; fetchSubmodules = true; }; nativeBuildInputs = [ clang cmake ]; buildPhase = "make -j $NIX_BUILD_CORES"; installPhase = '' mkdir -p $out/bin mv $TMP/LightGBM/lightgbm $out/bin ''; } ); in rec { defaultApp = flake-utils.lib.mkApp { drv = defaultPackage; }; defaultPackage = lightgbm-cli; devShell = pkgs.mkShell { buildInputs = with pkgs; [ lightgbm-cli ]; }; } ); }
  • Is it possible to clean memory after using a package that has a memory leak in my python script?
    2 projects | /r/Python | 29 Apr 2021
    I'm working on the AutoML python package (Github repo). In my package, I'm using many different algorithms. One of the algorithms is LightGBM. The algorithm after the training doesn't release the memory, even if del is called and gc.collect() after. I created the issue on LightGBM GitHub -> link. Because of this leak, memory consumption is growing very fast during algorithm training.

What are some alternatives?

When comparing tensorflow and LightGBM you can also consider the following projects:

PaddlePaddle - PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)

Prophet - Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.

Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more

scikit-learn - scikit-learn: machine learning in Python

LightFM - A Python implementation of LightFM, a hybrid recommendation algorithm.

xgboost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

PyBrain

MLflow - Open source platform for the machine learning lifecycle

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

Deeplearning4j - Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learning using automatic differentiation.

Keras - Deep Learning for humans

CNTK - Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit