tensorflow VS LightGBM

Compare tensorflow vs LightGBM and see what are their differences.

LightGBM

A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used for ranking, classification and many other machine learning tasks. (by Microsoft)
Our great sponsors
  • JetBrains - Developer Ecosystem Survey 2022
  • Scout APM - Less time debugging, more time building
  • SonarQube - Static code analysis for 29 languages.
tensorflow LightGBM
145 6
165,908 13,926
0.9% 1.6%
10.0 9.5
6 days ago 3 days ago
C++ C++
Apache License 2.0 MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

tensorflow

Posts with mentions or reviews of tensorflow. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-08.

LightGBM

Posts with mentions or reviews of LightGBM. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-02-28.
  • Search YouTube from the terminal written in python
    2 projects | reddit.com/r/Python | 28 Feb 2022
    Microsoft lightGBM. https://github.com/microsoft/LightGBM
  • LightGBM VS CXXGraph - a user suggested alternative
    2 projects | 28 Feb 2022
  • Writing the fastest GBDT libary in Rust
    6 projects | dev.to | 11 Jan 2022
    Here are our benchmarks on training time comparing Tangram's Gradient Boosted Decision Tree Library to LightGBM, XGBoost, CatBoost, and sklearn.
  • Workstation Management With Nix Flakes: Build a Cmake C++ Package
    2 projects | dev.to | 31 Oct 2021
    { inputs = { nixpkgs = { url = "github:nixos/nixpkgs/nixos-unstable"; }; flake-utils = { url = "github:numtide/flake-utils"; }; }; outputs = { nixpkgs, flake-utils, ... }: flake-utils.lib.eachDefaultSystem (system: let pkgs = import nixpkgs { inherit system; }; lightgbm-cli = (with pkgs; stdenv.mkDerivation { pname = "lightgbm-cli"; version = "3.3.1"; src = fetchgit { url = "https://github.com/microsoft/LightGBM"; rev = "v3.3.1"; sha256 = "pBrsey0RpxxvlwSKrOJEBQp7Hd9Yzr5w5OdUuyFpgF8="; fetchSubmodules = true; }; nativeBuildInputs = [ clang cmake ]; buildPhase = "make -j $NIX_BUILD_CORES"; installPhase = '' mkdir -p $out/bin mv $TMP/LightGBM/lightgbm $out/bin ''; } ); in rec { defaultApp = flake-utils.lib.mkApp { drv = defaultPackage; }; defaultPackage = lightgbm-cli; devShell = pkgs.mkShell { buildInputs = with pkgs; [ lightgbm-cli ]; }; } ); }
  • Is it possible to clean memory after using a package that has a memory leak in my python script?
    2 projects | reddit.com/r/Python | 29 Apr 2021
    I'm working on the AutoML python package (Github repo). In my package, I'm using many different algorithms. One of the algorithms is LightGBM. The algorithm after the training doesn't release the memory, even if del is called and gc.collect() after. I created the issue on LightGBM GitHub -> link. Because of this leak, memory consumption is growing very fast during algorithm training.
  • LightGBM (kind of!) available in scikit-learn
    1 project | reddit.com/r/learnmachinelearning | 25 Apr 2021
    Finally, if you're still using XGBoost, take a look at LightGBM (or scikit-learn's version). In my experience, it trains faster and is usually more performant than XGBoost. See some benchmarks here.

What are some alternatives?

When comparing tensorflow and LightGBM you can also consider the following projects:

PaddlePaddle - PArallel Distributed Deep LEarning: Machine Learning Framework from Industrial Practice (『飞桨』核心框架,深度学习&机器学习高性能单机、分布式训练和跨平台部署)

Prophet - Tool for producing high quality forecasts for time series data that has multiple seasonality with linear or non-linear growth.

scikit-learn - scikit-learn: machine learning in Python

xgboost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

PyBrain

LightFM - A Python implementation of LightFM, a hybrid recommendation algorithm.

Keras - Deep Learning for humans

MLflow - Open source platform for the machine learning lifecycle

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

gensim - Topic Modelling for Humans

CNTK - Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit

Deeplearning4j - Suite of tools for deploying and training deep learning models using the JVM. Highlights include model import for keras, tensorflow, and onnx/pytorch, a modular and tiny c++ library for running math code and a java based math library on top of the core c++ library. Also includes samediff: a pytorch/tensorflow like library for running deep learning using automatic differentiation.