Using st_size field in stat data structure to scan directories' sizes gives me different results than other disk usage analyzers

This page summarizes the projects mentioned and recommended in the original post on /r/linuxquestions

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • ncda

    Ncurses disk usage analyzer

  • I'm working on Ncurses based disk usage analyzer. The only way I have found to get directories' sizes is to use lstat() and get it through the st_size field. So I made a recursive function (which is named get_dir_tree()), it reads the directories content and if it's a file it uses lstat() on it, if it's a directory it calls itself recursively. However I have found that it gives larger results than other disk usage analyzers like du, ncdu and baobab and I can't find out why, so any help is really appreciated. Here is the file on Github that contains the function and it is on line 282.

  • baobab

    Read-only mirror of https://gitlab.gnome.org/GNOME/baobab

  • Disk Usage Analyzer (baobab)

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts