Need help in processing a CSV file with around a million lines.

This page summarizes the projects mentioned and recommended in the original post on /r/laravel

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern API for authentication & user identity.
  • Onboard AI - ChatGPT with full context of any GitHub repo.
  • CSV

    CSV data manipulation made easy in PHP

    Either use something like league/csv or use fgetcsv directly.

  • csv-collection

    Read and write large CSV files using the power of Laravel's lazy collections

    I wrote a package for this a while ago for some projects in our company :) https://github.com/DutchCodingCompany/csv-collection. It has helped us quiet a few times! It uses the 'fgetcsv' and a LazyCollection internally.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

  • fast-excel

    🦉 Fast Excel import/export for Laravel

    For perf : https://github.com/rap2hpoutre/fast-excel

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts