Processing Wikipedia Dumps With Python

This page summarizes the projects mentioned and recommended in the original post on /r/programming

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • mwparserfromhell

    A Python parser for MediaWiki wikicode

  • There's also https://github.com/earwig/mwparserfromhell, if you don't want to roll your own.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • [Python] How can I clean up Wikipedia's XML backup dump to create dictionaries of commonly used words for multiple languages?

    1 project | /r/learnprogramming | 12 Oct 2021
  • How can I clean up Wikipedia's XML backup dump to create dictionaries of commonly used words for multiple languages?

    2 projects | /r/learnpython | 10 Oct 2021
  • I spent the 2 weeks building a complex data parsing program for a data project and today I found out that such a library already exists.

    1 project | /r/learnprogramming | 14 May 2022
  • [UPDATE] Here's the transcript of the 1781 most-used German Nouns according to a 4.2 million word corpus research performed by Routledge

    1 project | /r/German | 9 Jul 2021
  • The Future of MySQL is PostgreSQL: an extension for the MySQL wire protocol

    1 project | news.ycombinator.com | 26 Apr 2024