libpostal
splink
Our great sponsors
libpostal | splink | |
---|---|---|
5 | 16 | |
3,951 | 1,086 | |
0.9% | 8.7% | |
5.9 | 9.9 | |
3 months ago | 5 days ago | |
C | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
libpostal
-
Install Python Libraries Using Command Prompt
@echo off REM Check if MSYS2 and MinGW are installed where msys2 2>nul >nul if %errorlevel% equ 0 ( echo MSYS2 is already installed. Use --force to reinstall. ) else ( REM Install MSYS2 and MinGW choco install msys2 refreshenv ) REM Check if MSYS2 packages are updated pacman -Qu 2>nul >nul if %errorlevel% equ 0 ( echo MSYS2 packages are already updated. Use --force to reinstall. ) else ( REM Update MSYS2 packages pacman -Syu ) REM Check if build dependencies are installed pacman -Q autoconf automake curl git make libtool gcc mingw-w64-x86_64-gcc 2>nul >nul if %errorlevel% equ 0 ( echo Build dependencies are already installed. Use --force to reinstall. ) else ( REM Install build dependencies pacman -S autoconf automake curl git make libtool gcc mingw-w64-x86_64-gcc ) REM Check if libpostal is cloned if exist libpostal ( echo libpostal repository is already cloned. Use --force to reinstall. ) else ( REM Clone libpostal repository git clone https://github.com/openvenues/libpostal ) cd libpostal REM Check if libpostal is built and installed if exist C:/Program Files/libpostal/bin/libpostal.dll ( echo libpostal is already built and installed. Use --force to reinstall. ) else ( REM Build and install libpostal cp -rf windows/* ./ ./bootstrap.sh ./configure --datadir=C:/libpostal make -j4 make install ) REM Check if libpostal is added to PATH environment variable setx /m PATH "%PATH%;C:\Program Files\libpostal\bin" 2>nul >nul if %errorlevel% equ 0 ( echo libpostal is already added to PATH environment variable. Use --force to reinstall. ) else ( REM Add libpostal to PATH environment variable setx PATH "%PATH%;C:\Program Files\libpostal\bin" ) REM Test libpostal installation libpostal "100 S Broad St, Philadelphia, PA" pause
-
Transforming free-form geospatial directions into addresses - SOTA?
I know of https://github.com/openvenues/libpostal which handles typos and omissions in addresses, but I am looking into a more fuzzy description of a location.
-
[P] Better ways to clean lots of text?
use an address parser library like libpostal https://github.com/openvenues/libpostal
-
complete stack for an analysis team
Also, what OS(s) does IT support for clients and servers? I think Libpostal doesn't officially support Windows, but you can build it to target that. Seems difficult and/or unreliable though: https://github.com/openvenues/libpostal/issues/219
-
Automating a Web Scraper
You can feed libpostal sequence of string until it gives good results. A lot of miss, some hits, score the hits. https://github.com/openvenues/libpostal
splink
- Splink: Fast, accurate, scalable probabilistic data linkage
-
Ask HN: What projects are you working on?
https://github.com/moj-analytical-services/splink
-
Record linkage/Entity linkage
Record linkage has been a big part of a project I've been working on for 6 months now. I personally think a great and free solution be using the splink package in Python which can handle 10+m rows which implements the Fellegi-Sunter model (equivalent to a naive-Bayes model) is the classical model in record linkage. It can be trained in an unsupervised manner using some initial parameter estimation (these are quite intuitive) and then expectation maximisation. The features in the model will be different pairwise string comparisons on your field of interest. These can include exact equality; edit distance comparisons like Levensthein distance and Jaro-Winkler; and phonetic comparisons like soundex and double metaphone. The splink pacakge will handle training the model and then all the graph theory at the end to connect all your links into clusters. All the details you'll need are in the links. https://www.robinlinacre.com/probabilistic\_linkage/ https://moj-analytical-services.github.io/splink/
-
What is the best approach to removing duplicate person records if the only identifier is person firstname middle name and last name? These names are entered in varying ways to the DB, thus they are free-fromatted.
https://moj-analytical-services.github.io/splink/ is a FOSS python package (but it runs against your db using SQL).
-
DuckDB – in-process SQL OLAP database management system
If you're curious, I've written a FOSS record linkage library that executes everything as SQL. It supports multiple SQL backends including DuckDB and Spark for scale, and runs faster than most competitors because it's able to leverage the speed of these backends: https://github.com/moj-analytical-services/splink
-
Ask HN: What have you created that deserves a second chance on HN?
Splink - a python library for probabilistic record linkage (fuzzy matching/entity resolution).
Splink is dramatically faster and works on much larger datasets than other open source libraries. I'm particularly proud of the fact we support multiple execution backends (at the moment, DuckDb Spark Athena and Sqlite, but additional adaptors are relatively straightforward to write).
We've had >4 million pypi downloads and it's used in government, academia and the private sector, often replacing extremely expensive proprietary solutions.
https://github.com/moj-analytical-services/splink
More info in blog posts here:
-
Conformed Dimensions problem that keeps recurring on every project
Splink is a SQL tool that can do this https://github.com/moj-analytical-services/splink
-
How do you join two sources with attributes that aren't identical?
Probabilistic record matching model such as a Fellegi-Sunter. Check out the splink package in Python.
-
Splink 3: Fast, accurate and scalable record linkage (entity resolution) in Python
Main docs here: https://moj-analytical-services.github.io/splink
-
Splink 3: Fast, accurate and scalable fuzzy record linkage in Python with support for multiple backends (FOSS)
It'd be great to see Splink add value in this area! Do give us a shout if you have any questions. The best place to post is on the Github discussions: https://github.com/moj-analytical-services/splink/discussions
What are some alternatives?
usaddress - :us: a python library for parsing unstructured United States address strings into address components
zingg - Scalable identity resolution, entity resolution, data mastering and deduplication using ML
neuralcoref - ✨Fast Coreference Resolution in spaCy with Neural Networks
dedupe - :id: A python library for accurate and scalable fuzzy matching, record deduplication and entity-resolution.
rmlint - Extremely fast tool to remove duplicates and other lint from your filesystem
sqlglot - Python SQL Parser and Transpiler
jdupes - A powerful duplicate file finder and an enhanced fork of 'fdupes'.
entity-embed - PyTorch library for transforming entities like companies, products, etc. into vectors to support scalable Record Linkage / Entity Resolution using Approximate Nearest Neighbors.
kvdo - A kernel module which provide a pool of deduplicated and/or compressed block storage.
dblink - Distributed Bayesian Entity Resolution in Apache Spark
spaCy - 💫 Industrial-strength Natural Language Processing (NLP) in Python
KaithemAutomation - Pure Python, GUI-focused home automation/consumer grade SCADA