sqlitefs
datasette
Our great sponsors
sqlitefs | datasette | |
---|---|---|
3 | 150 | |
19 | 7,270 | |
- | - | |
10.0 | 9.3 | |
7 months ago | 7 days ago | |
Rust | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
sqlitefs
-
SQLite: 35% Faster Than the Filesystem
> but also presents as a true filesystem.
As does:
https://github.com/guardianproject/libsqlfs
https://github.com/narumatt/sqlitefs
(I know nothing about these, just got them from a quick search)
- Why SQLite may become foundational for digital progress
- Fd: A simple, fast and user-friendly alternative to 'find'
datasette
-
Not by AI
If anyone doubts that simonw knows Datasette well, I encourage a close examination of this URL: https://github.com/simonw/datasette
-
Ask HN: Small scripts, hacks and automations you're proud of?
I have a neat Hacker News scraping setup that I'm really pleased with.
The problem: I want to know when content from one of my sites is submitted to Hacker News, and keep track of the points and comments over time. I also want to be alerted when it happens.
Solution: https://github.com/simonw/scrape-hacker-news-by-domain/
This repo does a LOT of things.
It's an implementation of my Git scraping pattern - https://simonwillison.net/2020/Oct/9/git-scraping/ - in that it runs a script once an hour to check for more content.
It scrapes https://news.ycombinator.com/from?site=simonwillison.net (scraping the HTML because this particular feature isn't supported by the Hacker News API) using shot-scraper - a tool I built for command-line browser automation: https://shot-scraper.datasette.io/
The scraper works by running this JavaScript against the page and recording the resulting JSON to the Git repository: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
That solves the "monitor and record any changes" bit.
But... I want alerts when my content shows up.
I solve that using three more tools I built: https://datasette.io/ and https://datasette.io/plugins/datasette-atom and https://datasette.cloud/
This script here runs to push the latest scraped JSON to my SQLite database hosted using my in-development SaaS platform, Datasette Cloud: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
I defined this SQL view https://simon.datasette.cloud/data/hacker_news_posts_atom which shows the latest data in the format required by the datasette-atom plugin.
Which means I can subscribe to the resulting Atom feed (add .atom to that URL) in NetNewsWire and get alerted when my content shows up on Hacker News!
I wrote a bit more about how this all works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
-
Tool to parse, index, and search local documents? - Windows
Datasette
-
What is the easiest way to make searchable, sortable, multi-criteria database frontpage?
Datasette is remarkably well-suited to exactly the use case you've described. You essentially just point it at a SQLite database and get a reasonably friendly interface for querying, sorting, all that good stuff. You can customize pretty much everything about it depending on how dirty you wanna get your hands, but the out-of-the-box experience might even suffice for what you're trying to do.
-
Ask HN: Any low code frameworks on top of Django?
datasette? https://datasette.io/
-
Want to build an API but have no idea where to start
Would this tool work for you? Look like specifically tailored to exposing DB data as APIs: https://datasette.io/
- The Untold Story of SQLite
-
Open Source Project to Create a Comprehensive Food Database [Help wanted]
Very cool. Have you considered making the data available as a (static) Datasette website? https://datasette.io/ You can probably host it on Github pages for free.
-
I'm sure I'm being stupid.. Copying data from an API and making a database
My project https://datasette.io/ is ideal for this kind of thing. You can use https://sqlite-utils.datasette.io/ to load JSON data into a SQLite database, then publish it with Datasette.
-
Datasette is my data hammer
I'm definitely keen on suggestions for improvements I can make to the default UI.
Datasette provides both a JSON API (easily enabled for CORS access) and supports custom templates, so it's possible to customize the UI any way you like.
So far I've not seen many examples of extensive customization. I use the custom templates a lot myself - these four sites are all just regular Datasette with some custom templates:
- https://til.simonwillison.net/
- https://www.niche-museums.com/
- https://www.rockybeaches.com/us/pillar-point
Source code is on GitHub for all four.
What are some alternatives?
nocodb - 🔥 🔥 🔥 Open Source Airtable Alternative
duckdb - DuckDB is an in-process SQL OLAP Database Management System
sql.js-httpvfs - Hosting read-only SQLite databases on static file hosters like Github Pages
litestream - Streaming replication for SQLite.
gomodest - A complex SAAS starter kit using Go, the html/template package, and sprinkles of javascript.
Sequel-Ace - MySQL/MariaDB database management for macOS
beekeeper-studio - Modern and easy to use SQL client for MySQL, Postgres, SQLite, SQL Server, and more. Linux, MacOS, and Windows.
dbhub.io - A "Cloud" for SQLite databases. Collaborative development for your data. :)
roapi - Create full-fledged APIs for slowly moving datasets without writing a single line of code.
temporal_tables - Temporal Tables PostgreSQL Extension
Sapper - The next small thing in web development, powered by Svelte
Redash - Make Your Company Data Driven. Connect to any data source, easily visualize, dashboard and share your data.