tcl VS drydock

Compare tcl vs drydock and see what are their differences.

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
tcl drydock
11 3
- 6
- -
- 0.0
- almost 2 years ago
Go
- Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

tcl

Posts with mentions or reviews of tcl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-10.
  • A brief interview with Tcl creator John Ousterhout
    9 projects | news.ycombinator.com | 10 Feb 2023
  • Tcl Ported to Go
    2 projects | /r/programming | 14 Oct 2022
    Behold, a 16MB "Go" file: https://gitlab.com/cznic/tcl/-/blob/master/lib/tcl_windows_amd64.go
    1 project | /r/patient_hackernews | 13 Oct 2022
    1 project | /r/hackernews | 13 Oct 2022
    1 project | /r/hypeurls | 13 Oct 2022
    9 projects | news.ycombinator.com | 13 Oct 2022
    Based upon info in an AUTHORS file [0] for a different project by the user cznic, I think there is indeed some kind of connection with nic.cz

    But I believe, based upon my membership on the GoNuts group / mailing list, that this is mostly the work of one individual, Jan Mercl (he is quite active in GoNuts) — as also stated in the previously mentioned AUTHORS file, and in the AUTHORS file for the port of Tcl that is the subject of the OP [1].

    I have used some of his non-transpiled code/projects in my own Go projects in the past. He seems to be a very solid coder, often happy to share his views in GoNuts, and also frequently reviews others' code too.

    [0] https://github.com/cznic/golex/blob/master/AUTHORS

    [1] https://gitlab.com/cznic/tcl/-/blob/master/AUTHORS

  • 뉴스 스크랩 2022-10-14
    2 projects | dev.to | 13 Oct 2022
  • SQLite in Go, with and Without Cgo
    16 projects | news.ycombinator.com | 13 May 2022
    I think the author of modernc.org/sqlite also ported the test suite. They wrote https://gitlab.com/cznic/tcl to run the TCL-based tests, for example.

drydock

Posts with mentions or reviews of drydock. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-05-13.
  • SQLite in Go, with and Without Cgo
    16 projects | news.ycombinator.com | 13 May 2022
    I have been using SQLite in Go projects for a few years now. During early stages of development I always start with SQLite as the main database, then when the project matures, I usually add support for PostgreSQL.

    (I usually make a Store interface which is application specific and doesn't even assume there is an SQL database underneath. Then I make "driver" packages for each storage system - be it PostgreSQL, SQLite, flat files, timeseries etc. I have only one set of unit tests that is then run against all drivers. And when I have a caching layer, I also run all the unit tests with or without caching. The cache is usually just an adapter that wraps a Store type. I maintain separate schemas and drivers for each "driver" because I have found that this is actually faster and easier than trying to make generic SQL drivers for instance.)

    However, I always keep the SQLite support and it is usually the default when you start up the application without explicitly specifying a database. This means that it is easy for other developers to do ad-hoc experiments or even create integration tests without having to fire up a database, which even when you are able to do it quickly, still takes time and effort. In production you usually want to point to a PostgreSQL (or other) database. Usually, but not always.

    I also use it extensively in unit tests (often creating and destroying in-memory databases hundreds of times during just a couple of seconds of tests). I run all my tests on every build while developing and then speed matters a lot. When testing with PostgreSQL I usually set a build tag that specifies that I want to run the tests against PostgreSQL as well. I always want to run all the database tests - I don't always need to run them against PostgreSQL

    (Actually, I made a quick hack called Drydock which takes care of creating a PostgreSQL instance and creates one database per test. This is experimental, but I've gotten a lot of use out of it: https://github.com/borud/drydock)

    The reason I do this is that it results in much quicker turnaround during the initial phase when the data model may go through several complete rewrites. The lack of friction is significant.

    SQLite has actually surprised me. I use it in a project where I routinely have tens of millions of rows in the biggest table. And it still performs well enough at well north of 100M rows. I wouldn't recommend it in production, but for a surprising number of systems you could if you wanted to.

    The transpiled SQLite is very interesting to me for two reasons. It makes cross compiling a lot less complex. I make extensive use of Go and SQLite on embedded ARM platforms and then you either have to choose between compiling on the target platform or mess around with C libraries. It also eliminates the need to do two stage Docker builds (which cuts down building Docker images from 50+ seconds to perhaps 4-5 seconds).

    The transpiled version is slower by quite a lot. I haven't done a systematic benchmark, but I noticed that a server that stores 30-40 datapoints per second went from 0.5% average CPU load to about 2% average CPU load. I'm not terribly worried about it, but it does mean that when I increase the influx of data I'm most likely going to hit a wall sooner.

    I'll be using the transpiled SQLite a lot more in the coming year and I'll be on the Gophers Slack so if anyone is interested in sharing experiences, discussing SQLite in Go, please don't be shy.

  • Exiting the Vietnam of Programming: Our Journey in Dropping the ORM (In Golang)
    7 projects | news.ycombinator.com | 26 Nov 2021
    This isn't new. A lot of applications and libraries do this. And I think it is a good way to design things.

    Usually the database I use to develop a SQL schema is Sqlite3, since it allows for really nice testing. Then I add PostgreSQL support (which requires more involved testing setup, but I have a library that makes this somewhat easier: https://github.com/borud/drydock). (SQLite being in C is a bit of a problem since it means I can't get a purely statically linked binary on all platforms - at least I haven't found a way to do that except on Linux. So if anyone has some opinions on alternatives in pure Go, I'm all ears)

    In the Java days JDBC every single method implementing some operation would be a lot of boilerplate. JDBC wasn't a very good API. But in Go that is much less of a problem. In part because you have struct tags, and libraries like Sqlx. To that I also add some helper functions to deal with result/error combos. Turns out the majority of my interactions with SQL databases can be carried out in 1-3 lines of code - with a surprising number of cases just being a oneliner. (The performance hit from using Sqlx is in most cases so minimal it doesn't matter. If it matters to you: use Sqlx when modeling and evolving the persistence, and then optimize it out if you must. I think I've done that just once in about 100kLOC worth of code written over the last few years).

    And best of all: I get to deal with the database as a database. I write SQL DDL statements to define the schema, and SQL to perform the transactions. I don't have to pretend it is a object model, so I can make full use of the SQL. (Well, actually, I try to make do as far as possible with trivial SQL, but that's a whole different discussion). The interface type takes care of exposing the persistence in a way that fits the application.

    (Another thing I've started experimenting with a bit is to return channels or objects containing channels instead of arrays of things. But there is still some experimenting that needs to be done to find a pleasing design)

  • Show HN: Idea for unit testing with PostgreSQL in Go
    1 project | news.ycombinator.com | 10 Feb 2021

What are some alternatives?

When comparing tcl and drydock you can also consider the following projects:

tk

sqinn - SQLite over stdin/stdout

pure-data - Pure Data - a free real-time computer music system

xgo - Go CGO cross compiler

x11

sqlite - work in progress

cppwin32 - A modern C++ projection for the Win32 SDK

framework - PHP Framework providing ActiveRecord models and out of the box CRUD controllers with versioning and ORM support

zigwin32 - Zig bindings for Win32 generated by https://github.com/marlersoft/zigwin32gen

zeidon-joe - Zeidon Java Object Engine and related projects.

sqlite

pggen - Generate type-safe Go for any Postgres query. If Postgres can run the query, pggen can generate code for it.