Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).
> not slowing one's work
Put back into context, your reply makes sense as these popular libraries are pretty battle tested. Having said that, it is a valid point that type hints being voluntary means they can only be relied upon with discipled developers and for code you control. Of course, the same point could be made for any code you can't control, especially if the library is written in a weakly typed language like C (or JS).
> I just don't see its absence as the crippling dealbreaker
My genuine question would be: what does dynamic typing offer over static typing? Verbosity would be my expectation, but that only really seems to apply without type inference. The other advantage often mentioned is that it's faster to iterate. Both of these don't seem particularly compelling to me, but I'm probably biased as I've spent all of my career working with static typing, aside from a few projects with Python and JS.
> if there are languages besides JS that you feel get their type system "just right", I'd be curious as to what they are
This is use case dependent, of course. Personally I get on well with Nim's (https://nim-lang.org/) type system: https://nim-lang.org/docs/manual.html#types. It's certainly not perfect, but it lets me write code that evokes a similar 'pseudocode' feel as Python and gets out of my way, whilst being compile time bound and very strict (the C-like run time performance doesn't hurt, too). It can be written much as you'd write type hinted Python, but it's strictness is sensible.
For example, you can write `var a = 1.5; a += 1` because `1` can be implicitly converted to a float here, but `var a = 1; a += 1.5` will not compile because int and float aren't directly compatible - you'd need to type cast: `a += int(1.5)` which makes it obvious something weird is happening.
Similarly `let a = 1; let b: uint = a` will not compile because `int` and `uint` aren't compatible (you'd need to use `uint(a)`). You can however write `let b: uint = 1` as the type can be implicitly converted. You can see/play with this online here: https://play.nim-lang.org/#ix=3MRD
This kind of strict typing can save a lot of head scratching issues if you're doing low level work, but it also just validates what you're doing is sensible without the cognitive overhead or syntactic noise that comes from something like Rust (Nim uses implicit lifetimes for performance and threading, rather than as a constraint).
Compared to Python, Nim won't let you silently overwrite things by redefining them, and raises a compile time error if two functions with the same name ambiguously use the same types. However, it has function overloading based on types, which helps in writing statically checked APIs that are type driven rather than name driven.
One of my favourite features is distinct types, which allow you to model different things that are all the same underlying type:
Stack-based arbitrary-precision integers - Fast and portable with natural syntax for resource-restricted devices.
> I think the message is more nuanced
I thought it was more nuanced too as they were explaining how integer types can be derived, until I finished the article, and they really did just seem to be complaining that there's a mismatch between compile time and run time.
Dynamic types don't really solve the problems they mention as far as I can tell either (perhaps I am misunderstanding), they just don't provide any guarantees at all and so "work" in the loosest sense.
> otherwise wouldn't lisp with its homoiconicity and compile time macros fit the bill perfectly?
That's a good point, I do wonder why they didn't mention Lisp at all.
> we don't have a solution yet
What they want to do can, as far as I can see, be implemented in Nim easily in a standard, imperative form, without any declarative shenanigans. Indeed, it is implemented here: https://github.com/nim-lang/Nim/blob/ce44cf03cc4a78741c423b2...
Of course, that implementation is more complex than the one in the article because it handles a lot more.
At the end of the day, it's really a capability mismatch at the language level and the author even states this:
> Programming languages ought to be rethought.
I'd argue that Nim has been 'rethought' specifically to address the issues they mention. The language was built with extension in mind, and whilst the author states that macros are a bad thing, I get the impression this is because most languages implement them as tacked on substitution mechanisms (Rust/D), and/or are declarative rather than "simple" imperative processes. IMHO, most people want to write general code for compile time work (like Zig), not learn a new sub-language. The author states this as well.
Nim has a VM for running the language at compile time so you can do whatever you want, including the recursive type decomposition (for example: https://github.com/status-im/nim-stint). It also has 'real' macros that aren't substitutions but work on the core AST directly, can inspect types at compile time, and is a system language but also high level. It seems to solve their problems, but of course, they simply might not have used or even heard of it.
Clean code begins in your IDE with SonarLint. Up your coding game and discover issues early. SonarLint is a free plugin that helps you find & fix bugs and security issues from the moment you start writing code. Install from your favorite IDE marketplace today.
The Clojure programming language
- It allows to express abstractions in pure code, rather than in documents or comments.
The first advantage requires little explanation. The second I think is more difficult to appreciate without enough experience with typed systems and an interface oriented programming style.
How these two things are important for a given person is difficult to say. I have come to the conclusion that it relates a lot to how your brain works. Unfortunately, mine can only track 3 or 4 things at the same time, so I need to abstract away anything else than the little piece of code I'm working on and get the compiler to worry about how things glue together.
I need the comfort of getting the compiler to help me change my hats and be able not to think about anything else that the concrete piece of code I'm working on and the interfaces of the other parts it uses. When I don't have this possibility, I miss it even more than the comfort of programming using inmutable data structures that Clojure spoilt me with. I think need to seriously try Scala 3 at some point, since it seems to combine inmutable data structures by default with an advance type system (although the lenses libraries I've seen look like an abomination in comparison with Clojure's update-in and company, not to mention the absolute elegance and triviality of the implementation of the latter: https://github.com/clojure/clojure/blob/clojure-1.10.1/src/c...).
So I would second your recommendation and encourage people trapped in dynamic language echo chambers to try for a year something like Typescript or Kotlin to appreciate other ways of doing things in languages with practical type systems. Perhaps some of them will discover that it suits them better or help them better understand why they prefer the dynamic style.
Nim - Python bridge
> Python's behavior (though correct to spec) is arguably worse
Another gotcha I hit in Python is the scoping of for loops, e.g.,https://stackoverflow.com/questions/3611760/scoping-in-pytho...
Python takes a very non-obvious position on this from my perspective.
Ultimately, all these things are about the balance of correctness versus productivity.
I don't want to be writing types everywhere when it's "obvious" to me what's going on, yet I want my idea of obvious confirmed by the language. At the other end of the scale I don't want to have to annotate the lifetime of every bit of memory to formally prove some single use script. The vast majority of the time a GC is fine, but there are times I want to manually manage things without it being a huge burden.
Python makes a few choices that seem to be good for productivity but end up making things more complicated as projects grow. For me, being able to redefine variables in the same scope is an example of ease of use at the cost of clarity. Another is having to be careful of not only what you import, but the order you import, as rather than raise an ambiguity error the language just silently overwrites function definitions.
For everything else, I can directly use Python from Nim and visa versa with Nimpy: https://github.com/yglukhov/nimpy. This is particularly useful if you have some slow Python code bottlenecking production, since the similar syntax makes it relatively straightforward to port over and use the resultant compiled executable within the larger Python code base.
Perhaps ironically, as it stands the most compelling reason not use Nim isn't technical: it's that it's not a well known language yet so it can be a hard sell to employers who want a) to hire developers with experience from a large pool, and b) want to know that a language is well supported and tested. Luckily, it's fairly quick to onboard people thanks to the familiar syntax, and the multiple compile targets make it able to utilise the C/C++/Python ecosystems natively. Arguably the smaller community means companies can have more influence over development and steer language development. Still this is, in my experience, a not insignificant issue, at least for the time being.
Are there nim users?
3 projects | reddit.com/r/Python | 16 Sep 2022
daily report for Nim language
2 projects | dev.to | 16 Jan 2022
NimConf 2021 (Sat, June 26th)
2 projects | news.ycombinator.com | 23 Jun 2021
Patten Matching in Nim
3 projects | news.ycombinator.com | 11 Mar 2021
Aprende a programar con Nim
3 projects | dev.to | 11 Mar 2021