Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
conjure
Interactive evaluation for Neovim (Clojure, Fennel, Janet, Racket, Hy, MIT Scheme, Guile, Python and more!)
-
Graal
GraalVM compiles Java applications into native executables that start instantly, scale fast, and use fewer compute resources 🚀
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Feels appropriate to say that Hylang is in version 1.0 alpha:
https://github.com/hylang/hy/releases/tag/1.0a4
That's Lisp over Python -- you can do data science properly in Lisp.
I repeat you can do data science properly in Lisp now.
Maybe things have changed over the years. I've played around with Hy in 2017 and in the end felt like it combines the bad parts of Python with the bad parts of a Lisp.
If you're forced to use Python it's usually because of one of 2 reasons: Either you need to work with colleagues on something (in which case almost certainly they wouldn't accept a lispified version of it) or because you need some of the libraries that are already implemented.
In the latter case, you might say Hy is a good option. But why bind yourself to the slowness of Python when you could just as well use some fast Lisp (e.g. Common Lisp) that interacts with a Python interpreter similar to nimpy [0] from Nim? Not sure if such a thing already exists for CL, but it should be quite feasible to write and then you get access to all your Python library needs while actually using a better language for the rest of your code.
But I'm all ears on the real advantages Hy provides aside from being neat.
[0]: https://github.com/yglukhov/nimpy
Janet [1] is a clean, modern LISP.
Clojure is also an option, but of course that comes with a heavy JVM burden.
[1 ]https://janet-lang.org
> Not sure if such a thing already exists for CL
couple of solutions exist for this
https://github.com/bendudson/py4cl
https://github.com/pinterface/burgled-batteries
> Not sure if such a thing already exists for CL
couple of solutions exist for this
https://github.com/bendudson/py4cl
https://github.com/pinterface/burgled-batteries
> (but I don’t buy the argument that “real” macros are only able in s-exprs)
The Rhombus project in Racket^1 is an attempt to fix a lot of things about Racket and S-exprs but one of the goals is creating a syntax that is able to be manipulated as easily a s-exprs.
> By “macro-extensible,” we mean that Rhombus users should be able to extend the syntax of Rhombus in much the same way that Racket users can extend the syntax of Racket. Complex syntactic extensions such as object and class systems, static type checkers, and pattern matching should be implemented as libraries while still providing a surface syntax familiar to users of these features in other languages.
1. https://github.com/racket/rhombus-prototype/blob/master/reso...
Julia [1] has macros (not necessarily based on s-exprs though you can use s-exprs if you want) and a REPL [2] with some decent tooling (e.g., [3] [4])
[1]: http://julialang.org/
Shining light on another; https://github.com/lem-project/async-process
>What about the interactive environment like the one in Common Lisp?
Never used a "real" Lisp REPL, but Conjure[0] seems like it ticks a lot of boxes.
>Is this one going to remain obscure?
There are only a ~dozen mainstream languages. Once you get outside of the popular zeitgeist, you have to appreciate exactly what you are getting. It takes a lot of dedication to become fluent in a language/ecosystem, so it is no surprise that people are reluctant to switch to a novel platform. Clojure is far more likely to ever become mainstream (owing to the huge JVM ecosystem), but even that seems to have only limited industry penetration.
[0] https://github.com/Olical/conjure
Clojure doesn't need JVM. You can run it with GraalVM either directly [1] or via Babashka [2]. Both are great options for get into Clojure since there's almost no startup time like JVM and you don't have to be burdened with learning and heaviness of JVM stack.
[1] https://github.com/oracle/graal
[2] https://github.com/babashka/babashka
Clojure doesn't need JVM. You can run it with GraalVM either directly [1] or via Babashka [2]. Both are great options for get into Clojure since there's almost no startup time like JVM and you don't have to be burdened with learning and heaviness of JVM stack.
[1] https://github.com/oracle/graal
[2] https://github.com/babashka/babashka
> One of my favorite examples is the regex library cl-ppcre. Thanks to the nature of Lisp, the recognizer for each regex you create can be compiled to native code on compiler implementations of CL.
That is not true - cl-ppcre generates a chain of closures. Experimental performance is in the same ballpark as typical "bytecode" interpreting regex implementations.
(Disclosure: I wrote another regex library at <https://github.com/telekons/one-more-re-nightmare>, which does do native code compilation.)
Depends on what you mean.
If you just mean writing an implementation of Emacs (or something Emacs-like) in Common Lisp, that's not very hard, and it's been done a few times. See, for example, Lem[1] and Hemlock[2].
Heck, I wrote one for MacOSX in about 2001 or 2002.
If you mean a drop-in replacement for GNU Emacs, that's a lot harder. Besides the UI and the editing infrastructure, you need to write a bug-compatible implementation of GNU's elisp, or you lose the whole GNU Emacs ecosystem. That ecosystem is most of its practical appeal. That's a whole bunch of work.
[1] https://github.com/40ants/lem-pareto/blob/master/lem-pareto-...
It's not easy to get the point across just by reading about it. Id honestly say you should just learn some lisp and implement something cool.
> I don't see why this would be any different than defining a function like I normally do.
Because with functions you don't define any new syntax or implement a DSL; you don't have access to the compiler during read, compilation, and run time. This allows you to easily extend the language: do you miss list comprehensions from Python? Add it yourself with a macro. Want to generate boilerplate code? Macro. Prolog compiler? Yes. Basically anything in this book: https://github.com/norvig/paip-lisp
> But why is this more productive than simply defining a "exponent()" function or something similar?
It's not just about defining a new operator, you can basically implement a DSL with C-like syntax if you want to: https://github.com/y2q-actionman/with-c-syntax