Julia 1.9 Highlights

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

CodeRabbit: AI Code Reviews for Developers
Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
coderabbit.ai
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • llvm-project

    Fork of https://github.com/llvm/llvm-project (by JuliaLang)

    I'm not aware of bugs with offset arrays in the standard library. It's happened before and it may happen again, but Base and the standard library are generally very good at avoiding that.

    The main problem is non-standard library packages that were written back in early julia days before OffsetArrays existed (e.g. a big offendeder IIRC was StatsBase.jl), and so wasn't written with any awareness of how to deal with generic indexing.

    OffsetArrays.jl are a neat trick, and sometimes they really are useful e.g. when mimicing some code that was written in a 0-based language, or just when you're working with array offsets a lot, but I wouldn't really recommend using them everywhere. Other non-array indexable types like Tuple don't have 0-based counterparts (as far as I'm aware), so you'll be jumping back and forth from 0-based and 1-based still, and it's just an extra layer of mental load.

    Honestly though, it's often not very necessary to talk about array indices at all. The preferred pattern is just to use `for i in eachindex(A)`, `A[begin]`, `A[end]` etc.

    > and IIRC also the language build depends on a fork of LLVM (https://github.com/JuliaLang/llvm-project)

    Yes, we use a fork of LLVM, but not because we're really changing it's functionality, just because we have patches for bugs. The bugs are typically reported upstream and our patches are contributed, but the feedback loop is slow enough that it's easiest to just maintain our own patched fork. We do keep it updated though (this release brings us up to v14) and there shouldn't be any divergences from upsteam other than the bugfixes as far as I'm aware

  • CodeRabbit

    CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.

    CodeRabbit logo
  • PlotDocs.jl

    Documentation for Plots.jl

    https://docs.juliaplots.org/stable/

    3. See https://juliaacademy.com

    Another alternative environment are Pluto notebooks. It's reactive like a spreadsheet, but easy to use in your browser.

    https://featured.plutojl.org/

    I have several users without much coding experience using Pluto notebooks just to generate plots from CSV files. They are finding the combination of a web based interface, reactive UI, and fast execution easier to use than a MATLAB Live script.

  • Pkg.jl

    Pkg - Package manager for the Julia programming language

    There was a "bug" (or just unhandled caching case) that effected the Pluto notebook system that required precompilation each time. This is because Pluto notebooks kept a manifest (so they always instantiated with the same packages every time for full reproducibility) and the instantiation of that manifest triggered not just package running but also precompilation. That was fixed in https://github.com/JuliaLang/Pkg.jl/pull/3378, with a larger discussion in https://discourse.julialang.org/t/first-pluto-notebook-launc.... That should largely remove this issue as in included in the v1.9 release (it was first in v1.9-RC2 IIRC).

  • julia

    The Julia Programming Language

    My favorite change (even though it's not listed in the changelog), is that just-in-time compiled code now has frame pointers[1], making Julia code much more debuggable. Profilers, debuggers, etc. all can now work out of the box.

    Extra excited that the project I happen to work (the Parca open source project[2] on influenced this change [3][4]. Shout out to Valentin Churavy for driving this on the Julia front!

    [1] https://github.com/JuliaLang/julia/commit/06d4cf072db24ca6df...

  • parca

    Continuous profiling for analysis of CPU and memory usage, down to the line number and throughout time. Saving infrastructure cost, improving performance, and increasing reliability.

  • parca-demo

    A collection of languages and frameworks profiled by Parca and Parca agent

  • CondaPkg.jl

    Add Conda dependencies to your Julia project

    You can use CondaPkg.jl (https://github.com/cjdoris/CondaPkg.jl) to setup Python dependencies with version control. I haven't played with it too much but it seemed to work out for what I tried.

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  • Pytorch

    Tensors and Dynamic neural networks in Python with strong GPU acceleration

    >I work for AWS; these are the definitions we use, more or less.

    you're just being asinine - we're literally talking about binary code that's never seen by anyone that doesn't compile from source and goes digging around in the build dir - how could you possibly call that code "transparent" in any sense of the word? are blob drivers also transparent according to these "AWS" definitions?

    >I was specifically referring to the computation graph of the model that is used by autograd.

    it's literally right there in bolded text on the first page of the original paper (the 2017 neurips paper):

    >Immediate, eager execution. An eager framework runs tensor computations as it encounters them; it avoids ever materializing a “forward graph”, recording only what is necessary to differentiate the computation

    autograd has absolutely nothing to do with the graph - autograd is literally 10s of thousands of lines of generated, templatized, code that connects edges one op at a time. you can argue with me all you want or you can just go to repo tip and see for yourself https://github.com/pytorch/pytorch/blob/main/aten/src/ATen/t...

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Recap Hacktober Fest

    4 projects | dev.to | 27 Oct 2024
  • Integrate Hugging Face Spaces & Gradio with a React application

    2 projects | dev.to | 12 Oct 2024
  • How to Learn Generative AI: A Step-by-Step Guide

    3 projects | dev.to | 23 Sep 2024
  • AI enthusiasm #9 - A multilingual chatbot📣🈸

    6 projects | dev.to | 1 May 2024
  • FLaNK Stack Weekly 19 Feb 2024

    50 projects | dev.to | 19 Feb 2024

Did you konow that Julia is
the 49th most popular programming language
based on number of metions?