oorb
Transformers.jl
oorb | Transformers.jl | |
---|---|---|
3 | 7 | |
54 | 505 | |
- | - | |
5.3 | 5.4 | |
about 2 months ago | 1 day ago | |
Fortran | Julia | |
GNU General Public License v3.0 only | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
oorb
-
Show HN: OpenOrb, a curated search engine for Atom and RSS feeds
Imagine my surprise to see OpenOrb, the standard open source software package for orbit determination and minor planet propagation, on the front page of HackerNews. Its interesting software with a beautiful theoretical basis in Bayesian statistics, and a gnarly Fortran codebase - I can’t wait to see the discussion!
Oh.
It’s one thing to land near a name in use. It is quite another to take it directly!
https://github.com/oorb/oorb
- Julia 1.10 Released
-
NASA’s Double Asteroid Redirection Test Is a Smashing Success
Mostly Python and Fortran. See for example https://github.com/oorb/oorb.
The hardest problems are always the social ones. How do you get uptake of a new method, how do you get funding, how do you politely tell a collaboration they are doing the wrong thing, etc.
But if you mean pure technical stuff - the hardest problem I had to solve was rethinking some of the inner loops of the THOR algorithm. The problem was essentially to speed up a Hough transform in 6D space. Lots of time spent profiling CPU cache timings to get that fast.
Transformers.jl
-
Julia 1.10 Released
Flux is quite a nice lower level library:
https://github.com/FluxML/Flux.jl
On top of that there are many higher level libraries such as Transformers.jl
https://github.com/chengchingwen/Transformers.jl
- How is Julia Performance with GPUs (for LLMs)?
-
Load a transformer model with julia
Check out Transformers.jl. It’s a library that implements transformer based models in Julia using Flux.jl. They have support for some of the huggingface transformers.
-
Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.
All of this is build by the community and your mileage may vary.
In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.
-
Help on Differentiable Programming
I think you might have some luck with looking at a transformers implementation in flux, e.g: https://github.com/chengchingwen/Transformers.jl/tree/master/src/basic
-
Fastai.jl: Fastai for Julia
Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
If NLP primitives are all that's keeping you from testing the waters, have a look at https://github.com/chengchingwen/Transformers.jl.