Catlab.jl
glow
Catlab.jl | glow | |
---|---|---|
4 | 6 | |
585 | 3,151 | |
0.7% | 1.0% | |
9.0 | 8.2 | |
6 days ago | 5 days ago | |
Julia | C++ | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Catlab.jl
-
Data Structures as Topological Spaces (2002) [pdf]
Related to this, AlgebraicJulia has been doing a lot with applying concepts from algebra and category theory to data analysis and modelling.
https://www.algebraicjulia.org/
There's some blog posts that are also interesting:
https://blog.algebraicjulia.org/
-
Fart Proudly – An Essay by Benjamin Franklin
> Maybe I’m just too bitter about academia in this point in my career but it seems like we’ve run out of things to study and/or have too many people doing it.
We have certainly not run out of things to study, but I think we've hit the limit on what can effectively be communicated through traditional science journals [1], and we need to address the reproducibility crisis through open source science and reconsider the incentive structures around academia [2]. We need to oppose initiatives from people like Bill Gates who wish to privatize science through his various non-profits, as knowledge works better as as commons (we were unable to deal with the pandemic partly because Bill Gates prevented Oxford from open sourcing their work on COVID [3]). We need software that can compose scientific models [4], and organizations that can facilitate greater coordination among scientists. Science will become all the more important in an increasingly uncertain world, but are we up to the task?
[1] https://www.science.org/content/article/frustrated-science-s...
[1] https://numfocus.org/open-source-science-initiative-ossci
[2] https://www.wired.com/story/opinion-the-world-loses-under-bi...
[3] https://www.algebraicjulia.org/
-
Anyone know whether the source for cl-cat: a DSEL for computational category theory is publicly available?
Thank you for replying, but what prevents you from releasing your code? Dr Rydeheard has shared the StandardML version from his book (and the book). Of course if you don't want to share your code that is your prerogative and that is fine, but I am just trying to understand the issue that is preventing you a little more clearly. My interest in your implementation is strictly one of personal education. With applied category theory becoming more popular and computing implementations often used for teaching purposes (e.g. this book ) I would like to see a lisp implementation. It is built into Haskell, mostly, and people are developing libraries for Idris and Julia. I would find it instructive to see the implementation in common-lisp. Thank you for taking the time to respond to my original question.
-
From Julia to Rust
The biggest group outside of numerical computing in Julia land are the PL and systems people though? This includes type theorists [1], database folks [2], distributed systems people ([3] to name just one). There are also a fair number of compiler nuts, hence the existence of multiple projects [4][5] in this space. And this is before getting into things that bridge more than one of the domains above, e.g. [7] or [8].
FTR, I think it's fair to question whether numerical computing should have an outsized influence on the direction of the language. I also think it's a pretty fair comparison to point out how standardized and consistent the Rust governance process is compared to Julia's (the Rust RFC system is an exemplar here). That doesn't mean there is a dearth of PL and systems knowledge in the Julia community though.
[1] https://github.com/AlgebraicJulia/Catlab.jl
glow
-
Accelerating AI inference?
Pytorch supports other kinds of accelerators (e.g. FPGA, and https://github.com/pytorch/glow), but unless you want to become a ML systems engineer and have money and time to throw away, or a business case to fund it, it is not worth it. In general, both pytorch and tensorflow have hardware abstractions that will compile down to device code. (XLA, https://github.com/pytorch/xla, https://github.com/pytorch/glow). TPUs and GPUs have very different strengths; so getting top performance requires a lot of manual optimizations. Considering the the cost of training LLM, it is time well spent.
-
Decompiling x86 Deep Neural Network Executables
It's pretty clear its referring to the output of Apache TVM and Meta's Glow
-
US government bans export of NVIDIA A100 to China and Russia, effective immediately
I also disagree with this. For example, Meta seems desperate about AI accelerators, and in fact is already doing "hardware customers develop software stack themselves" I mentioned above: Glow is that stack. Meta is doing Glow even if there is no promising AI accelerators right now, they are that desperate.
-
If data science uses a lot of computational power, then why is python the most used programming language?
For reference: In Tensorflow and JAX, for example, the tensor gets compiled to the intermediate XLA format (https://www.tensorflow.org/xla), then passed to the XLA complier (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/compiler/xla/service) or the new TFRT runtime (https://github.com/tensorflow/runtime/blob/master/documents/tfrt_host_runtime_design.md), or some more esoteric hardware (https://github.com/pytorch/glow).
-
Esperanto Champions the Efficiency of Its 1,092-Core RISC-V Chip
The main reasons are hiring, and depth and breadth of the product.
Compilers are hard, device support is hard, the compiler community is small and closed source compilers quickly become weird tech islands.
https://github.com/pytorch/glow
- From Julia to Rust
What are some alternatives?
StaticArrays.jl - Statically sized arrays for Julia
tvm - Open deep learning compiler stack for cpu, gpu and specialized accelerators
egg - egg is a flexible, high-performance e-graph library
serving - A flexible, high-performance serving system for machine learning models
julia - The Julia Programming Language
XLA.jl - Julia on TPUs
Juleps - Julia Enhancement Proposals
MacroTools.jl - MacroTools provides a library of tools for working with Julia code and expressions.
Symbolics.jl - Symbolic programming for the next generation of numerical software
runtime - A performant and modular runtime for TensorFlow