ITensors.jl
tntorch
ITensors.jl | tntorch | |
---|---|---|
4 | 1 | |
485 | 271 | |
1.6% | - | |
9.4 | 1.6 | |
7 days ago | about 1 year ago | |
Julia | Python | |
Apache License 2.0 | GNU Lesser General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ITensors.jl
-
A question relating to the BCS theory ground state
DMRG packages are available in Julia and C++ and Python. (Don't use Fortran. But here is a Fortran library if you insist.)
-
To those working in computational physics, what do you think of Julia?
As one example, one of the leading libraries for tensor network simulations (https://itensor.org) has recently been rewritten in Julia (previously was c++) and the flatiron institute who develops it (which is certainly one of the leading Computational physics institutions in the world) is advising new users to use the Julia version. I also know some other computational groups which use Julia, even for things like quantum Monte Carlo (where I personally would have believed c++ to have an edge but people tell me different)! I think when even leading computational groups switch, Julia is almost always the much better option for the average user if you write your code from scratch (a situation not so rare in condensed matter). If you need to use some libraries or legacy code, this obviously changes the situation.
-
Julia 1.8 has been released
> One thing that supports this view is that there are several Julia packages that are wrappers around existing C/Fortran/C++ libraries, and basically no examples (that I know) of people porting existing libraries to Julia.
As with the others, I'll strongly disagree and chime in with a few examples off the top of my head:
* ITensors.jl : They started moving from a C++ to Julia a couple years ago and now their webpage doesn't even mention their original C++ implementation on its homepage anymore https://itensor.org/
* DifferentialEquations.jl : This has many state of the art differentiatial equation solving facilities in it, many of which are improvements over old Fortran libraries.
* SpecialFunctions.jl, Julia's own libm, Bessels.jl, SLEEFPirates.jl : Many core math functions have ancient Fortran or C implementations from OpenLibm or whatever, and they're being progressively replaced with better, faster versions written in pure julia that outperform the old versions.
-
Initializing an n^k array as a sparse array?
Otherwise, maybe check ITensors.jl or look for packages that want to do the same thing?
tntorch
-
ETH Zurich AI Researchers Introduce ‘tntorch’: a PyTorch-Powered Tensor Learning Python Library That Supports Multiple Decompositions Under a Unified Interface
Continue reading | Checkout the paper and github
What are some alternatives?
Fastor - A lightweight high performance tensor algebra framework for modern C++
ludwig - Low-code framework for building custom LLMs, neural networks, and other AI models
danfojs - Danfo.js is an open source, JavaScript library providing high performance, intuitive, and easy to use data structures for manipulating and processing structured data.
norse - Deep learning for spiking neural networks
Measurements.jl - Error propagation calculator and library for physical measurements. It supports real and complex numbers with uncertainty, arbitrary precision calculations, operations with arrays, and numerical integration.
torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
NTNk.jl - Unsupervised Machine Learning: Nonnegative Tensor Networks + k-means clustering
Octavian.jl - Multi-threaded BLAS-like library that provides pure Julia matrix multiplication
ProtoStructs.jl - Easy prototyping of structs
RecursiveArrayTools.jl - Tools for easily handling objects like arrays of arrays and deeper nestings in scientific machine learning (SciML) and other applications
GenericArpack.jl - A pure Julia translation of the Arpack library for eigenvalues and eigenvectors but for any numeric types. (Symmetric only right now)
ObjectOriented.jl - Conventional object-oriented programming in Julia without breaking Julia's core design ideas