root
NetworkX
root | NetworkX | |
---|---|---|
31 | 61 | |
2,430 | 14,278 | |
1.5% | 1.3% | |
10.0 | 9.6 | |
3 days ago | 1 day ago | |
C++ | Python | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
root
-
If you can't reproduce the model then it's not open-source
I think the process of data acquisition isn't so clear-cut. Take CERN as an example: they release loads of data from various experiments under the CC0 license [1]. This isn't just a few small datasets for classroom use; we're talking big-league data, like the entire first run data from LHCb [2].
On their portal, they don't just dump the data and leave you to it. They've got guides on analysis and the necessary tools (mostly open source stuff like ROOT [3] and even VMs). This means anyone can dive in. You could potentially discover something new or build on existing experiment analyses. This setup, with open data and tools, ticks the boxes for reproducibility. But does it mean people need to recreate the data themselves?
Ideally, yeah, but realistically, while you could theoretically rebuild the LHC (since most technical details are public), it would take an army of skilled people, billions of dollars, and years to do it.
This contrasts with open source models, where you can retrain models using data to get the weights. But getting hold of the data and the cost to reproduce the weights is usually prohibitive. I get that CERN's approach might seem to counter this, but remember, they're not releasing raw data (which is mostly noise), but a more refined version. Try downloading several petabytes of raw data if not; good luck with that. But for training something like a LLM, you might need the whole dataset, which in many cases have its own problems with copyrights…etc.
[1] https://opendata.cern.ch/docs/terms-of-use
[2] https://opendata.cern.ch/docs/lhcb-releases-entire-run1-data...
[3] https://root.cern/
- What software is used to generate plots/graphs like this seen in many particle physics papers?
-
Interactive GCC (igcc) is a read-eval-print loop (REPL) for C/C++
The odd part is that this is not just for fun. For many physicists when I was at CERN, a C++ REPL was a commonly used tool to interactively debug analyses to such a degree that many never compiled their code. Back then, I believe, it was some custom implementation included in ROOT (https://root.cern/). I even went out of my way to write C++ code compatible to it just so it could run with this implementation, otherwise some colleagues weren't interested in collaborating at all.
-
Stable Diffusion in pure C/C++
That Python ML code is calling C++ code running in the GPU, one more reason to use C++ across the whole stack.
CERN already used prototyping in C++, with ROOT and CINT, 20 years ago.
https://root.cern/
Nowadays it is even usable from Netbooks via Xeus.
It is more a matter of lack of exposure to C++ interpreters than anything else.
- Root: Analyzing Petabytes of Data, Scientifically
-
Aliens might be waiting for humans to solve a puzzle
Quantum computing is a pretty interesting science too. https://home.cern/news/press-release/knowledge-sharing/cern-quantum-technology-initiative-unveils-strategic-roadmap they have to deal with lots of data streaming too https://root.cern/
-
cppyy Generated Wrappers and Type Annotations
I'm a user of CERN's ROOT (https://root.cern/) and while I'd usually write in C++, I've been trying to write as much Python as I can recently to get a bit better in the language.
- Root: Analyzing Petabytes of Scientific Data
-
Span: how to cast pointer of pointer to other types?
I'm dealing with a C++ software called ROOT made by CERN, which is, if I'm not wrong, the only C++ API that we could use for data analysis such as plotting histograms, fitting multi-parameter functions and storing data in the size of TB to the disk and many more. That's the only reason why physicists still stick to this software. you can check here .
-
How exactly would you go about writing a program to simplify algebraic expressions?
Hey, I found something which could be useful: https://root.cern
NetworkX
-
Routes to LANL from 186 sites on the Internet
Built from this data... https://github.com/networkx/networkx/blob/main/examples/grap...
-
The Hunt for the Missing Data Type
I think one of the elements that author is missing here is that graphs are sparse matrices, and thus can be expressed with Linear Algebra. They mention adjacency matrices, but not sparse adjacency matrices, or incidence matrices (which can express muti and hypergraphs).
Linear Algebra is how almost all academic graph theory is expressed, and large chunks of machine learning and AI research are expressed in this language as well. There was recent thread here about PageRank and how it's really an eigenvector problem over a matrix, and the reality is, all graphs are matrices, they're typically sparse ones.
One question you might ask is, why would I do this? Why not just write my graph algorithms as a function that traverses nodes and edges? And one of the big answers is, parallelism. How are you going to do it? Fork a thread at each edge? Use a thread pool? What if you want to do it on CUDA too? Now you have many problems. How do you know how to efficiently schedule work? By treating graph traversal as a matrix multiplication, you just say Ax = b, and let the library figure it out on the specific hardware you want to target.
Here for example is a recent question on the NetworkX repo for how to find the boundary of a triangular mesh, it's one single line of GraphBLAS if you consider the graph as a matrix:
https://github.com/networkx/networkx/discussions/7326
This brings a very powerful language to the table, Linear Algebra. A language spoken by every scientist, engineer, mathematician and researcher on the planet. By treating graphs like matrices graph algorithms become expressible as mathematical formulas. For example, neural networks are graphs of adjacent layers, and the operation used to traverse from layer to layer is matrix multiplication. This generalizes to all matrices.
There is a lot of very new and powerful research and development going on around sparse graphs with linear algebra in the GraphBLAS API standard, and it's best reference implementation, SuiteSparse:GraphBLAS:
https://github.com/DrTimothyAldenDavis/GraphBLAS
SuiteSparse provides a highly optimized, parallel and CPU/GPU supported sparse Matrix Multiplication. This is relevant because traversing graph edges IS matrix multiplication when you realize that graphs are matrices.
Recently NetworkX has grown the ability to have different "graph engine" backends, and one of the first to be developed uses the python-graphblas library that binds to SuiteSparse. I'm not a directly contributor to that particular work but as I understand it there has been great results.
-
Build the dependency graph of your BigQuery pipelines at no cost: a Python implementation
In the project we used Python lib networkx and a DiGraph object (Direct Graph). To detect a table reference in a Query, we use sqlglot, a SQL parser (among other things) that works well with Bigquery.
- NetworkX – Network Analysis in Python
-
Custom libraries and utility tools for challenges
If you program in Python, can use NetworkX for that. But it's probably a good idea to implement the basic algorithms yourself at least one time.
-
Google open-sources their graph mining library
For those wanting to play with graphs and ML I was browsing the arangodb docs recently and I saw that it includes integrations to various graph libraries and machine learning frameworks [1]. I also saw a few jupyter notebooks dealing with machine learning from graphs [2].
Integrations include:
* NetworkX -- https://networkx.org/
* DeepGraphLibrary -- https://www.dgl.ai/
* cuGraph (Rapids.ai Graph) -- https://docs.rapids.ai/api/cugraph/stable/
* PyG (PyTorch Geometric) -- https://pytorch-geometric.readthedocs.io/en/latest/
--
1: https://docs.arangodb.com/3.11/data-science/adapters/
2: https://github.com/arangodb/interactive_tutorials#machine-le...
-
org-roam-pygraph: Build a graph of your org-roam collection for use in Python
org-roam-ui is a great interactive visualization tool, but its main use is visualization. The hope of this library is that it could be part of a larger graph analysis pipeline. The demo provides an example graph visualization, but what you choose to do with the resulting graph certainly isn't limited to that. See for example networkx.
What are some alternatives?
PyMesh - Geometry Processing Library for Python
Numba - NumPy aware dynamic Python compiler using LLVM
xeus - Implementation of the Jupyter kernel protocol in C++
Dask - Parallel computing with task scheduling
tfgo - Tensorflow + Go, the gopher way
julia - The Julia Programming Language
windows-telemetry-blocklist - Blocks outgoing Windows telemetry, compatible with Pi-Hole.
RDKit - The official sources for the RDKit library
decimal - Arbitrary-precision fixed-point decimal numbers in Go
snap - Stanford Network Analysis Platform (SNAP) is a general purpose network analysis and graph mining library.
apd - Arbitrary-precision decimals for Go
SymPy - A computer algebra system written in pure Python