Our great sponsors
-
FYI PyTorch now supports "same" padding for convolutions with stride=1; see the docs here: https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.ht...
The stride>1 case has been a bit more controversial within TensorFlow, and there is ongoing discussion on the correct way to implement it within PyTorch on the issue: https://github.com/pytorch/pytorch/issues/3867
-
jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
-
As a researcher in RL & ML in a big industry lab, I would say most of my colleagues are moving to JAX 0https://github.com/google/jax], which this article kind of ignores. JAX is XLA-accelerated NumPy, it's cool beyond just machine learning, but only provides low-level linear algebra abstractions. However you can put something like Haiku [https://github.com/deepmind/dm-haiku] or Flax [https://github.com/google/flax] on top of it and get what the cool kids are using :)
-
Do any JAX experts know if there is an equivalent to https://captum.ai/ - a model interpretability library for pytorch?
In particular i want to be able to measure feature importance on both inputs and internal layers on a sample by sample basis. This is the only thing currently holding me back from using JAX right now.
Alternatively a simle to read/understand/port implementation of DeepLIFT would work too.
thanks
-
The thing is, tensorflow has more ability to run cross platform.
I help maintain https://github.com/capitalone/DataProfiler
Our sensitive data detection library is exported to iOS, android, and Java; in addition to Python. We also run distributed and federated use cases with custom layers. All of which are improved in tensorflow.
That said, I’d use pytorch if I could. Simply put, it has a better user experience.
-
Yeah they made ml5.js for this reason: https://ml5js.org/
I do feel like Google could do better communicating all of their different tools though. Their ecosystem is large and pretty confusing - they've got so many projects going on at once that it always seems like everyone gets fed up with them before they take a second pass and make them more friendly to newcomers.
Facebook seems to have taken a much more focused approach as you can see with PyTorch Live
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
-
-