Open alexander-held opened 1 year ago
I am not sure whether or not it would be feasible to access numpy
operations via keras_core.src.ops.numpy
and if they are sufficiently complete, from a brief look I think there might be some scipy
functions that would also be needed that are not there (e.g. scipy.special.gammaln
).
The other part that might be tricky is that keras_core
has (at least currently) a TensorFlow dependency, which is optional for pyhf
.
edit: the dependency will eventually become optional: https://github.com/keras-team/keras/issues/18455
One interesting thing is we can always define the keras_core
layers to inject or patch missing functionality (e.g. shims).
While I'm interested to learn more about keras_core
I'm also not interested in taking on tensorflow
as required dependency:
install_requires=[
"tensorflow",
"absl-py",
"numpy",
"rich",
"namex",
"h5py",
],
So I think this would for the time being need to be an experimental research area and not something that we would seriously consider adopting in the near term.
The dependency will be dropped already in the next version: https://twitter.com/fchollet/status/1679948051140743170
The dependency will be dropped already in the next version: https://twitter.com/fchollet/status/1679948051140743170
Hm interesting. We should indeed try it out then, but I wonder if this will even be necessary given the new Array API work that is coming out (c.f. @asmeurer's SciPy 2023 talk from today: Python Array API Standard: Toward Array Interoperability in the Scientific Python Ecosystem).
The rebranded Keras v3.0
now has gotten rid of TensorFlow as a required dependency
install_requires=[
"absl-py",
"numpy",
"rich",
"namex",
"h5py",
"dm-tree",
],
Though the resulting install adds 9 dependencies in total (we already have numpy
)
Installing collected packages: namex, dm-tree, pygments, numpy, mdurl, absl-py, markdown-it-py, h5py, rich, keras_core
So I'm still hesitant to add this as a core dependnecy unless the impact is substantial.
I will also admit that it might be worth exploring switching to keras
and then as the Array API standard gains adoption switching to that from keras in a big library swap, but I think that would also need exploration on how Keras supports this API translation and how much/little effort adopting new backends through a plugin-like system with array-api-compat
is (e.g. CuPy and possibly Dask for nearly free).
Using keras-core
would also require abandoning the ability to switch backends on the fly as it locks the backend as soon as you import keras
.
Configuring your backend
You can export the environment variable
KERAS_BACKEND
or you can edit your local config file at~/.keras/keras.json
to configure your backend. Available backend options are:"tensorflow"
,"jax"
,"torch"
. Example:export KERAS_BACKEND="jax"
In Colab, you can do:
import os os.environ["KERAS_BACKEND"] = "jax" import keras_core as keras
Note: The backend must be configured before importing
keras
, and the backend cannot be changed after the package has been imported.
It also would require dropping Python 3.8 support as it is Python 3.9+ only.
python_requires=">=3.9",
Neither of these are deal breakers IMO, but just info.
Summary
Keras Core (https://keras.io/keras_core/announcement/) looks like it might provide a convenient unified way to access the relevant tensor operations for TensorFlow, PyTorch and JAX via a single interface. This might allow simplifying the
pyhf
backend implementation code, only requiring a purenumpy
interface and a Keras Core interface (which then distributes to the three tensor backends).cc @kratsg who also mentioned this
Additional Information
n/a
Code of Conduct