facebook / Ax

Adaptive Experimentation Platform
https://ax.dev
MIT License
2.38k stars 311 forks source link

`_random_seed` not retained when using `ax_client.save_to_json_file()` and `AxClient.load_from_json_file()` #2166

Open sgbaird opened 9 months ago

sgbaird commented 9 months ago

I'm not sure if this is intended or not, but here is a Colab reproducer: https://colab.research.google.com/drive/12_9U4ikUu9q9hIwz9w_8krPzuDOx9JWc?usp=sharing

For provenance, here is the reproducer code:

import numpy as np
from ax.service.ax_client import AxClient, ObjectiveProperties

obj1_name = "branin"

def branin(x1, x2):
    y = float(
        (x2 - 5.1 / (4 * np.pi**2) * x1**2 + 5.0 / np.pi * x1 - 6.0) ** 2
        + 10 * (1 - 1.0 / (8 * np.pi)) * np.cos(x1)
        + 10
    )

    return y

ax_client = AxClient(random_seed=42)
ax_client.create_experiment(
    parameters=[
        {"name": "x1", "type": "range", "bounds": [-5.0, 10.0]},
        {"name": "x2", "type": "range", "bounds": [0.0, 10.0]},
    ],
    objectives={
        obj1_name: ObjectiveProperties(minimize=True),
    },
)

for _ in range(10):
    parameters, trial_index = ax_client.get_next_trial()
    results = branin(
        parameters["x1"],
        parameters["x2"],
    )
    ax_client.complete_trial(trial_index=trial_index, raw_data=results)

best_parameters, metrics = ax_client.get_best_parameters()

ax_client.save_to_json_file()

ax_client_restored = AxClient.load_from_json_file()
print(ax_client_restored._random_seed == 42)
# False
bernardbeckerman commented 9 months ago

Hi @sgbaird, thanks for the crystal clear issue and repro! Taking a look now.

bernardbeckerman commented 9 months ago

This seems like an oversight on our part, but unfortunately I won't be able to implement the fix before I leave (am going out of town for a few weeks). I'll follow up internally to find a new owner, and if that's not possible I'll need to get to it once I'm back. Of course, please feel free to submit a fix if you have the bandwidth, otherwise I'll be happy to get to it when I'm back. Thanks again for surfacing this!