Open davifebba opened 2 days ago
Hi @davifebba, thanks for opening the issue. One thing I don't understand is why "time" is part of your search space- as time goes on, more and more of the search space will become inaccessible, which doesn't really make sense for a sweep.
Some suggestions
Hi @mgrange1998 !
Yes, it works with only temperature, and even with time using ax.client.get_next_trial(fixed_features=FixedFeatures({'time':time.time()-start_time}))
However, the time that goes into FixedFeatures has a -delta_T to the real time I wanted to pass to the model.
I'm measuring the current-voltage response of a device, which degrades overtime. The idea is to use time and temperature as a parameter to build an interpolation model to predict device response as a function of these two parameters. The device not only depends on temperature, but also on its age. I'm trying to build a model that can make predictions as a function of (temperature, time) in an active learning scenario. If I use time as fixed_features
, the model than can take temperature and time as independent variables and calculate the mean and uncertainty of the device response. When Ax suggests a new temperature with time as fixed feature, this would work just fine if I didn't have to wait for a temperature ramp to then measure the device response, at a different time than the one suggested.
Interesting. You're essentially trying to use time as a "contextual" variable. Unfortunately, I don't think this is easily supported in the current service API, but it's something that we want to support going forward. This would require quite a bit of additional work on our internals though, so having full support for this may take some time.
That said, @sdaulton has done related things using a TimeAsFeature
transform. This is not the exactly what you'd need here since this uses the trial start and end times rather than the age of the device. What you'd need is some other transform that converts the "time" metric into an addiitonal parameter for the model.
Exactly, time in this case would be a "contextual" variable. Since we cannot control time, the model would suggest only temperature points, but also use time to make predictions. In my case, "age" means the elapsed time since the start of the device characterization. Would it be possible to implement something using the Developer API, and informing the model of the correct time after a new trial has been suggested?
Would it be possible to implement something using the Developer API, and informing the model of the correct time after a new trial has been suggested?
Yes, that should be possible. We can implement a new Transform
that (i) add the new time parameter as a parameter to the (transformed) search space and (ii) grabs the specified metric from the observation features and injects it as the parameter value instead. This will be very similar to the TimeAsFeature
transform. We would then have to specify this as a transform to use in a custom GenerationStrategy
setup.
@Balandat, any guidance or example on how to implement that would be very appreciated!
Question
I have the following problem (using the ServiceAPI): I have time and temperature as search parameters. When I use get_next_trial(), AxClient suggests both, which is fine for temperature. However, I need to mark the trial as completed after the evaluation method, so that the suggested time is no longer valid and needs to be updated to the correct time after the evaluation method is called. For example:
How to solve this problem? Using fixed_features in get_next_trial does not work because I need the updated time. So how to update the parameterization with an adjusted parameter before marking the trial as complete, using a generation strategy? In other words, how to update the arm parameters in the generation strategy before marking the trial as complete?
I'm using this generation strategy, so that Botorch takes over after a few sobol samples. If I use ax_client.attach_trial(), then it's a manual step and not part of the generation strategy.
Please provide any relevant code snippet if applicable.
No response
Code of Conduct