alan-turing-institute / ThermodynamicAnalyticsToolkit

Sampling-based approach to analyse neural networks using TensorFlow
https://alan-turing-institute.github.io/ThermodynamicAnalyticsToolkit/
GNU General Public License v3.0
22 stars 3 forks source link

Can we use placeholders in input pipeline construction for max_steps? #49

Closed FrederikHeber closed 6 years ago

FrederikHeber commented 6 years ago

max_steps is frequently changed. For example, when first optimizing and then sampling, the only thing changing about the dataset is this value.

We might achieve this using feedible iterators, i.e. the values for all the placeholders are fed once when initialize (or reset) the iterator.

This would impact also on the simulation interface, see #20, as then in fewer cases we would have to recreate the input pipeline which might force a reparsing of files, too. More values such as batch_size, batch_data_files might also be replaceable making the input pipeline even more flexible.

FrederikHeber commented 6 years ago

No, this is not possible. Tensorflow states on calling self.dataset.repeat(ceil(max_steps*batch_size/dimension)) with a placeholder max_steps:

TypeError: a float is required