In torch, in order to follow closely the Python implementation we implemented iterators as R6 classes with .iter() and .next() methods.
A minimal example would be something like this:
library(coro)
Range <- R6::R6Class(
classname = "Range",
public = list(
n = NULL,
i = 0,
initialize = function(n) {
self$n <- n
},
.next = function() {
self$i <- self$i + 1
if (self$i <= self$n)
return(self$i)
coro::exhausted()
}
)
)
as_iterator.Range <- function(x) {
force(x)
coro::as_iterator(function() {
x$.next()
})
}
coro::iterate(for (i in as_iterator.Range(Range$new(3))) {
print(i)
})
This works nice, but it would be nice to avoid the explicit as_iterator.Range call:
coro::iterate(for (i in Range$new(3)) {
print(i)
})
The $.iter() method can be used to decouple different strategies of iteration or simply to avoid the need of reinitializing the class. In torch, the context is something like this: The dl instance can be used for every epoch without the need to recreate the object. This would be possible by abusing as_iterator and implementing a method for Dataloader instances that returns a fresh iterator each time.
dl <- Dataloader$new(dataset)
for (i in 1:epochs) {
iterate(for(batch in dl) {
# train model
})
}
In torch, in order to follow closely the Python implementation we implemented iterators as R6 classes with
.iter()
and.next()
methods.A minimal example would be something like this:
This works nice, but it would be nice to avoid the explicit
as_iterator.Range
call:The
$.iter()
method can be used to decouple different strategies of iteration or simply to avoid the need of reinitializing the class. In torch, the context is something like this: Thedl
instance can be used for every epoch without the need to recreate the object. This would be possible by abusingas_iterator
and implementing a method for Dataloader instances that returns a fresh iterator each time.Here's torch's code: https://github.com/mlverse/torch/blob/master/R/utils-data-dataloader.R#L84