Ok, so now its time to handle those time-series which don't have an underlying array (or more generally, storage). At least two types of time series fall into this category:
those that have a generating process behind them: these are infinite and inherently lazy. You don't compute more of them until you need that part.
time series that come in from a continuous stream: for example a time series coming in from a web-server log or a set of events on some Internet-of-things device in your infrastructure. (eventually you might save these in a database but you want to operate on these first)
To handle these cases, we'll add a single additional abstract (generator) method to the StreamTimeSeriesInterface:
def produce(self, chunk=1):
pass
which produces a chunk sized bunch of new elements into the timeseries whenever it is called (document it).
Now create a SimulatedTimeSeries class which has a constructor that takes a generator as an argument, starts it, and perhaps primes it. (or uses a boolean to keep track of the priming)
An example of such a generator would be make_data from HW6.
The subtlety here is that the produce method can handle data in chunks, so to implement this class properly, the instance variable used by the constructor that saves the called generator function can be used inside produce to advance by chunknexts.
From Project Spec 5
Ok, so now its time to handle those time-series which don't have an underlying array (or more generally, storage). At least two types of time series fall into this category:
To handle these cases, we'll add a single additional abstract (generator) method to the
StreamTimeSeriesInterface
:which produces a chunk sized bunch of new elements into the timeseries whenever it is called (document it).
Now create a
SimulatedTimeSeries
class which has a constructor that takes a generator as an argument, starts it, and perhaps primes it. (or uses a boolean to keep track of the priming)An example of such a generator would be
make_data
from HW6.The subtlety here is that the
produce
method can handle data in chunks, so to implement this class properly, the instance variable used by the constructor that saves the called generator function can be used insideproduce
to advance bychunk
next
s.Indeed you could use
make_data
to test your code.