Open grahamcrowell opened 7 years ago
trait Metric {
def calculate(process: Process, dateId: Int) : Double
}
Metrics basically take a time series and a perform some calculation and return value of the metric for a given date.
Denormalize price data into Spark Dataset[State]
case class State(objectLabel, processLabel, dateId, value)
Start with strings determine if performance blocks progress
Map symbol to asset
@grahamcrowell this is basically the same as Spark's machine learning pipeline. Spark ml lib pipeline docs
setup aws for hadoop http://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-gs-prerequisites.html
analytical model
Subject
Attribute
State
Event
Process
Time Granularity
Sent from my Samsung SM-J120W using FastHub