Open bertocast opened 8 years ago
@pegli any idea why this wouldn't be working?
My guess is that it is happening because the current version of ARIMA.autofit() is looking for a pyspark.mllib.linalg.Vector as the first parameter, not an array. The changes I made in PR #137 should fix this issue.
I tried to run the code above on my laptop (which has the code from the PR) and the call to autofit()
does not result in a JavaError. However, it does appear to hang when provided with [1,2,3,4,5]
as source data. When I increased the number of data points in the array, autofit()
returned almost immediately. Here's my test code:
import numpy as np
from sparkts.models.ARIMA import autofit
from pyspark.context import SparkContext
a = np.linspace(1, 5, 10) # array starting at 1, ending at 5, and containing 10 elements
model = autofit(a, sc=SparkContext('local', 'blabla'))
print model.coefficients
I'm getting this error when I try to create a new ARIMAModel.
It's SparkContext fault, I think. Any thoughts?