Open Barathwaja opened 1 year ago
Hello @Barathwaja,
When you initialize the class TimeSeriesKMeans
with an init
input parameter equal to an ndarray, this parameter is stored and is accessible via init
attribute (in your case model.init
).
When you use the fit
method on a dataset, the init parameter is left unchanged.
The k-means algorithm is initialized using the init
ndarray.
Then after running the k-means algorithm, the final positions of the clusters centers are stored in the cluster_centers_
attribute.
In your case, you can access the cluster centers via model.cluster_centers_
.
If you want to predict the label of a new point, the attribute cluster_centers_
will be used.
If you want to fit your model on a new dataset, the attribute init
will be used.
I am not sure to understand what you are willing to do.
If you want to update your init
parameter using your final cluster centers positions, you can use:
model.init = model.cluster_centers_
If you want to control the value of the cluster centers, you can use:
model.cluster_centers_ = cluster_centers
where cluster_centers
is an ndarray of shape (n_clusters, sz, d)
.
I hope this helps!
Describe the bug Hi I'm trying to set the Clustercenters through init argument and after FIT it's recomputed and setting for that dataset. How to know if it really uses that base and setting up or not.
To Reproduce Code
Results