MicroStrategy / mstrio-py

Python integration for MicroStrategy
Apache License 2.0
90 stars 60 forks source link

ValueError by uploading a dataframe to a cube #169

Closed magerdaniel closed 1 month ago

magerdaniel commented 9 months ago

Hi,

this is the first time I log an issue here and not with mstr techsupport. I need to upload a dataframe with more than 100.000 lines and I'm getting an error. In the next step, I tried to loop it and upload mult. smaller chunks. Now I'm getting an error, if I upload a smaller junk directly.... PLS. find a JupyterNotebook file with all infos in the attachment. I do think, the issue is somehow related to the index of the dataframe :-) Have a great day, Dan

MSTR_CUBE_CASE.zip

mgorskiMicroStrategy commented 8 months ago

Hi @magerdaniel You are right the issue is related to the index of the dataframe. You are dividing dataframe into chunks by yourself can you please try to use for this mstrio-py builtin functionality? In ds.create() and ds.update() methods you have chunksize parameter set as default to 100 000. Please set it maybe to 50 00 and upload all dataframe without dividing it before upload. (chunksize (int, optional): Number of rows to transmit to the server with each request)

magerdaniel commented 8 months ago

Hi @mgorskiMicroStrategy,

thx for your response. I do think there are multiple issues round the upload. I tested the parameter "chunksize", but it fails. Pls. find enclosed a new notebook and a csv file to reproduce the issue. It works fine if the chunksize is big enough to upload all data at once. As soon as this is not case, it breaks. Have a great day, Daniel cube_data.zip

mgorskiMicroStrategy commented 8 months ago

@magerdaniel thanks for update I reproduced issue now, need to take better look on it. I would appreciate if you could also log this issue to mstr techsupport and share here CS number it will help me to prioritize this case.

Have a great day, Michał

apiotrowskiMicroStrategy commented 1 month ago

Issue resolved. Closing