Open fboylu opened 8 years ago
We need to switch to a chunked upload to handle the large payloads. See the python sdk change for reference, https://github.com/Azure/Azure-MachineLearning-ClientLibrary-Python/commit/4204e2b2f1540e6d9f936b19457048c5afdca860
Any ETA available on a resolution to this issue?
Pull requests are welcome. I don't think anybody is actively working on this issue.
Ok
There's no contributor guidelines and whatnot on this - before PRs can be accepted, does this need to be once-overed for Microsoft compliance stuff or is it ok as-is?
while using upload.dataset() on a data frame, the following error is returned:
Error: AzureML returns error code: HTTP status code : 500 Maximum request length exceeded. Traceback:
I was able to upload a smaller subset of the data frame without issues.