Hi, I am trying to upload a UTF-8 encoded string to Cloud storage and getting the following error:
Exception: Failed to process HTTP response.
The code is the following:
import datalab.storage as gcs
import pandas as pd
Items_Object = gcs.Bucket('astrologer-2').items(prefix=file_prefix)
for item in items:
if not item.key.endswith('/') and item.exists():
data = StringIO(item.read_from())
dataFrame = pd.read_csv(data, low_memory=False, sep=',', encoding='utf-8')
df_string = dataFrame.to_csv(index=False, encoding='utf-8')
print df_string
response = item.write_to(df_string, 'text/csv')
The error fires on the line item.write_to(df_string, 'text/csv').
And the content of the file is:
Nombre,Apellido Lluís,Gassó Test,Testson 最高,サートした
I tried using 'text/plain', 'text/plain;encoding=UTF-8', 'text/csv;encoding=UTF-8' and 'application/octet-stream' and none of them worked.
When uploading a string without special characters it works just fine, but when there are special characters it fires the error.
I also tried to use the google.datalab.storage module getting the exact same output.
Hi, I am trying to upload a UTF-8 encoded string to Cloud storage and getting the following error:
The code is the following:
The error fires on the line item.write_to(df_string, 'text/csv').
And the content of the file is:
Nombre,Apellido Lluís,Gassó Test,Testson 最高,サートした
I tried using 'text/plain', 'text/plain;encoding=UTF-8', 'text/csv;encoding=UTF-8' and 'application/octet-stream' and none of them worked.
When uploading a string without special characters it works just fine, but when there are special characters it fires the error.
I also tried to use the google.datalab.storage module getting the exact same output.
https://stackoverflow.com/q/49411182/9502377