DeepLcom / deepl-php

Official PHP library for the DeepL language translation API.
MIT License
202 stars 23 forks source link

Glossary Japanese symbols issue? #26

Closed ivdimova closed 1 year ago

ivdimova commented 1 year ago

I am trying to import a glossary with Japanese symbols, but the request keeps failing. What is the right way to fix this?

JanEbbing commented 1 year ago

Hi @ivdimova , could you please share your code and the error you receive? I can create EN/JP and JP/EN glossaries without any issue:

    $t = new \DeepL\Translator('MY_AUTH_KEY');
    $gi = $t->createGlossary(
        'my test glossary', 
        \DeepL\LanguageCode::ENGLISH_BRITISH,
        \DeepL\LanguageCode::JAPANESE, 
        \DeepL\GlossaryEntries::fromEntries(['companion' => '旧友', 'friend' => '青二才'])
    );
    var_dump($gi);
    $t = new \DeepL\Translator('MY_AUTH_KEY');
    $gi = $t->createGlossary(
        'my test glossary',
        \DeepL\LanguageCode::JAPANESE,
        \DeepL\LanguageCode::ENGLISH_BRITISH,
        \DeepL\GlossaryEntries::fromEntries(['旧友' => 'companion', '青二才' => 'friend'])
    );
    var_dump($gi);
ivdimova commented 1 year ago

"DeepL\\DeepLException: Term \"\" contains no non-whitespace characters , this is what I am getting, only in Japanese. There are no white spaces and all other languages work. I've tried to trim all whitespaces, still same error. Can't see anything wrong with the terms. @JanEbbing

JanEbbing commented 1 year ago

Hi @ivdimova , the error you get means that some of the glossary entries were empty. If I had to guess, it is some encoding-related issue (the japanese characters are not saved in the file due to an incompatible encoding like ASCII). Can you run file <yoursourcefile.php> (in the shell)? For my example file it says PHP script text, Unicode text, UTF-8 text, it should say UTF-8. If your file is in a different encoding, change it to UTF-8 and try that again.

(If you are running in the terminal, make sure that the terminal charset allows japanese characters).

ivdimova commented 1 year ago

Thanks, I found the issue!