Closed plata closed 7 years ago
Another possibility would be to call tr
directly:
Packages.org.phoenicis.configuration.localisation.Localisation.tr("translation test.")
This is working but I'm not sure how we can get the text into our .pot
and .po
files (especially for user repositories).
Called tr directly makes more sense. The analysis of the scripts is another problem we can handle quite easily
If we do that, all strings have to be in the properties file which is generated with mvn gettext:dist
. I don't see how this should happen for user repositories.
The pot files should not necessarily be generated by maven. We could imagine another process that could generate all the translations from all our git repositories and uploading them to a translation platform. This is the way it is done in POLv4
Sure. But that's still not enough. For example, I could provide my own scripts in a separate repository. In this case, I would not have any chance to translate those.
You could add your translation to the main repository, it should work fine
Doesn't that defeat the whole point of custom repositories? I might not even want my strings to be available publicly (e.g. in a company environment). With a solution which loads a json with the translations at runtime, I could provide my own translation file for my repository.
Correct
Then we should also allow to include po files in a repository
Yes.
The problem with the solution from localeplanet is that it seems to do only a simple mapping. So no plural forms etc.
I managed to get i18next running after changing the i18next.js a little bit. So I guess we could use that.
However, there's still another problem: How do we translate the .json
files?
I opened a PR (PlayOnLinux/POL-POM-5#870) to solve the .json
translation).
Fixed.
We could use http://www.localeplanet.com/ and https://docs.transifex.com/formats/json.