Closed weakish closed 7 years ago
Hi @weakish,
Thanks for offering to translate the blog post to Chinese.
In the past, another one of my blog posts has been translated to Japanese and Chinese. This is the one:
original: http://blog.otoro.net/2015/05/07/creatures-avoiding-planks/ translation: http://www.wukai.me/2016/03/01/creatures-avoiding-planks/
I'm certainly okay for you to translate it, but can you include links and attributions like the translation link above?
I would also prefer if you translate it as a public website (and link to that website from social media), since I want to link my blog post to your translated version as well, for Chinese speakers who read my blog. Perhaps can just use GitHub pages if you need to host it somewhere. Would this be possible?
Thanks
I'm certainly okay for you to translate it
Glad to hear that.
can you include links and attributions like the translation link above?
Of course.
I would also prefer if you translate it as a public website
Actually there is a public website (I said "under construction" because there will be a redesign of the site in next month. I will post the translation on that site and post the url here. And the urls shouldn't change after the redesign. If it changed, I will make sure there would be a 301 redirection to the new url.)
Thanks and happy coding!
Great, keep me updated later 👍
Translated Recurrent Net Dreams Up Fake Chinese Characters in Vector Format with TensorFlow to Chinese.
The link is https://www.jqr.com/news/008561
The link and attribution is at the bottom of the translated text, and Otoro Studio is linked at the beginning of the translated text.
I omit some content in Chinese translation,
mainly the sketch-rnn
setup section, since the README said that the project is deprecated.
And some knowledge about characters that are well known for Chinese speakers are also omitted.
Unfortunately the translated text is in Simplified Chinese. Personally I think Traditional Chinese looks better than Simplified Chinese (device with small font size is an exception). But mostly readers in CN are more familiar with Simplified Chinese, thus Simplified Chinese is used.
BTW, personally I cannot see the connection between Simplified Chinese and Newspeak.
But Simplified Chinese dose have some nasty bugs, like (余, 餘)
, (后, 後)
, (适, 適)
, etc.
If you have any issue or question, please kindly let me know.
And thanks for your inspiring introduction to this interesting topic.
Thanks for letting me know!
I’ll link the main article to your translation next time I update the blog.
I might have been a bit harsh on simplified Chinese to be honest ...
But it’s fun to point out the design flaws :-)
Like 愛 without a 心
On Thu, Nov 9, 2017 at 3:40 AM Jakukyo Friel notifications@github.com wrote:
Translated Recurrent Net Dreams Up Fake Chinese Characters in Vector Format with TensorFlow http://blog.otoro.net/2015/12/28/recurrent-net-dreams-up-fake-chinese-characters-in-vector-format-with-tensorflow/ to Chinese.
The link is https://www.jqr.com/news/008561
The link and attribution is at the bottom of the translated text, and Otoro Studio is linked at the beginning of the translated text.
I omit some content in Chinese translation, mainly the sketch-rnn setup section, since the README said that the project is deprecated. And some knowledge about characters that are well known for Chinese speakers are also omitted.
Unfortunately the translated text is in Simplified Chinese. Personally I think Traditional Chinese looks better than Simplified Chinese (device with small font size is an exception). But mostly readers in CN are more familiar with Simplified Chinese, thus Simplified Chinese is used.
BTW, personally I cannot see the connection between Simplified Chinese and Newspeak. But Simplified Chinese dose have some nasty bugs, like (余, 餘), (后, 後), (适, 適), etc.
If you have any issue or question, please kindly let me know.
And thanks for your inspiring introduction to this interesting topic.
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-343129630, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHvjZ7Mfw93U9P45fKb-N8xpRFsfEks5s0uSagaJpZM4QKvDh .
Just translated Neural Slime Volleyball into Chinese.
The url is https://www.jqr.com/news/008677
I might have been a bit harsh on simplified Chinese
Might not. I rethought your opinions and recalled my struggling experience to start reading ancient Chinese books. One of the main hassles is unable to recognise Traditional Chinese characters. It was long ago thus I almost forgot it. From this point, Simplified Chinese does have some sense of Newspeak, as a hinder to access publications before the establishment of PRC. But I am not sure if it is just a side effect or it was on purpose.
As for design flaws, I think although "爱" without "心" breaks the traditional structure, it is still acceptable, since "友" (friendship) is somehow relevant in the meaning. But there are a lot of nonsense simplifications, for example, the pattern "又". For example, "鄧" has been simplified to "邓" (breaking the phonetic relation between "登" and "鄧"), and "聖" has been simplified to "圣" (breaking the semantic relation), making "邓" and "圣" harder to learn and memorise. There are dozens of Chinese characters simplified with the pattern "又", and there are several other patterns like "又".
Thanks for the translation, this article is a bit old- I updated the Kanji article to link to your translation one btw.
I was taught both Simplified and Traditional as a kid. I think my brain is somehow wired to recognise Traditional characters quicker, perhaps because there's more information and more "redundancy" in the way they are presented, from an information perspective.
Just finished translation of Recurrent Neural Network Tutorial for Artists.
The url is https://www.jqr.com/news/008900
P.S. I am neither an artists nor a designer (though one of my favourite books is "How to Design Programs"), but I really enjoying this simple, clear, and inspiring demonstration on how to model handwriting.
The JavaScript code doesn’t seem to work on your site :)
On Wed, Nov 22, 2017 at 12:40 AM Jakukyo Friel notifications@github.com wrote:
Just finished translation of Recurrent Neural Network Tutorial for Artists http://blog.otoro.net/2017/01/01/recurrent-neural-network-artist/.
The url is https://www.jqr.com/news/008900
P.S. I am neither an artists nor a designer (though one of my favourite books is "How to Design Programs"), but I really enjoying this simple, clear, and inspiring demonstration on how to model handwriting.
— You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-346280556, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHqcXWno7s-LfvtS91-2E_PYTztTWks5s493ngaJpZM4QKvDh .
The JavaScript code doesn’t seem to work
Unfortunately the markdown editor used on jqr.com does not allow JavaScript. :-(
And the Chinese translation for the neurogram blog post:
https://www.jqr.com/news/008965
P.S. This web app is really cool. To be honest, the slime volleyball app is also very cool, but the too-hard-to-beat AI player let me feel hopeless very soon. But with this one, I can let it evolve really interesting pictures. It will be another time killer on my mobile device. 💓
Excuse me, forgot to mention that there seems some typos in the blog post:
...to work on my NEAT implementation in javascript
Javascript (not really a typo, just to be consistent to the spelling in the later part of the article)
CPPN’s are used to generate patterns
CPPNs
A Backpropable versiion of NEAT might be very interesting
version
Thanks
Have corrected -
On Sun, Nov 26, 2017 at 9:51 PM Jakukyo Friel notifications@github.com wrote:
Excuse me, forgot to mention that there seems some typos in the blog post:
...to work on my NEAT implementation in javascript
Javascript (not really a typo, just to be consistent to the spelling in the later part of the article)
CPPN’s are used to generate patterns
CPPNs
A Backpropable versiion of NEAT might be very interesting
version
— You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-347085422, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHuS6-wx6m5kRxPbc7myvwC1NVp9Lks5s6k3sgaJpZM4QKvDh .
I finished translating Generating Abstract Patterns with TensorFlow yesterday:
https://www.jqr.com/news/009203
Possible typos encountered during translation:
and contain 32 activations at each neural net layer
contains
Let’s generate another random image using the in the same latent space
I guess you meant either "using the same latent space" or "in the same latent space".
Thanks for the correction. I’ve fixed the errors now, appreciate it.
On Sat, Dec 16, 2017 at 7:07 AM Jakukyo Friel notifications@github.com wrote:
I finished translating Generating Abstract Patterns with TensorFlow http://blog.otoro.net/2016/03/25/generating-abstract-patterns-with-tensorflow/ yesterday:
https://www.jqr.com/news/009203
Possible typos encountered during translation:
and contain 32 activations at each neural net layer
contains
Let’s generate another random image using the in the same latent space
I guess you meant either "using the same latent space" or "in the same latent space".
— You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-352189036, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHp3_8YGVnnpnE-ymaCvUnkhPK85eks5tA9ysgaJpZM4QKvDh .
Hi, I translated Generating Large Images from Latent Vectors Part One1 and Part Two2:
https://www.jqr.com/news/009358 https://www.jqr.com/news/009391
Some possible typos in Part One:
z_1_before = sampler.encode(sampler.get_random_specific_mnist(1))
sampler.show_image_from_z(z_1_before)
z_1_after = sampler.encode(sampler.get_random_specific_mnist(1))
sampler.show_image_from_z(z_1_before)
z_1_before
on the last line should be z_1_after
instead.
z_5_before = sampler.encode(sampler.get_random_specific_mnist(5))
sampler.show_image_from_z(z_5_before)
z_5_after = sampler.encode(sampler.get_random_specific_mnist(5))
sampler.show_image_from_z(z_5_before)
z_5_before
on the last line should be z_5_after
instead.
BTW, IDEs may not detect these typos because z_{1, 5}_after
has been used in later code.
However, a paper3 submitted to ICLR 2018 from MS Research
representing programs as graphs seems be able to detect these kind of typos
(at least for a statically typed language).
Some possible typos in Part Two:
resovoir computing
reservoir
Current Model - 24 Epochs:
The video below is missing?
Just translated Neural Network Evolution Playground with Backprop NEAT0 to Chinese1.
Enjoyed translating this gentle introduction to genetic algorithm and NEAT. Design choices of the interactive demo are well explained in the article. Awesome!
A possible typo:
Imagine if we have a set of 100 random weights for a neural network
I guess you mean "100 sets of random wights"?
Thanks! I have fixed the typos, will push them out in a bit.
I translated [The Frog of CIFAR 10] and [Hyper Networks] to Chinese, the translated text is published at https://www.jqr.com/article/000013 and https://www.jqr.com/article/000037 correspondingly.
Possible typos in Hyper Networks:
the weights a recurrent network
the weights of a recurrent network
Thanks for the catch! btw, you spelled "hardmu" in one of the posts-
you spelled "hardmu"
Sorry. Fixed now.
Finished (abridged) translation of "world modules": https://www.jqr.com/article/000124
Enjoyed reading and translating this paper. Awesome and inspiring.
Hi, thanks for the translation!
For this article, since it is not just me (but a joint work with my coauthor), I request that you put:
The original URL, the URL of the arxiv pdf, and also the citation information at the top of your article before your article content is that’s fine.
Thanks!
On Fri, Mar 30, 2018 at 6:33 PM Jakukyo Friel notifications@github.com wrote:
Finished (abridged) translation of "world modules": https://www.jqr.com/article/000124
Enjoyed reading and translating this paper. Awesome and inspiring.
— You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-377498400, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHlo6yLEkAE7-zWUVQ20xI1VX_Hu2ks5tjgn8gaJpZM4QKvDh .
Basically:
Original article URL: worldmodels.github.io
Arxiv URL (you have a link to it in your 1st paragraph but would like the URL to be displayed in its text, like arxiv.org/abs/....)
The BibTeX citation is in the original article in appendix.
Hi! I discussed w/ coauthor, it is okay not to include the BibTeX (just the other 2 URL). Thanks-
Sorry for the late reply (haven't checked GitHub during weekend).
I made the following edits:
The online version is updated:
https://www.jqr.com/article/000124
If you see anything improper, or any room for improvements, please kindly let me know.
Thanks.
Thanks a lot, as always!
On Sun, Apr 1, 2018 at 7:13 PM Jakukyo Friel notifications@github.com wrote:
Sorry for the late reply (haven't checked GitHub during weekend).
I made the following edits:
- Mention Schmidhuber in the translator's note (sorry for forgetting to mention him)
- Add the worldmodels.github.io and arxiv.org link at the beginning (before first section "Introduction")
- Add the BibTex citation at the end of the translated text
The online version is updated:
https://www.jqr.com/article/000124
If you see anything improper, or any room for improvements, please kindly let me know.
Thanks.
— You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub https://github.com/hardmaru/rnn-tutorial/issues/1#issuecomment-377837588, or mute the thread https://github.com/notifications/unsubscribe-auth/AGBoHtNybfBGrRvQfLxWCWY_6NPmKlBWks5tkYkvgaJpZM4QKvDh .
Hi.
A lot of articles in your blog are greatly comprehensible, with cute images. I'd like to translate them to Chinese, making them more approachable to Chinese ML developers.
I am currently working for a Chinese new media about AI and machine learning. I am looking forward to know:
Is it O.K. to publish the translation on the WeChat social public account of the media/company, and on the website later in future (the website is under construction), or what is the license of your blog posts?
And if you have any specific requirements, or questions, please let me know.
Thanks.
P.S. The WeChat account name is 论智, basically means "on intelligence" in Chinese.