weinman / cnn_lstm_ctc_ocr

Tensorflow-based CNN+LSTM trained with CTC-loss for OCR
GNU General Public License v3.0
497 stars 170 forks source link

Irrelevancy #20

Closed ghost closed 6 years ago

ghost commented 6 years ago

Training fixed models is vastly wasteful of yours and especially your students time. This focus on narrow AI is only side stepping from our goal of developing human-level AI. The only reasons narrow AI from a person or group with fairly or greater deep understanding of AI can be justifiable by two reasons. One is for a product that is needed fairly quickly before human-level AI ultimately arrives. A product that the world would greatly suffer without before the eventual ultimatum when strong AI arrives. Among special group of others, I list Tesla's AI vision wing in this category in its goal which is part of a far greater picture "necessary" if explanation needed. The other reason is if this narrow is clearing up new sectors like potent technological models different from CNNs or RNNs or even pushing these models into new territories. This project among others is the social equivalent of globalization. Yes, it might help a bit but what it does more is waste talent on things that will eventually be replaced. It is working on a far superior sail when knowing world is on the verge of steam engines which will revolutionize the field. Great courage is required to make great strides. That courage is diving into something that you don't know what actually you are even searching for. That courage is knowing very well a life's work might net nothing. That courage is being selfless taking an impossible chance at breakthrough over some publications in your name. Have your students into new territories that even you don't even know comfortably. Be their leader in routes you feel no one is exploring, perhaps even against your own beliefs. Grow like the AI you are building after reading this if you read this entirely. Be the building block.

melki commented 6 years ago

:beer: