gunthercox / ChatterBot

ChatterBot is a machine learning, conversational dialog engine for creating chat bots
https://chatterbot.readthedocs.io
BSD 3-Clause "New" or "Revised" License
14.03k stars 4.44k forks source link

Problem installing chatterbot #399

Closed nnt0 closed 7 years ago

nnt0 commented 7 years ago

So. I have a problem.

I'll begin from the start.

I started with installing chatterbot using. pip3 install chatterbot Which worked well and installed this. chatterbot in /usr/local/lib/python3.5/dist-packages fuzzywuzzy<0.13,>=0.12 in /usr/local/lib/python3.5/dist-packages (from chatterbot) nltk<4.0.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) pymongo<4.0.0,>=3.3.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) textblob<0.12.0,>=0.11.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) python-twitter>=3.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) jsondatabase>=0.1.1 in /usr/local/lib/python3.5/dist-packages (from chatterbot) requests in /usr/lib/python3/dist-packages (from python-twitter>=3.0->chatterbot) future in /usr/local/lib/python3.5/dist-packages (from python-twitter>=3.0->chatterbot) requests-oauthlib in /usr/local/lib/python3.5/dist-packages (from python-twitter>=3.0->chatterbot) oauthlib>=0.6.2 in /usr/lib/python3/dist-packages (from requests-oauthlib->python-twitter>=3.0->chatterbot) But if i check the installed version with python -m chatterbot --version it tells me this. /usr/bin/python: No module named chatterbot So i decided to check the path. cd /usr/bin/python But then... bash: cd: /usr/bin/python: Isn't a directory Just to make sure that chatterbot isn't installed i copyed the example from the documentation, made it executable with chmod +x chatbot.py and started it with. ./chatbot.py But then i got this. from: can't read /var/mail/chatterbot import: unable to open X server ' @ error/import.c/ImportImageCommand/364. ./chatbot.py: Zeile 9: Syntaxerror unexpected word »(« ./chatbot.py: Zeile 9: bot = ChatBot("Terminal",'

First line; I checked cd /var/mail/chatterbot but bash: cd: root: Isn't a directory Why should it make a folder there? Second line + third line: I don't understand. Why does it have a problem with that?

But i don't give up yet!

I did python3 chatbot.py which gave this response.

/usr/local/lib/python3.5/dist-packages/fuzzywuzzy/fuzz.py:35: UserWarning: Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning warnings.warn('Using slow pure-python SequenceMatcher. Install python-Levenshtein to remove this warning') /usr/local/lib/python3.5/dist-packages/nltk/decorators.py:59: DeprecationWarning: inspect.getargspec() is deprecated, use inspect.signature() instead regargs, varargs, varkwargs, defaults = inspect.getargspec(func) /usr/local/lib/python3.5/dist-packages/chatterbot/adapters/storage/jsonfile.py:19: UnsuitableForProductionWarning: The JsonFileStorageAdapter is not recommended for production application environments. self.UnsuitableForProductionWarning


Resource 'tokenizers/punkt/PY3/english.pickle' not found. Please use the NLTK Downloader to obtain the resource: >>> nltk.download() Searched in:

  • '/home/[don't want to tell my username]/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'
  • ''

    Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/textblob/decorators.py", line 35, in decorated return func(*args, **kwargs) File "/usr/local/lib/python3.5/dist-packages/textblob/tokenizers.py", line 57, in tokenize return nltk.tokenize.sent_tokenize(text) File "/usr/local/lib/python3.5/dist-packages/nltk/tokenize/init.py", line 90, in sent_tokenize tokenizer = load('tokenizers/punkt/{0}.pickle'.format(language)) File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 801, in load opened_resource = _open(resource_url) File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 919, in open return find(path, path + ['']).open() File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 641, in find raise LookupError(resource_not_found) LookupError:


    Resource 'tokenizers/punkt/PY3/english.pickle' not found. Please use the NLTK Downloader to obtain the resource: >>> nltk.download() Searched in:

  • '/home/[don't want to tell my username]/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'
  • ''

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "chatbot.py", line 18, in database="../database.db" File "/usr/local/lib/python3.5/dist-packages/chatterbot/chatterbot.py", line 61, in init self.add_adapter(adapter, kwargs) File "/usr/local/lib/python3.5/dist-packages/chatterbot/chatterbot.py", line 81, in add_adapter adapter = NewAdapter(kwargs) File "/usr/local/lib/python3.5/dist-packages/chatterbot/adapters/logic/time_adapter.py", line 26, in init self.classifier = NaiveBayesClassifier(training_data) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 192, in init self.train_features = [(self.extract_features(d), c) for d, c in self.train_set] File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 192, in self.train_features = [(self.extract_features(d), c) for d, c in self.train_set] File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 169, in extract_features return self.feature_extractor(text, self.train_set) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 81, in basic_extractor word_features = _get_words_from_dataset(train_set) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 63, in _get_words_from_dataset return set(all_words) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 62, in all_words = chain.fromiterable(tokenize(words) for words, in dataset) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 59, in tokenize return word_tokenize(words, include_punc=False) File "/usr/local/lib/python3.5/dist-packages/textblob/tokenizers.py", line 72, in word_tokenize for sentence in sent_tokenize(text)) File "/usr/local/lib/python3.5/dist-packages/textblob/base.py", line 64, in itokenize return (t for t in self.tokenize(text, *args, **kwargs)) File "/usr/local/lib/python3.5/dist-packages/textblob/decorators.py", line 38, in decorated raise MissingCorpusError() textblob.exceptions.MissingCorpusError: Looks like you are missing some required data for this feature.

To download the necessary data, simply run

python -m textblob.download_corpora

or use the NLTK downloader to download the missing data: http://nltk.org/data.html If this doesn't fix the problem, file an issue at https://github.com/sloria/TextBlob/issues.

(Marking it as code didn't worked. I don't know why.)

And for Python 2. python chatbot.py Told me this. Traceback (most recent call last): File "chatbot.py", line 1, in <module> from chatterbot import ChatBot ImportError: No module named chatterbot

Aha!

Okay. The first time i was told that the module chatterbot isn't there.

Let's try to install it with git. git clone https://github.com/gunthercox/ChatterBot.git Output. Klone nach 'ChatterBot' ... remote: Counting objects: 5401, done. remote: Total 5401 (delta 0), reused 0 (delta 0), pack-reused 5400 Empfange Objekte: 100% (5401/5401), 2.73 MiB | 47.00 KiB/s, Fertig. Löse Unterschiede auf: 100% (3433/3433), Fertig. Prüfe Konnektivität ... Fertig.

(I'm too lazy to translate it. It's german if you want to know.)

Installing. pip install ./ChatterBot Okay. Installed.

Let's try again.

python3 chatbot.py Same respone. python chatbot.py Same response. ./chatbot.py Same response.

I don't know what i should do now. Please, help me.

Thank you.

gunthercox commented 7 years ago

Sure, let's pick just one version of Python to use and go from there. Which version of Python would you prefer to get chatterbot installed under?

nnt0 commented 7 years ago

I would prefer python 3.

gunthercox commented 7 years ago

Ok, so bear with me if you have already done this. Can you tell me the output if you enter the following:

pip3 install chatterbot --upgrade
python3 -m chatterbot --version
nnt0 commented 7 years ago

Of course i can.

pip3 install chatterbot --upgrade Output. The directory '/home/[don't want to tell my username]/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag. The directory '/home/[don't want to tell my username]/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag. Collecting chatterbot Downloading ChatterBot-0.4.12-py2.py3-none-any.whl (100kB) 100% |████████████████████████████████| 102kB 1.3MB/s Requirement already up-to-date: textblob<0.12.0,>=0.11.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Requirement already up-to-date: nltk<4.0.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Requirement already up-to-date: pymongo<4.0.0,>=3.3.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Requirement already up-to-date: fuzzywuzzy<0.13,>=0.12 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Requirement already up-to-date: jsondatabase>=0.1.1 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Requirement already up-to-date: python-twitter>=3.0 in /usr/local/lib/python3.5/dist-packages (from chatterbot) Collecting requests (from python-twitter>=3.0->chatterbot) Downloading requests-2.11.1-py2.py3-none-any.whl (514kB) 100% |████████████████████████████████| 522kB 1.4MB/s Requirement already up-to-date: future in /usr/local/lib/python3.5/dist-packages (from python-twitter>=3.0->chatterbot) Requirement already up-to-date: requests-oauthlib in /usr/local/lib/python3.5/dist-packages (from python-twitter>=3.0->chatterbot) Collecting oauthlib>=0.6.2 (from requests-oauthlib->python-twitter>=3.0->chatterbot) Downloading oauthlib-2.0.0.tar.gz (122kB) 100% |████████████████████████████████| 122kB 2.6MB/s Installing collected packages: chatterbot, requests, oauthlib Found existing installation: ChatterBot 0.4.11 Uninstalling ChatterBot-0.4.11: Successfully uninstalled ChatterBot-0.4.11 Found existing installation: requests 2.10.0 Uninstalling requests-2.10.0: Successfully uninstalled requests-2.10.0 Found existing installation: oauthlib 1.1.2 Uninstalling oauthlib-1.1.2: Successfully uninstalled oauthlib-1.1.2 Running setup.py install for oauthlib ... done Successfully installed chatterbot-0.4.12 oauthlib-2.0.0 requests-2.11.1

And now. python3 -m chatterbot --version Output. 0.4.12

Huh.

Well. Let's test if it works.

python3 chatbot.py Output.

/usr/local/lib/python3.5/dist-packages/chatterbot/adapters/storage/jsonfile.py:19: UnsuitableForProductionWarning: The JsonFileStorageAdapter is not recommended for production application environments. self.UnsuitableForProductionWarning


Resource 'tokenizers/punkt/PY3/english.pickle' not found. Please use the NLTK Downloader to obtain the resource: >>> nltk.download() Searched in:

  • '/home/[don't want to tell my username]/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'
  • ''

Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/textblob/decorators.py", line 35, in decorated return func(_args, *_kwargs) File "/usr/local/lib/python3.5/dist-packages/textblob/tokenizers.py", line 57, in tokenize return nltk.tokenize.sent_tokenize(text) File "/usr/local/lib/python3.5/dist-packages/nltk/tokenize/init.py", line 90, in sent_tokenize tokenizer = load('tokenizers/punkt/{0}.pickle'.format(language)) File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 801, in load opened_resource = _open(resource_url) File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 919, in open return find(path, path + ['']).open() File "/usr/local/lib/python3.5/dist-packages/nltk/data.py", line 641, in find raise LookupError(resource_not_found) LookupError:


Resource 'tokenizers/punkt/PY3/english.pickle' not found. Please use the NLTK Downloader to obtain the resource: >>> nltk.download() Searched in:

  • '/home/[don't want to tell my username]/nltk_data'
  • '/usr/share/nltk_data'
  • '/usr/local/share/nltk_data'
  • '/usr/lib/nltk_data'
  • '/usr/local/lib/nltk_data'
  • ''

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "chatbot.py", line 18, in database="../database.db" File "/usr/local/lib/python3.5/dist-packages/chatterbot/chatterbot.py", line 57, in init self.add_logic_adapter(adapter, kwargs) File "/usr/local/lib/python3.5/dist-packages/chatterbot/chatterbot.py", line 88, in add_logic_adapter adapter = self.initialize_class(adapter, kwargs) File "/usr/local/lib/python3.5/dist-packages/chatterbot/chatterbot.py", line 84, in initialize_class return Class(kwargs) File "/usr/local/lib/python3.5/dist-packages/chatterbot/adapters/logic/time_adapter.py", line 26, in init self.classifier = NaiveBayesClassifier(training_data) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 192, in init* self.train_features = [(self.extract_features(d), c) for d, c in self.train_set] File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 192, in self.train_features = [(self.extract_features(d), c) for d, c in self.train_set] File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 169, in extract_features return self.feature_extractor(text, self.train_set) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 81, in basic_extractor word_features = _get_words_from_dataset(train_set) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 63, in _get_words_from_dataset return set(all_words) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 62, in all_words = chain.fromiterable(tokenize(words) for words, in dataset) File "/usr/local/lib/python3.5/dist-packages/textblob/classifiers.py", line 59, in tokenize return word_tokenize(words, include_punc=False) File "/usr/local/lib/python3.5/dist-packages/textblob/tokenizers.py", line 72, in word_tokenize for sentence in sent_tokenize(text)) File "/usr/local/lib/python3.5/dist-packages/textblob/base.py", line 64, in itokenize return (t for t in self.tokenize(text, _args, _kwargs)) File "/usr/local/lib/python3.5/dist-packages/textblob/decorators.py", line 38, in decorated raise MissingCorpusError() textblob.exceptions.MissingCorpusError: Looks like you are missing some required data for this feature.

To download the necessary data, simply run

python -m textblob.download_corpora

or use the NLTK downloader to download the missing data: http://nltk.org/data.html If this doesn't fix the problem, file an issue at https://github.com/sloria/TextBlob/issues.

(Had to use quote instead of code again.)

It dosen't. ):

gunthercox commented 7 years ago

Ok, so looking at the traceback, the UnsuitableForProductionWarning is just a warning and can be ignored. The MissingCorpusError is a definitely an issue. What happens when you run the command it reccomends?

python3 -m textblob.download_corpora
python3 chatbot.py
nnt0 commented 7 years ago

Okay.

python3 -m textblob.download_corpora Output. [nltk_data] Downloading package brown to /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping corpora/brown.zip. [nltk_data] Downloading package punkt to /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping tokenizers/punkt.zip. [nltk_data] Downloading package wordnet to /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping corpora/wordnet.zip. [nltk_data] Downloading package averaged_perceptron_tagger to [nltk_data] /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping taggers/averaged_perceptron_tagger.zip. [nltk_data] Downloading package conll2000 to [nltk_data] /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping corpora/conll2000.zip. [nltk_data] Downloading package movie_reviews to [nltk_data] /home/[don't want to tell my username]/nltk_data... [nltk_data] Unzipping corpora/movie_reviews.zip. Finished.

python3 chatbot.py Output. /usr/local/lib/python3.5/dist-packages/chatterbot/adapters/storage/jsonfile.py:19: UnsuitableForProductionWarning: The JsonFileStorageAdapter is not recommended for production application environments. self.UnsuitableForProductionWarning Type something to begin...

Yay. It works. Apperently textblob.download_orpora is needed for it to work.

gunthercox commented 7 years ago

Awesome, I'm glad it worked. I will note that I do have plans to remove textblob from ChatterBot in the future in favor of handling tasks like this automatically.

nnt0 commented 7 years ago

Why don't you make a .sh which automates the install and checks if everything what is needed is installed?

gunthercox commented 7 years ago

ChatterBot actually already handles this internally for the NLTK data corpus. An example can be seen here: https://github.com/gunthercox/ChatterBot/blob/master/chatterbot/utils/wordnet.py#L16. The issue is textblob (which is just another library that wraps NLTK to provide convenient functionality). Textblob, as you saw, handles the downloads explicitly instead of automatically. The goal moving forward would be simply to remove ChatterBot's dependency on textblob.

lock[bot] commented 5 years ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.