fs0c131y / AadhaarSearchEngine

Find Aadhaar cards thanks to Google
199 stars 64 forks source link

SearchEngines module not found #6

Open Confidant23 opened 6 years ago

Confidant23 commented 6 years ago

No module named SearchEngines

nzec commented 6 years ago

Same.

abhijitborah commented 6 years ago

On running the scrapy crawl AadhaarSpider -a keyword="aadhaar meri pehachan filetype:pdf" -a se=google -a pages=10

in the AadhaarSearchEngine directory, I get the following. (Please tolerate my python illiteracy ).

> al@axon:~/AadhaarSearchEngine$ scrapy crawl AadhaarSpider -a keyword="aadhaar meri pehachan filetype:pdf" -a se=google -a pages=10
> :0: UserWarning: You do not have a working installation of the service_identity module: 'cannot import name 'opentype''.  Please install it from <https://pypi.python.org/pypi/service_identity> and make sure all of its dependencies are satisfied.  Without the service_identity module, Twisted can perform only rudimentary TLS client hostname verification.  Many valid certificate/hostname mappings may be rejected.
> Traceback (most recent call last):
>   File "/usr/local/bin/scrapy", line 11, in <module>
>     sys.exit(execute())
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/cmdline.py", line 149, in execute
>     cmd.crawler_process = CrawlerProcess(settings)
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/crawler.py", line 249, in __init__
>     super(CrawlerProcess, self).__init__(settings)
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/crawler.py", line 137, in __init__
>     self.spider_loader = _get_spider_loader(settings)
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/crawler.py", line 336, in _get_spider_loader
>     return loader_cls.from_settings(settings.frozencopy())
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/spiderloader.py", line 61, in from_settings
>     return cls(settings)
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/spiderloader.py", line 25, in __init__
>     self._load_all_spiders()
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders
>     for module in walk_modules(name):
>   File "/usr/local/lib/python3.5/dist-packages/scrapy/utils/misc.py", line 71, in walk_modules
>     submod = import_module(fullpath)
>   File "/usr/lib/python3.5/importlib/__init__.py", line 126, in import_module
>     return _bootstrap._gcd_import(name[level:], package, level)
>   File "<frozen importlib._bootstrap>", line 986, in _gcd_import
>   File "<frozen importlib._bootstrap>", line 969, in _find_and_load
>   File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
>   File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
>   File "<frozen importlib._bootstrap_external>", line 665, in exec_module
>   File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
>   File "/home/al/AadhaarSearchEngine/AadhaarSearchEngine/spiders/AadhaarSpider.py", line 4, in <module>
>     from AadhaarSearchEngine.common.SearchResultPages import SearchResultPages
>   File "/home/al/AadhaarSearchEngine/AadhaarSearchEngine/common/SearchResultPages.py", line 1, in <module>
>     from SearchEngines import SearchEngines
> ImportError: No module named 'SearchEngines'
> al@axon:~/AadhaarSearchEngine$
shaileshsharan98 commented 5 years ago

Getting Same Error!

rshiva commented 5 years ago

In SearchResultPages.py replace SearchEngines import SearchEngines with from AadhaarSearchEngine.common.SearchEngines import SearchEngines