stephen304 / bitcannon

A torrent index archiving, browsing, and backup tool
http://bitcannon.io/
MIT License
408 stars 40 forks source link

Search not working #49

Closed HighCommander4 closed 9 years ago

HighCommander4 commented 9 years ago

I installed BitCannon, and then imported the full Kickass dump (it took about 15 hours, but it did complete).

Howering, searching for anything turns up no results. For example, a search for "Mozart" turns up 0 results, even though the corresponding search on Kickass turns up plenty of results.

What can I do to further diagnose or fix this issue?

EntropicNinja commented 9 years ago

Does it return a page that has the top green bar with the tabs/option/etc but no results - how lng did you wait on that page for? I only ask because I did the same (500+Mb archive file) and it took, according the the cli readout, 1m49s to complete.

e.g

The last line bitcannon prints to sdout is [martini] Completed 200 OK in 2.52071642s if I watch the output after running bitcannon from cli on my Debian 7 VPS.

The time it took to complete a search after adding the archive became [martini] Completed 200 OK in 1m47s

I understand that you may not be running it in the same situation as I but I thought I'd chime in. After I used it last night, I stopped it running, started it tonight and now im getting [martini] Completed 200 OK in 15.470252215s for searches that are older than 24hrs.

HTH

HighCommander4 commented 9 years ago

The last line is [martini] Completed 404 Not Found in 62.5ms.

HighCommander4 commented 9 years ago

Here is the complete output during starting up and searching for "mozart":

[OK!] Connecting to Mongo at 127.0.0.1
[OK!] BitCannon is live at http://127.0.0.1:1337/
[martini] listening on :1337 (development)
[OK!] Starting to import from url:
      https://kickass.so/hourlydump.txt.gz
[OK!] Compression detection complete
[OK!] GZip detected, unzipping enabled
[OK!] Reading initialized
[OK!] Reading completed
      191 torrents imported
      9328 torrents skipped
[OK!] Starting to import from url:
      http://www.demonoid.pw/api/demonoid24h.txt.gz
[OK!] Compression detection complete
[OK!] GZip detected, unzipping enabled
[OK!] Reading initialized
[OK!] Reading completed
      3 torrents imported
      273 torrents skipped
[OK!] Finished auto importing.[martini] Started GET / for 127.0.0.1:4836
[martini] [Static] Serving /
[martini] Completed 200 OK in 109.375ms
[martini] Started GET /styles/vendor.02d380f0.css for 127.0.0.1:4836
[martini] [Static] Serving /styles/vendor.02d380f0.css
[martini] Started GET /styles/main.5f7ccbfa.css for 127.0.0.1:4837
[martini] [Static] Serving /styles/main.5f7ccbfa.css
[martini] Completed 200 OK in 0
[martini] Started GET /scripts/vendor.e0da37b0.js for 127.0.0.1:4838
[martini] [Static] Serving /scripts/vendor.e0da37b0.js
[martini] Completed 200 OK in 187.5ms
[martini] Completed 200 OK in 0
[martini] Started GET /scripts/scripts.c18719dd.js for 127.0.0.1:4839
[martini] [Static] Serving /scripts/scripts.c18719dd.js
[martini] Completed 200 OK in 0
[martini] Started GET /views/main.html for 127.0.0.1:4838
[martini] [Static] Serving /views/main.html
[martini] Completed 200 OK in 0
[martini] Started GET /images/bitcannon.6afc99ec.png for 127.0.0.1:4838
[martini] [Static] Serving /images/bitcannon.6afc99ec.png
[martini] Completed 200 OK in 15.625ms
[martini] Started GET /views/search.html for 127.0.0.1:4838
[martini] [Static] Serving /views/search.html
[martini] Completed 200 OK in 0
[martini] Started GET /search/mozart for 127.0.0.1:4838
[martini] Completed 404 Not Found in 62.5ms
stephen304 commented 9 years ago

Sorry for the long load times - I recently fixed the browse page performance issues and that will be in v0.1.0

As for the 404 error, it appears that I return that error if it can't get the results from the database into a variable. If you right click on the page and click inspect element, then click on the network tab, you can see the request. Then reload the search so that the 404 request shows up. Click on it and open the response tab. It should hopefully have something like {"message": "something useful here"}

Also, skinnx86 what page was the last 15 second request for?

HighCommander4 commented 9 years ago

The message is: "no such cmd: aggregate".

stephen304 commented 9 years ago

Do you know what version of mongodb you are using? I use the aggregation pipeline to make searches fast (and quite honestly I don't know how to do it without aggregation) and it may be the case that mongodb versions below 2.6 don't have the aggregation pipeline.

HighCommander4 commented 9 years ago

I use MongoDB 2.0.7. It is, I believe, the latest version that supports my OS, Windows XP.

stephen304 commented 9 years ago

Sorry about that, I don't think there's much I can do about that. Perhaps there is another computer you can use for this? It's very difficult to make software compatible with an os that old.

HighCommander4 commented 9 years ago

OK, I'll try using it on my Linux VM instead.

Perhaps you can document that BitCannon doesn't support Windows XP?

stephen304 commented 9 years ago

I definitely will! I had it sort of documented that it doesn't support MongoDB < 2.6, but I didn't realize that meant being >windowsxp.

Feel free to open more issues if you have any other problems running it!

HighCommander4 commented 9 years ago

Heh, my Linux VM runs Ubuntu 10.04, which ships with MongoDB 1.2 :/

stephen304 commented 9 years ago

Maybe this works? http://docs.mongodb.org/manual/tutorial/install-mongodb-on-ubuntu/#install-a-specific-release-of-mongodb

I'm not sure what you can do other than upgrading Ubuntu to a more recent version. Ubuntu 10.04 desktop reached EOL 2 years ago, and the server version reaches EOL in 3 months.

HighCommander4 commented 9 years ago

Yep, installing 2.6 from MongoDB's own package repo worked.

Haven't imported the full dump yet, but search on the last 24 hours of data is working!

stephen304 commented 9 years ago

Glad to see you have it working! I would advise to wait until the next release before importing a full dump. The interface lags a lot in 0.0.5 with large amounts of data.

Also, keep in mind that I've been changing the database structure almost every release so far, meaning you would have to clear out the database to upgrade versions. I hope this will stabilize soon though.

HighCommander4 commented 9 years ago

OK, I'll wait with importing the full dump.

Thanks for working on this, Stephen! It's a really cool (and timely) project.

EntropicNinja commented 9 years ago

Speaking of clearing out the database, would you include some quick steps on doing so when 0.0.6 is released. I have looked in /data/db and in bitcannon working directory but there is nothing in each. TIA

stephen304 commented 9 years ago

Normally there should be some files in /data/db, unless you're on linux maybe. In any case you can either delete the data files (which I guess you don't know where they are) or you can start the mongo program for a prompt and enter use bitcannon then db.dropDatabase().

I may include a program to clear the bitcannon database with the next release to make it easier. I'd like to have an upgrade program that could apply the changes to the database entries, but it would be very slow and would require tons of testing. If this is the last major database change, it may be feasible, but there was so much that was changed between last and next release.

HighCommander4 commented 9 years ago

Normally there should be some files in /data/db, unless you're on linux maybe. In any case you can either delete the data files (which I guess you don't know where they are)

FWIW on my Linux system the data files are in /var/lib/mongodb, and this path is specified in /etc/mongod.conf.

EntropicNinja commented 9 years ago

NINJA EDIT (Before Submitting): Thanks @HighCommander4

I do indeed use Linux (Debian 7 on Digital Ocean hosting). Issuing mongod gives me the following, I have no idea if this could be an issue.

journal dir=/data/db/journal
recover : no journal files present, no recovery needed
ERROR: Insufficient free space for journal files
Please make at least 3379MB available in /data/db/journal or use --smallfiles

Listing all files in /data/db/ originally told me that there are 8 files although it would not list them. After running mongod and the ending line was dbexit: really exiting now. Now ls -la now informs me there are 12 files, but only journal and mongod.lock files listed. After seeing where the journal was stored I thought I'd look there. (/var/lib/mongodb).

root@Skinnx86:/var/lib/mongodb# du -cksh *|sort -hr|head -11
9.1G    total
3.1G    journal
2.0G    bitcannon.6
2.0G    bitcannon.5
1.1G    bitcannon.4
513M    bitcannon.3
257M    bitcannon.2
129M    bitcannon.1
65M local.0
65M bitcannon.0
16M local.ns

I guessing by the length of the journal that bitcannon.0 was the first database to be created and bitcannon.5 is the full dump import, and and bitcannon.6 is the 24hr dump after full import. Do you think that removing bitcannon.0 would effect bitcannon or would it be one to clear out? I will need to clear out eventually as I only have 20Gb on this VPS.

As an aside, I am not a dev/coder, I am doing this for myslef and to learn alongside my job! I have, however, created a fairly comprehensive Debian 7 Server install guide and am in the process of setting up gitlab, before I make a pull request & merge. If you would like to read it over I can send it before hand.

Many Thanks again,

Skinnx86

stephen304 commented 9 years ago

Mongodb doesn't differentiate between the imports - they are all really dumped into the same database through a bunch of individual inserts so the file splitting is up to mongo.

If you have a debian guide I could put it up on the wiki! You can email it to me or something.

EntropicNinja commented 9 years ago

Fair play. I start work in a mo, for a few hours, so I will try to ping it over to you when I get a chance.