ephraim / lcurse

Python script to have a "curse" compatible client for linux
The Unlicense
51 stars 24 forks source link

IndexError #14

Closed savirien closed 8 years ago

savirien commented 8 years ago

This looks like a great script, but I can't get it to run "Update Catalog" without either crashing or perpetually displaying the load bar associated with said function. Python3 with pyqt5 is installed.

Bash spits out these errors: WARNING: gui translation file could not be found: /opt/lcurse/translations/en_US.qm Traceback (most recent call last): File "/opt/lcurse/modules/waitdlg.py", line 252, in run self.retrieveListOfAddons() File "/opt/lcurse/modules/waitdlg.py", line 242, in retrieveListOfAddons lastpage = self.retrievePartialListOfAddons(page) File "/opt/lcurse/modules/waitdlg.py", line 225, in retrievePartialListOfAddons lastpage = self.parsePager(pager[0].string) IndexError: list index out of range Traceback (most recent call last): File "/opt/lcurse/modules/application.py", line 319, in addAddon url = [ item[1] for item in self.availableAddons if item[0] == name ][0]

ephraim commented 8 years ago

Hey,

what addon do you try to add?

ephraim commented 8 years ago

please try again with the current commits pulled.

ephraim commented 8 years ago

btw. if you find anything not translated correctly, I would be glad to get a proper translation for en_US LANG.

savirien commented 8 years ago

Thanks for the reply. As of now I get this:

WARNING: gui translation file could not be found: /home/greg/Downloads/lcurse-master/translations/en_US.qm Traceback (most recent call last): File "/home/greg/Downloads/lcurse-master/modules/waitdlg.py", line 331, in run self.retrieveListOfAddons() File "/home/greg/Downloads/lcurse-master/modules/waitdlg.py", line 321, in retrieveListOfAddons lastpage = self.retrievePartialListOfAddons(page) File "/home/greg/Downloads/lcurse-master/modules/waitdlg.py", line 304, in retrievePartialListOfAddons lastpage = self.parsePager(pager[0].string) IndexError: list index out of range

I am by no means a py expert. What I've tried thus far is changing waitdlg.py line 300 to: response = self.opener.open("http://www.curse.com/addons/wow?page=" + str(page))

Thought the curse site may have changed their URL scheme, but I still get the same errors.

It looks like (at least to me) that something is not initialized right in line 304 of waitdlg.py but I don't have the knowledge to fix it.

Thanks again.

ephraim commented 8 years ago

You didn't pull the latest changes, isn't it?

savirien commented 8 years ago

Sorry for the late reply, that was with the changes.

hotice commented 8 years ago

It doesn't work for me either, using the latest GIT. This is what I get under Ubuntu 16.04:

QXcbWindow: Unhandled client message: "_GTK_LOAD_ICONTHEMES"
/usr/lib/python3/dist-packages/bs4/__init__.py:166: UserWarning: No parser was explicitly specified, so I'm using the best available HTML parser for this system ("lxml"). This usually isn't a problem, but if you run this code on another system, or in a different virtual environment, it may use a different parser and behave differently.

To get rid of this warning, change this:

 BeautifulSoup([your markup])

to this:

 BeautifulSoup([your markup], "lxml")

  markup_type=markup_type))
Traceback (most recent call last):
  File "/opt/lcurse/modules/waitdlg.py", line 331, in run
    self.retrieveListOfAddons()
  File "/opt/lcurse/modules/waitdlg.py", line 321, in retrieveListOfAddons
    lastpage = self.retrievePartialListOfAddons(page)
  File "/opt/lcurse/modules/waitdlg.py", line 304, in retrievePartialListOfAddons
    lastpage = self.parsePager(pager[0].string)
IndexError: list index out of range
ephraim commented 8 years ago

sorry my bad. Did forget one file to add to the commit. Please pull and try again. Thanks!

savirien commented 8 years ago

Looks like things are working now. I did add a small sleep delay into your function in waitdlg.py

def retrieveListOfAddons(self):
    page = 1
    lastpage = 1
    self.sem.acquire()
    lastpage = self.retrievePartialListOfAddons(page)
    page += 1
    self.retrievedLastpage.emit(lastpage)

    while page <= lastpage:
        sleep(0.5)
        self.sem.acquire()
        start_new_thread(self.retrievePartialListOfAddons, (page,))
        page += 1

The workers were requesting so fast it was causing urllib to timeout from too many connections left open. I believe this is an issue with my ISP rather than with your code.

Thank you for getting lcurse working again!

hotice commented 8 years ago

Now it works here too. Thanks!