Open peterbmckinley opened 4 years ago
huh, interesting project !
It looks as though the project was written for python2, and you are invoking it with the python3 runtime.
print "Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,)
is the old python2 way of printing that is not supported on python3, it should be:
print("Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,))
You should be able to use the 2to3
command (bundled with python) to convert it. See: https://docs.python.org/3/library/2to3.html
Hi R, thank you for your prompt response. I'm a huge admirer of your work in this area. I especially love carousel.py.
I do think oledterm could be tranformative for oled displays on sbc's. It's a shame it's caught up in the python2 v python3 quicksand that bedevils so much code.
I'll try it this evening when I get home, its 7.24am here in the UK.
Thanks again!
On Tue, 4 Aug 2020, 07:20 Richard Hull, notifications@github.com wrote:
huh, interesting project !
It looks as though the project was written for python2, and you are invoking it with the python3 runtime.
print "Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,)
is the old python2 way of printing that is not supported on python3, it should be:
print("Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,))
You should be able to use the 2to3 command (bundled with python) to convert it. See: https://docs.python.org/3/library/2to3.html
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-668403344, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDNISRAV6OSZLZYTDLLR66SDFANCNFSM4PT2H6JA .
Incidentally, I have to invoke using python3, otherwise I get the classic luma.core not found error because of the (correct) way luma.oled installed. But I'm keen to play with the conversion script.
Peter
On Tue, 4 Aug 2020, 07:20 Richard Hull, notifications@github.com wrote:
huh, interesting project !
It looks as though the project was written for python2, and you are invoking it with the python3 runtime.
print "Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,)
is the old python2 way of printing that is not supported on python3, it should be:
print("Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,))
You should be able to use the 2to3 command (bundled with python) to convert it. See: https://docs.python.org/3/library/2to3.html
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-668403344, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDNISRAV6OSZLZYTDLLR66SDFANCNFSM4PT2H6JA .
Here's a thought.... do you think you could tweak the oledterm script to bring it into the 21st century Python3 world, test it and consider adding it to your list of examples?
Don't get me wrong, the dancing banana is cute, but I'm drawn to things with a more practical application than making me smile 😜
What do you think?
Peter
On Tue, 4 Aug 2020, 07:20 Richard Hull, notifications@github.com wrote:
huh, interesting project !
It looks as though the project was written for python2, and you are invoking it with the python3 runtime.
print "Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,)
is the old python2 way of printing that is not supported on python3, it should be:
print("Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,))
You should be able to use the 2to3 command (bundled with python) to convert it. See: https://docs.python.org/3/library/2to3.html
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-668403344, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDNISRAV6OSZLZYTDLLR66SDFANCNFSM4PT2H6JA .
bring it into the 21st century Python3 world, test it and consider adding it to your list of examples?
Including it in the repository would require permission from the author.
Hi Richard, I know the author doesn't appear to be responding, can we be sure he or she is alive 🙄. Not sure where that leaves us! I'll see about converting it later.
Will you be trying it yourself?
Peter
On Tue, 4 Aug 2020, 11:32 Thijs Triemstra, notifications@github.com wrote:
bring it into the 21st century Python3 world, test it and consider adding it to your list of examples?
Including it in the repository would require permission from the author.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-668518047, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDMOQT5X75VZXKCKFMDR67PTBANCNFSM4PT2H6JA .
Oops sorry thijstriemstra!
Ok I ran the script through 2to3 with writeback option, now when I run:
python3 oledterm.py --i2c-port 0 --display sh1106
I get:
Traceback (most recent call last):
File "oledterm.py", line 113, in <module>
main()
File "oledterm.py", line 100, in main
term.putch(char)
File "/usr/local/lib/python3.7/dist-packages/luma/core/virtual.py", line 327, in putch
w = self.font.getsize(char)[0]
File "/usr/local/lib/python3.7/dist-packages/PIL/ImageFont.py", line 262, in getsize
size, offset = self.font.getsize(text, False, direction, features, language)
TypeError: expected string
I realise this is an unfair question for here, but since oledterm uses luma.oled you guys are my only hope!
Peter
Hi all! Are we admitting defeat on this?
I'll have a go at a clean-room re-implementation of what is in that repo, directly in luma.examples
Exciting!
As you can see from the oledterm issues, the maintainer is no longer responding. I've tried twice. I don't know what the Github procedure is to formally declare a project orphaned.
To have an interactive luma example I think would attract significant attention. Many SBC projects run headless, and to provide a means of accessing the console directly simply by adding a $2 oled display would be great progress for a lot of people.
Thanks and good luck, I'm looking forward to the next installment!
Peter
On Mon, 10 Aug 2020, 10:34 Richard Hull, notifications@github.com wrote:
I'll have a go at a clean-room re-implementation of what is in that repo, directly in luma.examples
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-671255581, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDPM5PSNSJEZ5LAEPLLR765IXANCNFSM4PT2H6JA .
Hi just wondered if satoshinm's script looks viable? I have no clue how his spell works, if it ever did........
Krizzel87 has had some success modfiying Satshinm's oledterm code and making it sort of work, but it's highly unstable. See thread here:
https://github.com/satoshinm/oledterm/issues/4
Krizzel87 has started his own repo, but I consider it a work in progress as it's known not to work properly:
https://github.com/Krizzel87/Oledterm_SH1106
Hope this helps
Hi rm-hull any news on your proposed clean-room rebuild?
Peter
BUMP
Hi peter, sorry, I will try and get to this in the next few days ..
Hi Richard,
Great to hear from you. Glad you still think oledterm is worth tackling. My project includes an OPi0 with a 1.3" OLED display in a custom case. Having access to the CLI is a whole other use-case, outside the project, but an uber cool one. Thats why I registered the domain www.tinycomputers.co.uk, as the Speedsaver (my project) alter ego.
If anyone on the planet can make oledterm stable, it is you. If you can't make it work, no-one can. I'm not just saying that to big you up or encourage you, you are light years ahead of anyone else in this field. I defy anyone to disagree with me.
Good luck and I look forward to hearing from you regarding progress or lack thereof. Maybe it can't be done, who knows. In that case, Carousel will remain the coolest luma.oled example in my book.
Best wishes,
Peter McKinley
Speedsaver Ltd
On Fri, 30 Oct 2020, 23:07 Richard Hull, notifications@github.com wrote:
Hi peter, sorry, I will try and get to this in the next few days ..
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/rm-hull/luma.examples/issues/120#issuecomment-719837861, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFONLDJJHUHIRPLD4JD4CO3SNNBJ5ANCNFSM4PT2H6JA .
I guess this is permanently consigned to the Too Hard basket of history........
I guess this is permanently consigned to the Too Hard basket of history........
Time, life, sun, all gets in the way ;)
My setup: Orange Pi Zero Armbian Buster 1.3" OLED on i2c bus, SH1106 controller
I have all the luma.examples running fine (with the exception of the Raspberry Pi camera specific ones) and I'm intrigued by the idea of piping console to my OLED display. I came across this cool project:
https://github.com/satoshinm/oledterm
But when I run:
python3 oledterm.py --i2c-port 0 --display sh1106
I get:
File "oledterm.py", line 55 print "Unable to access %s, try running as root?" % (VIRTUAL_TERMINAL_DEVICE,) ^ SyntaxError: invalid syntax
I opened an issue on the oledterm project, but I don't think it's actively maintained. I think oledterm would be a killer example to be considered for possible inclusion in luma.examples, but any tips why it's not working for me?
Peter