SpenceKonde / megaTinyCore

Arduino core for the tinyAVR 0/1/2-series - Ones's digit 2,4,5,7 (pincount, 8,14,20,24), tens digit 0, 1, or 2 (featureset), preceded by flash in kb. Library maintainers: porting help available!
Other
551 stars 142 forks source link

ATTiny1614 - SERIAL_RS485 define causes error #930

Closed CraigBurden closed 1 year ago

CraigBurden commented 1 year ago

Attempting to use RS485 mode for the UART on a ATTiny1614 I get the following error:

.platformio/packages/framework-arduino-megaavr-megatinycore/cores/megatinycore/UART_constants.h:116:45: error: 'USART_RS485_0_bm' was not declared in this scope
   #define SERIAL_RS485         (((uint16_t) USART_RS485_0_bm) << 8)// 0x0100

It appears that the issue is with the following code in UART_constants.h:

#if defined(USART_RS485_0_bm) || defined(USART_RS4850_bm)
  #define SERIAL_RS485         (((uint16_t) USART_RS485_0_bm) << 8)// 0x0100
  #define SERIAL_RS485_OTHER   (((uint16_t) USART_RS485_1_bm) << 8)// 0x0200 tinyAVR 0/1
#else
  #define SERIAL_RS485         (((uint16_t) USART_RS485_bm)  << 8)// 0x0100
  #define SERIAL_RS485_OTHER   (((uint16_t) USART_RS485_bm)  << 9)// 0x0200
#endif

The issue being that the condition is if either USART_RS485_0_bm or USART_RS4850_bm is defined but then the resulting macro definitions rely on USART_RS485_0_bm.

In this case USART_RS485_0_bm is not defined so the macro fails.

I would go fix this myself and PR but I don't have the context to know where else (if at all this needs to be addressed too)

SpenceKonde commented 1 year ago

Da fuck you mean USART_RS485_0_bm isn't defined?! It's a bloody tinyAVR 1-series, how the fuck is that possible?

USART_RS485_0_bm IS BE DEFINED RIGHT IN THE IO HEADER!!! Right on line 5013:

#define USART_RS485_0_bm  (1<<0)  /* RS485 Mode internal transmitter bit 0 mask. */

Are you using the Azduino6 compiler package as required by versions of the core that have had that feature added?

You need to be using the latest compiler toolchain package because the clowns at microchip fucking changed the name of over 1000 bitfields in the latest releases of the ATPacks because they realized there really should be an underscore between the bitfield name and the number of the bit within that bitfield (this is true) and so they RENAMED EVERY FUCKING ONE (once you establish a convention, the number 1 rule of designmanship is you don't fucking change it unless you havea VERY GOOD REASON. They have changed the names of things too many times. I try to catch them all, but the >3000 lines of compatibility code that I have in core_devices.h may be missing a few.

If USART_RS4850_bm is defined, then you have the old toolchain, and my core wont work. (there are a few other things that i know will be broken. Basically, every register bitfield name of the form PERIPHERALOPTIONBITNUMBER[bm|bp] was changed to PERIPHERAL_OPTIONBITNUMBER[bm|bp]. They put compatibility defined in like, one group of bitfields out of hundreds.

Of course, if you were using the Arduinio IDE and you used board manager to install, you would be assured you had the right toolchain and this problem would never arise. But these third party tools don't actually honor the (or even look at) the board manager json which defines the tools required by the core, their version, and where to download them.

CraigBurden commented 1 year ago

This is the greatest issue response I have ever gotten, Thank you!

I am using PlatformIO with the 'atmelmegaavr' platform. The initial dump I get when I build looks like this:

Processing ATtiny1614 (platform: atmelmegaavr; board: ATtiny1614; framework: arduino)
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Verbose mode can be enabled via `-v, --verbose` option
CONFIGURATION: https://docs.platformio.org/page/boards/atmelmegaavr/ATtiny1614.html
PLATFORM: Atmel megaAVR (1.7.0) > ATtiny1614
HARDWARE: ATTINY1614 20MHz, 2KB RAM, 16KB Flash
PACKAGES: 
 - framework-arduino-megaavr-megatinycore @ 2.6.5 
 - toolchain-atmelavr @ 3.70300.220127 (7.3.0)
LDF: Library Dependency Finder -> https://bit.ly/configure-pio-ldf
LDF Modes: Finder ~ chain, Compatibility ~ soft
Found 15 compatible libraries
Scanning dependencies...
No dependencies
Building in release mode

Re the lack of definition, my ln5013 in the IO header is:

#define USART_TXPL3_bm  (1<<3)  /* Transmit pulse length bit 3 mask. */

I hope that helps for diagnosing this, hopefully I am just getting the wrong versions via PIO

SpenceKonde commented 1 year ago

Yeah, that's the old header version, old toolchain version (note: I can't tell from the version string is shows. I don't know how to display that value or what determines it, but since I don't know what determines it, I probably haven't been incrementing it or anything!

but the USART_TXPL bits get get that treatment. They are USART_TXPL3_bm like on the older version, USART_TXPL_3_bm like on the new one.

I don't know what platform you are using, but you want to be using the build from the json file for your platform:

          "systems": [
            {
              "size": "38098458",
              "checksum": "SHA-256:75b9740cf47d41177aff14f9674e25ad378e29fae7633920cdcc7b056e8e9fbe",
              "host": "aarch64-pc-linux-gnu",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-aarch64-pc-linux-gnu.tar.bz2",
              "url": "https://spencekondetoolchains.s3.amazonaws.com/avr-gcc-7.3.0-atmel3.6.1-azduino6-aarch64-pc-linux-gnu.tar.bz2"
            },
            {
              "size": "34520389",
              "checksum": "SHA-256:b41a827e92f6a87c45f2e37029865b6bbd57e0eeadb639be66416b89e8f77b78",
              "host": "arm-linux-gnueabihf",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-arm-linux-gnueabihf.tar.bz2",
              "url": "https://spencekondetoolchains.s3.amazonaws.com/avr-gcc-7.3.0-atmel3.6.1-azduino6-arm-linux-gnueabihf.tar.bz2"
            },
            {
              "size": "37148876",
              "checksum": "SHA-256:24fc6bcd0786d3015346342a2fea5701a0ce11eeea178bac1c1d2b6a9e6d6d03",
              "host": "i686-pc-linux-gnu",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-i686-pc-linux-gnu.tar.bz2",
              "url": "https://spencekondetoolchains.s3.amazonaws.com/avr-gcc-7.3.0-atmel3.6.1-azduino6-i686-pc-linux-gnu.tar.bz2"
            },
            {
              "size": "37715121",
              "checksum": "SHA-256:7c4cc781343cbae77328e7d69433458c105d1efca87a56863947cb73966fe821",
              "host": "x86_64-apple-darwin14",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-x86_64-apple-darwin14.tar.bz2",
              "url": "https://spencekondetoolchains.s3.amazonaws.com/avr-gcc-7.3.0-atmel3.6.1-azduino6-x86_64-apple-darwin14.tar.bz2"
            },
            {
              "size": "37714803",
              "checksum": "SHA-256:3a4be4dde46b9ee5af1d89fb50512dea0a1c49fca5f2b18357bd5fd12d6c330d",
              "host": "x86_64-pc-linux-gnu",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-x86_64-pc-linux-gnu.tar.bz2",
              "url": "https://github.com/SpenceKonde/DxCore/raw/gh-pages/avr-gcc-7.3.0-atmel3.6.1-azduino6-x86_64-pc-linux-gnu.tar.bz2"
            },
            {
              "size": "44740781",
              "checksum": "SHA-256:4b4a25ca7935402998b27befd9439300ad642c4e21b0becb3f945748090c7c74",
              "host": "i686-w64-mingw32",
              "archiveFileName": "avr-gcc-7.3.0-atmel3.6.1-azduino6-i686-w64-mingw32.tar.bz2",
              "url": "https://spencekondetoolchains.s3.amazonaws.com/avr-gcc-7.3.0-atmel3.6.1-azduino6-i686-w64-mingw32.tar.bz2"
            }
          ]

(the one that's hosted on github is hosted that way because it's used for CI, so it serves everyone except the network providers with bandwidth charges to host that within github. When I hadn't realized this, I got my alarm for the entire monthly spend 2 days after I uploaded the first new toolchain that way - and subsequent investigation showed that the vast majority of it was from the CI wich is extremely inefficient. So github it is for linux x64 only - I'm doing it to save them money, not because I'm trying to be a freeloader. If I were, I'd host everything on github, but I somehow think they'd get annoyed if I stuffed 300mbytes per toolchain version in as BLOBs and when I release a new toolchain and core version, suddenly everyone starts downloading it. I'm not sure, but I think these cores may be widely used enough to move the needle; I mean it's 40-50mbytes per user when a new toolchain version goes out and all my cores will typically get that update within a few days of eachother. I don't know how many people have actually installed the core though, yasee.)

Also ugh, this reminds me that I need to deal with AWS stuff cause my reserved instances ran out..... but no time for that now. Plan is to move the json file to azduino.com, but only after rebuilding the VM behind that name, because the type of instance it's using is no longer favorable (they encourage migration to newer versions of their VMs and newer OS versions. And upgrading an OS on a linux machine in the cloud? Fuck that, it's easier to deploy a new one. None of the artifacts on azduino.com or drazzy.com are unique to the webservers, they all have a backup copy elsewhere (mostly it's in private github repos that I sync, set one script +x cause I can't do that in github from the web interface or my windows computer, but doing from the command line, you can't use a password to log in to commit the change - I'm stuck with one of the release building scripts being non-x permission'ed because it's such a huge pain in the ass to commit anything except when I'm allowed to login with a password.

And yes, if you are thinking "wtf, this guy knows every detail of some things, and is completely clueless about basic stuff other times" that is pretty accurate.

CraigBurden commented 1 year ago

I'm on Linux x86_64, I am happy to pull the tools manually as you suggest. I would love to solve this though, not only for the greater good but for my colleagues that will want to pull this at some point.

It looks to me that some nice person at PlatformIO is your advocate:

https://registry.platformio.org/tools/platformio/framework-arduino-megaavr-megatinycore

They are keeping things up to date over there, at least when it comes to the core itself. But the tools clearly not.

https://registry.platformio.org/tools/platformio/toolchain-atmelavr/versions

On the surface the tool version looks okay to me, v7.3.0? But I assume there is either a subversion, perhaps a future release or fork that you want to use?

SpenceKonde commented 1 year ago

v7.3.0 of avr-gcc has LIKE FIVE OTHER INDEPENDENT VERSIONS INVOLVED

There's the version of AVRlibc (that's not changed in a while).

Then there is the version of the ATmega ATPACK, the ATtiny ATPACK, the Dx-series ATPACK and the Ex-series ATPACK.

Because there are so many variables and so many ways that things can get screwed up, I make my own toolchain packages that are known to work with all modern and classic AVRs (well, I don't test with classic ATmega, but they haven't changed much abvout those in years). And I point to it from the json file, which is how board manager of the Arduino IDE always can have the right toolchain version! That's what Arduino itself does too - Except they haven't updated it since 2020.

Something isn't right with the build scripts, they don't produce a working toolchain anymore. Not quite.

So what I do, is I fire up an AWS XXL compute instance (so the build finishes in a sane time) and run the build script there, with modifications to use the latest tinyAVR, megaAVR, Dx, and Ex ATPACK header packages. I also had to get a linux guy to help me extend the io.h file to use all the headers from the ATpacks. (I hope I never need to change it though, because there's a bit of an issue between me and that dude, and he's always been my go to for stuff like that, but I can't get any help from him (I don't think he even wants to talk to me) until we get this other issue worked out, which depends on a third party who was never supposed to be involved. He was not happy with me when I last brought it up, but this is notthe place to go over that stupidity.) Then I download the result, and then shut down the instance. Then I copy it to the WSL linux-in-windows thing. I also copy the most recent working Arduino official package, Arduino7, for all the platforms. I untarball all of them. Then, from the newly compiled one, I have commands in a text file that I copy and paste in to delete all the executables and shared object files that run on the host computer, as well as, IIRC, the linker scripts because those seem to be one of the problems. I think Microchip is 1 version ahead of us on the avr-libc, but despite that it's open source, in theory, I don't tink I've ever been able to find a working link to the souce code, only platform-specific compiled package. The majority of the files, which are precompiled AVR code we keep, plus device specs and I/O headers. Then I copy that tree on top of the arduino7 ones for each platform (since the platform-specific stuff was all stripped out already), recompress them. Then do the same thing on windows for the windows one because otherwise we get newline'ed.

I have repeatedly offered my toolchain packages to PIO and my only condition of the offer is that they not piggy back on my file hosting, because I I pay for hosting of those files and they charge for bandwidth and I don't want to wake up one morning and find that I'm suddenly facing a huge bill, because this is my full time job and guess what, it doesn't exactly pay very well.

SpenceKonde commented 1 year ago

According to my theoretical notes, I was suspicious that the reason there were problems is that I was not able to update to the latest avr-libc version because I could only find a windows and mac binary, that is, no source, so nothing I could do anything with. And that's why if I were some kind of crosscompiling guru or Arduino had included complete environment setup directions with their toolchain github repo, I couldn't just build the thing normally with a fresh binary build for all platforms like Arduino used to do until they didn't, because I think I need source code that Microchip is legally required to make available, which they do not make available. this may be why Arduino stopped building too. though the error you get doesn't sound anything like it's related to libc, I think that via a difference in how some linker script variable is treated, that's the key genesis of the problem.

That implies that if I had days of time to burn (I don't) setting up a cross compilation environment, I could build fresh binaries each time by running the full toolchain build script on all platforms. It might or might not work any better, but I do know one thing. Right now it takes me a little over half an hour to build on a c5-4xlarge instance. Something like 40 cents of compute time. We should expect that cross compilation for other platforms is slower, so most likely 50-75 cents per platform other than x64 linux. That's an extra cup of good coffee every time I attempted building the toolchain. So I could burn like 8-10 times as much money as I do now on compute time too, after burning days of time I don't have doing something I hate (wrangling toolchains, and any programming for a system with more than a 16 bit address space - notice how I never used the m2560? my brain only handles 16-bit addresses natively