Open ianrrees opened 2 years ago
There is also this repo for a different driver: https://github.com/Disasm/stm32-usbd-tests I think a collection of links in this repo with some test setup description would be good to have.
Ah, cool, thanks @Disasm . I'll open a PR to the main usb-device README.md as a start. Do you have links to boards that someone interested in testing STM32s might be able to buy?
Does the USB2 test suite pass? I've not got a login handy but could have a look at that at work sometime.
Do you have links to boards that someone interested in testing STM32s might be able to buy?
The various STM32 "Discovery" boards are high quality and include a debugger compatible with probe-run
. Other cheaper options are e.g. the cheap Black Pill boards based on the STM32F4x1 chips.
Following on from discussion in PRs #101 and #103 - my feeling is that the hardware specific code+documentation should not live in usb-device
, instead it should be linked from documentation here. At least in the ATSAMD case, I think the logical place for it is with the examples for what we call BSPs (Metro M0 examples).
This thinking comes down to the scenario where that device-specific code is used: someone is trying to test a change made to some combination of usb-device
and device-specific implementation of usb-device
traits (a HAL or similar). That person necessarily has some hardware to run the test on, so they are only interested in a subset of the hardware-specific tests. Unless the changes are strictly confined to usb-device
(not to an interface it exposes), the person will also need a repo for the HAL on hand anyway.
Basically, it seems easier to work on usb-device
in the context of the hardware's HAL with a patched usb-device
, rather than the other way around. At least in the ATSAMD case, there is also more change in the HAL than usb-device
.
Thinking ahead to CI too - if usb-device
CI ran the hardware-specific tests, wouldn't that be quite noisy? A change to a usb-device
interface would result in failing tests for all the hardware-specific tests until their respective HALs caught up. Ideally, there would be a hardware-independent test, maybe using something like Linux USB Gadget as in #12, and that probably should be included with usb-device
.
The various STM32 "Discovery" boards are high quality and include a debugger compatible with
probe-run
. Other cheaper options are e.g. the cheap Black Pill boards based on the STM32F4x1 chips.
Ah, gotcha. In terms of embedded Rust, I'm mostly coming at this from an ATSAMD background, and in that world our typical projects are more per-board than per-MCU. So, if a person wanted to use the test firmware I made, they should have the same board, not just the same chip. That said, we have a few rough edges in this area (mainly around maintenance/examples) and I think for something like this work, it might make sense to make the usb-device test firmware only MCU-specific. The only thing that's really per-board in that firmware currently is a blinking heartbeat LED, obviously that's not critical. (Edit to add: I believe the ATSAMD examples currently use an on-board oscillator, so a second dependency on the board, but the MCUs can behave as a USB device without an external oscillator)
I can get onboard with keeping hardware-specifics out - I had an MQTT project where HITL code was checked it, and it was in disrepair incredibly quickly. Is there any USB test-class code that we need to have in usb-device
, or does that already exist?
Is there any USB test-class code that we need to have in
usb-device
, or does that already exist?
Yep, this file is used by the firmwares mentioned in the first couple comments of this thread. I don't have strong opinions on where it lives within usb-device
- seems fine for it to stay where it is, or perhaps if we set up the workspaces per #103 it could be its own package.
I'll pursue merging the ATSAMD usb-device
test code in to the ATSAMD HAL; no reason we couldn't move it again later if desired. Happy to have usb-device
documentation link to either my separate testing repo (linked in the OP), or wait until the ATSAMD source is in the HAL main branch.
Beyond the location of the pieces of test code - how much testing should we require of PRs?
For example, I started this issue in the context of #60 so will shamelessly refer to that :). Is the test coverage there sufficient? Is it necessary? Does someone with a particular role in this project need to confirm the tests pass, since they currently need to be manually run?
Some parts of usb-device
can be tested in software and without OS stuff like USB over network or USB gadgets. It won't cover everything, but still can help a lot, as basically no setup would be necessary, it's easy to do some weird stuff which would be difficult with real hardware, and sometimes easier to debug. Testing in software worked pretty well for usbd-dfu
crate so I decided to move and extend that testing machinery to a dedicated crate - usbd-class-tester
While initially intended for testing Classes, it turned out that it also can test usb-device
itself. Quite a lot of code related to Control transfers in its current state. A few examples are here: https://github.com/vitalyvb/usbd-class-tester/blob/344b4cd7bcb09b225d4f653ee3f02c25dca28495/tests/test_device.rs#L42
This works by having a (very crude) software UsbBus implementation which provides methods to send and receive data from the endpoints, and a test case has access to both the Class and the emulated UsbBus with usb-device
in the middle.
For this PR I've thrown together a firmware and minimal documentation to use this crate's
TestClass
. @Disasm has an implementation for a different micro here.I wonder if we should associate test firmware sources with this project somehow? It's not clear to me whether that should look more like a link in usb-device documentation, checking test firmware sources in to usb-device, or putting the test firmwares with HALs that implement usb-device support. Any thoughts?
For me, this was a bit of a mission (partly down to flaky SWD, it seems), and I think it would be good to reduce friction around testing as much as practical.