Closed erickson-nist closed 7 years ago
ACK. The register layout needs some documentation still. I'll also mention what the checksum algorithm is. But "documenting the checksum algorithm used" is not sensible. There are books about those checksum algorithms.
Hardware testbenches are more tricky. If we can agree on a hardware setup for the hardware testbenches that can be replicated both at my and and at your end, and if we agree on what should be tested, we can look into it. But that won't be easy. And I don't think it is really helpful as it seems that neither of us has the intention of continuously running those hardware testbenches.
Your quotes are from the work statement, not from the quote that we supplied in response. I am unsure what the contract is based on but I suspect it would be the latter (as that's also what's quoted in the final contract). But let's be pragmatic here: what do you actually need?
M-Labs uses the Python package coverage
for systematically running unittest. See FAQ. That package also optionally produces a report indicating the code coverage of unit tests.
A partial coverage analysis on the ARTIQ code base is generated by the following.
$ cd ~\artiq-dev\artiq
$ coverage run --source artiq -m unittest discover -v
$ coverage html
Documenting the checksum algorithm is taken to mean: state what algorithm is used (so we can look it up to understand any limitations) as well as information on how it is used in practice to check for communication errors (i.e. what methods are called on the core device to retrieve the checksum from the PDQ and compare it to what it should be). No need for an extensive lesson on error corrected communications :)
Documentation is in the award, deliverable 1.3, as follows verbatim : "SPI protocol and register layout, protocol documentation, manual, unit tests". This I think can be taken to mean reasonable documentation and manual and unit tests of the type that were in existence for pdq2, given what was in the performance work statement as well.
I think that hardware unit tests are not necessarily as big an issue -- we can write up waveforms that we want to test and see if they are generated properly. What IS important from our end is full documentation, on readthedocs or equivalent as was done with pdq2, of the register layout, communications protocols, triggering protocols, essentially updating the existing pdq2 documentation with whatever nontrivial changes have been made in this latest generation of the gateware.
Topics which I think should be covered in the documentation/manual -- and obviously some of these need paragraphs, others might just need a sentence or two -- include (I may be forgetting some here):
@jordens your email( copied below) to Yong on May 12, 2017 is the kind of thing I find very useful and what I think of when I here the words hardware test bench. Simple examples like that for the key functionality would be very useful to have in the testbench folder.
For example, when I first tried to test the SPI triggering, my interpretation of the write_config function was that it configures the PDQ to accept triggering via either SPI or F1, not that it does the triggering itself. I spent a good amount of time pouring over the code looking for an SPI trigger function before giving up and posting an issue. It ended up being incredibly simple to do, but was not obvious to me at first(and likely may not be to others just starting out). A simple program like the one in your email that executes the two correct lines of code to turn the trigger on/off and/or a similar note in the documentation would be quite helpful to beginners.
I think it would be very helpful if there were simple example experiments in the testbench folder showing all the basic functionality like SPI triggering, checking whether the returned checksum value is correct, programming a simple waveform via SPI, etc.
It's very much along the lines of a usual ARTIQ setup.
class PDQ2SPI(EnvExperiment): def build(self): self.setattr_device("core") self.setattr_device("pdq0") self.setattr_device("led")
@kernel
def run(self):
self.core.reset()
self.core.break_realtime()
self.pdq0.setup_bus(write_div=50, read_div=50)
self.pdq0.write_config(reset=1)
for i in range(100):
delay(80*us)
self.led.on()
self.pdq0.write_config(clk2x=1, trigger=0, enable=0, aux_miso=1)
self.pdq0.write_crc(0)
self.pdq0.write_frame(0)
self.led.off()
@erickson-nist That is not what's commonly understood by hardware testbench. But if that example that I sent you and a few more like that satisfy your needs, I'll check them in. And always feel free to ask a question if something is not clear. @dhslichter As I said, the registers need documentation. Expect that to be done today. But I need to point out that most of the new code is indeed already extensively documented. Other things like extending the existing documentation or code that was previously written (like the USB and wavesynth code carried over) are not covered.
@jordens thank you. Looking over the pdq readthedocs this morning, it looks generally very clear and good, obviously awaiting the register info you will be putting on soon.
@dhslichter http://pdq.readthedocs.io/en/latest/manual.html#registers Let me know if there is something unclear.
@jordens thanks, I would specify in the manual.rst documentation that the maximum number of frames is 32 (i.e. less than the full 256 one might naively expect), and document how the code handles the three MSBs of the register (just ignored, I assume? In the Migen I just see the coercion Signal(max=32), not sure how that behaves in reality).
@jordens thanks for the new example files. The software itself is in pretty good shape. Example files will simplify our testing procedure. I would also add in example files for
See #14
As I go through the contract testing the various features, I notice that one of the bullet points states the following:
I find brief mention of SPI in /doc/architecture.rst and https://m-labs.hk/artiq/manual-master/core_drivers_reference.html#module-artiq.coredevice.pdq
Am I looking in the wrong place, is there more to be written, or do you feel that the above references satisfy the stipulations of the contract?
In another section of the contract it states:
The testbench folder appears to satisfy the first half with software simulations, though all of the included files seem to deal strictly with simulations and not hardware. Is there a set of hardware test benches elsewhere that I have missed?
The above contract quotes come from what looks to be an earlier draft sent to me by Yong, so please let me know if there are any discrepancies with the final version. I'm tagging @dhslichter and @yw-nist in case they have any additional insight as they were actually involved in formulating the contract.