Given the growing popularity of these tests, I have noticed a significant increase in questions regarding the accuracy and source of the test data. There are also known issues with some of the tests (e.g. #78 and #72).
Clearly declaring how each set of tests was generated is necessary to clear up any confusion regarding the accuracy (or lack there of) of this data, as well as to properly attribute any projects that were used in the process. It may also reveal why some errors exist, and would enable the emulation community easier access to improve the original source and/or the test data itself.
The 8088 tests contributed by @dbalsom and Folkert van Heusden already do this, which is greatly appreciated!
If the generation process is highly complex and thus may require a large amount of work/time to document, then at minimum write the source of the test data in each README, whether that be real hardware, an emulator/simulator, or anything else.
Given the growing popularity of these tests, I have noticed a significant increase in questions regarding the accuracy and source of the test data. There are also known issues with some of the tests (e.g. #78 and #72).
Clearly declaring how each set of tests was generated is necessary to clear up any confusion regarding the accuracy (or lack there of) of this data, as well as to properly attribute any projects that were used in the process. It may also reveal why some errors exist, and would enable the emulation community easier access to improve the original source and/or the test data itself.
The 8088 tests contributed by @dbalsom and Folkert van Heusden already do this, which is greatly appreciated!
If the generation process is highly complex and thus may require a large amount of work/time to document, then at minimum write the source of the test data in each README, whether that be real hardware, an emulator/simulator, or anything else.