Open markbrough opened 10 years ago
Yes, I agree that tags for documents should be sampled to see whether donors are correctly tagging documents and that the documents contain the looked-for information. PWYF is monitoring IATI data quality as part of its raison de etre and this must be considered a priority area for testing.
I would suggest that every organisation-level document is tested for all donors, and for the largest recipient 3 activities have their activity level document tags checked. Errors should be communicated to the donor in question.
In terms of scoring on the Index, some of the indicators are almost wholly about ensuring that a document has been correctly tagged in IATI and contains the correct information. If all 3 pass then the full score is allowed. If 2 pass, then 2/3 score, if 1 passes then 1/3 score, if none pass then no score for the indicator. For documents that are not part of the indicator (such as associated project documents that have been included for completeness) these should not affect the indicator score but the donor should be notified.
Activity-level document links should relate to documents that are about that specific activity, or contain a clearly identified section dealing with the specific activity. Using activity-level links for country- or agency-level documents is generally not acceptable.
We support point sampling of documents. Please make sure the sample size is large enough to avoid anomalous results.
Hi, one of the issues with the tester in general regarding documents is around the quality testing of content and format of document. I have outlined the points below:
Thank you, Akshay
Thanks for all the comments.
For 2014
A total of 14 indicators refer to documents. These documents are manually checked to verify that they contain the required information to score for the indicator. For IATI publishers, the documents may be located via links in their XML files.
10 documents will be randomly sampled from organisations' IATI files, with a minimum of five documents needing to meet the criteria for the indicator.
For organisation level documents where only a single document is expected, the document will be checked to see if it contains the required information to score on the indicator.
We will also be sampling data on results, sub-national location and conditions.
Issues
In the 2013 Index, it was possible to tag activities with documents which didn't contain the information being sought in two ways:
contract
, but not containing any contract information).Questions
By sampling we mean: randomly selecting a sample of documents for manual checks.
2014 Index We are proposing to sample documents to manually check whether they contain the information requested. We are considering how this would be incorporated into the scoring methodology.