Closed kimisatoPC closed 5 years ago
@kimisatoPC working on the feedback below from GBV team
[x] Is it possible to program “controls” into the tool to reduce entry errors? Specifically, we are wondering if entry of a partial score (i.e. 0.75) would result in an “invalid” data entry message. During the in-country trainings, we had a number of questions related to wanting partial scores and we could foresee this coming up again. Yes, we can add in controls via drop-down menu items where users would select only a “1” or a “0”, so no partial scores would be allowed.
[x] Can the ICPI Team clarify the differences and purposes of the two windows, labeled “raw data” and “data entry form”? We are curious if these two windows are linked or if one is simply there for explanatory purposes? Raw dataset is only used for populating the master dashboard, while the Data Entry Form is for clinic visits to assess the QA standards. We can also add some clarifying language on the Instructions tab.
[x] We had been operating under the assumption that the “data entry form” was where the data would be first entered. If this is indeed correct, we have been discussing how best to approach incorporating both versions of the tool (Full and Minimum). While there is a column that specifies Full or Minimum on the data entry form, we are wondering if there is a way to select “Full” or “Minimum” for each standard during data entry via a drop down option that results in some auto calculate functions? For example, if the data entry clerk selects “minimum”, then certain criteria would automatically disappear to only include those relevant criteria and the scores would also adjust. This programming would also have to be linked to multiple sheets, data entry and score entry, as well as on the actual dashboard. This would be possible but would require a complete redesign of the tool in Excel, given its limitations. If this type of functionality is desired then we would advise moving the tool into a web app that would allow for more advanced features, such as in DHIS2.
[x] We might be missing it, but it seems like the current data entry and Dashboard is set up to record one assessment per site. Which begs the question: How will we handle multiple assessments over time? Is there a way to allow a ‘filter by’ date/assessment #/etc to allow us to eventually see changes in scores over time for sites that conduct more than one assessment? Preferably, this would be within the same dashboard and function similarly to Pano filters. The master dashboard is set up so that multiple assessments per site can be stored through the use of a unique ID.
[x] On a related note, would we be able to create a pivot table to compare scores across time? Yes, we can definitely include a pivot table to filter by date of assessment and other filters. For the Dashboard, can we explore ways to incorporate comments? Perhaps this could be another tab? We can explore this, however Excel will be limited in terms of displaying comments besides in a basic pivot table. If we would like more advanced functionality with the dashboard, we would suggest considering a web app version of data entry and then a more advanced dashboard tool, either within DHIS2 or Tableau, or R Shiny that could display text data in a more interactive way.
@shapaklyak Dashboards are up and ready for your review! Please go through the final feedback items above.
Tools finalized and submitted. Great work, @shapaklyak !
@kimisatoPC has addressed the following comments from the ICPI leads in the current versions posted here.
Tab by Tab Review Raw dataset tab
Instructions tab About section:
[x] Thanks for providing the background on the tool. Please include when the tool was first developed? I think that question is actually relevant now and if someone came across this tool in a few years.
[x] There are currently no dates/years anywhere on the Instructions page.
[x] In addition to letting the reader know the year that the Jhpeigo tool was developed, it would be helpful to add the year that the ICPI tool was made.
[x] Can you add the ICPI Cluster that made this tool and/or a contact email (icpi@state.gov) for questions? Instructions section:
[x] How does one know whether they should complete the minimum version of the tool or the full version? It would be helpful to specify this criteria briefly upfront. For ex., can you give a ballpark amount of time that it takes to complete the minimum vs. the full? Would I use the minimum approach if I were missing key personnel who work on GBV or key documents that would help me in making the assessment?
[x] Are there any other documents that I need prior to starting this exercise? For ex., would it go faster if I have certain paperwork in front of me or do I need particular HCWs present? Using the Data Entry Form section:
[x] For step #2, when you are instructing to “transfer the information collected onto the Data Entry Form”, does this only apply if one has used the paper-based approach?
[x] Are you missing a step at the end here, where one is supposed to copy and paste the data from the raw dataset tab to the 2nd tool? If this is correct, please make this clearer in the instructions.
Data Entry Form tab
[x] Score: It seems that there are 3 acceptable answer choices here (1, 0 or NA). Why not include dropdowns with those 3 options? That would decrease data entry error here.
[x] Are the answers always binary?
[x] It seems that with the questions, there could be a lot of gray area. For example, “Facility maintains patient privacy during triage/intake process”.
[x] I could see this answer being “sometimes, when the clinic is not extremely busy”.
[x] Any guidance to offer to staff before they are answering these questions…For example, if they are unsure should they choose “no”? I think you referenced a Facilitation guide; it would be important to specify any instructions for whomever is answering these questions to make it more likely that they have a similar approach.
[x] It would be great to encourage the use of the Facilitation guide to the extent that you can. Asking staff to fill out this set of information is essentially asking them to do operations research.
Score Summary tab
[x] It seems that the “Standard Achieved” cell is an important outcome of this exercise. It might be good to have a visual cue indicating good performance or poor performance here.
a. For ex., if my site fails to meet the criteria for Availability and Appropriateness of Services, and the NO appears under Standard Achieved, can it change to red?
b. Conversely if there is a YES, can it change to green?
[x] The font on this form is size 9. The content on this page is important; can you increase the font size?
2) GBV QA Master Dataset and Dashboard 20181217
Tab by Tab Review
[x] Can you add an instructions page to this tool? It might seem like overkill, but it would be good to specify that it is connected to the first tool.
[x] Is possible to combine these 2 tools into 1 tool?
[x] Confirming that the raw data tab will be hidden when packaged for others to use, right?
Overview of Results tab
[x] The visuals that you have on the Results by Site tab are really great. They are so great, that I am wondering, what is the added value of the Overview of Results tab? Is it actually telling us something in addition to the analyses that are already being shown on the Results by Site tab? Or is it just adding bar graphs?
[x] In reviewing the QA Category Scores – I would display the sum of the score and total verification criteria differently. a. If the interpretation of the first graph is that of a total possible score of 9 the facility scored 5 then I would show this as a percentage. b. You could use a stacked bar chart. The bar chart used would normally compare values and it looks like you really want to show a percentage.
[x] Bottom light blue area – the values for the Sum of Score and Total Verification Criteria should be centered so it is clear what the values refer to.
Results by Site tab
[x] Can you add a legend for the colors? Are those thresholds consistent with the Jhpiego tool (and important to the overall methodology) or established by ICPI to indicate good performance or needs improvement?
[x] The Radar Chart is very good summarizing the overall situation. Consider moving this up on the page so it isn’t overlooked.
@shapaklyak can you please review comments below and address if appropriate? Additional Comments
[x] It’s great that this analysis draws on data from a health facility assessment. Do our SIMS indicators capture any similar data?
[x] Are there any other existing instructions as to how to interpret the findings & take next steps? For ex., if 50% of my categories are in red, are there priority areas to focus on? Or are all areas priorities?
[x] It would be great for the novice to this technical area to have definitions for QA Categories, QA Standards, QA Verification. Maybe this could be spelled out in the overview tab you will add since the visuals will likely be used separately from the facilitation guide and data entry and there will not be a sense of what the overall meaning is.