Change:
The DPA’s Research & Analysis unit, responsible for mandatory federal reporting, regularly produces Timeliness Performance reports. The report titled “Applications Timeliness” will be used to produce this metric. It measures the percentage of applications processed in a timely manner.
To:
The DPA’s Research & Analysis unit, responsible for mandatory federal reporting, regularly produces Timeliness Performance reports. These reports will be used to produce this metric, as they measure the percentage of applications processed in a timely manner.
Change: As we begin, our baseline timeliness percentage, across all programs, is 73%, while our goal is 95% timeliness.
To: The current timeliness percentage, averaged across all programs, is 93%, while our goal is 95% timeliness.
(2. Acceptable Error Rate)
Change: The case management system in use by the DPA (PathOS) contains a reporting feature which allows us to generate a “Twelve Month Summary” report of each program’s error rates.
To: The case error rate will be generated from the Division’s case management system by a Public Assistance Analyst assisting the project, who generates a statewide summary of each program’s error rates within a given time frame. We are currently updating these yearly with twelve months’ data.
Change: Our baseline error rate across programs is 14%, and our goal is 8%.
To: Our current error rate across all programs is 25%, and our goal is 8%. In 2020, the error rate was 11%. The increase is due to an expansion in the category of errors. Previously, errors were only reported when the benefit amount was incorrect, or the category or subtype was incorrect. In 2020, the division began counting errors in correspondence, even when the error does not affect the benefit amount or category.
(3. Applications per day per Eligibility Technician)
Change: We work with our PathOS consultant, receiving this information quarterly.
To: To obtain this metric, we work with the contractor for our workflow management platform, Current™.
Change: Our baseline metric is 6, and our goal is 9 applications per day, per worker
To: Our current averaged metric is 5, and our goal is 9 applications per day, per worker.
(SME Assigment)
Change: 1. DPA director sends SME request to Regional Managers and Supervisors
To: 1. DPA director, or designee, sends SME request to Regional Managers and Supervisors
Change: 3. DPA director forwards SME names to Project Management Office (PMO)
To: 3. DPA director, or designee, forwards SME names to Project Management Office (PMO)
Change: 4. PMO sends welcome letter: a. Introduces contract purpose b. SME participation parameters i. Method of research data collection ii. Contact and Meeting expectations c. Next Steps
To: 4. PMO sends welcome letter: a. Introduces contract purpose b. SME participation parameters (including method of research data collection and Meeting expectations) c. Next Steps
(Discovery Session Format)
Remove: Diary studies-self-reported data from users over a period of time such as a few days. This method for Discovery necessitates a short diary period, in order to be contained within a two week sprint
Because: We have never, and probably wouldn’t employ this method.
(Pre-Production Usability Testing)
Change: With designated Subject Matter Experts (SME’s), guided and documented by the EIS-R Project Management Office (PMO).
To: With designated Subject Matter Experts (SME’s), guided and documented by the EIS-M Project Management Office (PMO
Remove: These tests could be performed on location in Staff Development and Training (SD&T) training rooms, with results logged by the PMO into Azure DevOps
(Post Production Usability Testing)
Change: We’ll also need to begin collecting information from all users on the implemented feature
To: We may also need to collect information from users on the implemented feature
Change: To launch this effort after release, we’ll release a short user survey to all DPA field workers
To: To launch this effort after release, we may, if appropriate release a short user survey to select DPA workers
Change: The DPA’s Research & Analysis unit, responsible for mandatory federal reporting, regularly produces Timeliness Performance reports. The report titled “Applications Timeliness” will be used to produce this metric. It measures the percentage of applications processed in a timely manner. To: The DPA’s Research & Analysis unit, responsible for mandatory federal reporting, regularly produces Timeliness Performance reports. These reports will be used to produce this metric, as they measure the percentage of applications processed in a timely manner.
Change: As we begin, our baseline timeliness percentage, across all programs, is 73%, while our goal is 95% timeliness. To: The current timeliness percentage, averaged across all programs, is 93%, while our goal is 95% timeliness.
(2. Acceptable Error Rate) Change: The case management system in use by the DPA (PathOS) contains a reporting feature which allows us to generate a “Twelve Month Summary” report of each program’s error rates. To: The case error rate will be generated from the Division’s case management system by a Public Assistance Analyst assisting the project, who generates a statewide summary of each program’s error rates within a given time frame. We are currently updating these yearly with twelve months’ data.
Change: Our baseline error rate across programs is 14%, and our goal is 8%. To: Our current error rate across all programs is 25%, and our goal is 8%. In 2020, the error rate was 11%. The increase is due to an expansion in the category of errors. Previously, errors were only reported when the benefit amount was incorrect, or the category or subtype was incorrect. In 2020, the division began counting errors in correspondence, even when the error does not affect the benefit amount or category.
(3. Applications per day per Eligibility Technician) Change: We work with our PathOS consultant, receiving this information quarterly. To: To obtain this metric, we work with the contractor for our workflow management platform, Current™.
Change: Our baseline metric is 6, and our goal is 9 applications per day, per worker To: Our current averaged metric is 5, and our goal is 9 applications per day, per worker.
(SME Assigment) Change: 1. DPA director sends SME request to Regional Managers and Supervisors To: 1. DPA director, or designee, sends SME request to Regional Managers and Supervisors
Change: 3. DPA director forwards SME names to Project Management Office (PMO) To: 3. DPA director, or designee, forwards SME names to Project Management Office (PMO)
Change: 4. PMO sends welcome letter: a. Introduces contract purpose b. SME participation parameters i. Method of research data collection ii. Contact and Meeting expectations c. Next Steps To: 4. PMO sends welcome letter: a. Introduces contract purpose b. SME participation parameters (including method of research data collection and Meeting expectations) c. Next Steps
(Discovery Session Format) Remove: Diary studies-self-reported data from users over a period of time such as a few days. This method for Discovery necessitates a short diary period, in order to be contained within a two week sprint Because: We have never, and probably wouldn’t employ this method.
(Pre-Production Usability Testing) Change: With designated Subject Matter Experts (SME’s), guided and documented by the EIS-R Project Management Office (PMO). To: With designated Subject Matter Experts (SME’s), guided and documented by the EIS-M Project Management Office (PMO
Remove: These tests could be performed on location in Staff Development and Training (SD&T) training rooms, with results logged by the PMO into Azure DevOps
(Post Production Usability Testing) Change: We’ll also need to begin collecting information from all users on the implemented feature To: We may also need to collect information from users on the implemented feature
Change: To launch this effort after release, we’ll release a short user survey to all DPA field workers To: To launch this effort after release, we may, if appropriate release a short user survey to select DPA workers