Evaluation of the Michigan Disease Surveillance System: A Subjectivist Approach to Assessing an EDSS and Its Environmental Context Público

Buck, Matthew Paul (2017)

Permanent URL: https://etd.library.emory.edu/concern/etds/0z708x246?locale=pt-BR
Published

Abstract

Background:

Since 2004, the State of Michigan's Communicable Disease Division (CD Division) at the Michigan Department of Health and Human Services has used a custom-built, web-based surveillance tool as its electronic disease surveillance system (EDSS). This EDSS accepts both manual entry of cases and electronic laboratory reports (ELRs), supports case investigation, and is the basis for subsequent reporting to the State's partners (e.g., to CDC). In that time, a comprehensive assessment of the system has not been completed. In order to assist the CD Division in assessing the remainder of the MDSS lifecycle, its environmental context, and whether it can remain viable into the future, a comprehensive analysis of system requirements, a comparative assessment of existing (a.k.a., "off of the shelf") systems, an informatics capacity assessment, and a funding assessment were conducted.

Methods:

Using a subjectivist approach through a responsive/illuminative framework, system user feedback was solicited and analyzed to progressively elucidate specific system requirements, successes, and functionality gaps. Applying a standardized EDSS comparison tool (the vendor analysis), MDSS was then compared against other comprehensive ‘off-the-shelf' EDSSs. In order to support the optimal identification and ongoing use of the most appropriate EDSS, an additional standardized assessment tool was then utilized to assess the CD Division's capacity to engage in informatics activities around supporting EDSSs for communicable disease surveillance. A high-level funding analysis was conducted to describe the funding environment in which MDSS is currently maintained and developed.

Results:

The responsive/illuminative approach found that, overall 82.7% of users would give MDSS and adequate or mostly adequate rating. Results showed that an EDSS solution should be identified by the CD Division that addresses specific attributes, including: Accuracy, Completeness, Consistency, Data Quality, Error Reduction, and System Reliability and Functionality. The standardized EDSS comparison tool was shown to provide an effective methodology and format to identify and articulate system needs and to critically compare systems. But, while this tool showed that there is reason to believe that an alternative system might be warranted for consideration, significant limitations in the tool's dated representation of evaluated systems and the introduction of rater bias through the addition of the MDSS assessment resulted in only general findings. The informatics capacity assessment showed that the CD Division has a strong foundation of informatics practices and principles, is in the range of a ‘managed' level of maturity, and is well poised to further develop its growth. The funding review illustrated that there are significant environmental constraints on the CD Division's ability to select and/or develop an EDSS for communicable disease surveillance, including: lack of state sponsorship, system enhancements restricted to funding opportunity requirements, and lack of an effective funding communication and advocacy strategy.

Conclusions:

While this evaluation did not determine whether any ‘off-the-shelf' EDSS should replace MDSS or whether MDSS should be further enhanced to address system gaps identified in this analysis, several recommendations were identified to assist the CD Division in determining which attributes matter most in promoting confidence in any EDSS solution for communicable disease surveillance in Michigan. Specific recommendations were made that, regardless of the EDSS solution, would target: reducing variability in data input, supporting sophisticated data analytics, improving system response time, reducing incidents of missing case data, enhancing system alerts, development of a case follow-up module, and improving communication and transparency in system support. The vendor analysis showed that other ‘off-the-shelf' EDSSs show potential as possible EDSS solutions and should be further investigated by the CD Division as viable options. The informatics capacity assessment led to the development of a specific, five-year plan with a three-year interim benchmark. This five-year plan would result in moving the CD Division from a ‘managed' state to a ‘defined' state of informatics practice. And, the funding review provided for three recommendations: that the CD Division should develop a funding management model to organize and direct its funding activities; that the CD Division should develop a managed EDSS development communication strategy; and, that the CD Division should work to identify more diversified funding sources including efforts to obtain state sponsorship of communicable disease surveillance activities. The recommendations resulting from this evaluation were also compared to the recently publish Public Health Commission (PHC) report to Governor Snyder and found to be in compliance with the recommended PHC approach and efforts to modernize Public Health towards Public Health 3.0.

Table of Contents

Table of Contents

Chapter 1 - MDSS Evaluation Literature Review 1

1.1: What is an Electronic Disease Surveillance System - (a) General Description 1

1.1: What is an Electronic Disease Surveillance System - (b) Critical Functions 5

Figure 1.1: "Sequence of Actions Needed to Gather and Use Health-Related Information for Public Health Purposes" 6

1.3: EDSS Evaluations - (a) Necessity 9

1.3: EDSS Evaluations - (b) Core Requirements 10

1.3: EDSS Evaluations - (c) Existing Frameworks 13

Figure 1.2: "Logic Model for the nine-dimension evaluation framework to assess public health information systems" 14

Figure 1.3: "A Conceptual Framework of the Public Health System as a basis for measuring system performance" 16

1.4: Necessity of Public Health Informatics 18

1.5: MDSS 21

1.6: MDSS Evaluation 23

Chapter 2 - MDSS Evaluation Methodology 24

2.1: Evaluation Methodology Basis and Justification 24

Figure 2.1: "Natural history of a subjectivist Study" 27

Table 2.1: "Classification of generic study types by broad study questions and the stakeholders most concerned" 28

Table 2.2: "Factors distinguishing the nine generic study types" 30

2.2: Evaluation Framework - (a) Overarching Design 31

2.2: Evaluation Framework - (b) Iterative Observations and User Feedback 33

Table 2.3: "Representative Focus Group Composition" 35

2.2: Evaluation Framework - (c) Cross-System Comparative Analysis 37

2.2: Evaluation Framework - (d) Funding Analysis 38

2.2: Evaluation Framework - (e) Informatics Capacity Self-Assessment 39

2.2: Evaluation Framework - (f) Prospective Public Health Capacity Readiness 40

Chapter 3 - MDSS Evaluation Results 41

3.1: Responsive/Illuminative MDSS Analysis - (a) Initial High-Level Requirements Gathering Results 41

3.1: Responsive/Illuminative MDSS Analysis - (b) Focus Group Discussions Results 49

Table 3.1: "Actual Representative Focus Group Composition" 50

3.1: Responsive/Illuminative MDSS Analysis - (c) MDSS User Survey Results 54

Table 3.2: "User Survey Population" 54

Table 3.3: "Respondent Population" 55

Table 3.4: "Adjusted Respondent Population" 56

Figure 3.1: "Information Quality Attributes" 58

Figure 3.2: "System Quality Attributes" 60

Figure 3.3: "Service Quality Attributes" 62

Figure 3.4: "User-Recommended Targeted Improvements" 64

Figure 3.5: "Satisfaction with Task Completion" 67

Figure 3.6: "Overall MDSS Rating" 70

Table 3.5: "Impact and Outcomes" 73

3.2: PHII EDSS Vendor Comparison - Results 74

Table 3.6: "Vendor Analysis - MDSS Results" 78

Figure 3.7: "MDSS Core Area Scores" 79

3.3: Building an Informatics Savvy Health Department: A Self-Assessment - Results 81

Figure 3.6: "Informatics Capacity Self-Assessment Score Distribution" 84

Figure 3.7: "Informatics Capacity Self-Assessment Score Frequency Distribution" 85

3.4: Funding Assessment - Results 87

Chapter 4 - Discussion and Recommendations 90

4.1: Discussion - Overview 90

4.2: Responsive/Illuminative Approach - (a) Discussion 92

4.2: Responsive/Illuminative Approach - (b) Recommendations 104

Table 4.1 - Recommendations: Goal 1, Objective 1 108

Table 4.2 - Recommendations: Goal 1, Objective 2 109

Table 4.3 - Recommendations: Goal 2 111

Table 4.4 - Recommendations: Goal 3 111

4.3: PHII EDSS Vendor Comparison - Discussion and High Level Recommendations 112

4.4: Building and Informatics Savvy Health Department: A Self-Assessment - Discussion and Recommendations 116

4.5: Funding Assessment - Discussion and Recommendations 121

4.6: Additional Considerations 124

4.7: Limitations 128

Chapter 5 - Executive Summary 131

5.1 - Executive Summary 131

Bibliography 134


Appendices

MDSS Evaluation: Appendix A - Responsive/Illuminative Analysis - High-Level Requirements Discussions and Funding Review 138

Requirements Derivation - Program/Unit Shadowing 138

9/28/2016 - 2:30 PM to 3:30 PM - MDSS Discussion with STD Unit Manager 138

10/5/2016 - 8:00 AM to 9:30 AM - MDSS Discussion with STD Unit Staff 140

10/6/2016 - 8:00 AM to 9:30 AM - MDSS Discussion with STD Unit Staff 142

10/12/2016 - 10:00 AM to 10:30 AM - MDSS Discussion with STD Unit Manager 144

10/12/2016 - 2:30 PM to 3:30 PM - MDSS Discussion with Hepatitis Surveillance Unit Staff 145

12/2/2016 - 1:00 PM to 2:00 PM - MDSS Discussion with Hepatitis Surveillance Unit Staff 147

Funding Review 148

Public Health Emergency Preparedness (PHEP) Cooperative Agreement funding 148

Assessment, Assurance, Policy Development, and Prevention Strategies (AAPPS) Grant Review 149

Tuberculosis Surveillance Funding - Review 152

MDSS Evaluation: Appendix B - Responsive/Illuminative Analysis - Focus Group Discussion Summaries 154

3.1.17 - Focus Group Discussion #1 154

3.1.17 Focus Group Discussion - Organized by Attribute 159

3.8.17 - Focus Group Discussion #2 162

3.15.17 - Focus Group Discussion #3 167

MDSS Evaluation: Appendix C: PHII EDSS Vendor Comparison - Comparability Matrix Results 171

MDSS Evaluation: Appendix D - Building an Informatics-Savvy Health Department: A Self-Assessment - Completed for MDHHS CD Division 183

Section 1: Vision, Strategy and Governance 184

Section 2: Skilled Workforce 197

Section 3: Effectively Used and Well-Designed Systems 203

Score Assessment 215

Vision, Strategy, and Governance 215

Skilled Workforce 217

Effectively Used and Well-Designed Systems 219

Overall Score 221



About this Master's Thesis

Rights statement
  • Permission granted by the author to include this thesis or dissertation in this repository. All rights reserved by the author. Please contact the author for information regarding the reproduction and use of this thesis or dissertation.
School
Department
Degree
Submission
Language
  • English
Research Field
Palavra-chave
Committee Chair / Thesis Advisor
Committee Members
Partnering Agencies
Última modificação

Primary PDF

Supplemental Files