How do you think the partners involved in Digital contact tracing would have defined success?
<--- Score
63. What intelligence can you gather?
<--- Score
64. How do you gather requirements?
<--- Score
65. Who is gathering information?
<--- Score
66. What sources do you use to gather information for a Digital contact tracing study?
<--- Score
67. When are meeting minutes sent out? Who is on the distribution list?
<--- Score
68. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?
<--- Score
69. Scope of sensitive information?
<--- Score
70. If substitutes have been appointed, have they been briefed on the Digital contact tracing goals and received regular communications as to the progress to date?
<--- Score
71. How does the Digital contact tracing manager ensure against scope creep?
<--- Score
72. Has a team charter been developed and communicated?
<--- Score
73. What was the context?
<--- Score
74. Are there different segments of customers?
<--- Score
75. What scope do you want your strategy to cover?
<--- Score
76. Do you have organizational privacy requirements?
<--- Score
77. Is scope creep really all bad news?
<--- Score
78. When is/was the Digital contact tracing start date?
<--- Score
79. Have all basic functions of Digital contact tracing been defined?
<--- Score
80. How do you keep key subject matter experts in the loop?
<--- Score
81. Who defines (or who defined) the rules and roles?
<--- Score
82. How can the value of Digital contact tracing be defined?
<--- Score
83. Has the Digital contact tracing work been fairly and/or equitably divided and delegated among team members who are qualified and capable to perform the work? Has everyone contributed?
<--- Score
84. Are required metrics defined, what are they?
<--- Score
85. Has everyone on the team, including the team leaders, been properly trained?
<--- Score
86. Is the current ‘as is’ process being followed? If not, what are the discrepancies?
<--- Score
87. What information should you gather?
<--- Score
88. What is the definition of success?
<--- Score
89. Is the Digital contact tracing scope complete and appropriately sized?
<--- Score
90. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
91. Are the Digital contact tracing requirements testable?
<--- Score
92. What is the scope of the Digital contact tracing work?
<--- Score
93. Are there any constraints known that bear on the ability to perform Digital contact tracing work? How is the team addressing them?
<--- Score
94. What are the rough order estimates on cost savings/opportunities that Digital contact tracing brings?
<--- Score
95. What is the scope?
<--- Score
96. Have the customer needs been translated into specific, measurable requirements? How?
<--- Score
97. How and when will the baselines be defined?
<--- Score
98. How do you gather the stories?
<--- Score
99. Are accountability and ownership for Digital contact tracing clearly defined?
<--- Score
100. Have specific policy objectives been defined?
<--- Score
101. Is there a critical path to deliver Digital contact tracing results?
<--- Score
102. What are the tasks and definitions?
<--- Score
103. Is there any additional Digital contact tracing definition of success?
<--- Score
104. What scope to assess?
<--- Score
105. Are audit criteria, scope, frequency and methods defined?
<--- Score
106. Are different versions of process maps needed to account for the different types of inputs?
<--- Score
107. What customer feedback methods were used to solicit their input?
<--- Score
108. What are the compelling stakeholder reasons for embarking on Digital contact tracing?
<--- Score
109. How do you build the right business case?
<--- Score
110. Do you have a Digital contact tracing success story or case study ready to tell and share?
<--- Score
111. Is data collected and displayed to better understand customer(s) critical needs and requirements.
<--- Score
112. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?
<--- Score
113. Are all requirements met?
<--- Score
114. Is Digital contact tracing currently on schedule according to the plan?
<--- Score
115. How was the ‘as is’ process map developed, reviewed, verified and validated?
<--- Score
116. What key stakeholder process output measure(s) does Digital contact tracing leverage and how?
<--- Score
117. Has the improvement team collected the ‘voice of