How do you manage scope?
<--- Score
71. Are task requirements clearly defined?
<--- Score
72. Is full participation by members in regularly held team meetings guaranteed?
<--- Score
73. Is there a critical path to deliver DRIVE Technology results?
<--- Score
74. Have specific policy objectives been defined?
<--- Score
75. Has the DRIVE Technology work been fairly and/or equitably divided and delegated among team members who are qualified and capable to perform the work? Has everyone contributed?
<--- Score
76. What is the worst case scenario?
<--- Score
77. When is the estimated completion date?
<--- Score
78. Have all of the relationships been defined properly?
<--- Score
79. How is the team tracking and documenting its work?
<--- Score
80. Has a high-level ‘as is’ process map been completed, verified and validated?
<--- Score
81. What are the DRIVE Technology tasks and definitions?
<--- Score
82. What would be the goal or target for a DRIVE Technology’s improvement team?
<--- Score
83. How will the DRIVE Technology team and the group measure complete success of DRIVE Technology?
<--- Score
84. Is there a clear DRIVE Technology case definition?
<--- Score
85. Are customer(s) identified and segmented according to their different needs and requirements?
<--- Score
86. Are required metrics defined, what are they?
<--- Score
87. Is there a completed, verified, and validated high-level ‘as is’ (not ‘should be’ or ‘could be’) stakeholder process map?
<--- Score
88. What happens if DRIVE Technology’s scope changes?
<--- Score
89. Does the scope remain the same?
<--- Score
90. How did the DRIVE Technology manager receive input to the development of a DRIVE Technology improvement plan and the estimated completion dates/times of each activity?
<--- Score
91. What defines best in class?
<--- Score
92. Are team charters developed?
<--- Score
93. Are there any constraints known that bear on the ability to perform DRIVE Technology work? How is the team addressing them?
<--- Score
94. What is the definition of success?
<--- Score
95. What knowledge or experience is required?
<--- Score
96. Do you have organizational privacy requirements?
<--- Score
97. How do you catch DRIVE Technology definition inconsistencies?
<--- Score
98. How and when will the baselines be defined?
<--- Score
99. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?
<--- Score
100. When is/was the DRIVE Technology start date?
<--- Score
101. What are the Roles and Responsibilities for each team member and its leadership? Where is this documented?
<--- Score
102. Is data collected and displayed to better understand customer(s) critical needs and requirements.
<--- Score
103. In what way can you redefine the criteria of choice clients have in your category in your favor?
<--- Score
104. Are stakeholder processes mapped?
<--- Score
105. What is the scope of the DRIVE Technology work?
<--- Score
106. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
107. Is it clearly defined in and to your organization what you do?
<--- Score
108. How would you define the culture at your organization, how susceptible is it to DRIVE Technology changes?
<--- Score
109. What are the dynamics of the communication plan?
<--- Score
110. What DRIVE Technology services do you require?
<--- Score
111. How do you manage unclear DRIVE Technology requirements?
<--- Score
112. Do you have a DRIVE Technology success story or case study ready to tell and share?
<--- Score
113. Is the DRIVE Technology scope manageable?
<--- Score
114. Is DRIVE Technology linked to key stakeholder goals and objectives?
<--- Score
115. Is the DRIVE Technology scope complete and appropriately sized?
<--- Score
116. What specifically is the problem? Where does it occur? When does it occur? What is its extent?
<--- Score
117. Are roles and responsibilities formally defined?
<--- Score
118. Is the work to date meeting requirements?
<--- Score
119. Is the team adequately staffed with the desired cross-functionality? If not, what additional resources are available to the team?
<--- Score
120. Is there a completed SIPOC representation, describing the Suppliers, Inputs, Process, Outputs, and Customers?
<--- Score
121. What is the scope of the DRIVE Technology effort?
<--- Score
122. Have the customer needs been translated into specific, measurable requirements? How?
<--- Score
123. What is in the scope and what is not in scope?
<--- Score
124. What are the boundaries of the scope?