Score
20. Is the Master Data Services scope complete and appropriately sized?
<--- Score
21. Are team charters developed?
<--- Score
22. How often are the team meetings?
<--- Score
23. How was the ‘as is’ process map developed, reviewed, verified and validated?
<--- Score
24. What are the compelling stakeholder reasons for embarking on Master Data Services?
<--- Score
25. Is the work to date meeting requirements?
<--- Score
26. Is there a Master Data Services management charter, including stakeholder case, problem and goal statements, scope, milestones, roles and responsibilities, communication plan?
<--- Score
27. Have all of the relationships been defined properly?
<--- Score
28. What are the dynamics of the communication plan?
<--- Score
29. Who are the Master Data Services improvement team members, including Management Leads and Coaches?
<--- Score
30. Are required metrics defined, what are they?
<--- Score
31. Scope of sensitive information?
<--- Score
32. Is there a completed, verified, and validated high-level ‘as is’ (not ‘should be’ or ‘could be’) stakeholder process map?
<--- Score
33. Has a project plan, Gantt chart, or similar been developed/completed?
<--- Score
34. What defines best in class?
<--- Score
35. Are additional no requirements or clarity of requirements necessary?
<--- Score
36. Have specific policy objectives been defined?
<--- Score
37. What are requirements for systems?
<--- Score
38. Does function perform as designed according to requirements?
<--- Score
39. Has/have the customer(s) been identified?
<--- Score
40. Is there a critical path to deliver Master Data Services results?
<--- Score
41. Who defines (or who defined) the rules and roles?
<--- Score
42. Has the improvement team collected the ‘voice of the customer’ (obtained feedback – qualitative and quantitative)?
<--- Score
43. What critical content must be communicated – who, what, when, where, and how?
<--- Score
44. How do you hand over Master Data Services context?
<--- Score
45. How do you manage changes in Master Data Services requirements?
<--- Score
46. When are meeting minutes sent out? Who is on the distribution list?
<--- Score
47. Is it clearly defined in and to your organization what you do?
<--- Score
48. How would you define Master Data Services leadership?
<--- Score
49. How much governance is required based on domains and use cases?
<--- Score
50. How do you gather Master Data Services requirements?
<--- Score
51. Who is gathering Master Data Services information?
<--- Score
52. Is Master Data Services currently on schedule according to the plan?
<--- Score
53. What scope to assess?
<--- Score
54. What gets examined?
<--- Score
55. Are accountability and ownership for Master Data Services clearly defined?
<--- Score
56. Have all basic functions of Master Data Services been defined?
<--- Score
57. How do you build the right business case?
<--- Score
58. Are customers identified and high impact areas defined?
<--- Score
59. What would be the goal or target for a Master Data Services’s improvement team?
<--- Score
60. Is there regularly 100% attendance at the team meetings? If not, have appointed substitutes attended to preserve cross-functionality and full representation?
<--- Score
61. What are the requirements for audit information?
<--- Score
62. What is in scope?
<--- Score
63. What sources do you use to gather information for a Master Data Services study?
<--- Score
64. Is the team equipped with available and reliable resources?
<--- Score
65. When is the estimated completion date?
<--- Score
66. Are all requirements met?
<--- Score
67. Is the improvement team aware of the different versions of a process: what they think it is vs. what it actually is vs. what it should be vs. what it could be?
<--- Score
68. Has the Master Data Services work been fairly and/or equitably divided and delegated among team members who are qualified and capable to perform the work? Has everyone contributed?
<--- Score
69. What information do you gather?
<--- Score
70. The political context: who holds power?
<--- Score
71. Are different versions of process maps needed to account for the different types of inputs?
<--- Score
72. Are resources adequate for the scope?
<--- Score
73. Has anyone else (internal or external to the group) attempted to solve this problem or a similar one before? If so, what knowledge can be leveraged from these previous efforts?
<--- Score
74. What are your hardware and software requirements?
<--- Score
75. Is the Master Data Services scope manageable?
<---