Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to verify the repair meets predefined quality standards, e.g., functionality, appearance, etc.
- Minimize the time it takes to confirm all initially reported issues have been addressed, e.g., software bugs, hardware malfunctions, etc.
- Minimize the time it takes to ensure the solution integrates seamlessly with existing systems, e.g., CRM, data analytics platforms, etc.
- Minimize the time it takes to validate the repair's compliance with industry regulations, e.g., data protection, consumer safety, etc.
- Minimize the time it takes to assess the repair's impact on overall system performance, e.g., speed, reliability, etc.
- Minimize the time it takes to check for any unintended consequences of the repair, e.g., new bugs, system vulnerabilities, etc.
- Minimize the time it takes to evaluate the repair's adherence to the project timeline, e.g., completion dates, milestones, etc.
- Minimize the time it takes to review warranty or guarantee terms post-repair, e.g., coverage period, conditions, etc.
- Minimize the time it takes to gather feedback from stakeholders on the repair outcome, e.g., end-users, project team, etc.
- Minimize the time it takes to document the repair process and outcome for future reference, e.g., repair methods, parts used, etc.
- Minimize the time it takes to communicate the completion of the repair to all relevant parties, e.g., team members, management, etc.
- Minimize the time it takes to plan for any necessary follow-up actions or monitoring, e.g., additional tests, ongoing maintenance, etc.
- Minimize the time it takes to assess the cost-effectiveness of the repair solution, e.g., expenses, resource allocation, etc.
- Minimize the time it takes to ensure the repair does not disrupt existing workflows or processes, e.g., daily operations, user experience, etc.
- Minimize the time it takes to verify the repair's durability and long-term viability, e.g., stress tests, usage simulations, etc.
- Minimize the time it takes to confirm the repair has not affected system security, e.g., data encryption, access controls, etc.
- Minimize the time it takes to evaluate the repair provider's service quality, e.g., responsiveness, expertise, etc.
- Minimize the time it takes to determine the need for any additional training or support post-repair, e.g., user manuals, helpdesk, etc.
- Minimize the time it takes to assess the repair's alignment with business objectives, e.g., customer satisfaction, market competitiveness, etc.
- Minimize the time it takes to plan for the decommissioning or replacement of irreparable components, e.g., outdated hardware, unsupported software, etc.
Customer Success Statements (PJTBD)
- Verify the repair meets predefined quality standards, e.g., functionality, appearance, etc.
- Confirm all initially reported issues have been addressed, e.g., software bugs, hardware malfunctions, etc.
- Ensure the solution integrates seamlessly with existing systems, e.g., CRM, data analytics platforms, etc.
- Validate the repair's compliance with industry regulations, e.g., data protection, consumer safety, etc.
- Assess the repair's impact on overall system performance, e.g., speed, reliability, etc.
- Check for any unintended consequences of the repair, e.g., new bugs, system vulnerabilities, etc.
- Evaluate the repair's adherence to the project timeline, e.g., completion dates, milestones, etc.
- Review warranty or guarantee terms post-repair, e.g., coverage period, conditions, etc.
- Gather feedback from stakeholders on the repair outcome, e.g., end-users, project team, etc.
- Document the repair process and outcome for future reference, e.g., repair methods, parts used, etc.
- Communicate the completion of the repair to all relevant parties, e.g., team members, management, etc.
- Plan for any necessary follow-up actions or monitoring, e.g., additional tests, ongoing maintenance, etc.
- Assess the cost-effectiveness of the repair solution, e.g., expenses, resource allocation, etc.
- Ensure the repair does not disrupt existing workflows or processes, e.g., daily operations, user experience, etc.
- Verify the repair's durability and long-term viability, e.g., stress tests, usage simulations, etc.
- Confirm the repair has not affected system security, e.g., data encryption, access controls, etc.
- Evaluate the repair provider's service quality, e.g., responsiveness, expertise, etc.
- Determine the need for any additional training or support post-repair, e.g., user manuals, helpdesk, etc.
- Assess the repair's alignment with business objectives, e.g., customer satisfaction, market competitiveness, etc.
- Plan for the decommissioning or replacement of irreparable components, e.g., outdated hardware, unsupported software, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]