Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to verify the functionality of the customized solution or product, e.g., performance tests, user acceptance tests, etc.
- Minimize the time it takes to confirm the customized solution or product meets the specified requirements, e.g., design specifications, user requirements, etc.
- Minimize the likelihood of overlooking any defects or issues in the customized solution or product, e.g., software bugs, hardware malfunctions, etc.
- Minimize the time it takes to assess the user-friendliness of the customized solution or product, e.g., ease of use, intuitiveness, etc.
- Minimize the time it takes to evaluate the performance of the customized solution or product under different conditions, e.g., stress tests, load tests, etc.
- Minimize the likelihood of missing any non-compliance issues with the customized solution or product, e.g., regulatory standards, industry norms, etc.
- Minimize the time it takes to determine the scalability of the customized solution or product, e.g., capacity tests, growth projections, etc.
- Minimize the time it takes to confirm the security features of the customized solution or product, e.g., penetration tests, vulnerability assessments, etc.
- Minimize the likelihood of overlooking any integration issues with the customized solution or product, e.g., compatibility with existing systems, interoperability, etc.
- Minimize the time it takes to assess the reliability of the customized solution or product, e.g., failure rate, mean time between failures, etc.
- Minimize the likelihood of failing to identify any potential improvements in the customized solution or product, e.g., performance enhancements, feature additions, etc.
- Minimize the time it takes to verify the maintainability of the customized solution or product, e.g., ease of updates, modularity, etc.
- Minimize the likelihood of overlooking any usability issues in the customized solution or product, e.g., user interface design, accessibility, etc.
- Minimize the time it takes to confirm the robustness of the customized solution or product, e.g., error handling, fault tolerance, etc.
- Minimize the likelihood of missing any potential risks associated with the customized solution or product, e.g., security risks, operational risks, etc.
- Minimize the time it takes to evaluate the efficiency of the customized solution or product, e.g., resource usage, response time, etc.
- Minimize the likelihood of failing to identify any cost implications of the customized solution or product, e.g., operational costs, maintenance costs, etc.
- Minimize the time it takes to confirm the compatibility of the customized solution or product with the intended environment, e.g., hardware requirements, software dependencies, etc.
- Minimize the likelihood of overlooking any potential conflicts with existing systems or processes, e.g., data conflicts, process disruptions, etc.
- Minimize the time it takes to verify the overall quality of the customized solution or product, e.g., quality assurance tests, quality control checks, etc.
Customer Success Statements (PJTBD)
- Verify the functionality of the customized solution or product, e.g., performance tests, user acceptance tests, etc.
- Confirm the customized solution or product meets the specified requirements, e.g., design specifications, user requirements, etc.
- Avoid overlooking any defects or issues in the customized solution or product, e.g., software bugs, hardware malfunctions, etc.
- Assess the user-friendliness of the customized solution or product, e.g., ease of use, intuitiveness, etc.
- Evaluate the performance of the customized solution or product under different conditions, e.g., stress tests, load tests, etc.
- Avoid missing any non-compliance issues with the customized solution or product, e.g., regulatory standards, industry norms, etc.
- Determine the scalability of the customized solution or product, e.g., capacity tests, growth projections, etc.
- Confirm the security features of the customized solution or product, e.g., penetration tests, vulnerability assessments, etc.
- Avoid overlooking any integration issues with the customized solution or product, e.g., compatibility with existing systems, interoperability, etc.
- Assess the reliability of the customized solution or product, e.g., failure rate, mean time between failures, etc.
- Avoid failing to identify any potential improvements in the customized solution or product, e.g., performance enhancements, feature additions, etc.
- Verify the maintainability of the customized solution or product, e.g., ease of updates, modularity, etc.
- Avoid overlooking any usability issues in the customized solution or product, e.g., user interface design, accessibility, etc.
- Confirm the robustness of the customized solution or product, e.g., error handling, fault tolerance, etc.
- Avoid missing any potential risks associated with the customized solution or product, e.g., security risks, operational risks, etc.
- Evaluate the efficiency of the customized solution or product, e.g., resource usage, response time, etc.
- Avoid failing to identify any cost implications of the customized solution or product, e.g., operational costs, maintenance costs, etc.
- Confirm the compatibility of the customized solution or product with the intended environment, e.g., hardware requirements, software dependencies, etc.
- Avoid overlooking any potential conflicts with existing systems or processes, e.g., data conflicts, process disruptions, etc.
- Verify the overall quality of the customized solution or product, e.g., quality assurance tests, quality control checks, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]