Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to evaluate the compatibility of different solutions, e.g., software compatibility, hardware compatibility, etc.
- Minimize the time it takes to verify the successful integration of all solutions, e.g., system tests, functionality checks, etc.
- Minimize the likelihood of integration failures leading to system downtime, e.g., software crashes, hardware malfunctions, etc.
- Minimize the time it takes to identify and address integration issues, e.g., error logs, troubleshooting, etc.
- Minimize the time it takes to confirm the stability of the integrated system, e.g., stress tests, performance monitoring, etc.
- Minimize the likelihood of data loss during integration, e.g., data backups, data validation, etc.
- Minimize the time it takes to ensure the security of the integrated system, e.g., security audits, vulnerability scans, etc.
- Minimize the likelihood of security breaches due to integration, e.g., unauthorized access, data leaks, etc.
- Minimize the time it takes to assess the performance of the integrated system, e.g., speed tests, load tests, etc.
- Minimize the likelihood of performance degradation due to integration, e.g., slow response times, system lags, etc.
- Minimize the time it takes to validate the functionality of the integrated system, e.g., user acceptance tests, functionality checks, etc.
- Minimize the likelihood of functionality issues due to integration, e.g., broken features, incorrect outputs, etc.
- Minimize the time it takes to document the integration process and outcomes, e.g., integration reports, system documentation, etc.
- Minimize the likelihood of miscommunication or misunderstanding due to lack of documentation, e.g., unclear instructions, missing information, etc.
- Minimize the time it takes to communicate the integration success to stakeholders, e.g., status updates, progress reports, etc.
- Minimize the likelihood of stakeholder dissatisfaction due to lack of communication, e.g., delayed updates, incomplete information, etc.
- Minimize the time it takes to plan for future integrations based on current success, e.g., lessons learned, best practices, etc.
- Minimize the likelihood of repeating past mistakes in future integrations, e.g., overlooked issues, repeated errors, etc.
- Minimize the time it takes to train end-users on the integrated system, e.g., user manuals, training sessions, etc.
- Minimize the likelihood of user errors due to lack of training, e.g., incorrect usage, system misuse, etc.
Customer Success Statements (PJTBD)
- Evaluate the compatibility of different solutions, e.g., software compatibility, hardware compatibility, etc.
- Verify the successful integration of all solutions, e.g., system tests, functionality checks, etc.
- Avoid integration failures leading to system downtime, e.g., software crashes, hardware malfunctions, etc.
- Identify and address integration issues, e.g., error logs, troubleshooting, etc.
- Confirm the stability of the integrated system, e.g., stress tests, performance monitoring, etc.
- Avoid data loss during integration, e.g., data backups, data validation, etc.
- Ensure the security of the integrated system, e.g., security audits, vulnerability scans, etc.
- Avoid security breaches due to integration, e.g., unauthorized access, data leaks, etc.
- Assess the performance of the integrated system, e.g., speed tests, load tests, etc.
- Avoid performance degradation due to integration, e.g., slow response times, system lags, etc.
- Validate the functionality of the integrated system, e.g., user acceptance tests, functionality checks, etc.
- Avoid functionality issues due to integration, e.g., broken features, incorrect outputs, etc.
- Document the integration process and outcomes, e.g., integration reports, system documentation, etc.
- Avoid miscommunication or misunderstanding due to lack of documentation, e.g., unclear instructions, missing information, etc.
- Communicate the integration success to stakeholders, e.g., status updates, progress reports, etc.
- Avoid stakeholder dissatisfaction due to lack of communication, e.g., delayed updates, incomplete information, etc.
- Plan for future integrations based on current success, e.g., lessons learned, best practices, etc.
- Avoid repeating past mistakes in future integrations, e.g., overlooked issues, repeated errors, etc.
- Train end-users on the integrated system, e.g., user manuals, training sessions, etc.
- Avoid user errors due to lack of training, e.g., incorrect usage, system misuse, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]