Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to track the progress of the product configuration, e.g., status updates, progress reports, etc.
- Minimize the time it takes to identify any issues or errors during the configuration process, e.g., system alerts, error messages, etc.
- Minimize the likelihood of missing critical configuration steps, e.g., software installation, hardware setup, etc.
- Minimize the time it takes to verify the configuration settings, e.g., system checks, parameter validation, etc.
- Minimize the likelihood of configuration conflicts causing system instability, e.g., software incompatibility, hardware conflicts, etc.
- Minimize the time it takes to communicate configuration progress to stakeholders, e.g., status meetings, progress emails, etc.
- Minimize the likelihood of configuration delays impacting project timelines, e.g., resource allocation, scheduling conflicts, etc.
- Minimize the time it takes to resolve any configuration issues, e.g., troubleshooting, technical support, etc.
- Minimize the likelihood of configuration errors leading to system downtime, e.g., system crashes, performance issues, etc.
- Minimize the time it takes to document the configuration process, e.g., process logs, configuration reports, etc.
- Minimize the likelihood of incomplete configuration causing security vulnerabilities, e.g., open ports, default passwords, etc.
- Minimize the time it takes to validate the configuration against requirements, e.g., specification checks, requirement validation, etc.
- Minimize the likelihood of configuration changes causing data loss, e.g., data migration, backup failures, etc.
- Minimize the time it takes to update the configuration as per changing requirements, e.g., system updates, parameter changes, etc.
- Minimize the likelihood of configuration issues impacting user experience, e.g., system lag, functionality issues, etc.
- Minimize the time it takes to train end-users on the new configuration, e.g., user guides, training sessions, etc.
- Minimize the likelihood of configuration issues causing compliance violations, e.g., data privacy, industry standards, etc.
- Minimize the time it takes to test the product post-configuration, e.g., functionality tests, performance tests, etc.
- Minimize the likelihood of configuration issues going undetected, e.g., system monitoring, alert systems, etc.
- Minimize the time it takes to finalize the configuration and prepare for deployment, e.g., final checks, deployment planning, etc.
Customer Success Statements (PJTBD)
- Track the progress of the product configuration, e.g., status updates, progress reports, etc.
- Identify any issues or errors during the configuration process, e.g., system alerts, error messages, etc.
- Avoid missing critical configuration steps, e.g., software installation, hardware setup, etc.
- Verify the configuration settings, e.g., system checks, parameter validation, etc.
- Avoid configuration conflicts causing system instability, e.g., software incompatibility, hardware conflicts, etc.
- Communicate configuration progress to stakeholders, e.g., status meetings, progress emails, etc.
- Avoid configuration delays impacting project timelines, e.g., resource allocation, scheduling conflicts, etc.
- Resolve any configuration issues, e.g., troubleshooting, technical support, etc.
- Avoid configuration errors leading to system downtime, e.g., system crashes, performance issues, etc.
- Document the configuration process, e.g., process logs, configuration reports, etc.
- Avoid incomplete configuration causing security vulnerabilities, e.g., open ports, default passwords, etc.
- Validate the configuration against requirements, e.g., specification checks, requirement validation, etc.
- Avoid configuration changes causing data loss, e.g., data migration, backup failures, etc.
- Update the configuration as per changing requirements, e.g., system updates, parameter changes, etc.
- Avoid configuration issues impacting user experience, e.g., system lag, functionality issues, etc.
- Train end-users on the new configuration, e.g., user guides, training sessions, etc.
- Avoid configuration issues causing compliance violations, e.g., data privacy, industry standards, etc.
- Test the product post-configuration, e.g., functionality tests, performance tests, etc.
- Avoid configuration issues going undetected, e.g., system monitoring, alert systems, etc.
- Finalize the configuration and prepare for deployment, e.g., final checks, deployment planning, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]