Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to identify the necessary configuration steps, e.g., installation, setup, testing, etc.
- Minimize the time it takes to determine the order of configuration steps, e.g., sequential, parallel, dependent, etc.
- Minimize the time it takes to identify the tools and resources required for configuration, e.g., software, hardware, manuals, etc.
- Minimize the time it takes to understand the product's technical specifications, e.g., system requirements, compatibility, etc.
- Minimize the time it takes to identify potential configuration challenges, e.g., complex settings, compatibility issues, etc.
- Minimize the likelihood of missing a critical configuration step, e.g., system checks, user permissions, etc.
- Minimize the time it takes to document the configuration process, e.g., step-by-step instructions, diagrams, etc.
- Minimize the time it takes to verify the configuration settings, e.g., system checks, user testing, etc.
- Minimize the likelihood of configuration errors leading to system issues, e.g., crashes, performance degradation, etc.
- Minimize the time it takes to communicate the configuration process to stakeholders, e.g., team members, clients, etc.
- Minimize the time it takes to update the configuration documentation, e.g., version control, change logs, etc.
- Minimize the likelihood of miscommunication about the configuration process, e.g., unclear instructions, missing information, etc.
- Minimize the time it takes to train others on the configuration process, e.g., workshops, tutorials, etc.
- Minimize the likelihood of configuration changes leading to system instability, e.g., crashes, performance issues, etc.
- Minimize the time it takes to review and validate the configuration process, e.g., peer review, quality checks, etc.
- Minimize the likelihood of configuration documentation becoming outdated, e.g., product updates, system changes, etc.
- Minimize the time it takes to incorporate feedback into the configuration process, e.g., user feedback, system logs, etc.
- Minimize the likelihood of configuration steps being performed out of order, e.g., dependencies, prerequisites, etc.
- Minimize the time it takes to resolve configuration issues, e.g., troubleshooting, technical support, etc.
- Minimize the likelihood of configuration settings causing security vulnerabilities, e.g., open ports, default passwords, etc.
Customer Success Statements (PJTBD)
- Identify the necessary configuration steps, e.g., installation, setup, testing, etc.
- Determine the order of configuration steps, e.g., sequential, parallel, dependent, etc.
- Identify the tools and resources required for configuration, e.g., software, hardware, manuals, etc.
- Understand the product's technical specifications, e.g., system requirements, compatibility, etc.
- Identify potential configuration challenges, e.g., complex settings, compatibility issues, etc.
- Avoid missing a critical configuration step, e.g., system checks, user permissions, etc.
- Document the configuration process, e.g., step-by-step instructions, diagrams, etc.
- Verify the configuration settings, e.g., system checks, user testing, etc.
- Avoid configuration errors leading to system issues, e.g., crashes, performance degradation, etc.
- Communicate the configuration process to stakeholders, e.g., team members, clients, etc.
- Update the configuration documentation, e.g., version control, change logs, etc.
- Avoid miscommunication about the configuration process, e.g., unclear instructions, missing information, etc.
- Train others on the configuration process, e.g., workshops, tutorials, etc.
- Avoid configuration changes leading to system instability, e.g., crashes, performance issues, etc.
- Review and validate the configuration process, e.g., peer review, quality checks, etc.
- Avoid configuration documentation becoming outdated, e.g., product updates, system changes, etc.
- Incorporate feedback into the configuration process, e.g., user feedback, system logs, etc.
- Avoid configuration steps being performed out of order, e.g., dependencies, prerequisites, etc.
- Resolve configuration issues, e.g., troubleshooting, technical support, etc.
- Avoid configuration settings causing security vulnerabilities, e.g., open ports, default passwords, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]