Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to confirm the product details, e.g., model number, specifications, etc.
- Minimize the time it takes to verify the configuration requirements, e.g., hardware compatibility, software requirements, etc.
- Minimize the likelihood of overlooking critical product details, e.g., unique features, limitations, etc.
- Minimize the time it takes to cross-check product details with customer requirements, e.g., performance needs, budget constraints, etc.
- Minimize the likelihood of misinterpreting configuration details, e.g., setup instructions, user manuals, etc.
- Minimize the time it takes to validate the product's compatibility with existing systems, e.g., network infrastructure, software platforms, etc.
- Minimize the likelihood of missing important configuration steps, e.g., installation order, calibration procedures, etc.
- Minimize the time it takes to confirm the product's operational status, e.g., power-on self-test, diagnostic tests, etc.
- Minimize the likelihood of incorrectly configuring the product, e.g., wrong settings, improper connections, etc.
- Minimize the time it takes to verify the product's functionality post-configuration, e.g., performance tests, functionality checks, etc.
- Minimize the likelihood of overlooking product defects or malfunctions, e.g., physical damage, software bugs, etc.
- Minimize the time it takes to confirm the product's compliance with industry standards, e.g., safety regulations, quality standards, etc.
- Minimize the likelihood of failing to document the configuration process, e.g., setup steps, configuration settings, etc.
- Minimize the time it takes to validate the product's readiness for deployment, e.g., final checks, user acceptance tests, etc.
- Minimize the likelihood of overlooking necessary product updates or patches, e.g., firmware updates, software patches, etc.
- Minimize the time it takes to confirm the product's warranty and support details, e.g., warranty period, support channels, etc.
- Minimize the likelihood of failing to communicate the configuration details to relevant stakeholders, e.g., end-users, support team, etc.
- Minimize the time it takes to verify the product's integration with other systems, e.g., data flow, interoperability, etc.
- Minimize the likelihood of overlooking the need for user training or orientation, e.g., product usage, troubleshooting, etc.
- Minimize the time it takes to confirm the product's security settings and protocols, e.g., encryption, access controls, etc.
Customer Success Statements (PJTBD)
- Confirm the product details, e.g., model number, specifications, etc.
- Verify the configuration requirements, e.g., hardware compatibility, software requirements, etc.
- Avoid overlooking critical product details, e.g., unique features, limitations, etc.
- Cross-check product details with customer requirements, e.g., performance needs, budget constraints, etc.
- Avoid misinterpreting configuration details, e.g., setup instructions, user manuals, etc.
- Validate the product's compatibility with existing systems, e.g., network infrastructure, software platforms, etc.
- Avoid missing important configuration steps, e.g., installation order, calibration procedures, etc.
- Confirm the product's operational status, e.g., power-on self-test, diagnostic tests, etc.
- Avoid incorrectly configuring the product, e.g., wrong settings, improper connections, etc.
- Verify the product's functionality post-configuration, e.g., performance tests, functionality checks, etc.
- Avoid overlooking product defects or malfunctions, e.g., physical damage, software bugs, etc.
- Confirm the product's compliance with industry standards, e.g., safety regulations, quality standards, etc.
- Avoid failing to document the configuration process, e.g., setup steps, configuration settings, etc.
- Validate the product's readiness for deployment, e.g., final checks, user acceptance tests, etc.
- Avoid overlooking necessary product updates or patches, e.g., firmware updates, software patches, etc.
- Confirm the product's warranty and support details, e.g., warranty period, support channels, etc.
- Avoid failing to communicate the configuration details to relevant stakeholders, e.g., end-users, support team, etc.
- Verify the product's integration with other systems, e.g., data flow, interoperability, etc.
- Avoid overlooking the need for user training or orientation, e.g., product usage, troubleshooting, etc.
- Confirm the product's security settings and protocols, e.g., encryption, access controls, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]