Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to identify the necessary configuration settings, e.g., network settings, user permissions, etc.
- Minimize the time it takes to input the initial configuration settings, e.g., IP addresses, usernames, etc.
- Minimize the time it takes to verify the correctness of the initial configuration settings, e.g., syntax checks, value ranges, etc.
- Minimize the likelihood of inputting incorrect configuration settings, e.g., wrong IP addresses, incorrect user permissions, etc.
- Minimize the time it takes to save and apply the initial configuration settings, e.g., saving configuration files, rebooting devices, etc.
- Minimize the likelihood of configuration settings not being saved correctly, e.g., file corruption, insufficient storage, etc.
- Minimize the time it takes to test the initial configuration settings, e.g., connectivity tests, functionality tests, etc.
- Minimize the likelihood of initial configuration settings causing system errors, e.g., system crashes, performance issues, etc.
- Minimize the time it takes to document the initial configuration process, e.g., noting settings, recording changes, etc.
- Minimize the likelihood of losing track of the configuration changes made, e.g., unsaved notes, unclear documentation, etc.
- Minimize the time it takes to communicate the initial configuration status to stakeholders, e.g., status reports, meetings, etc.
- Minimize the likelihood of miscommunication about the configuration status, e.g., unclear reports, missed meetings, etc.
- Minimize the time it takes to resolve any issues found during the initial configuration, e.g., resetting settings, troubleshooting, etc.
- Minimize the likelihood of issues persisting after the initial configuration, e.g., unresolved errors, recurring issues, etc.
- Minimize the time it takes to prepare for the next configuration steps, e.g., planning, scheduling, etc.
- Minimize the likelihood of being unprepared for the next configuration steps, e.g., lack of planning, scheduling conflicts, etc.
- Minimize the time it takes to train end users on the new configuration, e.g., training sessions, user guides, etc.
- Minimize the likelihood of end users struggling with the new configuration, e.g., lack of training, unclear user guides, etc.
- Minimize the time it takes to monitor the system after the initial configuration, e.g., system checks, performance monitoring, etc.
- Minimize the likelihood of missing any issues after the initial configuration, e.g., unnoticed errors, performance drops, etc.
Customer Success Statements (PJTBD)
- Identify the necessary configuration settings, e.g., network settings, user permissions, etc.
- Input the initial configuration settings, e.g., IP addresses, usernames, etc.
- Verify the correctness of the initial configuration settings, e.g., syntax checks, value ranges, etc.
- Avoid inputting incorrect configuration settings, e.g., wrong IP addresses, incorrect user permissions, etc.
- Save and apply the initial configuration settings, e.g., saving configuration files, rebooting devices, etc.
- Avoid configuration settings not being saved correctly, e.g., file corruption, insufficient storage, etc.
- Test the initial configuration settings, e.g., connectivity tests, functionality tests, etc.
- Avoid initial configuration settings causing system errors, e.g., system crashes, performance issues, etc.
- Document the initial configuration process, e.g., noting settings, recording changes, etc.
- Avoid losing track of the configuration changes made, e.g., unsaved notes, unclear documentation, etc.
- Communicate the initial configuration status to stakeholders, e.g., status reports, meetings, etc.
- Avoid miscommunication about the configuration status, e.g., unclear reports, missed meetings, etc.
- Resolve any issues found during the initial configuration, e.g., resetting settings, troubleshooting, etc.
- Avoid issues persisting after the initial configuration, e.g., unresolved errors, recurring issues, etc.
- Prepare for the next configuration steps, e.g., planning, scheduling, etc.
- Avoid being unprepared for the next configuration steps, e.g., lack of planning, scheduling conflicts, etc.
- Train end users on the new configuration, e.g., training sessions, user guides, etc.
- Avoid end users struggling with the new configuration, e.g., lack of training, unclear user guides, etc.
- Monitor the system after the initial configuration, e.g., system checks, performance monitoring, etc.
- Avoid missing any issues after the initial configuration, e.g., unnoticed errors, performance drops, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]