Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to identify the necessary configuration settings, e.g., system preferences, user roles, etc.
- Minimize the time it takes to adjust the configuration settings to meet specific requirements, e.g., security protocols, data access, etc.
- Minimize the likelihood of overlooking critical configuration settings, e.g., network settings, user permissions, etc.
- Minimize the time it takes to verify the configuration settings are correctly adjusted, e.g., system tests, user tests, etc.
- Minimize the likelihood of configuration settings causing system errors or malfunctions, e.g., software crashes, data corruption, etc.
- Minimize the time it takes to document the configuration settings for future reference, e.g., system logs, user manuals, etc.
- Minimize the likelihood of configuration settings conflicting with other system settings, e.g., software compatibility, hardware requirements, etc.
- Minimize the time it takes to communicate the configuration settings to relevant stakeholders, e.g., system users, IT support, etc.
- Minimize the likelihood of configuration settings being incorrectly adjusted, e.g., input errors, misunderstanding requirements, etc.
- Minimize the time it takes to review and update the configuration settings as needed, e.g., system upgrades, policy changes, etc.
- Minimize the likelihood of configuration settings compromising system security, e.g., weak passwords, open ports, etc.
- Minimize the time it takes to train users on the new configuration settings, e.g., user guides, training sessions, etc.
- Minimize the likelihood of configuration settings causing user confusion or frustration, e.g., complex interfaces, unclear instructions, etc.
- Minimize the time it takes to resolve issues related to configuration settings, e.g., troubleshooting, system patches, etc.
- Minimize the likelihood of configuration settings leading to data loss or corruption, e.g., incorrect data paths, insufficient storage, etc.
- Minimize the time it takes to test the system performance after adjusting configuration settings, e.g., load tests, stress tests, etc.
- Minimize the likelihood of configuration settings affecting system performance negatively, e.g., slow response times, high CPU usage, etc.
- Minimize the time it takes to backup the system before adjusting configuration settings, e.g., data backups, system snapshots, etc.
- Minimize the likelihood of configuration settings causing system downtime, e.g., server crashes, network outages, etc.
- Minimize the time it takes to restore the system to previous configuration settings if needed, e.g., system restore, rollback procedures, etc.
Customer Success Statements (PJTBD)
- Identify the necessary configuration settings, e.g., system preferences, user roles, etc.
- Adjust the configuration settings to meet specific requirements, e.g., security protocols, data access, etc.
- Avoid overlooking critical configuration settings, e.g., network settings, user permissions, etc.
- Verify the configuration settings are correctly adjusted, e.g., system tests, user tests, etc.
- Avoid configuration settings causing system errors or malfunctions, e.g., software crashes, data corruption, etc.
- Document the configuration settings for future reference, e.g., system logs, user manuals, etc.
- Avoid configuration settings conflicting with other system settings, e.g., software compatibility, hardware requirements, etc.
- Communicate the configuration settings to relevant stakeholders, e.g., system users, IT support, etc.
- Avoid configuration settings being incorrectly adjusted, e.g., input errors, misunderstanding requirements, etc.
- Review and update the configuration settings as needed, e.g., system upgrades, policy changes, etc.
- Avoid configuration settings compromising system security, e.g., weak passwords, open ports, etc.
- Train users on the new configuration settings, e.g., user guides, training sessions, etc.
- Avoid configuration settings causing user confusion or frustration, e.g., complex interfaces, unclear instructions, etc.
- Resolve issues related to configuration settings, e.g., troubleshooting, system patches, etc.
- Avoid configuration settings leading to data loss or corruption, e.g., incorrect data paths, insufficient storage, etc.
- Test the system performance after adjusting configuration settings, e.g., load tests, stress tests, etc.
- Avoid configuration settings affecting system performance negatively, e.g., slow response times, high CPU usage, etc.
- Backup the system before adjusting configuration settings, e.g., data backups, system snapshots, etc.
- Avoid configuration settings causing system downtime, e.g., server crashes, network outages, etc.
- Restore the system to previous configuration settings if needed, e.g., system restore, rollback procedures, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]