Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to confirm the solution's operational parameters, e.g., voltage, current, frequency, etc.
- Minimize the time it takes to validate the compatibility of the solution with existing systems, e.g., software versions, hardware interfaces, etc.
- Minimize the time it takes to ensure all safety protocols are enabled, e.g., emergency stop functions, overload protections, etc.
- Minimize the time it takes to verify user access levels are correctly set, e.g., administrator, standard user, guest, etc.
- Minimize the time it takes to check the solution's performance under expected load conditions, e.g., maximum users, data throughput, etc.
- Minimize the time it takes to document the final configuration settings, e.g., IP addresses, device names, routing paths, etc.
- Minimize the time it takes to train end-users on the solution's operation, e.g., daily tasks, troubleshooting steps, etc.
- Minimize the time it takes to establish a maintenance schedule, e.g., software updates, hardware inspections, etc.
- Minimize the time it takes to create a backup and recovery plan, e.g., data backups, system images, etc.
- Minimize the time it takes to ensure compliance with relevant regulations, e.g., data protection laws, safety standards, etc.
- Minimize the time it takes to configure alerts for system errors or failures, e.g., email notifications, SMS alerts, etc.
- Minimize the time it takes to integrate the solution with monitoring tools, e.g., performance dashboards, error logs, etc.
- Minimize the time it takes to test failover and redundancy mechanisms, e.g., secondary power supply, network paths, etc.
- Minimize the time it takes to finalize user documentation and support materials, e.g., user manuals, FAQs, troubleshooting guides, etc.
- Minimize the time it takes to confirm remote access capabilities are functional, e.g., VPN connections, remote desktop, etc.
- Minimize the time it takes to ensure data encryption and security measures are in place, e.g., SSL certificates, firewalls, etc.
- Minimize the time it takes to validate external connectivity and integrations, e.g., third-party services, external databases, etc.
- Minimize the time it takes to optimize the solution for energy efficiency, e.g., power saving modes, efficient coding practices, etc.
- Minimize the likelihood of system downtime due to incorrect settings, e.g., network configurations, storage allocations, etc.
- Minimize the likelihood of user errors due to inadequate training, e.g., incorrect data entry, misuse of features, etc.
Customer Success Statements (PJTBD)
- Confirm the solution's operational parameters, e.g., voltage, current, frequency, etc.
- Validate the compatibility of the solution with existing systems, e.g., software versions, hardware interfaces, etc.
- Ensure all safety protocols are enabled, e.g., emergency stop functions, overload protections, etc.
- Verify user access levels are correctly set, e.g., administrator, standard user, guest, etc.
- Check the solution's performance under expected load conditions, e.g., maximum users, data throughput, etc.
- Document the final configuration settings, e.g., IP addresses, device names, routing paths, etc.
- Train end-users on the solution's operation, e.g., daily tasks, troubleshooting steps, etc.
- Establish a maintenance schedule, e.g., software updates, hardware inspections, etc.
- Create a backup and recovery plan, e.g., data backups, system images, etc.
- Ensure compliance with relevant regulations, e.g., data protection laws, safety standards, etc.
- Configure alerts for system errors or failures, e.g., email notifications, SMS alerts, etc.
- Integrate the solution with monitoring tools, e.g., performance dashboards, error logs, etc.
- Test failover and redundancy mechanisms, e.g., secondary power supply, network paths, etc.
- Finalize user documentation and support materials, e.g., user manuals, FAQs, troubleshooting guides, etc.
- Confirm remote access capabilities are functional, e.g., VPN connections, remote desktop, etc.
- Ensure data encryption and security measures are in place, e.g., SSL certificates, firewalls, etc.
- Validate external connectivity and integrations, e.g., third-party services, external databases, etc.
- Optimize the solution for energy efficiency, e.g., power saving modes, efficient coding practices, etc.
- Avoid system downtime due to incorrect settings, e.g., network configurations, storage allocations, etc.
- Avoid user errors due to inadequate training, e.g., incorrect data entry, misuse of features, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]