Success Metrics
There are two formatting options available. The traditional desired outcome statement is a structure used in the Outcome-Driven Innovation methodology. Since many stakeholders - especially when involved with marketing or UX teams - push back on the awkward nature of desired outcomes statements since people don’t talk like that, the alternative is a natural language structure that gets to the heart of the outcome and tries to avoid tasks and activities where feasible.
This catalog contains 20 potential metrics using each formatting option. You will likely need to reduce this set for a survey. The number of statements that have been generated is arbitrary and can be expanded to accommodate your needs.
Desired Outcome Statements (ODI)
- Minimize the time it takes to verify the solution meets all specified requirements, e.g., functionality, performance, compatibility, etc.
- Minimize the time it takes to identify any discrepancies between the expected and actual solution outcomes, e.g., output accuracy, speed, user experience, etc.
- Minimize the time it takes to ensure all components of the solution are fully integrated, e.g., software modules, hardware parts, network connections, etc.
- Minimize the time it takes to confirm the solution's compatibility with existing systems, e.g., databases, operating systems, third-party applications, etc.
- Minimize the time it takes to validate the solution's performance under various conditions, e.g., peak load, stress scenarios, normal operation, etc.
- Minimize the time it takes to assess the solution's user interface for intuitiveness and ease of use, e.g., navigation, accessibility, error messaging, etc.
- Minimize the time it takes to check for any security vulnerabilities within the solution, e.g., data encryption, authentication mechanisms, access controls, etc.
- Minimize the time it takes to evaluate the solution's scalability for future growth, e.g., database expansion, user load increase, feature addition, etc.
- Minimize the time it takes to determine the solution's maintainability over time, e.g., code modularity, documentation quality, ease of updates, etc.
- Minimize the time it takes to confirm the solution adheres to relevant industry standards and regulations, e.g., privacy laws, data protection, compliance requirements, etc.
- Minimize the time it takes to ensure the solution's reliability for critical operations, e.g., system uptime, failover capabilities, data integrity, etc.
- Minimize the time it takes to verify the solution's interoperability with other systems and devices, e.g., API integrations, hardware compatibility, network protocols, etc.
- Minimize the time it takes to assess the solution's impact on existing workflows and processes, e.g., automation potential, efficiency gains, user adoption, etc.
- Minimize the time it takes to evaluate the solution's environmental impact, e.g., energy consumption, carbon footprint, recyclability, etc.
- Minimize the time it takes to confirm the solution's ability to handle data accurately and securely, e.g., encryption standards, backup procedures, data recovery, etc.
- Minimize the time it takes to ensure the solution provides adequate support and documentation for end-users, e.g., user manuals, online help, training materials, etc.
- Minimize the time it takes to verify the solution's ease of deployment and configuration, e.g., installation processes, customization options, initial setup, etc.
- Minimize the time it takes to assess the solution's total cost of ownership, e.g., initial investment, ongoing maintenance costs, upgrade expenses, etc.
- Minimize the likelihood of unresolved issues remaining after solution implementation, e.g., software bugs, hardware malfunctions, integration errors, etc.
- Minimize the likelihood of user resistance due to inadequate training or support, e.g., insufficient training materials, lack of user engagement, unclear instructions, etc.
Customer Success Statements (PJTBD)
- Verify the solution meets all specified requirements, e.g., functionality, performance, compatibility, etc.
- Identify any discrepancies between the expected and actual solution outcomes, e.g., output accuracy, speed, user experience, etc.
- Ensure all components of the solution are fully integrated, e.g., software modules, hardware parts, network connections, etc.
- Confirm the solution's compatibility with existing systems, e.g., databases, operating systems, third-party applications, etc.
- Validate the solution's performance under various conditions, e.g., peak load, stress scenarios, normal operation, etc.
- Assess the solution's user interface for intuitiveness and ease of use, e.g., navigation, accessibility, error messaging, etc.
- Check for any security vulnerabilities within the solution, e.g., data encryption, authentication mechanisms, access controls, etc.
- Evaluate the solution's scalability for future growth, e.g., database expansion, user load increase, feature addition, etc.
- Determine the solution's maintainability over time, e.g., code modularity, documentation quality, ease of updates, etc.
- Confirm the solution adheres to relevant industry standards and regulations, e.g., privacy laws, data protection, compliance requirements, etc.
- Ensure the solution's reliability for critical operations, e.g., system uptime, failover capabilities, data integrity, etc.
- Verify the solution's interoperability with other systems and devices, e.g., API integrations, hardware compatibility, network protocols, etc.
- Assess the solution's impact on existing workflows and processes, e.g., automation potential, efficiency gains, user adoption, etc.
- Evaluate the solution's environmental impact, e.g., energy consumption, carbon footprint, recyclability, etc.
- Confirm the solution's ability to handle data accurately and securely, e.g., encryption standards, backup procedures, data recovery, etc.
- Ensure the solution provides adequate support and documentation for end-users, e.g., user manuals, online help, training materials, etc.
- Verify the solution's ease of deployment and configuration, e.g., installation processes, customization options, initial setup, etc.
- Assess the solution's total cost of ownership, e.g., initial investment, ongoing maintenance costs, upgrade expenses, etc.
- Avoid unresolved issues remaining after solution implementation, e.g., software bugs, hardware malfunctions, integration errors, etc.
- Avoid user resistance due to inadequate training or support, e.g., insufficient training materials, lack of user engagement, unclear instructions, etc.
Test Fit Structure
Apply this to Customer Success Statements only. Everything should fit together nicely. Here’s an article where I introduced the concept. Feel free to devise your own version for Desired Outcome Statements as this does not apply to their format directly.
As a(n) [end user] + who is + [Job] you're trying to [success statement] + "faster and more accurately" so that you can successfully [Job Step]