When to spend your usability budget?
- Dave Ellender
Understanding how well something is used is not a new concept, it just seems that way when new technologies appear. In fact, the word ‘usable’ dates back to the 14th century. In today’s world, ensuring usability requires an ever-increasing diversity of techniques. User testing is simply one of the techniques that can be employed: often, this means watching videos of users struggling with an apparently finished design. But spending a small amount of money can improve customer sales, productivity and proficiency according to "Return on Investment for Usability", a recent survey of over 800 development projects by the Norman Nielsen Group . The report estimates a 100% increase in sales conversion rates on the web, a 161% increase in user productivity and a 202% increase in the use of specific features - after an average spend on usability of around 10% per project (Norman Nielsen Group, 2003). The trouble is allocating and spending a usability budget will not guarantee these returns: it might be too late in the project to make a significant improvement, particularly if this 10% spend comes after months of developing the interface and underlying code. In short, usability testing is not a panacea. This article seeks to answer the question: when is the right time to spend your usability budget?
Spending some of the budget before any programmers or graphic designers start work will ensure the most effective return on investment. At this stage there are a high number of possible design solutions that can be explored with users cost effectively: potential problems can be eliminated before too many resources are committed. Conducting usability analysis will also enable an accurate specification of what the users need and consequently a better estimation of the costs of developing the design. In a survey of over 100 managers, the four most common reasons for projects taking longer than expected and costing more were discovered to be usability-related issues. These included frequent requests for changes by users, overlooked tasks, users’ lack of understanding of their own requirements and insufficient user-analyst communication . Another study found that a 70% increase in specification costs reduced overall costs (including the extra spent on specification!) by 18%. Recommendation: Describe what the users will do (with the support of the technology), not what the technology does or is. Then, develop functional and technical specifications. Then, draw up the project development plans.
Spending some of the budget during development will ensure a good return on investment. Including more feedback about how users actually use a prototype of the proposed system will significantly improve a design and its chances of succeeding in the real world. Anticipating how people will use a design is notoriously difficult: people in society use technology unpredictably and when enough people do so in similar ways, it can become a new social custom. Because of this, what users require from a design changes over time: it is most noticeable before and after a design is created. Testing prototypes during development will ensure that as the design grows, so does the understanding of how it can be used. Cost-benefit analysis is easier and as a result, business managers are able to commercialise the design more effectively. Famously, mobile phones were designed with little expectation that people would use them for inputting and reading text: the screens and buttons were small, sending text messages was free. But text messaging became phenomenally popular - rising to 500m messages a month in its first two years  - despite designs and business models that did not effectively support or commercialise this use. Not long afterwards, mobile phone companies began to charge users for sending text messages and the design of a new generation of handsets got underway. Needless to say, these businesses are a little coy about this recent bit of history. Recommendation: Review the user requirements specification constantly. Then, incorporate additional user requirements as they are identified during development process. Then, manage the costs and benefits of designing for these requirements.
Spending all of the budget after development will ensure a moderate return on investment. Testing after development always illustrates how every design is imperfect and can be improved. Even if there is little time or budget to make these improvements, some improvements can still be made. This will improve the performance of the system created. But the cost of making a design change after development can be up to 100 times more than making the same change before development . As such, fewer design solutions are economically viable at this stage, despite the benefits; often, only some will get implemented. Sometimes, seriously even experts following centuries of tradition do not catch imperfect designs until this late stage. The £18m Millennium Bridge, a footbridge crossing the River Thames in London, was designed and built in 2000 by Arup, a leading engineering firm. The bridge had to be closed after two days because the number of people walking over it caused it to sway, not up and down as might be expected, but unusually from side to side. This was a design error that was not identified before or during development: but it had to be fixed - whatever the cost. After all, it was the first pedestrian bridge across the Thames for over 100 years. Recommendation: Use cost-benefit analysis to choose which parts of the design to improve.
Spending money on usability testing at any stage will always identify ways to improve a design. These improvements can have significant benefits on the user experience and on the bottom line. Placing usability at the centre of every stage of the development cycle will allow designers to explore and test solutions in the most cost effective way. In turn, this will allow better management of the costs and business benefits of undertaking the design. Leaving it until the end of the development can make the cost-benefit analysis of design improvements a painful hangover: in the case of the Millennium Bridge, Arup ended up paying an extra £5m of their own money to fix it. The bridge finally opened two years after the anniversary it was supposed to celebrate . Consequently, Arup’s engineers went on to help update the official British Standard on bridge safety and their business development team began to approach organisations around the world whose bridges could have the same problem. There are many different techniques to ensure usability: Human-Centred Design is one of them. In a nutshell, start early, test often and monitor constantly and the result will be more effective technology that fits the way people behave: the best returns on investments for usability are those made earliest.
1. Norman Nielsen Group, 2003. "Return on Investment for Usability". 2. Lederer and Prassad, 1992. "Nine management guidelines for better cost estimating." Communications of the ACM, 35(2), 51-59. 3. Mobile Data Association, 2003. "Text mania continues in 2003". 4. Pressman, 1992. "Software Engineering: A Practitioner’s Approach". McGraw-Hill, New York. 5. Arup, 2002. "Millennium Bridge Time Line."
Let's work together
We believe that creating groundbreaking experiences that make measurable differences in the way people live takes a special type of collaboration. Our team designs impactful experiences by leaning on the variety of capabilities and expertise within Nomensa to ensure our solution is bespoke to your needs. We believe collaboration is key, let’s work together.