Guest Post by: Dominic Miller (Mentee, Session 10, The Product Mentor) [Paired with Mentor, Scott Stokke]
Before the start of The Product Mentor programme I wrote an article about “The cost of non-compliance” where I described how the relative cost of fixing defects increases exponentially the later in the design/development process the defect is identified and fixed.
This is something which is particularly important for me in my current organisation because we are experiencing a large number of defects being reported by our customers and we are spending a lot of time dealing with the issues raised by customers. In many cases the amount of time spent resolving an issue (by support, development, account management, training and even, sometimes, the CEO) seems far in excess of the time it would have taken to design & build the software correctly in the first place. In addition, because of the issues with usability, and because of missing functionality, we are losing customers and struggling to recruit new customers.
The graph below illustrates this principle graphically, showing that the cost of fixing issues in production can be ten or even a hundred times more expensive than fixing the issue before it reaches production.
As part of The Product Mentor programme I wanted to explore if, and how, encouraging better customer research & engagement could be an effective strategy to help reduce the extent to which issues are not found until they reach production, so that we can reduce costs and improve customer satisfaction. In particular I wanted to explore how customer research can be used to more effectively ‘discover’ & validate requirements and thereby improve the design at the earliest stages of development where the potential cost savings are highest.
The approach I wanted to take was to conduct some research with about 10 customers to find out more about their businesses, their processes, their challenges and their use of our product (FileFinder). Then I wanted to focus on one or two areas of importance to customers and go through a process of designing and developing some improvements in those areas.
The initial background research revealed lots of different areas that customers want to see improved, but a consistent finding was the importance of the speed of entry of candidate data onto the system and the importance of maintaining the quality of that data. [FileFinder is an Applicant Tracking System for the Executive Search industry. Customers are continuously adding a high volume of candidates on to their system, and individual candidates are potentially worth tens of thousands in placement fees – hence the importance of speed and quality]
Based on the initial research I decided to focus on two key features of the system – CV import and duplicate data management.
Almost all customers gave feedback about how the CV import options in general could be better, and one of the customers was particularly vociferous about the shortcomings of one of the import features – email attachment import – which he had stopped using because it didn’t meet his needs. I also found out that the training team don’t even cover that feature in their training sessions because it was so clunky.
So I decided to take the email attachment import feature and look to design, and implement, improvements to the feature. I knew I would only be able to commit a small amount of development time to the improvements so needed to be conscious not to be too ambitious.
The first step was to speak to the vociferous customer to understand as much as possible about his process and his issues. We had already made a small specific change that he had requested to the email import functionality back in late summer 2018 but that hadn’t improved it enough for him to start using it again. So it clearly needed further investigation.
After speaking to the customer and digging deeper I realised that the small change he had requested, and which we had implemented, was just one, albeit possibly the most visible, of a number of changes that were needed to make the feature usable.
So following the discussion I spoke to one of my developers and mocked up a couple of different options, which I then discussed again with the customer. I was surprised at the option he chose, and the subsequent discussion about why he chose that option helped to flush out some further details about his process and mental model that I hadn’t hitherto been aware of.
We developed the changes we’d agreed on and had a number of demos of the working code. Each demo stimulated further discussion, deeper understanding of the needs, and further refinement of how the customer wanted the feature to work. This lead to some frustration for the developer who is not used to refining/revising the code several times. For me though, the refining & revising is worth it because the customer will (hopefully) be happier & more successful, and we will save time & effort in fixing/changing it again in the future.
With regards to duplicate data management, a ‘deduplication tool’ had previously been built for filefinder but was decommissioned because it was error prone and caused crashes. I thought it would be worth getting a version of the tool working in a test environment and conducting some user testing on it to get feedback from customers.
I conducted testing with 5 users, asking them to perform a number of tasks such as setting up duplication identification rules, running the tool to identify potential duplicates, and then reviewing and managing duplicates. The testing revealed that the tool had significant usability issues. In particular, configuring the duplication rules was something that no user was able to do without support. Most of the other tasks also baffled most users. More significantly the testing revealed that there was even doubt about whether the tool was even needed – many customers had efficient processes in place already to prevent and manage duplicate data in, and the tool would not have helped them much in its current state. Several of the test participants said they would not use the tool even with the usability issues resolved. And those that did say they might use the tool would want significant changes in the way the tool works.
So, based on the research I performed here is the lesson I learned:
The lesson is that a very customer-focussed discovery-design-development process can be very effective at identifying customer requirements and issues at an early stage but the process needs to allow for iteration, change and unpredictability.
There are several important implications from this:
- The time it takes to discover-design-develop will (obviously) be longer if you have 5 or 6 iterations of demos/discussions/testing with users compared to just one or two.
- Discovery, design and development are not discrete stages are and not done sequentially – they are very closely coupled and iterative. The number of iterations needed to get a feature right won’t be known at the start. Planning what discovery/design/development activities will take place at what time is therefore a challenge.
- For much of the discovery/design/development process it may not appear that much progress being made – particularly in terms of ‘agreed’ requirements/user stories/acceptance criteria, or working code.
- The result of some discovery/design/development effort may be that it becomes clear that it is not worth going ahead with the development of a specific feature – the cost & effort required to complete it would exceed the value to the customer.
- The above challenges can be very difficult for some stakeholders, especially management, to handle. There will be unpredictability, uncertainty, risk and cost, and no guarantee that a product or feature will even emerge at the end of it. But what needs to be borne in mind is that a product/feature that does emerges through a good process of discovery/design/development will be more likely to meet the users’ needs, and therefore less likely to need updates in production, when it is significantly more expensive to change. The overall cost of development & support should therefore be lower, and customers will be happier.
- Practices such as wire-framing, mock ups, prototypes and user testing can help hugely in improving the discussions with users and the quality of their feedback, and can, thereby, help minimise the amount of iteration/revision required. But they are only ever approximations of how the product/feature will look and behave and we should EXPECT that further changes are required when the code has been ‘completed’ and rolled out to real-world users – and therefore have a process in place to manage such changes.
- The approach I propose requires lots of input and time from users. I am fortunate to have a large pool of users who are willing to spend time engaging with me but I recognise this is not always easy for all product teams. Nonetheless, it is critical for the long term success of the product so the time and cost incurred in getting the user involvement is a good investment.
Dominic is a Product Manager with Crunch Accounting in Brighton, UK, having recently worked as Product Manager for Dillistone Systems in London. He has a background in Business Analysis and User Experience Design before moving into Product Management several years ago.
More About The Product Mentor
The Product Mentor is a program designed to pair Product Management Mentors and Mentees around the World, across all industries, from start-up to enterprise, guided by the fundamental goals…
Better Decisions. Better Products. Better Product People.
Each Session of the program runs for 6 months with paired individuals…
- Conducting regular 1-on-1 mentor-mentee chats
- Sharing experiences with the larger Product community
- Participating in live-streamed product management lessons and Q&A
- Mentors and Mentees sharing their product management knowledge with the broader community
Check out the Mentors & Enjoy!
The Product Guy