A question I was fond of in my early user research career was “If you had a magic want and could change the product in any way you liked, what would you change and why?” The “magic wand” question was always a conversation starter- expert users always have suggestions about the product they’re using. But there’s a problem with the magic wand question: Asking the user what they want doesn’t work.
When people make feature requests, they’re usually some combination of what they know to exist, and what they think is feasible. Their requests don’t actually tell you what to change. The “magic wand” question only gives you an idea of where to start. The same is true of many user research and web analytics requests. Many requests are limited to what the requester knows to exist and what they think is possible.
On the other side of the coin, one of the hardest parts of research is lack of connection with the subject matter. As an analyst, I don’t know the subject matter nearly as well as the requestor. When the requested report is complex or requires investigation, limited domain knowledge means stakeholders must fill the analyst in with context. But who can predict what is or isn’t relevant to what comes back from an investigation?
The Agent-Stakeholder Disconnect
Stakeholders are unfamiliar with the capabilities of the tools. Analysts have limited familiarity with the context. The hand-off between them is constrained by time. The result is that sometimes important trends or the big picture is missed. Other times, reports are iterated to death. When we’re talking about analysis of existing data, this isn’t a big problem: analysis is relatively cheap to adjust.
When we’re talking about a change that affects data collection, deficiencies get really expensive. We’re talking about analysts’ time, engineering, QA, Product, PMO. Fixes only come out with a release- and that’s if you’re lucky. There is no guarantee that you’ll have the resources to make the fix. Data may be delayed for months or not delivered at all.
Errors caught at requirements time are 100x cheaper to fix than those caught at release. More complete requirements yield more reliable specifications, and will reduce the need for downstream fixes. Web analysts must dig deeper during requirements analysis. Responses to the “magic wand” question- requests from stakeholders – must be refined into something specific and actionable. Good, reliable data collection happens when web analysts ask “why” up front.