Making the UX Process Visible
Leading via influence
​
No one knew exactly what the role would entail, but "influence without authority" was how my manager framed the challenge of filling a new "Product Insights Lead" role at Mailchimp.
​
To be successful, I needed to communicate with peers on the leadership team (often senior managers, directors) and the team of 4 product analysts and researchers working on the same domain.
​
We should work closely with each other to break down silos and provide our four product teams with the skills to identify actionable insights
representing the disciplines of UX Research and Product Analytics on a leadership team.
Product Insights Lead for an area including Mailchimp's core products including Email
If as researchers, we present data in a way that supports critical thinking, and are up front about limitations, we build stakeholder trust and understanding in the research process. We can choose to present research findings as more of a 'done deal' but when we do, it can signal to stakeholders that their relationship with research is a more passive one. Robert always sees the long term, something I really value. An audience that can ask critical questions about research is more confident in how and where to apply findings and is more likely to genuinely want research and carve a space for it in their timelines.
Role /
Workshop designer + facilitator
​
Audience /
Internal stakeholders
Industry /
Healthcare B2B
​
What happened in the workshops?
The workshop was titled "Listening to Data" and guided stakeholders from diverse teams (Marketing, Voice of Customer, Content Strategy) through reading (and questioning) qualitative and quantitative user research data that was immediately relevant to their own work (writing to customers).
I printed enlarged Excel graphs on 11 x 17 paper and used color coding to make quantitative data more friendly and easier to analyze. Together, we observed themes - first as a group, then in pairs. As a group, we built themes up into hypotheses, including observations we had to support them. We also noted new questions we had. Hypotheses from the workshop were applied to the next round of prototypes and results shared out with the workshop group.
For months after this, I got feedback about the impact of the workshop, particularly about how well the metaphor of "deep listening" to data - listening in an open way, not just for what you 'want' or 'expect' to hear - had resonated.
Using observations to create hypotheses
In pairs, participants identified themes and questions in the research. Then as a group we made observations that allowed us to develop hypotheses as to why one version of the prototype had been better received than the others according to the quantitative data from an online test (above right).
Explaining the process
This poster explained how our interpretation of the data in the workshop fit into our iterative design process. Without it, we couldn't create next set of prototypes.
It also set expectations that we would continue iterating, and that we would uncover new questions, not just answers.