Learnings from UX Australia Design Research 2021
Summary of things I learned from this jam-packed 3 day event, with amazing insights from UX researchers.
UX Australia’s Design Research 2021 took place between 17–19 March 2021, featuring talks on a broad set of topics related to conducting research during COVID-19; involving stakeholders in research; reporting research findings to your audience; and several deep-dive case studies.
It was amazing to listen to talks by UX researches from around the world, doing different types of research in so many different industries and conditions.
I’ve always found one of the best ways to keep these types of events and conferences fresh in mind is to write summaries of main takeaways and learnings.
For each presentation, I have summarised my personal learnings, with the constraint of keeping the summary to one paragraph (as much as possible!)
Thanks again to UX Australia for organising this, and all the amazing presenters for sharing their knowledge and experiences with us.
(Quick note: these are the takeaways that have resonated with me the most from each presentation, and don’t necessarily represent the entire breadth of what was covered during the presentation.)
Day 1 (17 March)
Indi Young — People, Purposes, Patterns & Problem Space (Opening Keynote)
We often solve for the ‘typical’ user, considering only one way to be in this world, and within our own systems. We need to recognise the systems we swim in, and know that there is a wider world out there with diverse ways of existing. We can shift our focus from studying processes or solutions to studying humans, and people’s purposes. A person’s purpose is their aim, intent, objective, what they want to accomplish, achieve, plan, put off, or make progress on. We can form cognitive empathy by asking people their inner thinking, emotional reactions, and guiding principles — core parts of being human. We can use thinking styles instead of personas, as thinking styles represent different philosophic approaches to a person’s purpose. This is a much more richer experience than just asking superficial questions on needs, or using shorthands like demographics to represent people.
Dalia El-Shimy — So You’ve Earned a Seat at the Table. What’s Next?
Many design researchers have already earned a seat at the table, as research increasingly informs strategic decisions made by organisations. But to really utilise this seat, the research needs to made understandable to stakeholders. What language or terminology do the stakeholders talk in, and ask yourself if you are communicating in their language. Look for stakeholder needs in every information source you have access to — from emails and reports, to check-ins and presentations. When you understand someone’s needs, you can tailor your communication to have those needs met. Effective storytelling and memorable shorthand can be powerful ways to communicate complex narratives to stakeholders. Finally, your work does not stop after you have delivered the research, as you need to ask for feedback from stakeholders. If research recommendations don’t get adopted, really try to understand why that is, and in what ways can you improve. It’s a process of iteration and improvement each time.
Kate Hardgrave — Finding the Sweet Spot: Leveraging Insights to Shape Strategic Direction
Research and insights can be used to put customers at the heart of what organisations do. The needs of customers can be brought in directly to stakeholders through the research with customer participants, particularly with customer journey maps. Research and discovery workshops with stakeholders can be conducted to fill in the business layer. Together, both the customer and business journeys can be mapped alongside each other. This allows an organisation to be customer-centric but speak the language of the business.
Stephen Cox — Teaching an Old Dog New Tricks: Moving from Experience Design Research to Product Design Research
Advanced data analytics techniques can be utilised effectively when making the switch from user experience design research to product design research. To address the challenge of better integrating human insights into an analytics-based business, lean experiments can be used to bring people back into the research and make research more customer-focused.
Vivienne Dinh & Kimberly Nguyen Don — Embracing Diversity in Research
Digital accessibility is incredibly important. Research conducted showed that all users wanted an inclusive, consistent experience, not an exclusive one. Consider accessibility upfront, rather than leaving it as an afterthought. Accessibility needs to be fully documented as part of the design itself, rather than leaving it up to front-end developers. Make sure accessibility and diversity are prioritised throughout the research process, from planning and recruitment, all the way to analysis. It makes us better designers when we design with diversity in mind. Take steps towards more inclusive design.
Olivia Kirk & Belinda Tobias — Exploring Health Sector Complexity through Design Research
Over time health is delivered to a person through multiple touchpoints and complex interacting systems. Consider the ecosystem of care by looking at it holistically. Being adaptive during research is really useful. For example, due to COVID-19, in-person research was changed to online research, which actually ended up allowing for many more sessions to be conducted, and allowed for more observers to participate in the research. Furthermore, to design for inclusivity, we must recruit for inclusivity. So look for the most hard to reach participants first, otherwise you might run out of time. Also, be aware of biases and allow for contradictions during analysis — having multiple perspectives from your own team can help with this. Don’t be swept into group-think — give space to everyone’s opinions, and also give your own voice.
Caylie Panuccio — Sociolinguistics, Design Research, and You: How to Languate Gooder
There is no right or wrong variety of language to use. Avoid judgement of how language is used — be a descriptivist. Meet your participant where they are. Establish trust. Adapt language to suit them. Avoid registers and jargon (register = variety of language that is determined by what you are talking about; jargon = the really technical words you might find in a register). Do word swaps. Notice if the participant is code-switching (switching styles), and if that indicates discomfort during research. Make participants feel more at ease. Phatic communication (things like chit chat) can establish rapport with participants. Avoid stereotyping. Treat every individual as unique. Ask yourself: who are you in the research room?
Ben Kraal — Maths for Design Researchers
As a design researcher, when you read other people’s research and numbers, you have to ask yourself if the numbers are saying what they are claiming to say. Your job as a researcher is to read these critically, and look at the research methods used. Good numbers have a benchmark — unless you can compare a measurement, you just have a single number. Benchmarks help you answer questions by comparing and make sense of the numbers. Clients and stakeholders love benchmarks, because the benchmarks help make your data more objective, and let you tell a better story. Benchmarks contextualise your research to the research that has been done before it. Quantitative research is not for every type of research, but can be part of a richer whole of all the research you have done. There are very well-designed user experience surveys and research methods, and lots of literature on how to analyse this data via specially designed tools. You have to do more work to understand the outcomes of these surveys, but you get a much richer story at the end of that research.
Day 2 (18 March)
Lauren Isaacson — Insights from Anywhere: A Guide and Options for Doing Remote Research
There is a rich variety of options to do research remotely. Before selecting your tools for remote research, clearly define:
- your research objectives,
- your target participants,
- and your final deliverables, including any privacy compliance.
User interviews, usability tests, focus groups, ethnography, and more, can all be done using various research software that offer many tools, including privacy compliance. Remote research tools also allow stakeholders to be active, by providing observation-only channels. To find new and interesting research tools, look through curated directories, or join research associations and trade groups — they do demo days where vendors demo their products. Make sure to practice using the tools before conducting tests with participants. Analysis of remote research data can be done with your usual methods — spreadsheets, collaborative affinity maps, qualitative analysis software, etc. Professional transcripts of research helps make the data analysis speedier.
Alex Crook — Reimagining Ethnography During COVID with Adults, Kids and Cars
Due to the onset of the COVID-19 pandemic, planned in-person ethnographic research had to be quickly adapted. The research team pivoted to remote ethnography as a method. The end result was rich data from more diverse people across Australia. Remote ethnography allowed the research team to connect with people who would otherwise be hard to reach via in-person ethnography and usability tests. In particular, the remote nature of research allowed for richer insights, as people did not have to adhere to restrictive research lab conditions. All of this research directly affected product designs, and remote research methods will continue to be used for future product rollouts.
Stephanie Lamont — Remote Research for Business Engagement
Remote research allows participants to join from anywhere, at any time — the same applies to stakeholders. Get stakeholders involved directly in your research. This allows stakeholders to be more engaged throughout the process and advocate for customers with insight. Build inclusivity, excitement, and opportunity for the research. Don’t exclude anybody because you assume the research might not be relevant to them. Invite stakeholders to research sessions with personalised email invitations. In the emails introduce the study; relevance to the stakeholder; session timings; and give the stakeholder a supporting role. At conclusion of research, write a relevant report to increase engagement and bring in different team silos into one whole. Make sure to send out the report to all involved, along with personal thanks to each stakeholder for being involved.
Alana Wade & Serena Lai — Gloves in the time of Corona: Guerrilla Testing in a Pandemic
How can in-person, on-the-street, guerilla testing be conducted during the COVID-19 pandemic? Guerilla research was the method chosen because the user tests were for the initial validation of a facial recognition app. This required a diversity of people testing in real-world conditions. To conduct the research, a COVID safety plan was created and followed during testing. The researchers packed a kit for testing, including battery packs, bacterial wipes, gloves, and printed consent forms. Different suburbs were selected for testing sites based on demographics data. Key learnings from these guerilla tests were:
- Success was found when testing in local traffic areas like outside cafes, rather than high traffic areas like outside malls.
- Setting up a stand and allowing people to approach through curiosity was more successful than walking around approaching people.
- Moving around in different areas allowed for best mix of people and conditions. 3 locations would have been optimal.
- People appreciated the COVID safety plan. And they loved knowing they were helping by being involved.
The researchers got really good insights, with diverse people and real-world conditions. Biggest takeaway with guerrilla testing during COVID: it’s uncomfortable, but don’t take rejection personally. You can get great results through this method.
Laura Ryan — The Role of the Design Researcher in Creating a Wellbeing Economy
The designed world has a profound effect on the world. Our actions as designers has impact on people’s lives. If well-being is not kept in mind, we won’t be conscious of what impact we are having. Using human-centerd methods alone is not enough. Many products that have had negative consequences on people’s lives have use human-centered design methods. Design researchers should be able to answer the questions:
- What is my role here?
- What happens to the research participants and the people using the products I am testing?
When conducting design research, test people’s comprehension of something before testing for desirability and value attribution. Comprehension should form the base for testing.
Jax Wechsler — Trauma-Informed Design Research
Human-centered design research can cause harm to people who have experienced trauma. Power differentials can feel unsafe and cause distress. Design can also be extractive — extract ideas from people, but not in the best interest of the people being researched. Trauma-informed practice is strength-based. Each person is seen as a unique individual, capable of being in charge of their own life and decisions. Always put the well-being of person in front of you, rather than your research goals. Make person feel heard and understood. Never force sharing. Ensure there are appropriate follow-up counselling services for participants and the team. Ensure there is sufficient time for everyone’s self care, including your own. (Book recommendation: Design Justice: Community-Led Practices to Build the Worlds We Need, by Sasha Costanza-Chock)
Styliana Sarris — Reporting your User Research Findings like a Psychologist
Some of the artefacts at the end of design research can be really high fidelity, which obfuscates the methodology and results. Reporting design research using the structure used by academic papers can allow you to communicate with stakeholders in a meaningful and scientific way. This is how these reports can be structured:
- Abstract — this is the executive summary. The entire report should be summarised here, so someone can gauge if they need to read the rest of the report. This section should be written last.
- Introduction — explain the Why — why are you doing the research? Include a literature review, and how this research aims to address the gap in the literature. Conclude with the testable hypothesis, which is an educated prediction.
- Method — the step by step procedure: how data was collected and analysed; the sample size; demographic characteristics (whom did you recruit and why); eligibility and exclusion criteria; how the sample was divided; measures utilised; and the tools used. If we don’t write down the method, someone else cannot replicate it to reproduce the results.
- Results — results of the research, including looking at their statistical significance. Add descriptive and inferential statistics here where relevant. Without looking at statistical significance, we can mislead our stakeholders by making the wrong conclusions — the results might not be statistically significant, in which case they cannot be generalised to the wider population. Avoid reflection and interpretation of results here — that is for the discussion section.
- Discussion — a summary of the hypothesis, along with interpretation of the findings, and what conclusions can be made from this research. Use this section to also call out any study flaws and limits, without being defensive. Discuss future directions.
Brooke Jamieson — The Persuasion Equation: How to Effectively Communicate Results to People Who Don’t Want to Listen
To communicate research effectively to stakeholders, understand their needs and make your presentation relevant to those needs. Make sure each separate person with different wants and needs gets what they needed from your presentation. Seeing your stakeholders as personas is a good way to accomplish this. When reporting, don’t just hoard data and make endless charts. Instead, try to really understand what’s going on, and why people should care about this research. Use research to help people make decisions, not overwhelm them. Make sure you are using research to make decisions, not just keep researching forever — be data driven, not data derailed. The Pareto Principle can be a helpful rule of thumb to keep things in check: 80% of your insights will be formed by 20% of your data.
Paul Merrel, Rebecca Hendry, Sean Rom and Peta Marks — The One-Two Punch: Context and Co-Design
In this case study, the researchers started with the hypothesis: If we can improve early detection and intervention in primary care (particularly at GPs), we should be able to limit progression of illness and improve outcomes. This was a very complex issue, with complex systems and individual factors interacting with each other. The end system needed to be usable by GPs, and have input from the people with lived experiences whom it will affect. Hence co-design was used. There were several key steps which helped make the co-design research approach successful:
- Participants were always humanised and referred in respectful terms such as, “people with lived experiences”.
- Authoethnography to allow participants to send images and self-recordings. This allowed for deeper understanding of people’s lived experiences.
- Roleplaying was used with GPs to walk through how a consultation would happen. This gave a lot of detail and nuance on the context, rather than simply asking the GP interview-style questions.
- Prototypes were co-designed with GPs, and tested in context of use.
- Both GPs and people with lived experiences were reinbursed equally.
Conclusion of this co-design practice showed that co-design increases confidence in how to move forward in terms of development, especially when funding is limited. But make sure to avoid co-creation theatre — avoid spending resources on doing promotion of “letting patients help”, rather than doing the hard work of co-creation. Make sure the process is not tokenistic. Let it be led by people at as many points as possible.
Day 3 (19 March)
Vidhika Bansal — Pensieves & Crystal Balls: Mitigating Bias When Investigating Past Behavior and Future Intent
Memories are fallible and predictions can be misleading. What can we do to mitigate bias when conducting research? There are several types of biases to watch out for when conducting research, some of these are: hindsight bias, confimation bias, social desirability bias, framing bias, present bias, present bias, affective forecasting, and projection bias. Some ways to mitigate biases includes: limiting time to recall; framing questions neutrally and using open-ended questions; avoiding hypotheticals; and getting specific about the timeframe when asking to recall something. Strive for diversity, both in participant recruitment and in your team, which can help reduce biases. Take things with grain of salt — watch out for ideal states and exaggerated emotions.
Michael Bloom — What They Say vs What They Do
Diary studies are a great way to study people’s behaviour, as they allow you to be there without actually be there. This allows people to behave more naturally compared to being observed directly. Smartphones are often used to conduct diary studies, allowing participants to take photos, audio recordings, and notes. Diary studies can provide a rich understanding of a person in context of their environment, rather than observing them at a task or action with narrow lens. Early prototypes can also be tested using diary studies, to give context of use, and how much each feature is actually being used. Diary studies are a good research method if you want to see a diversity of experiences; learn how patterns and behaviours change over time; what motivates someone to act in a certain way; capture micro-moments that result in big changes; and observe behaviours that happen sporadically.
Katrina Ryl & Roland Wimbush — Get Your Hands Dirty: Gaining the Trust of Hard to Reach Users
It’s not always easy to reach “hard to reach” users, but it’s always worth it. In this case study of doing user research with miners working in remote locations around the world, the researchers had to utilise several methods to build trust with participants. The main method was to really understand the lives of the participating miners, especially their pain points, to build empathy. Spoken language, industry lingo, work ethics, religion, and regional cultures were some of the aspects that were essential to udnerstanding the participants. Engaging in these genuinely helped improve relationships with the local community.
Kim Chatterjee — How To Design Your Own “Empathy Walk” (And Make Your Stakeholders Live Your Research Findings)
Empathy walks allow you to think of people as whole human beings, rather than seeing them as one task or one aspect of research. Empathy walks can be used to immerse your audience in your persona’s life, and really reflect on on vulnerability and privilege. When designing an empathy walk, have a clear mission and timeframe in mind for the activities, and let people physically move through the space. The persona you use should be based on real research. Make the personas contextual and concrete, based on what aspects actually affect the peoples lives and decision-making, rather than just creating personalities.
Saher Zafar — Designing a National Behaviour Change Campaign for Rural Cambodia
Iterative design was used to find ways to communicate new hygiene practices to Cambodian children, and improve the uptake of these practices. Behaviour change is a process and needs to go beyond awareness. When designing research for behaviour change, define people, problem, practices, and have a clear goal in mind. Throughout the research, keep going back to your goal to keep things in perspective. It’s easy to get lost if you don’t focus on the goal and where the biggest identified needs were. Be willing to test and iterate. Even “good” ideas can fail. Be open-minded and humble. Testing early and often is best. Constantly engage with multiple stakeholders and consider all agendas and needs along the way. Be strategic — make decisions, and prioritise according to which feedback is necessary and when to move forward. If we understand people and create informed nudges, we can influence actions.
Jess Nichols — Creating Impact through the Research Journey
Creating research impact happens throughout the research journey. Research is a partnership — you can’t do research in a black box, without taking your team on a journey with you. Impact from research comes from team alignment and communication. Your results should not come as a shock to your team. Share early and often, and share clear and actionable insights. Define your research impact at the start:
- Direct impact research — explicitly connect to a business outcome. There is clear solution outcome from this research.
- Indirect impact research — more broader impact, without knowing exactly which solution it will be applied to.
Without this clarity, you will not be successful in your research goals. Align research outcomes to what your business wants to achieve. Advocate for researchers to be in the room when decisions happen. Help create research maturity by visualising and promoting research outcomes (through data viz, videos, audio clips, etc). Take the lid off the research box — be transparent about the time and effort it take to do research. Creating research impact is an ongoing process — see it as a marathon, not a sprint.
Lucy Denton — A Stakeholders Point of View on Engaging in Research
Research insights/outcomes can become lost in reports. Reports are often boring to create and boring to consume. They don’t encourage action. To truly build engagement with stakeholders, involve decision makers in every step of the research process. Create a research plan and share the plan with stakeholders, and then bring them together for discussions. During participant recruitment, get stakeholders involved by asking them questions on inclusion criteria. Push for details, because stakeholders are often thinking of detailed criteria but not expressing it fully. During testing, make sure stakeholders attend more than one session, to truly build engagement with participants. Involve stakeholders in the analysis step as well (or at least in post-session debriefs). Self-service research repositories can be used to “report” back research, rather than long reports. These repositories use atomic design principles to allow research be presented in relevant bite-sized pieces.
Ruth Ellison & Michelle Pickrell — A Framework for Creating Actionable Insights
Don’t mistake findings for insights. An insight creation canvas is a useful took to create actionable insights. “How might we…” statements can be used to increase team alignment and engagement.