Day 1-Redefining Truth and Inclusivity: Navigating data ownership and ethical research in the age of disinformation
— Hi everyone, I do UXR in the trust and safety space, and this presentation about disinformation and its impact on research
- Excited to be here with many things to cover, and other portions of conference
— Will talk about problem of profession, with rise of disinformation and what it means for conducting resaerch, along with actionable next steps and a Q&A
— Problem is that truth being reshaped by rise of disinformation, and considering how truth coincides with inclusivity, it is changing the nature data ownership
- Work will be about respecting ownership of narratives, while engaging with work equitably, and clashes with organizational policies
— Practical strategies for understanding what it actually means means, amid landscape that benefits from disinformation, and what we can do to benefit research
- Based on research and scholarship, and largely conducted by non-Western scholars
— Jem pointed out three areas to investigate as role of problem
- Researchers and organizations
- Organizations and participants
- Researchers and participants
— Final recommendations will ladder back-up to this
— Before recommendations, let’s acknowledge our own positionality, and be honest about it, and how their perspectives, shape the presentation
- Over course of career in product equity and trust and safety— my lived experience as a Black man, worldview shifted dramatically to how current systems benefit those in power
— Ultimately believe here is what we must do to engage in work equitable
- Still have to live and work with constraints of our present situation, and do our best to shift it a little bit
- Do my best to tie recommendations to reality of profession, stakeholder needs, without risking livelihood or reputation
- Reflection of my own opinion, and not employer or past employer, and based on collective knowledge of people I know of
— Let’s align on disinformation and what it means
- Disinformation: False information shared to deceive
- Misinformation: False information shared without malicious intent
— Widespread misinformation fuels distrust and impacts willingness to engage in research, and people can blur fact and opinion, and this impacts sound research insights
- For actual research participants we have had to convince them that misinformed their informed point of view
— Disinformation thrives because it’s profitable, and algorithms prioritize engagement over accuracy and polarize public discourse
- Academics have illustrated how media shapes narrative for vested interest rather than objective truths
- This is how people and stakeholders get infected by disinformation and we are susceptible to the same vector points as anyone else
- Historians warn that disinformation used to control people and destabilize democratic discourse
- And actors are harnessing reduction of fact-checking to fuel further disinformation
— Example of weak and strong tie connection, and fueled largely by technology companies, with policies that outline hiring and signal to research participants about how data might empower or exploit them
- AI will exacerbate this
— Lot of people impacted amid explosion of platforms change how info is generated, and consumed
- New efficiencies and profound risks— like misleading info at scale
- Deepfake and AI generated information can manipulate disinformation, and algorithmic biases impact and amplify societal inequities without rigorous accountability and embedding into design process
— Rapid rise of tools allowed to distinguish between manipulated and authentic info
- People need to be mindful of this, and how marginalized communities impacted
- Comes up with people all the time— and people turning to AI tools for connection, and don’t trust AI tools and empathy
— Have real reasons for worry
— Given this, that people who have been marginalized view tech companies with distrust, and extractive research and design practices alienate people even further
- Instagrams creating of AI influencers that gave impression of engagement with influencers would be representative
- Trained on stolen data, to drive engagement and use identity politics to distract from policies actively harming vulnerable communities
— The bigger problem is that people don’t really understand the communities they are designing for, beyond the stereotypes they have of these communities
- Instead of really researching communities to uncover their needs, they run risk of significant PR disasters and poor product-market fit
— Potential participants might not have list of companies that wronged them, but they do have a pervasive sense of distrust, and no incentive to help companies to get the findings right
— Under these conditions, research samples for our products are now more likely to include negative selection effects, including:
- People who want to vent about problems
- Least likely to get harmed when company gets it wrong
- Seeking approval and access to jobs a the company
— And these panels will NOT include
- Representative samples
- People most vulnerable to harm
- People with thoughtful recommendations and ways to provide value
— When we can’t recruit, risk descending further to where truths we center are shallow, easily manipulated, and in service of status quo
- Looking back at own qualitative studies, have four participants out of 10 who make the study, and offer something truly new and help add value to role as stakeholder and role as researchers
— I repeat this mantra when setting up any study
- The most part important of any study is the research sample — if you target the wrong audience you’ve already failed, and insights will mislead
- An audience you can’t recruit doesn’t tell what their community problems are
— Reflecting on threads
- Rise of disinformation, unethical design decisions, and growing distrust, and desire for participants to own their own narratives
— How it matters?
- People refuse to take part
- More effort to research vulnerable communities
- Crucially have misinformed stakeholder and misaligned incentives
- Short-term thinking and myopic leadership
- Confusing facts with opinions
— So how can we push back against this
— List of practices to ladder back-up and research organizational relationship
- Positionality: Reflective practice to understand your identity and social position and how it works and for each of these linked to resources for these
- Equity Pause: Lilly Zhang said checking biases is to focus on decision you make that impact other people
- Leveraging learnings to create an equity pause at key moments to figure out biases taht might be present before making decisions that impact people’s lives, and how to leverage
- Baseline questions in appendix and resources of this talk
- What Could Go Wrong?: Grasping what happens if impact veers from intent, and whole game in product or design strategy. This is aligned with red-blue team workshops
- Strategic Positions: Pack into metrics, business outcomes, and PR disasters— and gets at litany of cognitive biases that are impacted
- Intentional intervention and influence on decision making
— Now let’s look at tactics to build trust with participants
- Informed Consent: if this is unavailable, have institutional review boards for research ethics
- Data Quality organization, and uphold more rigorous data standards and verify consent and identity
- Community-Based Partnerships: Authentic relationships with communities to get rare insights
- Co-Design and Accountability: Participatory co-design with communities and plans for accountability, and research reckoning. Strategic and tactical research, and ethnographic research greatly appreciated
- Transparency: Share case studies about design processes and highlight for future participants and after interview or workshop— people feel better about company and product based on care and respect and intentionality we shared
- Moments of ownership of experience, and share it as case studies, and setting standard for social and business pressure to follow
— Strategic positioning, and navigating wide-scale divestments from DEI, face headwinds, and aligning equity research efforts with trust and safety
- Help protect work with shifts in political priorities, GDPR, and bad actor detection
— End with some mistakes to avoid and lessons learned with my work in equity and trust and safety
- Equity research is just another tool and all tools work in areas you work in
- Takes courage to push back against dominant culture, and lazy narratives in an organization of people who are not trained in this
- It’s stressful and anxiety inducing
- Helps reinforce systemic inequity and what a great emotional challenge it is
- Self-reflection to grasp position to leverage power in work
- Otherwise extremely emotionally tasking and leaning on family and friends
- By sharing power and increasing transparency you are not decreasing your own power— and this work takes time
— Some may view investment in equity as cost without immediate terms, but real impacts for business value
— Benefits
- Enhanced consumer trust and higher engagement
- Participants who feel seen and valued, and solving real problems in their lives, and feeling respected — engaging with people who have greater ownership and agency in process, results in deeper insights
- Competitive advantage— Market differentiator and helps us stand out
- Risk mitigation— Helps present PR disaster lawsuits, regulatory penalties, etc.
— Thank you for joining me on this journey and open to questions
Q&A
- As disinformation holds power in the US and impacts many countries as well, how can we resist what we’ve striving together in specific role? What is advice to give here?
- Build community and organizing together who have been doing work longer than just the moment in time
- Lot of political views that overlap with this, but enage with orgs and others and be strategic to build initiatives and strategic on battles you pick and against leadership
- Happy to think more deeply
- Build community and organizing together who have been doing work longer than just the moment in time