Day 3-Why Changing Hearts & Minds Doesn’t Work When Promoting DE&I Efforts, but Checklists Do

ch1

Thank you for the introduction and we are pumped to be here
  • Use lot of animated GIFs and have tried to avoid  unexpected contrasts
  • Will lean on those in chat to look at certain content

 

ch2

—  We can all say 2020 was a very interesting time
  • Witnessed unprecedented number of people taking to the streets
    • It felt like people were getting on board

 

ch3

— Saw brands with sponsored conversations, DEI statements, and focus on equity focused roles
  • What a time for the DEI enthusiast

 

 

ch4

— Wanted to bring design org with parity with rest of the org with DEI
  • We did a book club, media club, and Slack channel

 

 

ch5

— These efforts didn’t work. Excitement turned to burnout and we were stressed and not moving needle on goals

 

ch6

— Most importantly we couldn’t scale
  • Dominique Ward defines scale as doing less while having greater impact
    • We were doing the most on our team and little impact

 

ch7

— So what were the goals?
  • 1) Change makeup of the design team
  • 2) Provide access to jobs and salaries to those historically excluded

 

ch8

— Reflected on things we did
  • Had good attendance and engagement but didn’t equal success in terms of outcomes

 

ch9

— So people were talking, but when we asked “How have you seen this show up at org?”
  • There was radio silence when this was asked

 

ch10

This set up problem that all we talked about was too removed from day-to-day work and organizational context
  • People were at remedial Race Relations 101, while we were asking for high-level analysis

 

ch11

— This was bolstered by data in our org and others
  • Study by HBR talking about DEI in workplace pointed out where BIPOC have more mistakes listed with feedback
    • Less leadership  mentions for BIPOC versus white women
    • Women 11x more likely to be described as abrasive

 

ch12

— So we changed our focus from hearts and minds to tangible impact on company goals

 

ch13

— Two ways we did this:
  1. Created objective evaluation criteria
  2. Evidence-based performance reviews

 

ch14

— Before criteria, we’ll touch upon three group norms (from Michelle Bess), commitments made to the virtual space together

 

— Three  group norms
  1. Stay present and don’t check out of conversation, as much as possible
  2. Be open, as we are offering new perpsectives, and resist the urge to dismiss what you are hearing.
    1. Reflect on how this might work, and approach with curiosity
  3. Sit with discomfort, and you can see yourself reflected in stories told
    1. Sit with uncomfortable feelings and where they are coming from

ch15

— We’ll kickoff the discussion with a story

 

ch16

— We were hiring an early career reseracher org
  • We had a Black female candidate who had credentials from agency, with master’s experience
  • Our white female research lead passed on her, and we could understand this

 

ch17

— But to our suprise, she then had a portfolio review with white man who had no degree, and said to hire him

 

ch18

— I would say that’s pretty racist, but

 

ch19

— There was a lot of word salad thrown out to minimize what happened

 

ch20

— So we decided to do side-by-side comparison of the two candidates and this got me thinking on objective metrics
  • We were moving the goalposts on who was applying

 

ch21

— So we created 12 core competencies for design org
  • Not just product designers and researchers but analysts
  • Worked to talk through skills important for each capability
— Note that we don’t expect to have person be strong in all 12, but to have most important skills for capability representing and level in org

 

ch22

— We used example and demonstrated behavior for a skill like writing
  • i.e. writing macro and micro-copy

 

ch23

— What is the best way to measure them, and how the org would measure them
  • Encouraged people to share evaluation methods on Slack

 

ch24

— We used a scale for actions over perceived knowledge and how individual used the competency
  • Scale ranged from being completely unfamiliar with the competency to using it in new ways

 

ch25

– How knowledgable you are often dpeends on the eye of the beholder
  • Might go to great school, but an interviewer can disregard that

 

ch26

— You can move from “I know” which is subjective, to “I can” which can be demonstrated

 

ch27

— So coming up with objective criteria is first thing to do
  • Without this, run risk of all biases running into processes
  • This is hard to fix after the fact

 

ch28

— We laid the foundation for future work we did
  • Filling out cheat sheet for guide to take basic hiring information and what you are looking for when hiring for someone
    • Pick-up required competencies (3 and at what level)
    • Behaviors in case studies and actions to demonstrate, along with sample discussion questions
    • Finally have archetype
  • Highlight  three non-negotiables , and archetype to help sort out closely qualified candidates
    • We included a Figjam on work and joint notes for evaluating candidate and why might be good fit

 

ch29

— Creating objective criteria helped team be more expansive and helped people notice when they were looking for reasons not to hire someone
  • If person met basic competencies,  they could pass on to next tier of interview
— This helped us ask right questions, and why this particular candidate and select this one and ask if bias is showing up
  • Great feedback on this to be clear about role for team

 

ch30

— Found objective criteria and tying or to org cycle helped work with individuals from changing way we are all thinking, and instead change specific process
  • Applied logic to employee journey
  • Example of performance review process at organization
— So where would you focus next, in  the context of the employee journey:
  • We asked ourselves: Where do we see high likelihood of bias?
    • The more individuals involved as opposed to processes, more likely bias
    • So we looked at journey where HR was not in charge of process

 

ch31

— So we focused on performance reviews, where lot of bias existed
  • Literature found global ratings is a  petri dish for bias, due to
    • 1) Confirmation bias with notions based on characteristics and traits, and paying attention to those
    • 2) People relying on potential impact  only for one group (‘white men’), while others are asked to perform and provide results
    • 3) Performance reviews aslo had long-term and short-term
      • Short-term: Performance reviews played part in what bonus was and compensation package for the year
      • Long-term: Evaluation tied to career opportunities inside and outside the org

 

ch32

— So we introduced evidence based performance reviews through a workshop and introduce process that would remove bias from feedback discussions as much as possible
  • Focus on minimizing bias from conversation

 

ch33

— Introduced three principles for managers and team members
  • Give 3 pieces of concrete evidence for feedback you give for individual
    • If asked, is this a trend, you have lot of info about individual, and more constructive criticism
    • It also minimizes halo/horn effect as white men who do one thing right have everything else cast as positive,
      • While for BIPOC workers, doing one thing wrong is a black-mark against you, whatever else you do

 

ch34

— Not enough for people to look at own feedback and eliminate bias from conversation themselves
  • Instead we ask managers to audit feedback and take what you learn to see bias
— Performance reviews based on research and examples based on workshop and examining biased feedback

 

ch35

— Managers audited individual pieces of feedback, and then audited all feedback to look for tends and outliers
  • This is way to see where bias slips in
  • Manager with three individuals and noticed one stood out
    • Look at characteristics and probably can see bias at play

 

ch36

— Performance review conversations to be constuctive by following up with reviewers and making it dynamic
  • Go back, and without three pieces of evidence, ask if  there are other times behavior showed up?
  • Consider other critical context you are missing
    • Pay attention to trends, and have conversations with people

 

ch37

— We introduced workshop to move from looking at personality-driven moments in time ,to make performance reviews to be constructive
  • Concrete evidence based criticism let us take performance reviews

 

ch38

— People thought this was incredibly successful workshop compared to others

 

ch39

— Asked participants to do opposite of introspection, but instead…

 

ch40

— We asked them to audit other people’s feedback and look at others and bringing anti-bias lens to tthem

 

ch41

— The results?
  • Empowered team members to call out bias. Opened up performance reviews for potential bias
  • Held managers to a system for reviews
    • “The liked how objective criteria was taken to heart, and used empirical data to back up claims, and empowered people to do this themselves”

 

ch42

— We had different idea of success
  • Checklists to objective criteria and organziational cycles
— We are not fully there for accountability
  • We still need to tie performance to incentives and disincentives
— We are eager to hear your thoughts on accountability

 

ch43

— We also have another request
  • Want to scale practice byeond and link all resources with session to help you find pieces of the org cycle that are relevant to you, as well as associated checklists

 

ch44

— To sum up:
  • Checklists moved needle as they were different from free-form discussions and centered facts in objective criteria and evidence
  • Got around table and come to mutual agreement
  • Not impacted by individual bias and can make conversations where bias wouldn’t impact end result

 

ch45

— In conclusion, forget hearts and minds and don’t go chasing waterfalls
  • Disrupt your org with checklists
FAQ
  1. Can you share version of cheat sheet? Looks amazing.
    1. Yes, can share template used, but if you have competencies and criterias, do that first. The cheat sheet is a playbook
      1. Link will be as Google Drive in request for resources.
  1. Is the hiring cheat sheet before interviews or after? Does panel know beforehand?
    1. Yes, goes out to the entire panel, and have FigJam to take notes and gut check on non-negotiables
    2. Used to help write job description itself beyond strict interview space
  1. Story behind the matching citrus shirts in the call?
    1. No story, we just decided on matching citrus shirts