One of the most interesting parts of working at Google is learning what other people do here — it’s not uncommon to come across a job title you’ve never heard of. For example: ProFair Program Manager, or ProFair Analyst.
These roles are part of our Responsible Innovation team, which focuses on making sure our tech supports Google’s AI Principlesa. One way the team does this is by conducting proactive algorithmic product fairness — or ProFair — testing. This means bringing social and cultural perspectives to the testing process, to assess how an AI or ML application, dataset or technique might avoid reinforcing unfair bias. Three women who work on ProFair testing are Anne Peckham, N’Mah Y. and Cherish M. and today we’re asking them: What’s your job?
The job: ProFair Responsible Innovation
Anne is a program manager, N’Mah is an analyst and Cherish is also an analyst.
So…what do you do?
Anne Peckham, a program manager working on ProFair for Responsible Innovation, says she primarily helps others get things done. “I organize projects, figure out strategies, identify what needs to get done, provide documentation, keep track of learnings…and do it again for each project.” N’Mah is a ProFair analyst. “I lead Profair training across Google, coordinate an ethics fellowship program for Googlers and design and conduct fairness tests for products before launch.” Cherish, also an analyst, does this as well. “I help product teams understand how to improve products ahead of launch. I drive our company-wide program in teaching Googlers how to test products, too.” Cherish says a big part of her role is making sure when product teams are building something they think of everyone who will use it — referencing the Google AI Principle of “avoid[ing] creating or reinforcing unfair bias.” “Far ahead of launch time, I look for ways a proposed AI application, ML model or dataset might not function optimally for a user due to unfair bias, so we can help fix it proactively. ”
All three enjoy the variety that comes with this work. “I love how collaborative my role is,” Anne says. “I get to work on many types of projects and with lots of different teams — including the Responsible AI research group.” N’Mah also enjoys seeing the products she’s supported make a difference in the world once they’ve actually launched.
“This role forces me to think outside the box, which I enjoy, and I’m able to advocate for users who may not be in the room,” Cherish says. “This job is very cerebral in nature. And I love collaborating with others to build these products for good.”
How did you choose this job?
All three Googlers didn’t know ProFair was an option when they were first considering their careers. “For a while, I wanted to be a librarian, but coming out of college, I’d been interested in doing political science research or program operations,” Anne says. “I had an entry level job as a program assistant where I was making lists and helping others move goals forward, and that skill transferred to different sectors.”
“I wanted to be a lawyer, but ended up studying Middle East Studies and Spanish,” says N’Mah. “I focused on cross-cultural experiences, and that’s ultimately what drew me to this work.” That ended up aiding her, she says — it helps her understand how products impact people from different cultural backgrounds. Cherish also wanted to be a lawyer, and was interested in technology and ethics. “I was always interested in serving others,” she says. “But I had no idea this sort of career even existed! The teams and roles we work in were developed within the past few years.”
What would you tell someone who wants your job?
Today, there are more straightforward paths toward this work. “Thankfully people who are currently in school have networks to leverage to learn more about this work,” Cherish says. Still, she says, “there is no linear path.” Someone who wants to do this kind of work should be interested in technological innovation but also focused on doing so with social benefit top of mind.
Anne agrees with Cherish: “There is no single path to this kind of work, but I’ve noticed people who choose this career are curious and passionate about wherever it is they are working on. I love program management, but others are passionate about building testing infrastructure, or achieving the most social benefit. You see them bring that enthusiasm to their teams.” Anne mentions that she didn’t think there was “room” for her in this field, which is something to consider for those interested in similar careers: The point of Product Fairness work is that all perspectives and backgrounds are included, not just people with MBAs and computer science degrees. “Ultimately, technology shouldnt be built for homogenous audiences,” Cherish says — and who works in this field should be just as diverse, too.
N’Mah says you shouldn’t feel pigeon-holed by your academic or career background; different experiences, personal and professional, are needed here. “There are a variety of backgrounds you can come from to work in this space — that’s what makes the team great,” she says. “If you’re interested in cross-cultural connections, or socially beneficial technical solutions, this could be an area of interest.” And if you’re someone who’s aware of their own unconscious biases, you might be naturally inclined toward a career in product fairness.
Bonus question: For Women’s History Month, who are some of your women role models?
“I have a strong group of female friends from high school who I’ve kept in touch with over the years,” Anne says. “We’ve all pursued different paths and have various strengths in our careers, but when we meet up, I love hearing what they’re passionate about and what they’re working on.” N’Mah says Harriet Tubman has always been a symbol to her of what’s possible in this country. “She persevered during a challenging moment in history and has done so much to push America forward socially.” For Cherish, she looks up to Maya Angelou. “She had such an incredibly poignant impact on society through her activism and her literature.”
By Tasnoba Nusrat Keyword Contributor
Source Google Cloud