A study comparing acceptance rates of contributions from men and women in an open-source software community finds that, overall, women’s contributions tend to be accepted more often than men’s – but when a woman’s gender is identifiable, they are rejected more often.
“There are a number of questions and concerns related to gender bias in computer programming, but this project was focused on one specific research question: To what extent does gender bias exist when pull requests are judged on GitHub?” says Emerson Murphy-Hill, corresponding author of a paper on the study and an associate professor of computer science at North Carolina State University.
GitHub is an online programming community that fosters collaboration on open-source software projects. When people identify ways to improve code on a given project, they submit a “pull request.” Those pull requests are then approved or denied by “insiders,” the programmers who are responsible for overseeing the project.
For this study, researchers looked at more than 3 million pull requests from approximately 330,000 GitHub users, of whom about 21,000 were women.
The researchers found that 78.7 percent of women’s pull requests were accepted, compared to 74.6 percent for men.
However, when looking at pull requests by people who were not insiders on the relevant project, the results got more complicated.
Programmers who could easily be identified as women based on their names or profile pictures had lower pull request acceptance rates (58 percent) than users who could be identified as men (61 percent). But woman programmers who had gender neutral profiles had higher acceptance rates (70 percent) than any other group, including men with gender neutral profiles (65 percent).
“Our results indicate that gender bias does exist in open-source programming,” Murphy-Hill says. “The study also tells us that, in general, women on GitHub are strong programmers. We don’t think that’s because gender affects one’s programming skills, but likely stems from strong self-selection among women who submit pull requests on the site.
“We also want to note that this paper builds on a previous, un-peer-reviewed version of the paper, which garnered a lot of input that improved the research,” Murphy-Hill says.
The Latest on: Gender bias in computer programming
- NASA Names Headquarters After Mary W. Jackson, First African American Female Engineer at NASAon June 24, 2020 at 9:06 pm
NASA Administrator Jim Bridenstine announced Wednesday the agency's headquarters building in Washington, D.C., will be named after Mary W. Jackson, the first African American female engineer at NASA.
- NASA to name headquarters after Hampton native Mary Jackson, its first Black female engineeron June 24, 2020 at 3:39 pm
Officials with NASA announced on Wednesday that the title of the agency’s headquarters will be named after the first Black female engineer who worked with the ...
- Overcoming Bias In A World Of Bad Informationon June 23, 2020 at 3:11 am
Robots make sense for repetitive and dangerous tasks, but they also work well as a check against bias. Artificial intelligence ... about things like race or gender, but they rely on inputs that ...
- Is This the End of Facial Recognition?on June 22, 2020 at 5:00 pm
Stay with us. S1: Back in the summer of twenty seventeen before she started her research on algorithmic bias, Deborah Shaji was working at a company called Clarify, a computer vision startup.
- The EMpower Top 100 Ethnic Minority Future Leader Role Models 2020on June 17, 2020 at 4:17 pm
This list celebrates inspirational people of colour who are not senior in their organisations but are making a significant contribution to ethnic minority people at work.
via Google News and Bing News