Software is Biased
Software is Infallible, Developers are Fallible
The logic follows that, since developers write software therefore software is fallible. We all carry a bias, it's impossible not to; after all, we're each shaped very heavily by our experiences and perceptions of the world. We may both understand what the color red is or how chocolate smells, but perceive them ever-so-slightly differently.
And that's okay, except for when it isn't. Software is when it isn't.
Software is Understanding Codified
That's literally all software is, it's a grouping of rules for which a computer adheres to for processing. These are rules written by a developer, a live person with their respective perceptions and biases of the world.
Let's use a concrete example: a developer writes a program for studying English, something akin to digital flashcards. When covering professions and using images of people, they may associate certain professions with preconceived notions of what the sex of that professional might be (i.e. woman for teacher, man for electrician). This is a bias that exists, most often, because of a lack of exposure to professionals of the other sex. Note I'm using sex and not gender, they're not the same thing.
In the example, the developer has preconceived notions on sex and profession. This bias is translated into an application that is then distributed and spreads the bias as the norm, where countless others are implicitly consuming that bias. This was a fairly straightforward example with a clear bias that's easy to talk about in the current political climate, but bias in software can get very subtle. Again, every developer has a bias. In those subtleties, there be dragons.
Technical Grief
At this point, the more technical among you are somewhere in their journey of grief. If this is the first time you're hearing about it, you're probably somewhere between the denial and anger stages right now. That's fair. After all, I did just call you unaware and – gasp – imperfect. For those that have heard it before, you might be further along; some may be at the acceptance stage, others may have just been stuck in denial.
Regardless, this is sort of the sticking point. Unless there's an understanding of biases we carry, we're prone to implement software imbued with them that can do more mortal damage than implying certain sexes are more fit for certain professions.
Diversity in Tech
You had to know this was coming. I mean, you're not expected to be all professions and all-knowing to solve all your own problems. There are doctors, home contractors, pilots, etc.; you rely on them for knowing things you don't and how to handle them. Why is filling bias gaps any different?
This isn't advocating for "lowering the bar," but for a consideration that the value in diversity isn't virtue-signaling – it's because there's a legitimate viewpoint likely going unrepresented in most software. I don't strictly define diversity as being limited to ethnicity, skin color, or gender. Socioeconomic status unquestionably factors in as well, it just so happens that people of color on average happen to come from lower socioeconomic backgrounds. For better or worse, poor people think differently; they've been exposed to different problems and can have a different understanding of life priorities.
It's an enormous value-add for nearly any software team. I know people from lower socioeconomic backgrounds exist in tech, too; after all, I grew up in a lower-middle class household with a single parent and a younger sibling. I've known that, personally, much of what I've learned and how I think is shaped heavily by what has happened in my life and how I've learned to deal with those circumstances.
Software is Not a Meritocracy
There is a fallacy that exists in the tech world that software development is a meritocracy, that the only consideration should be your strict skill set as it applies to the position; this couldn't be further from the truth.
Several times, I've sat across a candidate who had all the right things on paper and answered all technical questions with acceptable answers yet still failed to show they'd be a valuable team member. These candidates often lacked strong, non-technical communication skills and displayed clear biases in their answers and mannerisms that appeared too rigid to work with. To be sure, their technical skills have real value – I'm sure they were able to find employment elsewhere – but they weren't going to provide the value we needed at the time.
Some technical people might read that and think, "well, then you weren't being clear enough in your job posting!" That may be true, but just how clear do you need those postings to be? Many job postings don't state things such as, "must be nice to others," and yet that's a thing often assessed in interviews.
To assume software is a meritocracy and operate as such is to ignore the privilege that many have enjoyed while entering into the industry. Despite my background, I was very privileged in my access to technical knowledge; I was exposed to family friends with deep technical skills and attended a magnet school focused on science and technology. This is what propelled me in my technical understanding and I was fortunate enough to have access to those things where others may not have. And, yet, others had even more privilege.
Privilege is a Bias
That's the big one, isn't it? We're all biased, but it depends on what bias we're talking about. When it comes to tech, many who enter already come from financially privileged backgrounds. Nearly all current success stories of tech founders who "started out in their garage" are people who came from already-privileged backgrounds and benefit from historically privileged ethnicities.
This is a bias that, when aware and accepting of, can help tear down the barriers that exist for diversity in tech. The bias in software will only become more subtle as time wears on and, now that "software is eating the world," could cause some truly unfortunate incidents. There's no rolling back a release on the real world.