Limited Time Offer!

For Less Than the Cost of a Starbucks Coffee, Access All DevOpsSchool Videos on YouTube Unlimitedly.
Master DevOps, SRE, DevSecOps Skills!

Enroll Now

Does artificial intelligence have a gender?

Source: hindustantimes.com

Artificial Intelligence is using advances to help medical professionals detect and treat cancers; emergency responders predict and prepare for impending natural disasters; police identify criminals and safely disarm bombs; organisations improve products, services and processes and school children receive tailored help from virtual teachers suited to their learning style. Through the use of robots and software agents, the machine may even perform these tasks alone or as a team member collaborating with humans. If we are going to build machines that play roles that simulate human reasoning, behaviour and activities, as a society we should ensure that those machines benefit all members of society, regardless of their age, gender, religion or status in society, rather than replicate human biases, perpetuate disparities or widen the gap between the haves and have nots.

If AI is a simulation of human intelligence, who does it simulate and does it have a gender? Whether you view gender as socially constructed by one’s environment and culture, a biologically determined factor as in the essentialist perspective, or adhere to the theory of individual differences, gender plays a role in who we are.

All too often it affects how we are perceived and what we can do. That can vary from opportunities to pursue a certain career to whether our car navigation system recognises or ignores our voice commands.

In my area of AI research, female avatars are most commonly used to play virtual assistants and companions. This perpetuates a perspective that helping roles are best performed by women.

These characters are friendly and empathic, but also submissive and there are no negative consequences for users who ignore them or even verbally abusive them. But more often AI represents males.

I recall earlier this year at a Digital Health conference in Melbourne a medical specialist confessing that 25 years ago when he was a rural GP he misdiagnosed a female patient which nearly cost her life because he had never seen that condition in a female.

The dataset he was operating from, his experience, was biased. Similarly, the bias within AI is due to the inherent bias in our world.

It exists in the expertise we capture in knowledge based systems, in the datasets from which we develop predictive models and the software and hardware designed for and tested by (mostly) men who naturally operate from their own experiences and needs. To make matters worse, because the AI is doing the task, the bias becomes more hidden, particularly in methods like deep learning that are difficult for humans to interpret or understand.

In order for AI technology to meet the needs of both men and women, both genders should be the target of innovation ns, involved in the design of these systems and represented in datasets and evaluations. For example, we need to avoid unconscious bias in deciding what features to include or exclude in the training of predictive models. But how can we deliver inclusive solutions given current gender gaps?

Globally, women are underrepresented in Engineering and Information Technology classrooms and workplaces, with representation around 30% in India and significantly lower in other countries ; resulting in products and technology mostly designed with men in mind.

In AI research, that percentage is closer to 10%, as I observed in 2018 at the joint-AI conference held in Stockholm with thousands of delegates, where I and another lady had the rare experience of walking straight into a toilet cubicle following the keynote speeches and watched with some amusement the long and winding queue emanating from the men’s bathroom.

Governments, universities, industry and wider society need to work together to develop ethical frameworks that harness the benefits of AI without ignoring concerns such as cognitive degeneration, threats to autonomy, accountability, privacy, security, discrimination, societal implications and economic impacts.

Five key principles found across existing frameworks mandate that AI technology should benefit the common good (beneficence); do no harm (non-maleficence); maintain human agency (autonomy); promote diversity and fairness (justice) and to ensure accountability, responsibility and transparency (explicability) with respect to the other principles.

Particularly relevant to the gender question, the principle of justice aims to eliminate discrimination, minimise data bias and promote shared benefits.

So back to our question. Does AI represent or favour a particular gender? Yes, currently it mirrors our world dominated by data, decisions and designs for and by males.

Explainable AI, AI that can explain its goals, beliefs, reasoning and knowledge boundaries, provides a fresh opportunity to make this bias transparent.

To bring the female voice to AI is another key solution that can be achieved through initiatives such as the women in STEM programs at Macquarie University situated in Sydney, Australia. With commitment to follow ethical principles, together we can build AI that exposes bias and does not discriminate based on gender; in so doing artificial intelligence can transform human intelligence and our society.

Related Posts

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Artificial Intelligence