2025 Social Inequality
Challenge Overview
Social inequality is one of the biggest problems facing Western society. Inequalities defined by gender, sexuality, race, ethnicity, education and class hold back individuals and societies.
Did you know, for instance, that office temperatures are determined by the average metabolic rate of men, which makes them 5°C too cold for women? Or that 10.3% of black students drop out of university in England, compared with 6.9% of the student population as a whole? These are just two examples of the ground-breaking research in recent years that has shone a light on the endemic, often invisible, structures that underlie inequality.
This is a dynamic Challenge, looking at what we can do to build an environment to address inequality and make a difference. Topics may include policy-making and inequality, recognising gender inequality in media and film, poverty, global economic inequality, social mobility, safety on the streets.
This year we are also working in collaboration with the South-West Social Mobility Commission (SWSMC) on a new and very exciting enquiry group focussing on Transformational changes needed in education and employment to achieve better outcomes for young people who come from disadvantaged backgrounds. More information on the work the commission was tasked to do can be found on their website: https://www.exeter.ac.uk/about/south-westsocialmobilitycommission/
This challenge will run on Streatham Campus.
Like most twenty-first century institutions, the University of Exeter aims to be equal and inclusive in every aspect of its policies. But does it always succeed? And how can you help to shape policy and/or publicise the university’s policies around inequality?
This enquiry group will investigate the policies currently in place for trying to ensure equality amongst students in the university. It will discuss these policies with a policy-maker in the university’s Equality and Diversity Team and seek to identify areas of strength and weakness. Using these insights alongside workshopping ideas in focus groups and analysing policy documents, the enquiry group will then set out to develop/define new policy in an area it feels could be better.
The end goal is to present a campaign aimed at university leaders using a range of media that could include an informatics poster, a short film, a pamphlet.
Is the Harvey Weinstein scandal indicative of an endemic problem in the Hollywood film industry? How does the Daily Mail’s language position and judge women as bodies? To what extent does the BBC perpetuate gendered assumptions in how it sets the news agenda? How are women portrayed in films and television?
These very diverse and current questions cut to the heart of a gender bias at the media. The task of this enquiry group is to identify where and how media bias operates and address what the next generation of journalists, policymakers, presenters and media executives can do about it.
In Grand Challenges week, this enquiry group will investigate how the media and films shape our perceptions of gender in numerous subtle, often, invisible ways. It will give you the opportunity to draw on recent articles, cutting-edge research in the field, and extensive discussions and analysis in your group. The task is to draw this research together into a campaign, specifically aimed at people in the 18-25 age category highlighting the images and language that define damaging gender stereotypes through the media and/or films. The aim? By highlighting exactly how gender is represented, you will help your target audience to see beyond the myths they perpetuate.
Your product might include the following: a pamphlet, a short film, a website, a poster to showcase your ideas, alongside a campaign. But ultimately, it’s up to you!
Health differences between people are often unfair and impossible to ignore. Health gaps exist everywhere—both in local communities and around the world. Research from the British Cardiovascular Societies shows that women continue to face underdiagnosis and inadequate treatment for heart disease. In one study, women were found to be 50% more likely than men to be given an incorrect diagnosis following a heart attack. There is clear evidence of racism and discrimination in the UK, which harms both the physical and mental health of people from ethnic minority groups. These issues also create barriers to obtaining health information and accessing health care services. These differences in health outcomes and access to care are deeply rooted in histories of power and inequality. They affect people based on race, ethnicity, class, gender, sexual orientation, and citizenship, often targeting groups that are excluded from what is considered “normal.” This exclusion makes health inequalities seem acceptable or unavoidable. This group aims to reduce these barriers, ensuring that everyone has access to good health care and supporting the right to health for all.
Generative artificial intelligence (gen AI) has the potential to boost the economy. Studies show it can make workers more productive, especially in jobs that require advanced thinking skills, and help businesses grow and innovate. This could lead to higher overall output and wages. However, if gen AI is not used equally across different groups, it might increase the gaps in pay and job opportunities, exacerbating inequality—a key concern for policymakers. There is a notable "gender gap" in the use of gen AI. On average, 50% of men report having used generative AI in the past year, compared to 37% of women (BIS, 2024). Factors like income, education, age, or race do not explain this difference. Gender differences in privacy concerns, trust in using gen AI tools, and the way men and women perceive the economic risks and benefits of AI contribute to the remaining gap. Furthermore, bias can also emerge in the way gen AI itself generates content. Research from Charles Sturt University has shown how AI-generated images of professional roles in health and medicine, such as medical students and pharmacists, often exhibit gender and ethnic bias. Despite women making up 54.3% of medical students in Australia, only 39.9% of AI-generated images depicted women. Similarly, the images often lacked ethnic diversity, with 0% representing people with dark skin tones. Such biases in AI-generated images could perpetuate misrepresentations of the diversity in these fields, potentially deterring women and minorities from pursuing careers in health and medicine.