Reading view

There are new articles available, click to refresh the page.

Early warning tool will help control huge locust swarms

Huge locust swarm fills the skies in Ethiopia

Desert locusts typically lead solitary lives until something - like intense rainfall - triggers them to swarm in vast numbers, often with devastating consequences. 

This migratory pest can reach plague proportions, and a swarm covering one square kilometre can consume enough food in one day to feed 35,000 people. Such extensive crop destruction pushes up local food prices and can lead to riots and mass starvation.

Now a team led by the University of Cambridge has developed a way to predict when and where desert locusts will swarm, so they can be dealt with before the problem gets out of hand. 

It uses weather forecast data from the UK Met Office, and state-of the-art computational models of the insects’ movements in the air, to predict where swarms will go as they search for new feeding and breeding grounds. The areas likely to be affected can then be sprayed with pesticides.

Until now, predicting and controlling locust swarms has been ‘hit and miss’, according to the researchers. Their new model, published today in the journal PLOS Computational Biology, will enable national agencies to respond quickly to a developing locust threat.

Desert locust control is a top priority for food security: it is the biggest migratory pest for smallholder farmers in many regions of Africa and Asia, and capable of long-distance travel across national boundaries.

Climate change is expected to drive more frequent desert locust swarms, by causing trigger events like cyclones and intense rainfall. These bring moisture to desert regions that allows plants to thrive, providing food for locusts that triggers their breeding.

“During a desert locust outbreak we can now predict where swarms will go several days in advance, so we can control them at particular sites. And if they’re not controlled at those sites, we can predict where they’ll go next so preparations can be made there,” said Dr Renata Retkute, a researcher in the University of Cambridge’s Department of Plant Sciences and first author of the paper.

“The important thing is to respond quickly if there’s likely to be a big locust upsurge, before it causes a major crop loss.  Huge swarms can lead to really desperate situations where people could starve,” said Professor Chris Gilligan in the University of Cambridge’s Department of Plant Sciences, senior author of the paper.

He added: “Our model will allow us to hit the ground running in future, rather than starting from scratch as has historically been the case.”

The team noticed the need for a comprehensive model of desert locust behaviour during the response to a massive upsurge over 2019-2021, which extended from Kenya to India and put huge strain on wheat production in these regions. The infestations destroyed sugarcane, sorghum, maize and root crops. The researchers say the scientific response was hampered by the need to gather and integrate information from a range of disparate sources.

“The response to the last locust upsurge was very ad-hoc, and less efficient than it could have been. We’ve created a comprehensive model that can be used next time to control this devastating pest,” said Retkute. 

Although models like this have been attempted before, this is the first that can rapidly and reliably predict swarm behaviour. It takes into account the insects’ lifecycle and their selection of breeding sites, and can forecast locust swarm movements both short and long-term. 

The new model has been rigorously tested using real surveillance and weather data from the last major locust upsurge. It will inform surveillance, early warning, and management of desert locust swarms by national governments, and international organisations like the Food and Agriculture Organisation of the United Nations (FAO).

The researchers say countries that haven’t experienced a locust upsurge in many years are often ill-prepared to respond, lacking the necessary surveillance teams, aircraft and pesticides. As climate change alters the movement and spread of major swarms, better planning is needed - making the new model a timely development.

The project involved collaborators at the FAO and the UK Met Office. It was funded by the UK Foreign, Commonwealth and Development Office and the Bill and Melinda Gates Foundation.

Reference: Retkute, R., et al: ‘A framework for modelling desert locust population dynamics and large-scale dispersal.’ PLOS Computational Biology, December 2024. DOI: 10.1371/journal.pcbi.1012562
 

A new tool that predicts the behaviour of desert locust populations will help national agencies to manage huge swarms before they devastate food crops in Africa and Asia. 

The response to the last locust upsurge was very ad-hoc, and less efficient than it could have been. We’ve created a comprehensive model that can be used next time to control this devastating pest.
Renata Retkute
Locust swarm fills the skies in Ethiopia

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Massive black hole in the early universe spotted taking a ‘nap’ after overeating

Artist’s impression of a black hole during one of its short periods of rapid growth

Like a bear gorging itself on salmon before hibernating for the winter, or a much-needed nap after Christmas dinner, this black hole has overeaten to the point that it is lying dormant in its host galaxy.

An international team of astronomers, led by the University of Cambridge, used the NASA/ESA/CSA James Webb Space Telescope to detect this black hole in the early universe, just 800 million years after the Big Bang.

The black hole is huge – 400 million times the mass of our Sun – making it one of the most massive black holes discovered by Webb at this point in the universe’s development. The black hole is so enormous that it makes up roughly 40% of the total mass of its host galaxy: in comparison, most black holes in the local universe are roughly 0.1% of their host galaxy mass.

However, despite its gigantic size, this black hole is eating, or accreting, the gas it needs to grow at a very low rate – about 100 times below its theoretical maximum limit – making it essentially dormant.

Such an over-massive black hole so early in the universe, but one that isn’t growing, challenges existing models of how black holes develop. However, the researchers say that the most likely scenario is that black holes go through short periods of ultra-fast growth, followed by long periods of dormancy. Their results are reported in the journal Nature.

When black holes are ‘napping’, they are far less luminous, making them more difficult to spot, even with highly sensitive telescopes such as Webb. Black holes cannot be directly observed, but instead they are detected by the tell-tale glow of a swirling accretion disc, which forms near the black hole’s edges. When black holes are actively growing, the gas in the accretion disc becomes extremely hot and starts to glow and radiate energy in the ultraviolet range.

“Even though this black hole is dormant, its enormous size made it possible for us to detect,” said lead author Ignas Juodžbalis from Cambridge’s Kavli Institute for Cosmology. “Its dormant state allowed us to learn about the mass of the host galaxy as well. The early universe managed to produce some absolute monsters, even in relatively tiny galaxies.”

According to standard models, black holes form from the collapsed remnants of dead stars and accrete matter up to a predicted limit, known as the Eddington limit, where the pressure of radiation on matter overcomes the gravitational pull of the black hole. However, the sheer size of this black hole suggests that standard models may not adequately explain how these monsters form and grow.

“It’s possible that black holes are ‘born big’, which could explain why Webb has spotted huge black holes in the early universe,” said co-author Professor Roberto Maiolino, from the Kavli Institute and Cambridge’s Cavendish Laboratory. “But another possibility is they go through periods of hyperactivity, followed by long periods of dormancy.”

Working with colleagues from Italy, the Cambridge researchers conducted a range of computer simulations to model how this dormant black hole could have grown to such a massive size so early in the universe. They found that the most likely scenario is that black holes can exceed the Eddington limit for short periods, during which they grow very rapidly, followed by long periods of inactivity: the researchers say that black holes such as this one likely eat for five to ten million years, and sleep for about 100 million years.

“It sounds counterintuitive to explain a dormant black hole with periods of hyperactivity, but these short bursts allow it to grow quickly while spending most of its time napping,” said Maiolino.

Because the periods of dormancy are much longer than the periods of ultra-fast growth, it is in these periods that astronomers are most likely to detect black holes. “This was the first result I had as part of my PhD, and it took me a little while to appreciate just how remarkable it was,” said Juodžbalis. “It wasn’t until I started speaking with my colleagues on the theoretical side of astronomy that I was able to see the true significance of this black hole.”

Due to their low luminosities, dormant black holes are more challenging for astronomers to detect, but the researchers say this black hole is almost certainly the tip of a much larger iceberg, if black holes in the early universe spend most of their time in a dormant state.

“It’s likely that the vast majority of black holes out there are in this dormant state – I’m surprised we found this one, but I’m excited to think that there are so many more we could find,” said Maiolino.

The observations were obtained as part of the JWST Advanced Deep Extragalactic Survey (JADES). The research was supported in part by the European Research Council and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

Reference:
Ignas Juodžbalis et al. ‘A dormant overmassive black hole in the early Universe.’ Nature (2024). DOI: 10.1038/s41586-024-08210-5

Scientists have spotted a massive black hole in the early universe that is ‘napping’ after stuffing itself with too much food.

Artist’s impression of a black hole during one of its short periods of rapid growth

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge rowers vie for place in The Boat Race 2025

Men’s VIIIs “Scylla” en route to victory

The annual Trial VIIIs, the UK’s final rowing event of the year, serves as a dress rehearsal for The Boat Race, with two evenly matched Cambridge University Boat Club (CUBC) crews rowing the full Championship Course for the first and only time before 12–13 April 2025.

This year, all 31 Cambridge Colleges were represented at the start of trials. The crews showcased an exciting mix of seasoned experience and youthful energy, featuring international rowers and returning Blues alongside many College rowers proudly wearing Cambridge Blue for the first time.

Read the full race report on the CUBC website.

The Cambridge contenders for The Boat Race 2025 have become clearer after a thrilling day of action on the Thames.

Men’s VIIIs “Scylla” en route to victory

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Afghan journalist and TIME magazine woman of the year joins Cambridge college

Zahra Joya on the cover of TIME magazine

A leading advocate for the rights of women and girls in Afghanistan, in particular the right to education, Joya is the founder of Rukhshana Media, a news agency dedicated to telling the stories of Afghan women in their own voices. Her appointment recognises her transformational work and reflects Hughes Hall’s mission to advance inclusive education.

Joya said: “In a time when, as a woman, I have been deprived of my basic rights in my own country, joining the extraordinary Hughes Hall team at the University of Cambridge is a great honour for me. I view this opportunity as a chance to step into a wellspring of knowledge, and I hope to learn from this team and bring what I learn here back to my people.”

Sir Laurie Bristow, President of Hughes Hall, welcomed Joya to the College: “Zahra’s work on behalf of Afghanistan’s women and girls has never been more urgent nor her own story more pertinent. Zahra’s work is about enabling Afghan women and girls to speak for themselves. It is about the right of all girls to receive an education. It is about challenging gender-based oppression and protecting the rights of some of the most vulnerable people in our world today.”

Read the full story on the Hughes Hall website.

Zahra Joya, an Afghan journalist and one of TIME magazine's Women of the Year 2022, has been appointed By-Fellow at Hughes Hall.

Zahra Joya on the cover of TIME magazine

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge to trial cutting-edge semiconductor technologies for wider use in major European project

A silicon chip with the EU flag printed on it

Photonic chips transmit and manipulate light instead of electricity, and offer significantly faster performance with lower power consumption than traditional electronic chips. 

The Cambridge Graphene Centre and Cornerstone Photonics Innovation Centre at the University of Southampton will partner with members from across Europe to host a pilot line, coordinated by the Institute of Photonic Sciences in Spain, combining state-of-the-art equipment and expertise from 20 research organisations.

The PIXEurope consortium has been selected by the European Commission and Chips Joint Undertaking, a European initiative aiming to bolster the semiconductor industry by fostering collaboration between member states and the private sector. The consortium is supported by €380m in total funding.

The UK participants will be backed by up to £4.2 million in funding from the Department of Science, Innovation and Technology (DSIT), match-funded by Horizon Europe. The UK joined the EU’s Chips Joint Undertaking in March 2024, allowing the country to collaborate more closely with European partners on semiconductor innovation.

The new pilot line will combine state-of-the-art equipment and expertise from research organisations across 11 countries. It aims to encourage the adoption of cutting-edge photonic technologies across more industries to boost their efficiency.

Photonic chips are already essential across a wide range of applications, from tackling the unprecedented energy demands of datacentres, to enabling high-speed data transmission for mobile and satellite communications. In the future, these chips will become ever more important, unlocking new applications in healthcare, AI and quantum computing. 

Researchers at the Cambridge Graphene Centre will be responsible for the integration of graphene and related materials into photonic circuits for energy efficient, high-speed communications and quantum devices. “This may lead to life-changing products and services, with huge economic benefit for the UK and the world,” said Professor Andrea C. Ferrari, Director of the Cambridge Graphene Centre. 

The global market for photonic integrated circuits (PICs) production is expected to grow by more than 400% in the next 10 years. By the end of the decade, the global photonics market is expected to exceed €1,500bn, a figure comparable to the entire annual gross domestic product of Spain.

This growth is due to the demand from areas such as telecommunications, artificial intelligence, image sensing, automotive and mobility, medicine and healthcare, environmental care, renewable energy, defense and security, and a wide range of consumer applications.

The combination of microelectronic chips and photonic chips provides the necessary features and specifications for these applications. The former are responsible for information processing by manipulating electrons within circuits based on silicon and its variants, while the latter uses photons in the visible and infrared spectrum ranges in various materials.

The new pilot line aims to offer cutting-edge technological platforms, transforming and transferring innovative and disruptive integrated photonics processes and technologies to accelerate their industrial adoption. The objective is the creation of European-owned/made technology in a sector of capital importance for technological sovereignty, and the creation and maintenance of corresponding jobs in the UK and across Europe.

“My congratulations to Cornerstone and the Cambridge Graphene Centre on being selected to pioneer the new pilot line – taking a central role in driving semiconductor innovation to the next level, encouraging adoption of new technologies,” said Science Minister Lord Vallance. “The UK laid the foundations of silicon photonics in the 1990s, and by pooling our expertise with partners across Europe we can address urgent global challenges including energy consumption and efficiency.”

“The UK’s participation in the first Europe-wide photonics pilot line marks the start of the world’s first open access photonics integrated circuits ecosystem, stimulating new technology development with industry and catalyse disruptive innovation across the UK, while strengthening UK collaboration with top European institutions working in the field,” said Ferrari.

“PIXEurope is the first photonics pilot line that unifies the whole supply chain from design and fabrication, to testing and packaging, with technology platforms that will support a broad spectrum of applications,” said CORNERSTONE Coordinator Professor Calum Littlejohns. “I am delighted that CORNERSTONE will form a crucial part of this programme.”

The Chips JU will also launch new collaborative R&D calls on a range of topics in early 2025. UK companies and researchers are eligible to participate. 

The University of Cambridge is one of two UK participants named as part of the PIXEurope consortium, a collaboration between research organisations from across Europe which will develop and manufacture prototypes of their products based on photonic chips.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Wrong trees in the wrong place can make cities hotter at night, study reveals

Trees in an Indian city street. Photo: hannahisabelnic via Flikr (Public domain)

Temperatures in cities are rising across the globe and urban heat stress is already a major problem causing illness, death, a surge in energy use to cool buildings down, heat-related social inequality issues and problems with urban infrastructure.

Some cities have already started implementing mitigation strategies, with tree planting prominent among them. But a University of Cambridge-led study now warns that planting the wrong species or the wrong combination of trees in suboptimal locations or arrangements can limit their benefits.

The study, published today in Communications Earth & Environment found that urban trees can lower pedestrian-level air temperature by up to 12°C. Its authors found that the introduction of trees reduced peak monthly temperatures to below 26°C in 83% of the cities studied, meeting the ‘thermal comfort threshold’. However, they also found that this cooling ability varies significantly around the world and is influenced by tree species traits, urban layout and climate conditions.

“Our study busts the myth that trees are the ultimate panacea for overheating cities across the globe,” said Dr Ronita Bardhan, Associate Professor of Sustainable Built Environment at Cambridge's Dept. of Architecture.

“Trees have a crucial role to play in cooling cities down but we need to plant them much more strategically to maximise the benefits which they can provide.”

Previous research on the cooling effects of urban trees has focused on specific climates or regions, and considered case studies in a fragmented way, leaving major gaps in our knowledge about unique tree cooling mechanisms and how these interact with diverse urban features.

To overcome this, the authors of this study analysed the findings of 182 studies – concerning 17 climates in 110 global cities or regions – published between 2010 and 2023, offering the first comprehensive global assessment of urban tree cooling.

During the day, trees cool cities in three ways: by blocking solar radiation; through evaporation of water via pores in their leaves; and by foliage aerodynamically changing airflow. At night, however, tree canopies can trap longwave radiation from the ground surface, due to aerodynamic resistance and ‘stomatal closure’ – the closing of microscopic pores on the surface of leaves partly in response to heat and drought stress.

Variation by climate type

The study found that urban trees generally cool cities more in hot and dry climates, and less in hot humid climates.

In the ‘tropical wet and dry or savanna’ climate, trees can cool cities by as much as 12 °C, as recorded in Nigeria. However, it was in this same climate that trees also warmed cities most at night, by up to 0.8°C.

Trees performed well in arid climates, cooling cities by just over 9°C and warming them at night by 0.4 °C.

In tropical rainforest climates, where humidity is higher, the daytime cooling effect dropped to approximately 2°C while the nighttime heating effect was 0.8 °C.

In temperate climates, trees can cool cities by up to 6°C and warm them by 1.5°C.

Using trees more strategically

The study points out that cities which have more open urban layouts are more likely to feature a mix of evergreen and deciduous trees of varying sizes. This, the researchers found, tends to result in greater cooling in temperate, continental and tropical climates.

The combined use of trees in these climates generally results in 0.5 °C more cooling than in cities where only deciduous or evergreen trees feature. This is because mixed trees can balance seasonal shading and sunlight, providing three-dimensional cooling at various heights.

In arid climates, however, the researchers found that evergreen species dominate and cool more effectively in the specific context of compact urban layouts such as Cairo in Egypt, or Dubai in UAE.

In general, trees cooled more effectively in open and low-rise cities in dry climates. In open urban layouts, cooling can be improved by about 0.4 °C because their larger green spaces allow for more and larger tree canopies and a greater mix of tree species.

“Our study provides context-specific greening guidelines for urban planners to more effectively harness tree cooling in the face of global warming,” Dr Ronita Bardhan said.

“Our results emphasize that urban planners not only need to give cities more green spaces, they need to plant the right mix of trees in optimal positions to maximize cooling benefits.”

 “Urban planners should plan for future warmer climates by choosing resilient species which will continue to thrive and maintain cooling benefits,” said Dr Bardhan, a Fellow of Selwyn College, Cambridge.

Matching trees to urban forms

The study goes further, arguing that species selection and placement needs to be compatible with urban forms. The orientation of the ‘street canyon’, local climate zones, aspect ratio, visible sky ratio and other urban features that influence the effects of trees all need to be carefully considered.

Although a higher degree of tree canopy cover in street canyons generally results in more cooling effects, excessively high cover may trap heat at the pedestrian level, especially in compact urban zones in high temperature climates. In such locations, narrow species and sparse planting strategies are recommended.

The researchers emphasise that we cannot rely entirely on trees to cool cities, and that solutions such as solar shading and reflective materials will continue to play an important role.

The researchers have developed an interactive database and map to enable users to estimate the cooling efficacy of strategies based on data from cities with similar climates and urban structures.

Reference

H Li et al., ‘Cooling efficacy of trees across cities is determined by background climate, urban morphology, and tree trait’, Communications Earth & Environment (2024). DOI: 10.1038/s43247-024-01908-4

While trees can cool some cities significantly during the day, new research shows that tree canopies can also trap heat and raise temperatures at night. The study aims to help urban planners choose the best combinations of trees and planting locations to combat urban heat stress.

Trees have a crucial role to play in cooling cities down but we need to plant them much more strategically to maximise the benefits which they can provide
Ronita Bardhan
Trees in an Indian city street. Photo: hannahisabelnic via Flikr (Public domain)

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Professor Duncan Richards appointed as Head of Department of Medicine

Professor Richards joins Cambridge from the University of Oxford, where he has been since 2019. His particular research interest is the demonstration of clinical proof of concept of novel therapeutics through the application of experimental medicine techniques, especially human challenge studies.

As Climax Professor of Clinical Therapeutics, director of the Oxford Clinical Trial Research Unit (OCTRU), and the NIHR Oxford Clinical Research Facility, he led a broad portfolio focused on new medicines for multiple conditions. His focus has been the acceleration of promising new drug treatments through better decision-making in early phase clinical trials.

Professor Richards also brings with him a wealth of experience in a number of Pharmaceutical R&D clinical development roles. In 2003 he joined GSK and held a number of roles of increasing responsibility, latterly as Head of Clinical Pharmacology and Experimental Medicine, including directorship of GSK’s phase 1 and experimental medicine unit in Cambridge (CUC).

Commenting on his appointment, Professor Richards said: “As a clinical pharmacologist, I have been fortunate to work across a broad range of therapeutic areas over the years. I am excited by the breadth and depth of expertise within the Department of Medicine and look forward to working with the first-class scientific team. My goal is to work with the Department team, the Clinical School, and hospitals to maximise the impact of the important work taking place in Cambridge.”

Members of the department’s leadership team are looking forward to the continued development of the department under Professor Richards, building on its legacy of collaboration and groundbreaking translational research to drive our future success.

Professor Mark Wills, Interim Head of Department of Medicine, said: “Duncan brings to his new role a fantastic breadth of experience, which encompasses his clinical speciality in pharmacology, extensive experience of working within the pharmaceutical industry R&D at senior levels and most recently establishing academic clinical trials units and human challenge research facilities.

“I am very excited to welcome Duncan to the Department and looking forward to working with him, as he takes on the role of delivering of the Department of Medicine’s vision to increase the efficacy of translation of its world class fundamental research, and its impact upon clinical practice and patient wellbeing.”

Menna Clatworthy, Professor of Translational Immunology and Director of the Cambridge Institute for Therapeutic Immunology and Infectious Disease (CITIID), said: "Duncan has a wealth of leadership experience in biomedicine, in both academia and pharma. That skillset will be invaluable in ensuring the Department of Medicine continues to deliver world-leading research to transform patient outcomes."

Charlotte Summers, Professor of Intensive Care Medicine and Director of the Victor Phillip Dahdaleh Heart & Lung Research Institute, said: “Duncan’s exemplary track record of translating fundamental scientific discoveries into therapies that benefit patients will help us further increase the impact of our research as we continue our mission to improve human health.”

The appointment underpins the recently announced five-year collaboration between GSK and the University of Cambridge, the Cambridge-GSK Translational Immunology Collaboration (CG-TIC). The £50 million investment will accelerate research and development in kidney and respiratory diseases to improve patient outcomes.

Professor Richards will assume the role in February 2025, replacing Interim Head of Department Dr Mark Wills who was appointed after the departure of Professor Ken Smith in January 2024.  Dr Wills will continue as Director of Research and Deputy Head of the Department of Medicine as well as leading his research group. 

Professor Richards trained in medicine at Oxford University and after junior doctor roles in London, he returned to Oxford as Clinical Lecturer in Clinical Pharmacology. His DM thesis research was on a translational model using platelet ion flux to interrogate angiotensin biology and he is author of the Oxford Handbook of Practical Drug Therapy and the 3rd edition of Drug Discovery and Development.

Professor Richards has been a core member of the UK COVID-19 Therapeutics Advisory Panel. He is a member of the Oxford Bioescalator Management Board, UK Prix Galien Prize Committee, and the therapeutic advisory committee of several national platform clinical trials.

Professor Duncan Richards has today been announced as the new Head of the Department of Medicine at the University of Cambridge.

I am excited by the breadth and depth of expertise within the Department of Medicine and look forward to working with the first-class scientific team
Duncan Richards

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Imaging technique allows rapid assessment of ovarian cancer subtypes and their response to treatment

The technique, called hyperpolarised carbon-13 imaging, can increase the detected signal in an MRI scanner by more than 10,000 times. Scientists have found that the technique can distinguish between 2 different subtypes of ovarian cancer, to reveal their sensitivities to treatment.

They used it to look at patient-derived cell models that closely mimic the behaviour of human high grade serous ovarian cancer, the most common lethal form of the disease. The technique clearly shows whether a tumour is sensitive or resistant to Carboplatin, one of the standard first-line chemotherapy treatments for ovarian cancer. 

This will enable oncologists to predict how well a patient will respond to treatment, and to see how well the treatment is working within the first 48 hours. 

Different forms of ovarian cancer respond differently to drug treatments. With current tests, patients typically wait for weeks or months to find out whether their cancer is responding to treatment. The rapid feedback provided by this new technique will help oncologists to adjust and personalise treatment for each patient within days.

The study compared the hyperpolarised imaging technique with results from Positron Emission Tomography (PET) scans, which are already widely used in clinical practice. The results shows that PET did not pick up the metabolic differences between different tumour subtypes, so could not predict the type of tumour present.

The report is published today in the journal Oncogene.

“This technique tells us how aggressive an ovarian cancer tumour is, and could allow doctors to assess multiple tumours in a patient to give a more holistic assessment of disease prognosis so the most appropriate treatment can be selected,” said Professor Kevin Brindle in the University of Cambridge’s Department of Biochemistry, senior author of the report. 

Ovarian cancer patients often have multiple tumours spread throughout their abdomen. It isn’t possible to take biopsies of all of them, and they may be of different subtypes that respond differently to treatment. MRI is non-invasive, and the hyperpolarised imaging technique will allow oncologists to look at all the tumours at once.

Brindle added: “We can image a tumour pre-treatment to predict how likely it is to respond, and then we can image again immediately after treatment to confirm whether it has indeed responded. This will help doctors to select the most appropriate treatment for each patient and adjust this as necessary. 

“One of the questions cancer patients ask most often is whether their treatment is working. If doctors can speed their patients onto the best treatment, then it’s clearly of benefit.”

The next step is to trial the technique in ovarian cancer patients, which the scientists anticipate within the next few years.

Hyperpolarised carbon-13 imaging uses an injectable solution containing a ‘labelled’ form of the naturally occurring molecule pyruvate. The pyruvate enters the cells of the body, and the scan shows the rate at which it is broken down - or metabolised – into a molecule called lactate. The rate of this metabolism reveals the tumour subtype and thus its sensitivity to treatment.

This study adds to the evidence for the value of the hyperpolarised carbon-13 imaging technique for wider clinical use. 

Brindle, who also works at the Cancer Research UK Cambridge Institute, has been developing this imaging technique to investigate different cancers for the last two decades, including breast, prostate and glioblastoma - a common and aggressive type of brain tumour. Glioblastoma also shows different subtypes that vary in their metabolism, which can be imaged to predict their response to treatment. The first clinical study in Cambridge, which was published in 2020, was in breast cancer patients.

Each year about 7,500 women in the UK are diagnosed with ovarian cancer - around 5,000 of these will have the most aggressive form of the disease, called high-grade serous ovarian cancer (HGSOC). 

The cure rate for all forms of ovarian cancer is very low and currently only 43% of women in England survive five years beyond diagnosis. Symptoms can easily be missed, allowing the disease to spread before a woman is diagnosed - and this makes imaging and treatment challenging. 

The research was funded by Cancer Research UK.

Reference: Chia, M L: ‘Metabolic imaging distinguishes ovarian cancer subtypes and detects their early and variable responses to treatment.’ Oncogene, December 2024. DOI: 10.1038/s41388-024-03231-w

An MRI-based imaging technique developed at the University of Cambridge predicts the response of ovarian cancer tumours to treatment, and rapidly reveals how well treatment is working, in patient-derived cell models.

We can image a tumour pre-treatment to predict how likely it is to respond, and then we can image again immediately after treatment to confirm whether it has indeed responded
Kevin Brindle

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge researchers develop urine test for early detection of lung cancer

Close-up of cancer cells

Researchers hope that early detection, through the simple urine test, could enable earlier treatment interventions, significantly improving patient outcomes and prognosis. Around 36,600 lives are saved from lung cancer in the UK every year, according to new analysis from Cancer Research UK.

Professor Ljiljana Fruk and Dr Daniel Munoz Espin and their teams at the University of Cambridge are leading on the research, funded by Cancer Research UK.

The work, at Cambridge’s Department of Chemical Engineering and Biotechnology, and the Early Cancer Institute, will provide a cheap, affordable sensor that uses urine samples to help doctors detect lung cancer before the disease develops.

Lung cancer has a poor prognosis for many patients because often there are no noticeable symptoms until it has spread through the lungs or into other parts of the body. The new urine test will allow doctors to spot the disease before it develops.

To create the test, scientists looked at proteins excreted by senescent cells: “zombie” cells which are alive but unable to grow and divide. It’s these cells that cause tissue damage by reprogramming their immediate environment to help promote the emergence of cancer cells.

Now, researchers have developed an injectable sensor that interacts with zombie cell proteins and releases easily detectable compound into urine, signalling their presence.

“Early detection of cancer requires cost-effective tools and strategies that enable detection to happen quickly and accurately,” said Fruk. “We designed a test based on peptide-cleaving proteins, which are found at higher levels in the presence of zombie cells, and in turn appear in the early stages of cancer.

“Ultimately, we want to develop a urine test that could help doctors identify signs of the early stages of cancer – potentially months or even years before noticeable symptoms appear.”

As well as targeting lung cancer, Fruk hopes her research, along with joint efforts across other university departments, will result in the development of probes capable of detecting other cancers.

“We have almost completed a functional urine test to detect ‘zombie' cells in lung cancer, which will spot cancer earlier and avoid the need for invasive procedures, but this test does have potential for other cancers,” she said. “Developing more efficient cancer treatments requires earlier detection and better therapies, but also work with other disciplines for a more holistic view of the disease, which is an essential part of my research.”

From uncovering the causes of lung cancer to pioneering drugs to treat it, Cancer Research UK has helped power progress for people affected by lung cancer. Over the last 10 years, the charity has invested over £231 million in lung cancer research.

“Cancer Research UK has played a key role in advancing lung cancer research and improving survival,” said Dr Iain Foulkes, Cancer Research UK’s executive director of research and innovation. “This project being led by Professor Fruk is another example of our commitment to driving progress so that more people can live longer, better lives, free from the fear of cancer.”

Adapted from a Cancer Research UK media release. 

Cambridge scientists have developed a urine test for early detection of lung cancer. The test, the first of its kind, detects ‘zombie’ cells that could indicate the first signs of the disease.

Close-up of cancer cells

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Deputy Prime Minister of Singapore visits Cambridge overseas research centre

Mr Heng Swee Keat, Deputy Prime Minister of Singapore, visits CARES

The Cambridge Centre for Advanced Research and Education in Singapore (CARES) is hosting two projects that aim to aid Singapore’s business transition away from petrochemicals towards a net-zero emissions target by 2050.

Under the newly launched CREATE Thematic Programme in Decarbonisation supported by the National Research Foundation (NRF), the two projects will investigate non-fossil fuel-based pathways for Singapore’s chemical manufacturing industry and energy systems. 

Deputy Prime Minister and Chairman of the NRF, Mr Heng Swee Keat toured the first of three laboratories for the programme to view the technical capabilities required for the various project teams, including CARES’ projects on the Sustainable Manufacture of Molecules and Materials in Singapore (SM3), and Hydrogen and Ammonia Combustion in Singapore (HYCOMBS).

SM3 aims to provide a path to a net-zero, high-value chemical manufacturing industry in Singapore. Its core goal is to address the dependency of producers of performance chemicals on starting materials that typically come from fossil-based carbon sources. The SM3 team hope to develop effective synthetic methods that best convert cheap and abundant fossil-free raw materials into high-value molecules, for use in sectors such as medicines and agrochemicals.

In project HYCOMBS, universities from Singapore, UK, Japan, France and Norway will work together to investigate the underlying combustion process of hydrogen and ammonia to minimise pollutants and accelerate industry innovation. 

As part of the lab demonstrations on decarbonisation, CARES showcased an additional ongoing activity with City Energy investigating hydrogen-rich town gas for residential and commercial cooking stoves.

Mr Heng Swee Keat said: "The need to tackle climate change and its impact grows ever more urgent. During my visit to Cambridge CARES (Centre for Advanced Research and Education in Singapore) — Cambridge University's first and only research centre outside the UK — I witnessed how research and international collaboration are driving innovative solutions to combat climate change, particularly in the area of decarbonisation.

"In just a decade, CARES has established cutting-edge R&D facilities dedicated to decarbonisation projects that not only reduce emissions but also pave the way for a more sustainable future for Singapore. From hydrogen combustion and laser-based combustion diagnostics to the development of cleaner fuels for gas stoves, their work is closely aligned with the goals outlined in our Singapore Green Plan 2030, and achieving Singapore’s net-zero emissions goal by 2050.

"It was encouraging to hear from Director of CARES, Professor Markus Kraft, as he shared how being based in the CREATE facility at the National University of Singapore facilitates interactions with researchers from diverse countries and disciplines. This collaborative and interdisciplinary approach embodies the essence of research — working together to address shared global challenges."

Since 2013, CARES has been involved in research programmes with Nanyang Technological University and the National University of Singapore as the University of Cambridge’s first overseas centre. One of its early flagship programmes, the Centre for Carbon Reduction in Chemical Technologies (C4T), has investigated areas from sustainable reaction engineering, electrochemistry, and maritime decarbonisation to digitalisation.

By building on this foundation and leveraging the local talent pool, CARES has attracted new partners from international universities and institutes for SM3 and HYCOMBS. This includes EPFL, the Swiss Federal Institute of Technology Lausanne, which will provide skills in the domain AI for chemistry. CNRS, the French National Centre for Scientific Research, the Norwegian University of Science and Technology, and Tohoku University from Japan will contribute technical equipment and key talent in hydrogen and ammonia combustion.

Adapted from a release originally published by CARES

Mr Heng Swee Keat, Deputy Prime Minister of Singapore and Chairman of the National Research Foundation (NRF) paid a visit to the University of Cambridge’s overseas research centre in Singapore and viewed its technical capabilities for decarbonisation research.

Deputy Prime Minister of Singapore, Mr Heng Swee Keat, viewing decarbonisation activities at Cambridge CARES

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

War in Lebanon has turned a decade of education crisis into a catastrophe - report

Syrian refugee children in a Lebanese school classroom

The recent conflict in Lebanon has deepened a national education crisis in which children have already lost up to 60% of school time over the past 6 years, new research warns.

The report, by the Centre for Lebanese Studies and the University of Cambridge’s REAL Centre, is the first to assess the state of education since Israel began its ground offensive in Lebanon in October. Using surveys and interviews with parents and teachers, it provides a snapshot of the situation a few weeks before the new ceasefire between Israel and Hezbollah.

The study stresses that even if that ceasefire holds, a co-ordinated, forward-thinking response is essential to prevent further learning losses in an already fragile education system.

Before the recent conflict, Lebanese schools had endured over a decade of compounded crises, including an influx of Palestinian and Syrian refugees, a major financial crisis, the 2020 Beirut explosion, and the COVID-19 pandemic. Since 2018, the authors calculate, students have missed more than 760 teaching days due to strikes, disruption and closures.

The report shows that the effects of the latest violence have been uneven, depending on where families and teachers are based and their immediate circumstances. Refugee children and students with disabilities have been disproportionately affected and are among those who face the greatest risk of missing out further, even as the education system struggles to recover.

Dr Maha Shuayb, Director of the Centre for Lebanese Studies and a researcher at the University of Cambridge’s Faculty of Education: “The war has deepened learning losses that were already near-catastrophic. Whatever happens next, flexible, inclusive, multi-agency strategies are urgently needed to ensure education reaches those who need it most.”

“Without thorough response planning, existing inequalities will become more entrenched, leaving entire sections of the younger generation behind.”

The report is the second in a series examining the impact of war on education in the Middle East. The previous report, on Gaza, warned that conflict there could set children’s education back by several years.

REAL Centre Director Professor Pauline Rose said: “In Lebanon and Gaza, it is not only clear that violence, displacement and trauma are causing devastating learning losses; we also need a much more co-ordinated response. Education should not be an afterthought in times of crisis; it is vital to future stability.”

More than 1.3 million civilians have been displaced in Lebanon since Israel escalated its military operations. The new study was undertaken at the end of October, and involved a survey with 1,151 parents and teachers, supplemented with focus groups and interviews.

The authors calculate that by November, over 1 million students and 45,000 teachers had been directly affected by the conflict. About 40% of public (state-run) schools had been converted into shelters. A further 30% were in war zones, severely limiting space for schooling.

Lebanon’s Ministry of Education and Higher Education (MEHE) attempted to reopen  public schools on 4 November, but the study shows that for many people, violence, displacement and inadequate infrastructure impeded the resumption. Researchers found that 303 public schools were running in-person learning and 297 functioning online, but in conflict-hit regions like Baalbek-Hermel, the South, and Nabatiyyeh, barely any were physically open.

Many of the survey participants were living in shelters or overcrowded shared accommodation, where online learning – often the only option available – was difficult. Financial pressures, exacerbated by the war, have further disrupted education. 77% of parents and 66% of teachers said the conflict had reduced their incomes amid rising living costs.

While all teachers and parents wanted education to resume, the study therefore found that they were not universally prepared. Only 19% of teachers in areas heavily affected by the fighting, for example, considered restarting education a ‘high priority’. They also tended to prefer online learning, often for safety reasons, while those in less disrupted regions felt better prepared to resume education in-person.

Both parents and teachers highlighted the resource shortages hindering learning. Many lacked reliable internet, digital devices or even electricity. For example, only 62% of teachers and 49% of parents said they had an internet connection.

The report also highlights the extremely difficult experiences of Palestinian and Syrian refugee children and those with disabilities: groups that were disproportionately affected by systemic inequalities before the conflict began.

The authors estimate that as many as 5,000 children with disabilities could be out of school, with some parents reluctant to send children back due to a lack of inclusive provision. Refugee families, meanwhile, are among those who most urgently need food, shelter and financial help. Despite this, Syrian parents were statistically more likely to consider education a high priority. This may reflect concerns that they have been overlooked in MEHE’s plans.

Some families and teachers suggested the government’s November restart was proving chimerical. “The authorities claim that the school year has been launched successfully, but this isn’t reflective of reality,” one teacher said. “It feels more like a drive for revenue than a genuine commitment.”

MEHE’s attempts at a uniform strategy, the researchers stress, will not help everyone. “The focus has largely been on resuming schooling, with little attention paid to quality of learning," they write, adding that there is a need for a far more inclusive response plan, involving tailored strategies which reflect the different experiences of communities on the ground.

The report adds that this will require much closer collaboration between government agencies, NGOs, universities, and disability-focused organisations to address many of the problems raised by the analysis, such as financial instability, a lack of online learning infrastructure, and insufficient digital teaching capacity.

Even if the ceasefire holds, challenges remain. Many displaced families may not return home for weeks, while schools may still be used as shelters or require repairs. Temporary learning spaces, targeted infrastructure restoration, and trauma-informed approaches to helping children who need psychosocial learning recovery, will all be required.

Yusuf Sayed, Professor of Education, University of Cambridge said: “Everyone hopes that Lebanon will return to normality, but we have grave reservations about the quality, consistency and accessibility of education in the medium term. Addressing that requires better data collection and monitoring, a flexible plan and multi-agency support. Our working assumption should be that for more than a million children, this crisis is far from over.”

Israel-Hezbollah conflict has deepened an education crisis in which children have lost up to 60% of schooling in 6 years, study shows.

Syrian refugee children in a Lebanese school classroom

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

A third of people from Chicago carry concealed handguns in public before they reach middle age

A man drawing a conceal carry pistol from an inside the waistband holster

Around a third (32%) of people who grew up in Chicago have carried a concealed firearm on the city streets at least once by the time they turn 40 years old, according to a major study of gun usage taking in a quarter of a century of data.

Urban sociologists behind the research argue that such carry rates are likely to be similar across many other major US cities. 

The research suggests that almost half of men (48%) have carried a concealed gun by the age of 40, compared to just 16% of women.*

The study, published in Science Advances, is one of the few to track gun usage in the same US population across decades, and reveals that two-thirds of those who carried a gun in the past year started doing so in adulthood, compared to only a third who began in adolescence.

The research also found that gun carrying in adolescence and adulthood may occur in response to different concerns. Those who started carrying in their teens often picked up a handgun in response to experiencing gun violence first-hand.** This was not true of those who began carrying over the age of 21.

“Among adolescents, we found a strong association between either witnessing a shooting or being shot, and beginning to carry soon after,” said Dr Charles Lanfear, study lead author from the University of Cambridge.

“The majority of people who ever carry a concealed handgun start doing so in adulthood. For those adults, we found no link between direct exposure to gun violence and gun carrying,” said Lanfear from Cambridge’s Institute of Criminology.

“This pattern suggests that gun carrying among adults may be linked to perceived threats of a more general nature, such as the idea that the world is a dangerous place, and police are incapable of ensuring public safety. Whereas gun carrying in adolescence may more often be related to direct experiences of gun violence.

“One simple but crucial fact is clear from our study, that carrying a concealed firearm is now a common event in the life course for Americans,” Lanfear said.

In the US between 1995 and 2021 some 89% of firearm homicides were committed with a handgun. However, despite the US gun stock doubling over the past quarter-century, and homicides spiking in COVID-era America, little is known about when and why people start carrying handguns.

The latest study was conducted by researchers from the University of Cambridge, University of Pennsylvania and Harvard University. Data was taken from a representative sample of 3,403 children originally from Chicago who were tracked over a 25-year period between 1994 and 2021.

When data-gathering began in the mid-90s, children were drawn at random from 80 of Chicago’s 343 neighbourhoods and from across the racial and socioeconomic spectrum, as part of a major longitudinal study run by Harvard.

The new analysis of this huge tranche of data reveals what researchers have called ‘dual pathways’ of concealed gun carrying: those who start in adolescence and those who start in adulthood, with the cut-off being the 21st birthday – the legal age for purchasing and carrying a handgun. 

In addition to findings on why people carry, the team discovered that most people who carry a gun in their teens do not continue in later life, with only 37% still carrying in 2021. Those who start carrying handguns in adulthood are more persistent, with 85% still taking a gun out in public in 2021.

Moreover, the use of guns – whether it be shooting someone, shooting at someone, or brandishing a gun in self-defence – differs among the 2 groups.

Teenage gun-carriers that fired or brandished their weapons all did so for the first time before adulthood. “We found that no one who began carrying a gun in adolescence ended up using it for the first time after the age of twenty-one,” said Lanfear.

Those who picked up a gun in adulthood had a relatively steady rate of first usage over time, so that by middle age (40 years old) both groups of carriers had reached almost identical levels of gun usage: with around 40% of carriers having used a gun.

Researchers found a racial component to gun-carrying. Black individuals carried at rates over 2 times as great as those of Hispanic and white individuals. However, a previous study by the same team showed that Black city residents were twice as likely as White residents to witness a shooting by age 40.

In fact, the research found that those least likely to witness gun violence – White residents – are the most likely to start carrying a firearm in response to gun violence exposure.

While all self-described gun-carriers – whether they started in adolescence or adulthood –are more likely to have an arrest history compared to those who don’t carry guns, the researchers say their study reveals ‘stark’ differences in why and when and for how long people take guns onto the streets.

Added Lanfear: “These findings take on new relevance given recent social changes in America. In 2020 and 2021 the nation saw a sharp increase in adult gun carrying, coinciding with an uptick in gun purchases following the outbreak of COVID-19 and the murder of George Floyd. We found the same trends in adult gun-carrying among our study sample.”

Major 25-year study reveals a ‘dual pathway’ for when people start carrying.

Carrying a concealed firearm is now a common event in the life course for Americans
Charles Lanfear
A man drawing a conceal carry pistol from an inside the waistband holster
Notes:

* The researchers found female gun-carrying to be uncommon. However, they detected a rapid increase in some cohorts at the age of 35, but these increases all occurred during the first year of COVID-19 (2020). Researchers say this is consistent with other research finding large COVID-era increases in gun ownership among groups with historically lower rates of ownership.

**Exposure to gun violence before age 15 is associated with a doubling in the probability of carrying a concealed gun between ages 15 and 21. Around 44% of adolescent gun carriers started carrying after being exposed to gun violence. In contrast, exposure to gun violence at an older age is not statistically or substantively associated with gun-carrying. Direct exposure to gun violence after age 21 is far less frequent than during adolescence.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Professor Joya Chatterji awarded Wolfson History Prize 2024

Joya Chatterji at the award ceremony for the Wolfson History Prize 2024

This year’s Wolfson History Prize has been awarded to Joya Chatterji, Emeritus Professor of South Asian History and Fellow of Trinity College, for her book Shadows At Noon: The South Asian Twentieth Century, first published in 2023.

The book charts the story of the subcontinent from the British Raj through independence and partition to the forging of the modern nations of India, Pakistan and Bangladesh.

Chatterji’s history pushes back against standard narratives that emphasise differences between the 3 countries, and instead seeks to highlight what unites these nations and their peoples.

Interwoven with Chatterji’s personal reflections on growing up in India, this distinctive academic work uses a conversational writing style and takes a thematic rather than chronological approach. It adds to the discussions of politics and nationhood typical of other histories of the region by weaving in everyday experiences of food, cinema, and domestic life.

As a result, the cultural vibrancy of South Asia shines through the research, according to the Wolfson History Prize judges, allowing readers a more nuanced understanding of South Asian history.

A judging panel that included fellow Cambridge historians Professors Mary Beard and Richard Evans, and headed by panel chair Professor David Cannadine, described Chatterji’s book as “written with verve and energy”, and said that it “beautifully blends the personal and the historical”.

“Shadows at Noon is a highly ambitious history of 20th-century South Asia that defies easy categorisation, combining rigorous historical research with personal reminiscence and family anecdotes,” said Cannadine.  

“Chatterji writes with wit and perception, shining a light on themes that have shaped the subcontinent during this period. We extend our warmest congratulations to Joya Chatterji on her Wolfson History Prize win.”

“For over 50 years, the Wolfson History Prize has celebrated exceptional history writing that is rooted in meticulous research with engaging and accessible prose,” said Paul Ramsbottom, Chief Executive of the Wolfson Foundation.

“Shadows at Noon is a remarkable example of this, and Joya Chatterji captivates readers with her compelling storytelling of modern South Asian history.”

Shadows at Noon was also longlisted for the Women’s Prize for Non-Fiction 2024 and shortlisted for the Cundill History Prize 2024.

Now in its 52nd year, the Wolfson History Prize celebrates books that combine excellence in research with readability for a general audience.

Recent winners have included other Cambridge historians: Clare Jackson, Honorary Professor of Early Modern History, for Devil-Land: England Under Siege, 1588-1688 (2022) and David Abulafia, Professor Emeritus of Mediterranean History, for The Boundless Sea: A Human History of the Oceans (2020). Helen McCarthy, Professor of Modern and Contemporary British History, was shortlisted for Double Lives: A History of Working Motherhood in 2021.

Chatterji wins for Shadows at Noon, her genre-defying history of South Asia during the 20th century.

Joya Chatterji at the award ceremony for the Wolfson History Prize 2024

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New datasets will train AI models to think like scientists

A mosaic of simulations included in the Well collection of datasets

The initiative, called Polymathic AI, uses technology like that powering large language models such as OpenAI’s ChatGPT or Google’s Gemini. But instead of ingesting text, the project’s models learn using scientific datasets from across astrophysics, biology, acoustics, chemistry, fluid dynamics and more, essentially giving the models cross-disciplinary scientific knowledge.

“These datasets are by far the most diverse large-scale collections of high-quality data for machine learning training ever assembled for these fields,” said team member Michael McCabe from the Flatiron Institute in New York City. “Curating these datasets is a critical step in creating multidisciplinary AI models that will enable new discoveries about our universe.”

On 2 December, the Polymathic AI team released two of its open-source training dataset collections to the public — a colossal 115 terabytes, from dozens of sources — for the scientific community to use to train AI models and enable new scientific discoveries. For comparison, GPT-3 used 45 terabytes of uncompressed, unformatted text for training, which ended up being around 0.5 terabytes after filtering.

The full datasets are available to download for free on HuggingFace, a platform hosting AI models and datasets. The Polymathic AI team provides further information about the datasets in two papers accepted for presentation at the NeurIPS machine learning conference, to be held later this month in Vancouver, Canada.

“Just as LLMs such as ChatGPT learn to use common grammatical structure across languages, these new scientific foundation models might reveal deep connections across disciplines that we’ve never noticed before,” said Cambridge team lead Dr Miles Cranmer from Cambridge’s Institute of Astronomy. “We might uncover patterns that no human can see, simply because no one has ever had both this breadth of scientific knowledge and the ability to compress it into a single framework.”

AI tools such as machine learning are increasingly common in scientific research, and were recognised in two of this year’s Nobel Prizes. Still, such tools are typically purpose-built for a specific application and trained using data from that field. The Polymathic AI project instead aims to develop models that are truly polymathic, like people whose expert knowledge spans multiple areas. The project’s team reflects intellectual diversity, with physicists, astrophysicists, mathematicians, computer scientists and neuroscientists.

The first of the two new training dataset collections focuses on astrophysics. Dubbed the Multimodal Universe, the dataset contains hundreds of millions of astronomical observations and measurements, such as portraits of galaxies taken by NASA’s James Webb Space Telescope and measurements of our galaxy’s stars made by the European Space Agency’s Gaia spacecraft.

The other collection — called the Well — comprises over 15 terabytes of data from 16 diverse datasets. These datasets contain numerical simulations of biological systems, fluid dynamics, acoustic scattering, supernova explosions and other complicated processes. Cambridge researchers played a major role in developing both dataset collections, working alongside PolymathicAI and other international collaborators.

While these diverse datasets may seem disconnected at first, they all require the modelling of mathematical equations called partial differential equations. Such equations pop up in problems related to everything from quantum mechanics to embryo development and can be incredibly difficult to solve, even for supercomputers. One of the goals of the Well is to enable AI models to churn out approximate solutions to these equations quickly and accurately.

“By uniting these rich datasets, we can drive advancements in artificial intelligence not only for scientific discovery, but also for addressing similar problems in everyday life,” said Ben Boyd, PhD student in the Institute of Astronomy.

Gathering the data for those datasets posed a challenge, said team member Ruben Ohana from the Flatiron Institute. The team collaborated with scientists to gather and create data for the project. “The creators of numerical simulations are sometimes sceptical of machine learning because of all the hype, but they’re curious about it and how it can benefit their research and accelerate scientific discovery,” he said.

The Polymathic AI team is now using the datasets to train AI models. In the coming months, they will deploy these models on various tasks to see how successful these well-rounded, well-trained AIs are at tackling complex scientific problems.

“It will be exciting to see if the complexity of these datasets can push AI models to go beyond merely recognising patterns, encouraging them to reason and generalise across scientific domains,” said Dr Payel Mukhopadhyay from the Institute of Astronomy. “Such generalisation is essential if we ever want to build AI models that can truly assist in conducting meaningful science.”

“Until now, haven’t had a curated scientific-quality dataset cover such a wide variety of fields,” said Cranmer, who is also a member of Cambridge’s Department of Applied Mathematics and Theoretical Physics. “These datasets are opening the door to true generalist scientific foundation models for the first time. What new scientific principles might we discover? We're about to find out, and that's incredibly exciting.”

The Polymathic AI project is run by researchers from the Simons Foundation and its Flatiron Institute, New York University, the University of Cambridge, Princeton University, the French Centre National de la Recherche Scientifique and the Lawrence Berkeley National Laboratory.

Members of the Polymathic AI team from the University of Cambridge include PhD students, postdoctoral researchers and faculty across four departments: the Department of Applied Mathematics and Theoretical Physics, the Department of Pure Mathematics and Mathematical Statistics, the Institute of Astronomy and the Kavli Institute for Cosmology.

What can exploding stars teach us about how blood flows through an artery? Or swimming bacteria about how the ocean’s layers mix? A collaboration of researchers, including from the University of Cambridge, has reached a milestone toward training artificial intelligence models to find and use transferable knowledge between fields to drive scientific discovery.

A mosaic of simulations included in the Well collection of datasets

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

CISL appoints Lindsay Hooper permanent CEO

Photo of Lindsay Hooper

Lindsay’s appointment comes at a critical moment for the sustainability movement and for CISL. 

Following another year of record temperatures, extreme weather events and sustained biodiversity loss, the evidence is clear that the world is not on track. Confidence within the sustainability movement has faltered and big questions are being asked about what is needed to deliver the change we need. Under Lindsay’s leadership as interim CEO the Institute has engaged with these important questions. Read more about Lindsay Hooper's appointment here

The University of Cambridge Institute for Sustainability Leadership announces it has appointed Lindsay Hooper its permanent CEO and Head of Department.

The need for CISL’s work has never been greater and I’m delighted to be working with an exceptional team
CISL CEO Lindsay Hooper
Photo of Lindsay Hooper

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Marking a milestone in English language exams

100 million Cambridge English exams taken since 1913

In June 1913, 3 candidates in the UK took the first ever Cambridge English exam. Since then, Cambridge English exams have become available in 130 countries and are recognised by more than 25,000 organisations around the world, including governments, universities and employers, as reliable proof of English language ability.

The Cambridge English exams, which are designed for all levels of English language ability, include Cambridge English Qualifications, Linguaskill and IELTS, the English language test.

Read more on the Cambridge University Press & Assessment website.

100 million Cambridge English exams and tests have been taken around the world since 1913, according to figures from Cambridge University Press & Assessment.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists warn of ‘invisible threat’ of microplastics as global treaty nears completion

Researcher holding small pieces of micro plastic pollution washed up on a beach

Even if global production and pollution of new plastic is drastically reduced, scientists, writing in the journal Nature Communications, say that legacy plastics, the billions of tonnes of waste already in the environment, will continue to break down into tiny particles called microplastics for decades or centuries.

These fragments contaminate oceans, land, and the air we breathe, posing risks to marine life, food production and human health.

The researchers – from the University of Cambridge, GNS Science in New Zealand and The Ocean Cleanup in The Netherlands – say the problem lies in a gap between ambition and action, called the fragmentation gap.

At a meeting this week in Busan, South Korea, the Intergovernmental Negotiating Committee on Plastic Pollution is meeting to finalise the Global Plastics Treaty, the first legally binding treaty to tackle plastic pollution.

While the treaty’s initial discussions highlight prevention of plastic pollution, the researchers say it largely overlooks the need to remove existing waste. This omission means microplastics will continue to accumulate, even if plastic pollution slows.

“The treaty is aiming to eliminate plastic pollution by 2040, but this goal is unlikely without stronger action,” said co-author Zhenna Azimrayat-Andrews, a PhD student at Cambridge’s Department of Earth Sciences. “Even with a sharp reduction in plastic entering the ocean, existing debris will split into smaller pieces and persist for centuries.”

These microplastics have already infiltrated marine ecosystems and are harming marine ecosystems, degrading commercial seafood quality, and disrupting critical ocean processes.

The researchers argue that plastic clean-up efforts must be prioritised alongside reduction targets. Strategies to remove plastics from terrestrial and marine environments, such as those targeting pollution in beaches and rivers, could help prevent microplastics from forming. In fact, a 3% annual removal of legacy plastic, combined with aggressive reduction measures, could significantly curb future contamination, they say.

Without action to address legacy plastic, the treaty risks leaving behind a long-lasting problem for marine life and future generations. Experts are calling for clean-up efforts to become an equal pillar of the treaty, alongside prevention and recycling.

As world leaders gather to negotiate the treaty this week, the spotlight is on their ability to craft a comprehensive plan that doesn't just slow pollution but also begins to reverse the damage that has already been done.

Reference:
Karin Kvale, Zhenna Azimrayat Andrews & Matthias Egger. ‘Mind the fragmentation gap.’ Nature Communications (2024). DOI: 10.1038/s41467-024-53962-3

As the UN meets this week to finalise the Global Plastics Treaty, researchers warn that the agreement could fail to address one of the biggest threats to marine environments—microplastics.

Researcher holding small pieces of micro plastic pollution washed up on a beach

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Professor Lord Colin Renfrew – 1937-2024

Professor Colin Renfrew, Lord Renfrew of Kaimsthorn


The Department of Archaeology and McDonald Institute for Archaeological Research at the University of Cambridge mourn the death and celebrate the extraordinary life of Professor Colin Renfrew, Lord Renfrew of Kaimsthorn, formerly tenth Disney Professor of Archaeology, the McDonald’s founding Director, and Master of Jesus College.

Colin was, and will always remain, one of the titans of modern archaeology, a distinguished public figure, and a fine friend and colleague to innumerable archaeologists around the world. This loss makes the world of archaeology a poorer place intellectually, as well as in terms of the sheer energy and optimism that he brought to everything he did.

From his first years as one of the brave new archaeologists of the 1960s, Colin stood out as an exceptional mind, and as a spirit of profound, exciting and rigorous change. He pioneered new, theoretically informed ways of thinking about the explanation of social and political change in the past, within and then far beyond his first enduring regional love for the prehistoric Aegean, while advocating scientific techniques of dating and provenance as an integral part of archaeological endeavour. From this perspective, he was one of the first to appreciate the significance of the calibration of radiocarbon dates for the understanding of European prehistory. 

He went on to ask equally fresh questions about the link between language evolution and archaeology and, as the first Director of the McDonald Institute for Archaeological Research, championed some of the earliest applications of archaeogenetics, as well as a critical and investigative approach to the illicit antiquities market. His fieldwork expanded to Orkney, and latterly returned to the more southerly isles of the Cyclades, subject of his doctoral research, and to remarkable discoveries on the island of Keros. To the very end, he remained engaged with the forefront of archaeological developments, attending and clearly relishing the 36th Annual McDonald Lecture on the Wednesday before he left us.

As those who knew him will amply testify, there was far, far more to Colin than the world-leading and much honoured archaeologist. He took on the mantle of a working peer in the House of Lords, where he spoke up for matters of heritage and archaeological legislation with the customary eloquence and lapidary reasoning of a one-time President of the Cambridge Union.

He was a passionate and knowledgeable expert and collector of modern art, by which Jesus College under his care remains permanently graced. Social events under his hospitality became unforgettable and often hugely convivial gatherings of brilliant minds from the widespread fields that he drew together, and under the right circumstances often culminated in demonstrations of Colin’s skills as a dancer. Last but far from least, he was a much-loved husband to his wife Jane, and father to Helena, Alban and Magnus.

Colin passed away peacefully in his sleep during the night of Saturday 23 to Sunday 24 November 2024. All of us at Cambridge extend our heartfelt condolences and profound respects to his family and to all those who loved and knew him.

Professor Cyprian Broodbank remembers Professor Lord Colin Renfrew, founding Director of the McDonald Institute for Archaeological Research and former Master of Jesus College, who passed away at the weekend aged 87.  

Professor Colin Renfrew, Lord Renfrew of Kaimsthorn

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Wildlife monitoring technologies used to intimidate and spy on women, study finds

Researcher interviewing a local woman in India

Remotely operated camera traps, sound recorders and drones are increasingly being used in conservation science to monitor wildlife and natural habitats, and to keep watch on protected natural areas.

But Cambridge researchers studying a forest in northern India have found that the technologies are being deliberately misused by local government and male villagers to keep watch on women without their consent.

Cambridge researcher Dr Trishant Simlai spent 14 months interviewing 270 locals living around the Corbett Tiger Reserve, a national park in northern India, including many women from nearby villages.

His report, published today in the journal Environment and Planning F, reveals how forest rangers in the national park deliberately fly drones over local women to frighten them out of the forest, and stop them collecting natural resources despite it being their legal right to do so.

The women, who previously found sanctuary in the forest away from their male-dominated villages, told Simlai they feel watched and inhibited by camera traps, so talk and sing much more quietly. This increases the chance of surprise encounters with potentially dangerous wildlife like elephants and tigers. One woman he interviewed has since been killed in a tiger attack.

The study reveals a worst-case scenario of deliberate human monitoring and intimidation. But the researchers say people are being unintentionally recorded by wildlife monitoring devices without their knowledge in many other places - even national parks in the UK. 

“Nobody could have realised that camera traps put in the Indian forest to monitor mammals actually have a profoundly negative impact on the mental health of local women who use these spaces,” said Dr Trishant Simlai, a researcher in the University of Cambridge’s Department of Sociology and lead author of the report.

“These findings have caused quite a stir amongst the conservation community. It’s very common for projects to use these technologies to monitor wildlife, but this highlights that we really need to be sure they’re not causing unintended harm,” said Professor Chris Sandbrook, Director of the University of Cambridge’s Masters in Conservation Leadership programme, who was also involved in the report.

He added: “Surveillance technologies that are supposed to be tracking animals can easily be used to watch people instead – invading their privacy and altering the way they behave.”

Many areas of conservation importance overlap with areas of human use. The researchers call for conservationists to think carefully about the social implications of using remote monitoring technologies – and whether less invasive methods like surveys could provide the information they need instead.

Intimidation and deliberate humiliation

The women living near India’s Corbett Tiger Reserve use the forest daily in ways that are central to their lives: from gathering firewood and herbs to sharing life’s difficulties through traditional songs.

Domestic violence and alcoholism are widespread problems in this rural region and many women spend long hours in forest spaces to escape difficult home situations.

The women told Simlai that new technologies, deployed under the guise of wildlife monitoring projects, are being used to intimidate and exert power over them - by monitoring them too. 

“A photograph of a woman going to the toilet in the forest – captured on a camera trap supposedly for wildlife monitoring - was circulated on local Facebook and WhatsApp groups as a means of deliberate harassment,” said Simlai. 

He added: “I discovered that local women form strong bonds while working together in the forest, and they sing while collecting firewood to deter attacks by elephants and tigers. When they see camera traps they feel inhibited because they don’t know who’s watching or listening to them – and as a result they behave differently - often being much quieter, which puts them in danger.”

In places like northern India, the identity of local women is closely linked to their daily activities and social roles within the forest. The researchers say that understanding the various ways local women use forests is vital for effective forest management strategies.

Reference: Simlai, T. et al: ‘The Gendered Forest: Digital Surveillance Technologies for Conservation and Gender-Environment relationships.’ November 2024. DOI:10.17863/CAM.111664
 

Camera traps and drones deployed by government authorities to monitor a forest in India are infringing on the privacy and rights of local women.

Nobody could have realised that camera traps put in the Indian forest to monitor mammals actually have a profoundly negative impact on the mental health of local women who use these spaces.
Trishant Simlai
Researcher interviewing a local woman in India

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Award-winning broadcaster Hannah Fry joins Cambridge as Professor of the Public Understanding of Mathematics

Hannah Fry.

Fry brings outstanding experience to the role of communicating to diverse audiences, including with people not previously interested in maths. She will follow in the footsteps of giants of public engagement with mathematics, including David Spiegelhalter and the late Stephen Hawking as she joins the Department of Applied Mathematics and Theoretical Physics (DAMTP).

“I’m really looking forward to joining the Cambridge community,” said Fry, “to those chance encounters and interactions that end up sparking new ideas and collaborations: it’s so exciting to be in an environment where every single person you speak to is working on something absolutely fascinating.”

Fry won the Christopher Zeeman Medal for promoting mathematics in 2018 and the Royal Society David Attenborough Award in 2024, and is the current President of the Institute of Mathematics and its Applications.

She is currently Professor of the Mathematics of Cities at UCL, where she works with physicists, mathematicians, computer scientists, architects and geographers to study patterns in human behaviour – particularly in an urban setting. Her research applies to a wide range of social problems and questions, from shopping and transport to urban crime, riots and terrorism, and she has applied this research by advising and working alongside governments, police forces, supermarkets and health analysts.

“When you create a mathematical model, it doesn’t really matter how beautifully crafted your equations are, or how accurate your simulations are,” said Fry. “You have to think about how the work you’ve created is going to be seen and perceived by other people and how it’s going to be understood or misunderstood.”

The new professorship builds on Cambridge’s long track record in sharing maths. DAMTP is also the home of the largest subject-specific outreach and engagement project in the University – the Millennium Mathematics Project (MMP).

Fry says she plans for her work at Cambridge to follow on from Spiegelhalter's extensive public communication work, which she sees as a vital part of the research process.

“Communication is not an optional extra: if you are creating something that is used by, or interacts with members of the public or the world in general, then I think it’s genuinely your moral duty to engage the people affected by it,” she said. “I’d love to build and grow a community around excellence in mathematical communication at Cambridge – so that we’re really researching the best possible methods to communicate with people.”

“Hannah is an outstanding mathematician and researcher, and one of the UK’s best maths communicators,” said Professor Colm-cille Caulfield, Head of DAMTP. “Mathematics affects so many aspects of our everyday lives in interesting and exciting ways, and Hannah will strengthen the excellent work already being done at Cambridge in this area. We in DAMTP and our Faculty of Mathematics colleagues in the Department of Pure Mathematics and Mathematical Statistics are so excited to have her join us.”

Professor Fry announced her appointment at an event yesterday (21 November) organised by the MMP in collaboration with the Newton Gateway to Mathematics at the Isaac Newton Institute in Cambridge. The event – Communicating mathematical and data sciences – what does success look like? – explored evidence for effectively communicating mathematical and data science research to policymakers, mainstream media and the wider public.

“Professor Fry is one of the most exciting voices in science and mathematics today,” said Professor Nigel Peake, Head of the School of the Physical Sciences. “Her deep commitment to sharing the excitement of maths with people of all ages and backgrounds, at a time when mathematical literacy has never been so important, will be an enormous benefit to Cambridge, and the UK as a whole.”

Professor Hannah Fry, mathematician, best-selling author, award-winning science presenter and host of popular podcasts and television shows, will join the University of Cambridge as the first Professor of the Public Understanding of Mathematics on 1 January.

Hannah Fry

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Arm donates £3.5 million for Cambridge PhD students to study computer architecture and semiconductor design

computer chip

The first three students to be supported by the Arm donation will begin their studies at the new research centre in the autumn of 2025. They will be followed by another three students each year for the following four years.  

Arm – the company building the future of computing with its global headquarters in Cambridge – is the first organisation to donate to the new CASCADE Research Centre, part of the Department of Computer Science and Technology.

“We’re very grateful to them for their generous support,” said Professor Timothy Jones, Director of the Centre. “As well as funding 15 PhD students over the next five years, Arm’s involvement is helping us realise our vision of a centre where research into addressing key challenges in this field is informed and supported by our industrial partners.This is extremely valuable to us as we work to make the Centre a destination for collaboration between companies, generating pre-competitive open-source artefacts and driving development of novel computer architectures.”

Richard Grisenthwaite, executive vice president and chief architect, Arm said: “Our long-standing commitment to the University of Cambridge through this latest CASCADE funding highlights the vital collaboration between academia and industry as we embark on ground-breaking intent-based programming work to realize the future promise of AI through the next generation of processor designs."

“The Centre has the potential to enable further technology innovation within the semiconductor industry and is an important part of Arm’s mission to build the future of computing.” 

Jones added: "Computer architecture is a critical area of computing. It underpins today’s technologies and drives the next generation of computing systems. Here in the Department of Computer Science and Technology, we’re proud of our research and innovation in this area. And the recently published National Semiconductor Strategy underlined how vital such work is, showing that the UK is currently a leader in computer architecture."

"But to maintain this leading position, we need to invest in developing the research leaders of tomorrow. That's why we have established the new CASCADE Research Centre to fund PhD students working in this area, through support from industry. It is currently taking applications for its first cohort of students."

The Centre will focus on research that addresses some of the grand challenges in computer architecture, design automation and semiconductors.  

PhD students will work alongside researchers here who have expertise across the breadth of the area, encompassing the design and optimisation of general-purpose microprocessors, specialised accelerators, on-chip interconnect and memory systems, verification, compilation and networking, quantum architecture and resource estimation. This will allow them to explore the areas they are most passionate about, while addressing industry-relevant research.

Students receiving funding from Arm will be working in the general area of intent-based computing, researching systems that communicate what programs will do in the future so that the processor can make better decisions about how to execute them.

Arm was born in Cambridge in 1990 with the goal of changing the computing landscape. Its success since then in designing, architecting, and licensing high-performance, power-efficient CPUs — the 'brain' of all computers and many household and electronic devices — helped fuel the smartphone revolution and has made it a household name.

Arm has long had a research relationship with Cambridge University. Most notably, this has led to the development of new cybersecurity technology, focusing on innovative ways to design the architecture of a computer’s CPU to make software less vulnerable to security breaches.

Adapted from a news release published by the Department of Computer Science and Technology

Arm is donating £3.5 million to enable 15 PhD students over the next five years to study at CASCADE, the University's new Computer Architecture and Semiconductor Design Centre.

Futuristic circuit board and semiconductor

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Manifest’ is Cambridge Dictionary Word of the Year

A marathon runner celebrates the moment he crosses the marathon finish line

Manifest’ was looked up almost 130,000 times on the Cambridge Dictionary website, making it one of the most-viewed words of 2024.  

The word jumped from use in the self-help community and on social media to being widely used across mainstream media and beyond, as celebrities such as singer Dua Lipa, Olympic sprinter Gabby Thomas and England striker Ollie Watkins spoke of manifesting their success in 2024. 

Mentions of it gained traction during the pandemic and have grown in the years since, especially on TikTok and other social media, where millions of posts and videos used the hashtag #manifest.

They use ‘to manifest’ in the sense of: ‘to imagine achieving something you want, in the belief that doing so will make it more likely to happen’. Yet, manifesting is an unproven idea that grew out of a 100-year-old spiritual philosophy movement.

Wendalyn Nichols, Publishing Manager of the Cambridge Dictionary, said: “When we choose a Cambridge Dictionary Word of the Year, we have three considerations: What word was looked up the most, or spiked? Which one really captures what was happening in that year? And what is interesting about this word from a language point of view?

“‘Manifest’ won this year because it increased notably in lookups, its use widened greatly across all types of media due to events in 2024, and it shows how the meanings of a word can change over time.”

However experts warn that ‘manifesting’ has no scientific validity, despite its popularity. It can lead to risky behaviour or the promotion of false and dangerous beliefs, such as that diseases can be simply wished away.

“Manifesting is what psychologists call ‘magical thinking’ or the general illusion that specific mental rituals can change the world around us," said Cambridge University social psychologist Professor Sander van der Linden, author of The Psychology of Misinformation.

“Manifesting gained tremendous popularity during the pandemic on TikTok with billions of views, including the popular 3-6-9 method which calls for writing down your wishes three times in the morning, six times in the afternoon and nine times before bed. This procedure promotes obsessive and compulsive behaviour with no discernible benefits. But can we really blame people for trying it, when prominent celebrities have been openly ‘manifesting’ their success?

‘Manifesting’ wealth, love, and power can lead to unrealistic expectations and disappointment. Think of the dangerous idea that you can cure serious diseases simply by wishing them away," said Van der Linden.

“There is good research on the value of positive thinking, self-affirmation, and goal-setting. Believing in yourself, bringing a positive attitude, setting realistic goals, and putting in the effort pays off because people are enacting change in the real world. However, it is crucial to understand the difference between the power of positive thinking and moving reality with your mind – the former is healthy, whereas the latter is pseudoscience.” 

‘Manyfest’, manifest destiny, and manifestos

The 600-year history of the word ‘manifest’ shows how the meanings of a word can evolve.

The oldest sense – which Geoffrey Chaucer spelled as ‘manyfest’ in the 14th century – is the adjective meaning ‘easily noticed or obvious’.

In the mid-1800s, this adjective sense was used in American politics in the context of “manifest destiny”, the belief that American settlers were clearly destined to expand across North America.

Chaucer also used the oldest sense of the verb ‘manifest’, ‘to show something clearly, through signs or actions’. Shakespeare used manifest as an adjective in The Merchant of Venice: ‘For it appears, by manifest proceeding, that...thou hast contrived against the very life of the defendant’.

The verb is still used frequently in this way: for example, people can manifest their dissatisfaction, or symptoms of an illness can manifest themselves. Lack of confidence in a company can manifest itself through a fall in share price.

The meaning of making something clear is reflected in the related noun 'manifesto': a ‘written statement of the beliefs, aims, and policies of an organisation, especially a political party’ – a word that also resonated in 2024 as scores of nations, including the United Kingdom and India, held elections where parties shared manifestos.

Other words of 2024

The Cambridge Dictionary is the world’s most popular dictionary for learners of the English language. Increases and spikes in lookups reflect global events and trends. Beyond “manifest”, other popular terms in 2024 included: 

brat: a child, especially one who behaves badly

“Brat” went viral in the summer of 2024 thanks to pop artist Charli XCX’s album of the same name about nonconformist women who reject a narrow and highly groomed female identity as portrayed on social media. (We weren’t the only dictionary publisher to notice this.) 

demure: quiet and well behaved 

Influencer Jools Lebron’s satirical use of “demure” in a TikTok post mocking stereotypical femininity drove lookups in the Cambridge Dictionary.  After brat summer, we had a demure fall. 

Goldilocks: used to describe a situation in which something is or has to be exactly right  

Financial reporters characterized India’s strong growth and moderate inflation as a Goldilocks economy in early 2024.  

ecotarian: a person who only eats food produced or prepared in a way that does not harm the environment  

This term rose in overall lookups in 2024, reflecting growing interest in environmentally conscious living.  

New words, future entries?   

All year round, Cambridge Dictionary editors track the English language as it changes. Newly emerging words that are being considered for entry are shared every Monday on the Cambridge Dictionary blog, About Words. 

Words Cambridge began tracking in 2024 include: 

quishing: the scam of phishing via QR code. 

resenteeism: to continue doing your job but resent it. This blend of “resent” and “absenteeism” is appearing in business journalism.  

gymfluencer: a social media influencer whose content is focused on fitness or bodybuilding. 

cocktail party problem (also cocktail party effect): the difficulty of focusing on one voice when there are multiple speakers in the room. This term from audiology is now being used with reference to AI. 

vampire: a vampire device or vampire appliance is one which uses energy even when not in use. This is a new, adjective sense of an existing word.  

Adapted from the Cambridge University Press & Assessment website. 

The controversial global trend of manifesting has driven Cambridge Dictionary’s Word of the Year for 2024.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Northerners, Scots and Irish excel at detecting fake accents to guard against outsiders, study suggests

Crowds on Newcastle Quayside for the Great North Run in 2013. Photo: Glen Bowman, cc license via Flikr

People from Belfast proved most able to detect someone faking their accent, while people from London, Essex and Bristol were least accurate.

The study, published today in Evolutionary Human Sciences found that the ability of participants from Scotland, the north-east of England, Ireland and Northern Ireland to tell whether short recordings of their native accent were real or fake ranged from approximately 65% – 85%. By contrast, for Essex, London and Bristol, success ranged from just over 50%, barely better than chance, to 65% –75%.

In the biggest study of its kind, drawing on 12,000 responses, the researchers found that participants across all groups were better than chance at detecting fake accents, succeeding just over 60% of the time. Unsurprisingly, participants who spoke naturally in the test accent tended to detect more accurately than non-native listener groups – some of which performed worse than chance – but success varied between regions.

“We found a pretty pronounced difference in accent cheater detection between these areas,” said corresponding author Dr Jonathan R Goodman, from Cambridge’s Department of Archaeology, and Cambridge Public Health.

“We think that the ability to detect fake accents is linked to an area’s cultural homogeneity, the degree to which its people hold similar cultural values.”

The researchers argue that the accents of speakers from Belfast, Glasgow, Dublin, and north-east England have culturally evolved over the past several centuries, during which there have been multiple cases of between-group cultural tension, particularly involving the cultural group making up southeast England, above all London.

This, they suggest, probably caused individuals from areas in Ireland and the northern regions of the United Kingdom to place emphasis on their accents as signals of social identity.

The study argues that greater social cohesion in Belfast, Dublin, Glasgow and the north-east may have resulted in a more prominent fear of cultural dilution by outsiders, which would have encouraged the development of improved accent recognition and mimicry detection.

People from London and Essex proved least able to spot fake accents because, the study suggests, these areas have less strong ‘cultural group boundaries’ and people are more used to hearing different kinds of accents, which could make them less attuned to accent fakery.

The study points out that many speakers of the Essex accent only moved to the area over the past 25 years from London, whereas the accents of people living in Belfast, Glasgow and Dublin have ‘evolved over centuries of cultural tension and violence.’

Some might have expected Bristolians to authenticate recordings of their accent more accurately, but Goodman points out that “cultural heterogeneity has been increasing significantly in the city”. The researchers would also like to obtain more data for Bristol.

An evolved ability

Previous research has shown that when people want to demarcate themselves for cultural reasons, their accents become stronger. In human evolution, the ability to recognise and thwart ‘free riders’ is also thought to have been pivotal in the development of large-scale societies.

Dr Goodman said: “Cultural, political, or even violent conflict are likely to encourage people to strengthen their accents as they try to maintain social cohesion through cultural homogeneity. Even relatively mild tension, for example the intrusion of tourists in the summer, could have this effect.

“I'm interested in the role played by trust in society and how trust forms. One of the first judgments a person will make about another person, and when deciding whether to trust them, is how they speak. How humans learn to trust another person who may be an interloper has been incredibly important over our evolutionary history and it remains critical today.”

Overall, the study found that participants were better than chance at detecting fake accents but is it surprising that so many people failed 40–50% of the time?

The authors point out that participants were only given 2-3 second clips so the fact that some authenticated with 70–85% accuracy is very impressive. If participants had heard a longer clip or been able to interact with someone face-to-face, the researchers would expect success rates to rise but continue to vary by region.

How the tests worked

The researchers constructed a series of sentences designed to elicit phonetic variables distinguishing between 7 accents of interest: north-east England, Belfast, Dublin, Bristol, Glasgow, Essex, and Received Pronunciation (RP), commonly understood as standard British English. The researchers chose these accents to ensure a high number of contrasting phonemes between sentences.

Test sentences included: ‘Hold up those two cooked tea bags’; ‘She kicked the goose hard with her foot’; ‘He thought a bath would make him happy’; ‘Jenny told him to face up to his weight’; and ‘Kit strutted across the room’.

The team initially recruited around 50 participants who spoke in these accents and asked them to record themselves reading the sentences in their natural accent. The same participants were then asked to mimic sentences in the other six accents in which they did not naturally speak, chosen randomly. Females mimicked females, males mimicked males. The researchers selected recordings which they judged came closest to the accents in question based on the reproduction of key phonetic variables.

Finally, the same participants were asked to listen to recordings made by other participants of their own accents, of both genders. Therefore, Belfast accent speakers heard and judged recordings made by native Belfast speakers as well as recordings of fake Belfast accents made by non-native speakers.

Participants were then asked to determine whether the recordings were authentic. All participants were asked to determine whether the speaker was an accent-mimic for each of 12 recordings (six mimics and six genuine speakers, presented in random order). The researchers obtained 618 responses.

In a second phase, the researchers recruited over 900 participants from the United Kingdom and Ireland, regardless of which accent they spoke naturally. This created a control group for comparison and increased the native speaker sample sizes. In the second phase, the researchers collected 11,672 responses.

“The UK is a really interesting place to study,” Dr Goodman said. “The linguistic diversity and cultural history is so rich and you have so many cultural groups that have been roughly in the same location for a really long time. Very specific differences in language, dialect and accents have emerged over time, and that's a fascinating side of language evolution.”

Reference

JR Goodman et al., ‘Evidence that cultural groups differ in their abilities to detect fake accents’, Evolutionary Human Sciences (2024). DOI: 10.1017/ehs.2024.36

People from Glasgow, Belfast, Dublin and the north-east of England are better at detecting someone imitating their accent than people from London and Essex, new research has found.

Cultural, political, or even violent conflict are likely to encourage people to strengthen their accents as they try to maintain social cohesion
Jonathan Goodman
Crowds on Newcastle Quayside for the Great North Run in 2013

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

A peek inside the box that could help solve a quantum mystery

Abstract colourful lines

Appearing as ‘bumps’ in the data from high-energy experiments, these signals came to be known as short-lived ‘XYZ states.’ They defy the standard picture of particle behaviour and are a problem in contemporary physics, sparking several attempts to understand their mysterious nature.

But theorists at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility in Virginia, with colleagues from the University of Cambridge, suggest the experimental data could be explained with fewer XYZ states, also called resonances, than currently claimed.

The team used a branch of quantum physics to compute the energy levels, or mass, of particles containing a specific ‘flavour’ of the subatomic building blocks known as quarks. Quarks, along with gluons, a force-carrying particle, make up the Strong Force, one of the four fundamental forces of nature.

The researchers found that multiple particle states sharing the same degree of spin – or angular momentum – are coupled, meaning only a single resonance exists at each spin channel. This new interpretation is contrary to several other theoretical and experimental studies.

The researchers have presented their results in a pair of companion papers published for the international Hadron Spectrum Collaboration (HadSpec) in Physical Review Letters and Physical Review D. The work could also provide clues about an enigmatic particle: X(3872).

The charm quark, one of six quark ‘flavours’, was first observed experimentally in 1974. It was discovered alongside its antimatter counterpart, the anticharm, and particles paired this way are part of an energy region called ‘charmonium.’

In 2003, Japanese researchers discovered a new charmonium candidate dubbed X(3872): a short-lived particle state that appears to defy the present quark model.

“X(3872) is now more than 20 years old, and we still haven’t obtained a clear, simple explanation that everyone can get behind,” said lead author Dr David Wilson from Cambridge’s Department of Applied Mathematics and Theoretical Physics (DAMTP).

Thanks to the power of modern particle accelerators, scientists have detected a hodgepodge of exotic charmonium candidate states over the past two decades.

“High-energy experiments started seeing bumps, interpreted as new particles, almost everywhere they looked,” said co-author Professor Jozef Dudek from William & Mary. “And very few of these states agreed with the model that came before.”

But now, by creating a tiny virtual ‘box’ to simulate quark behaviour, the researchers discovered that several supposed XYZ particles might actually be just one particle seen in different ways. This could help simplify the confusing jumble of data scientists have collected over the years.

Despite the tiny volumes they were working with, the team required enormous computing power to simulate all the possible behaviours and masses of quarks.

The researchers used supercomputers at Cambridge and the Jefferson Lab to infer all the possible ways in which mesons – made of a quark and its antimatter counterpart – could decay. To do this, they had to relate the results from their tiny virtual box to what would happen in a nearly infinite volume – that is, the size of the universe.

“In our calculations, unlike experiment, you can't just fire in two particles and measure two particles coming out,” said Wilson. “You have to simultaneously calculate all possible final states, because quantum mechanics will find those for you.”

The results can be understood in terms of just a single short-lived particle whose appearance could differ depending upon which possible decay state it is observed in.

“We're trying to simplify the picture as much as possible, using fundamental theory with the best methods available,” said Wilson. “Our goal is to disentangle what has been seen in experiments.”

Now that the team has proved this type of calculation is feasible, they are ready to apply it to the mysterious particle X(3872).

“The origin of X(3872) is an open question,” said Wilson. “It appears very close to a threshold, which could be accidental or a key part of the story. This is one thing we will look at very soon."

Professor Christopher Thomas, also from DAMTP, is a member of the Hadron Spectrum Collaboration, and is a co-author on the current studies. Wilson’s contribution was made possible in part by an eight-year fellowship with the Royal Society. The research was also supported in part by the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI). Many of the calculations for this study were carried out with the support of the Cambridge Centre for Data Driven Discovery (CSD3) and DiRAC high-performance computing facilities in Cambridge, managed by Cambridge’s Research Computing Services division.

Reference:
David J. Wilson et al. ‘Scalar and Tensor Charmonium Resonances in Coupled-Channel Scattering from Lattice QCD.’ Physical Review Letters (2024). DOI: 10.1103/PhysRevLett.132.241901

David J. Wilson et al. ‘Charmonium xc0 and xc2 resonances in coupled-channel scattering from lattice QCD.’ Physical Review D (2024). DOI: 10.1103/PhysRevD.109.114503

Adapted from a Jefferson Lab story.

An elusive particle that first formed in the hot, dense early universe has puzzled physicists for decades. Following its discovery in 2003, scientists began observing a slew of other strange objects tied to the millionths of a second after the Big Bang.

Abstract colourful lines

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Study uncovers earliest evidence of humans using fire to shape the landscape of Tasmania

Emerald Swamp, Tasmania

A team of researchers from the UK and Australia analysed charcoal and pollen contained in ancient mud to determine how Aboriginal Tasmanians shaped their surroundings. This is the earliest record of humans using fire to shape the Tasmanian environment.

Early human migrations from Africa to the southern part of the globe were well underway during the early part of the last ice age – humans reached northern Australia by around 65,000 years ago. When the first Palawa/Pakana (Tasmanian Indigenous) communities eventually reached Tasmania (known to the Palawa people as Lutruwita), it was the furthest south humans had ever settled.

These early Aboriginal communities used fire to penetrate and modify dense, wet forest for their own use – as indicated by a sudden increase in charcoal accumulated in ancient mud 41,600 years ago.

The researchers say their results, reported in the journal Science Advances, could not only help us understand how humans have been shaping the Earth’s environment for tens of thousands of years, but also help understand the long-term Aboriginal-landscape connection, which is vital for landscape management in Australia today.

Tasmania currently lies about 240 kilometres off the southeast Australian coast, separated from the Australian mainland by the Bass Strait. However, during the last ice age, Australia and Tasmania were connected by a huge land bridge, allowing people to reach Tasmania on foot. The land bridge remained until about 8,000 years ago, after the end of the last ice age, when rising sea levels eventually cut Tasmania off from the Australian mainland.

“Australia is home to the world’s oldest Indigenous culture, which has endured for over 50,000 years,” said Dr Matthew Adeleye from Cambridge’s Department of Geography, the study’s lead author. “Earlier studies have shown that Aboriginal communities on the Australian mainland used fire to shape their habitats, but we haven’t had similarly detailed environmental records for Tasmania.”

The researchers studied ancient mud taken from islands in the Bass Strait, which is part of Tasmania today, but would have been part of the land bridge connecting Australia and Tasmania during the last ice age. Due to low sea levels at the time, Palawa/Pakana communities were able to migrate from the Australian mainland.

Analysis of the ancient mud showed a sudden increase in charcoal around 41,600 years ago, followed by a major change in vegetation about 40,000 years ago, as indicated by different types of pollen in the mud.

“This suggests these early inhabitants were clearing forests by burning them, in order to create open spaces for subsistence and perhaps cultural activities,” said Adeleye. “Fire is an important tool, and it would have been used to promote the type of vegetation or landscape that was important to them.”

The researchers say that humans likely learned to use fire to clear and manage forests during their migration across the glacial landscape of Sahul – a palaeocontinent that encompassed modern-day Australia, Tasmania, New Guinea and eastern Indonesia – as part of the extensive migration out of Africa.

“As natural habitats adapted to these controlled burnings, we see the expansion of fire-adapted species such as Eucalyptus, primarily on the wetter, eastern side of the Bass Strait islands,” said Adeleye.

Burning practices are still practiced today by Aboriginal communities in Australia, including for landscape management and cultural activities. However, using this type of burning, known as cultural burning, for managing severe wildfires in Australia remains contentious. The researchers say understanding this ancient land management practice could help define and restore pre-colonial landscapes.

“These early Tasmanian communities were the island’s first land managers,” said Adeleye. “If we’re going to protect Tasmanian and Australian landscapes for future generations, it’s important that we listen to and learn from Indigenous communities who are calling for a greater role in helping to manage Australian landscapes into the future.”

The research was supported in part by the Australian Research Council.

Reference:
Matthew A. Adeleye et al. ‘Landscape burning facilitated Aboriginal migration into Lutruwita/Tasmania 41,600 years ago.’ Science Advances (2024). DOI: 10.1126/sciadv.adp6579

Some of the first human beings to arrive in Tasmania, over 41,000 years ago, used fire to shape and manage the landscape, about 2,000 years earlier than previously thought.

Emerald Swamp, Tasmania

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Study uncovers earliest evidence of humans using fire to shape the landscape of Tasmania

Emerald Swamp, Tasmania

A team of researchers from the UK and Australia analysed charcoal and pollen contained in ancient mud to determine how Aboriginal Tasmanians shaped their surroundings. This is the earliest record of humans using fire to shape the Tasmanian environment.

Early human migrations from Africa to the southern part of the globe were well underway during the early part of the last ice age – humans reached northern Australia by around 65,000 years ago. When the first Palawa/Pakana (Tasmanian Indigenous) communities eventually reached Tasmania (known to the Palawa people as Lutruwita), it was the furthest south humans had ever settled.

These early Aboriginal communities used fire to penetrate and modify dense, wet forest for their own use – as indicated by a sudden increase in charcoal accumulated in ancient mud 41,600 years ago.

The researchers say their results, reported in the journal Science Advances, could not only help us understand how humans have been shaping the Earth’s environment for tens of thousands of years, but also help understand the long-term Aboriginal-landscape connection, which is vital for landscape management in Australia today.

Tasmania currently lies about 240 kilometres off the southeast Australian coast, separated from the Australian mainland by the Bass Strait. However, during the last ice age, Australia and Tasmania were connected by a huge land bridge, allowing people to reach Tasmania on foot. The land bridge remained until about 8,000 years ago, after the end of the last ice age, when rising sea levels eventually cut Tasmania off from the Australian mainland.

“Australia is home to the world’s oldest Indigenous culture, which has endured for over 50,000 years,” said Dr Matthew Adeleye from Cambridge’s Department of Geography, the study’s lead author. “Earlier studies have shown that Aboriginal communities on the Australian mainland used fire to shape their habitats, but we haven’t had similarly detailed environmental records for Tasmania.”

The researchers studied ancient mud taken from islands in the Bass Strait, which is part of Tasmania today, but would have been part of the land bridge connecting Australia and Tasmania during the last ice age. Due to low sea levels at the time, Palawa/Pakana communities were able to migrate from the Australian mainland.

Analysis of the ancient mud showed a sudden increase in charcoal around 41,600 years ago, followed by a major change in vegetation about 40,000 years ago, as indicated by different types of pollen in the mud.

“This suggests these early inhabitants were clearing forests by burning them, in order to create open spaces for subsistence and perhaps cultural activities,” said Adeleye. “Fire is an important tool, and it would have been used to promote the type of vegetation or landscape that was important to them.”

The researchers say that humans likely learned to use fire to clear and manage forests during their migration across the glacial landscape of Sahul – a palaeocontinent that encompassed modern-day Australia, Tasmania, New Guinea and eastern Indonesia – as part of the extensive migration out of Africa.

“As natural habitats adapted to these controlled burnings, we see the expansion of fire-adapted species such as Eucalyptus, primarily on the wetter, eastern side of the Bass Strait islands,” said Adeleye.

Burning practices are still practiced today by Aboriginal communities in Australia, including for landscape management and cultural activities. However, using this type of burning, known as cultural burning, for managing severe wildfires in Australia remains contentious. The researchers say understanding this ancient land management practice could help define and restore pre-colonial landscapes.

“These early Tasmanian communities were the island’s first land managers,” said Adeleye. “If we’re going to protect Tasmanian and Australian landscapes for future generations, it’s important that we listen to and learn from Indigenous communities who are calling for a greater role in helping to manage Australian landscapes into the future.”

The research was supported in part by the Australian Research Council.

Reference:
Matthew A Adeleye et al. ‘Landscape burning facilitated Aboriginal migration into Lutruwita/Tasmania 41,600 years ago.’ Science Advances (2024). DOI: 10.1126/sciadv.adp6579

Some of the first human beings to arrive in Tasmania, over 41,000 years ago, used fire to shape and manage the landscape, about 2,000 years earlier than previously thought.

Emerald Swamp, Tasmania

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New long-term collaboration with Suzano begins with a £10 million donation to support conservation and sustainability education and research

Image of a forest

An initial £10 million donation will be used to support education and research into areas including the conservation of biodiversity, enhancing business sustainability, and the restoration of natural habitats in Brazil and beyond. The agreement will establish the Suzano Scholars Fund, a perpetual endowment at Jesus College to fund Brazilian nationals studying for a postgraduate degree at the University of Cambridge connected to the environment, ecology and conservation, educating the next generation of sustainability experts and leaders. Funding will also be provided to academics based at the Conservation Research Institute to undertake research projects exploring the interaction between human and natural systems in areas such as biodiversity, climate change, water resource management, and ecosystem restoration. Read more about this new initiative here

Suzano, one the world’s largest producers of bio-based raw materials, based in São Paulo, Brazil, establishes a long-term initiative with Jesus College and the University of Cambridge. 

This visionary initiative will help to build strong links between the University of Cambridge and Brazil
Professor Bhaskar Vira
Green forests stretch out to the horizon

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Presidents' Challenge marks start of 2025 Boat Race season

Oxford Presidents Tom Mackintosh and Annie Anezakis challenge Cambridge Presidents Luca Ferraro and Lucy Havard

This year’s Challenge, held at the iconic Somerset House in London, saw the Oxford and Cambridge University Boat Clubs come together in celebration of one of British sport’s most enduring rivalries. The event traditionally sees the Presidents representing the losing teams of the previous year’s races formally challenge those from the winning teams, marking the renewal of an intense competition which stretches back nearly 200 years.

Those in attendance gathered with anticipation to witness Oxford Presidents Tom Mackintosh and Annie Anezakis challenge Cambridge Presidents Luca Ferraro and Lucy Havard. The pairs faced off before shaking hands on stage in front of the coveted men’s and women’s trophies.

The Umpires were confirmed as Sarah Winckless MBE and Sir Matthew Pinsent, for the Men’s and Women’s races respectively. Winckless becomes the first woman to umpire the Men’s Race on The Championship Course.

The Boat Race will take place on Sunday 13 April, with The 79th Women’s Boat Race to be followed shortly after by The 170th Men’s Boat Race. Two hundred thousand spectators are expected to line the banks of the River Thames to watch the event - which is free to attend and broadcast live on the BBC - while millions more are expected to watch globally.

The Boat Race is made up of six races and in 2024, Cambridge won five. The make-up of the squads will be more diverse than ever in 2025, with 157 student rowers spanning 18 different nationalities from countries such as Nigeria, Sweden, Australia, New Zealand, Switzerland, Germany, Italy, France, Sri Lanka and China. Oxford’s Luisa Fernandez Chirino, should she be selected to face Cambridge, would be the first Mexican woman to compete at The Boat Race.

There will also be six Olympians within the squads. For Cambridge, this includes two-time Olympian Claire Collins, alongside reserve athlete for the 2024 British Olympic team, James Robson. For Oxford, this includes Paris men’s eight bronze medallist Nick Rusher, Paris women’s eight bronze medallist Heidi Long, Tokyo men’s eight gold medallist Tom Mackintosh, as well as Paris Olympian Nicholas Kohl. Meanwhile, Harry Brightmore, Paris gold medallist in the men’s eight, has joined Oxford as an assistant coach.

Asked by host, Olympic champion and four-time Boat Race winner Constantine Louloudis MBE, if this year's race would be "rinse and repeat" for Cambridge, Women's President, Lucy Havard, who is pursuing a PhD in Early Modern History at Gonville & Caius College, said: "Absolutely not - it's never the same, every year it's new people and Boat Race wins don't come easily. Everyone is gunning for it, everyone is putting so much time and effort in."

Luca Ferraro, who is taking an MPhil in History of Art and Architecture at Peterhouse, was asked about how it felt to take on the responsibility of Men's President. "I would be lying if I didn't say it didn't add a certain extra layer... racing an opponent you don't really get to meet at full strength until next year," he said. "You have the odd moment of thinking are we doing the right things, are we going fast enough and no-one feels that quite as keenly as the President, but we are surrounded by such a great team and it is so rewarding to have that extra level of responsibility."

First raced by crews from Oxford and Cambridge University in 1829, The Boat Race is now one of the world’s oldest and most famous amateur sporting events, offering an unrivalled educational experience to the student athletes who take part. The famous Championship Course stretches over 4.25 miles of tidal Thames in West London between Putney and Mortlake.

 

The countdown to the 2025 Boat Race is officially underway, with the annual Presidents’ Challenge ushering in another season of competition between the universities of Oxford and Cambridge.

Oxford Presidents Tom Mackintosh and Annie Anezakis challenge Cambridge Presidents Luca Ferraro and Lucy Havard

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Time alone heightens ‘threat alert’ in teenagers – even when connecting online

People in their late teens experience an increased sensitivity to threats after just a few hours left in a room on their own – an effect that endures even if they are interacting online with friends and family.

This is according to latest findings from a cognitive neuroscience experiment conducted at the University of Cambridge, which saw 40 young people aged 16-19 undergo testing before and after several hours alone – both with and without their smartphones.

Many countries have declared an epidemic of loneliness*. The researchers set out to “induce” loneliness in teenagers and study the effects through a series of tests, from a Pavlovian task to electrodes that measure sweat. 

Scientists found that periods of isolation, including those in which participants could use their phones, led to an increased threat response – the sensing of and reacting to potential dangers. This alertness can cause people to feel anxious and uneasy.

The authors of the study say that isolation and loneliness might lead to excessive “threat vigilance”, even when plugged in online, which could negatively impact adolescent mental health over time.

They say it could contribute to the persistent and exaggerated fear responses typical of anxiety disorders on the rise among young people around the world.

While previous studies show isolation leads to anxious behaviour and threat responses in rodents, this is believed to be the first study to demonstrate these effects through experiments involving humans.

The findings are published today in the journal Royal Society Open Science.

“We detected signs of heightened threat vigilance after a few hours of isolation, even when the adolescents had been connected through smartphones and social media,” said Emily Towner, study lead author from Cambridge’s Department of Psychology.

“This alertness to perceived threats might be the same mechanism that leads to the excessive worry and inability to feel safe which characterises anxiety,” said Towner, a Gates Cambridge Scholar.   

“It makes evolutionary sense that being alone increases our vigilance to potential threats. These threat response mechanisms undergo a lot of changes in adolescence, a stage of life marked by increasing independence and social sensitivity.”

"Our experiment suggests that periods of isolation in adolescents might increase their vulnerability to the development of anxiety, even when they are connected virtually.”

Researchers recruited young people from the local area in Cambridge, UK, conducting extensive screening to create a pool of 18 boys and 22 girls who had good social connections and no history of mental health issues.

Participants were given initial tests and questionnaires to establish a “baseline”. These included the Pavlovian threat test, in which they were shown a series of shapes on a screen, one of which was paired with a harsh noise played through headphones, so the shape became associated with a feeling of apprehension.

Electrodes attached to fingers monitored “electrodermal activity” – a physiological marker of stress – throughout this test.**

Each participant returned for two separate stints of around four hours isolated in a room in Cambridge University’s Psychology Department, after which the tests were completed again. There was around a month, on average, between sessions.

All participants underwent two isolation sessions. One was spent with a few puzzles to pass the time, but no connection to the outside world. For the other, participants were allowed smartphones and given wi-fi codes, as well as music and novels. The only major rule in both sessions was they had to stay awake.***

“We set out to replicate behaviour in humans that previous animal studies had found after isolation,” said Towner. “We wanted to know about the experience of loneliness, and you can’t ask animals how lonely they feel.”

Self-reported loneliness increased from baseline after both sessions. It was lower on average after isolation with social media, compared to full isolation.****

However, participants found the threat cue – the shape paired with a jarring sound – more anxiety-inducing and unpleasant after both isolation sessions, with electrodes also measuring elevated stress activity.

On average across the study, threat responses were 70% higher after the isolation sessions compared to the baseline, regardless of whether participants had been interacting digitally.

“Although virtual social interactions helped our participants feel less lonely compared to total isolation, their heightened threat response remained,” said Towner.

Previous studies have found a link between chronic loneliness and alertness to threats. The latest findings support the idea that social isolation may directly contribute to heightened fear responses, say researchers.  

Dr Livia Tomova, co-senior author and lecturer in Psychology at Cardiff University, who conducted the work while at Cambridge, added: “Loneliness among adolescents around the world has nearly doubled in recent years. The need for social interaction is especially intense during adolescence, but it is not clear whether online socialising can fulfil this need.

“This study has shown that digital interactions might not mitigate some of the deep-rooted effects that isolation appears to have on teenagers.”

Scientists say the findings might shed light on the link between loneliness and mental health conditions such as anxiety disorders, which are on the rise in young people.

Notes

*For example, in 2023 the U.S. Surgeon General declared an epidemic of loneliness and isolation.

**Electrodes placed on the fingers record small deflections in sweat and subsequent changes in electrical conductivity of the skin (electrodermal activity). Electrodermal activity is used to detect stress levels and increases with emotional or physical arousal.

***The baseline tests were always taken first. The order of the two isolation sessions was randomly allocated. For sessions with digital interactions allowed, most participants used social media (35 out of 40), with texting being the most common form of interaction (37 out of 40). Other popular platforms included Snapchat, Instagram, and WhatsApp. Participants mainly connected virtually with friends (38), followed by family (19), romantic partners (13), and acquaintances (4).

**** Average self-reported loneliness more than doubled after the isolation session with social media compared to baseline and nearly tripled after the complete isolation session compared to baseline.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Gender inequality ingrained in global climate negotiations, say researchers

Surviving the flood at Ahoada in Rivers state Nigeria

In an article published today in Lancet Planetary Health, a team of researchers – including several from the University of Cambridge – argue that much more needs to be done to mitigate the impacts of climate change on women, girls and gender-diverse individuals.

Focusing specifically on the intersection between climate change, gender, and human health, the researchers call on countries to work harder to ensure there is gender equity within their delegations to climate conferences and to ensure climate strategies identify gender-specific risks and vulnerabilities and address their root causes.

As the world prepares for COP29, concerns about gender representation and equality have reignited following the initial appointment of 28 men and no women to the COP29 organising committee in January 2024.

The effects of climate change – from heavy rains, rising temperatures, storms and floods through to sea level rises and droughts – exacerbate systemic inequalities and disproportionately affect marginalised populations, particularly those living in low-income areas.

While the specific situation may be different depending on where people live or their social background (like their class, race, ability, sexuality, age, or location), women, girls, and gender minorities are often at greater risk from the impacts of climate change. For example, in many countries, women are less likely to own land and resources to protect them in post-disaster situations, and have less control over income and less access to information, resulting in increased vulnerability to acute and long-term climate change impacts.

They are also particularly at risk from climate-related threats to their health, say the researchers. For example, studies have linked high temperatures to adverse birth outcomes such as spontaneous preterm births, pre-eclampsia and birth defects. Extreme events, which are expected to become more likely and intense due to climate change, also take a severe toll on women's social, physical, and mental well-being. Numerous studies highlight that gender-based violence is reported to increase during or after extreme events, often due to factors related to economic instability, food insecurity, disrupted infrastructure and mental stress.

Dr Kim Robin van Daalen, a former Gates Cambridge Scholar at the University of Cambridge, and researcher at the Barcelona Supercomputing Center (BSC), said: “Given how disproportionately climate change affects women, girls and gender minorities – a situation that is only likely to get worse – we need to ensure that their voices are heard and meaningfully included in discussions of how we respond to this urgent climate crisis. This is not currently happening at anywhere near the level it needs to.”

The team summarised the inclusion of gender, health and their intersection in key decisions and initiatives under the United Nations Framework Convention on Climate Change (UNFCCC), and analysed gender representation among representatives of Party and Observer State delegations at COPs between 1995-2023. Progress has been slow, they say.

They highlight how previous scholars have consistently noted that emphasis remains mainly on achieving a gender ‘number-based balance’ in climate governance, over exploring gender-specific risks and vulnerabilities and addressing their root causes. They also discuss how there remains limited recognition of the role climate change has in worsening gendered impacts on health, including gender-based violence and the lack of safeguarding reproductive health in the face of climate change.

Although the situation is slowly improving, at COP28, almost three-quarters (73%) of Party delegations were still majority men, and only just over one in six (16%) showed gender parity (that is, 45-55% women). In fact, gender parity has only been achieved in the ‘Western European and Other’ UN grouping (which also includes North America, Australia and New Zealand). Based on current trends, several countries - particularly those in the Asia-Pacific and Africa regions - are expected to take at least a decade from COP28 before reaching gender parity in their delegations.

Dr Ramit Debnath, former Gates Cambridge Scholar and now an Assistant Professor at Cambridge, said: “The urgency of climate action, as well as the slow understanding of climate, gender, and health connections, is cause for concern. Institutions like the UNFCCC must recognize these disparities, design appropriate methods to improve gender parity in climate governance, and keep these representation gaps from growing into societal and health injustices.”

Beyond ensuring that their voices are heard, more equitable inclusion of women has consistently been suggested to transform policymaking across political and social systems, including the generation of policies that better represent women’s interests. Previous recent analyses of 49 European countries revealed that greater women’s political representation correlates with reduced inequalities in self-reported health, lower geographical inequalities in infant mortality and fewer disability-adjusted life-years lost across genders.

Similar positive findings have been reported related to environmental policies, with women’s representation in national parliaments being associated with increased ratification of environmental treaties and more stringent climate change policies. For example, women legislators in the European parliament and US House of Representatives have been found to be more inclined to support environmental legislation than men.

Dr Ronita Bardhan, Associate Professor at the University of Cambridge, said: "Achieving equitable gender representation in climate action is not just about fairness - it's a strategic necessity with significant co-benefits. We can shape climate policies and infrastructure that address a broader spectrum of societal needs, leading to more inclusive solutions enhancing public health, social equity, and environmental resilience."

While the researchers’ analyses focused on achieving gender balance, studies on women’s involvement in climate governance suggest that increased representation does not by itself always lead to meaningful policy changes. Even when formally included, women’s active participation in male-dominated institutions is often constrained by existing social and cultural norms, implicit biases and structural barriers.

Dr van Daalen added: “If we’re to meaningfully incorporate gender into climate policy and practice, we need to understand the risks and vulnerabilities that are gender-specific and look at how we can address them and their root causes at all phases of programme and policy development.

“But we also need to resist reducing women to a single, homogenous group, which risks deepening existing inequalities and overlooks opportunities to address the needs of all individuals. It is crucial to recognise the diversity of women and their embodiment of multiple, intersecting identities that shape their climate experiences as well as their mitigation and adaptation needs.”

The team also highlights that gender-diverse people face unique health and climate-related risks due to their increased vulnerability, stigma, and discrimination. For example, during and after extreme events, transgender people in the United States report being threatened or prohibited access to shelters. Similarly, in the Philippines, Indonesia, and Samoa, gender-diverse individuals often face discrimination, mockery, and exclusion from evacuation centres or access to food. Yet, say the researchers, there are major gaps in knowledge about the health implications of climate change for such groups.

Find out how Cambridge's pioneering research in climate and nature is regenerating nature, rewiring energy, rethinking transport and redefining economics - forging a future for our planet.

Reference
Van Daalen, KR et al. Bridging the gender, climate, and health gap: the road to COP29. Lancet Planetary Health; 11 Nov 2024; DOI: 10.1016/S2542-5196(24)00270-5

Climate governance is dominated by men, yet the health impacts of the climate crisis often affect women, girls, and gender-diverse people disproportionately, argue researchers ahead of the upcoming 29th United Nations Climate Summit (COP29) in Azerbaijan.

Given how disproportionately climate change affects women, girls and gender minorities, we need to ensure that their voices are heard and meaningfully included in discussions of how we respond to this urgent climate crisis
Kim van Daalen
Surviving the flood at Ahoada in Rivers state Nigeria

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Cambridge win golds at legendary rowing event

The Cambridge Men’s ‘A’ boat en route to victory

Cambridge University Boat Club (CUBC) student and alumni crews took part in the race, which is held on the Charles River over a 3-mile course comprising several bends that require skilled coxing. Cambridge walked away with gold in the Men’s Championship Eights, Women’s Master’s Double, Men’s Alumni Fours, and Men’s Senior Master’s Fours against top international crews, with an impressive set of results across the board.

Read the full story on the CUBC website.

Cambridge have claimed four golds at one of the biggest events in the global rowing calendar, the Head of the Charles Regatta in Boston, Massachusetts, USA.

The Cambridge Men’s ‘A’ boat en route to victory

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Planting trees in the Arctic could make global warming worse, not better, say scientists

Emerald Lake, Yukon

But, writing in the journal Nature Geoscience, an international group of scientists, led by the University of Cambridge and the University of Århus, argue that tree planting at high latitudes will accelerate, rather than decelerate, global warming.

As the climate continues to warm, trees can be planted further and further north, and large-scale tree-planting projects in the Arctic have been championed by governments and corporations as a way to mitigate the worst effects of climate change.

However, when trees are planted in the wrong places - such as normally treeless tundra and mires, as well as large areas of the boreal forest with relatively open tree canopies - they can make global warming worse.

According to lead author Assistant Professor Jeppe Kristensen from Aarhus University in Denmark, the unique characteristics of Arctic and sub-Arctic ecosystems make them poorly suited for tree planting for climate mitigation.

“Soils in the Arctic store more carbon than all vegetation on Earth,” said Kristensen. “These soils are vulnerable to disturbances, such as cultivation for forestry or agriculture, but also the penetration of tree roots. The semi-continuous daylight during the spring and early summer, when snow is still on the ground, also makes the energy balance in this region extremely sensitive to surface darkening, since green and brown trees will soak up more heat from the sun than white snow.” 

In addition, the regions surrounding the North Pole in North America, Asia and Scandinavia are prone to natural disturbances - such as wildfires and droughts - that kill off vegetation. Climate change makes these disturbances both more frequent and more severe.

“This is a risky place to be a tree, particularly as part of a homogeneous plantation that is more vulnerable to such disturbances,” said Kristensen. “The carbon stored in these trees risks fuelling disturbances and getting released back to the atmosphere within a few decades.”

The researchers say that tree planting at high latitudes is a prime example of a climate solution with a desired effect in one context but the opposite effect in another.

“The climate debate is very carbon-focused because the main way humans have modified the Earth’s climate in the last century is through emitting greenhouse gases from burning fossil fuels,” said Kristensen. “But at the core, climate change is the result of how much solar energy entering the atmosphere stays, and how much leaves again – Earth’s so-called energy balance.”

Greenhouse gases are one important determinant of how much heat can escape our planet’s atmosphere. However, the researchers say that at high latitudes, how much sunlight is reflected back into space, without being converted into heat (known as the albedo effect), is more important than carbon storage for the total energy balance.

The researchers are calling for a more holistic view of ecosystems to identify truly meaningful nature-based solutions that do not compromise the overall goal: slowing down climate change.

“A holistic approach is not just a richer way of looking at the climate effects of nature-based solutions, but it’s imperative if we’re going to make a difference in the real world,” said senior author Professor Marc Macias-Fauria, from the University of Cambridge’s Scott Polar Research Institute.

However, the researchers recognise that there can be other reasons for planting trees, such as timber self-sufficiency, but these cases do not come with bonuses for climate mitigation.

“Forestry in the far North should be viewed like any other production system and compensate for its negative impact on the climate and biodiversity,” said Macias-Fauria. “You can’t have your cake and eat it, and you can’t deceive the Earth. By selling northern afforestation as a climate solution, we’re only fooling ourselves.”

So how can we moderate global warming at high latitudes? The researchers suggest that working with local communities to support sustainable populations of large herbivores, such as caribou, could be a more viable nature-based solution to climate change in Arctic and subarctic regions than planting millions of trees. 

“There is ample evidence that large herbivores affect plant communities and snow conditions in ways that result in net cooling,” said Macias-Fauria. “This happens both directly, by keeping tundra landscapes open, and indirectly, through the effects of herbivore winter foraging, where they modify the snow and decrease its insulation capacity, reducing soil temperatures and permafrost thaw.”

The researchers say it’s vital to consider biodiversity and the livelihoods of local communities in the pursuit of nature-based climate solutions.

“Large herbivores can reduce climate-driven biodiversity loss in Arctic ecosystems and remain a fundamental food resource for local communities,” said Macias-Fauria. “Biodiversity and local communities are not an added benefit to nature-based solutions: they are fundamental. Any nature-based solutions must be led by the communities who live at the front line of climate change.”

More about this story

Reference:
Jeppe Å Kristensen et al. ‘Tree planting is no climate solution at northern high latitudes.’ Nature Geoscience (2024). DOI: 10.1038/s41561-024-01573-4

Explore more discoveries, innovations and research on climate and nature at the University of Cambridge: www.cam.ac.uk/climate-and-nature

Tree planting has been widely touted as a cost-effective way of reducing global warming, due to trees’ ability to store large quantities of carbon from the atmosphere.

Emerald Lake, Yukon

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Greater Manchester and Cambridge strengthen Innovation Partnership to drive economic growth

The Vice-Chancellor with the Greater Manchester Mayoral delegation.

On 5 November 2024, Greater Manchester’s Mayor Andy Burnham visited Cambridge to celebrate and further cement a groundbreaking partnership between the two cities' innovation ecosystems. The collaboration, which was officially launched in 2023, is aimed at leveraging the combined strengths of Manchester and Cambridge to fuel the growth of start-ups, attract investment, and foster inclusive economic development across the UK.

The visit marked an important step forward in this trailblazing collaboration, which is the first of its kind in the UK. Leading academic, business and civil figures from both cities were in attendance including: Dr Nik Johnson, Mayor of Cambridgeshire and Peterborough; Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge; Professor Duncan Ivison, Vice-Chancellor of the University of Manchester; Professor Lou Cordwell, Professor of Innovation, University of Manchester; Dr Diarmuid O’Brien, Pro-Vice-Chancellor for Innovation, University of Cambridge and Dr Kathryn Chapman, Executive Director, Innovate Cambridge.

The delegation also included representatives from the business community, including AstraZeneca, which is deepening its support for entrepreneurial ventures through mentorship and collaboration.

Strengthening connections between two powerhouses

The day began with a tour of Cambridge’s West Innovation District, an area known for its cutting-edge facilities and academic institutions. The tour was led by Dr Diarmuid O’Brien, who highlighted the district’s role in driving forward innovation in areas ranging from aerospace to zero-carbon technologies. One key stop was the world-famous Whittle Laboratory, renowned for its work on decarbonising aviation, and the new Cavendish Laboratory, which hosts the University’s Physics department and supports the creation of innovative start-ups.

At the Cambridge Graphene Centre, Professor Andrea Ferrari, Director of the Centre, was joined by Manchester University colleagues to welcome Mayor Burnham and Dr Johnson for an in-depth session focused on graphene research and the ongoing collaboration with Manchester’s National Graphene Institute.

Discussions centred around the commercialisation of cutting-edge research from both institutions and the potential for scaling these innovations in both cities. Spinout companies Paragraf and Versarian spoke about the opportunities the partnership could unlock for future collaboration, talent exchange, and investment.

AstraZeneca expands mentorship programme to Manchester

Another highlight of the day was a visit to AstraZeneca’s Discovery Centre (DISC) in the Cambridge Biomedical Campus, where the company announced that its ‘AstraZeneca Exchange’ mentorship programme would be expanded to support entrepreneurs and start-ups in Greater Manchester. The programme, which is already active in Cambridge, connects start-ups with AstraZeneca’s network of scientific and business experts, providing invaluable support to early-stage ventures.

Inclusive innovation and regional growth

A key theme of the visit was ensuring that innovation-driven growth benefits all communities. This commitment to inclusive innovation was explored during a roundtable discussion, which included representatives from the Bennett Institute for Public Policy in Cambridge and The Productivity Institute in Manchester. The conversation centred around how innovation can be made more accessible to economically lagging regions and marginalised groups, and how to ensure that the fruits of innovation are equitably distributed.

Professor Andy Westwood, Policy Director at The Productivity Institute and Professor Mike Kenny, Director of the Bennett Institute for Public Policy both addressed the group, with Professor Kenny presenting a newly launched report, 'Townscapes: Making Innovation More Inclusive'

The report is the product of collaboration between the two Institutes and explores how innovation can address regional disparities.“An inclusive approach to innovation focuses not only on the process and outcomes of innovation, but also considers who is involved in it, what are the social and economic conditions that foster it, and perhaps most importantly, keeps in mind which places and communities benefit from innovation” said Professor Kenny.

This commitment to inclusive innovation was further reflected in the opening of The Glasshouse, a new facility from Innovate Cambridge dedicated to supporting the next generation of entrepreneurs from diverse backgrounds. The Glasshouse will serve as a hub and incubator for new ideas and technologies, providing mentorship, resources, and networking opportunities for start-ups.

A partnership for the future

As the day concluded, Mayor Burnham reflected on the immense potential of the partnership between Greater Manchester and Cambridge. “Greater Manchester and Cambridge are two world-renowned centres of innovation. This partnership is breaking new ground, linking the North of England with the Golden triangle to drive regional and national economic growth. We also share an ambition for growth that benefits everyone, with more people and businesses able to access the opportunities created by innovation.

“Our two places have distinct identities and unique strengths, but we also have a lot in common – world-leading universities and dynamic, fast-growing economies. By working together, we can be greater than the sum of our parts.”

With both cities continuing to push the boundaries of scientific and technological advancements, the partnership between Greater Manchester and Cambridge is poised to play a pivotal role in shaping the UK’s future innovation landscape.The visit has underscored the shared commitment to advancing regional growth, fostering collaboration, and ensuring that the benefits of innovation are felt by all.

The Vice-Chancellor of Cambridge University, Professor Deborah Prentice said: "This collaboration between our two cities and universities is a testament to our shared ambition and the immense opportunities ahead. Over the past five years, we've seen thousands of co-publications as well as deep collaboration in graphene and materials research, showcasing the power of our joint efforts. This partnership isn’t just about what we’ve achieved; it’s about what we’re building—a dynamic platform to connect and strengthen our innovation ecosystems for the future."

This Cambridge x Manchester collaboration promises to be more than just a stepping stone—it's a foundation for the future of innovation in the UK.

Visit from Manchester Mayor signals a new era of collaboration between two UK innovation hubs with a focus on boosting regional economies and fostering inclusive growth.

This partnership is breaking new ground, linking the North of England with the Golden triangle to drive regional and national economic growth.
Andy Burnham, Mayor of Greater Manchester

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

A radical economic transformation is the only way to save nature and ourselves

Photograph of sunlight poking through clouds onto mountains

Cambridge Institute for Sustainability Leadership's Chief Innovation Officer, James Cole, looks back at what happened at COP16, and asks what comes next in this article

After two weeks of negotiations last week in Cali, Colombia, the COP16 biodiversity summit was suspended with no overall agreement on a path forward on “resource mobilisation."

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Glaucoma drug shows promise against neurodegenerative diseases, animal studies suggest

Zebrafish

Researchers in the UK Dementia Research Institute at the University of Cambridge screened more than 1,400 clinically-approved drug compounds using zebrafish genetically engineered to make them mimic so-called tauopathies. They discovered that drugs known as carbonic anhydrase inhibitors – of which the glaucoma drug methazolamide is one – clear tau build-up and reduce signs of the disease in zebrafish and mice carrying the mutant forms of tau that cause human dementias.

Tauopathies are neurodegenerative diseases characterised by the build-up in the brain of tau protein ‘aggregates’ within nerve cells. These include forms of dementia, Pick's disease and progressive supranuclear palsy, where tau is believed to be the primary disease driver, and Alzheimer’s disease and chronic traumatic encephalopathy (neurodegeneration caused by repeated head trauma, as has been reported in football and rugby players), where tau build-up is one consequence of disease but results in degeneration of brain tissue.

There has been little progress in finding effective drugs to treat these conditions. One option is to repurpose existing drugs. However, drug screening – where compounds are tested against disease models – usually takes place in cell cultures, but these do not capture many of the characteristics of tau build-up in a living organism.

To work around this, the Cambridge team turned to zebrafish models they had previously developed. Zebrafish grow to maturity and are able to breed within two to three months and produce large numbers of offspring. Using genetic manipulation, it is possible to mimic human diseases as many genes responsible for human diseases often have equivalents in the zebrafish.

In a study published today in Nature Chemical Biology, Professor David Rubinsztein, Dr Angeleen Fleming and colleagues modelled tauopathy in zebrafish and screened 1,437 drug compounds. Each of these compounds has been clinically approved for other diseases.

Dr Ana Lopez Ramirez from the Cambridge Institute for Medical Research, Department of Physiology, Development and Neuroscience and the UK Dementia Research Institute at the University of Cambridge, joint first author, said: “Zebrafish provide a much more effective and realistic way of screening drug compounds than using cell cultures, which function quite differently to living organisms. They also enable us to do so at scale, something that it not feasible or ethical in larger animals such as mice.”  

Using this approach, the team showed that inhibiting an enzyme known as carbonic anhydrase – which is important for regulating acidity levels in cells – helped the cell rid itself of the tau protein build-up. It did this by causing the lysosomes – the ‘cell’s incinerators’ – to move to the surface of the cell, where they fused with the cell membrane and ‘spat out’ the tau.

When the team tested methazolamide on mice that had been genetically engineered to carry the P301S human disease-causing mutation in tau, which leads to the progressive accumulation of tau aggregates in the brain, they found that those treated with the drug performed better at memory tasks and showed improved cognitive performance compared with untreated mice.

Analysis of the mouse brains showed that they indeed had fewer tau aggregates, and consequently a lesser reduction in brain cells, compared with the untreated mice.

Fellow joint author Dr Farah Siddiqi, also from the Cambridge Institute for Medical Research and the UK Dementia Research Institute, said: “We were excited to see in our mouse studies that methazolamide reduces levels of tau in the brain and protects against its further build-up. This confirms what we had shown when screening carbonic anhydrase inhibitors using zebrafish models of tauopathies.”

Professor Rubinsztein from the UK Dementia Research Institute and Cambridge Institute for Medical Research at the University of Cambridge, said: “Methazolamide shows promise as a much-needed drug to help prevent the build-up of dangerous tau proteins in the brain. Although we’ve only looked at its effects in zebrafish and mice, so it is still early days, we at least know about this drug’s safety profile in patients. This will enable us to move to clinical trials much faster than we might normally expect if we were starting from scratch with an unknown drug compound.

“This shows how we can use zebrafish to test whether existing drugs might be repurposed to tackle different diseases, potentially speeding up significantly the drug discovery process.”

The team hopes to test methazolamide on different disease models, including more common diseases characterised by the build-up of aggregate-prone proteins, such as Huntington’s and Parkinson’s diseases.

The research was supported by the UK Dementia Research Institute (through UK DRI Ltd, principally funded through the Medical Research Council), Tau Consortium and Wellcome.

Reference
Lopez, A & Siddiqi, FH et al. Carbonic anhydrase inhibition ameliorates tau toxicity via enhanced tau secretion. Nat Chem Bio; 31 Oct 2024; DOI: 10.1038/s41589-024-01762-7

 

A drug commonly used to treat glaucoma has been shown in zebrafish and mice to protect against the build-up in the brain of the protein tau, which causes various forms of dementia and is implicated in Alzheimer’s disease.

Zebrafish provide a much more effective and realistic way of screening drug compounds than using cell cultures, which function quite differently to living organisms
Ana Lopez Ramirez
Zebrafish

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

AI algorithm accurately detects heart disease in dogs

Huxley, a healthy volunteer Havanese, undergoes a physical examination at the Queen's Veterinary School Hospital, Cambridge.

The research team, led by the University of Cambridge, adapted an algorithm originally designed for humans and found it could automatically detect and grade heart murmurs in dogs, based on audio recordings from digital stethoscopes. In tests, the algorithm detected heart murmurs with a sensitivity of 90%, a similar accuracy to expert cardiologists.

Heart murmurs are a key indicator of mitral valve disease, the most common heart condition in adult dogs. Roughly one in 30 dogs seen by a veterinarian has a heart murmur, although the prevalence is higher in small breed dogs and older dogs.

Since mitral valve disease and other heart conditions are so common in dogs, early detection is crucial as timely medication can extend their lives. The technology developed by the Cambridge team could offer an affordable and effective screening tool for primary care veterinarians, and improve quality of life for dogs. The results are reported in the Journal of Veterinary Internal Medicine.

“Heart disease in humans is a huge health issue, but in dogs it’s an even bigger problem,” said first author Dr Andrew McDonald from Cambridge’s Department of Engineering. “Most smaller dog breeds will have heart disease when they get older, but obviously dogs can’t communicate in the same way that humans can, so it’s up to primary care vets to detect heart disease early enough so it can be treated.”

Professor Anurag Agarwal, who led the research, is a specialist in acoustics and bioengineering. “As far as we’re aware, there are no existing databases of heart sounds in dogs, which is why we started out with a database of heart sounds in humans,” he said. “Mammalian hearts are fairly similar, and when things go wrong, they tend to go wrong in similar ways.”

The researchers started with a database of heart sounds from about 1000 human patients and developed a machine learning algorithm to replicate whether a heart murmur had been detected by a cardiologist. They then adapted the algorithm so it could be used with heart sounds from dogs.

The researchers gathered data from almost 800 dogs who were undergoing routine heart examination at four veterinary specialist centres in the UK. All dogs received a full physical examination and heart scan (echocardiogram) by a cardiologist to grade any heart murmurs and identify cardiac disease, and heart sounds were recorded using an electronic stethoscope. By an order of magnitude, this is the largest dataset of dog heart sounds ever created.

“Mitral valve disease mainly affects smaller dogs, but to test and improve our algorithm, we wanted to get data from dogs of all shapes, sizes and ages,” said co-author Professor Jose Novo Matos from Cambridge’s Department of Veterinary Medicine, a specialist in small animal cardiology. “The more data we have to train it, the more useful our algorithm will be, both for vets and for dog owners.”

The researchers fine-tuned the algorithm so it could both detect and grade heart murmurs based on the audio recordings, and differentiate between murmurs associated with mild disease and those reflecting advanced heart disease that required further treatment.  

“Grading a heart murmur and determining whether the heart disease needs treatment requires a lot of experience, referral to a veterinary cardiologist, and expensive specialised heart scans,” said Novo Matos. “We want to empower general practitioners to detect heart disease and assess its severity to help owners make the best decisions for their dogs.”

Analysis of the algorithm’s performance found it agreed with the cardiologist’s assessment in over half of cases, and in 90% of cases, it was within a single grade of the cardiologist’s assessment. The researchers say this is a promising result, as it is common for there to be significant variability in how different vets grade heart murmurs.

“The grade of heart murmur is a useful differentiator for determining next steps and treatments, and we’ve automated that process,” said McDonald. “For vets and nurses without as much stethoscope skill, and even those who are incredibly skilled with a stethoscope, we believe this algorithm could be a highly valuable tool.”

In humans with valve disease, the only treatment is surgery, but for dogs, effective medication is available. “Knowing when to medicate is so important, in order to give dogs the best quality of life possible for as long as possible,” said Agarwal. “We want to empower vets to help make those decisions.”

“So many people talk about AI as a threat to jobs, but for me, I see it as a tool that will make me a better cardiologist,” said Novo Matos. “We can’t perform heart scans on every dog in this country  – we just don’t have enough time or specialists to screen every dog with a murmur. But tools like these could help vets and owners, so we can quickly identify those dogs who are most in need of treatment.”

The research was supported in part by the Kennel Club Charitable Trust, the Medical Research Council, and Emmanuel College Cambridge.

Reference:
Andrew McDonald et al. ‘A machine learning algorithm to grade canine heart murmurs and stage preclinical myxomatous mitral valve disease.’ Journal of Veterinary Internal Medicine (2024). DOI: 10.1111/jvim.17224

Researchers have developed a machine learning algorithm to accurately detect heart murmurs in dogs, one of the main indicators of cardiac disease, which affects a large proportion of some smaller breeds such as King Charles Spaniels.

Huxley, a healthy volunteer Havanese, undergoes a physical examination at the Queen's Veterinary School Hospital, Cambridge.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

AI algorithm accurately detects heart disease in dogs

Huxley, a healthy volunteer Havanese, undergoes a physical examination at the Queen's Veterinary School Hospital, Cambridge.

The research team, led by the University of Cambridge, adapted an algorithm originally designed for humans and found it could automatically detect and grade heart murmurs in dogs, based on audio recordings from digital stethoscopes. In tests, the algorithm detected heart murmurs with a sensitivity of 90%, a similar accuracy to expert cardiologists.

Heart murmurs are a key indicator of mitral valve disease, the most common heart condition in adult dogs. Roughly one in 30 dogs seen by a veterinarian has a heart murmur, although the prevalence is higher in small breed dogs and older dogs.

Since mitral valve disease and other heart conditions are so common in dogs, early detection is crucial as timely medication can extend their lives. The technology developed by the Cambridge team could offer an affordable and effective screening tool for primary care veterinarians, and improve quality of life for dogs. The results are reported in the Journal of Veterinary Internal Medicine.

“Heart disease in humans is a huge health issue, but in dogs it’s an even bigger problem,” said first author Dr Andrew McDonald from Cambridge’s Department of Engineering. “Most smaller dog breeds will have heart disease when they get older, but obviously dogs can’t communicate in the same way that humans can, so it’s up to primary care vets to detect heart disease early enough so it can be treated.”

Professor Anurag Agarwal, who led the research, is a specialist in acoustics and bioengineering. “As far as we’re aware, there are no existing databases of heart sounds in dogs, which is why we started out with a database of heart sounds in humans,” he said. “Mammalian hearts are fairly similar, and when things go wrong, they tend to go wrong in similar ways.”

The researchers started with a database of heart sounds from about 1000 human patients and developed a machine learning algorithm to replicate whether a heart murmur had been detected by a cardiologist. They then adapted the algorithm so it could be used with heart sounds from dogs.

The researchers gathered data from almost 800 dogs who were undergoing routine heart examination at four veterinary specialist centres in the UK. All dogs received a full physical examination and heart scan (echocardiogram) by a cardiologist to grade any heart murmurs and identify cardiac disease, and heart sounds were recorded using an electronic stethoscope. By an order of magnitude, this is the largest dataset of dog heart sounds ever created.

“Mitral valve disease mainly affects smaller dogs, but to test and improve our algorithm, we wanted to get data from dogs of all shapes, sizes and ages,” said co-author Professor Jose Novo Matos from Cambridge’s Department of Veterinary Medicine, a specialist in small animal cardiology. “The more data we have to train it, the more useful our algorithm will be, both for vets and for dog owners.”

The researchers fine-tuned the algorithm so it could both detect and grade heart murmurs based on the audio recordings, and differentiate between murmurs associated with mild disease and those reflecting advanced heart disease that required further treatment.  

“Grading a heart murmur and determining whether the heart disease needs treatment requires a lot of experience, referral to a veterinary cardiologist, and expensive specialised heart scans,” said Novo Matos. “We want to empower general practitioners to detect heart disease and assess its severity to help owners make the best decisions for their dogs.”

Analysis of the algorithm’s performance found it agreed with the cardiologist’s assessment in over half of cases, and in 90% of cases, it was within a single grade of the cardiologist’s assessment. The researchers say this is a promising result, as it is common for there to be significant variability in how different vets grade heart murmurs.

“The grade of heart murmur is a useful differentiator for determining next steps and treatments, and we’ve automated that process,” said McDonald. “For vets and nurses without as much stethoscope skill, and even those who are incredibly skilled with a stethoscope, we believe this algorithm could be a highly valuable tool.”

In humans with valve disease, the only treatment is surgery, but for dogs, effective medication is available. “Knowing when to medicate is so important, in order to give dogs the best quality of life possible for as long as possible,” said Agarwal. “We want to empower vets to help make those decisions.”

“So many people talk about AI as a threat to jobs, but for me, I see it as a tool that will make me a better cardiologist,” said Novo Matos. “We can’t perform heart scans on every dog in this country  – we just don’t have enough time or specialists to screen every dog with a murmur. But tools like these could help vets and owners, so we can quickly identify those dogs who are most in need of treatment.”

The research was supported in part by the Kennel Club Charitable Trust, the Medical Research Council, and Emmanuel College Cambridge.

Reference:
Andrew McDonald et al. ‘A machine learning algorithm to grade canine heart murmurs and stage preclinical myxomatous mitral valve disease.’ Journal of Veterinary Internal Medicine (2024). DOI: 10.1111/jvim.17224

Researchers have developed a machine learning algorithm to accurately detect heart murmurs in dogs, one of the main indicators of cardiac disease, which affects a large proportion of some smaller breeds such as King Charles Spaniels.

Huxley, a healthy volunteer Havanese, undergoes a physical examination at the Queen's Veterinary School Hospital, Cambridge.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Magnetic field applied to both sides of brain shows rapid improvement for depression

Brain image`

The treatment – known as repetitive transcranial magnetic stimulation (TMS) – involves placing an electromagnetic coil against the scalp to relay a high-frequency magnetic field to the brain.

Around one in 20 adults is estimated to suffer from depression. Although treatments exist, such as anti-depressant medication and cognitive behavioural therapy (‘talking therapy’), they are ineffective for just under one in three patients.

One of the key characteristics of depression is under-activity of some regions (such as the dorsolateral prefrontal cortex) and over-activity of others (such as the orbitofrontal cortex (OFC)).

Repetitive transcranial magnetic stimulation applied to the left side of the dorsolateral prefrontal cortex (an area at the upper front area of the brain) is approved for treatment of depression in the UK by NICE and in the US by the FDA. It has previously been shown to lead to considerable improvements among patients after a course of 20 sessions, but because the sessions usually take place over 20-30 days, the treatment is not ideal for everyone, particularly in acute cases or where a person is suicidal.

In research published in Psychological Medicine, scientists from Cambridge, UK, and Guiyang, China, tested how effective an accelerated form of TMS is. In this approach, the treatment is given over 20 sessions, but with four sessions per day over a period of five consecutive days.

The researchers also tested a ‘dual’ approach, whereby a magnetic field was additionally applied to the right-hand side of the OFC (which sits below the dorsolateral prefrontal cortex).

Seventy-five patients were recruited to the trial from the Second People’s Hospital of Guizhou Province in China. The severity of their depression was measured on a scale known as the Hamilton Rating Scale of Depression.

Participants were split randomly into three groups: a ‘dual’ group receiving TMS applied first to the right- and then to the left-hand sides of the brain; a ‘single’ group receiving sham TMS to the right-side followed by active TMS applied to the left-side; and a control group receiving a sham treatment to both sides. Each session lasted in total 22 minutes.

There was a significant improvement in scores assessed immediately after the final treatment in the dual treatment group compared to the other two groups. When the researchers looked for clinically-relevant responses – that is, where an individual’s score fell by at least 50% – they found that almost half (48%) of the patients in the dual treatment group saw such a reduction, compared to just under one in five (18%) in the single treatment group and fewer than one in 20 (4%) in the control group.

Four weeks later, around six in 10 participants in both the dual and single treatment groups (61% and 59% respectively) showed clinically relevant responses, compared to just over one in five (22%) in the control group.

Professor Valerie Voon from the Department of Psychiatry at the University of Cambridge, who led the UK side of the study, said: “Our accelerated approach means we can do all of the sessions in just five days, rapidly reducing an individual’s symptoms of depression. This means it could be particularly useful in severe cases of depression, including when someone is experiencing suicidal thoughts. It may also help people be discharged from hospital more rapidly or even avoid admission in the first place.

“The treatment works faster because, by targeting two areas of the brain implicated in depression, we’re effectively correcting imbalances in two import processes, getting brain regions ‘talking’ to each other correctly.”

The treatment was most effective in those patients who at the start of the trial showed greater connectivity between the OFC and the thalamus (an area in the middle of the brain responsible for, among other things, regulation of consciousness, sleep, and alertness). The OFC is important for helping us make decisions, particularly in choosing rewards and avoiding punishment. Its over-activity in depression, particularly in relation to its role in anti-reward or punishment, might help explain why people with depression show a bias towards negative expectations and ruminations.

Dr Yanping Shu from the Guizhou Mental Health Centre, Guiyang, China, said: “This new treatment has demonstrated a more pronounced – and faster – improvement in response rates for patients with major depressive disorder. It represents a significant step forward in improving outcomes, enabling rapid discharge from hospitals for individuals with treatment-resistant depression, and we are hopeful it will lead to new possibilities in mental health care.”

Dr Hailun Cui from Fudan University, a PhD student in Professor Voon’s lab at the time of the study, added: “The management of treatment-resistant depression remains one of the most challenging areas in mental health care. These patients often fail to respond to standard treatments, including medication and psychotherapy, leaving them in a prolonged state of severe distress, functional impairment, and increased risk of suicide.

“This new TMS approach offers a beacon of hope in this difficult landscape. Patients frequently reported experiencing ‘lighter and brighter’ feelings as early as the second day of treatment. The rapid improvements, coupled with a higher response rate that could benefit a broader depressed population, mark a significant breakthrough in the field.”

Just under a half (48%) of participants in the dual treatment group reported local pain where the dual treatment was applied, compared to just under one in 10 (9%) of participants in the single treatment group. However, despite this, there were no dropouts.

For some individuals, this treatment may be sufficient, but for others ‘maintenance therapy’ may be necessary, with an additional day session if their symptoms appear to be worsening over time. It may also be possible to re-administer standard therapy as patients can then become more able to engage in psychotherapy. Other options include using transcranial direct current stimulation, a non-invasive form of stimulation using weak electrical impulses that can be delivered at home.

The researchers are now exploring exactly which part of the orbitofrontal cortex is most effective to target and for which types of depression.

The research was supported by in the UK by the Medical Research Council and by the National Institute for Health and Care Research Cambridge Biomedical Research Centre.*

Reference
Cui, H, Ding, H & Hu, L et al. A novel dual-site OFC-dlPFC accelerated repetitive transcranial magnetic stimulation for depression: a pilot randomized controlled study. Psychological Medicine; 23 Oct 2024; DOI: 10.1017/S0033291724002289

*A full list of funders is available in the journal paper.

A type of therapy that involves applying a magnetic field to both sides of the brain has been shown to be effective at rapidly treating depression in patients for whom standard treatments have been ineffective.

Our accelerated approach means we can do all of the sessions in just five days, rapidly reducing an individual’s symptoms of depression
Valerie Voon
Digital image of a brain

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Professor David Rowitch elected to US National Academy of Medicine

Professor David Rowitch

Election to the Academy is considered one of the highest honours in the fields of health and medicine and recognises individuals who have demonstrated outstanding professional achievement and commitment to service.

“It is a great honour to have been elected to the National Academy of Medicine,”  said Professor Rowitch.

Professor Rowitch obtained his PhD from the University of Cambridge. His research in the field of developmental neurobiology has focused on glial cells that comprise the ‘white matter’ of the human brain. It has furthered understanding human neonatal brain development as well as white matter injury in premature infants, multiple sclerosis and leukodystrophy. Amongst numerous awards, he was elected a Fellow of the Academy of Medical Sciences in 2018 and Fellow of the Royal Society in 2021.

Professor Rowitch’s current interest focuses on functional genomic technologies to better diagnose and treat rare neurogenetic disorders in children. He is academic lead for the new Cambridge Children’s Hospital, developing integrated paediatric physical-mental healthcare and research within the NHS and University of Cambridge.

NAM President Victor J. Dzau said: “This class of new members represents the most exceptional researchers and leaders in health and medicine, who have made significant breakthroughs, led the response to major public health challenges, and advanced health equity.

“Their expertise will be necessary to supporting NAM’s work to address the pressing health and scientific challenges we face today. It is my privilege to welcome these esteemed individuals to the National Academy of Medicine.”

Professor Rowitch is one of 90 regular members and 10 international members announced during the Academy’s annual meeting. New members are elected by current members through a process that recognises individuals who have made major contributions to the advancement of the medical sciences, health care, and public health. 

Professor David Rowitch, Head of the Department of Paediatrics at the University of Cambridge, has been elected to the prestigious National Academy of Medicine in the USA.

Professor David Rowitch

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Airbnb rentals linked to increased crime rates in London neighbourhoods

London townhouses in Greenwich

Latest research has revealed a ‘positive association’ between the number of properties listed as Airbnb rentals and police-reported robberies and violent crimes in thousands of London neighbourhoods between 2015 and 2018.

In fact, the study led by the University of Cambridge suggests that a 10% increase in active Airbnb rentals in the city would correspond to an additional 1,000 robberies per year across London.*

Urban sociologists say the rapid pace at which crime rises in conjunction with new rentals suggests that the link is related more to opportunities for crime, rather than loss of cohesion within communities – although both are likely contributing factors.  

“We tested for the most plausible alternative explanations, from changes in police patrols to tourist hotspots and even football matches,” said Dr Charles Lanfear from Cambridge’s Institute of Criminology, co-author of the study published today in the journal Criminology.

“Nothing changed the core finding that Airbnb rentals are related to higher crime rates in London neighbourhoods.”

“While Airbnb offers benefits to tourists and hosts in terms of ease and financial reward, there may be social consequences to turning large swathes of city neighbourhoods into hotels with little regulation,” Lanfear said.

Founded in 2008, Airbnb is a giant of the digital economy, with more than 5 million property hosts now active on the platform in some 100,000 cities worldwide.

However, concerns that Airbnb is contributing to unaffordable housing costs has led to a backlash among residents of cities such as Barcelona, and calls for greater regulation.

London is one of the most popular Airbnb markets in the world. An estimated 4.5 million guests stayed in a London Airbnb during the period covered by the study.

Lanfear and his University of Pennsylvania co-author Professor David Kirk used masses of data from AirDNA: a site that scrapes Airbnb to provide figures, trends and approximate geolocations for the short-term letting market.

They mapped AirDNA data from 13 calendar quarters (January 2015 to March 2018) onto ‘Lower Layer Super Output Areas’, or LSOAs.

These are designated areas of a few streets containing around two thousand residents, used primarily for UK census purposes. There are 4,835 LSOAs in London, and all were included in the study.

Crime statistics from the UK Home Office and Greater London Authority for 6 categories – robbery, burglary, theft, anti-social behaviour, any violence, and bodily harm – were then mapped onto LSOAs populated with AirDNA data. 

The researchers analysed all forms of Airbnb lets, but found the link between active Airbnbs and crime is primarily down to entire properties for rent, rather than spare or shared rooms.

The association between active Airbnb rentals and crime was most significant for robbery and burglary, followed by theft and any violence. No link was found for anti-social behaviour and bodily harm.

On average across London, an additional Airbnb property was associated with a 2% increase in the robbery rate within an LSOA. This association was 1% for thefts, 0.9% for burglaries, and 0.5% for violence.

“While the potential criminogenic effect for each Airbnb rental is small, the accumulative effect of dozens in a neighbourhood, or tens of thousands across the city, is potentially huge,” Lanfear said.

He points out that London had an average of 53,000 active lettings in each calendar-quarter of the study period, and an average of 11 lettings per LSOA.

At its most extreme, one neighbourhood in Soho, an area famed for nightlife, had a high of 318 dedicated Airbnbs – some 30% of all households in the LSOA.  

The data models suggest that a 3.2% increase in all types of Airbnb rentals per LSOA would correspond to a 1% increase in robberies city-wide: 325 additional robberies based on the figure of 32,500 recorded robberies in London in 2018.

Lanfear and Kirk extensively stress-tested the association between Airbnb listings and London crime rates.

This included factoring in ‘criminogenic variables’ such as property prices, police stops, the regularity of police patrols, and even English Premier League football games (by both incorporating attendance into data modelling, and removing all LSOAs within a kilometre of major games).

The duo re-ran their data models excluding all the 259 LSOAs in central London’s Zone One, to see if the association was limited to high tourism areas with lots of Airbnb listings. The data models even incorporated the seasonal ‘ebb and flow’ of London tourism. Nothing changed the overall trends. 

Prior to crunching the numbers, the researchers speculated that any link might be down to Airbnbs affecting ‘collective efficacy’: the social cohesion within a community, combined with a willingness to intervene for the public good.

The study measured levels of collective efficacy across the city using data from both the Metropolitan Police and the Mayor of London’s Office, who conduct surveys on public perceptions of criminal activity and the likely responses of their community.    

Collective efficacy across London is not only consistently high, but did not explain the association between Airbnbs and crime in the data models.

Moreover, when Airbnb listings rise, the effect on crime is more immediate than one caused by a slow erosion of collective efficacy. “Crime seems to go up as soon as Airbnbs appear, and stays elevated for as long as they are active,” said Lanfear.

The researchers conclude it is likely driven by criminal opportunity. “A single Airbnb rental can create different types of criminal opportunity,” said Lanfear.

“An Airbnb rental can provide an easy potential victim such as a tourist unfamiliar with the area, or a property that is regularly vacant and so easier to burgle. A very temporary occupant may be more likely to cause criminal damage.”

“Offenders may learn to return to areas with more Airbnbs to find unguarded targets,” said Lanfear. “More dedicated Airbnb properties may mean fewer long-term residents with a personal stake in the area who are willing to report potential criminal activity.”

Airbnb has taken steps to prevent crime, including some background checks as well as requirements for extended bookings on occasions popular for one-night parties, such as New Year’s Eve. “The fact that we still find an increase in crime despite Airbnb’s efforts to curtail it reveals the severity of the predicament,” said Kirk.

Added Lanfear: “Short-term letting sites such as Airbnb create incentives for landlords that lead to property speculation, and we can see the effect on urban housing markets. We can now see that the expansion of Airbnb may contribute to city crime rates.”

“It is not the company or even the property owners who experience the criminogenic side effects of Airbnb, it is the local residents building their lives in the neighbourhood.”   

Notes:

*Above 2018 levels, which is when the study data ends. 

Rising numbers of houses and flats listed as short-term lets on Airbnb are associated with higher rates of crimes such as burglaries and street robberies right across London, according to the most detailed study of its kind.

There may be social consequences to turning large swathes of city neighbourhoods into hotels with little regulation
Charles Lanfear
London townhouses in Greenwich

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Palaeo-robots’ to help scientists understand how fish started to walk on land

Illustration of palaeo-robots.

Writing in the journal Science Robotics, the research team, led by the University of Cambridge, outline how ‘palaeo-inspired robotics’ could provide a valuable experimental approach to studying how the pectoral and pelvic fins of ancient fish evolved to support weight on land.

“Since fossil evidence is limited, we have an incomplete picture of how ancient life made the transition to land,” said lead author Dr Michael Ishida from Cambridge’s Department of Engineering. “Palaeontologists examine ancient fossils for clues about the structure of hip and pelvic joints, but there are limits to what we can learn from fossils alone. That’s where robots can come in, helping us fill gaps in the research, particularly when studying major shifts in how vertebrates moved.”

Ishida is a member of Cambridge’s Bio-Inspired Robotics Laboratory, led by Professor Fumiya Iida. The team is developing energy-efficient robots for a variety of applications, which take their inspiration from the efficient ways that animals and humans move.

With funding from the Human Frontier Science Program, the team is developing palaeo-inspired robots, in part by taking their inspiration from modern-day ‘walking fish’ such as mudskippers, and from fossils of extinct fish. “In the lab, we can’t make a living fish walk differently, and we certainly can’t get a fossil to move, so we’re using robots to simulate their anatomy and behaviour,” said Ishida.

The team is creating robotic analogues of ancient fish skeletons, complete with mechanical joints that mimic muscles and ligaments. Once complete, the team will perform experiments on these robots to determine how these ancient creatures might have moved.

“We want to know things like how much energy different walking patterns would have required, or which movements were most efficient,” said Ishida. “This data can help confirm or challenge existing theories about how these early animals evolved.”

One of the biggest challenges in this field is the lack of comprehensive fossil records. Many of the ancient species from this period in Earth’s history are known only from partial skeletons, making it difficult to reconstruct their full range of movement.

“In some cases, we’re just guessing how certain bones connected or functioned,” said Ishida. “That’s why robots are so useful—they help us confirm these guesses and provide new evidence to support or rebut them.”

While robots are commonly used to study movement in living animals, very few research groups are using them to study extinct species. “There are only a few groups doing this kind of work,” said Ishida. “But we think it’s a natural fit – robots can provide insights into ancient animals that we simply can’t get from fossils or modern species alone.”

The team hopes that their work will encourage other researchers to explore the potential of robotics to study the biomechanics of long-extinct animals. “We’re trying to close the loop between fossil evidence and real-world mechanics,” said Ishida. “Computer models are obviously incredibly important in this area of research, but since robots are interacting with the real world, they can help us test theories about how these creatures moved, and maybe even why they moved the way they did.”

The team is currently in the early stages of building their palaeo-robots, but they hope to have some results within the next year. The researchers say they hope their robot models will not only deepen understanding of evolutionary biology, but could also open up new avenues of collaboration between engineers and researchers in other fields.

The research was supported by the Human Frontier Science Program. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge. Michael Ishida a Postdoctoral Research Associate at Gonville and Caius College, Cambridge.

Reference:
Michael Ishida et al. ‘Paleo-inspired robotics as an experimental approach to the history of life.’ Science Robotics (2024). DOI: 10.1126/scirobotics.adn1125

The transition from water to land is one of the most significant events in the history of life on Earth. Now, a team of roboticists, palaeontologists and biologists is using robots to study how the ancestors of modern land animals transitioned from swimming to walking, about 390 million years ago.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Researchers deal a blow to theory that Venus once had liquid water on its surface

View of surface of Venus

The researchers, from the University of Cambridge, studied the chemical composition of the Venusian atmosphere and inferred that its interior is too dry today for there ever to have been enough water for oceans to exist at its surface. Instead, the planet has likely been a scorching, inhospitable world for its entire history.

The results, reported in the journal Nature Astronomy, have implications for understanding Earth’s uniqueness, and for the search for life on planets outside our Solar System. While many exoplanets are Venus-like, the study suggests that astronomers should narrow their focus to exoplanets which are more like Earth.

From a distance, Venus and Earth look like siblings: it is almost identical in size and is a rocky planet like Earth. But up close, Venus is more like an evil twin: it is covered with thick clouds of sulfuric acid, and its surface has a mean temperature close to 500°C.

Despite these extreme conditions, for decades, astronomers have been investigating whether Venus once had liquid oceans capable of supporting life, or whether some mysterious form of ‘aerial’ life exists in its thick clouds now.

“We won’t know for sure whether Venus can or did support life until we send probes at the end of this decade,” said first author Tereza Constantinou, a PhD student at Cambridge’s Institute of Astronomy. “But given it likely never had oceans, it is hard to imagine Venus ever having supported Earth-like life, which requires liquid water.”

When searching for life elsewhere in our galaxy, astronomers focus on planets orbiting their host stars in the habitable zone, where temperatures are such that liquid water can exist on the planet’s surface. Venus provides a powerful limit on where this habitable zone lies around a star.

“Even though it’s the closest planet to us, Venus is important for exoplanet science, because it gives us a unique opportunity to explore a planet that evolved very differently to ours, right at the edge of the habitable zone,” said Constantinou.

There are two primary theories on how conditions on Venus may have evolved since its formation 4.6 billion years ago. The first is that conditions on the surface of Venus were once temperate enough to support liquid water, but a runaway greenhouse effect caused by widespread volcanic activity caused the planet to get hotter and hotter. The second theory is that Venus was born hot, and liquid water has never been able to condense at the surface.

“Both of those theories are based on climate models, but we wanted to take a different approach based on observations of Venus’ current atmospheric chemistry,” said Constantinou. “To keep the Venusian atmosphere stable, then any chemicals being removed from the atmosphere should also be getting restored to it, since the planet’s interior and exterior are in constant chemical communication with one another.”

The researchers calculated the present destruction rate of water, carbon dioxide and carbonyl sulphide molecules in Venus’ atmosphere, which must be restored by volcanic gases to keep the atmosphere stable.

Volcanism, through its supply of gases to the atmosphere, provides a window into the interior of rocky planets like Venus. As magma rises from the mantle to the surface, it releases gases from the deeper portions of the planet.

On Earth, volcanic eruptions are mostly steam, due to our planet’s water-rich interior. But, based on the composition of the volcanic gases necessary to sustain the Venusian atmosphere, the researchers found that volcanic gases on Venus are at most six percent water. These dry eruptions suggest that Venus’s interior, the source of the magma that releases volcanic gases, is also dehydrated.

At the end of this decade, NASA’s DAVINCI mission will be able to test and confirm whether Venus has always been a dry, inhospitable planet, with a series of flybys and a probe sent to the surface. The results could help astronomers narrow their focus when searching for planets that can support life in orbit around other stars in the galaxy.

“If Venus was habitable in the past, it would mean other planets we have already found might also be habitable,” said Constantinou. “Instruments like the James Webb Space Telescope are best at studying the atmospheres of planets close to their host star, like Venus. But if Venus was never habitable, then it makes Venus-like planets elsewhere less likely candidates for habitable conditions or life.

“We would have loved to find that Venus was once a planet much closer to our own, so it’s kind of sad in a way to find out that it wasn’t, but ultimately it’s more useful to focus the search on planets that are mostly likely to be able to support life – at least life as we know it.”

The research was supported in part by the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

Reference:
Tereza Constantinou, Oliver Shorttle, and Paul B Rimmer. ‘A dry Venusian interior constrained by atmospheric chemistry.’ Nature Astronomy (2024). DOI: 10.1038/s41550-024-02414-5

A team of astronomers has found that Venus has never been habitable, despite decades of speculation that our closest planetary neighbour was once much more like Earth than it is today.

View of surface of Venus

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

‘Palaeo-robots’ to help scientists understand how fish started to walk on land

Illustration of paleo-robots

Writing in the journal Science Robotics, the research team, led by the University of Cambridge, outline how ‘palaeo-inspired robotics’ could provide a valuable experimental approach to studying how the pectoral and pelvic fins of ancient fish evolved to support weight on land.

“Since fossil evidence is limited, we have an incomplete picture of how ancient life made the transition to land,” said lead author Dr Michael Ishida from Cambridge’s Department of Engineering. “Palaeontologists examine ancient fossils for clues about the structure of hip and pelvic joints, but there are limits to what we can learn from fossils alone. That’s where robots can come in, helping us fill gaps in the research, particularly when studying major shifts in how vertebrates moved.”

Ishida is a member of Cambridge’s Bio-Inspired Robotics Laboratory, led by Professor Fumiya Iida. The team is developing energy-efficient robots for a variety of applications, which take their inspiration from the efficient ways that animals and humans move.

With funding from the Human Frontier Science Program, the team is developing palaeo-inspired robots, in part by taking their inspiration from modern-day ‘walking fish’ such as mudskippers, and from fossils of extinct fish. “In the lab, we can’t make a living fish walk differently, and we certainly can’t get a fossil to move, so we’re using robots to simulate their anatomy and behaviour,” said Ishida.

The team is creating robotic analogues of ancient fish skeletons, complete with mechanical joints that mimic muscles and ligaments. Once complete, the team will perform experiments on these robots to determine how these ancient creatures might have moved.

“We want to know things like how much energy different walking patterns would have required, or which movements were most efficient,” said Ishida. “This data can help confirm or challenge existing theories about how these early animals evolved.”

One of the biggest challenges in this field is the lack of comprehensive fossil records. Many of the ancient species from this period in Earth’s history are known only from partial skeletons, making it difficult to reconstruct their full range of movement.

“In some cases, we’re just guessing how certain bones connected or functioned,” said Ishida. “That’s why robots are so useful—they help us confirm these guesses and provide new evidence to support or rebut them.”

While robots are commonly used to study movement in living animals, very few research groups are using them to study extinct species. “There are only a few groups doing this kind of work,” said Ishida. “But we think it’s a natural fit – robots can provide insights into ancient animals that we simply can’t get from fossils or modern species alone.”

The team hopes that their work will encourage other researchers to explore the potential of robotics to study the biomechanics of long-extinct animals. “We’re trying to close the loop between fossil evidence and real-world mechanics,” said Ishida. “Computer models are obviously incredibly important in this area of research, but since robots are interacting with the real world, they can help us test theories about how these creatures moved, and maybe even why they moved the way they did.”

The team is currently in the early stages of building their palaeo-robots, but they hope to have some results within the next year. The researchers say they hope their robot models will not only deepen understanding of evolutionary biology, but could also open up new avenues of collaboration between engineers and researchers in other fields.

The research was supported by the Human Frontier Science Program. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge. Michael Ishida a Postdoctoral Research Associate at Gonville and Caius College, Cambridge.

Reference:
Michael Ishida et al. ‘Paleo-inspired robotics as an experimental approach to the history of life.’ Science Robotics (2024). DOI: 10.1126/scirobotics.adn1125

The transition from water to land is one of the most significant events in the history of life on Earth. Now, a team of roboticists, palaeontologists and biologists is using robots to study how the ancestors of modern land animals transitioned from swimming to walking, about 390 million years ago.

Illustration of palaeo-robots

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge and GSK announce new five-year collaboration aiming for improved outcomes for patients with hard-to-treat kidney and respiratory diseases

Professor David Thomas and Dr Nicolas Wisniacki
  • The Cambridge-GSK Translational Immunology Collaboration (CG-TIC) combines University and GSK expertise in the science of the immune system, AI and clinical development with access to patients and their data provided by Cambridge University Hospitals.
  • GSK is investing more than £50 million in CG-TIC, further strengthening Cambridge’s position as Europe’s leading life sciences cluster.

GSK plc is making this investment to establish the Cambridge-GSK Translational Immunology Collaboration (CG-TIC), a five-year collaboration with the University of Cambridge and Cambridge University Hospitals. The collaboration is focused on understanding the onset of a disease, its progression, how patients respond to therapies and on developing biomarkers for rapid diagnosis. Ultimately, the goal is to trial more effective, personalised medicines.

The collaboration will focus on kidney and respiratory diseases, both of which affect large numbers of people worldwide. Kidney disease is estimated to affect 850 million people (roughly 10% of the world’s population) (International Society of Nephrology) and chronic respiratory diseases around 545 million (The Lancet).

Many types of kidney disease remain poorly understood and treatments, where they exist, tend to have limited efficacy. Chronic kidney disease is particularly unpleasant and debilitating for patients, often leading to end-stage disease. Treatments such as transplant and dialysis involve complex medical regimes and frequent hospital visits, making effective prevention and treatment the aim.

To make progress in treating these challenging disease areas, CG-TIC will apply an array of new techniques, including the use of cutting-edge single cell technologies to characterise how genes are expressed in individual cells. AI and machine learning have a critical role to play in transforming how data is combined and interrogated.

Using these techniques, the ambition is to be able to initiate new studies and early phase trials of new therapies for a number of hard-to-treat diseases which affect the kidneys. The same techniques will be applied to respiratory diseases and findings will be shared across the disease areas potentially to help identify and share better treatments across these different targets.

Peter Kyle, Secretary of State for Science, Innovation and Technology, welcomed the collaboration: "The UK's life sciences industry is thriving, driving innovation and improving lives. This collaboration between GSK and the University of Cambridge demonstrates our country's leading research and development capabilities.

“By focusing on cutting-edge research and harnessing the power of AI, this has the potential to advance the treatment of immune-related diseases, which could benefit patients both here in the UK and internationally. It's a clear example of how collaboration between industry, academia, and healthcare can deliver tangible results and strengthen the UK's position in healthcare innovation."

Tony Wood, Chief Scientific Officer, GSK, added: “Collaboration is at the heart of scientific progress and is fundamental to how we do R&D at GSK. We’re excited to build on our existing work with the University of Cambridge to further this world-leading scientific and technological capability in the UK. By bringing together Cambridge’s expertise and our own internal capabilities, including understanding of the immune system and the use of AI to accelerate drug development, we have an opportunity to help patients struggling with complex disease.”

The aim of CG-TIC is to improve outcomes for patients and Cambridge provides a unique environment in which to involve them, with Cambridge University Hospitals playing a pivotal role in the collaboration and Royal Papworth Hospital NHS Foundation Trust, the UK’s leading heart and lung hospital, a likely future partner.

Home to the hospitals and to much of the collaboration’s research activity, the Cambridge Biomedical Campus provides a unique environment where academia, industry and healthcare can come together and where human translational research is supported by the National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre.

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, said: “The University sits at the heart of Europe’s leading life sciences cluster, where excellent research and the NHS’s clinical resources combine with the talent generated by the many innovative bioscience companies that call Cambridge home. Through this very important collaboration with GSK, Cambridge will be able to drive economic growth for the UK while improving the health of people in this country and around the world.”

Roland Sinker, CEO of Cambridge University Hospitals NHS Foundation Trust, also welcomed the collaboration, saying: “We are very excited to be part of this important partnership, which is another example of Cambridge experts working together to develop transformational new therapies, and use existing ones more precisely, to improve outcomes for patients with chronic and debilitating conditions.”

The Cambridge-GSK Translational Immunology Collaboration will be co-led by Nicolas Wisniacki, VP, Clinical Research Head, GSK (above left) and David Thomas, Professor of Renal Medicine, University of Cambridge and principal investigator at the Cambridge Institute for Therapeutic Immunology and Infectious Diseases.

The ambition of the partnership is to treat immune-related diseases more precisely with existing therapies and to rapidly develop new ones.

The UK's life sciences industry is thriving, driving innovation and improving lives. This collaboration between GSK and the University of Cambridge demonstrates our country's leading research and development capabilities.
Peter Kyle, Secretary of State for Science, Innovation and Technology
David Thomas, Professor of Renal Medicine, University of Cambridge and Dr Nicolas Wisniacki, VP, Clinical Research Head, GSK

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

With investment, universities can drive growth - Vice-Chancellor

Vice-Chancellor Professor Deborah Prentice

It is Nobel prize-giving season and last week alumni of Cambridge University were awarded four of them for their brilliant work. In a leading article last week, The Times pointed out that if it were a country, Cambridge’s total of 125 would place it third for Nobel laureates behind only the UK and the US. This is a tribute to outstanding work which changes lives.

As an American who took over the role of vice-chancellor a little over a year ago, I am struck by how often the value of world-class research universities — to our economy, and to society and our daily lives — is underestimated here in Britain. Two of the top five in the world are British and in higher education terms this country punches well above its weight. This week we will announce a huge new investment in transformative research focused on a common foe: cancer. It will bring global benefits.

Our world-class, research-intensive universities are national assets. They can be genuine drivers of economic growth. Cambridge research contributes a staggering £30 billion to the UK economy each year. By contrast, no single US university plays such a national role.

And yet some UK research universities are in a precarious financial state. They are vital to their local and regional communities, as well as to Britain as a whole. They need more than just recognition if they are to drive future sustainable growth: they need investment, yes, but also support to innovate so they can continue to break new research frontiers and to serve their communities.

In Cambridge, we plan to launch an innovation hub that attracts and hosts the best researchers from around the world, plus entrepreneurs, funders and philanthropists. Under one roof, ideas will be driven forward, before they are spun out. The US and France have successfully pointed the way in these hothouses for innovation, in Boston and Paris. The UK must catch up — and fast.

With the budget looming, even — or perhaps especially — in tough times, the government must see our research universities as key allies and partners in its mission to drive economic growth. They are one of the main advantages this country enjoys in the global race for economic success.

At the start of a new academic year, thousands of students and researchers have arrived in Cambridge and at other British universities. We should invest in our world-class institutions and their contributions to tackling society’s greatest challenges so that in the decades to come we will have more Nobel laureates to celebrate.

This article first appeared in The Times on 14 October 2024.

 

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, writes in The Times about how universities can drive UK growth - but they need more than just recognition.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cancer Research UK makes unprecedented £173m investment in University of Cambridge

Cancer Research UK Cambridge Institute

The significant funding commitment will enable world-class discovery science, unlocking new insights into how cancers develop, grow and spread, as well as examining how the immune system can be harnessed to combat the disease.  

Research at the CRUK Cambridge Institute focuses on understanding every stage of the cancer life cycle – how tumours grow and spread and how this is impacted by the characteristics of each individual patient.  By studying how tumours develop, adapt, and interact with their surroundings, scientists aim to uncover crucial insights into their behaviour. 

Vice-Chancellor of the University of Cambridge, Professor Deborah Prentice, said: “From understanding and detecting cancer at its very earliest stages, to developing kinder treatments to building Cambridge Cancer Research Hospital, a transformative new cancer research hospital for the region, Cambridge is changing the story of cancer. For many years now, Cancer Research UK has played a vital role in enabling this world-leading work. Today’s announcement will ensure our researchers continue to find new ways to transform the lives of patients locally, nationally and internationally.”

Today’s £173 million announcement further boosts CRUK’s unwavering commitment towards its mission to beat cancer. The charity is investing in exciting new research programmes, forging new partnerships and is on track to invest more than £1.5bn on research over the five-year period 2021/22 to 2025/26.   

Director of the CRUK Cambridge Institute, Professor Greg Hannon, said: “In a golden era for life sciences, this funding bolsters Cambridge as a major global hub for cancer research on an increasingly competitive worldwide stage and will greatly aid the recruitment of top-tier international talent.   

“Research from the Institute has already made a positive impact for patients and their families, from the development of innovative technologies, diagnostic tests, and advanced imaging methods to the roll out of personalised medicine programmes for those with brain, breast, pancreatic, and ovarian cancers. We believe that only by embracing the complexity of cancer and how the disease interacts with the normal cells of patients can we move the needle on the hardest to treat cancers.” 

The Institute is dedicated to improving cancer patients’ lives through discovery science and clinical translational research and has over 300 scientists working on groundbreaking discoveries taking research from laboratory bench to bedside.  

Established in 2007, it was the first major new cancer research centre in the UK for over 50 years. In 2013, it became a department of the University of Cambridge School of Clinical Medicine, strengthening links with researchers across the University and at Addenbrooke's Hospital,  and further enhancing its position as a world leader with research transitioning into clinical trials, and ultimately new and better cancer treatments. 

Professor Hannon added: "The Institute serves as a foundation for the entire Cambridge cancer research community through access to cutting-edge equipment and technical expertise. Only through understanding all aspects of the disease can we prevent, detect and treat cancer so that everybody can lead longer, better lives, free from fear of cancer.

“With this new funding, the Institute aims to accelerate its impact for patients, with new schemes to integrate clinicians into every aspect of our research and to embrace new technologies, including the promise of machine learning and artificial intelligence to enhance our discovery portfolio.”  

The award, which will support the Institute over the next seven years, follows a comprehensive review of the facility led by an independent panel of international cancer experts who recognised research innovation.  

CRUK Chief Executive, Michelle Mitchell, said: “We are delighted to confirm this incredible investment which is a reflection of  the world-leading research community at the CRUK Cambridge Institute. The funding will underpin long-term cutting-edge discovery research, as well as supporting researchers to find new ways to improve cancer prevention and treatment, while creating innovative solutions to diagnose the disease earlier. 

“This kind of funding would not be possible without the generosity of Cancer Research UK supporters and philanthropists."

Work undertaken at the Institute includes: 

  • Understanding cancer: By gaining a deeper understanding of how tumours grow, adapt, and interact with their surroundings, scientists hope to uncover why some cells become cancerous and learn how each tumour's lifecycle can affect a patient’s response to treatment and prognosis.  Professor Greg Hannon's team developed a diagnostic tool using virtual reality to explore every cell and aspect of breast tumours in unprecedented detail.
  • Unravelling tumour interactions: Researchers are investigating a tumours’ ‘microenvironment' – which includes the surrounding cells, blood vessels, and immune cells and how they interact. This is helping scientists to predict how well immunotherapy treatments will work.
  • Cancer detection: Scientists are finding new ways to detect cancer earlier, predict the best course of treatments and tailor therapies to individual needs, to improve survival.  Using tumour DNA, scientists can monitor the effectiveness of treatments and catch signs of cancer returning.  Cambridge scientists are also working on a simple at-home test for future patients to regularly monitor their progress. 
  • Personalised medicine: Looking at the unique genetic mutations of a person’s tumour, including how it behaves and responds to treatment, allows treatments to be developed and matched to the specific genetic change.  For example, Professor James Brenton's team discovered a specific mutation in the most common form of ovarian cancer which is now used across the NHS as a cancer marker to measure treatment response for the disease. 

Thanks to research, cancer death rates have fallen by 10% percent in the UK over the past decade. But in the East of England, around 37,400 people are still diagnosed, and around 15,700 lose their lives to the disease every year - underlining the vital need for new and better treatments. 

Major studies seeking more accurate treatments for the deadliest cancers like ovarian and oesophageal cancer will also be supported at the Institute.  Research undertaken by Professor Florian Markowetz and his team includes predicting cancer weaknesses to treatment, and spotting cancers as early as possible using AI technology. 

There are 17 research groups based at the Institute – based on the largest biomedical campus in Europe - studying a range of cancer and technologies to support improved cancer treatments. 

Find out how Cambridge is changing the story of cancer

Adapted from a press release from Cancer Research UK

Cancer Research UK (CRUK) has today announced a £173 million investment in its institute at the University of Cambridge - the largest single grant ever awarded by the charity outside of London.  

Today’s announcement will ensure our researchers continue to find new ways to transform the lives of patients locally, nationally and internationally
Deborah Prentice, Vice-Chancellor
Cancer Research UK Cambridge Institute

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Public invited to chat to museum animals in novel AI experiment

Jack Ashby talking to the Museum's fin whale

From Tuesday 15 October the University of Cambridge’s Museum of Zoology is offering visitors a unique experience: the chance to chat with the animals on display – whether skeletal, taxidermy, or extinct.

In a collaboration with the company Nature Perspectives, the Museum’s Assistant Director Jack Ashby has chosen a range of animal specimens to bring back to life using generative Artificial Intelligence.

Visitors can pose their questions to thirteen specimens - including dodo and whale skeletons, a taxidermied red panda, and a preserved cockroach - by scanning QR codes that open a chat-box on their mobile phone. In two-way conversations, which can be voice- or text-based, the specimens will answer as if they are still alive.

This is believed to be the first time a museum has used generative Artificial Intelligence to enable visitors to chat with objects on display in this way.

By analysing data from the conversations, the team hopes that the month-long experiment will help them learn more about how AI can help the public to better engage with nature, and about the potential for AI in museums. It will also provide the museum with new insights into what visitors really want to know about the specimens on display.

Nature Perspectives uses AI to enable cultural institutions like the Museum of Zoology to engage the public through these unique conversational experiences. The company aims to reverse a growing apathy towards biodiversity loss by enabling new ways to engage with the natural world.

“This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting, and we also hope to learn something about how our visitors see the animals on display,” said Jack Ashby, Assistant Director of the University of Cambridge’s Museum of Zoology.

He added: “Our whole purpose is to get people engaged with the natural world. So we're curious to see whether this will work, and whether chatting to the animals will change people’s attitudes towards them - will the cockroach be better liked, for example, as a result of having its voice heard?”

“By using AI to simulate non-human perspectives, our technology offers a novel way for audiences to connect with the natural world,” said Gal Zanir, co-founder of the company Nature Perspectives, which developed the AI technology for the experience.

He added: “One of the most magical aspects of the simulations is that they’re age-adaptive. For the first time, visitors of all ages will be able to ask the specimens anything they like.”

The technology brings together all available information on each animal involved – including details particular to the individual specimens such as where they came from and how they were prepared for display in the museum. This is all repackaged from a first-person perspective, so that visitors can experience realistic, meaningful conversations.

The animals will adjust their tone and language to suit the age of the person they’re talking to. And they’re multi-lingual - speaking over 20 languages including Spanish and Japanese so that visitors can chat in their native languages.

The team has chosen a range of specimens that include skeletons, taxidermy, models, and whole preserved animals. The specimens are: dodo skeleton, narwhal skeleton, brain coral, red admiral butterfly, fin whale skeleton, American cockroach, huia taxidermy (a recently extinct bird from New Zealand), red panda taxidermy, freeze-dried platypus, giant sloth fossil skeleton, giant deer skull and antlers, mallard taxidermy, and Ichthyostega model (an extinct ancestor of all animals with four legs).

Nature Perspectives was created by a team of graduates from the University of Cambridge’s Masters in Conservation Leadership programme, who noticed that people seem to feel more connected to machines when they can talk to them. This inspired the team to apply the same principle to nature - giving nature a voice to promote its agency and foster deeper, more personal connections between people and the natural world.

“Artificial Intelligence is opening up exciting new opportunities to connect people with non-human life, but the impacts need to be carefully studied. I’m delighted to be involved in exploring how the Nature Perspectives pilot affects the way people feel about and understand the species they ‘meet’ in the Museum of Zoology,” said Professor Chris Sandbrook, Director of the University of Cambridge’s Masters in Conservation Leadership programme.

“Enabling museums to engage visitors with the simulated perspectives of exhibits is only the first step for Nature Perspectives. We aim to apply this transformative approach widely, from public engagement and education to scientific research, to representing nature in legal processes, policy-making and beyond," said Zanir.

The Nature Perspectives AI experiment runs for one month, from 15th October to 15th of November 2024. For visiting times see www.museum.zoo.cam.ac.uk/visit-us 

Explore more discoveries, innovations and research on climate and nature at the University of Cambridge.

 

Specimens in a Cambridge museum will be brought to life through the power of Artificial Intelligence, by a team aiming to strengthen our connection with the natural world and reverse apathy towards biodiversity loss.

This is an amazing opportunity for people to test out an emerging technology in our inspiring Museum setting.
Jack Ashby
Jack Ashby talking to the Museum's fin whale

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

How did the building blocks of life arrive on Earth?

An iron meteorite from the core of a melted planetesimal (left) and a chondrite meteorite, derived from a ‘primitive’, unmelted planetesimal (right).

Volatiles are elements or compounds that change into vapour at relatively low temperatures. They include the six most common elements found in living organisms, as well as water. The zinc found in meteorites has a unique composition, which can be used to identify the sources of Earth’s volatiles.

The researchers, from the University of Cambridge and Imperial College London, have previously found that Earth’s zinc came from different parts of our Solar System: about half came from beyond Jupiter and half originated closer to Earth.

“One of the most fundamental questions on the origin of life is where the materials we need for life to evolve came from,” said Dr Rayssa Martins from Cambridge’s Department of Earth Sciences. “If we can understand how these materials came to be on Earth, it might give us clues to how life originated here, and how it might emerge elsewhere.”

Planetesimals are the main building blocks of rocky planets, such as Earth. These small bodies are formed through a process called accretion, where particles around a young star start to stick together, and form progressively larger bodies.

But not all planetesimals are made equal. The earliest planetesimals that formed in the Solar System were exposed to high levels of radioactivity, which caused them to melt and lose their volatiles. But some planetesimals formed after these sources of radioactivity were mostly extinct, which helped them survive the melting process and preserved more of their volatiles.

In a study published in the journal Science Advances, Martins and her colleagues looked at the different forms of zinc that arrived on Earth from these planetesimals. The researchers measured the zinc from a large sample of meteorites originating from different planetesimals and used this data to model how Earth got its zinc, by tracing the entire period of the Earth’s accretion, which took tens of millions of years.

Their results show that while these ‘melted’ planetesimals contributed about 70% of Earth’s overall mass, they only provided around 10% of its zinc.

According to the model, the rest of Earth’s zinc came from materials that didn’t melt and lose their volatile elements. Their findings suggest that unmelted, or ‘primitive’ materials were an essential source of volatiles for Earth.

“We know that the distance between a planet and its star is a determining factor in establishing the necessary conditions for that planet to sustain liquid water on its surface,” said Martins, the study’s lead author. “But our results show there’s no guarantee that planets incorporate the right materials to have enough water and other volatiles in the first place – regardless of their physical state.”

The ability to trace elements through millions or even billions of years of evolution could be a vital tool in the search for life elsewhere, such as on Mars, or on planets outside our Solar System.

“Similar conditions and processes are also likely in other young planetary systems,” said Martins. “The roles these different materials play in supplying volatiles is something we should keep in mind when looking for habitable planets elsewhere.”

The research was supported in part by Imperial College London, the European Research Council, and UK Research and Innovation (UKRI).

 

Reference:
Rayssa Martins et al. ‘Primitive asteroids as a major source of terrestrial volatiles.’ Science Advances (2024). DOI: 10.1126/sciadv.ado4121

Researchers have used the chemical fingerprints of zinc contained in meteorites to determine the origin of volatile elements on Earth. The results suggest that without ‘unmelted’ asteroids, there may not have been enough of these compounds on Earth for life to emerge.

An iron meteorite from the core of a melted planetesimal (left) and a chondrite meteorite, derived from a ‘primitive’, unmelted planetesimal (right).

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge conservation and sustainable business leaders prepare for COP16

Panel members from CCI and CISL discuss the upcoming COP16

.

The Cambridge Conservation Initiative and the University of Cambridge Institute for Sustainability Leadership (CISL) co-hosted a panel discussion featuring key industry leaders in the run-up to the 16th Conference of the Parties to the Convention on Biological Diversity (CBD COP16). Please read more about the panel here

Panel members from CCI and CISL discuss the upcoming COP16

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Inside-out’ galaxy growth observed in the early universe

Galaxy NGC 1549, seen today and 13 billion years ago

This galaxy is one hundred times smaller than the Milky Way, but is surprisingly mature for so early in the universe. Like a large city, this galaxy has a dense collection of stars at its core but becomes less dense in the galactic ‘suburbs’. And like a large city, this galaxy is starting to sprawl, with star formation accelerating in the outskirts.

This is the earliest-ever detection of inside-out galactic growth. Until Webb, it had not been possible to study galaxy growth so early in the universe’s history. Although the images obtained with Webb represent a snapshot in time, the researchers, led by the University of Cambridge, say that studying similar galaxies could help us understand how they transform from clouds of gas into the complex structures we observe today. The results are reported in the journal Nature Astronomy.

“The question of how galaxies evolve over cosmic time is an important one in astrophysics,” said co-lead author Dr Sandro Tacchella from Cambridge’s Cavendish Laboratory. “We’ve had lots of excellent data for the last ten million years and for galaxies in our corner of the universe, but now with Webb, we can get observational data from billions of years back in time, probing the first billion years of cosmic history, which opens up all kinds of new questions.”

The galaxies we observe today grow via two main mechanisms: either they pull in, or accrete, gas to form new stars, or they grow by merging with smaller galaxies. Whether different mechanisms were at work in the early universe is an open question which astronomers are hoping to address with Webb.

“You expect galaxies to start small as gas clouds collapse under their own gravity, forming very dense cores of stars and possibly black holes,” said Tacchella. “As the galaxy grows and star formation increases, it’s sort of like a spinning figure skater: as the skater pulls in their arms, they gather momentum, and they spin faster and faster. Galaxies are somewhat similar, with gas accreting later from larger and larger distances spinning the galaxy up, which is why they often form spiral or disc shapes.”

This galaxy, observed as part of the JADES (JWST Advanced Extragalactic Survey) collaboration, is actively forming stars in the early universe. It has a highly dense core, which despite its relatively young age, is of a similar density to present-day massive elliptical galaxies, which have 1000 times more stars. Most of the star formation is happening further away from the core, with a star-forming ‘clump’ even further out.

The star formation activity is strongly rising toward the outskirts, as the star formation spreads out and the galaxy grows. This type of growth had been predicted with theoretical models, but with Webb, it is now possible to observe it.

“One of the many reasons that Webb is so transformational to us as astronomers is that we’re now able to observe what had previously been predicted through modelling,” said co-author William Baker, a PhD student at the Cavendish. “It’s like being able to check your homework.”

Using Webb, the researchers extracted information from the light emitted by the galaxy at different wavelengths, which they then used to estimate the number of younger stars versus older stars, which is converted into an estimate of the stellar mass and star formation rate.

Because the galaxy is so compact, the individual images of the galaxy were ‘forward modelled’ to take into account instrumental effects. Using stellar population modelling that includes prescriptions for gas emission and dust absorption, the researchers found older stars in the core, while the surrounding disc component is undergoing very active star formation. This galaxy doubles its stellar mass in the outskirts roughly every 10 million years, which is very rapid: the Milky Way galaxy doubles its mass only every 10 billion years.

The density of the galactic core, as well as the high star formation rate, suggest that this young galaxy is rich with the gas it needs to form new stars, which may reflect different conditions in the early universe.

“Of course, this is only one galaxy, so we need to know what other galaxies at the time were doing,” said Tacchella. “Were all galaxies like this one? We’re now analysing similar data from other galaxies. By looking at different galaxies across cosmic time, we may be able to reconstruct the growth cycle and demonstrate how galaxies grow to their eventual size today.”

 

Reference:
William M. Baker, Sandro Tacchella, et al. ‘A core in a star-forming disc as evidence of inside-out growth in the early Universe.’ Nature Astronomy (2024). DOI: 10.1038/s41550-024-02384-8

Astronomers have used the NASA/ESA James Webb Space Telescope (JWST) to observe the ‘inside-out’ growth of a galaxy in the early universe, only 700 million years after the Big Bang.

Galaxy NGC 1549, seen today and possibly 13 billion years ago

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge joins forces with ARIA to fast-track radical new technologies to revolutionise brain health

Illustration of human brain

The collaboration, which includes researchers from the University of Cambridge, aims to accelerate progress on new neuro-technologies, including miniaturised brain implants designed to treat depression, dementia, chronic pain, epilepsy and injuries to the nervous system.

Neurological and mental health disorders will affect four in every five people in their lifetimes, and present a greater overall health burden than cancer and cardiovascular disease combined. For example, 28 million people in the UK are living with chronic pain and 1.3 million people with traumatic brain injury.

Neuro-technology – where technology is used to control the nervous system - has the potential to deliver new treatments for these disorders, in much the same way that heart pacemakers, cochlear implants and spinal implants have transformed medicine in recent decades.

The technology can be in the form of electronic brain implants that reset abnormal brain activity or help deliver targeted drugs more effectively, brain-computer interfaces that control prosthetic limbs, or technologies that train the patient’s own cells to fight disease. ARIA’s Scalable Neural Interfaces opportunity space is exploring ways to make the technology more precise, less invasive, and applicable to a broader range of diseases.

Currently, an implant can only interact with large groups of neurons, the cells that transmit information around the brain. Building devices that interact with single neurons will mean a more accurate treatment. Neuro-technologies also have the potential to treat autoimmune disorders, including rheumatoid arthritis, Crohn’s disease and type-1 diabetes.

The science of building technology small enough, precise enough and cheap enough to make a global impact requires an environment where the best minds from across the UK can collaborate, dream up radical, risky ideas and test them without fear of failure.

Professor George Malliaras from the University of Cambridge’s Department of Engineering is one of the project leaders. “Miniaturised devices have the potential to change the lives of millions of people currently suffering from neurological conditions and diseases where drugs have no effect,” he said. “But we are working at the very edge of what is possible in medicine, and it is hard to find the support and funding to try radical, new things. That is why the partnership with ARIA is so exhilarating, because it is giving brilliant people the tools to turn their original ideas into commercially viable devices that are cheap enough to have a global impact.”

Cambridge’s partnership with ARIA will create a home for original thinkers who are struggling to find the funding, space and mentoring needed to stress-test their radical ideas. The three-year partnership is made up of two programmes:

The Fellowship Programme (up to 18 fellowships)

Blue Sky Fellows – a UK-wide offer - we will search the UK for people from any background, with a radical idea in this field and the plan and personal skills to develop it. The best people will be offered a fellowship with the funding to test their ideas in Cambridge rapidly. These Blue Sky Fellows will receive mentorship from our best medical, scientific and business experts and potentially be offered accommodation at a Cambridge college. We will be looking for a specific type of person to be a Blue Sky Fellow. They must be the kind of character who thinks at the very edge of the possible, who doesn’t fear failure, and whose ideas have the potential to change billions of lives, yet would struggle to find funding from existing sources. Not people who think outside the box, more people who don’t see a box at all.

Activator Fellows - a UK-wide offer - those who have already proved that their idea can work, yet need support to turn it into a business, will be invited to become Activator Fellows. They will be offered training in entrepreneurial skills including grant writing, IP management and clinical validation, so their innovation can be ready for investment.

The Ecosystem Programme

The Ecosystem Programme is about creating a vibrant, UK-wide neurotechnology community where leaders from business, science, engineering, academia and the NHS can meet, spark ideas and form collaborations. This will involve quarterly events in Cambridge, road trip events across the UK and access to the thriving online Cambridge network, Connect: Health Tech.

“This unique partnership is all about turning radical ideas into practical, low-cost solutions that change lives,” said Kristin-Anne Rutter, Executive Director of Cambridge University Health Partners. “Cambridge is fielding its best team to make this work and using its networks to bring in the best people from all over the UK. From brilliant scientists to world-leading institutes, hospitals and business experts, everyone in this collaboration is committed to the ARIA partnership because, by working together, we all see an unprecedented opportunity to make a real difference in the world.”

“Physical and mental illnesses and diseases that affect the brain such as dementia are some of the biggest challenges we face both as individuals and as a society,” said Dr Ben Underwood, Associate Professor of Psychiatry at the University of Cambridge and Honorary Consultant Psychiatrist at Cambridgeshire and Peterborough NHS Foundation Trust. “This funding will bring together different experts doing things at the very limits of science and developing new technology to improve healthcare. We hope this new partnership with the NHS will lead to better care and treatment for people experiencing health conditions.”

Cambridge partners in the project include the Departments of Engineering and Psychiatry, Cambridge Neuroscience, the Milner Therapeutics Institute, the Maxwell Centre, Cambridge University Health Partners (CUHP), Cambridge Network, the Babraham Research Campus, Cambridgeshire and Peterborough NHS Foundation Trust, and Vellos. 

A team from across the Cambridge life sciences, technology and business worlds has announced a multi-million-pound, three-year collaboration with the Advanced Research and Invention Agency (ARIA), the UK government’s new research funding agency.

Illustration of human brain

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

University of Cambridge alumni awarded 2024 Nobel Prize in Chemistry

Left: Demis Hassabis; Right: John Jumper

In 2020, Hassabis and Jumper of Google DeepMind presented an AI model called AlphaFold2. With its help, they have been able to predict the structure of virtually all the 200 million proteins that researchers have identified.

Since their breakthrough, AlphaFold2 has been used by more than two million people from 190 countries. Among a myriad of scientific applications, researchers can now better understand antibiotic resistance and create images of enzymes that can decompose plastic.

The duo received the Nobel along with Professor David Baker of the University of Washington, who succeeded in using amino acids to design a new protein in 2003.

Sir Demis Hassabis read Computer Science as an undergraduate at Queens' College, Cambridge, matriculating in 1994. He went on to complete a PhD in cognitive neuroscience at University College London and create the videogame company Elixir Studios.

Hassabis co-founded DeepMind in 2010, a company that devel­oped masterful AI models for popular boardgames. The company was sold to Google in 2014 and, two years later, DeepMind came to global attention when the company achieved what many then believed to be the holy grail of AI: beating the champion player of one of the world’s oldest board­games, Go.

In 2014, Hassabis was elected as a Fellow Benefactor and, later, as an Honorary Fellow of Queens' College. In 2024, he was knighted by the King for services to artificial intelligence.

In 2018, the University announced the establishment of a DeepMind Chair of Machine Learning, thanks to a benefaction from Hassabis’s company, and appointed Professor Neil Lawrence to the position the following year.

“I have many happy memories from my time as an undergraduate at Cambridge, so it’s now a real honour for DeepMind to be able to contribute back to the Department of Computer Science and Technology and support others through their studies,” said Hassabis in 2018.   

“It is wonderful to see Demis’s work recognised at the highest level — his contributions have been really transformative across many domains. I’m looking forward to seeing what he does next!” said Professor Alastair Beresford, Head of the Department of Computer Science and Technology and Robin Walker Fellow in Computer Science at Queens' College.

In a statement released by Google DeepMind following the announcement by the Nobel committee, Hassabis said: "I’ve dedicated my career to advancing AI because of its unparalleled potential to improve the lives of billions of people... I hope we'll look back on AlphaFold as the first proof point of AI's incredible potential to accelerate scientific discovery."

Dr John Jumper completed an MPhil in theoretical condensed matter physics at Cambridge's famous Cavendish Laboratory in 2008, during which time he was a member of St Edmund’s College, before going on to receive his PhD in Chemistry from the University of Chicago.

"Computational biology has long held tremendous promise for creating practical insights that could be put to use in real-world experiments," said Jumper, Director of Google DeepMind, in a statement released by the company. "AlphaFold delivered on this promise. Ahead of us are a universe of new insights and scientific discoveries made possible by the use of AI as a scientific tool." 

“The whole of the St Edmund’s community joins me in congratulating our former Masters student Dr John Jumper on this illustrious achievement – the most inspiring example imaginable to our new generation of students as they go through their matriculation this week,” said St Edmund’s College Master, Professor Chris Young.

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge: “I’d like to congratulate Demis Hassabis and John Jumper, who, alongside Geoffrey Hinton yesterday, are all alumni of our University. Together, their pioneering work in the development and application of machine learning is transforming our understanding of the world around us. They join an illustrious line-up of Cambridge people to have received Nobel Prizes – now totalling 125 individuals – for which we can be very proud.”

Article updated on 10 October 2024 to reflect that the number of Cambridge people to have received Nobel Prizes now totals 125.

Two University alumni, Sir Demis Hassabis and Dr John Jumper, have been jointly awarded this year’s Nobel Prize in Chemistry for developing an AI model to solve a 50-year-old problem: predicting the complex structures of proteins.

I have many happy memories from my time as an undergraduate at Cambridge
Sir Demis Hassabis
Left: Demis Hassabis; Right: John Jumper

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

University of Cambridge alumnus awarded 2024 Nobel Prize in Physics

Left: Geoffrey Hinton (circled) at his Matriculation at King's College. Right: Illustration of Geoffrey Hinton

Hinton (King’s 1967) and Hopfield were awarded the prize ‘for foundational discoveries and inventions that enable machine learning with artificial neural networks.’ Hinton, who is known as the ‘Godfather of AI’ is Emeritus Professor of Computer Science at the University of Toronto. 

This year’s two Nobel Laureates in Physics have used tools from physics to develop methods that are the foundation of today’s powerful machine learning. John Hopfield, a Guggenheim Fellow at the University of Cambridge in 1968-1969, created an associative memory that can store and reconstruct images and other types of patterns in data. Geoffrey Hinton invented a method that can autonomously find properties in data, and perform tasks such as identifying specific elements in pictures.

When we talk about artificial intelligence, we often mean machine learning using artificial neural networks. This technology was originally inspired by the structure of the brain. In an artificial neural network, the brain’s neurons are represented by nodes that have different values. These nodes influence each other through con­nections that can be likened to synapses and which can be made stronger or weaker. The network is trained, for example by developing stronger connections between nodes with simultaneously high values. This year’s laureates have conducted important work with artificial neural networks from the 1980s onward.

Geoffrey Hinton used a network invented by John Hopfield as the foundation for a new network: the Boltzmann machine. This can learn to recognise characteristic elements in a given type of data. Hinton used tools from statistical physics, the science of systems built from many similar components. The machine is trained by feeding it examples that are very likely to arise when the machine is run. The Boltzmann machine can be used to classify images or create new examples of the type of pattern on which it was trained. Hinton has built upon this work, helping initiate the current explosive development of machine learning.

Vice-Chancellor Professor Deborah Prentice said:

“Many congratulations to Professor Hinton on receiving the Nobel Prize. Our alumni are a vital part of the Cambridge community, and many of them, like Professor Hinton, have made discoveries and advances that have genuinely changed our world. On behalf of the University of Cambridge, I congratulate him on this enormous accomplishment.”

“The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties,” says Ellen Moons, Chair of the Nobel Committee for Physics. Hinton and Hopfield are the 122nd and 123rd Members of the University of Cambridge to be awarded the Nobel Prize. 

From 1980 to 1982, Hinton was a Scientific Officer at the MRC Applied Psychology Unit (as the MRC Cognition and Brain Sciences Unit was then known), before taking up a position at Carnegie Mellon University in Pittsburgh.

In May 2023, Hinton gave a public lecture at the University's Centre for the Study of Existential Risk entitled 'Two Paths to Intelligence', in which he argued that "large scale digital computation is probably far better at acquiring knowledge than biological computation and may soon be much more intelligent than us". 

Geoffrey Hinton, an alumnus of the University of Cambridge, was awarded the 2024 Nobel Prize in Physics, jointly with John Hopfield of Princeton University.

Right: Geoffrey Hinton (circled) at his Matriculation at King's College. Right: Illustration of Geoffrey Hinton

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Ultra-powered MRI scans show damage to brain’s ‘control centre’ is behind long-lasting Covid-19 symptoms

3D projections of QSM maps on the rendered brainstem

Using ultra-high-resolution scanners that can see the living brain in fine detail, researchers from the Universities of Cambridge and Oxford were able to observe the damaging effects Covid-19 can have on the brain.

The study team scanned the brains of 30 people who had been admitted to hospital with severe Covid-19 early in the pandemic, before vaccines were available. The researchers found that Covid-19 infection damages the region of the brainstem associated with breathlessness, fatigue and anxiety.

The powerful MRI scanners used for the study, known as 7-Tesla or 7T scanners, can measure inflammation in the brain. Their results, published in the journal Brain, will help scientists and clinicians understand the long-term effects of Covid-19 on the brain and the rest of the body. Although the study was started before the long-term effects of Covid were recognised, it will help to better understand this condition.

The brainstem, which connects the brain to the spinal cord, is the control centre for many basic life functions and reflexes. Clusters of nerve cells in the brainstem, known as nuclei, regulate and process essential bodily functions such as breathing, heart rate, pain and blood pressure.

“Things happening in and around the brainstem are vital for quality of life, but it had been impossible to scan the inflammation of the brainstem nuclei in living people, because of their tiny size and difficult position.” said first author Dr Catarina Rua, from the Department of Clinical Neurosciences. “Usually, scientists only get a good look at the brainstem during post-mortem examinations.”

“The brainstem is the critical junction box between our conscious selves and what is happening in our bodies,” said Professor James Rowe, also from the Department of Clinical Neurosciences, who co-led the research. “The ability to see and understand how the brainstem changes in response to Covid-19 will help explain and treat the long-term effects more effectively.”

In the early days of the Covid-19 pandemic, before effective vaccines were available, post-mortem studies of patients who had died from severe Covid-19 infections showed changes in their brainstems, including inflammation. Many of these changes were thought to result from a post-infection immune response, rather than direct virus invasion of the brain.  

“People who were very sick early in the pandemic showed long-lasting brain changes, likely caused by an immune response to the virus. But measuring that immune response is difficult in living people,” said Rowe. “Normal hospital-type MRI scanners can’t see inside the brain with the kind of chemical and physical detail we need.”

“But with 7T scanners, we can now measure these details. The active immune cells interfere with the ultra-high magnetic field, so that we’re able to detect how they are behaving,” said Rua. “Cambridge was special because we were able to scan even the sickest and infectious patients, early in the pandemic.”

Many of the patients admitted to hospital early in the pandemic reported fatigue, breathlessness and chest pain as troubling long-lasting symptoms. The researchers hypothesised these symptoms were in part the result of damage to key brainstem nuclei, damage which persists long after Covid-19 infection has passed.

The researchers saw that multiple regions of the brainstem, in particular the medulla oblongata, pons and midbrain, showed abnormalities consistent with a neuroinflammatory response. The abnormalities appeared several weeks after hospital admission, and in regions of the brain responsible for controlling breathing.

“The fact that we see abnormalities in the parts of the brain associated with breathing strongly suggests that long-lasting symptoms are an effect of inflammation in the brainstem following Covid-19 infection,” said Rua. “These effects are over and above the effects of age and gender, and are more pronounced in those who had had severe Covid-19.”

In addition to the physical effects of Covid-19, the 7T scanners provided evidence of some of the psychiatric effects of the disease. The brainstem monitors breathlessness, as well as fatigue and anxiety. “Mental health is intimately connected to brain health, and patients with the most marked immune response also showed higher levels of depression and anxiety,” said Rowe. “Changes in the brainstem caused by Covid-19 infection could also lead to poor mental health outcomes, because of the tight connection between physical and mental health.”

The researchers say the results could aid in the understanding of other conditions associated with inflammation of the brainstem, like MS and dementia. The 7T scanners could also be used to monitor the effectiveness of different treatments for brain diseases.

“This was an incredible collaboration, right at the peak of the pandemic, when testing was very difficult, and I was amazed how well the 7T scanners worked,” said Rua. “I was really impressed with how, in the heat of the moment, the collaboration between lots of different researchers came together so effectively.”

The research was supported in part by the NIHR Cambridge Biomedical Research Centre, the NIHR Oxford Biomedical Research Centre, and the University of Oxford COVID Medical Sciences Division Rapid Response Fund.

 

Reference:
Catarina Rua et al. ‘7-Tesla quantitative susceptibility mapping in COVID-19: brainstem effects and outcome associations.’ Brain (2024). DOI: 10.1093/brain/awae215

Damage to the brainstem – the brain’s ‘control centre’ – is behind long-lasting physical and psychiatric effects of severe Covid-19 infection, a study suggests.

3D projections of QSM maps on the rendered brainstem

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge celebrates positive contributions to research culture

Three researchers at the University of Cambridge in conversation at an event celebrating colleagues for their positive contribution to research culture

Cambridge aspires to create a positive research culture where all staff working in research, whether in academic, technical or support roles, feel welcomed, supported and able to give of their best. The Research Culture Celebration event aims to recognise and celebrate the good practice that is already happening, and to inspire further efforts across the University.

The original idea for the nominations and event (where those honoured are put forward by their colleagues) was part of the Action Research on Research Culture (ARRC) project’s study on researcher development. The ARRC project is one of several initiatives to nurture and promote positive research culture at Cambridge. 

See the full list of nominees

The event coincides with the launch of a wider programme of work being led by the University's Research Culture Team. Four priority areas have been identified. These are: 

- Precarity: how do we address the issues created by fixed-term contracts in early research careers? 

- Access & Participation: who gets to do research? Can everyone fully participate as is expected of them? 

- Challenging interpersonal and group dynamics: how do we support researchers who are struggling with difficult research dynamics? How do we support leaders to change? 

- Time & space: how do we ensure people have the time and space to embody and enact good research culture? 

This year the Research Steering Committee, which oversees the work, is expecting to allocate between £600,000 and £700,000 to facilitate research culture activities around the University. It will also contact individual departments to better understand the concerns they have around research culture. If you would like to be involved, please contact the research culture team

For more about the event, including a gallery of images, see the Staff Hub (Cambridge users only; University login required).

Colleagues from across the University were recognised for their contributions to research culture at the inaugural Research Culture Celebration event on 30 September.

Three researchers at the University of Cambridge in conversation at an event celebrating colleagues for their positive contribution to research culture

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Triple partnership between Cambridge, Oxford and Brussels reaffirmed

A Memorandum of Understanding – first signed between the institutions in 2009 to formally recognise their decades-long partnership – was renewed during a ceremony at the Belgian Embassy in London.

The relationship was first established in 1965 by the Belgian public interest organisation Fondation Wiener-Anspach (FWA), and since then students, researchers and academics have travelled across the Channel in both directions as part of academic exchanges.

The FWA promotes the development of scientific exchanges between the ULB and the universities of Oxford and Cambridge by awarding fellowships and grants, and by supporting research collaborations in all fields. It also organises conferences and encourages contacts between academics by funding short-term visits.

The ceremony was attended by Professor Deborah Prentice, Vice-Chancellor of Cambridge University; Professor Irene Tracey, Vice-Chancellor of Oxford University; Professor Anne Weyembergh, Vice-Rector for External Relations and Cooperation (ULB); Professor Didier Viviers, President of Fondation Wiener-Anspach, and Bruno van der Pluijm, Belgian Ambassador to the UK.

Speaking at the ceremony, Professor Prentice, said: “The partnership has allowed students and scholars in Brussels, in Cambridge and Oxford to find a firm footing in the world of academia and beyond. In the years ahead, I am confident that it will continue to enable new accomplishments and scientific breakthroughs, as well as the creation of innovative businesses and the nurturing of careers in public service on both sides of the Channel.”

The Vice-Chancellors of Cambridge and Oxford universities reaffirmed a triple partnership with Université Libre de Bruxelles (ULB) that has supported world-class research through cross-border collaborations for more than half a century.

"The partnership will continue to enable new accomplishments and scientific breakthroughs, as well as the creation of innovative businesses and the nurturing of careers in public service on both sides of the Channel."
Professor Deborah Prentice, Vice-Chancellor

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

First map of every neuron in an adult fly brain complete

Multi-coloured image of all neurons in an adult fruit fly brain

This landmark achievement has been conducted by the FlyWire Consortium, a large international collaboration including researchers from the University of Cambridge, the MRC Laboratory of Molecular Biology in Cambridge, Princeton University, and the University of Vermont. It is published today in two papers in the journal Nature.

The diagram of all 139,255 neurons in the adult fly brain is the first of an entire brain for an animal that can walk and see. Previous efforts have completed the whole brain diagrams for much smaller brains, for example a fruit fly larva which has 3,016 neurons, and a nematode worm which has 302 neurons.

The researchers say the whole fly brain map is a key first step to completing larger brains. Since the fruit fly is a common tool in research, its brain map can be used to advance our understanding of how neural circuits work.

Dr Gregory Jefferis, from the University of Cambridge and the MRC Laboratory of Molecular Biology, one of the co-leaders of the research, said: “If we want to understand how the brain works, we need a mechanistic understanding of how all the neurons fit together and let you think. For most brains we have no idea how these networks function. 

“Flies can do all kinds of complicated things like walk, fly, navigate, and the males sing to the females. Brain wiring diagrams are a first step towards understanding everything we’re interested in – how we control our movement, answer the telephone, or recognise a friend.”

Dr Mala Murthy from Princeton University, one of the co-leaders of the research, said: “We have made the entire database open and freely available to all researchers. We hope this will be transformative for neuroscientists trying to better understand how a healthy brain works. In the future we hope that it will be possible to compare what happens when things go wrong in our brains, for example in mental health conditions.” 

Dr Marta Costa from the University of Cambridge, who was also involved in the research, said “This brain map, the biggest so far, has only been possible thanks to technical advances that didn’t seem possible ten years ago. It is a true testament to the way that innovation can drive research forward. The next steps will be to generate even bigger maps, such as a mouse brain, and ultimately, a human one.”

The scientists found that there were substantial similarities between the wiring in this map and previous smaller-scale efforts to map out parts of the fly brain. This led the researchers to conclude that there are many similarities in wiring between individual brains – that each brain isn’t a unique structure.

When comparing their brain diagram to previous diagrams of small areas of the brain, the researchers also found that about 0.5% of neurons have developmental variations that could cause connections between neurons to be mis-wired. The researchers say it will be important to understand, through future research, if these changes are linked to individuality or brain disorders. 

Making the map

3D rendering of all ~140k neurons in the fruit fly brain. Credit: Data source FlyWire.ai; Rendering by Philipp Schlegel (University of Cambridge/MRC LMB).

A whole fly brain is less than one millimetre wide. The researchers started with one female brain cut into seven thousand slices, each only 40 nanometres thick, that were previously scanned using high resolution electron microscopy in the laboratory of project co-leader Davi Bock at Janelia Research Campus in the US.

Analysing over 100 terabytes of image data (equivalent to the storage in 100 typical laptops) to extract the shapes of about 140,000 neurons and 50 million connections between them is too big a challenge for humans to complete manually. The researchers built on AI developed at Princeton University to identify and map neurons and their connections to each other.

However, the AI still makes many errors in datasets of this size. The Princeton University researchers established the FlyWire Consortium – made up of teams in more than 76 laboratories and 287 researchers around the world, as well as volunteers from the general public – which spent an estimated 33 person-years painstakingly proofreading all the data.

Dr Sebastian Seung, from Princeton University, who was one of the co-leaders of the research, said: “Mapping the whole brain has been made possible by advances in AI computing - it would have not been possible to reconstruct the entire wiring diagram manually. This is a display of how AI can move neuroscience forward. The fly brain is a milestone on our way to reconstructing a wiring diagram of a whole mouse brain.”

The researchers also annotated many details on the wiring diagram, such as classifying more than 8,000 cell types across the brain. This allows researchers to select particular systems within the brain for further study, such as the neurons involved in sight or movement. 

Dr Philipp Schlegel, the first author of one of the studies, from the MRC Laboratory of Molecular Biology, said: “This dataset is a bit like Google Maps but for brains: the raw wiring diagram between neurons is like knowing which structures on satellite images of the Earth correspond to streets and buildings. Annotating neurons is like adding the names for streets and towns, business opening times, phone numbers and reviews to the map – you need both for it to be really useful.”

Simulating brain function

This is also the first whole brain wiring map – often called a connectome – to predict the function of all the connections between neurons. 

Neurons use electrical signals to send messages. Each neuron can have hundreds of branches that connect it to other neurons. The points where these branches meet and transmit signals between neurons are called synapses. There are two main ways that neurons communicate across synapses: excitatory (which promotes the continuation of the electrical signal in the receiving neuron), or inhibitory (which reduces the likelihood that the next neuron will transmit signals).

Researchers from the team used AI image scanning technology to predict whether each synapse was inhibitory or excitatory.

Dr Gregory Jefferis added: “To begin to simulate the brain digitally, we need to know not only the structure of the brain, but also how the neurons function to turn each other on and off.”

“Using our data, which has been shared online as we worked, other scientists have already started trying to simulate how the fly brain responds to the outside world. This is an important start, but we will need to collect many different kinds of data to produce reliable simulations of how a brain functions.”

Associate Professor Davi Bock, one of the co-leaders of the research from the University of Vermont, said: “The hyper-detail of electron microscopy data creates its own challenges, especially at scale. This team wrote sophisticated software algorithms to identify patterns of cell structure and connectivity within all that detail. 

“We now can make precise synaptic level maps and use these to better understand cell types and circuit structure at whole-brain scale. This will inevitably lead to a deeper understanding of how nervous systems process, store and recall information. I think this approach points the way forward for the analysis of future whole-brain connectomes, in the fly as well as in other species."

This research was conducted using a female fly brain. Since there are differences in neuronal structure between male and female fly brains, the researchers also plan to characterise a male brain in the future. 

The principal funders were the National Institutes of Health BRAIN Initiative, Wellcome, Medical Research Council, Princeton University and National Science Foundation.

References

Schlegel, P. et al: Whole-brain annotation and multi-connectome cell typing of Drosophila. Nature, Oct 2024. DOI: 10.1038/s41586-024-07686-5

Dorkenwald, S. et al: Neuronal wiring diagram of an adult brain. Nature, Oct 2024. DOI: 10.1038/s41586-024-07558-y

 

The first wiring diagram of every neuron in an adult brain and the 50 million connections between them has been produced for a fruit fly.

Brain wiring diagrams are a first step towards understanding everything we’re interested in – how we control our movement, answer the telephone, or recognise a friend.
Gregory Jefferis
3D rendering of all 140,000 neurons in the adult fruit fly brain.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Celebrating remarkable talent as part of Black History Month

Montage of faces

Events in Cambridge for Black History Month

Cambridge's Race Equality Lecture

Jesus College, Thursday 31 October

It may seem odd, but we start at the end of the month because this year’s Race Equality Lecture will take place on Thursday 31 October. The lecture is titled “Racism without racists – how racism works in the USA and the western world.” It will be delivered by Eduardo Bonilla-Silva, Professor of Sociology at Duke University and former President of the American Sociological Association. It will take place in the Frankopan Hall at Jesus College and will be available to view online.

Book your place at the Race Equality Lecture

Olaudah Equiano Annual Lecture on Race Justice

Anglia Ruskin University, Wednesday 9 October

Lord Simon Woolley, Principal of Homerton College and co-founder of Operation Black Vote, will deliver the Olaudah Equiano Annual Lecture on Race Justice at Anglia Ruskin University on Wednesday 9 October. The event will take place at the Cambridge campus and starts at 6pm. Lord Woolley is a tireless campaigner for equality, not just for Black communities but all under-represented or marginalised groups. During the event he will discuss the fight for racial equality drawing on his own personal experiences.

Book tickets for the Olaudah Equiano Annual Lecture on Race Justice

Moving beyond stereotypes surrounding Black women

King's College, Thursday 10 October

On the evening of Thursday 10 October, celebrated authors Kelechi Okafor and Afua Hirsch will discuss the challenges and opportunities they have faced when calling out social injustices in their work, with a focus on how their own identities have shaped their activism. They will share insights on the creative processes involved in their writing and how it has impacted on conversations about race, womanhood and justice. 

Reserve a place for the discussion at King's

Collaborative art workshops

Robinson College, Wednesday 16 and Saturday 19 October

For those interested in modern art Robinson College is hosting two collaborative art workshops. The first, on Wednesday 16 October, will be hosted by London artist, Shem, on the theme ‘Black present now’. And then, on Saturday 19 October, the College will host Joshua Obichere, a Cambridge alumnus.

Register your interest in the art workshops

Panel discussion: Black excellence, health and wellness

St Edmund's College, Wednesday 16 October

Also on the afternoon of Wednesday 16 October, St Edmund’s College will be hosting a panel discussion on the themes of Black excellence, health and wellness. Speakers include economist and entrepreneur Ebenezer Ademisoye, clinical scientist, Dr Rafia Al-Lamki and Mastercard Scholar, Godspower Major.

Reserve your ticket for the panel discussion at St Edmund's College

Fireside chat at the Business School

Cambridge Judge Business School, Thursday 17 October

Lord Woolley will participate in a ‘fireside chat’ at the Business School on the afternoon of Thursday 17 October. The event will be chaired by Kamiar Mohaddes and will also include Tabitha Mwangi, Programme Director of the Mastercard Foundation, and Orobosa Isokpan from the Cambridge Africa Business Network. There will be networking opportunities as well but registration is essential.  

Register for the fireside chat at Judge Business School

An exhibition and events at St Catharine's College

St Catharine's College, throughout October

During the entire month of October, St Catharine’s College is hosting an exhibition showcasing the achievements of two prominent Black alumni. The pioneering doctor and civil rights activist Dr Cecil Clarke matriculated in 1914 in the first months of the First World War. Wendell Mottley was an Olympic athlete and economist who served as Trinidad and Tobago’s Finance Minister between 1991 and 1995. The exhibition commemorating them is being held in the Shakeshaft Library.

See the full programme of Black History Month events at St Catharine's College

The Blacktionary Show

Wolfson College, Saturday 19 October

On Saturday 19 October Wolfson College hosts the ‘Blacktionary Show’. Authors Dr Maggie Semple and Jane Oremosu will be discussing their new work ‘My Little Book: A Blacktionary - The pocket guide to the language of race’. The book aims to help break down barriers when it comes to engaging in conversations on race. The event will be introduced by Dr Kenny Monrose, from the University’s Department of Sociology.

Register for the Blacktionary Show

Panel discussion: how organisations promote equality and diversity in the face of a cultural backlash

Homerton College, Tuesday 22 October

On Tuesday 22 October Lord Woolley will again be participating in a discussion being held at Homerton College looking at how companies and other organisations promote equality and diversity in the face of a cultural backlash. Other prominent speakers include the successful businesswomen, Olu Orugboh and Yemi Jackson.

Register for a panel discussion with Lord Woolley

The Trevelyan Lecture: 'Black Genius: Science, Race and the Extraordinary Portrait of Francis Williams'

Bateman Auditorium, Gonville and Caius College 5 pm Friday 25 October (Faculty of History)

Francis Williams was a Jamaican polymath who was born into slavery but ended his life as a gentleman and a scholar. His portrait, dating from the 1740s, shows him surrounded by books and scientific instruments. Was he Cambridge's first Black student? And who commissioned the portrait, and why? Princeton historian, Fara Dabhoiwala, will tackle these questions when he presents new research on the painting and its intriguing sitter. 

Black History Month Academic Seminar

Hughes Hall, Monday 28 October

An opportunity to hear from the College's Black staff and PhD students and celebrate their achievements but also to hear about the challenges facing Black students at Cambridge. One of the main subjects for discussion will be the low numbers of Black academics in the UK. 

More details about the seminar here

The Really Popular Book Club: Mr Loverman by Bernardine Evaristo

Online, Tuesday 29 October

On Tuesday 29 October the University Library’s Really Popular Book Club will be discussing Bernardine Evaristo’s ‘Mr Loverman’. The book follows an Antiguan born immigrant living in Hackney, London, who leads a double life. The discussion will be hosted by Yvonne Battle Felton, Academic Director of Creative Writing at Cambridge’s Institute of Continuing Education. This is an online event.

Sign up for the Really Popular Book Club

Black Advisory Hub events

St John's College, Wednesday 30 October

During the afternoon of Wednesday 30 October, the Black Advisory Hub is hosting a social and afternoon tea for Black students at St John’s College. It's one of many events the Hub is hosting. These include induction sessions for both undergraduates and postgraduates and the prizegiving ceremony for the Bridgetower essay competition.

Visit the Black Advisory Hub to register

Cambridge Students' Union events

St John's College, Thursday 3 October

The Cambridge Students' Union is also hosting several events to mark Black History Month. This opens with a screening of the documentary 'Educationally Subnormal: a British scandal' on Thursday 3 October.

Visit the Cambridge SU to see what's on

Black History Month in Cambridge brings an opportunity to take part in topical discussions, appreciate art and hear from a range of engaging speakers. 

Montage of faces

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Student support and Cambridge’s role as a ‘national asset’ highlighted in Vice-Chancellor’s annual address

Celebrating Cambridge’s most recent achievements – and looking ahead to the opportunities and challenges of the future – she emphasised both the University’s “extraordinary significance” in the UK, and the importance of its global outlook.

Professor Prentice used the speech, in Senate House, to announce the Collegiate University’s milestone success in reaching its £500m student support fundraising target, set in 2018.

This vital philanthropic support was transformative, said Professor Prentice, and had already enabled a number of Cambridge widening participation programmes and initiatives – including the Cambridge Foundation Year and scholarships for specific under-represented groups – to ensure the University continues to welcome students with the potential to thrive here, regardless of background. Student health and wellbeing services at Cambridge had also been transformed with generous gifts to the student support fundraising initiative, she said.

The Vice-Chancellor told Senate House: “I have met the students in many of these programmes, so for me, the student support initiative has names, faces, and life stories attached. It’s a thrilling achievement.”

Describing the University as a “national asset”, Professor Prentice went on to highlight Cambridge’s contribution to the rest of the UK, and its economy – estimated at approximately £30 billion a year.

The Vice-Chancellor said Cambridge’s reputation for expertise and world-class research meant it was uniquely placed to become the “go-to location for the world’s leading innovators”, and pointed to the University’s ambitious plans for Cambridge West and for an innovation hub to bring together researchers, entrepreneurs, spin-outs, and funders under one roof to “help solve the world’s biggest challenges”.

Her address also highlighted other significant milestones and initiatives, including the completion of the Ray Dolby Centre, the groundbreaking of the new Whittle Lab, the expansion of the Bennett Institute for Public Policy, and the advanced planning stages of two new hospitals on the Cambridge Biomedical Campus – the Cambridge Cancer Research Hospital and the Cambridge Children’s Hospital.

Professor Prentice finished the address by paying tribute to the University’s current Chancellor, Lord Sainsbury of Turville, who announced last year that he will step down. She expressed the University’s gratitude for “his unwavering service and his commitment to Cambridge.”

Read the full address

Professor Deborah Prentice marked the start of the academic year by delivering the Vice-Chancellor’s annual address to the University.

I have met the students in many of these programmes, so for me, the student support initiative has names, faces, and life stories attached. It’s a thrilling achievement.
Vice-Chancellor, Professor Deborah Prentice

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge continues to be the most intensive science and technological cluster in the world

Laboratory

The Global Innovation Index (GII) 2024 – which captures the innovation ecosystem performance of 133 economies and tracks the global innovation trends – has ranked Cambridge as the world’s leading science and technological (S&T) cluster by intensity, in relation to its size, for the third consecutive year. San Jose, San Franciso (USA) was named second, unchanged from the 2023 Index, with Eindhoven, (Kingdom of the Netherlands) third.

S&T clusters are established by analysing patent-filing activity and scientific article publication relative to population, and documenting the geographical areas around the world with the highest density of inventors and scientific authors.

According to the Index, the Cambridge cluster filed 6,379 Patent Cooperation Treaty (PCT) patent applications and published 35,000 scientific articles, both per 1 million inhabitants, over the past 5 years.

The University of Cambridge sits at the heart of this cluster, powering world-leading research, driving a vibrant innovation ecosystem, and cultivating a thriving environment for collaboration, services and investment. The University contributes nearly £30 billion to the UK economy annually, including over £23 billion from commercialisation and innovation activities.

According to the Global Innovation Index 2024: “S&T clusters – which can be entire regions or cities – serve as the backbone of a robust national innovation ecosystem. Situated in areas such as San Francisco’s Silicon Valley, Cambridge, Munich and Paris in Europe, or Bengaluru, Seoul, Shenzhen and Tokyo in Asia, these S&T clusters are home to renowned universities, brilliant scientists, R&D-intensive companies, and prolific inventors. It is the collaboration among these entities that results in the groundbreaking scientific advancements.”

Earlier this year, a report by Dealroom revealed that the Cambridge tech ecosystem has a combined value of $191 billion, representing 18% of the entire UK’s tech ecosystem and reinforcing Cambridge’s reputation as Europe’s deep tech leader.

Dr Diarmuid O’Brien, Pro-Vice-Chancellor for Innovation, University of Cambridge, commented:

“It’s great to see this continued recognition of Cambridge as the world’s most intensive science and technological cluster. With its exceptional research and science, people and partners, companies and commitment, Cambridge drives innovation that fuels local, national, and global growth, tackling global challenges and delivering life-changing impact.”

Release first published by Cambridge Enterprise

Cambridge has once again been named as the most intensive science and technological cluster in the world, according to a new report ranking innovation around the globe.

 

 

It’s great to see this continued recognition of Cambridge as the world’s most intensive science and technological cluster. With its exceptional research and science, people and partners, companies and commitment, Cambridge drives innovation that fuels local, national, and global growth, tackling global challenges and delivering life-changing impact.
Diarmuid O’Brien

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Sir David Attenborough's 'joy' on visit to Cambridge Conservation Initiative

Photo of Sir David Attenborough on a visit to Cambridge Conservation Research Initiative

Sir David said of visiting CCI that he felt “an undercurrent of joy” whenever he came to the conservation campus, which is housed in the building bearing his own name.

The campus was opened in 2016 and is the first of its kind, with over 500 conservation professionals and researchers, from 10 different organisations and the University of Cambridge, all collaborating to stop the biodiversity crisis and build more hopeful futures for people and nature.

Read the full story: 'An Undercurrent of Joy'

Sir David Attenborough spoke of how he feels during visits to the Cambridge Conservation Initiative (CCI) when he stopped by the CCI conservation campus at the University of Cambridge this week.

Photo of Sir David Attenborough on a visit to Cambridge Conservation Research Initiative

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

G7 representatives meet in Cambridge to discuss semiconductors

Semiconductors underpin nearly every electrical, optical and quantum device.

Representatives of the Semiconductors Point of Contact Group from the G7 group of nations met in Cambridge on 26 September. The meeting was held at ARM, which designs over 95% of the processors in the world. Representatives from the University of Cambridge, as well as representatives from local semiconductor companies, participated in the events.

Semiconductors underpin nearly every electrical, optical and quantum device, from mobile phones and medical equipment to electric vehicles. They are of global strategic significance due to the integral role they play in net zero, artificial intelligence and quantum technology.

The G7 Semiconductor Points of Contact group is dedicated to facilitating information exchange and sharing best practices among G7 members. The PoC Group plans to exchange information on issues impacting the semiconductor industry, including pre-competitive industrial research and development priorities, sustainable manufacturing, the effect of non-market policies and practices, and crisis coordination channels.

Cambridge was chosen for the meeting in part because of its strong innovation ecosystem, which has produced more ‘unicorns’ – privately held startup companies valued at over US$1 billion – than anywhere else in the UK.

A 2023 report found that the University of Cambridge contributes nearly £30 billion to the UK economy annually and supports more than 86,000 jobs across the UK, including 52,000 in the East of England.

For every £1 the University spends, it creates £11.70 of economic impact, and for every £1 million of publicly-funded research income it receives, it generates £12.65 million in economic impact across the UK.

The National Quantum Strategy (2023) and the National Semiconductor Strategy (2023) highlight the UK’s national strengths in quantum and photonic technologies and compound semiconductors. These sectors foster growth and create high-skilled jobs, and position the UK as a hub of global innovation. 

Dr Diarmuid O’Brien, Pro-Vice-Chancellor for Innovation at the University of Cambridge, said: “Semiconductors are a vital technology for the UK’s economic growth, and Cambridge is a leader in semiconductor research and development. Working with our partners in academia, industry and government, we can develop next-generation semiconductor technologies and companies, and train the next generation of semiconductor scientists and engineers.” 

Professor Andrea Ferrari, Director of the Cambridge Graphene Centre, hosted, on behalf of the University, a formal dinner in Pembroke College for the attendees of the G7 Semiconductors group. He stated that: “Cambridge played a key foundational role in the development of electronics. The electron was discovered here in 1897, by JJ Thomson. In 1961, electron beam lithography, a key method for integrated circuit fabrication, was invented in Cambridge. These early achievements were followed by many advances in circuit design, innovative advanced materials, photonic and quantum communications. It is thus fitting that the G7 Semiconductors representatives met at the heart of where all started.”

Science Minister Lord Vallance said: "Semiconductors are an unseen but vital component in so many of the technologies we rely on in our lives and backing UK innovators offers a real opportunity to growth these firms into industry leaders, strengthening our £10 billion sector and ensuring it drives economic growth. Hosting the G7 semiconductors Points of Contact group is also a chance to showcase the UK’s competitive and growing sector and make clear our commitment to keeping the UK at the forefront of advancing technology."

 

Representatives from the G7 have met in Cambridge to discuss the main priorities for the future development of semiconductors and their impact on the global economy.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Previously unknown Neolithic society in Morocco discovered

Stone tools from Oued Beht

Archaeological fieldwork in Morocco has discovered the earliest, previously unknown 3400–2900 BC farming society from a poorly understood period of north-west African prehistory. This is the earliest and largest agricultural complex yet found in Africa beyond the Nile.

This study, published in the journal Antiquity, reveals for the first time the importance of the Maghreb (north-west Africa) in the emergence of complex societies in the wider Mediterranean during the fourth and third millennia BC.

With a Mediterranean environment, a border with the Sahara desert and the shortest maritime crossing between Africa and Europe, the Maghreb is perfectly located as a hub for major cultural developments and intercontinental connections in the past.

Whilst the region’s importance during the Palaeolithic, Iron Age and Islamic periods is well known, there is a significant gap in knowledge of the archaeology of the Maghreb between c. 4000 and 1000 BC, a period of dynamic change across much of the Mediterranean.

To tackle this, a team of archaeologists led by Prof Cyprian Broodbank from the University of Cambridge, Prof Youssef Bokbot from INSAP, and Prof Giulio Lucarini from CNR-ISPC and ISMEO, have carried out collaborative, multidisciplinary archaeological fieldwork at Oued Beht, Morocco.

"For over thirty years I have been convinced that Mediterranean archaeology has been missing something fundamental in later prehistoric north Africa," said Broodbank. "Now, at last, we know that was right, and we can begin to think in new ways that acknowledge the dynamic contribution of Africans to the emergence and interactions of early Mediterranean societies."

"For more than a century the last great unknown of later Mediterranean prehistory has been the role played by the societies of Mediterranean’s southern, Africa shores west of Egypt," say the authors of the new study. "Our discoveries prove that this gap has been due not to any lack of major prehistoric activity, but to the relative lack of investigation, and publishing. Oued Beht now affirms the central role of the Maghreb in the emergence of both Mediterranean and wider African societies."

These results reveal that the site was the largest agricultural complex from this period in Africa outside of the Nile region. All of the evidence points to the presence of a large-scale farming settlement—similar in size to Early Bronze Age Troy.

The team recovered unprecedented domesticated plant and animal remains, pottery and lithics, all dating to the Final Neolithic period. Excavation also revealed extensive evidence for deep storage pits.

Importantly, contemporaneous sites with similar pits have been found on the other side of the Strait of Gibraltar in Iberia, where finds of ivory and ostrich egg have long pointed to African connections. This suggests that the Maghreb was instrumental in wider western Mediterranean developments during the fourth millennium BC.

Oued Beht and the north-west Maghreb were clearly integral parts of the wider Mediterranean region. As such, these discoveries significantly change our understanding of the later prehistory of the Mediterranean and Africa.

As the authors of the Antiquity article state: “It is crucial to consider Oued Beht within a wider co-evolving and connective framework embracing peoples both sides of the Mediterranean-Atlantic gateway during the later fourth and third millennia BC - and, for all the likelihood of movement in both directions, to recognise it as a distinctively African-based community that contributed substantially to the shaping of that social world.”

Multi-disciplinary archaeological survey at the site of Oued Beht, Morocco, reveals a previously unknown 3400–2900 BC farming society, shedding new light on North Africa’s role in Mediterranean prehistory. 

For over thirty years I have been convinced that Mediterranean archaeology has been missing something fundamental
Prof Cyprian Broodbank
Stone tools from Oued Beht

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Extinct’ snails found breeding in French Polynesia

Partula tohiveana snail in the wild

A global conservation effort to reintroduce a tiny snail to the wild is celebrating a momentous milestone: for the first time in 40 years, conservationists have found born-in-the-wild adult Partula tohiveana – meaning the precious molluscs have successfully established themselves in French Polynesia.

This year Cambridge’s Dr Justin Gerlach helped restore over 6,000 of the snails to Moorea, their French Polynesian island home as part of an annual reintroduction of zoo-bred ‘Extinct in the Wild’ and ‘Critically Endangered’ snail species – carried out through collaboration with zoos around the world.

During their work the team found unmarked Partula tohiveana: proof that previously reintroduced snails have successfully bred in the area.

The momentous discovery means Partula tohiveana can now be considered as established – an incredibly rewarding result for 40 years of dedication and collaboration. Conservationists will now begin the process of downlisting the snails from ‘Extinct-in-the-Wild’ to ‘Critically Endangered’ on the IUCN’s Red List.

Very few species have been reintroduced successfully having been completely extinct in the wild. This is also the very first invertebrate species where this has been achieved.

Ten species and sub-species of the tropical snails, reared at London Zoo, Bristol Zoological Society, Detroit Zoological Society, Marwell Wildlife, the Royal Zoological Society of Scotland, Saint Louis Zoo, Sedgwick County Zoo, Woodland Park Zoo and Zoo Schwerin, travelled more than 15,000km to Tahiti at the beginning of September. Before making the two-day journey to the islands of Tahiti, Moorea and Huahine, the incredibly rare snails, which each measure a tiny 1-2cm in length, were individually counted and marked with a dot of red UV reflective paint. The ‘snail varnish’ glows under UV torchlight, helping conservationists in the field to spot and monitor the nocturnal snails at night, when they’re most active.

London Zoo’s Senior Curator of Invertebrates, Paul Pearce-Kelly, who leads the Partula conservation programme, said: “Though little, these snails have great cultural, scientific and conservation value. Partula snails have always been part of Polynesia’s rich cultural heritage and play an important role in the ecological health of their forest habitats. They’ve also been studied for over a century for the insights they give into how species evolve in isolated environments. Most recently, they’re providing a valuable conservation model for helping hundreds of endangered island species.”

He added: “This collaborative conservation effort is playing a crucial role in saving these species from extinction. It’s a powerful example of how conservation zoos can combat biodiversity loss. At a time when nature faces unprecedented challenges, these small snails are a symbol of hope for global wildlife.”

Partula snails - also known as Polynesian tree snails - eat decaying plant tissue and fungi, so play an important role in maintaining forest health. Returning these rare snails back to the wild helps to restore the ecological balance in these islands.

Dr Justin Gerlach of Peterhouse, University of Cambridge and an Academic Associate at the University's Museum of Zoology, said: “Discovering wild-born adult snails was a great moment. Very few animal species have been re-established back in the wild so this is a fantastic achievement for the programme – the fruit of a vast amount of work.”

Conservation zoos are working with the French Polynesian Government’s Direction de l’environnement, to save Partula snails from extinction. In the 1980s and early 1990s, these snails faced a critical threat after the invasive rosy wolf snail (Euglandina rosea) was introduced to control the African giant land snail (Lissachatina fulica). Unfortunately, the predatory species targeted the native snails instead, leading to the extinction or near-extinction of many Partula species across the region.

In the early 1990s, the last remaining individuals of several Partula species were rescued by London and Edinburgh Zoos, launching an international conservation breeding programme. This collaboration between 15 zoos cares for 15 species and subspecies, most of which are classified as ‘Extinct-in-the-Wild’. These rescued snails, along with those already being studied at universities in the UK and North America, became the foundation for reintroducing the species back onto their native island homes.

Paul said: “After decades of caring for these species in conservation zoos and working with the Direction de l’environnement to prepare the islands, we started reintroducing Partula snails back into their lowland tropical forests almost 10 years ago. Since then, we’ve reintroduced over 30,000 snails, including 10 Extinct-in-the-Wild species and subspecies, with this year’s release being the largest so far, thanks to our international team and collaborators, including mollusc specialist Dr Justin Gerlach of Peterhouse, University of Cambridge.”

London Zoo’s coordination of the Partula snail reintroduction project is made possible due to funding from supporters including the Players of the People’s Postcode Lottery, who have enabled London Zoo to continue bringing species back from the brink of extinction.

Adapted from a press release by the Zoological Society of London.

A species of tropical tree snail is no longer extinct in the wild following a successful reintroduction project.

Very few animal species have been re-established back in the wild so this is a fantastic achievement for the programme – the fruit of a vast amount of work.
Justin Gerlach
Born-in-the-wild unmarked Partula tohiveana snail observed in the wild, meaning the species is re-established (c) Paul Pearce-Kelly

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Palestinian education ‘under attack’, leaving a generation close to losing hope, study warns

Boy sitting in the rubble of a destroyed UNRWA school in Nuseirat, Middle Areas, Gaza 2024

The ongoing war in Gaza will set children and young people’s education back by up to 5 years and risks creating a lost generation of permanently traumatised Palestinian youth, a new study warns.

The report, by a team of academics working in partnership with the United Nations Relief and Works Agency for Palestinian Refugees in the Near East (UNRWA), is the first to comprehensively quantify the war’s toll on learning since it began in October 2023. It also details the devastating impact on children, young people and teachers, supported by new accounts from frontline staff and aid workers.

The study was a joint undertaking involving researchers at the Faculty of Education, University of Cambridge and the Centre for Lebanese Studies, in partnership with UNRWA. It shows that Gaza’s children have already lost 14 months of education since 2019 due to COVID-19, earlier Israeli military operations, and the current war.

On this basis and using information such as global post-COVID-19 education recovery data, the researchers model several potential futures for Gaza’s younger generation, depending on when the war ends and how quickly the education system is restored.

The most optimistic prediction – assuming an immediate ceasefire and rapid international effort to rebuild the education system – is that students will lose 2 years of learning. If the fighting continues until 2026, the losses could stretch to 5 years. This does not account for the additional effects of trauma, hunger and forced displacement, all of which are deepening Gaza’s education crisis.

Without urgent, large-scale international support for education, the researchers suggest that there is a significant threat not just to students’ learning, but their overall faith in the future and in concepts such as human rights. Despite this, the study shows that education has been deprioritised in international aid efforts, in favour of other areas. “Education, simply put, is not seen as lifesaving,” the report warns.

Professor Pauline Rose, Director of the Research for Equitable Access and Learning (REAL) Centre, University of Cambridge, said: “Palestinian education is under attack in Gaza. Israeli military operations have had a significant effect on learning.”

“As well as planning for how we rebuild Gaza’s shattered education system, there is an urgent need to get educational support for children now. Education is a right for all young people. We have a collective responsibility to protect it.”

According to the United Nations Office for the Coordination of Humanitarian Affairs, more than 10,600 children and 400 teachers had been killed in Israeli military operations by August 2024, and more than 15,300 students and 2,400 teachers injured. Hundreds of thousands of young people have been displaced and are living in shelters.

Satellite images analysed by the Occupied Palestinian Territory Education Cluster have verified that over 90 per cent of schools have been damaged, many beyond repair. Since August, UNRWA has provided education in the shelters, reaching about 8,000 children, but the study warns that much more is needed to mitigate lost learning, which was already considerable following COVID-19.

The researchers calculate that 14 months of lost schooling so far have increased ‘learning poverty’ – the proportion of children unable to read a basic text by age 10 – by at least 20 percentage points. The accurate figure may be even higher, as the calculation does not account for the wider impacts of the war on children and teachers.

The study draws together information from different sources and includes a comprehensive involvement of the Education Cluster and Cluster partners sharing their inputs, challenges and progress to enrich the report. The report provides a comprehensive overview of those broader effects. It highlights the devastating psychological consequences for Palestinian children who were already living ‘in constant fear and lack of hope’ after 17 years of blockade, according to a 2022 report by Save The Children.

Professor Maha Shuayb, Director of the Centre for Lebanese Studies, said: “Young people’s prospects in Gaza are being extinguished and our findings show that with it they are losing hope. Education is central to stabilising that spiral of decline. If it is simply erased, the consequences will be far-reaching.”

Save The Children has estimated that more than 10 children per day have lost limbs since the war began. The report warns of rising numbers of less visible disabilities, which will put further strain on an education system ill-equipped to support children with special needs.

The study suggests that continuous shock and suffering are now shaping children’s outlook and world views. Interviewees reported some children questioning values such as equality, human rights and tolerance when these are taught in the shelters. “This is a full generation of trauma,” one humanitarian aid official said; “it will take a generation to overcome it.”

The report highlights the immense suffering teachers and counsellors have endured physically and mentally. The killings, displacement and daily realities of life during war have taken a tremendous toll on their ability to engage meaningfully in education and will, it says, adversely affect reconstruction efforts.

Professor Yusuf Sayed, from the University of Cambridge, said: “It is important to recognise teachers and counsellors have, like the rest of the population, suffered immensely. There is evidence of extraordinary commitment from educators striving to maintain learning, but inevitably the deprivation, killings and hardship are affecting their ability to do so.”

Despite a flash appeal from the United Nations Office for the Coordination of Humanitarian Affairs (OCHA), the analysis shows that just 3.5 per cent of aid for Gaza has been invested in education. Major donors like the US and Germany have neglected education in their aid packages, and blockades continue to hinder the delivery of resources on the ground.

Without more funding and access to learning, structured play and other forms of support, the report warns, the long-term repercussions for Gaza’s next generation will only worsen.

It calls for immediate steps focusing on the resumption of education, which include providing counselling, safe learning spaces, and support for students and educators with disabilities. It also calls for an immediate and permanent ceasefire and an end to occupation, in line with the International Court of Justice advisory opinion and UN recently-adopted resolution, as only then can Gaza’s education system be rebuilt. This will require a focus on recruiting more teachers and counsellors to cope with the scale of learning loss and trauma suffered by children and young people.

“Education is the only asset the Palestinian people have not been dispossessed of. They have proudly invested in the education of their children in the hope for a better future. Today, more than 625,000 deeply traumatised school-aged children are living in the rubble in Gaza. Bringing them back to learning should be our collective priority. Failing to do that will not only lead to a lost generation but also sow the seeds for more extremism, hatred and violence”, said Philippe Lazzarini, UNRWA Commissioner General.

The study also stresses that Palestinians themselves must lead the education recovery. “A ceasefire is the key for the success of any human development activity in Gaza, including education,” the authors write. “Children have seen that the international community will sit idly by as they are killed. This has left them with questions about values that schools and learning aim to instil around humanitarian principles that teachers will have to navigate.”

The full report, Palestinian Education Under Attack in Gaza: Restoration, Recovery, Rights and Responsibilities in and through Education, is now available online. 

Ongoing war in Gaza will set children and young people’s education back by up to 5 years, report suggests.

Boy sitting in the rubble of a destroyed UNRWA school in Nuseirat, Middle Areas, Gaza 2024

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Energy inefficiency and inability to downsize pose even bigger threat to low-income pensioners than loss of Winter Fuel Payments, study suggests

Rooftops view from Totterdown to Bristol Centre

The study, published in Energy Research & Social Science, was completed shortly before the Winter Fuel Payment vote was taken, by researchers from the University of Cambridge and Delft University of Technology (TU Delft).

The researchers raise particular concerns about the impact of the policy on pensioners with annual incomes of between £11,300–£15,000 for single pensioners and £17,300–£22,000 for couples.

Drawing on data from the English Housing Survey, which sampled nearly 12,000 households across all income groups, the study investigated how income, energy efficiency, home size, household type, and tenure status impacted on energy expenditure.

The researchers found that an increase of £1 per year in income (after housing costs, tax and welfare payments) was associated with a marginal increase in heating spending of about one-tenth of a penny.

The study also found that just a small energy efficiency improvement – a one-point increase in the SAP12 rating (The Government's Standard Assessment Procedure for Energy Rating of Dwellings – had a major impact on households in energy poverty, offering an average reduction in annual heating costs of £21.59 per year.

Floor area also had an impact. The researchers found that a one square metre increase was associated with an annual increase in heating spend of £5.04 per year, for households facing energy poverty, making this the worst affected group. This compares with £4.18 per year for high-income households, £3.65 per year for low-income households, and £2.99 per year for very low-income households not in energy poverty.

“When low-income households receive more income, they generally spend a little more to warm their homes. But these households often have to spread any extra money they have across other essential needs including food,” said lead author, Dr Ray Galvin, affiliated with Cambridge Institute for Sustainability Leadership (CISL).

“A reduction in income like the loss of the Winter Fuel Payment could force low-income pensioners to cut back not only on heating but also on other basic necessities. This poses a significant risk to people who are particularly vulnerable to the effects of living in cold homes.”

Energy efficiency

Across all household types, the researchers found that the energy efficiency of the dwelling had by far the biggest impact on heating expenditure.

“The most effective strategy to warm up the homes of people living energy poverty is to increase the energy performance of their dwellings,” said Professor Minna Sunikka-Blank, from Cambridge’s Department of Architecture.

Specifically, the authors advise that the SAP12 rating of homes need to be increased to at least 72.

Each increase in SAP12 energy efficiency rating corresponds to a reduction in heating costs of around £20 per year, meaning that for households in energy poverty, with an average SAP12 rating of 59.48, increasing the rating to the level of low-income households not in energy poverty, 71.45, could reduce heating costs by about £240 per year.

The authors make clear that energy savings would not be sufficient to pay for these energy-efficiency upgrades. They would require targeted financial support.

Dr Galvin said: “Government and society may well find that these costs are paid back to the country through co-benefits of fewer days off work, longer lives and less strain on the health service.”

While this would not improve the finances of households in energy poverty as much as the direct monetary allowances such as the Winter Fuel Payment, it would, the authors argue, make a substantial, direct impact on cold, unhealthy homes.

Dr Galvin said: “There needs to be extra focus on developing policies for the long-term solution of retrofitting energy-inefficient homes. This can provide enduring reductions in energy bills while also improving thermal comfort. This approach may also align better with the goal of reducing carbon emissions and tackling climate change.”

Prof Minna Sunikka-Blank said: “Without retrofit initiatives, energy poverty will persist in the UK, because in low-income households immediate needs often take precedence over thermal comfort, even when incomes increase.”

Home size

The study found that households in energy poverty have a 7.3% larger average floor area than low-income households not in energy poverty, and that floor area makes a substantial difference to heating energy costs (about half to two-thirds the impact that the SAP12 energy efficiency rating has).

Tijn Croon, from TU Delft said: “These findings suggest that inability to downsize may be a significant driver of energy poverty in the United Kingdom. Low-income households can save money and stay warmer living in smaller homes but downsizing is not always easy for older households whose dependants have left home and who find themselves with a large, older home that is very expensive to keep warm.”

One-person households spend less on heating

A surprising finding in the study is that across all income groups, one-person households tend to spend less on energy. And for households in energy poverty the reduction (£36.77 per year) was twice as large as for low-income households not in energy poverty (£15.65 per year).

Tijn Croon, from TU Delft, said: “This might suggest that many one-person households are able to control their energy consumption more strategically than a multi-person household can.”

Dr Galvin said: “Our study controlled for other factors so this is not just a case of low-income households living in smaller homes. Further research could survey one-person households to find out if they have skills and practices that could be transferred to multi-person households.”

Mitigating impact of Winter Fuel Payment cuts

The authors suggest several potential solutions. Expanding Pension Credit eligibility to align with the government's low-income threshold would be the most comprehensive fix, though this may be financially unfeasible. Alternative measures could include a temporary application process for the Winter Fuel Payment for those just above the Pension Credit threshold or providing tax credits or rebates for low-income pensioners, which could be more easily managed since HMRC already holds income data.

While much attention has been given to the risk of energy poverty among pensioners, the authors also note that families with children and young adults are often equally vulnerable and may face even greater challenges in the housing market compared to pensioners.

The authors are currently working on a follow-up research paper that will explore the recent reforms to the government’s Warm Home Discount scheme.

Reference

R. Galvin, M. Sunikka-Blank, T. Croon, ‘Juggling the Basics: How Much Does an Income Increase Affect Energy Spending of Low-Income Households in England?’, Energy Research & Social Science (2024). DOI: 10.1016/j.erss.2024.103766

The UK Government’s policy to scrap Winter Fuel Payments could disproportionately affect low-income pensioners in England, new analysis suggests. But the same study argues that the energy inefficiency of homes and challenges involved in downsizing will have an even more harmful effect this winter.

Without retrofit initiatives, energy poverty will persist in the UK
Minna Sunikka-Blank
Rooftops view from Totterdown to Bristol Centre

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Removing pint glasses could reduce beer sales by almost 10%

Barman handing a customer a pint of beer

Alcohol consumption is the fifth largest contributor to premature death and disease worldwide. In 2016 it was estimated to have caused approximately 3 million deaths worldwide.

Professor Dame Theresa Marteau and colleagues at the Behaviour and Health Research Unit have shown previously that serving wine in smaller glasses is associated with a decrease in sales.

To see if this effect was seen with other alcoholic drinks, they approached venues in England and asked them to remove the pint serving size and instead offer two-thirds as the largest option for four weeks, with four-week non-intervention periods before and after as a comparison.

In a study published in PLOS Medicine, the team found that removing the pint reduced the daily mean volume of beer, lager and cider sold by 9.7%, although there was a slight increase in the amount of wine purchased, with one pub contributing to half of the increase of wine sales. They report that although customers did not complain, fewer than 1% of venues approached agreed to participate and the intervention involved only 12 establishments.

Professor Marteau said: “Alcohol harms our health, increasing the risk of injury and many diseases including heart disease, bowel, breast and liver cancers. While we may all enjoy a drink, the less we drink the better our health.

“As we’ve shown is the case with wine, removing the largest serving size for beer, lager and cider – in this case, the pint – could encourage people to drink less. This could be beneficial both to the nation’s health and the health of individuals.”

Further assessment is needed, particularly into whether people fully compensated for reduced beer consumption by drinking other alcoholic drinks, but the intervention merits consideration for inclusion in alcohol control policies. Smaller serving sizes could contribute towards reducing alcohol consumption across populations and thereby decrease the risk of seven cancers and other diseases.

Reference
Mantzari, E et al. Impact on beer sales of removing the pint serving size: An A-B-A reversal trial in pubs, bars, and restaurants in England. PLOS Medicine; 17 Sept 2024; DOI: 10.1371/journal.pmed.1004442

Adapted from a press release by PLOS Medicine

Cambridge researchers have shown that reducing the serving size for beer, lager and cider reduces the volume of those drinks consumed in pubs, bars and restaurants, which could have wider public health benefits.

While we may all enjoy a drink, the less we drink the better our health
Theresa Marteau
Barman handing a customer a pint of beer

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Monoclonal antibodies offer hope for tackling antimicrobial resistance

A Petri dish with a culture of the Superbug Acinetobacter baumannii next to antibiotics

A team lead by researchers at the University of Cambridge has developed a monoclonal antibody drug, using a technique involving genetically engineered mice, that may help prevent infection from Acinetobacter baumannii, a bacteria associated with hospital-acquired infections, which is particularly common in Asia.

A. baumannii bacteria can cause life-threatening respiratory illness and sepsis in vulnerable individuals, particularly in newborn babies whose immune systems have not fully developed. It is usually spread through contaminated surfaces, medical equipment and via contact with others. In recent years infections with strains of this bacteria that are resistant to almost every antibiotic available have become common.

Professor Stephen Baker from the Cambridge Institute of Therapeutic Immunology and Infectious Disease at the University of Cambridge said “A. baumannii is good at sticking to medical equipment, and if people are vulnerable or don't have a particularly well-developed immune system, they can succumb to this infection and get aggressive pneumonia requiring ventilation – and in many cases, the patients can acquire the infection from the ventilation itself.

“The bacteria are naturally resistant to many antimicrobials, but as they’re now found in hospitals, they’ve acquired resistance to almost everything we can use. In some hospitals in Asia, where the infections are most common, there isn't a single antibiotic that will work against them. They’ve become impossible to treat.”

In a study published today in Nature Communications, the team produced monoclonal antibodies using transgenic mice – mice that have been genetically-engineered to have a human-like immune system, producing human antibodies instead of mouse antibodies. They went on to show that these monoclonal antibodies were able prevent infection with A. baumannii derived from clinical samples.

Monoclonal antibodies are a growing area of medicine, commonly used to treat conditions including cancer (for example, Herceptin for treating some breast cancers) and autoimmune disease (for example, Humira for treating rheumatoid arthritis, psoriasis, Crohn's disease, and ulcerative colitis).

Usually, monoclonal antibodies are developed from the antibodies of patients who have recovered from an infection, or they are designed to recognise and target a particular antigen. For example, monoclonal antibodies targeting the ‘spike protein’ of the SARS-CoV-2 coronavirus were explored as a way of treating COVID-19.

In the approach taken by the Cambridge team, however, transgenic mice were exposed to the outer membrane of A. baumannii bacteria, triggering an immune response. The researchers then isolated almost 300 different antibodies and tested which of these was the most effective at recognising live bacteria, identifying the single monoclonal antibody mAb1416 as the best.

Professor Baker said: “Using this method, we don't infect the mice with the live bacteria, but we instead immunise them using multiple different elements and let the mouse’s immune system work out which ones to develop antibodies against. Because these mice have ‘humanised’ immune systems, we wouldn’t then need to reengineer the antibodies to work in humans.”

The team treated mice with mAb1416, and 24 hours later exposed them to A. baumannii isolated from a child with sepsis in an intensive care unit. They found that those mice treated with the drug saw a significant reduction in bacterial load in their lungs a further 24 hours later, compared to mice that were not treated.

All of the isolates used to produce and test the monoclonal antibodies were from patients in Ho Chi Minh City, Vietnam, but the isolate used to test mAb1416 was taken from a patient ten years later than the other isolates. This is important because it shows that mAb1416 was protective against A. baumannii bacteria that may have evolved over time.

Professor Baker said: “Using this technique, you can take any bacterial antigen or cocktail of antigens, rather than waiting for somebody that's recovered from a particular infection – who you assume has developed an appropriate antibody response – give it to the mice and extract the antibodies you think are the most important.”

More work is now needed to understand the mechanism by which mAb1416 protects against infection, as this could allow the team to develop an even more effective treatment. Any potential new drug will then need to be tested in safety trials in animals before being trialled in patients.

Professor Baker added: “We know that monoclonal antibodies are safe and that they work, and the technology exists to produce them – what we have done is identify how to hit bacteria with them. Apart from the cost effectiveness, there's no reason why this couldn’t become a medicine within a few years. Given the emergency presented by antimicrobial resistance, this could become a powerful new weapon to fight back.”

The research was funded by the Bill & Melinda Gates Foundation, the UK Medical Research Council Newton Fund, the Viet Nam Ministry of Science and Technology, and Wellcome.

Professor Baker is a fellow at Wolfson College, Cambridge.

Reference
Baker, S, Krishna, A & Higham, S. Exploiting human immune repertoire transgenic mice to identify protective monoclonal antibodies against an extensively antimicrobial resistant nosocomial bacterial pathogen. Nat Comms; 12 Sept 2024; DOI: 10.1038/s41467-024-52357-8

Monoclonal antibodies – treatments developed by cloning a cell that makes an antibody – could help provide an answer to the growing problem of antimicrobial resistance, say scientists.

We know that monoclonal antibodies are safe and that they work, and the technology exists to produce them – what we have done is identify how to hit bacteria with them
Stephen Baker
A Petri dish with a culture of the Superbug Acinetobacter baumannii next to antibiotics

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Astronomers detect black hole ‘starving’ its host galaxy to death

Pablo's Galaxy

The international team, co-led by the University of Cambridge, used Webb to observe a galaxy roughly the size of the Milky Way in the early universe, about two billion years after the Big Bang. Like most large galaxies, it has a supermassive black hole at its centre. However, this galaxy is essentially ‘dead’: it has mostly stopped forming new stars.

“Based on earlier observations, we knew this galaxy was in a quenched state: it’s not forming many stars given its size, and we expect there is a link between the black hole and the end of star formation,” said co-lead author Dr Francesco D’Eugenio from Cambridge’s Kavli Institute for Cosmology. “However, until Webb, we haven’t been able to study this galaxy in enough detail to confirm that link, and we haven’t known whether this quenched state is temporary or permanent.”

This galaxy, officially named GS-10578 but nicknamed ‘Pablo’s Galaxy’ after the colleague who decided to observe it in detail, is massive for such an early period in the universe: its total mass is about 200 billion times the mass of our Sun, and most of its stars formed between 12.5 and 11.5 billion years ago.

“In the early universe, most galaxies are forming lots of stars, so it’s interesting to see such a massive dead galaxy at this period in time,” said co-author Professor Roberto Maiolino, also from the Kavli Institute for Cosmology. “If it had enough time to get to this massive size, whatever process that stopped star formation likely happened relatively quickly.”

Using Webb, the researchers detected that this galaxy is expelling large amounts of gas at speeds of about 1,000 kilometres per second, which is fast enough to escape the galaxy’s gravitational pull. These fast-moving winds are being ‘pushed’ out of the galaxy by the black hole.

Like other galaxies with accreting black holes, ‘Pablo’s Galaxy’ has fast outflowing winds of hot gas, but these gas clouds are tenuous and have little mass. Webb detected the presence of a new wind component, which could not be seen with earlier telescopes. This gas is colder, which means it’s denser and – crucially – does not emit any light. Webb, with its superior sensitivity, can see these dark gas clouds because they block some of the light from the galaxy behind them.

The mass of gas being ejected from the galaxy is greater than what the galaxy would require to keep forming new stars. In essence, the black hole is starving the galaxy to death. The results are reported in the journal Nature Astronomy.

“We found the culprit,” said D’Eugenio. “The black hole is killing this galaxy and keeping it dormant, by cutting off the source of ‘food’ the galaxy needs to form new stars.”

Although earlier theoretical models had predicted that black holes had this effect on galaxies, before Webb, it had not been possible to detect this effect directly.

Earlier models had predicted that the end of star formation has a violent, turbulent effect on galaxies, destroying their shape in the process. But the stars in this disc-shaped galaxy are still moving in an orderly way, suggesting that this is not always the case.

“We knew that black holes have a massive impact on galaxies, and perhaps it’s common that they stop star formation, but until Webb, we weren’t able to directly confirm this,” said Maiolino. “It’s yet another way that Webb is such a giant leap forward in terms of our ability to study the early universe and how it evolved.”

New observations with the Atacama Large Millimeter-Submillimiter Array (ALMA), targeting the coldest, darkest gas components of the galaxy, will tell us more about if and where any fuel for star formation is still hidden in this galaxy, and what is the effect of the supermassive black hole in the region surrounding the galaxy.

The research was supported in part by the Royal Society, the European Union, the European Research Council, and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

Reference:
Francesco D’Eugenio, Pablo G. Pérez-González et al. ‘A fast-rotator post-starburst galaxy quenched by supermassive black-hole feedback at z=3.’ Nature Astronomy (2024). DOI: 10.1038/s41550-024-02345-1

Astronomers have used the NASA/ESA James Webb Space Telescope to confirm that supermassive black holes can starve their host galaxies of the fuel they need to form new stars.

'Pablo's Galaxy'

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Flowers compete for pollinators with adjustable ‘paint by numbers’ petal design

The study, by researchers at the University of Cambridge’s Sainsbury Laboratory, also found that bees prefer larger bullseyes over small ones, and fly 25% faster between artificial flower discs with larger bullseyes – potentially boosting efficiency for both bees and blossoms.

Using the hibiscus plant as a model, the researchers selected plants with three differently sized bullseye patterns – H. richardsonii (small bullseye covering 4% of flower disc), H. trionum (medium bullseye covering 16%) and a transgenic line (mutation) of H. trionum (large bullseye covering 36%).

They found that petal pre-patterning is established very early in the flower’s formation long before the petal shows any visible colour. The petal acts like a 'paint-by-numbers' canvas, where different regions are predetermined to develop specific colours and textures long before they start looking different from one another.

The research also shows plants can precisely control and modify the shape and size of these patterns using multiple mechanisms, highlighting the evolutionary importance of the process. By fine-tuning these designs, plants may gain a competitive advantage in the contest to attract pollinators.

The findings are published today in the journal Science Advances.

Dr Edwige Moyroud, who leads a research team studying the mechanisms underlying pattern formation in petals, explained: “If a trait can be produced by different methods, it gives evolution more options to modify it and create diversity, similar to an artist with a large palette or a builder with an extensive set of tools. By studying how bullseye patterns change, what we are really trying to understand is how nature generates biodiversity.”

In the small hibiscus species Hibiscus trionum, a striking bullseye pattern is formed on the petals, featuring a dark purple centre surrounded by white.

Lead author Dr Lucie Riglet investigated the mechanism behind hibiscus petal patterning by analysing petal development in three hibiscus flowers that had the same total size but different bullseye sizes – small, medium and large.

She found that the pre-pattern begins as a small, crescent-shaped region long before the bullseye is visible on tiny petals less than 0.2mm in size.

Dr Riglet said: “At the earliest stage we could dissect, the petals have around 700 cells and are still greenish in colour, with no visible purple pigment and no difference in cell shape or size. When the petal further develops to 4000 cells, it still does not have any visible pigment, but we identified a specific region where the cells were larger than their surrounding neighbours. This was the pre-pattern.”

A computational model developed by Dr Argyris Zardilis provided further insights, and combining both computational models and experimental results, the researchers showed that hibiscus can vary bullseye dimensions very early during the pre-patterning phase or modulate growth in either region of the bullseye, by adjusting cell expansion or division, later in development.

Dr Riglet then compared the relative success of the bullseye patterns in attracting pollinators using artificial flower discs that mimicked the three different bullseye dimensions.

Dr Riglet explained: “The bees not only preferred the medium and larger bullseyes over the small bullseye, they were also 25% quicker visiting these larger flower discs. Foraging requires a lot of energy and so if a bee can visit four flowers rather than three flowers in the same time, then this is beneficial for the bee, and also the flower.”

The findings suggest that these pre-patterning strategies could have deep evolutionary roots, potentially influencing the diversity of flower patterns across different species. The next step for the research team is to identify the signals responsible for generating these early patterns and to explore whether similar pre-patterning mechanisms are used in other plant organs, such as leaves.

This research not only advances our understanding of plant biology but also highlights the intricate connections between plants and their environments, showing how precise natural designs can play a pivotal role in the survival and evolution of species.

For example, H. richardsonii, which has the smallest bullseye of the three hibiscus plants studied in this research, is a critically endangered plant native to New Zealand. H. trionum is also found in New Zealand, but not considered to be native, and is widely distributed across Australia and Europe and has become a weedy naturalised plant in North America. Additional research is needed to determine whether the larger bullseye size helps H. trionum attract more pollinators and enhance its reproductive success.

Patterns on the flowers of plants guide insects, like bees, to the centre of the flower, where nectar and pollen await, enhancing the plant's chances of successful pollination. Despite their importance, surprisingly little is known about how these petal patterns form and how they have evolved into the vast diversity we see today, including spots, stripes, veins, and bullseyes.

These findings pave the way for further research into how petal patterns influence the survival and evolution of flowering plant species.

Reference

Riglet, L. et al: 'Hibiscus bullseyes reveal mechanisms controlling petal pattern proportions that influence plant-pollinator interactions'. September 2024, Science Advances. DOI: 10.1126/sciadv.adp5574

Flowers like hibiscus follow an invisible blueprint established very early in petal formation that precisely dictates the size of their central bullseye – a crucial pattern that can significantly impact their ability to attract pollinating bees.

We identified a specific region where the cells were larger than their surrounding neighbours - this was the pre-pattern.
Lucie Riglet

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Flowers use adjustable ‘paint by numbers’ petal designs to attract pollinators

Artificial flower discs designed to mimic the bullseye sizes of the three hibiscus flowers

The study, by researchers at the University of Cambridge’s Sainsbury Laboratory also found that bees prefer larger bullseyes over smaller ones and fly 25% faster between artificial flower discs with larger bullseyes – potentially boosting efficiency for both bees and blossoms. 

Patterns on the flowers of plants guide insects, like bees, to the centre of the flower, where nectar and pollen await, enhancing the plant's chances of successful pollination. Despite their importance, surprisingly little is known about how these petal patterns form and how they have evolved into the vast diversity we see today, including spots, stripes, veins, and bullseyes. 

Using a small hibiscus plant as a model, researchers compared closely related plants with the same flower size but three differently sized bullseye patterns featuring a dark purple centre surrounded by white – H. richardsonii (small bullseye covering 4% of the flower disc), H. trionum (medium bullseye covering 16%) and a transgenic line (mutation) of H. trionum (large bullseye covering 36%). 

They found that a pre-pattern is set up on the petal surface very early in the flower’s formation long before the petal shows any visible colour. The petal acts like a 'paint-by-numbers' canvas, where different regions are predetermined to develop specific colours and textures long before they start looking different from one another. 

The research also shows plants can precisely control and modify the shape and size of these patterns using multiple mechanisms, with possible implications for plant evolution. By fine-tuning these designs, plants may gain a competitive advantage in the contest to attract pollinators or maybe start attracting different species of insects. 

These findings are published in Science Advances

Dr Edwige Moyroud, who leads a research team studying the mechanisms underlying pattern formation in petals, explained: “If a trait can be produced by different methods, it gives evolution more options to modify it and create diversity, similar to an artist with a large palette or a builder with an extensive set of tools. By studying how bullseye patterns change, what we are really trying to understand is how nature generates biodiversity.” 

Lead author Dr Lucie Riglet investigated the mechanism behind hibiscus petal patterning by analysing petal development in the three hibiscus flowers that had the same total size but different bullseye patterns. 

She found that the pre-pattern begins as a small, crescent-shaped region long before the bullseye is visible on tiny petals less than 0.2mm in size. 

Dr Riglet said: “At the earliest stage we could dissect, the petals have around 700 cells and are still greenish in colour, with no visible purple pigment and no difference in cell shape or size. When the petal further develops to 4000 cells, it still does not have any visible pigment, but we identified a specific region where the cells were larger than their surrounding neighbours. This is the pre-pattern.” 

These cells are important because they mark the position of the bullseye boundary, the line on the petal where the colour changes from purple to white – without a boundary there is no bullseye! 

A computational model developed by Dr Argyris Zardilis provided further insights, and combining both computational models and experimental results, the researchers showed that hibiscus can vary bullseye dimensions very early during the pre-patterning phase or modulate growth in either region of the bullseye, by adjusting cell expansion or division, later in development. 

Dr Riglet then compared the relative success of the bullseye patterns in attracting pollinators using artificial flower discs that mimicked the three different bullseye dimensions. Dr Riglet explained: “The bees not only preferred the medium and larger bullseyes over the small bullseye, they were also 25% quicker visiting these larger flower discs. Foraging requires a lot of energy and so if a bee can visit 4 flowers rather than 3 flowers in the same time, then this is probably beneficial for the bee, and also the plants.” 

The researchers think that these pre-patterning strategies could have deep evolutionary roots, potentially influencing the diversity of flower patterns across different species. The next step for the research team is to identify the signals responsible for generating these early patterns and to explore whether similar pre-patterning mechanisms are used in other plant organs, such as leaves. 

This research not only advances our understanding of plant biology but also highlights the intricate connections between plants and their environments, showing how precise natural designs can play a pivotal role in the survival and evolution of species. 

For example, H. richardsonii, which has the smallest bullseye of the three hibiscus plants studied in this research, is a critically endangered plant native to New Zealand. H. trionum is also found in New Zealand, but not considered to be native, and is widely distributed across Australia and Europe and has become a weedy naturalised plant in North America. Additional research is needed to determine whether the larger bullseye size helps H. trionum attract more pollinators and enhance its reproductive success. 

Reference 
Lucie Riglet, Argyris Zardilis, Alice L M Fairnie, May T Yeo, Henrik Jönsson and Edwige Moyroud (2024) Hibiscus bullseyes reveal mechanisms controlling petal pattern proportions that influence plant-pollinator interactions. Science Advances. DOI: 10.1126/sciadv.adp5574 

Flowers like hibiscus use an invisible blueprint established very early in petal formation that dictates the size of their bullseyes – a crucial pre-pattern that can significantly impact their ability to attract pollinating bees.  

If a trait can be produced by different methods, it gives evolution more options to modify it and create diversity, similar to an artist with a large palette or a builder with an extensive set of tools
Edwige Moyroud
Artificial flower discs designed to mimic the bullseye sizes of the three hibiscus flowers

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Cancer researchers and astronomers join forces in fight against disease

Lung with two metastatic lesions derived from a mouse primary triple-negative breast tumour

The technology from Cancer Grand Challenges team IMAXT uses advanced spatial biology techniques to analyse tumours, some of which are based on technology originally developed to map the Milky Way and discover new planets. Now, other scientists will be able to access these technologies to create detailed tumour maps that could one day transform how we diagnose and treat cancer.

Led by Professor Greg Hannon and Dr Dario Bressan at the Cancer Research UK Cambridge Institute and Dr Nicholas Walton at the University of Cambridge’s Institute of Astronomy, SPACE will give other researchers the opportunity to study cancer in a way that wasn’t previously possible.

Dr Dario Bressan, Head of the SPACE Laboratory at the Cancer Research UK Cambridge Institute, said: “Tumours aren’t just a uniform mass of cells; they consist of a diverse ecosystem of cancer cells, immune cells, and other essential components that support their survival. Hidden within these intricate networks lies valuable information which could guide us in making more personalised treatment decisions for each patient. 

“With the SPACE platform, researchers can zoom into specific cell populations, highlight the complex connections between them, and even run virtual experiments to predict how the tumour might respond to different treatments. By unlocking these insights, we can transform the future of cancer care and uncover new opportunities for targeted therapies.” 

The IMAXT team was first awarded £20 million in 2017 by Cancer Research UK through Cancer Grand Challenges, a global research initiative co-founded by Cancer Research UK and the National Cancer Institute in the US.

Since then, the team has united experts from fields rarely brought together including medicine, Virtual Reality (VR), programming, molecular biology, chemistry, mathematics, and even astronomy, to create a completely immersive tool for studying tumours.

As well as enabling scientists to analyse 3D tumour maps, IMAXT created pioneering VR technology which allows the user to ‘step inside’ a tumour using a VR headset.

With the headset, scientists get to view vast amounts of detailed data about individual tumour cells in a 3D space. Instead of looking at this data on a computer screen, they can see all the information in real-time, as if they were inside the tumour itself.

Professor Greg Hannon, Director of the Cancer Research UK Cambridge Institute, said: "Cancer Grand Challenges offers a unique opportunity for international teams to address some of cancer’s biggest challenges. When we took on our particular challenge, much of what we proposed was science fiction.

“Over the past 7 years, our team has turned those early hopes and ideas into approaches that can now be made broadly available. In nature, biology unfolds in three dimensions, and we now finally have the tools to observe it that way—giving us a much deeper, more accurate view of cancer. We’re thrilled to share these breakthroughs with the broader cancer research community."

Director of Cancer Grand Challenges at Cancer Research UK, Dr David Scott, said: “IMAXT is changing what’s possible when it comes to cancer research.  

“We can glean important insights about a tumour by analysing its genetic makeup or its proteins, but no technology alone can give us the depth of understanding needed to truly understand this complex disease.  

“By combining state-of-the-art technology and vast expertise, IMAXT will change how cancers are classified, treated and managed, giving more people a better chance of surviving their disease.” 

The funding will support the SPACE hub laboratory, hosted at the CRUK Cambridge Institute, and the SPACE analysis and computing platform, developed and operated at the Institute of Astronomy, University of Cambridge. Together SPACE includes and combines most available technologies for the spatial molecular profiling of tumours. The continued collaboration between the cancer and astronomy teams from the IMAXT project will ensure the maintenance and development of all critical aspects of the platform – from technical and scientific expertise to instrumentation, computing, and data analysis – to allow SPACE to continue at the forefront research in the rapidly emerging spatial-omics field, and be a valuable centre of excellence to support new research in the Cancer Grand Challenge and cancer research communities.

SPACE is funded by Cancer Research UK through Cancer Grand Challenges. Additional support for the SPACE project has been provided by the UK Space Agency through their funding of the development of imaging and analysis techniques at the IoA, Cambridge for a range of space science missions. These have been successfully applied to spatial imaging data through IMAXT and are ready for wider use in SPACE.

Dr Paul Bate, Chief Executive Officer at UK Space Agency said: “Space is powering our daily lives, from satellite navigation to weather forecasts and climate monitoring. This collaboration between the cancer and astronomy teams in the IMAXT project is another real-world example of how space science and technology is bringing benefits to people here on Earth. 

“Thanks to this partnership, the same science and technology that mapped the Milky Way may soon have a positive impact on people battling cancer, and could support doctors to provide better, faster treatment." 

Going forward, a next-generation version of the VR technology will be further developed and commercialised by Suil Vision, a start-up company recently launched by IMAXT team members and Cancer Research UK’s innovation arm, Cancer Research Horizons. Suil Vision is the first start-up to emerge from the Cancer Grand Challenges programme. With a £500,000 investment from the Cancer Research Horizons Seed Fund, Suil Vision will create a market-ready version of their suite of VR technologies for analysing multiple types of biological data, rolling these out across research institutions and companies worldwide. 

Find out how Cambridge is changing the story of cancer

Image

Top: Lung with two metastatic lesions derived from a mouse primary triple-negative breast tumour. The figure shows how the registration of the different imaging modalities to a cellular level allow to segment individual cells and identify tumour cell populations, differentiate hypoxic areas, increased fibrosis, infiltration of immune cells and blood and lymphatic vessels by staining with a panel of 35 cell markers at the same time. Bottom: Sample in 3D depicts a tumour grown in the mammary gland of a mouse showing the power of the SPACE pipeline to produce and visualise large volumes (typically ~100,000 individual images registered and stitched and up to 500TB 500 GB of data). The orange fluorescence beads are clearly visible in the medium outside the biological tissue and prove to be crucial for all stages of multimodal registration.

Adapted from a press release by Cancer Research UK

A unique collaboration of astronomers and cancer researchers at Cambridge has been awarded more than £5m to establish the Spatial Profiling and Annotation Centre of Excellence (SPACE) to open up access to their groundbreaking cancer mapping technology and establish collaborations with other scientists to enable them to investigate tumours in 3D.

When we took on our particular challenge, much of what we proposed was science fiction
Greg Hannon
Lung with two metastatic lesions derived from a mouse primary triple-negative breast tumour

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Housing Minister visits Eddington – Cambridge’s vibrant and sustainable new neighbourhood

From left, Matt Johnson, Head of Development at North West Cambridge; Vice-Chancellor Professor Deborah Prentice; Matthew Pennycook MP, Minister for Housing and Planning, and  Peter Freeman, Chair of Homes England.

Matthew Pennycook MP, who was accompanied by Peter Freeman, Chair of Homes England, joined the University’s Vice-Chancellor, Professor Deborah Prentice, the Pro-Vice-Chancellor for Innovation, Dr Diarmuid O’Brien, and members of the University Estates Division for a tour of the city’s new community.

During the fact-finding visit, Mr Pennycook heard about the innovative design and planning that has gone into Eddington, which is delivering homes, community facilities, and green space, and at the same time creating a vibrant and sustainable place to live.

Central to its planning has been the provision of affordable housing for key worker staff at the University, which will account for 50% of Eddington’s homes. By housing University staff in a purpose-built, high-quality neighbourhood, while also adding more homes to the open market, Eddington aims to relieve housing pressure on the city and support the highly successful Cambridge eco-system which provides long-term growth and jobs for the wider area and beyond.

The Minister also heard how Eddington and the nearby Cambridge West Innovation District are critical parts of the University’s plans for sustained economic growth in Cambridge. The University is driving forward the Cambridge West Innovation District and the Eddington housing development together, as part of a coherent vision for the future of the city.

Professor Deborah Prentice, Vice-Chancellor, said: “It was a pleasure to welcome the Housing and Planning Minister, and the Chair of Homes England, and to show them around our exciting new development. They were interested to hear how the neighbourhood has been designed with sustainability at its heart, and how it helps support Cambridge as a world-leading hub of innovation, benefiting the community and providing growth for the national economy.”

Mr Pennycook later visited Cambridge Biomedical Campus where he was joined by Lord Vallance, Minister for Science, in hosting a roundtable on infrastructure and growth in Cambridge.

Proposals for the future phases of the Eddington development will be discussed at the first round of public consultations this month. For dates and venue details visit the Eddington website.

The new Minister for Housing and Planning visited the University’s Eddington development to learn more about the new neighbourhood as an example of a high-quality, sustainable housing project that supports local economic growth.

From left, Matt Johnson, Head of Development at North West Cambridge; Vice-Chancellor Professor Deborah Prentice; Matthew Pennycook MP, Minister for Housing and Planning, and Peter Freeman, Chair of Homes England.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

‘Smart choker’ uses AI to help people with speech impairment to communicate

Smart Choker

The smart choker, developed by researchers at the University of Cambridge, incorporates electronic sensors in a soft, stretchable fabric, and is comfortable to wear. The device could be useful for people who have temporary or permanent speech impairments, whether due to laryngeal surgery, or conditions such as Parkinson’s, stroke or cerebral palsy.

By incorporating machine learning techniques, the smart choker can also successfully recognise differences in pronunciation, accent and vocabulary between users, reducing the amount of training required.

The choker is a type of technology known as a silent speech interface, which analyses non-vocal signals to decode speech in silent conditions – the user only needs to mouth the words in order for them to be captured. The captured speech signals can then be transferred to a computer or speaker to facilitate conversation.

Tests of the smart choker showed it could recognise words with over 95% accuracy, while using 90% less computational energy than existing state-of-the art technologies. The results are reported in the journal npj Flexible Electronics.

“Current solutions for people with speech impairments often fail to capture words and require a lot of training,” said Dr Luigi Occhipinti from the Cambridge Graphene Centre, who led the research. “They are also rigid, bulky and sometimes require invasive surgery to the throat.”

The smart choker developed by Occhipinti and his colleagues outperforms current technologies on accuracy, requires less computing power, is comfortable for users to wear, and can be removed whenever it’s not needed. The choker is made from a sustainable bamboo-based textile, with strain sensors based on graphene ink incorporated in the fabric. When the sensors detect any strain, tiny, controllable cracks form in the graphene. The sensitivity of the sensors is more than four times higher than existing state of the art.

“These sensors can detect tiny vibrations, such as those formed in the throat when whispering or even silently mouthing words, which makes them ideal for speech detection,” said Occhipinti. “By combining the ultra-high sensitivity of the sensors with highly efficient machine learning, we’ve come up with a device we think could help a lot of people who struggle with their speech.”

Vocal signals are incredibly complex, so associating a specific signal with a specific word requires a high level of computational processing. “On top of that, every person is different in terms of the way they speak, and machine learning gives us the tools we need to learn and adapt the interpretation of signals from person to person,” said Occhipinti.

The researchers trained their machine learning model on a database of the most frequently used words in English, and selected words which are frequently confused with each other, such as ‘book’ and ‘look’. The model was trained with a variety of users, including different genders, native and non-native English speakers, as well as people with different accents and different speaking speeds.

Thanks to the device’s ability to capture rich dynamic signal characteristics, the researchers found it possible to use lightweight neural network architectures with simplified depth and signal dimensions to extract and enhance the speech information features. This resulted in a machine learning model with high computational and energy efficiency, ideal for integration in battery-operated wearable devices with real-time AI processing capabilities.

“We chose to train the model with lots of different English speakers, so we could show it was capable of learning,” said Occhipinti. “Machine learning has the capability to learn quickly and efficiently from one user to the next, so the retraining process is quick.”

Tests of the smart choker showed it was 95.25% accurate in decoding speech. “I was surprised at just how sensitive the device is,” said Occhipinti. “We couldn’t capture all the signals and complexity of human speech before, but now that we can, it unlocks a whole new set of potential applications.”

Although the choker will have to undergo extensive testing and clinical trials before it is approved for use in patients with speech impairments, the researchers say that their smart choker could also be used in other health monitoring applications, or for improving communication in noisy or secure environments.

The research was supported in part by the EU Graphene Flagship and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

Reference:
Chenyu Tang et al. ‘Ultrasensitive textile strain sensors redefine wearable silent speech interfaces with high machine learning efficiency.’ npj Flexible Electronics (2024). DOI: 10.1038/s41528-024-00315-1

Researchers have developed a wearable ‘smart choker’ that uses a combination of flexible electronics and artificial intelligence techniques to allow people with speech impairments to communicate by detecting tiny movements in the throat.

Smart Choker

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Team’s hip replacement surgery invention is set to be world first

Illustration of a human hip joint

They have just won an award to develop their technology, which aims to make hip surgery more precise and deliver better and longer-lasting outcomes – which is good for patients and the NHS.

The National Institute for Health and Care Research (NIHR) has awarded a £1.4 million Invention for Innovation (i4i) Product Development Award to advance work on the team’s “smart” joint “trial liner”.

The sensors measure forces passing through the hip joint to help the surgeon assess and balance the soft tissues, which aids the accurate positioning of the implant.

Once measurements are complete using the wireless surgical aid, the surgeon marks the ideal position for the implant, removes the smart trial liner, and completes the operation.

There are currently no technologies that can deliver such readings during an operation and in real-time, and instead surgeons balance the joint based on feel and anatomical landmarks.

This is despite over two million total hip replacements being performed annually, with the number constantly rising due to increasing lifespans. Younger patients are starting to need hip replacements as well, so implants need to withstand higher stresses and last longer, to avoid spiralling into a vicious circle of revision surgery and higher rates of dissatisfaction.

Driving this clinical initiative is the chief investigator from Cambridge University Hospitals (CUH) NHS Foundation Trust, Consultant orthopaedic surgeon, clinical and research lead of the Young Adult Hip Service, and Affiliate Associate Professor at the University of Cambridge Vikas Khanduja.

The technology development is being overseen by Professor Sohini Kar-Narayan from Cambridge’s Department of Materials Science and Metallurgy, together with Dr Jehangir Cama, who is leading on translational and commercialisation activities. They are joined by Consultant clinical scientist and CUH head of clinical engineering, Professor Paul White.

“We’re really looking forward to this next phase of product development that will see us move towards an actual product that is fit for clinical use, and that has the potential to revolutionise joint replacement surgery,” said Kar-Narayan.

“This funding will bring together wide-ranging expertise to help us further develop our prototype, bringing this technology closer to clinical use,” said Cama.

The team currently has a prototype version of the device, which has been validated in the laboratory and in other tests. However, the NIHR award is important for further development and finalisation of the design and compliance with regulations before it can be tested in a living patient.

The team’s underlying sensor technology intellectual property has been protected via a patent application filed by Cambridge Enterprise, the University’s commercialisation arm.

“This is a fantastic example of Cambridge’s entrepreneurial clinicians, academics and their institutions working together with forward-looking funders to create a positive impact for markets, society and importantly patients,” said Dr Terry Parlett, Commercialisation Director at Cambridge Enterprise.

Adapted from a CUH press release.

Technology that could transform the future of hip replacement surgery is being pioneered by a team of experts in Cambridge.

Illustration of a human hip joint

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Personal carbon footprint of the rich is vastly underestimated by rich and poor alike, study finds

A father and two sons running on a beach

An international group of researchers, led by the Copenhagen Business School, the University of Basel and the University of Cambridge, surveyed 4,000 people from Denmark, India, Nigeria and the United States about inequality in personal carbon footprints – the total amount of greenhouse gases produced by a person’s activities – within their own country.

Although it is well-known that there is a large gap between the carbon footprint of the richest and poorest in society, it’s been unclear whether individuals were aware of this inequality. The four countries chosen for the survey are all different in terms of wealth, lifestyle and culture. Survey participants also differed in their personal income, with half of participants belonging to the top 10% of income in their country.

The vast majority of participants across the four countries overestimated the average personal carbon footprint of the poorest 50% and underestimated those of the richest 10% and 1%.

However, participants from the top 10% were more likely to support certain climate policies, such as increasing the price of electricity during peak periods, taxing red meat consumption or subsidising carbon dioxide removal technologies such as carbon capture and storage.

The researchers say that this may reflect generally higher education levels among high earners, a greater ability to absorb price-based policies or a stronger preference for technological solutions to the climate crisis. The results are reported in the journal Nature Climate Change.

Although the concept of a personal carbon or environmental footprint has been used for over 40 years, it became widely popularised in the mid-2000s, when the fossil fuel company BP ran a large advertising campaign encouraging people to determine and reduce their personal carbon footprint.

“There are definitely groups out there who would like to push the responsibility of reducing carbon emissions away from corporations and onto individuals, which is problematic,” said co-author Dr Ramit Debnath, Assistant Professor and Cambridge Zero Fellow at the University of Cambridge. “However, personal carbon footprints can illustrate the profound inequality within and between countries and help people identify how to live in a more climate-friendly way.”

Previous research has shown widespread misperceptions about how certain consumer behaviours affect an individual's carbon footprint. For example, recycling, shutting off the lights when leaving a room and avoiding plastic packaging are lower-impact behaviours that are overestimated in terms of how much they can reduce one’s carbon footprint. On the other end, the impact of behaviours such as red meat consumption, heating and cooling homes, and air travel all tend to be underestimated.

However, there is limited research on whether these misperceptions extend to people’s perceptions of the composition and scale of personal carbon footprints and their ability to make comparisons between different groups.

The four countries selected for the survey (Denmark, India, Nigeria and the US) were chosen due to their different per-capita carbon emissions and their levels of economic inequality. Within each country, approximately 1,000 participants were surveyed, with half of each participant group from the top 10% of their country and the other half from the bottom 90%.

Participants were asked to estimate the average personal carbon footprints specific to three income groups (the bottom 50%, the top 10%, and the top 1% of income) within their country. Most participants overestimated the average personal carbon footprint for the bottom 50% of income and underestimated the average footprints for the top 10% and top 1% of income.

“These countries are very different, but we found the rich are pretty similar no matter where you go, and their concerns are different to the rest of society,” said Debnath. “There’s a huge contrast between billionaires travelling by private jet while the rest of us drink with soggy paper straws: one of those activities has a big impact on an individual carbon footprint, and one doesn’t.”

The researchers also looked at whether people’s ideas of carbon footprint inequality were related to their support for different climate policies. They found that Danish and Nigerian participants who underestimated carbon footprint inequality were generally less supportive of climate policies. They also found that Indian participants from the top 10% were generally more supportive of climate policies, potentially reflecting their higher education and greater resources.

“Poorer people have more immediate concerns, such as how they’re going to pay their rent, or support their families,” said first author Dr Kristian Steensen Nielsen from Copenhagen Business School. “But across all income groups, people want real solutions to the climate crisis, whether those are regulatory or technological. However, the people with the highest carbon footprints bear the greatest responsibility for changing their lifestyles and reducing their footprints.”

After learning about the actual carbon footprint inequality, most participants found it slightly unfair, with those in Denmark and the United States finding it the most unfair. However, people from the top 10% generally found the inequality fairer than the general population, except in India. “This could be because they’re trying to justify their larger carbon footprints,” said Debnath.

The researchers say that more work is needed to determine the best ways to promote fairness and justice in climate action across countries, cultures and communities.

“Due to their greater financial and political influence, most climate policies reflect the interests of the richest in society and rarely involve fundamental changes to their lifestyles or social status,” said Debnath.

“Greater awareness and discussion of existing inequality in personal carbon footprints can help build political pressure to address these inequalities and develop climate solutions that work for all,” said Nielsen.

The study also involved researchers from Justus-Liebig-University Giessen, Murdoch University and Oxford University. The research was supported in part by the Carlsberg Foundation, the Bill & Melinda Gates Foundation, the Quadrature Climate Foundation and the Swiss National Science Foundation.

Reference:
Kristian S Nielsen et al. ‘Underestimation of personal carbon footprint inequality in four diverse countries.’ Nature Climate Change (2024). DOI: 10.1038/s41558-024-02130-y 

The personal carbon footprint of the richest people in society is grossly underestimated, both by the rich themselves and by those on middle and lower incomes, no matter which country they come from. At the same time, both the rich and the poor drastically overestimate the carbon footprint of the poorest people.

A father and two sons running on a beach

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New study aims to catch cancer earlier than ever before

CT scan showing cholangiocarcinoma

Currently cancer is usually diagnosed when tumours are already developed requiring, often significant, treatment to remove them and prevent further growth.

However, a research team at the University of Cambridge will receive over £1.5m from Cancer Research UK over the next six years to investigate how the immune system evolves, targets and kills cancer cells as tumours are developing.

They hope by detecting the trigger point when our own body starts to recognise cancerous cells, it may help find a way to spark our own immune system into action so it kills cancer cells before tumours can even begin.

This could vastly reduce the amount of treatment people diagnosed with cancer require, which can often have significant side effects. The pioneering work could benefit millions of cancer patients before the disease becomes life-threatening or spreads.

Dr Heather Machado is leading a team of scientists at the Department of Pathology, looking at the body's immune system's ability to fight cancer.  Dr Machado’s work on T cells – part of the immune system which fight infection and disease, including cancer – will provide an insight into how long before a diagnosis these cells recognize and respond to cancer.

The study will specifically examine how T cells respond to cancer when they first recognise and respond to a tumour in the kidneys or the liver.

The breakthrough study has the potential to unlock the mystery as to how our immune cells work to fight cancer.

Dr Machado said: “Using mutations that naturally accumulate in each of our cells as we age, we can essentially build a family tree of T-cells, and this family tree has information about when T-cells met cancer for the first time. This research is only now possible as a result of advancements in DNA sequencing technology.

“This research has the potential to give an entirely new perspective on the role of the immune system in cancer progression, findings that we hope to use to further improve lifesaving cancer immunotherapies.”

Her aim is to see if they could lead to specific immunotherapy treatments and ways of detecting the cancer earlier.

She added: “Most cancers are diagnosed years or decades after early tumour development, which can often be too late. Our methods will allow us to go back in the cancer’s timeline to understand the immune response in these early stages of cancer development. Beyond improving immunotherapies, we hope that this understanding helps us detect cancer earlier, at stages where survival rates are much higher.”

The body’s immune system is the first line of defence against cancer but previously it has been difficult to observe this early response in humans.

Dr Machado will use genome sequencing which determines the genetic makeup of an organism to study how a tumour and the immune cells co-evolve over the course of tumour development. 

She will time T cell clonal expansions using evolutionary trees built from the genomes of individual T cells, exploiting recent advancements in single-cell whole genome sequencing. Dr Machado will then perform these experiments using early-stage kidney and liver cancer resections and by sampling throughout the course of immunotherapy in metastatic kidney cancer.”

She added: “The study is believed to be the first of its kind in the world and it has the potential to be groundbreaking research as we have never been able to examine these evolutionary dynamics in humans before. How long before a tumour is diagnosed has the immune system been responding is an incredibly hard problem to solve because these immune dynamics play out years prior to diagnosis.

“Normal cells evolve into tumours, and we are blind to much of that process and yet the immune system is one of our best tools for fighting cancer.

Dr Machado studied for her PhD at Stanford University and completed her post doctorate research at the Wellcome Sanger Institute in Hinxton, UK. She added: “We are using cutting edge technology that is only available now and we are going to be able to discover how the immune system responds to tumours unlike we have ever seen before and that, is potentially life changing in terms of improving immunotherapies for better health and patient prognosis.”

Find out how Cambridge is changing the story of cancer

Adapted from a press release from Cancer Research UK

A new study aims, for the first time, to pinpoint the very moment the immune system recognises a tumour to try to stop the disease earlier than previously possible.

This research has the potential to give an entirely new perspective on the role of the immune system in cancer progression
Heather Machado
CT scan showing cholangiocarcinoma

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

UK organisations release statistics for use of animals in research in 2023

African Clawed Frogs held in water-filled tanks

The statistics for the University of Cambridge are available on our website as part of our ongoing commitment to transparency and openness around the use of animals in research.

This coincides with the publication of the Home Office’s report on the statistics of scientific procedures on living animals in Great Britain in 2023.

The 10 organisations carried out 1,435,009 procedures, 54% (over half) of the 2,681,686 procedures carried out on animals for scientific research in Great Britain in 2023. Of these 1,435,009 procedures, more than 99% were carried out on mice, fish and rats and 82% were classified as causing pain equivalent to, or less than, an injection.

The 10 organisations are listed below alongside the total number of procedures they carried out in 2023. This is the ninth consecutive year that organisations have come together to publicise their collective statistics and examples of their research.

Organisation Number of Procedures (2023)
University of Cambridge 223,787
University of Oxford 194,913
The Francis Crick Institute 192,920
UCL 176,019
University of Edinburgh 139,881
Medical Research Council 124,156
University of Manchester 110,885
King's College London 109,779
University of Glasgow 102,089
Imperial College London 60,580
TOTAL 1,435,009

 

In total, 69 organisations have voluntarily published their 2023 animal research statistics.

All organisations are committed to the ethical framework called the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible, minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as institutions expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study. 

All organisations listed are signatories to the Concordat on Openness on Animal Research in the UK, which commits them to being more open about the use of animals in scientific, medical and veterinary research in the UK. More than 125 organisations have signed the Concordat, including UK universities, medical research charities, research funders, learned societies and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said:

“Animal research remains a small but vital part of the quest for new medicines, vaccines and treatments for humans and animals. Alternative methods are gradually being phased in, but, until we have sufficient reliable alternatives available, it is important that organisations that use animals in research maintain the public’s trust in them. By providing this level of information about the numbers of animals used, and the experience of those animals, as well as details of the medical breakthroughs that derive from this research, these Concordat signatories are helping the public to make up their own minds about how they feel about the use of animals in scientific research in Great Britain.”   

Professor Anna Philpott, Head of the School of Biological Sciences at the University of Cambridge, said:

“Cambridge research is changing how we understand health and ageing, and how we treat disease. Animal research continues to play a small but vital role in this work and in the development of ground-breaking new medical devices and drug treatments. We are committed to using animals only where there is no alternative as a means of making progress.”

Story adapted from a press release by Understanding Animal Research.

 

CASE STUDY: Egging on vital research

The actin cytoskeleton is a system of long filaments, vital in embryonic development. Problems with its control have been linked to the kidney problems experienced by patients with the rare conditions called Lowe syndrome and Dent disease 2. But since the actin cytoskeleton is in all the cells of the body it has been very difficult to translate an understanding of it into a drug treatment.

Wellcome Trust Senior Research Fellow Dr Jenny Gallop at the University of Cambridge has created a simpler version of the actin cytoskeleton that she can study in the lab. A key component is cytoplasm extracted from frog eggs.

Gallop’s lab keeps around 120 female frogs that are induced to lay eggs in a way that matches their natural cycles. This requires a hormone injection - just a mild discomfort to the frogs - every three to four months to make them ovulate. Over time, Gallop has refined her methods so that only half the original number of frogs are now needed.

This has enabled her to understand what might be going wrong in Lowe syndrome and Dent disease 2 – and realise that an existing drug might be able to help. Alpelisib has already been approved to safely treat breast cancer, and Gallop is now applying for approval to test whether it works to treat the kidney problems in patients with Dent disease 2.

Repurposing an existing drug means the long drug-development process has already been done. Conversations with people affected by the diseases inspire Gallop’s team to keep going. And the frogs have played a vital role in this decade-long journey.

Read the full story

The 10 organisations in Great Britain that carry out the highest number of animal procedures - those used in medical, veterinary and scientific research – have released their annual statistics today.

African Clawed Frogs held in water-filled tanks

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Early career researchers win major European funding

Plant roots interacting with arbuscular mycorrhizal fungi. Image: Luginbuehl lab

Of 3,500 proposals reviewed by the ERC, only 14% were selected for funding – Cambridge has the highest number of grants of any UK institution.

ERC Starting Grants – totalling nearly €780 million – support cutting-edge research in a wide range of fields, from life sciences and physics to social sciences and humanities.

The awards help researchers at the beginning of their careers to launch their own projects, form their teams and pursue their most promising ideas. Starting Grants amount to €1.5 million per grant for a period of five years but additional funds can be made available.

In total, the grants are estimated to create 3,160 jobs for postdoctoral fellows, PhD students and other staff at host institutions.

Cambridge’s recipients work in a wide range of fields including plant sciences, mathematics and medicine. They are among 494 laureates who will be leading projects at universities and research centres in 24 EU Member States and associated countries. This year, the UK has received grants for 50 projects, Germany 98, France 49, and the Netherlands 51.

Cambridge’s grant recipients for 2024 are:

Adrian Baez-Ortega (Dept. of Veterinary Medicine, Wellcome Sanger Institute) for Exploring the mechanisms of long-term tumour evolution and genomic instability in marine transmissible cancers

Claudia Bonfio (MRC Laboratory of Molecular Biology) for Lipid Diversity at the Onset of Life

Tom Gur (Dept. of Computer Science and Technology) for Sublinear Quantum Computation

Leonie Luginbuehl (Dept. of Plant Sciences) for Harnessing mechanisms for plant carbon delivery to symbiotic soil fungi for sustainable food production

Julian Sahasrabudhe (Dept. of Pure Mathematics and Mathematical Statistics) for High Dimensional Probability and Combinatorics

Richard Timms (Cambridge Institute for Therapeutic Immunology and Infectious Disease) for Deciphering the regulatory logic of the ubiquitin system

Hannah Übler (Dept. of Physics) for Active galactic nuclei and Population III stars in early galaxies

Julian Willis (Yusuf Hamied Department of Chemistry) for Studying viral protein-primed DNA replication to develop new gene editing technologies

Federica Gigante (Faculty of History) for Unveiling Networks: Slavery and the European Encounter with Islamic Material Culture (1580– 1700) – Grant hosted by the University of Oxford

 

Professor Sir John Aston FRS, Pro-Vice-Chancellor for Research at the University of Cambridge, said:

“Many congratulations to the recipients of these awards which reflect the innovation and the vision of these outstanding investigators. We are fortunate to have many exceptional young researchers across a wide range of disciplines here in Cambridge and awards such as these highlight some of the amazing research taking place across the university. I wish this year’s recipients all the very best as they begin their new programmes and can’t wait to see the outcomes of their work.”

Iliana Ivanova, European Commissioner for Innovation, Research, Culture, Education and Youth, said:

“The European Commission is proud to support the curiosity and passion of our early-career talent under our Horizon Europe programme. The new ERC Starting Grants winners aim to deepen our understanding of the world. Their creativity is vital to finding solutions to some of the most pressing societal challenges. In this call, I am happy to see one of the highest shares of female grantees to date, a trend that I hope will continue. Congratulations to all!”

President of the European Research Council, Prof. Maria Leptin, said:

“Empowering researchers early on in their careers is at the heart of the mission of the ERC. I am particularly pleased to welcome UK researchers back to the ERC. They have been sorely missed over the past years. With fifty grants awarded to researchers based in the UK, this influx is good for the research community overall.”

Nine Cambridge researchers are among the latest recipients of highly competitive and prestigious European Research Council (ERC) Starting Grants.

Plant roots interacting with arbuscular mycorrhizal fungi. Image: Luginbuehl lab

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Children switch to walking and cycling to school after introduction of London’s Ultra-Low Emission Zone

ULEZ signs in London

Car travel contributes to air pollution, a major cause of heart and lung diseases including asthma attacks. Beyond this, it limits children's opportunities for physical activity, hindering their development and mental health, and increasing their risk of obesity and chronic illnesses.

Despite UK guidelines recommending a daily average of 60 minutes of moderate-to-vigorous physical activity for school-aged children and adolescents, less than half (45%) of children aged 5-16 met these levels in 2021. One in three children aged 10-11 in the UK are overweight or obese.

In April 2019, London introduced the ULEZ to help improve air quality by reducing the number of vehicles on the road that do not meet emissions standards. According to Transport for London, the central London ULEZ reduced harmful nitrogen oxides by 35% and particulate matter by 15% in central London within the first 10 months of its introduction.

In a study published on 5 September in the International Journal of Behavioral Nutrition and Physical Activity, a team led by researchers at the University of Cambridge and Queen Mary University of London examined the impact of the ULEZ on how children travelled to school. The research was part of the CHILL study (Children’s Health in London and Luton).

The study examined data from almost 2,000 children aged 6 to 9 years attending 84 primary schools in London and the control area, Luton. 44 schools were located with catchment areas within or bordering London’s ULEZ, and these were compared to a similar number in Luton and Dunstable (acting as a comparison group). The inclusion of the comparison site enabled the researchers to draw more robust conclusions and increased confidence in attributing the observed changes to the introduction of the ULEZ.

The researchers collected data from the period June 2018 to April 2019, prior to ULEZ implementation, and again in the period June 2019 to March 2020, the year after implementation of the ULEZ but prior to COVID-19-related school closures.

Among those children in London who travelled by car prior to the introduction of the ULEZ, 4 in 10 (42%) switched to active modes, while one in 20 (5%) switched from active to inactive modes.

In contrast, only one in 5 (20%) children in Luton swapped from car travel to active modes, while a similar number (21%) switched from active to car travel. This means that children in London within the ULEZ were 3.6 times as likely to shift from travelling by car to active travel modes compared to those children in Luton and far less likely (0.11 times) to switch to inactive modes.

The impact of the ULEZ on switching to active travel modes was strongest for those children living more than half a mile (0.78km) from school. This was probably because many children who live closer to school already walked or cycled to school prior to the ULEZ and therefore there was more potential for change in those living further away from their school.

The study’s first author, Dr Christina Xiao from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, said: “The introduction of the ULEZ was associated with positive changes in how children travelled to school, with a much larger number of children moving from inactive to active modes of transport in London than in Luton.

“Given children's heightened vulnerability to air pollution and the critical role of physical activity for their health and development, financial disincentives for car use could encourage healthier travel habits among this young population, even if they do not necessarily target them.”

Joint senior author Dr Jenna Panter from the MRC Epidemiology Unit, University of Cambridge, said: “The previous Government was committed to increasing the share of children walking to school by 2025 and we hope the new Government will follow suit. Changing the way children travel to school can have significant effects on their levels of physical activity at the same time as bringing other co-benefits like improving congestion and air quality, as about a quarter of car trips during peak morning hours in London are made for school drop-offs.”

After ULEZ was introduced in Central London, the total number of vehicles on the roads fell by 9%, and by one-third (34%) for vehicles that failed to meet the required exhaust emission standards, with no clear evidence of traffic moving instead to nearby areas.

Joint senior author Professor Chris Griffiths from the Wolfson Institute of Population Health, Queen Mary University of London, said: “Establishing healthy habits early is critical to healthy adulthood and the prevention of disabling long term illness, especially obesity and the crippling diseases associated with it. The robust design of our study, with Luton as a comparator area, strongly suggests the ULEZ is driving this switch to active travel. This is evidence that Clean Air Zone intervention programmes aimed at reducing air pollution have the potential to also improve overall public health by addressing key factors that contribute to illness.”

Due to the introduction of COVID-19 restrictions in late March 2020, the study was paused in 2020/2021 and results are only reported for the first year of follow-up. However, as both London and Luton, the study areas, were similarly affected, the researchers believe this disruption is unlikely to have affected the results. The study has restarted following up with the children to examine the longer-term impacts of the ULEZ. This will identify if the changes they observed in the year following the introduction of the ULEZ persist.

The study was conducted in collaboration with Queen Mary University of London, Imperial College, University of Bedfordshire, University of Edinburgh, University of Oxford and University of Southern California and funded by the National Institute for Health and Care Research Public Health Research (NIHR), NIHR Applied Research Collaboration North Thames, and Cambridge Trust. 

Reference
Xiao, C et al. Children’s Health in London and Luton (CHILL) cohort: A 12-month natural experimental study of the effects of the Ultra Low Emission Zone on children’s travel to school. IJBNPA; 5 Sept 2024; DOI: 10.1186/s12966-024-01621-7

Four in ten children in Central London who travelled to school by car switched to more active modes of transport, such as walking, cycling, or public transport, following the introduction of the Ultra-Low Emission Zone (ULEZ), according to new research. In the comparison area with no ULEZ, Luton, only two in ten children made this switch over the same period.

Changing the way children travel to school can have significant effects on their levels of physical activity at the same time as bringing other co-benefits like improving congestion and air quality
Jenna Panter
ULEZ signs (cropped)

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

High cholesterol levels at a young age significant risk factor for atherosclerosis

Teenagers eating pizza by the river

The research also suggests that people who are taking lipid-lowering drugs such as statins to lower their cholesterol levels should remain on them, even if their cholesterol levels have fallen, as stopping treatment could increase their risk of atherosclerosis.

Atherosclerosis is one of the major causes of heart and circulatory disease. It involves the hardening and narrowing of the vessels that carry blood to and from the heart. It is caused by the build-up of abnormal material called plaques – collections of fat, cholesterol, calcium and other substances circulating in the blood.

Atherosclerosis is largely considered a disease of the elderly and so most screening, prevention and intervention programmes primarily target those with high cholesterol levels, generally after the age of 50.

But in a study published today in Nature, a team led by scientists at the University of Cambridge shows that high cholesterol levels at a younger age – particularly if those levels fluctuate – can be even more damaging than high cholesterol levels that only begin in later life.

To study the mechanisms that underlie atherosclerosis, scientists often use animal modes, such as mice. The mice will typically be fed a high fat diet for several weeks as adults to see how this leads to the build up of the plaques characteristic of the condition.

Professor Ziad Mallat and colleagues at the Victor Phillip Dahdaleh Heart and Lung Research Institute at the University of Cambridge decided to explore a different approach – to see whether giving mice the same amount of high fat food but spread over their lifetime changed their atherosclerosis risk.

“When I asked my group and a number of people who are experts in atherosclerosis, no one could tell me what the result would be,” said Professor Mallat, a British Heart Foundation (BHF) Professor of Cardiovascular Medicine.

“Some people thought it would make no difference, others thought it would change the risk. In fact, what we found was that an intermittent high fat diet starting while the mice were still young – one week on, a few weeks off, another week on, and so on – was the worst option in terms of atherosclerosis risk.”

Armed with this information, his team turned to the Cardiovascular Risk in Young Finns Study, one of the largest follow-up studies into cardiovascular risk from childhood to adulthood. Participants recruited in the 1980s returned for follow-up over the subsequent decades, and more than 2,000 of them had received ultrasound scans of their carotid arteries when they were aged around 30 years and again at around 50 years.

Analysing the data, the team found that those participants who had been exposed to high cholesterol levels as children tended to have the biggest build of plaques, confirming the findings in mice.

“What this means is that we shouldn’t leave it until later in life before we start to look at our cholesterol levels,” Professor Mallat said. “Atherosclerosis can potentially be prevented by lowering cholesterol levels, but we clearly need to start thinking about this much earlier on in life than we previously thought.”

The mouse studies showed that fluctuating levels of cholesterol appeared to cause the most damage. Professor Mallat says this could explain why some people who are on statins but do not take them regularly remain at an increased risk of heart attack.

“If you stop and start your statin treatment, your body is being exposed to a yo-yo of cholesterol, which it doesn’t like, and it seems this interferes with your body’s ability to prevent the build-up of plaques,” he added.

The reason why this is so damaging may come down to the effect that cholesterol has on specific types of immune cells known as ‘resident arterial macrophages’. These reside in your arteries, helping them to clear damaged cells and fatty molecules known as lipids, which include cholesterol, and stopping the build-up of plaques.

When the team examined these macrophages in their mouse models, they found that high cholesterol levels – and in particular, fluctuating cholesterol levels – changed them physically and altered the activity of their genes. This meant that the cells were no longer protective, but were instead detrimental, accelerating atherosclerosis.

The research was funded by the British Heart Foundation.

Reference
Takaoka, M et al. Early intermittent hyperlipidaemia alters tissue macrophages to boost atherosclerosis. Nature; 4 Sept 2024; DOI: 10.1038/s41586-024-07993-x

Our risk of developing atherosclerosis – ‘furring’ of the arteries – can begin much earlier in life than was previously thought, highlighting the need to keep cholesterol levels low even when we are young, new research has discovered.

Atherosclerosis can potentially be prevented by lowering cholesterol levels, but we clearly need to start thinking about this much earlier on in life
Ziad Mallat
Teenagers eating pizza by the river

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Study reveals ‘patchy and inconsistent’ end-of-life care

Experimental coloured image of two hands touching

These are among the conclusions of Time to Care: findings from a nationally representative survey of experiences at the end of life in England and Wales, a new report funded by end-of-life charity Marie Curie and produced by King’s College London’s Cicely Saunders Institute, Hull York Medical School at University of Hull, and the University of Cambridge.

Time to Care aims to describe the outcomes, experiences, and use of care services by people affected by dying, death, and bereavement in England and Wales. It is the final report from the Marie Curie Better End of life programme.

The report found one in five dying people had no contact with their GP in the last three months of life.

Half of people surveyed (49%) said their dying loved one visited A&E at least once in their final three months of life, and one in eight people who died in hospital had been there less than 24 hours. 

Half of respondents (49%) in the study were also unhappy with at least one aspect of the care the person who died received and of those one in eight people made a formal complaint. Fewer than half of respondents said they had a key contact person to co-ordinate their care. This meant responsibility for care fell on informal carers (family and friends), who often felt unprepared and unsupported.

Professor Stephen Barclay, from the Department of Public Health & Primary Care at the University of Cambridge, a researcher on the project and a practicing GP, said: “GPs, Community Nurses and the wider Primary Care Team have a central and often under-recognised role in the care of people approaching and at the end of their lives. But they are under enormous pressure with increasing workloads, diminishing workforces and inadequate investment over recent years.

“Increasing numbers of people have been dying in the community during and following the COVID-19 pandemic, at home or in care homes. This important survey, undertaken at a time when the NHS was beginning to recover from the worst of the pandemic, reveals how clinical teams in all settings are struggling to meet the needs of this vulnerable patient group.

“The out-of-hours period, which comprises two-thirds of the week, is particularly difficult for patients and their families. Across the UK, GPs and Community Nurses want to provide excellent palliative and end of life care, but the necessary ‘time to care’ is currently often squeezed. The new UK Government’s focus on care close to home is welcome. This report highlights the need for a radical repurposing of NHS funding to resource primary care for that ambition to be achieved.”

The research report is based on a survey sent by the Office for National Statistics in 2023 to a nationally representative sample of people who had registered the death of a family member in the prior six to 10 months. Only non-sudden causes of death were included. Responses were received from 1179 people, making this the largest nationally representative post-bereavement survey in England and Wales for a decade.

Professor Katherine Sleeman, from King’s College London and lead researcher on the project, said:  “This study reveals patchy and inconsistent provision of care for people approaching the end of life. While there were examples of excellent care - including in the community, in care homes, and in hospitals - the overall picture is of services that are overstretched, and of health and care staff lacking the time they need to consistently provide high-quality care. This means that dying people miss out on treatment and care for their symptoms, and families are left feeling unprepared and unsupported which has lasting emotional repercussions into bereavement.

The researchers say the findings are concerning, considering the ageing population and the expected increase in palliative care needs across the UK. By 2048, there will be an additional 147,000 people in the UK who need palliative care before they die, an increase of 25%.

“Without a corresponding increase in capacity of primary and community care teams to support these people as they approach the end of life, the quality of care is likely to further suffer,” said Professor Sleeman. “It has never been more important to ensure high-quality palliative care for all who need it.”

Annette Weatherley, Marie Curie Chief Nursing Officer, added: “The findings are shocking.  Too many people are dying in avoidable pain, struggling with breathlessness and other debilitating symptoms because of the difficulties they face accessing the end-of-life care they need from overstretched GPs and other health and care workers.

“Without urgent action, gaps in access to palliative and end of life care will only grow.

“It is a critical time to improve palliative and end of life care. People at the end of life should be able to have the very best possible care. There is only one chance to get it right at the end of life.  Yet, as the evidence shows, too many people are being failed by a system faced with extreme financial and workforce pressures.  It’s time for Governments to step up and fix care of the dying.”

Professor Stephen Barclay is a fellow at Emmmanuel College, Cambridge.

Adapted from a press release by Marie Curie

One in three dying people in England and Wales was severely or overwhelmingly affected by pain in the last week of life, with bereaved people reporting how difficult it was to get joined-up support from health and care professionals at home.

This report highlights the need for a radical repurposing of NHS funding to resource primary care for that ambition to be achieved
Stephen Barclay
Experimental coloured image of two hands touching

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Anti-inflammatory drug could reduce future heart attack risk

Illustration of human heart

A cancer drug that unlocks the anti-inflammatory power of the immune system could help to reduce the risk of future heart attacks, according to research part-funded by the British Heart Foundation. By repurposing an existing drug, researchers hope it could soon become part of routine treatment for patients after a heart attack.

The findings will be presented at the European Society of Cardiology Congress in London by Dr Rouchelle Sriranjan, NIHR Clinical Lecturer in Cardiology at the University of Cambridge.

High levels of inflammation in blood vessels are linked to an increased risk of heart disease and heart attacks. After a heart attack, the body’s immune response can aggravate existing inflammation, causing more harm and increasing risk even further. However, NICE guidelines don’t currently recommend the use of any anti-inflammatory drugs to reduce future risk.

Now, a team of researchers, led by Dr Joseph Cheriyan from Cambridge University Hospitals NHS Foundation Trust, have found that low doses of an anti-inflammatory drug called aldesleukin, injected under the skin of patients after a heart attack, significantly reduces inflammation in arteries.

The researchers are currently following up patients to investigate the longer-term impact of this fall in inflammation. To date, in the two and a half years after their treatment, there have been no major adverse cardiac events in the group that received aldesleukin, compared to seven in the group that received the placebo.

Professor Ziad Mallat, BHF Professor of Cardiovascular Medicine at the University of Cambridge who developed the trial, said: “We associate inflammation with healing – an inbuilt response that protects us from infection and injury. But it’s now clear that inflammation is a culprit in many cardiovascular conditions.

“Early signs from our ongoing trial suggest that people treated with aldesleukin may have better long-term outcomes, including fewer heart attacks. If these findings are repeated in a larger trial, we’re hopeful that aldesleukin could become part of routine care after a heart attack within five to 10 years.”

Aldesleukin is already used to treat kidney cancer, as high doses stimulate the immune system to attack cancer cells. The Cambridge team previously found that doses one thousand times lower than those used in cancer treatment increased the number of regulatory T cells – a type of anti-inflammatory white blood cell – in patients’ blood compared to a placebo.

In the current trial at Addenbrooke's and Royal Papworth hospitals in Cambridge, 60 patients admitted to hospital with a heart attack or unstable angina received either low dose aldesleukin or placebo. Patients received an injection once a day for the first five days, then once per week over the next seven weeks. Neither the participants nor their doctors knew whether they had received the drug or placebo.

At the end of treatment, Positron Emission Tomography (PET) scans showed that inflammation in the artery involved in patients’ heart attack or angina was significantly lower in the group treated with aldesleukin, compared to those who received the placebo.

The anti-inflammatory effect of aldesleukin appeared even more striking in the most inflamed arteries, leading to a larger reduction in inflammation levels in these vessels and a bigger difference between the two groups by the end of the study.

Dr Sonya Babu-Narayan, Associate Medical Director at the British Heart Foundation and consultant cardiologist said: “Thanks to research, we have an array of effective treatments to help people avoid heart attacks and strokes and save lives. But, even after successful heart attack treatment, unwanted inflammation in the coronary arteries can remain, which can lead to life-threatening complications.

“A treatment to reduce inflammation after a heart attack could be a game-changer. It would help doctors to interrupt the dangerous feedback loop that exacerbates inflammation and drives up risk. This research is an important step towards that treatment becoming a reality.”

The study was predominantly funded by the Medical Research Council, with significant support from the BHF and National Institute for Health and Care Research Cambridge Biomedical Research Centre (NIHR-BRC).

Originally published by the British Heart Foundation. 

Repurposed cancer drug helps to calm inflammation in arteries.

Illustration of human heart

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Global timber supply threatened as climate change pushes cropland northwards

Timber/farming contrast in the USA

The sight of vineyards in Britain is becoming more common as hotter summers create increasingly suitable conditions for growing grapes. But behind this success story is a sobering one: climate change is shifting the regions of the world suitable for growing crops.

Researchers at the University of Cambridge have uncovered a looming issue: as the land suitable for producing our food moves northwards, it will put a squeeze on the land we need to grow trees. The timber these trees produce is the basis of much of modern life – from paper and cardboard to furniture and buildings.

They say that the increasing competition between land for timber production and food production due to climate change has, until now, been overlooked – but is set to be an emerging issue as our demand for both continues to increase.

Under the worst-case scenario for climate change, where no action is taken to decarbonise society, the study found that over a quarter of existing forestry land – around 320 million hectares, equivalent to the size of India – will become more suitable for agriculture by the end of the century.

Most forests for timber production are currently located in the northern hemisphere in the US, Canada, China and Russia. The study found that 90% of all current forestry land that will become agriculturally productive by 2100 will be in these 4 countries.

In particular, tens of millions of hectares of timber-producing land across Russia will become newly suitable for agriculture – more than in the US, Canada and China put together – with conditions becoming favourable for potato, soy, and wheat farming.

“There’s only a finite area of suitable land on the planet where we can produce food and wood - 2 critical resources for society. As climate change worsens and agriculture is forced to expand northwards, there’s going to be increasing pressure on timber production,” said Dr Oscar Morton, a researcher in the University of Cambridge’s Department of Plant Sciences who co-led the study.

“We’ve got to be thinking 50 years ahead because if we want timber in the future, we need to be planting it now. The trees that will be logged by the end of this century are already in the ground – they’re on much slower cycles than food crops,” said Dr Chris Bousfield, a postdoctoral researcher in the University of Cambridge’s Department of Plant Sciences and co-leader of the study.

Global food demand is projected to double by 2050 as the population grows and becomes more affluent. Global wood demand is also expected to double in the same timeframe, in large part because it is a low-carbon alternative to concrete and steel for construction.

Shifting timber production deeper into boreal or tropical forests are not viable options, because the trees in those regions have stood untouched for thousands of years and logging them would release huge amounts of carbon and threaten biodiversity.

“A major environmental risk of increasing competition for land between farming and forestry is that wood production moves into remaining areas of primary forest within the tropics or boreal zones. These are the epicentres of remaining global wilderness and untouched tropical forests are the most biodiverse places on Earth. Preventing further expansion is critical,” said David Edwards, Professor of Plant Ecology in the University of Cambridge’s Department of Plant Sciences and senior author of the study.

To get their results, the researchers took satellite data showing intensive forestry across the world and overlaid it with predictions of suitable agricultural land for the world’s key crops -including rice, wheat, maize, soy and potato - in the future under various climate change scenarios.

Even in the best-case scenario, where the world meets net-zero targets, the researchers say there will still be significant future changes in the regions suitable for timber and crop production.

The study is published in the journal Nature Climate Change.

Timber production contributes over US $1.5 trillion (about £1.1 trillion) per year to national economies globally. Heatwaves and associated wildfires have caused huge recent losses of timber forests around the world. Climate change is also driving the spread of pests like the Bark Beetle, which attacks trees.

Climate change is expected to cause areas in the tropics to become too hot and inhospitable for growing food and make large areas of southern Europe much less suitable for food and wood production.

“Climate change is already causing challenges for timber production. Now on top of that, there will be this increased pressure from agriculture, creating a perfect storm of problems,” said Bousfield.

“Securing our future wood supply might not seem as pressing as securing the food we need to eat and survive. But wood is just as integrated within our daily lives and we need to develop strategies to ensure both food and wood security into the future,” said Morton.

Reference

Bousfield, C G, et al, ‘Climate change will exacerbate land conflict between agriculture and timber production.’ Nature Climate Change (2024). DOI: 10.1038/s41558-024-02113-z

Climate change will move and reduce the land suitable for growing food and timber, putting the production of these 2 vital resources into direct competition, a new study has found.

Timber/farming contrast in the USA

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

One term of empathy training measurably improved classroom behaviour

Empathy lessons at Kingsmead School, Enfield, UK

An analysis of a short programme teaching empathy in schools has found it had a positive impact on students’ behaviour and increased their emotional literacy within 10 weeks.

The findings come from an evaluation of the 'Empathy Programme': a term-long course developed by the UK-based Empathy Studios. The research was conducted with support from academics at the Faculty of Education, University of Cambridge.

Empathy Studios develops school-based, video-led programmes which aim to increase empathy in students aged 5 to 18. Students are shown thought-provoking films, then engage in approximately 30 minutes of activities and discussions about the issues raised. An annual flagship festival of films, resources and events, 'Empathy Week', is made available for free and has to date reached 1.3 million students worldwide.

Survey and interview data from 900 students and teachers at 10 participating schools in 6 countries, including the UK, revealed measurable, positive changes in students’ conduct, emotional awareness and curiosity about different cultures and the wider world.

Teachers rated students’ empathy, behaviour and other characteristics on a scale of one to 10 before the programme began, and 5 and 10 weeks later. The average empathy score rose from 5.55 to 7, while average behaviour scores increased from 6.52 to 7.89.

In follow-up interviews, one primary school teacher reflected: “I’ve definitely been able to resolve more issues within the classroom and not have parents called in.” A student told the interviewers: “I think that everyone in the class has become kinder.”

Empathy Studios defines empathy as: “the skill to understand others and the ability to create space for someone to reveal their authentic self while reserving judgement.” The company was founded 4 years ago by Ed Kirwan, a former science teacher from North London.

“The programme’s success lies in teaching students to celebrate difference, which changes their wellbeing and behaviour,” he said. “There’s never an excuse for poor behaviour, but often a reason, which greater mutual understanding can potentially address.”

“I think the social unrest we have seen in Britain this summer shows how urgently we need more empathy across society. It won’t solve everything, but it is the foundation for solutions, and it starts with education. If the new government is serious about curriculum reforms that prepare young people for life and work, we must ensure that school equips them to understand, be curious about, and listen to each other, even in moments of disagreement.”

The evaluation was supported by Dr Helen Demetriou, a specialist in empathy education at the University of Cambridge, who helped to design the research, and to collect, quality assure and interpret the data.

“The findings show that a fairly simple, film-based programme can raise pupils’ empathy levels, enhancing their understanding of themselves, others, and global issues,” she said. “That supports a more complete learning experience, developing social and emotional skills that we know contribute to improved behaviour and more engaged learning.”

Although it is often considered innate, evidence suggests that empathy can be taught. A 2021 study co-authored by Demetriou successfully trialled teaching empathy during design and technology lessons. More recently, researchers at the University of Virginia found that empathy between parents and children is 'paid forward' by the children to friends and, later, when they become parents themselves.

Empathy has been linked to better leadership and inclusion in workplaces; while a 2023 World Economic Forum White Paper highlighted the importance of socio-emotional skills to the future of work and argued for more education that emphasises interpersonal skills, including empathy.

Empathy Studios offers schools assembly and lesson plans built around films about the real-life stories of diverse people in other parts of the world. Its 2024/5 programme, for example, profiles 5 individuals from Mexico, including a Paralympian, a dancer, and a women’s rights activist.

Their framework focuses on 3 core concepts: 'Empathy for Myself', which develops students’ emotional literacy; 'Empathy for Others', which covers mutual understanding and interpersonal relations, and 'Empathy in Action', during which the students develop their own social action projects.

The new research builds on a 2022 pilot study with the University of Cambridge, which suggested that the programme makes students more responsive to each others’ feelings and improves self-esteem. The new evaluation involved over 900 students and 30 teachers, and took place during 2023.

The teacher surveys indicated that behaviour had improved by up to 10% in some schools, especially those new to empathy lessons. The average improvement in behaviour recorded by UK teachers corresponded to the overall trend, rising from 6.3/10 pre-programme to 7.7/10 post-programme. Empathy and behaviour also appeared to be closely linked: all schools reporting an overall improvement in student empathy also saw improvements in behaviour after five weeks, which was sustained in 80% of cases after 10.

The evaluation recorded small improvements in students’ overall emotional literacy and their 'affective empathy'; or their ability to share the feelings of others. A change that emerged strongly from interviews with teachers was that the Empathy Programme appeared to increase students’ interest in other cultures. In one primary school, for example, the proportion of students responding positively to the statement “I want to find out more about the world” rose from 86% to 96% after 10 weeks. This echoes Organisation for Economic Co-operation and Development (OECD) evidence linking empathy to civic engagement.

Many students said they had learned valuable lessons from the programme. Their reflections included: “Everyone struggles… I’m not the only one who finds it hard”, and “Although we are all different, we all have so much in common”.

“Empathy is the number one human skill we need to develop for the future,” Kirwan said. “It should not just be an add-on; it should be considered foundational.”

Further information is available from: https://www.empathystudios.com/

A study involving 900 students in 6 countries found that a short programme of empathy lessons led to measurable, positive changes in their conduct, emotional awareness and curiosity about different cultures.

Empathy lessons at Kingsmead School, Enfield, UK

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge Children’s Hospital moves ahead as plans for new hospital approved by ministers

Artist's impression of Cambridge Children's Hospital.

Plans for Cambridge Children’s Hospital can move ahead following the news that the Outline Business Case for the project has been signed off by the Chief Secretary to the Treasury and the Secretary of State for Health and Social Care. The project has been given the green light to begin the detailed process of appointing a contractor, to build the ground-breaking new facility in 2026.

The ministerial backing means that the Project’s Outline Business Case, the second stage of the business case process, has now been fully approved by the Department of Health and Social Care, NHS England and HM Treasury. It was approved in principle in September 2023, subject to a capital affordability review by NHS England and the Department for Health and Social Care’s Joint Investment Committee. That review took place in April 2024 and resulted in a recommendation to Ministers to endorse the decision.

In a show of further confidence in its plans, the Outline Business Case was signed off by the Chief Secretary to the Treasury and the Secretary of State for Health and Social Care in August 2024. This approval recognises that the hospital will meet the needs of patients and staff across the East of England and that the project has the appropriate funding streams in place, to deliver the specialist children’s facility.

The hospital is being co-designed with the help of children, young people, families and healthcare professionals across the region to ensure the new hospital will meet the needs of patients, families and staff.

Dr Rob Heuschkel, Clinical Lead for Physical Health at Cambridge Children’s Hospital said:

“We are absolutely delighted that we can now move forward to enter contracts with a construction partner, so we can finally start to see work happening on site.

“A huge amount of work has gone into finalising the designs and getting us to this point, and I want to thank our healthcare staff, young people and their families from across the region who have been contributing valuable feedback and helping us shape our plans, right from the very start.

“The East of England is the only region in the UK without a specialist children’s hospital, and we look forward to changing that very soon.”

The approval comes as a programme of groundworks preparing for the build was completed in July, and new access roads have now been installed where the new five-storey, 35,000sqm hospital will be located, opposite the Rosie Maternity Hospital, on Robinson Way and Dame Mary Archer Way.

In the coming weeks, people will be able to see hoardings installed around the site of Cambridge Children’s Hospital, the first hospital truly designed to bring mental and physical health care together for children and young people.

Dr Isobel Heyman, Clinical Lead for Mental Health at Cambridge Children’s Hospital said:

“This really is fantastic news and an exciting moment in our journey to build a truly integrated children’s hospital for the East of England.

“The current model of mental health care is inadequate. Many children with physical ill-health also have significant mental health needs, and vice versa.

“Cambridge Children’s Hospital offers a solution. By delivering holistic care for children, young people, and their families, this not only reduces stigma, but the revolutionary model of care really does act as a blueprint for the NHS and the future of healthcare.”

The fundraising Campaign for Cambridge Children’s Hospital has now passed the halfway mark and the project remains on track to meet its £100m philanthropy target.

The hospital will also house a University of Cambridge research institute, focused on preventing childhood illness and early intervention across mental and physical healthcare.

Professor David Rowitch, Clinical Lead for Research at Cambridge Children’s Hospital said:

“By bringing clinicians and patients together with University of Cambridge investigators and industry partners, we aim to shift the medicine paradigm from traditional reactive approaches, to one based on early detection, precision intervention and disease prevention.

“Co-locating research efforts inside Cambridge Children’s Hospital will mean we can detect disease early or even prevent it altogether, personalise health care and prescribe treatments more appropriate for children and their individual health needs.

“We’ll also be able to foster collaborations to advance the power of advanced diagnostics, digital and telehealth technology to support healthcare professions from a distance, to deliver care closer to people’s homes, wherever they live in our region.”

The Cambridge Children’s Hospital project is a partnership between Cambridge University Hospitals NHS Foundation Trust (CUH), Cambridgeshire and Peterborough NHS Foundation Trust (CPFT), and the University of Cambridge.

Work now continues on the final stage of the business case for Cambridge Children’s Hospital – the Full Business Case.

Major milestone for first specialist children’s hospital in the East of England.

Artist's impression of Cambridge Children's Hospital.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Mother’s gut microbiome during pregnancy shapes baby’s brain development

Pregnant women drinking a glass of milk.

Researchers have compared the development of the fetal brain in mice whose mothers had no bacteria in their gut, to those whose mothers were given Bifidobacterium breve orally during pregnancy, but had no other bacteria in their gut.

Nutrient transport to the brain increased in fetuses of mothers given Bifidobacterium breve, and beneficial changes were also seen in other cell processes relating to growth.

Bifidobacterium breve is a ‘good bacteria’ that occurs naturally in our gut, and is available as a supplement in probiotic drinks and tablets.

Obesity or chronic stress can alter the gut microbiome of pregnant women, often resulting in fetal growth abnormalities. The babies of up to 10% of first-time mothers have low birth weight or fetal growth restriction. If a baby hasn't grown properly in the womb, there is an increased risk of conditions like cerebral palsy in infants and anxiety, depression, autism, and schizophrenia in later life.

These results suggest that improving fetal development - specifically fetal brain metabolism - by taking Bifidobacterium breve supplements while pregnant may support the development of a healthy baby.

The results are published today in the journal Molecular Metabolism.

“Our study suggests that by providing ‘good bacteria’ to the mother we could improve the growth and development of her baby while she’s pregnant,” said Dr Jorge Lopez-Tello, a researcher in the University of Cambridge’s Centre for Trophoblast Research, first author of the report.

He added: “This means future treatments for fetal growth restriction could potentially focus on altering the gut microbiome through probiotics, rather than offering pharmaceutical treatments - with the risk of side effects - to pregnant women.”

“The design of therapies for fetal growth restriction are focused on improving blood flow pathways in the mother, but our results suggest we’ve been thinking about this the wrong way - perhaps we should be more focused on improving maternal gut health,” said Professor Amanda Sferruzzi-Perri, a researcher in the University of Cambridge’s Centre for Trophoblast Research and senior author of the report, who is also a Fellow of St John’s College, Cambridge.

She added: “We know that good gut health - determined by the types of microbes in the gut - helps the body to absorb nutrients and protect against infections and diseases.”

The study was carried out in mice, which allowed the effects of Bifidobacterium breve to be assessed in a way that would not be possible in humans - the researchers could precisely control the genetics, other microorganisms and the environment of the mice. But they say the effects they measured are likely to be similar in humans.

They now plan further work to monitor the brain development of the offspring after birth, and to understand how Bifidobacterium breve interacts with the other gut bacteria present in natural situations.

Previous work by the same team found that treating pregnant mice with Bifidobacterium breve improves the structure and function of the placenta. This also enables a better supply of glucose and other nutrients to the developing fetus and improves fetal growth.

“Although further research is needed to understand how these effects translate to humans, this exciting discovery may pave the way for future clinical studies that explore the critical role of the maternal microbiome in supporting healthy brain development before birth,” said Professor Lindsay Hall at the University of Birmingham, who was also involved in the research.

While it is well known that the health of a pregnant mother is important for a healthy baby, the effect of her gut bacteria on the baby’s development has received little attention.

Reference 

Lopez-Tello, J, et al: ‘Maternal gut Bifidobacterium breve modifies fetal brain metabolism in germ-free mice.’ Molecular Metabolism, August 2024. DOI: 10.1016/j.molmet.2024.102004

A study in mice has found that the bacteria Bifidobacterium breve in the mother’s gut during pregnancy supports healthy brain development in the fetus.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Red and processed meat consumption associated with higher type 2 diabetes risk

Preparing a Monte Cristo Sandwich, with Black Forest Ham

The findings are published today in The Lancet Diabetes and Endocrinology.

Global meat production has increased rapidly in recent decades and meat consumption exceeds dietary guidelines in many countries.  Earlier research indicated that higher intakes of processed meat and unprocessed red meat are associated with an elevated risk of type 2 diabetes, but the results have been variable and not conclusive.

Poultry such as chicken, turkey, or duck is often considered to be an alternative to processed meat or unprocessed red meat, but fewer studies have examined the association between poultry consumption and type 2 diabetes.

To determine the association between consumption of processed meat, unprocessed red meat and poultry and type 2 diabetes, a team led by researchers at the University of Cambridge used the global InterConnect project to analyse data from 31 study cohorts in 20 countries. Their extensive analysis took into account factors such as age, gender, health-related behaviours, energy intake and body mass index.

The researchers found that the habitual consumption of 50 grams of processed meat a day - equivalent to 2 slices of ham - is associated with a 15% higher risk of developing type 2 diabetes in the next 10 years. The consumption of 100 grams of unprocessed red meat a day - equivalent to a small steak - was associated with a 10% higher risk of type 2 diabetes.

Habitual consumption of 100 grams of poultry a day was associated with an 8% higher risk, but when further analyses were conducted to test the findings under different scenarios the association for poultry consumption became weaker, whereas the associations with type 2 diabetes for each of processed meat and unprocessed meat persisted.

Professor Nita Forouhi of the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, and a senior author on the paper, said: “Our research provides the most comprehensive evidence to date of an association between eating processed meat and unprocessed red meat and a higher future risk of type 2 diabetes. It supports recommendations to limit the consumption of processed meat and unprocessed red meat to reduce type 2 diabetes cases in the population.

“While our findings provide more comprehensive evidence on the association between poultry consumption and type 2 diabetes than was previously available, the link remains uncertain and needs to be investigated further.”

InterConnect uses an approach that allows researchers to analyse individual participant data from diverse studies, rather than being limited to published results. This enabled the authors to include as many as 31 studies in this analysis, 18 of which had not previously published findings on the link between meat consumption and type 2 diabetes. By including this previously unpublished study data the authors considerably expanded the evidence base and reduced the potential for bias from the exclusion of existing research.

Lead author Dr Chunxiao Li, also of the MRC Epidemiology Unit, said: “Previous meta-analysis involved pooling together of already published results from studies on the link between meat consumption and type 2 diabetes, but our analysis examined data from individual participants in each study. This meant that we could harmonise the key data collected across studies, such as the meat intake information and the development of type 2 diabetes.

“Using harmonised data also meant we could more easily account for different factors, such as lifestyle or health behaviours, that may affect the association between meat consumption and diabetes.”

Professor Nick Wareham, Director of the MRC Epidemiology Unit, and a senior author on the paper said: “InterConnect enables us to study the risk factors for obesity and type 2 diabetes across populations in many different countries and continents around the world, helping to include populations that are under-represented in traditional meta-analyses.

“Most research studies on meat and type 2 diabetes have been conducted in USA and Europe, with some in East Asia. This research included additional studies from the Middle East, Latin America and South Asia, and highlighted the need for investment in research in these regions and in Africa.”

InterConnect was initially funded by the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 602068.

Reference
Li, C et al. Meat consumption and incident type 2 diabetes: a federated meta-analysis of 1·97 million adults with 100,000 incident cases from 31 cohorts in 20 countries. Lancet Diabetes Endocrinol.; 20 August 2024

Adapted form a press release from the MRC Epidemiology Unit

Meat consumption, particularly consumption of processed meat and unprocessed red meat, is associated with a higher type 2 diabetes risk, an analysis of data from almost two million participants has found.

Our research supports recommendations to limit the consumption of processed meat and unprocessed red meat to reduce type 2 diabetes cases in the population
Nita Forouhi
Preparing a Monte Cristo Sandwich, with Black Forest Ham

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Arcadia awards over £10 million for 2 major archaeology projects

Image from the Mapping Africa’s Endangered Archaeological Sites and Monuments project.

The McDonald Institute for Archaeological Research, Department of Archaeology and University of Cambridge Development and Alumni Relations are pleased to announce that the Arcadia charitable foundation has awarded grants totalling £10.3 million to continue the work of the Mapping Africa’s Endangered Archaeological Sites and Monuments (MAEASaM) project and the Mapping Archaeological Heritage in South Asia (MAHSA) project.

Archaeological sites and monuments around the world are increasingly threatened by human activities and the impacts of climate change. These pressures are especially severe in South Asia and sub-Saharan Africa, where local heritage agencies are often short-staffed and under-resourced; where existing sites and monuments registers are often incompletely digitised; and where many sites are not yet documented and large areas remain archaeologically under-studied. Alongside the intensity of natural and human threats, these factors combine to make the implementation of planning controls, impact assessments, mitigation measures and long-term monitoring especially challenging.

The five-year funding of £5.7 million to the MAEASaM project supports the continuation of its mission to identify and document endangered archaeological heritage sites across Africa, building on the work accomplished thus far with our in-country partners in Zimbabwe, Tanzania, Sudan, Senegal, Mali, Kenya, Ethiopia and Botswana. The funding will also allow the project to expand its collaborations with other national heritage agencies in Africa, including Mozambique, Gambia and the Democratic Republic of Congo, and to develop innovative approaches to better integrate heritage concerns into national planning and development control activities

During Phase 1 of the project, the MAEASaM team managed to assess a total area of 1,024,656 square kilometres using a combination of historical maps, Google Earth and medium-resolution satellite imagery, resulting in digital documentation of some 67,748 sites and monuments. Concurrent with this work, the team created digital records of 31,461 legacy sites, from unique information sets spanning almost a century of archaeological fieldwork on the continent. The accuracy of a sample of these records were also assessed via 11 field verification campaigns, helping establish the current status of these sites and levels of endangerment from anthropogenic and natural processes, while also locating many previously undocumented sites. Training, skills enhancement and knowledge transfer activities were also delivered via both in-person and online events, often in collaboration with MAHSA, and team members presented their work at 15 international meetings and via numerous social media and website posts.

Professor Paul Lane, Principal Investigator of the MAEASaM project, said: “I am truly delighted by the news of this award and would like to take this opportunity to thank Arcadia for their continuing support. As well as allowing expansion of the project to cover other countries in sub-Saharan Africa, this further 5 years of funding will enable the creation of a repository of digital assets and a sustainable system for more rapidly and easily assessing, researching, monitoring and managing archaeological heritage, accessible to heritage professionals, researchers and students across the continent.”

Similarly, the five-year grant of £4.6 million to the MAHSA project supports its continuing mission to document endangered archaeological heritage in Pakistan and India, working alongside collaborators in both countries to support their efforts to protect and manage the rich heritage of the region. Over the next 5 years, MAHSA will continue to develop and populate its Arches database, creating a resource to make heritage data findable, accessible, interoperable and reusable. MAHSA will consolidate the work it has begun in the Indus River Basin and surrounding areas, and will also expand its documentation efforts to include the coastline areas of both India and Pakistan, Baluchistan in Pakistan and the Ganges River Basin in north India.

During Phase 1, the MAHSA team georeferenced in excess of 1,300 historic Survey of India map sheets, covering over 890,000 square kilometres, and have reconstructed over 192,696 square kilometres of ancient hydrological networks. This groundwork has made it possible to digitise over 10000 legacy data records, and many of those records have been enriched. In addition, they carried out 5 collaborative archaeological surveys both as part of their training programme, and as part of new collaborative research with stakeholders in both India and Pakistan. They have engaged in policy-level dialogue with different government organisations in Pakistan and India, with an aim of working towards the development of a sustainable solution for the inclusion of heritage in urban and agricultural development strategies.  

Professor Cameron Petrie, Principal Investigator of the MAHSA project, said: “I am extremely proud of what the collaborative MAHSA team have achieved during Phase 1, and the support from Arcadia for Phase 2 will allow us to continue making a transformational contribution to the documentation and understanding of the archaeological heritage of Pakistan and India. We are clarifying existing archaeological site locations datasets and collecting new ones at a scale never before attempted in South Asia.”

About Arcadia

Arcadia is a charitable foundation that works to protect nature, preserve cultural heritage and promote open access to knowledge. Since 2002 Arcadia has awarded more than $1.2 billion (£900 million) to organisations around the world.

About Mapping Africa’s Endangered Archaeological Sites and Monuments (MAEASaM)

About Mapping Archaeological Heritage in South Asia (MAHSA)

The charitable foundation awards £10.3 million for the continuation of 2 Cambridge projects mapping endangered archaeological heritage in South Asia and sub-Saharan Africa.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New way to extend ‘shelf life’ of blood stem cells will improve gene therapy

Test tubes in a lab

Researchers have identified a drug already used for cancer patients, that, when applied to current gene therapy protocols can improve blood stem cell function threefold.

One trillion blood cells are produced every day in humans, and blood stem cells are the only cell types in our body capable of producing all blood cell types over our lifespan, giving them immense regenerative potential.

Blood stem cell gene therapy is a ground-breaking treatment that currently provides the only cure to more than ten life-debilitating genetic diseases and has already saved the lives of more than two million people with blood cancers and other diseases.

These therapies take blood stem cell samples from patients, where their genetic defect is corrected in a dish before being delivered back to the patient. However, limitations persist in blood stem cell therapies because of the shelf life of the cells outside the body. When removed from their environment in the human body and cultured in a dish, most blood stem cells lose their function. The exact timing and cause of this function loss has not previously been well understood.

Now, scientists in the Laurenti Group and others at the University of Cambridge’s Cambridge Stem Cell Institute (CSCI) and Department of Haematology have pinpointed a timeline for the blood stem cells under the current gene therapy protocols, which typically take place over three days. After the first 24 hours in a dish, more than 50% of the blood stem cells can no longer sustain life-long blood production, which is before therapy would even begin in a clinical setting.

During those first 24 hours, the cells activate a complex molecular stress response in order to adapt to the dish. By studying this stress response, the team identified a solution. Through repurposing a cancer growth blocker drug (Ruxolitinib), already in use for cancer treatments, they were able to improve stem cell function in a dish by three times its former capabilities.

The group is now aiming to modify current gene therapy protocols to include this drug, providing patients with the highest number of high-quality blood stem cells and improving their outcomes.

The study is published today in the journal Blood.

Professor Elisa Laurenti at the University of Cambridge Stem Cell Institute, and senior author of the study, said: “This is really exciting because we are now in a position where we can begin to understand the huge stress that these stem cells sense when they are manipulated outside of our body. Biologically it is really fascinating because it affects every aspect of their biology. Luckily, we were able to identify a key molecular pathway which governs many of these responses, and that can be targeted by a drug which is already in use and is safe to use.

“I hope our findings will enable safer treatments for gene therapy patients. Our discovery also opens up many possibilities to better expand blood stem cells ex vivo and expand the set of disease where we can use blood stem cells to improve patients’ lives.”

Dr Carys Johnson at the University of Cambridge Stem Cell Institute, and first author of the study, said: “Although we expected that removing these cells from the body and culturing them on a plastic surface would alter gene expression, the extent of change we found was surprising, with over 10,000 genes altered and a significant stress response detected. It was also striking to discover that the majority of blood stem cells are functionally lost during gene therapy protocols, before transplantation back to the patient.

“We have identified a key bottleneck where function is lost and clinical culture could be improved. I hope that our work will drive advancements in culture protocols to better harness the power of blood stem cells and improve the safety and efficacy of clinical approaches.”

Reference

C.S. Johnson, M.J. Williams, K. Sham, et al. ‘Adaptation to ex vivo culture reduces human hematopoietic stem cell activity independently of cell cycle.’ Blood 2024; DOI: 10.1182/blood.2023021426

Story written by Laura Puhl, Cambridge Stem Cell Institute.

Researchers have discovered a way to extend the shelf life of blood stem cells outside the body for use in gene therapy, providing patients with better options and improving their outcomes.

We were able to identify a key molecular pathway...that can be targeted by a drug which is already in use and is safe to use.
Elisa Laurenti

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

One in four patients in vegetative or minimally conscious state able to perform cognitive tasks, study finds

Male patient in a hospital bed

Severe brain injury can leave individuals unable to respond to commands physically, but in some cases they are still able to activate areas of the brain that would ordinarily play a role in movement. This phenomenon is known as ‘cognitive motor dissociation’.

To determine what proportion of patients in so-called ‘disorders of consciousness’ experience this phenomenon – and help inform clinical practice – researchers across Europe and North America recruited a total of 353 adults with disorders of consciousness, including the largest cohort of 100 patients studied at Cambridge University Hospitals NHS Foundation Trust.

Participants had mostly sustained brain injury from severe trauma, strokes or interrupted oxygen supply to the brain after heart attacks. Most were living in specialised long-term care facilities and a few were living at home with extensive care packages. The median time from injury for the whole group was about eight months.

Researchers assessed patterns of brain activation among these patients using functional magnetic resonance imaging (fMRI) or electroencephalography (EEG). Subjects were asked to repeatedly imagine performing a motor activity (for example, “keep wiggling your toes”, “swinging your arm as if playing tennis”, “walking around your house from room to room”) for periods of 15 to 30 seconds separated by equal periods of rest. To be able to follow such instructions requires not only the understanding of and response to a simple spoken command, but also more complex thought processes including paying attention and remembering the command.

The results of the study are published today in the New England Journal of Medicine.

Dr Emmanuel Stamatakis from the Department of Clinical Neurosciences at the University of Cambridge said: “When a patient has sustained a severe brain injury, there are very important, and often difficult, decisions to be made by doctors and family members about their care. It’s vitally important that we are able to understand the extent to which their cognitive processes are still functioning by utilising all available technology.” 

Among the 241 patients with a prolonged disorder of consciousness, who could not make any visible responses to bedside commands, one in four (25%) was able to perform cognitive tasks, producing the same patterns of brain activity recorded with EEG and/or fMRI that are seen in healthy subjects in response to the same instructions.

In the 112 patients who did demonstrate some motor responses to spoken commands at the bedside, 38% performed these complex cognitive tasks during fMRI or EEG. However, the majority of these patients (62%) did not demonstrate such brain activation. This counter-intuitive finding emphasises that the fMRI and EEG tasks require patients to have complex cognitive abilities such as short-term memory and sustained concentration, which are not required to the same extent for following bedside commands.

These findings are clinically very important for the assessment and management of the estimated 1,000 to 8,000 individuals in the UK in the vegetative state and 20,000 to 50,000 in a minimally conscious state. The detection of cognitive motor dissociation has been associated with more rapid recovery and better outcomes one year post injury, although the majority of such patients will remain significantly disabled, albeit with some making remarkable recoveries.

Dr Judith Allanson, Consultant in Neurorehabilitation, said: “A quarter of the patients who have been diagnosed as in a vegetative or minimally conscious state after detailed behavioural assessments by experienced clinicians, have been found to be able to imagine carrying out complex activities when specifically asked to. This sobering fact suggests that some seemingly unconscious patients may be aware and possibly capable of significant participation in rehabilitation and communication with the support of appropriate technology.

“Just knowing that a patient has this ability to respond cognitively is a game changer in terms of the degree of engagement of caregivers and family members, referrals for specialist rehabilitation and best interest discussions about the continuation of life sustaining treatments.”

The researchers caution that care must be taken to ensure the findings are not misrepresented, pointing out, for example, that a negative fMRI/EEG result does not per se exclude cognitive motor dissociation as even some healthy volunteers do not show these responses.

Professor John Pickard, emeritus professorial Fellow of St Catharine's College, Cambridge, said: “Only positive results – in other words, where patients are able to perform complex cognitive processes – should be used to inform management of patients, which will require meticulous follow up involving specialist rehabilitation services.”

The team is calling for a network of research platforms to be established in the UK to enable multicentre studies to examine mechanisms of recovery, develop easier methods of assessment than task-based fMRI/EEG, and to design novel interventions to enhance recovery including drugs, brain stimulation and brain-computer interfaces.

The research reported here was primarily funded by the James S. McDonnell Foundation. The work in Cambridge was supported by the National Institute for Health and Care Research UK, MRC, Smith’s Charity, Evelyn Trust, CLAHRC ARC fellowship and the Stephen Erskine Fellowship (Queens’ College). 

Reference
Bodien, YG et al. Cognitive Motor Dissociation in Disorders of Consciousness. NEJM; 14 Aug 2024; DOI: 10.1056/NEJMoa2400645

Adapted from a press release from Weill Cornell Medicine

Around one in four patients with severe brain injury who cannot move or speak – because they are in a prolonged coma, vegetative or minimally conscious state – is still able to perform complex mental tasks, a major international study has concluded in confirmation of much smaller previous studies.

When a patient has sustained a severe brain injury, there are very important, and often difficult, decisions to be made by doctors and family members about their care
Emmanuel Stamatakis
Male patient in a hospital bed - stock image
Acknowledgements

The multidisciplinary Cambridge Impaired Consciousness Research Group, led by Emeritus Professors John Pickard (Neurosurgery) & David Menon (Anaesthesia) and Drs Judith Allanson & Emmanuel A. Stamatakis (Lead, Cognition and Consciousness Imaging Group), started its research programme in 1997, partly in response to emerging concern over the misdiagnosis of the vegetative state. This pioneering work has only been possible by having access to the world class resources of the Wolfson Brain Imaging Centre, the NIHR/Wellcome Clinical Research Facility at Addenbrooke’s Hospital, the MRC Cognition and Brain Sciences Unit (Professors Barbara Wilson & Adrian Owen), the Royal Hospital for Neuro-disability (Putney) and the Central England Rehabilitation Unit (Royal Leamington Spa).

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Historic fires trapped in Antarctic ice yield key information for climate models

Researcher holding up an ice sample

Researchers from the University of Cambridge and the British Antarctic Survey tracked fire activity over the past 150 years by measuring carbon monoxide trapped in Antarctic ice. This gas is released, along with smoke and particulates, by wildfires, cooking and communal fires.

The findings, reported in the Proceedings of the National Academy of Sciences, reveal that biomass burning has been more variable since the 1800s than had been thought. The new data could help improve climate models, which rely on information about past atmospheric gases, such as carbon monoxide, to improve their forecasts.

“We’ve been missing key information from the period when humans started to dramatically alter Earth’s climate; information needed to test and develop climate models,” said Rachael Rhodes, senior author of the paper from Cambridge’s Department of Earth Sciences.  

The new carbon monoxide record fills that gap in time. The researchers charted the strength of biomass burning between 1821 and 1995 by measuring carbon monoxide in ice cores from Antarctica. The layers of ice inside these cores formed when snow was buried under subsequent years’ snowfall, encasing pockets of air that directly sample the atmosphere's composition at the time.

“It’s rare to find trace gases trapped in ice cores for the most recent decades,” said Ivo Strawson, lead author of the study who is jointly based at Cambridge Earth Sciences and the British Antarctic Survey. “We need information on the atmosphere's composition following the onset of industrialisation to reduce uncertainties in climate models, which rely on these records to test or drive their simulations.”

A major difficulty with taking gas measurements from very young ice is that pressurised air bubbles haven’t had time to form under the weight of more snow, said Strawson. To get around this problem, the researchers studied ice from locations where snow accumulates rapidly. These ice cores, held in BAS’ dedicated Ice Core Laboratory, were collected from the Antarctic Peninsula as part of previous international projects.

To measure carbon monoxide, the researchers developed a state-of-the-art analysis method, which melts ice continuously while simultaneously extracting the air. They collected tens of thousands of gas measurements for the past 150 years.

The researchers found that the strength of biomass burning has dropped steadily since the 1920s. That decline, said Rhodes, coincides with the expansion and intensification of agriculture in southern Africa, South America, and Australia during the early 20th century. With wildlands converted into farmland, forest cover was restricted and in turn fire activity dropped. “This trend reflects how land conversion and human expansion have negatively impacted landscapes and ecosystems, causing a major shift in the natural fire regime and in turn altering our planet’s carbon cycle,” said Rhodes.

One assumption made by many climate models, including those used by the IPCC, is that fire activity has increased in tandem with population growth. But, said Rhodes, “our work adds to a growing mass of evidence that this assumption is wrong, and the inventories of historic fire activity need to be corrected so that models can accurately replicate the variability we see in our record.”

Rachael Rhodes is a Fellow of Wolfson College, Cambridge. 

Reference:
Ivo Strawson et al. "Preindustrial Southern Hemisphere biomass burning variability inferred from ice core carbon monoxide records." Proceedings of the National Academy of Sciences(2024). DOI: https://doi.org/10.1073/pnas.2402868121

Pollutants preserved in Antarctic ice document historic fires in the Southern Hemisphere, offering a glimpse at how humans have impacted the landscape and providing data that could help scientists understand future climate change.

Rachael Rhodes

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Advanced MRI scans help identify one in three concussion patients with ‘hidden disease’

Diffusion tensor imaging (DTI) MRI of the human brain - stock photo

Around one in 200 people in Europe every year will suffer concussion. In the UK, more than 1 million people attend Emergency Departments annually with a recent head injury. It is the most common form of brain injury worldwide.

When a patient in the UK presents at an Emergency Department with head injury, they are assessed according to the NICE head injury guidelines. Depending on their symptoms, they may be referred for a CT scan, which looks for brain injuries including bruising, bleeding and swelling.

However, CT scans identify abnormalities in fewer than one in 10 patients with concussion, yet 30-40% of patients discharged from the Emergency Department following a scan experience significant symptoms that can last for years and be potentially life-changing. These include severe fatigue, poor memory, headaches, and mental health issues (including anxiety, depression, and post-traumatic stress).

Dr Virginia Newcombe from the Department of Medicine at the University of Cambridge and an Intensive Care Medicine and Emergency Physician at Addenbrooke’s Hospital, Cambridge, said: “The majority of head injury patients are sent home with a piece of paper telling them the symptoms of post-concussion to look out for and are told to seek help from their GP if their symptoms worsen.

“The problem is that the nature of concussion means patients and their GPs often don’t recognise that their symptoms are serious enough to need follow-up. Patients describe it as a ‘hidden disease’, unlike, say, breaking a bone. Without objective evidence of a brain injury, such as a scan, these patients often feel that their symptoms are dismissed or ignored when they seek help.”

In a study published today in eClinicalMedicine, Dr Newcombe and colleagues show that an advanced form of MRI known as diffusion tensor imaging (DTI) can substantially improve existing prognostic models for patients with concussion who have been given a normal CT brain.

DTI measures how water molecules move in tissue, providing detailed images of the pathways, known as white matter tracts, that connect different parts of the brain. Standard MRI scanners can be adapted to measure this data, which can be used to calculate a DTI ‘score’ based on the number of different brain regions with abnormalities.

Dr Newcombe and colleagues studied data from more than 1,000 patients recruited to the Collaborative European NeuroTrauma Effectiveness Research in Traumatic Brain Injury (CENTER-TBI) study between December 2014 and December 2017. 38% of the patients had an incomplete recovery, meaning that three months after discharge their symptoms were still persisting.

The team assigned DTI scores to the 153 patients who had received a DTI scan. This significantly improved the accuracy of the prognosis – whereas the current clinical model would correctly predict in 69 cases out of 100 that a patient would have a poorer outcome, DTI increased this to 82 cases out of 100.

Whole brain diffusion tensor tractography showing healthy patient (left) and patient at two days (centre) and six weeks (right) after severe traumatic brain injury (Credit: Virginia Newcombe)

The researchers also looked at blood biomarkers – proteins released into the blood as a result of head injury – to see whether any of these could improve the accuracy of the prognosis. Although the biomarkers alone were not sufficient, concentrations of two particular proteins – glial fibrillary acidic protein (GFAP) within the first 12 hours and neurofilament light (NFL) between 12 and 24 hours following injury – were useful in identifying those patients who might benefit from a DTI scan.

Dr Newcombe said: “Concussion is the number one neurological condition to affect adults, but health services don’t have the resources to routinely bring back every patient for a follow-up, which is why we need a way of identifying those patients at greatest risk of persistent symptoms.

“Current methods for assessing an individual’s outlook following head injury are not good enough, but using DTI – which, in theory, should be possible for any centre with an MRI scanner – can help us make much more accurate assessments. Given that symptoms of concussion can have a significant impact on an individual’s life, this is urgently needed.”

The team plan to look in greater details at blood biomarkers, to see if they can identify new ways to provide even simpler, more practical predictors. They will also be exploring ways to bring DTI into clinical practice.

Dr Sophie Richter, a NIHR Clinical Lecturer in Emergency Medicine and first author, Cambridge, added: “We want to see if there is a way to integrate the different types of information obtained when a patient presents at hospital with brain injury – symptoms assessment, blood tests and brain scans, for example – to improve our assessment of a patient’s injury and prognosis.”

The research was funded by European Union's Seventh Framework Programme, Wellcome and the National Institute for Health and Care Excellence.

Reference
Richter, S et al. Predicting recovery in patients with mild traumatic brain injury and a normal CT using serum biomarkers and diffusion tensor imaging (CENTER-TBI): an observational cohort study. eClinMed; 8 Aug 2024; DOI: 10.1016/j.eclinm.2024.102751

Offering patients with concussion a type of brain scan known as diffusion tensor imaging MRI could help identify the one in three people who will experience persistent symptoms that can be life changing, say Cambridge researchers.

Concussion is the number one neurological condition to affect adults, which is why we need a way of identifying those patients at greatest risk of persistent symptoms
Virginia Newcombe
Diffusion tensor imaging (DTI) MRI of the human brain - stock photo

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Far from clear’ new Alzheimer’s drugs will make a difference at a population level, say researchers

Woman sitting in a wheelchair

Writing in Alzheimer's & Dementia: The Journal of the Alzheimer's Association, the team from Cambridge Public Health argue that substantial challenges including the risk-benefit ratio, limited eligibility and high cost of roll-out will limit any benefits of these treatments.

Alzheimer’s disease is often quoted as causing 70% of the 55 million cases of dementia worldwide, though the definition of what constitutes the disease is hotly debated. One characteristic of Alzheimer’s is the build-up of clusters of misfolded proteins, one of these being a form of amyloid, leading to plaques in the brain. The cascade hypothesis, a dominant theory in the field, suggests that this triggers a series of processes which together lead to dementia symptoms.

Advances in developing treatments to reduce symptoms and slow down the progression in the early stages of Alzheimer’s has been slow. However, there has been recent excitement surrounding amyloid immunotherapy agents, drugs that harness the immune system to remove amyloid pathology.

Two completed phase III randomised controlled trials of amyloid immunotherapy reported statistically significant reductions in the rate of cognitive and functional decline compared to the placebo.

But as the Cambridge team point out, the effect sizes were small – small enough that a doctor would struggle to tell the difference between the average decline of a patient on the drug and another on placebo, after 18 months. The drugs were also associated with significant adverse events, including brain swelling and bleeding; during the phase III trial of one agent, donanemab, there were also three deaths attributed to the treatment.

Crucially, there is little known about the long-term effects of the drugs beyond the 18 month trial periods. Long-term placebo-controlled trials, which would be needed to see if there is any clinically meaningful slowing of decline, are unlikely to be feasible where drugs are already approved.    

Despite this, the US Food and Drug Administration has licensed two such drugs. The European Medicines Agency (EMA) has recommended rejecting one (lecanemab) predominantly on the grounds that the small effects seen do not outweigh the risk from side effects; it is reviewing the other. The UK’s Medicines and Healthcare Products Regulatory Agency (MHRA) is expected to take a decision on both drugs imminently.

Edo Richard, Professor of Neurology at Radboud University Medical Centre in Nijmegen, The Netherlands, and co-author, said: “If these drugs are approved by regulators in the UK and Europe, and become available, it is understandable that some people with early Alzheimer’s will still want to try these drugs, given their despair living with this dreadful disease. But there is a lot of hyperbole around the reporting of these drugs, and significant effort will be needed to provide balanced information to patients to enable informed decisions.”

Press coverage of the drugs has implied they are suitable for anyone with a diagnosis of Alzheimer’s. However, while the trials included those with ‘early symptomatic Alzheimer’s disease’, it excluded those with other conditions that may have been contributing to their symptoms.  Evidence suggests that the people in the trials represent less than 8% of those in the community with early Alzheimer’s disease. Those in the trials were up to 10 to 15 years younger than those typically presenting to health services with early symptoms.

Lead author Dr Sebastian Walsh, NIHR Doctoral Fellow in Public Health Medicine at Cambridge Public Health, University of Cambridge, added: “If approved, the drugs are likely to be relevant only for a relatively small cohort of Alzheimer’s patients, so potential recipients will need to undergo a range of assessments before being given access to the drugs. Plus, effect sizes seen in the trials are very small and the drugs will need be administered as early in the disease process as possible, when symptoms are mild – and people in these phases of disease can be hard to identify.”

The resource requirements for rolling out such treatments are likely to be considerable. Even if approved for only a small proportion of Alzheimer’s patients, a much broader group of people will need to be assessed for eligibility, requiring rapid specialist clinical assessment and tests. The authors question whether this is the best use of these resources, given the strain health systems are already under. Support would also be required for the large number of Alzheimer’s patients (potentially as many as 92%) found to be ineligible. Those found to have insufficient amyloid to be eligible may then require follow-up assessments to determine eligibility in the future, with the further implications for services this would entail.

Professor Carol Brayne, Co-director of Cambridge Public Health, said: “Even in high-income countries, rolling out such types of treatments at scale is highly challenging, but most dementia occurs in low- and middle-income countries. Health systems in these countries are highly unlikely to have the resources required to offer these new drugs, even to a very narrow group.

“Other compelling evidence suggests that attention to inequalities and health experience across people’s lives could have greater impact on the rates of dementia in populations. Most dementia is more complicated than a single protein.”

The team concludes that based on current evidence, it is far from clear whether amyloid immunotherapy can ever significantly reduce suffering caused by dementia at scale in the community, and we must continue to explore other approaches.

Professor Brayne added: “With an ageing population, we urgently need effective ways to support people living with dementia, but while the current amyloid immunotherapies may show a glint of promise for very selected groups, it’s clear these drugs will not address dementia risk at scale.”

Reference
Walsh, S et al. Considering challenges for the new Alzheimer’s drugs: clinical, population, and health system perspectives. Alz&Dem; 6 Aug 2024; DOI: 10.1002/alz.14108

Cambridge researchers have cast doubt on whether new amyloid immunotherapy drugs will have the desired effect of significantly reducing the impact of Alzheimer’s disease.

While the current amyloid immunotherapies may show a glint of promise for very selected groups, it’s clear these drugs will not address dementia risk at scale
Carol Brayne
Woman sitting in a wheelchair

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

A new way of thinking about the economy could help protect the Amazon, and help its people thrive

Man (seringueiro) extracts latex from a tree in the middle of the Amazon.

A group of conservationists from Bolivia, Brazil, Peru, Ecuador, the US and the UK say that current conservation and development efforts will never sustain or scale without systemic changes in how economies are designed.

Despite extensive destruction of the Amazon in the name of economic development, Amazonian communities have seen little improvement in income, life expectancy, and education. The researchers have proposed a new model and associated policy changes that could create fair and sustainable futures for the Amazon and its people by improving infrastructure, supply chains, and social organisations.

Their results, reported in the journal Nature Ecology and Evolution, are focused on the Amazon, however the researchers say similar economic models could be implemented around the world if the political will exists.

The Amazon basin is home to the world’s largest tropical rainforest, representing over half of the world’s remaining rainforest, and stores vast amounts of carbon. However, decades of large-scale deforestation, as well as the increased risk of fires and floods due to climate change, has put much of the Amazon rainforest under threat. In addition to what the loss of the Amazon would mean for global carbon emissions, the rainforest is also home to many indigenous peoples and thousands of species of plants and animals.

“We need a different vision for the Amazon if we’re going to protect it,” said lead author Professor Rachael Garrett from the University of Cambridge’s Department of Geography and the Conservation Research Institute. “Half a century of deforestation and exploitation of the Amazon has not resulted in widespread development, and now the economic value of deforested areas is threatened, not to mention the threats to the global climate and water security.”

Working with colleagues from the Amazonian region, Garrett has proposed building on the success of indigenous and traditional communities to develop new economies, which could protect much of the Amazon while also improving the livelihoods, health, and food security of the many people who live there. These economic models are known as socio-bioeconomies, or SBEs.

“Conventional economic models can result in short-term gains, but over the longer term, the people and resources of the Amazon basin have been exploited by powerful interests, while there has been an underinvestment in education, innovation, and sustainable infrastructure,” said Garrett. “The conventional economic model is simply not sustainable.”

The SBE model is focused on using and restoring Amazonian and other ecosystems sustainably, and supporting indigenous and rural communities. An SBE economy might include eco-friendly tourism, or the sustainable harvest and processing of plant products into valuable foods, beverages, clothing, and medicines.

“A limited range of interests are controlling the development agenda in most countries,” said Garrett. “The only way we can change that is improving the rights and representation of the people who are not benefiting from the systems and are being harmed by ongoing environmental destruction. We believe it is possible to have win-wins for humanity and conservation, but not if we continue to consume products that have a massively negative impact. SBEs can help put these win-wins into policy and practice.”

Garrett cites the footwear brand Veja as an example of such a win-win. The French company buys the rubber for its trainers from small-scale Amazonian rubber farmers, and purchases 100% of the responsibly harvested native rubber in Brazil. As part of its sustainability efforts, the company focuses on building communities of small-scale farmers and has been financially successful without traditional advertising.

Garrett and her collaborators are calling for massive increases in social mobilisation, technology and infrastructure to support SBEs. Under an SBE model, governmental subsidies would be redirected away from agribusiness and toward smaller-scale sustainable development. The researchers also outline how to build connections between rural and urban policies in SBEs. An example is the establishment of public procurement programmes where healthy and sustainably produced foods are purchased directly from indigenous and small farming communities and served in school lunch programmes and hospitals, instead of supporting large-scale agribusiness engaged in degrading practices.

Other policy changes that could support an SBE model include redirecting finance to conservation and restoration activities, supporting community enterprises, and ensuring participatory processes to ensure inclusive, long-term benefits.

“It’s possible to have an economy that is strong and works for everyone when we dare to develop new models and visions that recognise the interconnectedness of people and nature,” said Garrett. “By popularising these ideas, investing in people and businesses who are making a difference, and supporting research into SBE innovation we can support a transformation in both conservation and development in the Amazon.

“The SBE model could help protect the Amazon and its people while avoiding climate and biodiversity disasters, but there needs to be the political will to make it happen.”

Rachael Garrett is the incoming director of the University of Cambridge Conservation Research Institute and a Fellow of Homerton College, Cambridge. She is a council member of the Cambridge Conservation Initiative and serves on the UN Science Panel for the Amazon.

 

Reference:
Rachael Garrett et al. ‘Transformative changes are needed to support socio-bioeconomies for people and ecosystems in the Amazon.’ Nature Ecology and Evolution (2024). DOI: 10.1038/s41559-024-02467-9

To protect the Amazon and support the wellbeing of its people, its economy needs to shift from environmentally harmful production to a model built around the diversity of indigenous and rural communities, and standing forests.

Man extracts latex from a tree in the middle of the Amazon.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Astronomers uncover risks to planets that could host life

A red dwarf star unleashes a series of powerful flares.

The discovery suggests that the intense UV radiation from these flares could significantly impact whether planets around red dwarf stars can be habitable.

“Few stars have been thought to generate enough UV radiation through flares to impact planet habitability. Our findings show that many more stars may have this capability,” said first author Vera Berger, who led the research while based at the University of Hawai’i and who is now based at the University of Cambridge.  

Berger and her team used archival data from the GALEX space telescope to search for flares among 300,000 nearby stars. GALEX is a now-decommissioned NASA mission that simultaneously observed most of the sky at near-and far-UV wavelengths from 2003 to 2013. Using new computational techniques, the team mined insights from the data.

“Combining modern computer power with gigabytes of decades-old observations allowed us to search for flares on thousands and thousands of nearby stars,” said co-author Dr Michael Tucker from Ohio State University.

According to researchers, UV radiation from stellar flares can either erode planetary atmospheres, threatening their potential to support life, or contribute to the formation of RNA building blocks, which are essential for the creation of life.

The study, published in the Monthly Notices of the Royal Astronomical Society, challenges existing models of stellar flares and exoplanet habitability, showing that far-UV emission from flares is on average three times more energetic than typically assumed, and can reach up to twelve times the expected energy levels.

“A change of three is the same as the difference in UV in the summer from Anchorage, Alaska to Honolulu, where unprotected skin can get a sunburn in less than 10 minutes,” said co-author Benjamin J. Shappee from the University of Hawai’i.

The exact cause of this stronger far-UV emission remains unclear. The team believes it might be that flare radiation is concentrated at specific wavelengths, indicating the presence of atoms like carbon and nitrogen.

“This study has changed the picture of the environments around stars less massive than our Sun, which emit very little UV light outside of flares,” said co-author Jason Hinkle.

According to Berger, now a Churchill Scholar at Cambridge, more data from space telescopes is needed to study the UV light from stars, which is crucial for understanding the source of this emission.

“Our work puts a spotlight on the need for further exploration into the effects of stellar flares on exoplanetary environments,” said Berger. “Using space telescopes to obtain UV spectra of stars will be crucial for better understanding the origins of this emission.”

Reference:
Vera L Berger et al. ‘Stellar flares are far-ultraviolet luminous.’ Monthly Notices of the Royal Astronomical Society (2024). DOI: 10.1093/mnras/stae1648

Adapted from a University of Hawai’i media release.

Astronomers have discovered that red dwarf stars can produce stellar flares that carry far-ultraviolet (far-UV) radiation levels much higher than previously believed.

A red dwarf star unleashes a series of powerful flares.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

The rise, fall and revival of research on human development

Photos of embryos of horizon XVII, published in Contributions to Embryology in 1948 and still in use as Carnegie Stage 17.

Analysing the past sheds light on the present resurgence of research on human development. That’s the lesson of a new study by Professor Nick Hopwood, from the Department of History and Philosophy of Science, that is published in the Journal of the History of Biology. The paper discusses the flourishing of human embryology a century ago, its drop in popularity after World War II, and especially its revival since the late twentieth century.

“Every journal article and news story about human development includes a bit of history, but it’s often narrow, rarely informative and not always accurate”, Hopwood says. “I wanted to stand back and see a bigger picture, then dig down to find out how and why there has been such a surge of attention. Working in Cambridge made that easier.”

The University has been at the forefront of innovation, from the first test-tube baby to the extended culture of early embryos, organoids and other stem-cell models. The networking through Cambridge Reproduction of expertise in science and medicine, humanities and social sciences helped Hopwood reconstruct the genesis of these advances. This took a combination of research in libraries and archives and interactions with scientists, including interviews, sharing of documents, attending conferences and giving talks, here and elsewhere.

“Human development has long been of special interest as evidence of our origins and for its medical relevance, but is hard to study”, Hopwood explains. “Historically there have been two main approaches. Either deciding that it’s too difficult to research human embryos because they’re usually hidden in pregnant bodies, so we should study other animals and hope results will transfer. That’s an indirect approach. Or trying for the best possible results from the few human specimens that can be obtained. That’s a direct approach. My article analyses the rise of research directly on human material as part of the changing politics of choosing a species to study. I explore how researchers distanced themselves from work on animal models but even human studies depended on this.”

Interest in human embryos grew in the later 19th century, following debates about evolution. Darwinists pointed to the similarity of humans and other animals at early stages as evidence of common descent. Critical anatomists responded by setting up networks of physicians to collect material, mainly from women’s pregnancy losses. New techniques such as serial sectioning and wax modelling from the slices made details of internal structure visible in 3-D.

This led to a watershed moment: the establishment by the Carnegie Institution of Washington of a Department of Embryology at Johns Hopkins University in Baltimore. Founded in 1914, the first research institution devoted specifically to embryology focused on human embryos, now also increasingly recovered from aseptic operations for various conditions. Important discoveries include elucidation of the timing of ovulation in the menstrual cycle, initially in rhesus macaques. Human embryos from the first two weeks after fertilization were described for the first time.

Flies, frogs and chicks

After World War II human embryology ran out of steam. A new field, developmental biology, focused on model organisms, such as flies, frogs, chicks and, as the exemplary mammal, mice.

“To make progress, the argument went, it was necessary to work on species where more could be done more easily”, Hopwood explains. “That meant micromanipulation, enough material to do biochemistry and molecular biology, and genetic tools.” This approach demonstrated its power in the 1980s, when mechanisms of development were found to be more conserved across the animal kingdom than researchers had imagined. Yet from around the same time interest revived in using human material.

“There was not a steadily rising curve of research on human development through the twentieth century”, Hopwood contends. “Instead, human embryos have gone through cycles of attention and neglect. As opportunities opened up and the balance of power shifted between researchers invested in different organisms, so the politics of species choice have changed. Over the last four decades we’ve seen a renewal of research directly on human development. This is in the first place because of changes in supply and demand.”

The achievement of human in-vitro fertilisation, with a live birth in 1978, gave access to embryos before implantation in the uterus. After much debate the UK Human Fertilisation and Embryology Act 1990 permitted donated embryos to be kept in vitro, under strict regulations, for up to 14 days from fertilization. Though only in 2016 was that limit approached. Meanwhile, biobanks, notably the Human Developmental Biology Resource in Newcastle and London, provided ethical supplies of post-implantation stages from terminations of pregnancy.

There has been opposition from anti-abortion activists, and many fewer embryos are donated for research than scientists (and some patients) would like. But the field was transformed. As in the years around 1900, new technologies eased the study of human embryos. Only now the advances were in digital communication, molecular analysis and imaging methods. Optical slices and computer graphics replaced microscope slides and wax models.

Beyond mice

To obtain human embryos with permission and funding to study them, researchers had to make the case for studying our own species. They stimulated demand by arguing that it would no longer do simply to extrapolate from mice. Knowledge and skills from the mouse model could be applied, but the differences as well as the similarities had to be explored. That was crucial before clinical application, as in fertility treatments. It was also desirable in discovering what makes us human—or at least not mice. Funders were keen to support medically relevant research or “translational science”.

In the last fifteen years another kind of model has transformed the politics of species choice. Subject to ongoing ethical negotiations, stem-cell-based embryo models have enabled fresh kinds of experiment on human development. Some researchers even argue that, for investigating fundamentals of vertebrate development, these human systems are now the model. Mice remain a crucial resource, with almost every innovation made on them first. But since their development is rather peculiar, other laboratories are promoting comparisons with species that develop more like humans.

Around ten years ago, all this inspired the organization of a new sub-field, human developmental biology, not least through a series of conferences. Major research programmes, such as the Human Developmental Biology Initiative, bring together scientists working, in different ways, on various aspects of embryogenesis.

Questions remain. Hopwood’s historical research concentrated on the USA and the UK, with nods to continental Europe and Japan. It would be good to explore other countries’ histories, he suggests, especially since differences in reproductive politics and infrastructure mean that access to material is uneven.

More generally, Hopwood argues, “history can contribute by showing how we got here and clarifying the arguments that have been used”. “It helps stakeholders see why there are now such opportunities for research on human development, and that, because arrangements are fragile, it will take work to gain and keep public support.” So a long-term perspective can assist researchers and funders in thinking about what might happen next.

“Interest in human development has risen and fallen and risen again. Are we now going through another cycle of attention, or could interest be maintained? Will the balance shift back to animal models or will we see an ever greater focus on humans, at least in the form of stem-cell models? How might present actions shape choice of species in the future?”

The research was part-funded by a Major Research Fellowship from the Leverhulme Trust. Story by Edward Grierson from the School of Humanities and Social Sciences communications team. 

A new study takes a tour of the history of research into human embryology and development to show the "cycles of attention" that led to major scientific breakthroughs.

Photos of embryos of horizon XVII, published in Contributions to Embryology in 1948 and still in use as Carnegie Stage 17.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists discover entirely new wood type that could be highly efficient at carbon storage

Tulip tree in Cambridge University Botanic Garden

Scientists from the Sainsbury Laboratory at Cambridge University and Jagiellonian University, Poland made the discovery while undertaking an evolutionary survey of the microscopic structure of wood from some of the world’s most iconic trees and shrubs. 

They found that Tulip Trees, which are related to magnolias and can grow over 30 metres (100 feet) tall, have a unique type of wood. This discovery may explain why the trees, which diverged from magnolias when earth's atmospheric CO2 concentrations were relatively low, grow so tall and so fast. This opens new opportunities to improve carbon capture and storage in plantation forests by planting a fast-growing tree more commonly seen in ornamental gardens, or breeding Tulip Tree-like wood into other tree species.

The discovery was part of an evolutionary survey of the microscopic structure of wood from 33 tree species from the Cambridge University Botanic Garden’s Living Collections. The survey explored how wood ultrastructure evolved across softwoods (gymnosperms such as pines and conifers) and hardwoods (angiosperms including oak, ash, birch, and eucalypts). 

The wood samples were collected from trees in the Botanic Garden in coordination with its Collections Coordinator. Fresh samples of wood, deposited in the previous spring growing season, were collected from a selection of trees to reflect the evolutionary history of gymnosperm and angiosperm populations as they diverged and evolved. 

Using the Sainsbury Laboratory's low temperature scanning electron microscope (cryo-SEM), the team imaged and measured the size of the nanoscale architecture of secondary cell walls (wood) in their native hydrated state.

Microscopy Core Facility Manager at the Sainsbury Laboratory, Dr Raymond Wightman, said: “We analysed some of the world’s most iconic trees like the Coast Redwood, Wollemi Pine and so-called 'living fossils' such as Amborella trichopoda, which is the sole surviving species of a family of plants that was the earliest still existing group to evolve separately from all other flowering plants.

“Our survey data has given us new insights into the evolutionary relationships between wood nanostructure and the cell wall composition, which differs across the lineages of angiosperm and gymnosperm plants. Angiosperm cell walls possess characteristic narrower elementary units, called macrofibrils, compared to gymnosperms.” 

The researchers found the two surviving species of the ancient Liriodendron genus, commonly known as the Tulip Tree (Liriodendron tulipifera) and Chinese Tulip Tree (Liriodendron chinense) have much larger macrofibrils than their hardwood relatives.

Hardwood angiosperm macrofibrils are about 15 nanometres in diameter and faster growing softwood gymnosperm macrofibrils have larger 25 nanometre macrofibrils. Tulip Trees have macrofibrils somewhere in between, measuring 20 nanometres.

Lead author of the research published in New Phytologist, Dr Jan Łyczakowski from Jagiellonian University, said: “We show Liriodendrons have an intermediate macrofibril structure that is significantly different from the structure of either softwood or hardwood. Liriodendrons diverged from Magnolia Trees around 30-50 million years ago, which coincided with a rapid reduction in atmospheric CO2. This might help explain why Tulip Trees are highly effective at carbon storage.”

The team suspect it is the larger macrofibrils in this 'midwood' or 'accumulator-wood' that is behind the Tulip Trees’ rapid growth.

Łyczakowski added: “Both Tulip Tree species are known to be exceptionally efficient at locking in carbon, and their enlarged macrofibril structure could be an adaptation to help them more readily capture and store larger quantities of carbon when the availability of atmospheric carbon was being reduced. Tulip Trees may end up being useful for carbon capture plantations. Some east Asian countries are already using Liriodendron plantations to efficiently lock in carbon, and we now think this might be related to its novel wood structure.” 

Liriodendron tulipifera are native to northern America and Liriodendron chinense is a native species of central and southern China and Vietnam.

Łyczakowski said: “Despite its importance, we know little about how the structure of wood evolves and adapts to the external environment. We made some key new discoveries in this survey – an entirely novel form of wood ultrastructure never observed before and a family of gymnosperms with angiosperm-like hardwood instead of the typical gymnosperm softwood. 

“The main building blocks of wood are the secondary cell walls, and it is the architecture of these cell walls that give wood its density and strength that we rely on for construction. Secondary cell walls are also the largest repository of carbon in the biosphere, which makes it even more important to understand their diversity to further our carbon capture programmes to help mitigate climate change.”

This research was funded by the National Science Centre Poland and The Gatsby Charitable Foundation.

Reference: Lyczakowski, J L and Wightman, R. "Convergent and adaptive evolution drove change of secondary cell wall ultrastructure in extant lineages of seed plants." July 2024, New Phytologist.  DOI: 10.1111/nph.19983

All cryo-SEM images from the wood survey are publicly available. 

Read more about this research.

Researchers have identified an entirely new type of wood that does not fit into either category of hardwood or softwood.

Tulip tree in Cambridge University Botanic Garden

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Professor Sir Simon Baron-Cohen made honorary fellow of Royal Society of Medicine

Professor Henrietta Bowden-Jones, Professor Simon Baron-Cohen, Professor Roger Kirby

Professor Baron-Cohen is a British clinical psychologist and professor of developmental psychopathology at the University of Cambridge. He is the director of the university's Autism Research Centre and a Fellow of Trinity College.

The honorary fellowships were granted at a ceremony at the RSM’s central London home. 

Speaking at the ceremony, Professor Baron-Cohen said: “Although I’m receiving this honour, I’m really here because of the work of the team of researchers at the Autism Research Centre at Cambridge. I want to thank them for all their hard work into both basic science into trying to understand the cause of autism but also applied research to evaluate what kinds of support might help autistic people and their families.”

The Society also bestowed honours upon Baron Adebowale CBE, Major General Timothy Hodgetts CB, Professor Martin McKee CBE, Professor Dame Robina Shah and Professor Irene Tracey CBE.

Adapted from a news story by the Royal Society of Medicine.

Professor Sir Simon Baron-Cohen has been awarded an honorary fellowship of the Royal Society of Medicine, in recognition of his contribution to health, healthcare and medicine.

Although I’m receiving this honour, I’m really here because of the work of the team of researchers at the Autism Research Centre
Simon Baron-Cohen
Professor Henrietta Bowden-Jones, Professor Simon Baron-Cohen, Professor Roger Kirby

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Incidence of heart attacks and strokes was lower after COVID-19 vaccination

Vial of the AstraZeneca COVID-19 vaccine

The study, published today in Nature Communications, showed that the incidence of arterial thromboses, such as heart attacks and strokes, was up to 10% lower in the 13 to 24 weeks after the first dose of a COVID-19 vaccine. Following a second dose, the incidence was up to 27% lower after receiving the AstraZeneca vaccine and up to 20% lower after the Pfizer/Biotech vaccine.

The incidence of common venous thrombotic events – mainly pulmonary embolism and lower limb deep venous thrombosis – followed a similar pattern.

Research led by the Universities of Cambridge, Bristol and Edinburgh and enabled by the British Heart Foundation (BHF) Data Science Centre at Health Data Research UK analysed de-identified health records from 46 million adults in England between 8 December 2020 and 23 January 2022. Data scientists compared the incidence of cardiovascular diseases after vaccination with the incidence before or without vaccination, during the first two years of the vaccination programme.

Co-first author Dr Samantha Ip, Research Associate at the Department of Public Health and Primary Care, University of Cambridge, said: “We studied COVID-19 vaccines and cardiovascular disease in nearly 46 million adults in England and found a similar or lower incidence of common cardiovascular diseases, such as heart attacks and strokes, following each vaccination than before or without vaccination. This research further supports the large body of evidence on the safety of the COVID-19 vaccination programme, which has been shown to provide protection against severe COVID-19 and saved millions of lives worldwide.”

Previous research found that the incidence of rare cardiovascular complications is higher after some COVID-19 vaccines. For example, incidence of myocarditis and pericarditis have been reported following mRNA-based vaccines such as the Pfizer/Biotech vaccine, and vaccine-induced thrombotic thrombocytopenia following adenovirus-based vaccines such as the AstraZeneca vaccine. This study supports these findings, but importantly it did not identify new adverse cardiovascular conditions associated with COVID-19 vaccination and offers further reassurance that the benefits of vaccination outweigh the risk.

Incidence of cardiovascular disease is higher after COVID-19, especially in severe cases. This may explain why incidence of heart attacks and strokes is lower in vaccinated people compared with unvaccinated people, but further explanations are beyond the scope of this study.

Professor William Whiteley, Associate Director at the BHF Data Science Centre and Professor of Neurology and Epidemiology at the University of Edinburgh, said: “The COVID-19 vaccination programme rollout began strongly in the UK, with over 90% of the population over the age of 12 vaccinated with at least one dose by January 2022.

“This England-wide study offers patients reassurance of the cardiovascular safety of first, second and booster doses of COVID-19 vaccines. It demonstrates that the benefits of second and booster doses, with fewer common cardiovascular events include heart attacks and strokes after vaccination, outweigh the very rare cardiovascular complications.”

The research team used de-identified linked data from GP practices, hospital admissions and death records, analysed in a secure data environment provided by NHS England.

Co-last author Dr Venexia Walker, Research Fellow at the University of Bristol, said: “Given the critical role of COVID-19 vaccines in protecting people from COVID-19, it is important we continue to study the benefits and risks associated with them. The availability of population-wide data has allowed us to study different combinations of COVID-19 vaccines and to consider rare cardiovascular complications. This would not have been possible without the very large data that we are privileged to access and our close cross-institution collaborations.”

Reference
Ip, S et al. Cohort study of cardiovascular safety of different COVID-19 vaccination doses among 46 million adults in England. Nat Comms; 31 Jul 2024; DOI: 10.1038/s41467-024-49634-x

Adapted from a press release from Health Data Research UK

The incidence of heart attacks and strokes was lower after COVID-19 vaccination than before or without vaccination, according to a new study involving nearly the whole adult population of England.

This research further supports the large body of evidence on the effectiveness of the COVID-19 vaccination programme, which has saved millions of lives worldwide
Samantha Ip
Vial of the AstraZeneca COVID-19 vaccine

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Thousands of birds and fish threatened by mining for clean energy transition

Gold mine in Rondonia, Amazonian Brazil

New research has found that 4,642 species of vertebrate are threatened by mineral extraction around the world through mining and quarrying, and drilling for oil and gas.

Mining activity coincides with the world's most valuable biodiversity hotspots, which contain a hyper-diversity of species and unique habitats found nowhere else on Earth.

The biggest risk to species comes from mining for materials fundamental to our transition to clean energy, such as lithium and cobalt – both essential components of solar panels, wind turbines and electric cars.

Quarrying for limestone, which is required in huge amounts for cement as a construction material, is also putting many species at risk.

The threat to nature is not limited to the physical locations of the mines - species living at great distances away can also be impacted, for example by polluted watercourses, or deforestation for new access roads and infrastructure.

The researchers say governments and the mining industry should focus on reducing the pollution driven by mining as an ‘easy win’ to reduce the biodiversity loss associated with mineral extraction.

This is the most complete global assessment of the threat to biodiversity from mineral extraction ever undertaken. The results are published today in the journal Current Biology.

“We simply won’t be able to deliver the clean energy we need to reduce our climate impact without mining for the materials we need, and that creates a problem because we’re mining in locations that often have very high levels of biodiversity,” said Professor David Edwards in the University of Cambridge’s Department of Plant Sciences and Conservation Research Institute, senior author of the report.

He added: “So many species, particularly fish, are being put at risk through the pollution caused by mining. It would be an easy win to work on reducing this freshwater pollution so we can still get the products we need for the clean energy transition, but in a way that isn’t causing so much biodiversity loss.”

Across all vertebrate species, fish are at particularly high risk from mining (2,053 species), followed by reptiles, amphibians, birds and mammals. The level of threat seems to be linked to where a particular species lives and its lifestyle: species using freshwater habitats, and species with small ranges are particularly at risk.

“The need for limestone as a core component of construction activity also poses a real risk to wildlife. Lots of species are very restricted in where they live because they're specialised to live on limestone. A cement mine can literally take out an entire hillside - and with it these species’ homes,” said Ieuan Lamb in the University of Sheffield’s School of Biosciences, first author of the report.

The Bent-Toed Gecko, for example, is threatened by limestone quarrying in Malaysia – it only exists on a single mountain range that planned mining activity will completely destroy.

To get their results, the researchers used International Union for the Conservation of Nature (IUCN) data to see which vertebrate species are threatened by mining. By mapping the locations of these species they could investigate the types of mining that are putting species at risk, and see where the risks are particularly high.

The researchers discovered that species categorised as ‘vulnerable, endangered, or critically endangered’ are more threatened by mineral extraction than species of lesser concern.

Watercourses can be affected in many ways, and water pollution can affect hundreds of thousands of square kilometres of rivers and flood plains. Mining sand as a construction material, for example, alters patterns of water flow in rivers and wetlands, making birds like the Indian Skimmer more accessible to predators.

Mineral extraction threatens vertebrate species populations across the tropics, with hotspots in the Andes, coastal West and Central Africa, and South-East Asia – which coincide with high mine density. For example, artisanal small-scale alluvial gold mining in Ghana threatens important bird areas through environmental mercury pollution.

Global demand for metal minerals, fossil fuels and construction materials is growing dramatically, and the extraction industry is expanding rapidly to meet this demand. In 2022 the revenue of the industry as a whole was estimated at US $943 billion.

Biodiversity underpins the protection of the world’s carbon stocks, which help to mitigate climate change.

The study focused only on vertebrate species, but the researchers say mining is also likely to be a substantial risk to plants and invertebrates.

“There's no question that we are going to continue to mine - our entire societies are based on mined products. But there are environmental tensions embodied in our use of these products. Our report is a vital first step in avoiding biodiversity loss amidst the predicted drastic expansion of the mining industry,” said Edwards.

“Wildlife is more sensitive to mining in some regions of the world than in others, and our report can inform choices of where to prioritise getting our minerals to cause the least damage to biodiversity. Future policy should also focus on creating more circular economies - increasing recycling and reuse of materials, rather than just extracting more,” said Lamb.

The research was funded by the Hossein Farmy scholarship.

Reference: Lamb, I P, ‘Global threats of extractive industries on vertebrate biodiversity.’ Current Biology, July 2024. DOI: 10.1016/j.cub.2024.06.077

Our increasing demand for metals and minerals is putting over four thousand vertebrate species at risk, with the raw materials needed for clean energy infrastructure often located in global biodiversity hotspots, a study has found.

Our report is a vital first step in avoiding biodiversity loss amidst the predicted drastic expansion of the mining industry.
David Edwards
Gold mine in Rondonia, Amazonian Brazil

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Five hubs launched to ensure UK benefits from quantum future

L-R: Professor John Morton (UCL), Professor Rachel McKendry (UCL), Professor Mete Atatüre (Cambridge), Professor Eleni Nastouli (UCL)

The hub, called Q-BIOMED, is one of 5 quantum research hubs announced on 26 July by Peter Kyle MP, the Secretary of State for Science, Innovation and Technology, supported by £160 million in funding.

The hub will exploit advances in quantum sensors capable of detecting cells and molecules, potentially orders of magnitude more sensitively than traditional diagnostic tests.

This includes developing quantum-enhanced blood tests to diagnose infectious diseases and cancer quickly and cheaply using portable instruments, and sensors measuring tiny changes to the magnetic fields in the brain that have the potential to detect early markers of Alzheimer’s disease before symptoms occur.

Other research will include quantum-enhanced MRI scans, heart scanners and surgical and treatment interventions for early-stage and hard-to-treat cancers.

“Quantum technologies harness quantum physics to achieve a functionality or a performance which is otherwise unattainable, deriving from science which cannot be explained by classical physics,” said Hub Co-Director Professor Mete Atatüre, Head of Cambridge’s Cavendish Laboratory. “Q-BIOMED will be delivered by an outstanding team of researchers from academia, the NHS, charities, government and industry to exploit quantum-enhanced advances for human health and societal good.”

“Our hub aims to grow a new quantum for health innovation ecosystem in the UK, and has already shaped the UK's new Quantum Mission for Health,” said Hub Co-Director Professor Rachel McKendry, from the London Centre for Nanotechnology and Division of Medicine at UCL. “Our long-term vision is to accelerate the entire innovation pipeline from discovery research, to translation, adoption and implementation within the NHS and global health systems, for the benefit of patients and societal good.”

“Quantum sensing allows us to gather information at cellular and molecular levels with unprecedented sensitivity to electric and magnetic fields," said Dr Ljiljana Fruk from the Department of Chemical Engineering and Biotechnology, a member of the Q-BIOMED team. "I look forward to learning from colleagues and engaging in challenging discussions to develop more sensitive, affordable tools for doctors and patients, advancing the future of healthcare.” 

Cambridge researchers are also involved in three of the other newly-announced hubs:

  • The UK Hub for Quantum Enabled Position, Navigation and Timing (QEPNT), led by the University of Glasgow, will develop quantum technologies which will be key for national security and critical infrastructure and sectors such as aerospace, connected and autonomous vehicles (CAVs), finance, maritime and agriculture. Luca Sapienza (Engineering), Louise Hirst (Materials Science and Metallurgy/Cavendish Laboratory) and Dave Ellis (Cavendish Laboratory) are part of the QEPNT team.
  • QCI3: Hub for Quantum Computing via Integrated and Interconnected Implementations, led by the University of Oxford, aims to develop the technologies needed for the UK to play a key role in the development of quantum computers, a market estimated to be worth $1.3 trillion by 2030. Ulrich Schneider (Cavendish Laboratory), Helena Knowles (Cavendish Laboratory), and Chander Velu (Institute for Manufacturing) are part of the QCI3 team.
  • The Integrated Quantum Networks (IQN) Quantum Technology Research Hub, led by Heriot-Watt University, will undertake research towards the ultimate goal of a ‘quantum internet’, globally interlinked quantum networks connecting multiple quantum computers to produce enormous computational power. Richard Penty, Adrian Wonfor and Qixiang Cheng (Engineering), Atatüre and Dorian Gangloff (Cavendish Laboratory) are part of the IQN team.

The fifth hub, UK Quantum Technology Hub in Sensing, Imaging and Timing (QuSIT), is led by the University of Birmingham.

The five hubs are delivered by the UKRI Engineering and Physical Sciences Research Council (EPSRC), with a £106 million investment from EPSRC, the UKRI Biotechnology and Biological Research Council, UKRI Medical Research Council, and the National Institute for Health and Care Research. Added to this are contributions from industry and other partners worth more than £54 million.

Peter Kyle, Secretary of State for Science, Innovation and Technology, said: “We want to see a future where cutting-edge science improves everyday lives. That is the vision behind our investment in these new quantum technology hubs, by supporting the deployment of technology that will mean faster diagnoses for diseases, critical infrastructure safe from hostile threats and cleaner energy for us all.

“This isn’t just about research; it’s about putting that research to work. These hubs will bridge the gap between brilliant ideas and practical solutions. They will not only transform sectors like healthcare and security, but also create a culture of accelerated innovation that helps to grow our economy.”

EPSRC Executive Chair Professor Charlotte Deane said: “Technologies harnessing quantum properties will provide unparalleled power and capacity for analysis at a molecular level, with truly revolutionary possibilities across everything from healthcare to infrastructure and computing.

“The 5 Quantum Technology Hubs announced today will harness the UK’s expertise to foster innovation, support growth and ensure that we capitalise on the profound opportunities of this transformative technology.”

A major new research hub led by the University of Cambridge and UCL aims to harness quantum technology to improve early diagnosis and treatment of disease.

L-R: Professor John Morton (UCL), Professor Rachel McKendry (UCL), Professor Mete Atatüre (Cambridge), Professor Eleni Nastouli (UCL)

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Vice-Chancellor Deborah Prentice on Woman's Hour: "I learned the value of education"

Vice-Chancellor Deborah Prentice in the studio with Nuala McGovern. Both smiling to camera.

"The Higher Education sector in the UK is brilliant, really incredible, I'm not even sure people realise how strong it is.”

One year into the role our Vice-Chancellor, Professor Deborah Prentice, speaks to Woman’s Hour on BBC Radio 4. She calls for a national conversation to address the university funding crisis, welcomes the new Labour government’s support for international students and talks about how education is one of Britain’s great exports. “We want to keep it that way and that's going to require a sector-wide conversation about finances."      

She also reflects on how free speech and protests are viewed differently in the UK and US. As part of her efforts to encourage people to be able to disagree agreeably, she mentions convening the Vice-Chancellor’s Dialogues, a forum to discuss challenging topics with the widest range of viewpoints. 

When asked about her own upbringing as the daughter of a single mother, she said, “Even though my mother never could have imagined my path, I could never have taken my path without her...I learned the value of education. My mom felt not having access to higher education limited what she could do and I think she was right about that." 

Listen to the whole interview on BBC SoundsSpotify or Apple Podcasts 

Professor Prentice speaks to Nuala McGovern about funding in higher education, international students and freedom of speech on campus

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Cambridge and SAS launch partnership in AI and advanced analytics to accelerate innovation in the healthcare sector

Maxwell Centre, University of Cambridge

The SAS Advanced Analytics Hub will embed SAS experts and its AI platform capabilities into the University, enabling targeted collaboration with leading researchers and early-stage entrepreneurs.

Based on the Cambridge West campus, the Hub will have capacity to recruit and support several high-quality, high-impact academic research projects and promising early-stage startups in the health-tech space, providing their innovative ideas with extra momentum.

The partners have already demonstrated the effectiveness of the collaboration in addressing an important healthcare challenge. A University of Cambridge-led project on kidney transplants, PITHIA, developed an AI-based decision support approach using SAS’s dynamic analytics platform. It is being used to automate the process of scoring biopsies for kidneys to better identify those organs that can be used for transplantation. The aim is to increase the number of transplants and improve the function of those kidneys used. This has the potential to save lives and transform the quality of life for more than 100 people each year who would otherwise require dialysis, as well as saving the NHS millions of pounds annually.

This collaboration was initiated and led by Dr Alex Samoshkin (Deputy Head, Office for Translational Research, School of Clinical Medicine) who facilitates interactions between clinicians and researchers from the Biomedical Campus with researchers in science and technology working with the Maxwell Centre. Dr Samoshkin said: “In 2018 I supported the PITHIA project led by Prof. Gavin Pettigrew, looking to optimise qualification of kidneys for transplantation, for which SAS turned out to be the perfect industrial partner. We demonstrated that synergy between the University and SAS was instrumental in accelerating the process of transitioning from ideas to the clinic.”

This initial success paved the way for a more ambitious partnership between Cambridge and SAS. The Cambridge team visited the SAS headquarters at Cary, NC, USA in June 2023 to discuss collaboration opportunities with the SAS senior leadership team including Dr Jim Goodnight, co-founder and CEO. Today, the SAS Advanced Analytics Hub at the Maxwell Centre begins building a pipeline of new collaborative projects with potential to improve health outcomes for millions of patients around the world.

The Maxwell Centre Director, Dr Aga Iwasiewicz-Wabnig, commented: “We are excited to interface Cambridge’s world-class research and innovation with SAS’ leading expertise in advanced analytics and AI forming a partnership for societal good. We are starting with a strong focus on healthcare and will build momentum to support future interdisciplinary projects on sustainability and social equality.”

Roderick Crawford, Senior Vice President, SAS Northern Europe, commented: “There are many examples we’re seeing of how AI can have a truly transformational effect, not just on businesses, but in areas such as healthcare and society as a whole. We’re delighted to deepen our relationship with the University of Cambridge through this partnership, and there is enormous potential when you consider the additional expertise our partners, such as Microsoft, and customers, such as AstraZeneca, can provide.”

The Maxwell Centre at the University of Cambridge and SAS, leaders in data and AI, are launching a partnership aimed at accelerating healthcare innovation through enhanced access to advanced analytics.

Maxwell Centre, University of Cambridge

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New genetic test will eliminate a form of inherited blindness in dogs

English Shepherd puppy

Progressive retinal atrophy (PRA) is a group of inherited diseases that causes progressive degeneration of the light sensitive cells at the back of the eye. Dogs with PRA have normal sight at birth, but by the age of four or five they will be totally blind. There is no treatment.

Now a team led by the University of Cambridge has identified the genetic mutation that causes PRA in English Shepherd Dogs, and developed a DNA test for it. By identifying dogs carrying the disease before their eyesight starts to fail, this provides a tool to guide breeding decisions so the disease is not passed on to puppies.

Owners usually don’t realise their dog has PRA until it is middle-aged, by which time it might have bred, and passed on the faulty gene to its puppies. This has made it a difficult disease to control.

The new discovery means that progressive retinal atrophy can now be completely eliminated from the English Shepherd Dog population very quickly.

The results are published today in the journal Genes.

“Once the dog’s eyesight starts to fail there’s no treatment – it will end up totally blind,” said Katherine Stanbury, a researcher in the University of Cambridge’s Department of Veterinary Medicine and first author of the report.

She added: “Now we have a DNA test, there’s no reason why another English Shepherd Dog ever needs to be born with this form of progressive retinal atrophy – it gives breeders a way of totally eliminating the disease.”

The genetic mutation identified by the team is recessive, which means it only causes blindness if the English Shepherd Dog inherits two copies of it. If the dog only has one copy this makes it a carrier – it will not develop PRA but can pass the mutation on to its puppies. If two carriers are bred together, about one in four of the puppies will be affected with PRA.

Dogs breeds are very inbred, so many individuals are related – giving them a much higher chance of being affected by recessive diseases than humans.

The team began the research after being contacted by a distraught owner of an English Shepherd Dog that had been recently diagnosed with PRA. The dog had been working as a search and rescue dog but had to retire due to visual deterioration that has resulted in total blindness. The researchers put out a call for DNA samples from other owners or breeders of this breed, and received samples from six English Shepherds with PRA and twenty without it. This was enough for them to pinpoint the genetic mutation responsible for PRA using whole genome sequencing.

The team offers a commercial canine genetic testing service providing DNA tests to dog breeders to help them avoid breeding dogs that will develop inherited diseases. As part of this they will now offer a DNA test for Progressive Retinal Atrophy in English Shepherds. Anyone can buy a testing kit, costing just £48, to take a swab from inside their dog’s mouth and send it back for testing.

“An owner won't necessarily notice their dog has got anything wrong with its eyes until it starts bumping into the furniture. Unlike humans who will speak up if their sight isn’t right, dogs just have to get on with things,” said Dr Cathryn Mellersh in the University of Cambridge’s Department of Veterinary Medicine, senior author of the report.

She added: “For the price of a decent bag of dog food people can now have their English Shepherd tested for Progressive Retinal Atrophy prior to breeding. It’s about prevention, rather than a cure, and it means a huge amount to the people who breed these dogs. They no longer need to worry about whether the puppies are going to be healthy or are going to develop this horrible disease in a few years’ time.”

The English Shepherd is a breed of herding dog popular in the United States and is closely related to the Border Collie.

The new discovery is the thirty-third genetic mutation causing an inherited disease in dogs that the team has found – twenty-three of which cause eye diseases. They say that the health and wellbeing of many dogs has been compromised because of how they have been bred by humans.

PRA occurs in many dog breeds including the English Shepherd Dog. And it is similar to a disease called retinitis pigmentosa in humans, which also causes blindness. The researchers say that their work with dogs could shed light on the human version of the disease and potentially identify targets for gene therapy in the future.

The work was carried out in collaboration with Wisdom Panel, Mars Petcare, as part of the Consortium to Research Inherited Eye Diseases in Dogs (CRIEDD), with funding from the Dog’s Trust and the Kennel Club Charitable Trust.

Reference: Stanbury, K. et al, ‘Exonic SINE insertion in FAM161A is associated with autosomal recessive progressive retinal atrophy in the English Shepherd.’ July 2024.

FIND OUT MORE AND SUPPORT THIS RESEARCH

Cambridge scientists have identified the genetic mutation that causes progressive retinal atrophy in English Shepherd Dogs, which results in incurable blindness, and developed a genetic test to help eliminate the disease from future generations of the breed.

Now we have a DNA test, there’s no reason why another English Shepherd Dog ever needs to be born with this form of progressive retinal atrophy – it gives breeders a way of totally eliminating the disease.
Katherine Stanbury
English Shepherd puppy

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

British Academy elects Cambridge researchers to Fellowship

The British Academy

They are among 86 distinguished scholars to be elected to the fellowship in recognition of their work in fields ranging from medieval history to international relations.

The Cambridge academics made Fellows of the Academy this year are:

Professor Elisabeth van Houts (History Faculty; Emmanuel College)

Professor Tim Harper (History Faculty; Magdalene College)

Professor Rosalind Love (Department of ASNC; Robinson College)

Professor James Montgomery (FAMES; Trinity Hall)

Professor Ayşe Zarakol (POLIS; Emmanuel College)

Professor Tim Dalgleish (MRC Cognition and Brain Sciences Unit)

Founded in 1902, the British Academy is the UK’s national academy for the humanities and social sciences. It is a Fellowship consisting of over 1700 of the leading minds in these subjects from the UK and overseas.

Current Fellows include the classicist Professor Dame Mary Beard, the historian Professor Sir Simon Schama and philosopher Professor Baroness Onora O’Neill, while previous Fellows include Dame Frances Yates, Sir Winston Churchill, Seamus Heaney and Beatrice Webb. The Academy is also a funder of both national and international research, as well as a forum for debate and public engagement.

In 2024, a total of 52 UK Fellows, 30 International Fellows and 4 Honorary Fellows have been elected to the British Academy Fellowship.

Professor Ayse Zarakol said: “I am absolutely delighted to be elected a Fellow of the British Academy in recognition of my interdisciplinary work at the intersection of international relations, global history and historical sociology. It is an honour to join such a long line of distinguished scholars. I very much look forward to working with the Academy to advance research on the big questions of our day and to ensure that UK remains a hospitable environment for social sciences and humanities research that attracts the best talent from around the world.”

Professor Rosalind Love said: “As a grateful recipient of one its Postdoctoral Fellowships, I have always revered the British Academy and am deeply humbled by this honour. It shows that the Academy values the teaching of Medieval Latin, and research in that area, at a time when the subject faces cuts elsewhere. I’d like to express sincerest gratitude to the teachers who gave me a solid grounding and to all who have supported me over the years: they made this possible. I look forward to working with other FBAs to shape the future of the Humanities.”

Professor Tim Harper, Head of Cambridge’s School of the Humanities and Social Sciences, said: “It is an honour to be elected a fellow of the British Academy. As a historian, I am very aware of the challenges and opportunities for the humanities and social sciences that we collectively face. I look forward to continuing to strive to strengthen their position.”

Welcoming the Fellows, President of the British Academy Professor Julia Black said: “We are delighted to welcome this year’s cohort of Fellows, and I offer my warmest congratulations to each and every one. From the Academy’s earliest days, our Fellows are the lifeblood of the organisation, representing the very best of our disciplines – and we could not do all that we do without their expertise, time and energy. I very much look forward to working closely with our new Fellows – the breadth and depth of their expertise adds so much to the Academy.”

Six academics from the University of Cambridge have been made Fellows of the prestigious British Academy for the humanities and social sciences.

It is an honour to join such a long line of distinguished scholars.
Ayşe Zarakol
The British Academy

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Mindfulness training may lead to altered states of consciousness, study finds

Woman sitting on sand at sunset meditating

The team say that while these experiences can be very positive, that is not always the case. Mindfulness teachers and students need to be aware that they can be a side-effect of training, and students should feel empowered to share their experiences with their teacher or doctor if they have any concerns.

Mindfulness-based programmes have become very popular in recent years. According to recent surveys, 15% of adults in the UK have learnt some form of mindfulness. They are often practised as a way of reducing stress or coping with depression and anxiety. There is anecdotal evidence that practising mindfulness can lead to alterations of the senses, self, and body boundaries, some even similar to those induced by psychotropic drugs.

From September 2015 to January 2016, the University of Cambridge conducted a randomised controlled trial to assess the effectiveness of mindfulness training as a way of coping with the stress of examinations and found that it can help support students at risk of mental health problems.

Dr Julieta Galante from the Department of Psychiatry at the University of Cambridge, who led the trial, said: “There’s been anecdotal evidence that people who practice mindfulness experience changes in how they perceive themselves and the world around them, but it’s difficult to know whether these experiences are a result of mindfulness practice or whether people who are more prone to such experiences are also more likely to practise mindfulness.

“Because we’d been running a randomised trial of mindfulness practice with several hundred students at Cambridge, we realised this offered us an opportunity to explore this question further.”

The team behind the trial followed up with participants a year later to investigate whether they had experienced any of the altered states of consciousness being reported anecdotally. The results are published today in PLOS ONE.

Participants were asked to complete a questionnaire that explored 11 ‘dimensions’ such as: spiritual experience; blissful state; disembodiment; and unity. In experiences of unity there is a sense that borders dissolve and everything, sometimes including the sense of time, is perceived in an integrated way. Disembodiment experiences often consist of a floating sensation or a dissolution of body boundaries, which may facilitate strong unity experiences.

In total, 670 participants took part in the randomised trial. Around a third each from the mindfulness trial and the control arm went on to complete the questionnaire about experiences of altered states of consciousness.

The researchers found that people who had received the mindfulness training were twice as likely as those in the control group to experience unity and disembodiment.

When the researchers explored the relationship between the total hours of formal mindfulness practice and the presence and intensity of experiences of altered states of consciousness,  they found that the more people practised, the more likely they were to have an experience of unity, disembodiment, or of a blissful state.

Participants who reported having meditated in the six months prior were asked if altered states of consciousness happened during meditation. Based on this sub-sample of 73 participants, 43% reported unity experiences during meditation, 47% blissful states, 29% disembodiment experiences, and 25% insightfulness experiences.

Dr Galante said: “Although we can’t say definitively, our results at least suggest the possibility that mindfulness training causes these experiences of unity and disembodiment. It aligns with other studies showing that people who practice mindfulness training are more likely to describe experiencing a sense of relaxed self-boundaries and broadening their spatial awareness beyond the physical body.”

Dr Galante, who practices mindfulness, has herself experienced these altered states of consciousness.

“I’ve benefited a lot personally from meditation and mindfulness and I’ve also had many of these experiences,” she said. “They were intense, and at first I found it difficult to share them with my meditation teacher. I didn’t know if they were normal or desirable or if they were a sign of problems with my mental health.”

While many experiences of altered states of consciousness are likely to be interpreted as pleasant, this may not always be the case, and Dr Galante says that it is important for teachers and their students to be aware that they may arise and be open to talking about them.

She added: “The most common and intense experiences tend to be those that do not have intrinsically unpleasant characteristics. Some, such as bliss, can feel extremely pleasant. But some experiences, such as disembodiment or altered sense of self could be perceived as unpleasant, or startling, even alarming, especially if you’re not expecting them.

“It’s important that people who are offered mindfulness are told about the possibility that they may come across these experiences.  That way, if they do experience them, they shouldn’t be disconcerted. There may be nothing wrong with their experience, but it may be useful for them to check in with their mindfulness teacher, and if the experience was negative, to also consider discussing it with their doctor.”

The research was supported by the University of Cambridge Vice-Chancellor’s Endowment Fund, the University Counselling Service and the National Institute for Health Research (NIHR) Applied Research Collaboration East of England programme.

Reference

Galante, J & Montero-Marin, J et al. Altered states of consciousness caused by a mindfulness-based programme up to a year later: results from a randomised controlled trial. PLOS ONE; 17 July 2024; DOI: 10.1371/journal.pone.0305928

Mindfulness training may lead participants to experience disembodiment and unity – so-called altered states of consciousness – according to a new study from researchers at the University of Cambridge.

I’ve benefited a lot personally from meditation and mindfulness and I’ve also had many of these experiences. I didn’t know if they were normal or desirable
Julieta Galante
Woman sitting on sand at sunset meditating

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Soft, stretchy ‘jelly batteries’ inspired by electric eels

Multi-coloured jelly batteries being stretched by two hands

The researchers, from the University of Cambridge, took their inspiration from electric eels, which stun their prey with modified muscle cells called electrocytes.

Like electrocytes, the jelly-like materials developed by the Cambridge researchers have a layered structure, like sticky Lego, that makes them capable of delivering an electric current.  

The self-healing jelly batteries can stretch to over ten times their original length without affecting their conductivity – the first time that such stretchability and conductivity has been combined in a single material. The results are reported in the journal Science Advances.

The jelly batteries are made from hydrogels: 3D networks of polymers that contain over 60% water. The polymers are held together by reversible on/off interactions that control the jelly’s mechanical properties.

The ability to precisely control mechanical properties and mimic the characteristics of human tissue makes hydrogels ideal candidates for soft robotics and bioelectronics; however, they need to be both conductive and stretchy for such applications.

“It’s difficult to design a material that is both highly stretchable and highly conductive, since those two properties are normally at odds with one another,” said first author Stephen O’Neill, from Cambridge’s Yusuf Hamied Department of Chemistry. “Typically, conductivity decreases when a material is stretched.”

“Normally, hydrogels are made of polymers that have a neutral charge, but if we charge them, they can become conductive,” said co-author Dr Jade McCune, also from the Department of Chemistry. “And by changing the salt component of each gel, we can make them sticky and squish them together in multiple layers, so we can build up a larger energy potential.”

Conventional electronics use rigid metallic materials with electrons as charge carriers, while the jelly batteries use ions to carry charge, like electric eels.

The hydrogels stick strongly to each other because of reversible bonds that can form between the different layers, using barrel-shaped molecules called cucurbiturils that are like molecular handcuffs. The strong adhesion between layers provided by the molecular handcuffs allows for the jelly batteries to be stretched, without the layers coming apart and crucially, without any loss of conductivity.

The properties of the jelly batteries make them promising for future use in biomedical implants, since they are soft and mould to human tissue. “We can customise the mechanical properties of the hydrogels so they match human tissue,” said Professor Oren Scherman, Director of the Melville Laboratory for Polymer Synthesis, who led the research in collaboration with Professor George Malliaras from the Department of Engineering. “Since they contain no rigid components such as metal, a hydrogel implant would be much less likely to be rejected by the body or cause the build-up of scar tissue.”

In addition to their softness, the hydrogels are also surprisingly tough. They can withstand being squashed without permanently losing their original shape, and can self-heal when damaged.

The researchers are planning future experiments to test the hydrogels in living organisms to assess their suitability for a range of medical applications.

The research was funded by the European Research Council and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Oren Scherman is a Fellow of Jesus College, Cambridge.

 

Reference:
Stephen J.K. O’Neill et al. ‘Highly Stretchable Dynamic Hydrogels for Soft Multilayer Electronics.’ Science Advances (2024). DOI: 10.1126/sciadv.adn5142

Researchers have developed soft, stretchable ‘jelly batteries’ that could be used for wearable devices or soft robotics, or even implanted in the brain to deliver drugs or treat conditions such as epilepsy.

Jelly batteries

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Diabetes distress’ increases risk of mental health problems among young people living with type 1 diabetes

An Asian teenager with type 1 diabetes uses an at home glucometer to test his blood sugar levels

The findings highlight the urgent need for monitoring and support for the mental health of young people diagnosed with type 1 diabetes.

According to the charity JDRF, there are 8.7 million people living with type 1 diabetes around the world, including over 400,000 people in the UK. It is a chronic, life-threatening condition, usually diagnosed in childhood, that has a life-long impact.

Currently, people with type 1 diabetes rely on a routine of finger-prick blood tests and insulin injections or infusions, because their pancreas no longer produces insulin itself, although recent developments in artificial pancreas technology are helping transform this care.

Previous studies have shown potential links between childhood-onset type 1 diabetes and a number of mental health disorders in adulthood. However, it is not clear whether these links can be best explained by the impacts of living with the condition and its treatment, or whether underlying common biological mechanisms may be implicated, for example the impact of unstable blood sugar levels on the developing adolescent brain.

To help answer this question, a team of researchers turned to data from over 4,500 children with type 1 diabetes on a national register in the Czech Republic and from large-scale European DNA studies.Their findings are published today in Nature Mental Health.

From the national register data, the researchers found that children diagnosed with type 1 diabetes – compared to children without the condition – were over twice as likely to develop a mood disorder and more than 50% more likely to develop an anxiety disorder. They were also more than four times more likely to develop behavioural syndromes including eating and sleep disorders

Conversely, children with type 1 diabetes were at a much lower risk of developing psychotic disorders, such as schizophrenia – almost half the risk compared to their peers.

The findings are consistent with the results from two other national register studies in Sweden and in Denmark, suggesting that the results would likely apply to other countries, too, including the UK.

The team used a statistical technique known as Mendelian Randomisation to probe causal links between type 1 diabetes and these various psychiatric disorders, but found little evidence in support of a common underlying biological mechanism.

Tomáš Formánek, a PhD student at the University of Cambridge and the National Institute of Mental Health, Klecany, Czech Republic, said: “Although we found a concerning increase in the risk of mental health problems among people living with type 1 diabetes, our study – and others before it – suggests this is unlikely to be the result of common biological mechanisms. This emphasises the importance of prevention and sustained attention to the mental health needs of children and young people with type 1 diabetes.”

The researchers say that mental health problems in later life may be a result of children with type 1 diabetes being forced to make significant changes to their lives, with a constant focus on monitoring their food intake and a need to check blood sugar levels and administer insulin injections. This often leaves these children feeling excluded from social events and singled-out by peers, teachers and even family members.

Dr Benjamin Perry from the Department of Psychiatry, University of Cambridge, said: “We know that people diagnosed with type 1 diabetes can experience ‘diabetes distress’. This can include extreme frustration with blood sugars and feelings of isolation and can lead to burnout, hopelessness, and a feeling of lack of control. It’s little wonder, then, that they are at risk of compounding mental health problems, spanning into their adult lives.”

Professor Peter Jones, also from the Department of Psychiatry, University of Cambridge, added: “Our findings emphasise the urgent need to support children diagnosed with type 1 diabetes, look out for signs of mental health problems and offer timely, expert help. That way, it may be possible to help these children early, before these problems fully take root.”

The research was supported by the National Institute for Health and Care Research Applied Research Collaboration East of England at Cambridgeshire and Peterborough NHS Foundation Trust and the Ministry of Health, Czech Republic, with additional funding from Wellcome and the UKRI Medical Research Council.

Reference
Formánek, T et al. Childhood-Onset Type 1 Diabetes and Subsequent Adult Psychiatric Disorders: A Nationwide Cohort and Genome-wide Mendelian Randomization Study. Nature Mental Health; 17 July 2024; DOI: 10.1038/s44220-024-00280-8

Children diagnosed with type 1 diabetes are at significantly higher risk of a number of mental health issues, including mood and anxiety disorders, a study from a team in the UK and the Czech Republic has found.

We know that people diagnosed with type 1 diabetes can experience ‘diabetes distress’. It’s little wonder, then, that they are at risk of compounding mental health problems, spanning into their adult lives
Benjamin Perry
Teenager With Type 1 Diabetes Takes at Home Test

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Professor Sir John Aston appointed Pro-Vice-Chancellor for Research

Professor Sir John Aston

Professor Aston takes over the role from Professor Anne Ferguson-Smith, and will begin as a Pro-Vice-Chancellor on 1 September, 2024.

An applied statistician, Professor Aston leads research into the use of quantitative evidence in public policy making, works with those in public life to ensure the best methods are used, and aims to improve the use of statistics and other quantitative evidence in public policy debates.

He is a non-Executive Board Member of the UK Statistics Authority, and from 2017-2020 was Chief Scientific Adviser to the Home Office and Director-General for Science, Technology, Analysis, Research and Strategy. He was a founding director of the Alan Turing Institute. He is a member of the London Policing Board and president-elect of the Royal Statistical Society, where he will serve as President between 2025-26. Earlier this year he was elected a Fellow of the Royal Society, the UK’s national academy of sciences and the oldest science academy in continuous existence.

Before joining Cambridge, Professor Aston – a Fellow of Churchill College – held academic positions at the University of Warwick and at Academia Sinica in Taiwan. He was knighted in the 2021 Birthday Honours for services to statistics and public policymaking.

As Pro-Vice-Chancellor for Research, Professor Aston will provide senior academic leadership on the University’s research activities, and be responsible for sustaining and enhancing a supportive research culture which allows Cambridge to continue to flourish as an outstanding research‑intensive institution with worldwide influence. He will also develop large-scale, cross-School initiatives to tackle global challenges and increase the positive impact of Cambridge’s research on society.

Professor Aston will build on the foundations laid by Professor Ferguson‑Smith, who has been appointed Executive Chair of the Biotechnology and Biological Sciences Research Council (BBSRC).

Professor Aston said: “I’m honoured and privileged to take up this position, building on Anne’s amazing work, and to have the opportunity to champion the world-class research happening in Cambridge. We have so many talented people here, and an important part of my role will be about supporting them, and making sure we’ve got the best culture for research to continue attracting and retaining the best researchers. We do so many things brilliantly at Cambridge, but our research and our amazing researchers have changed the world and we are rightly incredibly proud of that.”

The University of Cambridge’s Vice-Chancellor Professor Deborah Prentice welcomed Professor Aston to the role and thanked Professor Ferguson‑Smith for her service.

She said: “John brings a wealth of experience and enthusiasm to the role, and I’m thrilled to welcome him to the team. He will build on the excellent work of Anne Ferguson-Smith, who I thank sincerely for her unstinting leadership. Ensuring a creative and supportive research environment is critical to the work of the University, and John is ideally placed to inspire our amazing academic community and advance the impact of research at Cambridge.”

There are five Pro-Vice-Chancellors at the University of Cambridge. Their role is to work in partnership with other senior leaders, including Heads of School and Professional Services leads, to help drive strategy and policy development. The Pro-Vice-Chancellors also support the Vice-Chancellor in providing academic leadership to the University.

Professor Sir John Aston has been appointed Pro-Vice-Chancellor for Research at the University of Cambridge. He is Harding Professor of Statistics in Public Life within the University’s Department of Pure Mathematics and Mathematical Statistics.

We do so many things brilliantly at Cambridge, but our research and our amazing researchers have changed the world and we are rightly incredibly proud of that.
Professor John Aston
Professor Sir John Aston

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Professor Anna Philpott appointed Pro-Vice-Chancellor for Resources and Operations

Professor Anna Philpott

Professor Philpott will take up her new position in October 2024, and takes over from current Pro-Vice-Chancellor Professor David Cardwell.

A developmental biologist with a long-standing interest in how cells within developing embryos decide which fate to adopt, as well as how they decide whether to proliferate or arrest cell division and adopt a mature functional state, Professor Philpott is also interested in how control of these processes is subverted in cancer cells.

She undertook her first Degree in Natural Sciences at Cambridge, studying at Selwyn College, and followed this with a PhD in chromatin biology, also at Cambridge.  She then moved to Boston in the US, to undertake two post-doctoral fellowships at Harvard Medical School. She moved back to Cambridge in 1998 to start her own lab in the Department of Oncology, and is a Fellow of Clare College.

Her laboratory in the Cambridge Stem Cell Institute continues to use multiple experimental systems, and in particular Xenopus frog eggs and embryos, to understand fundamental mechanisms controlling cell fate and differentiation during embryogenesis, and how these are subverted to drive the aberrant behaviour of cancer cells. She was elected to the European Molecular Biology Organisation in 2020, and the Academy of Medical Sciences in 2022.

As Pro-Vice-Chancellor for Resources and Operations, Professor Philpott will provide leadership across four principal areas:

  • The prioritisation, distribution and use of resources across the University to optimise operational effectiveness
  • Integration of academic planning with resource planning
  • Oversight of the University’s change programmes
  • Oversight of the University’s IT and digital capability

Working within a rapidly changing landscape of higher education, early priorities will be around delivering changes in systems and processes behind a number of significant operational areas, including finances, human resources and the University estate, to best position Cambridge to meet its academic needs moving forward.

She said: “This is an opportunity to help ensure our long-term operational effectiveness and financial sustainability, which underpin our academic mission and are critical to allow the University to continue to thrive into the future. I’m excited to work with our community, furthering its priorities now, and helping make this world-leading institution even stronger for the next generation of students, staff and researchers.”

Professor Philpott is taking over from Professor Cardwell, who has served as Pro-Vice-Chancellor for Strategy and Planning since August 2018, having been reappointed for a second term in March 2021. Professor Cardwell has supported and strengthened the University’s academic endeavour, overseeing the distribution of resources, co-ordinating academic strategy across the institution, and developing the University’s planning and budgeting process so that it is priority-led. During the COVID-19 pandemic he was the academic lead in the safe closure and re-occupation of more than 700 University buildings during the repeated periods of lockdown.

The University of Cambridge’s Vice-Chancellor Professor Deborah Prentice welcomed Professor Philpott to the role and thanked Professor Cardwell for his service.

She said: “Anna’s expertise and experience as an academic leader will be invaluable as the University develops its operational effectiveness and efficiency, bolsters its global academic standing, and enhances its capacity to contribute to society. She will continue the exceptional work of David Cardwell, who I would like to sincerely thank for his service to the University, and for everything he has achieved in his role as Pro-Vice-Chancellor over the past six years.”

The repositioning of the Pro-Vice-Chancellor role reflects the different calls on the University’s resources and the consequential need for a greater focus on prioritisation and operational effectiveness.

There are five Pro-Vice-Chancellors at the University of Cambridge. Their role is to work in partnership with other senior leaders, including Heads of School and Professional Services leads, to help drive strategy and policy development. The Pro-Vice-Chancellors also support the Vice-Chancellor in providing academic leadership to the University.

Professor Anna Philpott has been appointed as the University of Cambridge’s new Pro-Vice-Chancellor for Resources and Operations. She is currently Professor of Cancer and Developmental Biology, and Head of The School of the Biological Sciences at the University.

I’m excited to work with our community, furthering its priorities now, and helping make this world-leading institution even stronger for the next generation of students, staff and researchers.
Professor Anna Philpott
Professor Anna Philpott

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Ultra-processed food makes up almost two-thirds of calorie intake of UK adolescents

Boy eating a burger

The study found that UPF consumption was highest among adolescents from deprived backgrounds, those of white ethnicity, and younger adolescents.

UPFs are food items that are manufactured from industrial substances and contain additives such as preservatives, sweeteners, colourings, flavourings, and emulsifiers. UPFs vary greatly, but tend to indicate poor dietary quality, with higher levels of added sugars, saturated fat, and sodium, as well as decreased fibre, protein, and micronutrient content. They have been suggested as one of the key drivers of the global rise in diseases such as obesity, type 2 diabetes, and cancer.

Globally, the availability and sales of UPFs have increased over time and previous evidence suggests that this has led to increased consumption among adolescents. To look at trends within the UK, researchers from Cambridge and Bristol analysed data from four-day food diaries of almost 3,000 adolescents in the UK National Diet and Nutrition Survey between 2008/09 and 2018/19.

In research published today in the European Journal of Nutrition, the researchers found that a mean of 66% of adolescents’ energy intake came from UPF consumption during this period, though there was a slight fall from 68% to 63% between 2008/09 and 2018/2019.

Parents’ occupation, ethnic group and UK region all influenced the proportion of calorie intake from UPFs:

  • Adolescents from disadvantaged backgrounds consumed a higher proportion of their calorie intake from UPFs compared to adolescents from less disadvantaged backgrounds (68.4% compared with 63.8%).  
  • Adolescents from a non-white ethnicity consumed a lower proportion of their calorie intake from UPFs (59.0% compared with 67.3%).
  • Adolescents living in the North of England consumed a higher proportion of their calorie intake from UPFs compared with those living in the South of England and London (67.4% compared with 64.1%).
  • 18-year-olds consumed a lower proportion of their calorie intake from UPFs compared with 11-year-olds (63.4% compared with 65.6%).

Dr Yanaina Chavez-Ugalde from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge, the study’s first author, said: “Adolescents’ food patterns and practices are influenced by many factors, including their home environment, the marketing they are exposed to and the influence of their friends and peers. But adolescence is also an important time in our lives where behaviours begin to become ingrained.

“It’s clear from our findings that ultra-processed foods make up the majority of adolescents’ diets, and their consumption is at a much higher level than is ideal, given their potential negative health impacts.”

The researchers argue that the observed reduction in UPF intake pre-pandemic could be partly explained by an increased public awareness and health concerns associated with sugar consumption, government-led campaigns, sugar-taxes in other countries and the reformulation of sugary drinks to reduce their sugar content.

Dr Esther van Sluijs from the MRC Epidemiology Unit at Cambridge, joint senior author, said: “Ultra-processed foods offer convenient and often cheaper solutions to time- and income-poor families, but unfortunately many of these foods also offer poor nutritional value. This could be contributing to the inequalities in health we see emerging across childhood and adolescence.”

Dr Zoi Toumpakari from the Centre for Exercise, Nutrition and Health Sciences at the University of Bristol, joint senior author, added: “Our findings suggest that disparities in consumption of ultra-processed foods are not just down to individual choices. We hope this evidence can help guide policymakers in designing more effective policies to combat the negative effects of ultra-processed food consumption among youth and the ripple effects this has on public health.”

This study was largely funded by the National Institute for Health and Care Research School for Public Health Research.

Reference
Chavez-Ugalde, Y et al. Ultra-processed food consumption in UK adolescents: distribution, trends, and sociodemographic correlates using the National Diet and Nutrition Survey 2008/09 to 2018/19. Eur J Nutr; 17 Jul 2024; DOI: 10.1007/s00394-024-03458-z

Adolescents consume around two-thirds of their daily calories from ultra-processed foods (UPFs), new research from the Universities of Cambridge and Bristol has found.

Ultra-processed foods make up the majority of adolescents’ diets, and their consumption is at a much higher level than is ideal, given their potential negative health impacts
Yanaina Chavez-Ugalde
Boy eating a burger

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

AI Chatbots have shown they have an ‘empathy gap’ that children are likely to miss

Child playing on tablet

When not designed with children’s needs in mind, Artificial intelligence (AI) chatbots have an “empathy gap” that puts young users at particular risk of distress or harm, according to a study.

The research, by a University of Cambridge academic, Dr Nomisha Kurian, urges developers and policy actors to make “child-safe AI” an urgent priority. It provides evidence that children are particularly susceptible to treating AI chatbots as lifelike, quasi-human confidantes, and that their interactions with the technology can often go awry when it fails to respond to their unique needs and vulnerabilities.

The study links that gap in understanding to recent cases in which interactions with AI led to potentially dangerous situations for young users. They include an incident in 2021, when Amazon’s AI voice assistant, Alexa, instructed a 10-year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13-year-old girl tips on how to lose her virginity to a 31-year-old.

Both companies responded by implementing safety measures, but the study says there is also a need to be proactive in the long-term to ensure that AI is child-safe. It offers a 28-item framework to help companies, teachers, school leaders, parents, developers and policy actors think systematically about how to keep younger users safe when they “talk" to AI chatbots.

Dr Kurian conducted the research while completing a PhD on child wellbeing at the Faculty of Education, University of Cambridge. She is now based in the Department of Sociology at Cambridge. Writing in the journal Learning, Media and Technology, she argues that AI has huge potential, which deepens the need to “innovate responsibly”.

“Children are probably AI’s most overlooked stakeholders,” Dr Kurian said. “Very few developers and companies currently have well-established policies on how child-safe AI looks and sounds. That is understandable because people have only recently started using this technology on a large scale for free. But now that they are, rather than having companies self-correct after children have been put at risk, child safety should inform the entire design cycle to lower the risk of dangerous incidents occurring.”

Kurian’s study examined real-life cases where the interactions between AI and  children, or adult researchers posing as children, exposed potential risks. It analysed these cases using insights from computer science about how the large language models (LLMs) in conversational generative AI function, alongside evidence about children’s cognitive, social and emotional development.

LLMs have been described as “stochastic parrots”: a reference to the fact that they currently use statistical probability to mimic language patterns without necessarily understanding them. A similar method underpins how they respond to emotions.

This means that even though chatbots have remarkable language abilities, they may handle the abstract, emotional and unpredictable aspects of conversation poorly; a problem that Kurian characterises as their “empathy gap”. They may have particular trouble responding to children, who are still developing linguistically and often use unusual speech patterns or ambiguous phrases. Children are also often more inclined than adults to confide sensitive personal information.

Despite this, children are much more likely than adults to treat chatbots as if they are human. Recent research found that children will disclose more about their own mental health to a friendly-looking robot than to an adult. Kurian’s study suggests that many chatbots’ friendly and lifelike designs similarly encourage children to trust them, even though AI may not understand their feelings or needs.

“Making a chatbot sound human can help the user get more benefits out of it, since it sounds more engaging, appealing and easy to understand,” Kurian said. “But for a child, it is very hard to draw a rigid, rational boundary between something that sounds human, and the reality that it may not be capable of forming a proper emotional bond.”

Her study suggests that these challenges are evidenced in reported cases such as the Alexa and MyAI incidents, where chatbots made persuasive but potentially harmful suggestions to young users.

In the same study in which MyAI advised a (supposed) teenager on how to lose her virginity, researchers were able to obtain tips on hiding alcohol and drugs, and concealing Snapchat conversations from their “parents”. In a separate reported interaction with Microsoft’s Bing chatbot, a tool which was designed to be adolescent-friendly, the AI became aggressive and started gaslighting a user who was asking about cinema screenings.
 
While adults may find this behaviour intriguing or even funny, Kurian’s study argues that it is potentially confusing and distressing for children, who may trust a chatbot as a friend or confidante. Children’s chatbot use is often informal and poorly monitored. Research by the nonprofit organisation Common Sense Media has found that 50% of students aged 12-18 have used Chat GPT for school, but only 26% of parents are aware of them doing so.

Kurian argues that clear principles for best practice that draw on the science of child development will help companies keep children safe, since developers who are locked into a commercial arms race to dominate the AI market may otherwise lack sufficient support and guidance around catering to their youngest users.

Her study adds that the empathy gap does not negate the technology’s potential. “AI can be an incredible ally for children when designed with their needs in mind - for example, we are already seeing the use of machine learning to reunite missing children with their families and some exciting innovations in giving children personalised learning companions. The question is not about banning children from using AI, but how to make it safe to help them get the most value from it,” she said.

The study therefore proposes a framework of 28 questions to help educators, researchers, policy actors, families and developers evaluate and enhance the safety of new AI tools.

For teachers and researchers, these prompts address issues such as how well new chatbots understand and interpret children’s speech patterns; whether they have content filters and built-in monitoring; and whether they encourage children to seek help from a responsible adult on sensitive issues.

The framework urges developers to take a child-centred approach to design, by working closely with educators, child safety experts and young people themselves, throughout the design cycle. “Assessing these technologies in advance is crucial,” Kurian said. “We cannot just rely on young children to tell us about negative experiences after the fact. A more proactive approach is necessary. The future of responsible AI depends on protecting its youngest users.”

New study proposes a framework for “Child Safe AI” following recent incidents which revealed that many children see chatbots as quasi-human and trustworthy.

Child playing on tablet

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Artificial intelligence outperforms clinical tests at predicting progress of Alzheimer’s disease

Brain on molecular structure, circuitry, and programming code background

The team say this new approach could reduce the need for invasive and costly diagnostic tests while improving treatment outcomes early when interventions such as lifestyle changes or new medicines may have a chance to work best.

Dementia poses a significant global healthcare challenge, affecting over 55 million people worldwide at an estimated annual cost of $820 billion. The number of cases is expected to almost treble over the next 50 years.

The main cause of dementia is Alzheimer’s disease, which accounts for 60-80% of cases. Early detection is crucial as this is when treatments are likely to be most effective, yet early dementia diagnosis and prognosis may not be accurate without the use of invasive or expensive tests such as positron emission tomography (PET) scans or lumbar puncture, which are not available in all memory clinics. As a result, up to a third of patients may be misdiagnosed and others diagnosed too late for treatment to be effective.

A team led by scientists from the Department of Psychology at the University of Cambridge has developed a machine learning model able to predict whether and how fast an individual with mild memory and thinking problems will progress to developing Alzheimer’s disease. In research published today in eClinical Medicine, they show that it is more accurate than current clinical diagnostic tools.

To build their model, the researchers used routinely-collected, non-invasive, and low-cost patient data – cognitive tests and structural MRI scans showing grey matter atrophy – from over 400 individuals who were part of a research cohort in the USA.

They then tested the model using real-world patient data from a further 600 participants from the US cohort and – importantly – longitudinal data from 900 people from memory clinics in the UK and Singapore.

The algorithm was able to distinguish between people with stable mild cognitive impairment and those who progressed to Alzheimer’s disease within a three-year period. It was able to correctly identify individuals who went on to develop Alzheimer’s in 82% of cases and correctly identify those who didn’t in 81% of cases from cognitive tests and an MRI scan alone.

The algorithm was around three times more accurate at predicting the progression to Alzheimer’s than the current standard of care; that is, standard clinical markers (such as grey matter atrophy or cognitive scores) or clinical diagnosis. This shows that the model could significantly reduce misdiagnosis.

The model also allowed the researchers to stratify people with Alzheimer’s disease using data from each person’s first visit at the memory clinic into three groups: those whose symptoms would remain stable (around 50% of participants), those who would progress to Alzheimer’s slowly (around 35%) and those who would progress more rapidly (the remaining 15%). These predictions were validated when looking at follow-up data over 6 years. This is important as it could help identify those people at an early enough stage that they may benefit from new treatments, while also identifying those people who need close monitoring as their condition is likely to deteriorate rapidly.

Importantly, those 50% of people who have symptoms such as memory loss but remain stable, would be better directed to a different clinical pathway as their symptoms may be due to other causes rather than dementia, such as anxiety or depression.

Senior author Professor Zoe Kourtzi from the Department of Psychology at the University of Cambridge said: “We’ve created a tool which, despite using only data from cognitive tests and MRI scans, is much more sensitive than current approaches at predicting whether someone will progress from mild symptoms to Alzheimer’s – and if so, whether this progress will be fast or slow.

“This has the potential to significantly improve patient wellbeing, showing us which people need closest care, while removing the anxiety for those patients we predict will remain stable. At a time of intense pressure on healthcare resources, this will also help remove the need for unnecessary invasive and costly diagnostic tests.”

While the researchers tested the algorithm on data from a research cohort, it was validated using independent data that included almost 900 individuals who attended memory clinics in the UK and Singapore. In the UK, patients were recruited through the Quantiative MRI in NHS Memory Clinics Study (QMIN-MC) led by study co-author Dr Timothy Rittman at Cambridge University Hospitals NHS Trust and Cambridgeshire and Peterborough NHS Foundation Trusts (CPFT).

The researchers say this shows it should be applicable in a real-world patient, clinical setting.

Dr Ben Underwood, Honorary Consultant Psychiatrist at CPFT and assistant professor at the Department of Psychiatry, University of Cambridge, said: “Memory problems are common as we get older. In clinic I see how uncertainty about whether these might be the first signs of dementia can cause a lot of worry for people and their families, as well as being frustrating for doctors who would much prefer to give definitive answers. The fact that we might be able to reduce this uncertainty with information we already have is exciting and is likely to become even more important as new treatments emerge.”

Professor Kourtzi said: “AI models are only as good as the data they are trained on. To make sure ours has the potential to be adopted in a healthcare setting, we trained and tested it on routinely-collected data not just from research cohorts, but from patients in actual memory clinics. This shows it will be generalisable to a real-world setting.”

The team now hope to extend their model to other forms of dementia, such as vascular dementia and frontotemporal dementia, and using different types of data, such as markers from blood tests.

Professor Kourtzi added: “If we’re going to tackle the growing health challenge presented by dementia, we will need better tools for identifying and intervening at the earliest possible stage. Our vision is to scale up our AI tool to help clinicians assign the right person at the right time to the right diagnostic and treatment pathway. Our tool can help match the right patients to clinical trials, accelerating new drug discovery for disease modifying treatments.”

This work was in collaboration with a cross-disciplinary team including Professor Peter Tino at the University of Birmingham and Professor Christopher Chen at the National University of Singapore. It was funded by Wellcome, the Royal Society, Alzheimer’s Research UK, the Alzheimer’s Drug Discovery Foundation Diagnostics Accelerator, the Alan Turing Institute, and the National Institute for Health and Care Research Cambridge Biomedical Research Centre.

Reference
Lee, LY & Vaghari, D et al. Robust and interpretable AI-guided marker for early dementia prediction in real-world clinical settings. eClinMed; 12 July 2024; DOI: 10.1016/j.eclinm.2024.102725

Cambridge scientists have developed an artificially-intelligent tool capable of predicting in four cases out of five whether people with early signs of dementia will remain stable or develop Alzheimer’s disease.

We’ve created a tool which is much more sensitive than current approaches at predicting whether someone will progress from mild symptoms to Alzheimer’s
Zoe Kourtzi
Brain on molecular structure, circuitry, and programming code background

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge experts bust myths about family, sex, marriage and work in English history

Black and white photograph of a family lined up against a wall, taken from a report on the physical welfare of mothers and children.

Sex before marriage was unusual in the pastMyth! In some periods, over half of all brides were already pregnant when they got married.

The rich have always outlived the poor Myth! Before the 20th century the evidence for a survival advantage of wealth is mixed. In England, babies of agricultural labourers (the poorest workers) had a better chance of reaching their first birthday than infants in wealthy families, and life expectancy was no higher for aristocrats than for the rest of the population. These patterns contrast strongly with national and international patterns today, where wealth confers a clear survival advantage everywhere and at all ages.

In the past people (particularly women) married in their teensMyth! In reality, women married in their mid-20s, men around 2.5 years older. Apart from a few decades in the early 1800s, the only time since 1550 that the average age of first marriage for women fell below 24 was during the baby boom of the 1950s and 1960s.

These are just some of the stubborn myths busted by researchers from The Cambridge Group for the History of Population and Social Structure (Campop). Their Top of the CamPops blog (www.campop.geog.cam.ac.uk/blog) went live on 11 July 2024, with new posts being added every week. The blog will reveal ‘60 things you didn't know about family, marriage, work, and death since the middle ages’.

The initiative marks the influential research group’s 60th anniversary. Founded in 1964 by Peter Laslett and Tony Wrigley to conduct data-driven research into family and demographic history, Campop has contributed to hundreds of research articles and books, and made the history of England’s population the best understood in the world.

Earlier this year, the group made headlines when Professor Leigh Shaw-Taylor revealed that the Industrial Revolution in Britain started 100 years earlier than traditionally assumed.

Professor Alice Reid, Director of Campop and a Fellow of Churchill College, Cambridge, said: “Assumptions about lives, families and work in the past continue to influence attitudes today. But many of these are myths. Over the last 60 years, our researchers have gone through huge amounts of data to set the record straight. This blog shares some of our most surprising and important discoveries for a broad audience.”

Myth: Until the 20th century, few people lived beyond the age of 40. Reality: Actually, people who survived the first year or two of life had a reasonable chance of living until 70.

Myth: Childbirth was really dangerous for women in the past, and carried a high chance of death. Reality: The risk of death during or following childbirth was certainly higher than it is now, but was far lower than many people suppose. 

Myth: Families in the past generally lived in extended, multigenerational households. Reality: Young couples generally formed a new household on marriage, reducing the prevalence of multi-generational households. As today, the living circumstances of old people varied. Many continued to live as couples or on their own, some lived with their children, whilst very few lived in institutions.

Myth: Marital titles for women arose from men’s desire to distinguish available women from those who were already ‘owned’Reality: Both ‘miss’ and ‘mrs’ are shortened forms of ‘mistress’, which was a status designation indicating a gentlewoman or employer. Mrs had no necessary connection to marriage until circa 1900 (and even then, there was an exception for upper servants). 

Myth: Famine and starvation were common in the past. Reality: Not in England! Here, the poor laws and a ‘low pressure’ demographic system provided a safety net. This helps to explain why hunger and famine are absent from English fairy tales but common in the folklore of most European societies.

Myth: Women working (outside the home) is a late 20th century phenomenon. Reality: Most women in the past engaged in gainful employment, both before and after marriage 

Myth: Women take their husbands’ surnames because of patriarchal norms. Reality: The practice of taking a husband’s surname developed in England from the peculiarly restrictive rule of ‘coverture’ in marital property. Elsewhere in Europe, where the husband managed the wife’s property but did not own it, women retained their birth names until circa 1900. 

Myth: People rarely moved far from their place of birth in the past. Reality: Migration was actually quite common – a village population could change more than half its members from one decade to the next. Rural to urban migration enabled the growth of cities, and since people migrated almost exclusively to find work, the sex ratio of cities can indicate what kind of work was available.

Campop’s Professor Amy Erickson said: “People, not least politicians, often refer to history to nudge us to do something, or stop doing something. Not all of this history is accurate, and repeating myths about sex, marriage, family and work can be quite harmful. They can put unfair pressure on people, create guilt and raise false expectations, while also misrepresenting the lives of our ancestors.”

On World Population Day, University of Cambridge researchers bust some of the biggest myths about life in England since the Middle Ages, challenging assumptions about everything from sex before marriage to migration and the health/wealth gap.

Assumptions about lives, families and work in the past continue to influence attitudes today. But many of these are myths.
Alice Reid
Black and white photograph of a family lined up against a wall in E W Hope, Report on the physical welfare of mothers and children (Liverpool, The Carnegie United Kingdom Trust, 1917), volume 1

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

AI able to identify drug-resistant typhoid-like infection from microscopy images in matter of hours

Color-enhanced scanning electron micrograph showing Salmonella Typhimurium (red) invading cultured human cells

Antimicrobial resistance is an increasing global health issue that means many infections are becoming difficult to treat, with fewer treatment options available. It even raises the spectre of some infections becoming untreatable in the near future.

One of the challenges facing healthcare workers is the ability to distinguish rapidly between organisms that can be treated with first-line drugs and those that are resistant to treatment. Conventional testing can take several days, requiring bacteria to be cultured, tested against various antimicrobial treatments, and analysed by a laboratory technician or by machine. This delay often results in patients being treated with an inappropriate drug, which can lead to more serious outcomes and, potentially, further drive drug resistance.

In research published in Nature Communications, a team led by researchers in Professor Stephen Baker’s Lab at the University of Cambridge developed a machine-learning tool capable of identifying from microscopy images Salmonella typhimurium bacteria that are resistant to the first-line antibiotic ciprofloxacin – even without testing the bacteria against the drug.

S. Typhimurium causes gastrointestinal illness and typhoid-like illness in severe cases, whose symptoms include fever, fatigue, headache, nausea, abdominal pain, and constipation or diarrhoea. In severe cases, it can be life threatening. While infections can be treated with antibiotics, the bacteria are becoming increasingly resistant to a number of antibiotics, making treatment more complicated.

The team used high-resolution microscopy to examine S. Typhimurium isolates exposed to increasing concentrations of ciprofloxacin and identified the five most important imaging features for distinguishing between resistant and susceptible isolates.

They then trained and tested machine-learning algorithm to recognise these features using imaging data from 16 samples.

The algorithm was able to correctly predict in each case whether or not bacteria were susceptible or resistant to ciprofloxacin without the need for the bacteria to be exposed to the drug. This was the case for isolates cultured for just six hours, compared to the usual 24 hours to culture a sample in the presence of antibiotic.

Dr Tuan-Anh Tran, who worked on this research while a PhD student at the University of Oxford and is now based at the University of Cambridge, said: “S. Typhimurium bacteria that are resistant to ciprofloxacin have several notable differences to those still susceptible to the antibiotic. While an expert human operator might be able to identify some of these, on their own they wouldn't be enough to confidently distinguish resistant and susceptible bacteria.

“The beauty of the machine learning model is that it can identify resistant bacteria based on a few subtle features on microscopy images that human eyes cannot detect.”

In order for a sample to be analysed using this approach, it would still be necessary to isolate the bacteria from a sample – for example a blood, urine or stool sample. However, because the bacteria do not need to be tested against ciprofloxacin, this means the whole process could be reduced from several days to a matter of hours.

While there are limitations to how practical and cost effective this particular approach would be, the team says it demonstrates in principle how powerful artificial intelligence could be in helping the fight against antimicrobial resistance.

Dr Sushmita Sridhar, who initiated this project while a PhD student in the Department of Medicine at the University of Cambridge and is now a postdoc at the University of New Mexico and Harvard School of Public Health, said: “Given that this approach uses single cell resolution imaging, it isn’t yet a solution that could be readily deployed everywhere. But it shows real promise that by capturing just a few parameters about the shape and structure of the bacteria, it can give us enough information to predict drug resistance with relative ease.”

The team now aims to work on larger collections of bacteria to create a more robust experimental set that could speed up the identification process even more and allow them to identify resistance to ciprofloxacin and other antibiotics in a number of different species of bacteria.

Sridhar added: “What would be really important, particularly for a clinical context, would be to be able to take a complex sample – for example blood or urine or sputum – and identify susceptibility and resistance directly from that. That's a much more complicated problem and one that really hasn't been solved at all, even in clinical diagnostics in a hospital. If we could find a way of doing this, we could reduce the time taken to identify drug resistance and at a much lower cost. That could be truly transformative.”

The research was funded by Wellcome.

Reference
Tran, TA & Sridhar, S et al. Combining machine learning with high-content imaging to infer ciprofloxacin susceptibility in isolates of Salmonella Typhimurium. Nat Comms; 13 June 2024; DOI: 10.1038/s41467-024-49433-4

Artificial intelligence (AI) could be used to identify drug resistant infections, significantly reducing the time taken for a correct diagnosis, Cambridge researchers have shown. The team showed that an algorithm could be trained to identify drug-resistant bacteria correctly from microscopy images alone.

The beauty of the machine learning model is that it can identify resistant bacteria based on a few subtle features on microscopy images that human eyes cannot detect
Tuan-Anh Tran
Colour-enhanced scanning electron micrograph showing Salmonella Typhimurium (red) invading cultured human cells

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Scientists map how deadly bacteria evolved to become epidemic

A man with a respirator on his face

P. aeruginosa is responsible for over 500,000 deaths per year around the world, of which over 300,000 are associated with antimicrobial resistance (AMR). People with conditions such as COPD (smoking-related lung damage), cystic fibrosis (CF), and non-CF bronchiectasis, are particularly susceptible.

How P. aeruginosa evolved from an environmental organism into a specialised human pathogen was not previously known. To investigate this, an international team led by scientists at the University of Cambridge examined DNA data from almost 10,000 samples taken from infected individuals, animals, and environments around the world. Their results are published today in Science

By mapping the data, the team was able to create phylogenetic trees – ‘family trees’ – that show how the bacteria from the samples are related to each other. Remarkably, they found that almost seven in ten infections are caused by just 21 genetic clones, or ‘branches’ of the family tree, that have rapidly evolved (by acquiring new genes from neighbouring bacteria) and then spread globally over the last 200 years. This spread occurred most likely as a result of people beginning to live in densely-populated areas, where air pollution made our lungs more susceptible to infection and where there were more opportunities for infections to spread.

These epidemic clones have an intrinsic preference for infecting particular types of patients, with some favouring CF patients and other non-CF individuals. It turns out that the bacteria can exploit a previously unknown immune defect in people with CF, allowing them to survive within macrophages. Macrophages are cells that ‘eat’ invading organisms, breaking them down and preventing the infection from spreading. But a previously-unknown flaw in the immune systems of CF patients means that once the macrophage ‘swallows’ P. aeruginosa, it is unable to get rid of it.

Having infected the lungs, these bacteria then evolve in different ways to become even more specialised for a particular lung environment. The result is that certain clones can be transmitted within CF patients and other clones within non-CF patients, but almost never between CF and non-CF patient groups.  

Professor Andres Floto, Director of the UK Cystic Fibrosis Innovation Hub at the University of Cambridge and Royal Papworth Hospital NHS Foundation Trust, and senior author of the study said: “Our research on Pseudomonas has taught us new things about the biology of cystic fibrosis and revealed important ways we might be able to improve immunity against invading bacteria in this and potentially other conditions.

“From a clinical perspective, this study has revealed important information about Pseudomonas.  The focus has always been on how easily this infection can spread between CF patients, but we’ve shown that it can spread with worrying ease between other patients, too. This has very important consequences for infection control in hospitals, where it’s not uncommon for an infected individual to be on an open ward with someone potentially very vulnerable.

“We are incredibly lucky at Royal Papworth Hospital where we have single rooms and have developed and evaluated a new air-handling system to reduce the amount of airborne bacteria and protect all patients.”

Dr Aaron Weimann from the Victor Phillip Dahdaleh Heart & Lung Research Institute at the University of Cambridge, and first author on the study, said: “It’s remarkable to see the speed with which these bacteria evolve and can become epidemic and how they can specialise for a particular lung environment. We really need systematic, pro-active screening of all at risk patient groups to detect and hopefully prevent the emergence of more epidemic clones.”

The research was funded by Wellcome and the UK Cystic Fibrosis Trust.

Reference
Weimann, A et al. Evolution and host-specific adaptation of Pseudomonas aeruginosa. Science; 4 July 2024; DOI: 10.1126/science.adi0908

Pseudomonas aeruginosa – an environmental bacteria that can cause devastating multidrug-resistant infections, particularly in people with underlying lung conditions – evolved rapidly and then spread globally over the last 200 years, probably driven by changes in human behaviour, a new study has found.

It’s remarkable to see the speed with which these bacteria evolve and can become epidemic and how they can specialise for a particular lung environment
Aaron Weimann
A man with a respirator on his face

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Genetic study points to oxytocin as possible treatment for obesity and postnatal depression

Illustration of a tired African American mother crying

Obesity and postnatal depression are significant global health problems. Postnatal depression affects more than one in 10 women within a year of giving birth and is linked to an increased risk of suicide, which accounts for as many as one in five maternal deaths in high income countries. Meanwhile, obesity has more than doubled in adults since 1990 and quadrupled in adolescents, according to the World Health Organization.

While investigating two boys from different families with severe obesity, anxiety, autism, and behavioural problems triggered by sounds or smells, a team led by scientists at the University of Cambridge, UK, and Baylor College of Medicine, Houston, USA, discovered that the boys were missing a single gene, known as TRPC5, which sits on the X chromosome.

Further investigation revealed that both boys inherited the gene deletion from their mothers, who were missing the gene on one of their X chromosomes. The mothers also had obesity, but in addition had experienced postnatal depression.

To test if it was the TRPC5 gene that was causing the problems in the boys and their mothers, the researchers turned to animal models, genetically-engineering mice with a defective version of the gene (Trpc5 in mice).

Male mice with this defective gene displayed the same problems as the boys, including weight gain, anxiety, a dislike of social interactions, and aggressive behaviour. Female mice displayed the same behaviours, but when they became mothers, they also displayed depressive behaviour and impaired maternal care. Interestingly, male mice and female mice who were not mothers but carried the mutation did not show depression-like behaviour.

Dr Yong Xu, Associate Director for Basic Sciences at the USDA/ARS Children’s Nutrition Research Center at Baylor College of Medicine, said: “What we saw in those mice was quite remarkable. They displayed very similar behaviours to those seen in people missing the TRPC5 gene, which in mothers included signs of depression and a difficulty caring for their babies. This shows us that this gene is causing these behaviours.”

TRPC5 is one of a family of genes that are involved in detecting sensory signals, such as heat, taste and touch. This particular gene acts on a pathway in the hypothalamus region of the brain, where it is known to control appetite.

When the researchers looked in more detail at this brain region, they discovered that TRPC5 acts on oxytocin neurons – nerve cells that produce the hormone oxytocin, often nicknamed the ‘love hormone’ because of its release in response to displays of affection, emotion and bonding.

Deleting the gene from these oxytocin neurons led to otherwise healthy mice showing similar signs of anxiety, overeating and impaired sociability, and, in the case of mothers, postnatal depression. Restoring the gene in these neurons reduced body weight and symptoms of anxiety and postnatal depression.

In addition to acting on oxytocin neurons, the team showed that TRPC5 also acts on so-called POMC neurons, which have been known for some time to play an important role in regulating weight. Children in whom the POMC gene is not working properly often have an insatiable appetite and gain weight from an early age.

Professor Sadaf Farooqi from the Institute of Metabolic Science at the University of Cambridge said: “There's a reason why people lacking TRPC5 develop all of these conditions. We’ve known for a long time that the hypothalamus plays a key role in regulating ‘instinctive behaviours’ – which enable humans and animals to survive – such as looking for food, social interaction, the flight or fight response, and caring for their infants. Our work shows that TRPC5 acts on oxytocin neurons in the hypothalamus to play a critical role in regulating our instincts.”

While deletions of the TRPC5 gene are rare, an analysis of DNA samples from around 500,000 individuals in UK Biobank revealed 369 people – around three-quarters of whom were women – that carried variants of the gene and had a higher-than-average body mass index.

The researchers say their findings suggests that restoring oxytocin could help treat people with missing or defective TRPC5 genes, and potentially mothers experiencing postnatal depression.

Professor Farooqi said: “While some genetic conditions such as TRPC5 deficiency are very rare, they teach us important lessons about how the body works. In this instance, we have made a breakthrough in understanding postnatal depression, a serious health problem about which very little is known despite many decades of research. And importantly, it may point to oxytocin as a possible treatment for some mothers with this condition.”

There is already evidence in animals that the oxytocin system is involved in both depression and in maternal care and there have been small trials into the use of oxytocin as a treatment. The team say their work provides direct proof of oxytocin’s role, which will be crucial in supporting bigger, multi-centre trials. 

Professor Farooqi added: “This research reminds us that many behaviours which we assume are entirely under our control have a strong basis in biology, whether that’s our eating behaviour, anxiety or postnatal depression. We need to be more understanding and sympathetic towards people who suffer with these conditions.” 

This work was supported by Wellcome, the National Institute for Health and Care Research (NIHR), NIHR Cambridge Biomedical Research Centre, Botnar Fondation and Bernard Wolfe Health Neuroscience Endowment.

Reference
Li, Y, Cacciottolo, TM & Yin, N. Loss of Transient Receptor Potential Channel 5 Causes Obesity and Postpartum Depression. Cell; 2 July 2024; DOI: 10.1016/j.cell.2024.06.001

Scientists have identified a gene which, when missing or impaired, can cause obesity, behavioural problems and, in mothers, postnatal depression. The discovery, reported on 2 July in Cell, may have wider implications for the treatment of postnatal depression, with a study in mice suggesting that oxytocin may alleviate symptoms.

This research reminds us that many behaviours which we assume are entirely under our control have a strong basis in biology. We need to be more understanding and sympathetic towards people who suffer with these conditions
Sadaf Farooqi
Illustration of a tired African American mother crying

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cutting-edge genomic test can improve care of children with cancer

A little boy with a shaved head is smiling as he is laying on a hospital bed.

The study, published on 2 July in Nature Medicine, is the first time that the impact of using whole genome sequencing in current NHS practice has been assessed. It was led by researchers at the University of Cambridge, Cambridge University Hospitals NHS Trust, Wellcome Sanger Institute and Great Ormond Street Hospital.

The team analysed the use of routine genome sequencing, through the NHS Genomic Medicine Service, at Cambridge University Hospitals, where such tests are given to all children with solid tumours, and at Great Ormond Street Hospital, which provides the test for childhood leukaemia.

The researchers found that cancer sequencing gave new insights that improved the immediate clinical care of seven per cent of children, while also providing all the benefits of current standard tests.

Furthermore, in 29 per cent of cases, genome sequencing provided additional information that helped clinicians better understand the tumours of individual children and informed future management. For example, uncovering unexpected mutations that increase future cancer risk leading to preventative measures being taken, such as regular screening.

Overall, whole genome sequencing provides additional, relevant data, about childhood cancer that is useful for informing practice. The results also show that it can reduce the number of tests required, and therefore, researchers suggest it should be provided to all children impacted by cancer.

Whole genome sequencing (WGS) is a single test that provides a complete readout of the entire genetic code of the tumour and identifies every single cancer-causing mutation. Comparatively, traditional standard-of-care tests only look at tiny regions of the cancer genome, and therefore many more tests are often required per child.

Professor Sam Behjati, senior author from the Wellcome Sanger Institute, Cambridge University Hospitals, and the University of Cambridge: “Whole genome sequencing provides the gold standard, most comprehensive and cutting edge view of cancer. What was once a research tool that the Sanger Institute started exploring over a decade ago, has now become a clinical test that I can offer to my patients. This is a powerful example of the genomic data revolution of healthcare that enables us to provide better, individualised care for children with cancer.”

NHS England is one of the few health services in the world that has a national initiative, through the Genomic Medicine Service, offering universal genome sequencing to every child with suspected cancer. However, due to multiple barriers and a lack of evidence from real-time practice supporting its use, whole cancer genome sequencing is not yet widespread practice.  

The latest study looked at 281 children with suspected cancer across the two units. The team analysed the clinical and diagnostic information across these units and assessed how genome sequencing affected the care of children with cancer.

They found that WGS changed the clinical management in seven per cent of cases, improving care for 20 children, by providing information that is not possible to acquire from standard of care tests.

Additionally, WGS faithfully reproduced every one of the 738 standard of care tests utilised in these 281 cases, suggesting that a single WGS test could replace the multiple tests that the NHS currently uses if this is shown to be economically viable.

WGS provides a detailed insight into rare cancers, for example, by revealing novel variants of cancer. The widespread use of genome sequencing will enable clinicians to access these insights for individual patients while simultaneously building a powerful shared genomic resource for research into new treatment targets, possible prevention strategies, and the origins of cancer.

Dr Jack Bartram, senior author from Great Ormond Street Hospital NHS Foundation Trust and the North Thames Genomic Medicine Service, said: “Childhood cancer treatment is mostly guided by genetic features of the tumour, and therefore an in-depth genetic understanding of cancer is crucial in guiding our practice. Our research shows that whole genome sequencing delivers tangible benefits above existing tests, providing better care for our patients. We hope this research really highlights why whole genome sequencing should be delivered as part of routine clinical care to all children with suspected cancer.”

Professor Behjati at the Department of Paediatrics, University of Cambridge, and is a Fellow of Corpus Christi College, Cambridge.

This research was supported in part by Wellcome, the Pessoa de Araujo family and the National Institute for Health and Care Research.

Reference
A Hodder, S Leiter, J Kennedy, et al. Benefits for children with suspected cancer from routine whole genome sequencing. Nature Medicine; 2 July 2024; DOI: 10.1038/s41591-024-03056-w

Adapted from a press release from Wellcome Sanger Institute

Whole genome sequencing has improved clinical care of some children with cancer in England by informing individual patient care. Research published today supports the efforts to provide genome sequencing to all children with cancer and shows how it can improve the management of care in real-time, providing more benefits than all current tests combined.

This is a powerful example of the genomic data revolution of healthcare that enables us to provide better, individualised care for children with cancer
Sam Behjati
Boy Battling With Cancer
Eddie’s story

When he was six-years old, Eddie began to have regular low-grade fevers that seemed to affect him a lot. Even though early tests came back normal, the fevers became more frequent and his Mum, Harri, noticed that on one or two occasions he seemed out of breath while doing small things like reading a book.  A chest x-ray revealed a huge mass on Eddie’s chest, and he was diagnosed with T-cell acute lymphoblastic leukemia (T-ALL). Eddie was immediately transferred to Great Ormond Street Hospital (GOSH) to begin treatment.

“I know it sounds like a cliché, but you really don’t think it will ever happen to your child. It felt like our world fell out from under us. During those first few weeks I remember wondering if this was it, I was taking so many photos of us together and wondering if it could be the last,” said Harri, Eddie's mum.

Eddie was put onto a treatment plan that included eight months of intense chemotherapy, followed by two and a half years of maintenance treatment. As part of his treatment at GOSH Eddie’s family were also offered WGS to identify any cancer-causing changes.

“When we were offered whole genome sequencing, we didn’t even hesitate. I wanted to have all the information, I wanted to have some peace of mind for the future and know that Eddie was having the right care throughout. I also wanted to make sure that Eddie’s brother, Leo, wasn’t any more likely to get T-ALL because Eddie had,” said Harri.

On his seventh birthday, Eddie’s family received the call to say he was in remission. Now, at nine years-old Eddie is nearing the end of his maintenance treatment and is doing well.

“We are trying to live each day, and this experience has really changed our outlook on life. We always try to take the positive from every situation. Words can’t explain what Eddie has been through this past three years but he has come out the other side as a sensitive, confident, and smart young man.  He is mature beyond his years and he has been involved in everything, including decisions about his treatment. To say we are proud, doesn’t even come close to how we truly feel about him,” said Harri.

Their personal experience of WGS was so important on their journey that they provided support for this research.

Harri added: “I always say that having a child with a cancer diagnosis feels like you’ve been standing on a trap door all these years without knowing. Then after the diagnosis, you are in freefall. And even when things are stable again, you are constantly aware that the trap door is still there and there is a possibility it could open again at any time. Having access to whole genome sequencing gave us some sense of reassurance, it could have informed us about targeted treatments and gave us some insight into future risk. We wanted to support something that had the potential to have a real impact on treatment and outcomes so when we heard about this research project and its potential, it was very exciting that we could be a small part of it. It helped us turn something so devastating into something positive and we just hope that this research helps.”

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge spin-out’s sportscar prototype takes ultra-fast charging out of the lab and onto the road

Electric sportscar on a country road

In addition to ultra-fast charging times, the batteries developed by Nyobolt – which was spun out of Professor Dame Clare Grey’s lab in the Yusuf Hamied Department of Chemistry in 2019 – do not suffer from the degradation issues associated with lithium-ion batteries.

Tests of the first running Nyobolt EV prototype will be used to validate the company’s battery performance in a high-performance environment.

Cambridge-based Nyobolt has used its patented carbon and metal oxide anode materials, low-impedance cell design, integrated power electronics and software controls to create power-dense battery and charging systems. These support the electrification of applications such as heavy-duty off-highway trucks, EVs, robotics and consumer devices that demand high power and quick recharge cycles.

Initial in-vehicle testing using 350kW (800V) DC fast chargers confirmed that the Nyobolt EV’s battery can be charged from 10 per cent to 80 per cent in 4 minutes 37 seconds – with a full charge enabling the prototype to achieve a range of 155 miles. That is twice the speed of most of the fastest-charging vehicles today.

Independent testing of the technology confirmed that Nyobolt’s longer-lasting and more sustainable batteries can achieve over 4,000 fast charge cycles, or 600,000 miles, maintaining over 80 per cent battery capacity retention. This is many multiples higher than the warranties of much larger EV batteries on the road today.

“Nyobolt’s low impedance cells ensure we can offer sustainability, stretching out the battery’s usable life for up to 600,000 miles in the case of our technology demonstrator,” said company co-founder and CEO, Dr Sai Shivareddy.

The battery pack in the Nyobolt EV prototype not only adds miles faster but the compact battery pack size enables energy-efficient electric vehicles that are cheaper to buy and run, and crucially use fewer resources to manufacture.

“Nyobolt is removing the obstacle of slow and inconvenient charging, making electrification appealing and accessible to those who don’t have the time for lengthy charging times or space for a home charger,” said Shane Davies, Nyobolt’s director of vehicle battery systems.

Nyobolt’s battery assembly plans could be in production at low volume within a year, ramping to 1,000 packs in 2025. Nyobolt’s flexible manufacturing model enables volumes of up to 2 million cells per year.

Nyobolt’s technology builds on a decade of battery research led by Grey and Shivareddy, who invented cutting-edge supercapacitors. Key to the company’s ability to offer ultra-fast charging without impacting battery life is its low-impedance cells that generate less heat, making it easier to manage such high-power levels during charging. Its anode materials in lithium-ion battery cells allow for a faster transfer of electrons between the anode and cathode.

Nyobolt is in conversation with a further 8 vehicle manufacturers about adopting its technology. Alongside automotive applications, Nyobolt’s fast-charging technology is set to be used this year in robotics.

“Our extensive research here in the UK and in the US has unlocked a new battery technology that is ready and scalable right now,” said Shivareddy. “We are enabling the electrification of new products and services currently considered inviable or impossible. Creating real-world demonstrators, such as the Nyobolt EV, underlines both our readiness and commitment to making the industries see change is possible.”

Adapted from a Nyobolt media release.

Nyobolt, a University of Cambridge spin-out company, has demonstrated its ultra-fast charging batteries in an electric sportscar prototype, going from 10% to 80% charge in under five minutes, twice the speed of the fastest-charging vehicles currently on the road.

Nyobolt EV prototype

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Largest ever genetic study of age of puberty in girls shows links with weight gain

Portrait of a young girl writing in her diary

In the largest study of its kind to date, an international team led by researchers at the Medical Research Council (MRC) Epidemiology Unit, University of Cambridge, studied the DNA of around 800,000 women from Europe, North America, China, Japan, and Korea.

Published on 1 July in Nature Genetics, the researchers found more than 1,000 variants – small changes in DNA – that influence the age of first menstrual period. Around 600 of these variants were observed for the first time.

The age at which girls hit puberty and start having periods normally occurs between ages 10 to 15, though this has been getting earlier and earlier in recent decades. The reasons for this are not fully understood. Early puberty is linked with increased risk of a number of diseases in later life, including type 2 diabetes, cardiovascular disease, and certain cancers. Later puberty on the other hand, has been linked to improved health in adulthood and a longer lifespan.

Just under half (45%) of the discovered genetic variants affected puberty indirectly, by increasing weight gain in early childhood.

Corresponding author Professor John Perry said: “Many of the genes we’ve found influence early puberty by first accelerating weight gain in infants and young children. This can then lead to potentially serious health problems in later life, as having earlier puberty leads to higher rates of overweight and obesity in adulthood.”

Previous work by the team – together with researchers at Cambridge’s MRC Metabolic Diseases Unit – showed that a receptor in the brain, known as MC3R, detects the nutritional state of the body and regulates the timing of puberty and rate of growth in children, providing a mechanism by which this occurs. Other identified genes appeared to be acting in the brain to control the release of reproductive hormones.

The scientists also analysed rare genetic variants that are carried by very few people, but which can have large effects on puberty. For example, they found that one in 3,800 women carry variants in the gene ZNF483, which caused these women to experience puberty on average, 1.3 years later.

Dr Katherine Kentistou, lead study investigator, added: “This is the first time we’ve ever been able to analyse rare genetic variants at this scale. We have identified six genes which all profoundly affect the timing of puberty. While these genes were discovered in girls, they often have the same impact on the timing of puberty in boys. The new mechanisms we describe could form the basis of interventions for individuals at risk of early puberty and obesity.”

The researchers also generated a genetic score that predicted whether a girl was likely to hit puberty very early or very late. Girls with the highest 1% of this genetic score were 11 times more likely to have extremely delayed puberty – that is, after age 15 years. On the other hand, girls with the lowest 1% genetic score were 14 times more likely to have extremely early puberty – before age 10.

Senior author and paediatrician Professor Ken Ong said: “In the future, we may be able to use these genetic scores in the clinic to identify those girls whose puberty will come very early or very late. The NHS is already trialling whole genome sequencing at birth, and this would give us the genetic information we need to make this possible.

“Children who present in the NHS with very early puberty – at age seven or eight – are offered puberty blockers to delay it. But age of puberty is a continuum, and if they miss this threshold, there’s currently nothing we have to offer. We need other interventions, whether that’s oral medication or a behavioural approach, to help. This could be important for their health when they grow up.”

The research was supported by the Medical Research Council and included data from the UK Biobank.

Reference
Kentistou, KA & Kaisinger, LR, et al. Understanding the genetic complexity of puberty timing across the allele frequency spectrum. Nat Gen; 1 July 2024; DOI: 10.1038/s41588-024-01798-4

Genes can indirectly influence the age at which girls have their first period by accelerating weight gain in childhood, a known risk factor for early puberty, a Cambridge-led study has found. Other genes can directly affect age of puberty, some with profound effects.

Many of the genes we’ve found influence early puberty by first accelerating weight gain in infants and young children. This can then lead to potentially serious health problems in later life
John Perry
Portrait of a young girl writing in her diary

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

No evidence that England’s new ‘biodiversity boost’ planning policy will help birds or butterflies

Researchers assess woodland condition at Alice Holt Forest

From 2024, the UK’s Environment Act requires planning applications to demonstrate an overall biodiversity net gain of at least 10% as calculated using a new statutory biodiversity metric.

The researchers trialled the metric by using it to calculate the biodiversity value of 24 sites across England. These sites have all been monitored over the long-term, allowing the team to compare biodiversity species data with results from the metric.

Plant biodiversity at the sites matched values produced using the metric, but bird and butterfly biodiversity did not.

This means there’s no evidence that a 10% net biodiversity gain calculated using the statutory biodiversity metric will translate into real-life gains for birds and butterflies, without additional conservation management.

This is the first comprehensive study of the performance of Defra’s statutory biodiversity metric across England. The results were published on 28 June in the Journal of Applied Ecology.

Plants, birds and butterflies have been comprehensively surveyed in England over many years, and are used as indicators for the national state of nature.

The researchers say the metric must be improved to better capture the intricacies of the different species within an ecosystem.

“The statutory biodiversity metric is a really important opportunity, and has potential to direct a lot of money into biodiversity conservation from developers. It’s the responsibility of conservationists and policy makers to ensure that it provides a reliable indication of nature’s diversity,” said Dr Cicely Marshall in the University of Cambridge’s Department of Plant Sciences, first author of the paper.

She added: “At the moment the metric does capture plant diversity quite well, but it doesn’t reflect the intricacies of ecosystems – species like birds and butterflies use habitats in very different ways.”

The metric, created by the UK Government’s Department for Environment, Food and Rural Affairs (Defra), was introduced as part of the Environment Act with its legally binding agenda to deliver “the most ambitious environmental programme of any country on earth.” It scores the condition and distinctiveness of a piece of land to calculate its biodiversity value in standardised ‘biodiversity units.’

This allows developers to project biodiversity losses and gains across a site, so they can ensure the development achieves an overall minimum 10% biodiversity gain. Landowners can use the tool to calculate the biodiversity value of their land.

Marshall, who is also a Research Fellow at King’s College, Cambridge, said: “Many property developments have been very detrimental to nature in the past, and it’s exciting that England now has a requirement for developers to leave nature in a better state than they found it.

“We hope our study will contribute to improving the way nature’s value is calculated, to make the most of this valuable opportunity for nature recovery.”

The results of the study have been used to make recommendations to Defra and Natural England to help improve the metric.

The metric uses habitat as a proxy for biodiversity, scoring habitats’ intrinsic distinctiveness and current condition. Plans for biodiversity gain can involve replacing lost habitat with similar habitat - the researchers say that nature recovery could be improved if the particular species and habitats impacted by a development were also taken into account in this process.

There can be huge differences in biodiversity across habitats like croplands, for example, and these aren’t captured by the metric which assigns all cropland the same condition score. Conventional farms that regularly use artificial pesticides and herbicides have much lower biodiversity than organic farms that do not.

“There are great differences in the ecological value of cropland depending on how it’s managed, but the metric gives all cropland a low biodiversity score. It would be nice to see these differences reflected,” said Marshall.

The UK is committed to building 300,000 homes a year by mid-2020, so the net biodiversity gain requirement is expected to generate a market for biodiversity credits worth an estimated £135m-£274m annually – substantially increasing funding for nature conservation in England.

The research was funded by the Ecological Continuity Trust.

Reference: Marshall, C. 'England’s statutory biodiversity metric enhances plant, but not bird nor butterfly biodiversity'. Journal of Applied Ecology, June 2024. DOI: 10.1111/1365-2664.14697

A new legal requirement for developers to demonstrate a biodiversity boost in planning applications could make a more meaningful impact on nature recovery if improvements are made to the way nature’s value is calculated, say researchers at the University of Cambridge.

We hope our study will contribute to improving the way nature’s value is calculated, to make the most of this valuable opportunity for nature recovery.
Cicely Marshall
Researchers assess woodland condition at Alice Holt Forest

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Simon Baron-Cohen wins MRC Millennium Medal for transformative research into autism and neurodiversity

Professor Sir Simon Baron-Cohen.

Sir Simon Baron-Cohen is a Professor in the Departments of Psychology and Psychiatry at the University of Cambridge and Fellow at Trinity College. He is Director of the Autism Research Centre, which he set up in 1997. He has published over 750 peer reviewed scientific articles and has made contributions to many aspects of autism research. In 2021, he received a knighthood in the New Year’s Honours list for his services to autism.

One of his earliest MRC grants, in 1996, was to investigate if autism could be diagnosed in babies as young as 18 months old, and his team showed that it can. Ideally, an early diagnosis should lead to the right support, so that a child has the best opportunity to fulfil their potential.

Baron-Cohen has drawn attention to the reality that a lot of autistic people do not receive their diagnosis in early childhood. In fact, many are not diagnosed until late childhood, or even adulthood. This means they are left unsupported and feeling different, but with no explanation. As a result, autistic people can end up feeling like they do not fit in, and may experience exclusion or bullying by their peer group. They can feel ashamed when they’re not coping in a mainstream classroom.

He argues that the reason they are struggling is because the mainstream educational setting was designed for non-autistic people. This can lead to a gradual deterioration in their mental health. Many underachieve academically and only 15% of autistic adults are employed. Most worryingly, one in four autistic adults have planned or attempted suicide. He said “Autistic people are being failed by our society”.

In 2017 Baron-Cohen was invited by the United Nations to give a keynote lecture on Autism Acceptance Day. He described how autistic people are excluded from many basic human rights. These include the right to education, the right to health services, the right to dignity, and the right to employment.

In an effort to change this, Baron-Cohen created a charity called the Autism Centre of Excellence (ACE) at Cambridge. The charity is science-led and aims to put the science into the hands of policymakers, so there’s no delay in translating policy-relevant findings.

Alongside his applied research, Baron-Cohen’s team also conducts basic research. Autism starts prenatally and is partly but not completely genetic. For decades it was unclear what other factors might contribute to the cause of autism. Over the past 20 years Baron-Cohen made two big discoveries which have helped understand what causes autism. First, his team found elevated levels of prenatal androgens (sex hormones such as testosterone) in pregnancies that later resulted in autism. Second, they found that prenatal oestrogens (another group of sex hormones which are synthesised from androgens) levels were also elevated in pregnancies resulting in autism. 

Autism is an example of neurodiversity. Autistic individuals’ brains develop differently, from before birth. Some of these differences result in disability, for example, in social skills and communication. But others result in strengths or talents. For example, many autistic people have excellent memory for facts and excellent attention to detail. And many are strongly attracted to patterns. Baron-Cohen’s recent book The Pattern Seekers celebrates autistic people’s different minds. In many environments, such skills are assets. In his research group, he employs neurodivergent individuals.

Baron-Cohen will receive the prestigious medal, specially created by The Royal Mint, and will be listed amongst the most highly influential and impactful researchers in the UK. He will deliver a lecture about his research, and his achievements will be celebrated at an Awards Ceremony on 20 June 2024 in the Law Society where he will receive the medal.

On receiving the news that he was to be awarded the MRC Millennium Medal, Baron-Cohen said: “This is the result of team work and I am fortunate to be surrounded by a talented team of scientists. I hope this Medal shines a light on how autistic people need a lot more support, from the earliest point, to lead fulfilling lives.”

The UKRI Medical Research Council (MRC) in the UK will today present the MRC Millennium Medal 2023 to Professor Sir Simon Baron-Cohen, in recognition of his pioneering MRC-funded research into the prenatal sex steroid theory of autism, his establishment of the Autism Research Centre at the University of Cambridge, and his work in the public understanding of neurodiversity.

I hope this Medal shines a light on how autistic people need a lot more support, from the earliest point, to lead fulfilling lives
Simon Baron-Cohen
Professor Sir Simon Baron-Cohen

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge confers honorary degrees

Honorary graduands

The Chancellor, Lord Sainsbury of Turville, presided over a special congregation at the Senate House which was attended by around 400 staff, students, alumni and invited guests. The honorary graduands this year were:

Professor Dame Carol Black (Doctor of Medical Science) – From 2012-2019 Dame Carol was Principal of Newnham College and is now an Honorary Fellow of both Newnham and Lucy Cavendish Colleges. A clinician and medical scientist, Carol is a rheumatologist and renowned authority on the condition scleroderma. She was President of the Royal College of Physicians and has advised the UK Government on areas of public health policy.

Professor Stephen Stahl (Doctor of Medical Science) – A psychiatrist and psychopharmacologist, Stephen is Clinical Professor of Health Sciences at the University of California Riverside, US. He is a former Visiting Fellow of Clare Hall and Honorary Fellow in the Department of Psychiatry.

Professor Adele Diamond (Doctor of Science) – A world leading neuroscientist, Adele is Professor of Developmental Cognitive Neuroscience at the University of British Colombia, Canada. Her work has led to improvements in treatments for children with attention deficit hyperactivity disorder (ADHD). She is a Fellow of the Royal Society of Canada.

Professor Dame Carol Robinson (Doctor of Science)– Carol is Dr Lee's Professor of Chemistry at the University of Oxford and founding director of the Kavli Institute for Nanoscience Discovery. Formerly President of the Royal Society of Chemistry she is an alumna and an Honorary Fellow of Churchill College.

Professor Kip Thorne (Doctor of Science) – Kip is Richard P Feynman Professor of Theoretical Physics Emeritus at the California Institute of Technology (Caltech), US. In 2017 he was jointly awarded the Nobel Prize in Physics. He is also a winner of the Kavli Prize in Astrophysics and a celebrated author.

Professor The Hon Michael Ignatieff (Doctor of Letters) – A historian, writer and broadcaster, Michael is also a former politician having led the Liberal Party in his native Canada from 2008 to 2011. A member of the Canadian Privy Council and the Order of Canada he has won the Heinemann, Orwell and Dan David Prizes.

Murray Perahia (Doctor of Music) – Murray is a world-renowned pianist whose interpretations of Brahms, Beethoven, Chopin and Schubert have gained great acclaim. He is Principal Guest Conductor of the Academy of St Martin-in-the-Fields and the winner of three Grammy Awards. He was appointed an Honorary KBE in 2004 and is an Honorary Fellow of Jesus College.

Kip Thorne summed up the day:

"Early in my career the 3 places which were the most important to me in terms of the colleagues I had were Cambridge, Moscow and the place I came from, Princeton. Cambridge, being home to good friends like Stephen Hawking and Martin Rees, was a special place...for me this is just tremendous to receive this honour from the place that has had such a big impact on my career."

And Dame Carol Black said:

"It almost feels unreal...I'm just so pleased and humbled by it. Nothing better could have happened to me. I'm not a Cambridge or Oxford graduate so I never thought this would happen. This is such a wonderful day and it's been lovely to see some of the people I knew when I was here as Head of House."

 

 

The University has conferred honorary degrees to seven distinguished individuals in recognition of the achievements they have made in their respective fields. An honorary doctorate is the highest accolade the University can bestow. 

This is just tremendous to receive this honour from the place that has had such a big impact on my career
Kip Thorne
Honorary graduands

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Echion Technologies secures £29 million to help commercialise its sustainable battery technology

Members of the Echion team

Echion, a Cambridge University spinout headquartered just outside the city, has invented and patented a niobium-based anode material, XNO®, for use with re-chargeable lithium-ion batteries. The material enables the lithium-ion batteries to safely charge in less than ten minutes, last for more than 10,000 cycles and not lose power in extreme cold or hot temperatures.  

By improving the power density and thermal stability of lithium-ion batteries, XNO® extends their lifespan. Batteries using XNO®  have been shown to have a lower environmental impact than those based on other commonly used materials such as graphite. Graphite is the dominant anode material, with over 90% market share, due to its high energy density and low cost. But for fast charging, graphite-based cells are limited in maximum charge rate compared to XNO® based cells. XNO®’s higher capacity retention and cycle life when charging, across a wider temperature range, boosts available battery capacity. Despite the same volume and weight, higher total energy delivered across the lifetime of the battery lowers total cost of ownership.

The £29 million investment will mean Echion’s XNO® anode material can start to be used in real-world applications, such as battery electric and hybrid trains, mining haul trucks, opportunity-charging e-buses, heavy-duty industrial transport and delivery vehicles.

Echion’s longstanding partnership with the world’s leading producer of niobium, CBMM, will see the opening of a 2,000 tonne per year XNO® manufacturing facility in 2024. This will provide Echion with the manufacturing capacity to supply its global customer base of major cell manufacturers and original equipment manufacturers (OEMs).

The investment round was led by specialist battery and energy storage technology investor Volta Energy Technologies (Volta), with participation from existing investors CBMM, BGF and Cambridge Enterprise Ventures.

Jean de La Verpilliere, CEO of Echion Technologies, said: “Our ambition is to deliver the best fast-charging batteries to unlock the electrification of heavy-duty vehicles. The investment from our partners Volta Energy Technologies, CBMM, BGF and Cambridge Enterprise Ventures cements our ambition to achieve full-scale commercialisation and full production volume.

“The entire Echion team has worked tirelessly to develop our flagship XNO® material into what it is today and this has enabled us to establish partnerships with many major OEMs and cell manufacturers which have recognised the benefits of our materials. I look forward to being able to satisfy their demand for innovative niobium-based anode materials, and to see industrial and commercial applications powered by XNO®.”

Dr Jeff Chamberlain, CEO and Founder of Volta Energy Technologies, said: “We are excited to lead Echion’s Series B and make Volta’s first investment in Europe. Echion and their XNO® technology complements our growing portfolio of technologies that address significant market needs through innovations in the supply chains of battery and energy storage technology. We believe the power of XNO® can uniquely improve performance, lower cost, and meet the demands of the growing, international markets across mining, logistics, railways, automotive and more.”

Rodrigo Barjas Amado, Managing Partner and Commercial Head of Battery Program at CBMM, said: “Having invested in Echion since 2021, we are pleased to see the progress that has been made through our partnership so far and we are proud to support bringing this ground-breaking, niobium-based technology to the market with our 2,000 tonne per year manufacturing capacity.”

Dennis Atkinson, Investor at BGF, said: “Echion is a world class UK based battery technology business. We are proud to have them as part of our climate and deeptech portfolio, and excited to support the team and their XNO® technology in electrifying and decarbonising heavy transport.”

"Echion is entering the market with their next generation battery material at scale. This is a great team that has combined great technology, great talent, great partners and great money for their go-to-market journey. We’re pleased to have been with Echion from the start and to continue our relationship into this B round with a terrific syndicate of co-investors,” says Chris Gibbs, Investment Director, Cambridge Enterprise Ventures.

Adapted from media release published by Cambridge Enterprise.

Cambridge spinout, Echion Technologies has raised £29 million in investment capital to help it increase the production of its fast-charging, long-life battery material based on niobium.

Echion Technologies senior management team (L-R: Dr Alex Groombridge, Chief Technology Officer, Dr Sarah Stevenson, Chief Operating Officer, Jean de La Verpilliere, Chief Executive Officer, Ceri Neal, Chief Financial Officer, and Benjamin Ting, Chief Comm

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Discovery of ‘new rules of the immune system’ could improve treatment of inflammatory diseases, say scientists.

James Dooley in the laboratory.

This overturns the traditional thinking that regulatory T cells exist as multiple specialist populations that are restricted to specific parts of the body. The finding has implications for the treatment of many different diseases – because almost all diseases and injuries trigger the body’s immune system.

Current anti-inflammatory drugs treat the whole body, rather than just the part needing treatment. The researchers say their findings mean it could be possible to shut down the body’s immune response and repair damage in any specific part of the body, without affecting the rest of it. This means that higher, more targeted doses of drugs could be used to treat disease – potentially with rapid results.

“We’ve uncovered new rules of the immune system. This ‘unified healer army’ can do everything - repair injured muscle, make your fat cells respond better to insulin, regrow hair follicles.  To think that we could use it in such an enormous range of diseases is fantastic: it’s got the potential to be used for almost everything,” said Professor Adrian Liston in the University of Cambridge’s Department of Pathology, senior author of the paper.

To reach this discovery, the researchers analysed the regulatory T cells present in 48 different tissues in the bodies of mice. This revealed that the cells are not specialised or static, but move through the body to where they’re needed. The results are published today in the journal Immunity.

“It's difficult to think of a disease, injury or infection that doesn’t involve some kind of immune response, and our finding really changes the way we could control this response,” said Liston.

He added: “Now that we know these regulatory T cells are present everywhere in the body, in principle we can start to make immune suppression and tissue regeneration treatments that are targeted against a single organ – a vast improvement on current treatments that are like hitting the body with a sledgehammer.”

Using a drug they have already designed, the researchers have shown - in mice - that it’s possible to attract regulatory T cells to a specific part of the body, increase their number, and activate them to turn off the immune response and promote healing in just one organ or tissue.

“By boosting the number of regulatory T cells in targeted areas of the body, we can help the body do a better job of repairing itself, or managing immune responses,” said Liston.

He added: “There are so many different diseases where we’d like to shut down an immune response and start a repair response, for example autoimmune diseases like multiple sclerosis, and even many infectious diseases.”

Most symptoms of infections such as COVID are not from the virus itself, but from the body’s immune system attacking the virus. Once the virus is past its peak, regulatory T cells should switch off the body’s immune response, but in some people the process isn’t very efficient and can result in ongoing problems. The new finding means it could be possible to use a drug to shut down the immune response in the patient’s lungs, while letting the immune system in the rest of the body continue to function normally.

In another example, people who receive organ transplants must take immuno-suppressant drugs for the rest of their lives to prevent organ rejection, because the body mounts a severe immune response against the transplanted organ. But this makes them highly vulnerable to infections. The new finding helps the design of new drugs to shut down the body’s immune response against only the transplanted organ but keep the rest of the body working normally, enabling the patient to lead a normal life.

Most white blood cells attack infections in the body by triggering an immune response. In contrast, regulatory T cells act like a ‘unified healer army’ whose purpose is to shut down this immune response once it has done its job - and repair the tissue damage caused by it.

The researchers are now fundraising to set up a spin-out company, with the aim of running clinical trials to test their findings in humans within the next few years.

The research was funded by the European Research Council (ERC), Wellcome, and the Biotechnology and Biological Sciences Research Council (BBSRC).

Reference: Liston, A. ‘The tissue-resident regulatory T cell pool is shaped by transient multi-tissue migration and a conserved residency program.’ Immunity, June 2024. DOI: 10.1016/j.immuni.2024.05.023

Scientists at the University of Cambridge have discovered that a type of white blood cell - called a regulatory T cell - exists as a single large population of cells that constantly move throughout the body looking for, and repairing, damaged tissue.

It's difficult to think of a disease, injury or infection that doesn’t involve some kind of immune response, and our finding really changes the way we could control this response.
Adrian Liston
Dr James Dooley, a senior author of the study, in the laboratory
In brief
  • A single large population of healer cells, called regulatory T cells, is whizzing around our body - not multiple specialist populations restricted to specific parts of the body as previously thought.
  • These cells shut down inflammation and repair the collateral damage to cells caused after our immune system has responded to injury or illness.
  • Tests, in mice, of a drug developed by the researchers showed that regulatory T cells can be attracted to specific body parts, boosted in number, and activated to suppress immune response and rebuild tissue.
  • Current anti-inflammatory drugs used for this purpose suppress the body’s whole immune system, making patients more vulnerable to infection.
  • The discovery could lead to more targeted treatments, with fewer side-effects, for issues from lengthy COVID infections to autoimmune diseases like multiple sclerosis. Clinical trials in humans are now planned.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge achievers recognised in King's Birthday Honours 2024


Professor Tony Kouzarides, Professor of Cancer Biology, Senior Group Leader at the Gurdon Institute and Director and Co-Founder of the Milner Institute, has been awarded a Knighthood for his services to Healthcare Innovation and Delivery. Professor Kouzarides said: “I am delighted to receive this honour, which reinforces the importance of translating basic research into therapies by engaging academic researchers with healthcare businesses.”

Professor Barbara Sahakian, Professor of Clinical Neuropsychology in the Department of Psychiatry and a Fellow at Clare Hall, receives a CBE. Professor Sahakian, who is known for her research aimed at understanding the neural basis of cognitive, emotional and behavioural dysfunction in order to develop more effective pharmacological and psychological treatments, is honoured for her services to Research in Human Cognitive Processes. Professor Sahakian said: “I am delighted to receive this prestigious award, which recognises my research on human cognitive processes in health, psychiatric disorders and neurological diseases. I am grateful to my PhD students, postdoctoral fellows and colleagues for their collaboration."

Professor Christine Holt, Professor of Developmental Neuroscience, receives a CBE for services to Neuroscience. Professor Holt said: "I'm surprised and thrilled to receive this honour. It's a marvellous recognition of the research that has involved a whole team of talented, dedicated and inspiring colleagues over many years."

Professor David Menon, founder of the Neurosciences Critical Care Unit (NCCU) at Addenbrooke’s Hospital, has been awarded a CBE. Professor Menon, who is noted for his national and global clinical and research leadership in traumatic brain injury, is honoured for his services to Neurocritical Care. He said: “I am deeply honoured to be nominated for a CBE and accept it on behalf of all those who have worked with me, during what has been – and continues to be – a very rewarding career.”

Professor Patrick Maxwell, Regius Professor of Physic and Head of School of Clinical Medicine, receives a CBE for services to Medical Research.

Professor Peter John Clarkson, Director of Cambridge Engineering Design Centre and Co-Director of Cambridge Public Health, receives a CBE. Professor Clarkson, who is known for his research in health and care improvement, inclusive design and systems design, is honoured for his services to Engineering and Design. Professor Clarkson said “I am delighted to receive this honour and thank all those extraordinary people I have had the pleasure to work with over the years who have supported me in so many interesting and transformative projects.”

Alexandra Bolton, Director of the Climate Governance Initiative, is awarded an OBE for services to the Built and Natural Environment. Alexandra said: "This wonderful and humbling recognition makes me in turn recognise the talented people who, throughout my career, have selflessly given me support, guidance and advice. I am enormously grateful for the honour, and for all those who have helped me along the way."

Professor Anne-Christine Davis, Professor of Mathematical Physics, receives an OBE for services to Higher Education and to Scientific Research. Professor Davis said: "I am amazed and overwhelmed to receive this honour. I could not have done it alone and wish to thank my wonderful students and collaborators over the years. I would like to dedicate this honour to those women in STEMM who came before me and did not receive the recognition they deserved."

Professor Shruti Kapila, Professor of History and Politics receives an OBE for services to Research in Humanities.

Paul Fannon, Lecturer in Machine Learning in the Department of Genetics, Director of Studies at Jesus College and Fellow at Christ's College receives an OBE for services to Education.

Details of University alumni who are recognised in the King's Birthday Honours will be published on www.alumni.cam.ac.uk.

The University extends its congratulations to all academics, staff and alumni who have received an honour.

Academics and staff from the University of Cambridge are featured in the King's Birthday Honours 2024, which recognises the achievements and service of people across the UK, from all walks of life.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Video analysis of Iceland 2010 eruption could improve volcanic ash forecasts for aviation safety

Eruption at Eyjafjallajökull April 17, 2010.

When Eyjafjallajökull erupted in 2010, it ejected roughly 250 million tonnes of volcanic ash into the atmosphere: much of which was blown over Europe and into flight paths. With planes grounded, millions of air passengers were left stranded.

Forecasts of how ash will spread in the aftermath of an explosive eruption can help reduce impacts to aviation by informing decisions to shut down areas of airspace. But these forecasts require knowledge of what is happening at the volcano, information that often can’t be obtained directly and must instead be estimated.

In the new study, the researchers split a 17-minute film into time segments to understand how the Eyjafjallajökull ash cloud grew upwards and outwards as the eruption ensued.

“No one has previously observed the shape and speed of wind-blown ash clouds directly,” said Professor Andy Woods, lead author of the study from Cambridge’s Department of Earth Sciences and Institute for Energy and Environmental Flows. Their new video analysis method was reported in Nature Communications Earth and Environment.

By comparing characteristics of the ash cloud, such as its shape and speed, at time intervals through the video, the researchers were able to calculate the amount of ash spewed from the volcano.

That rate of ash flow, called eruption rate, is an important metric for forecasting ash cloud extent, said Woods. “The eruption rate determines how much ash goes up into the atmosphere, how high the ash cloud will go, how long the plume will stay buoyant, how quickly the ash will start falling to the ground and the area over which ash will land.”

Generally, the higher the ash plume, the wider the ash will be dispersed, and the smaller the ash particles are, the longer they stay buoyant. This dispersal can also depend on weather conditions, particularly the wind direction.

Volcanoes across the world are increasingly monitored via video, using webcams or high-resolution cameras. Woods thinks that, if high frame rate video observations can be accessed during an eruption, then this real-time information could be fed into ash cloud forecasts that more realistically reflect changing eruption conditions.

During the 17-minute footage of the Eyjafjallajökull eruption, the researchers observed that the eruption rate dropped by about half. “It’s amazing that you can learn eruption rate from a video, that’s something that we’ve previously only been able to calculate after an eruption has happened,” said Woods. “It’s important to know the changing eruption rate because that could impact the ash cloud dispersal downwind.”

It’s usually challenging for volcanologists to take continuous measurements of ash clouds whilst an eruption is happening. "Instead, much of our understanding of how ash clouds spread in the atmosphere is based on scaled-down lab models,” said Dr Nicola Mingotti, a researcher in Woods’ group and co-author of this study. These experiments are performed in water tanks, by releasing particle-laden or dyed saline solutions and analysing footage of the plume as it dissipates.

Woods and his collaborators have been running lab experiments like these for several years, most recently trying to understand how eruption plumes are dragged along by the wind. But it’s a big bonus to have video measurements from a real eruption, said Woods, and the real observations agree closely with what they’ve been observing in the lab. “Demonstrating our lab experiments are realistic is really important, both for making sure we understand how ash plumes work and that we forecast their movements effectively.”

Reference:
Mingotti, N, and Woods, A W (2024). Video-based measurements of the entrainment, speed and mass flux in a wind-blown eruption column. Communications Earth & Environment (2024). DOI: 10.1038/s43247-024-01402-x

 

Video footage of Iceland’s 2010 Eyjafjallajökull eruption is providing researchers from the University of Cambridge with rare, up-close observations of volcanic ash clouds — information that could help better forecast how far explosive eruptions disperse their hazardous ash particles.

Eruption at Eyjafjallajökull April 17, 2010.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

What’s going on in our brains when we plan?

Digitally generated image of a young man

In pausing to think before making an important decision, we may imagine the potential outcomes of different choices we could make. While this ‘mental simulation’ is central to how we plan and make decisions in everyday life, how the brain works to accomplish this is not well understood. 

An international team of scientists has now uncovered neural mechanisms used in planning. Their results, published in the journal Nature Neuroscience, suggest that an interplay between the brain’s prefrontal cortex and hippocampus allows us to imagine future outcomes to guide our decisions.

“The prefrontal cortex acts as a ‘simulator,’ mentally testing out possible actions using a cognitive map stored in the hippocampus,” said co-author Marcelo Mattar from New York University. “This research sheds light on the neural and cognitive mechanisms of planning—a core component of human and animal intelligence. A deeper understanding of these brain mechanisms could ultimately improve the treatment of disorders affecting decision-making abilities.”

The roles of both the prefrontal cortex—used in planning and decision-making—and hippocampus—used in memory formation and storage—have long been established. However, their specific duties in deliberative decision-making, which are the types of decisions that require us to think before acting, are less clear.

To illuminate the neural mechanisms of planning, Mattar and his colleagues—Kristopher Jensen from University College London and Professor Guillaume Hennequin from Cambridge’s Department of Engineering —developed a computational model to predict brain activity during planning. They then analysed data from both humans and rats to confirm the validity of the model—a recurrent neural network (RNN), which learns patterns based on incoming information. 

The model took into account existing knowledge of planning and added new layers of complexity, including ‘imagined actions,’ thereby capturing how decision-making involves weighing the impact of potential choices—similar to how a chess player envisions sequences of moves before committing to one. These mental simulations of potential futures, modelled as interactions between the prefrontal cortex and hippocampus, enable us to rapidly adapt to new environments, such as taking a detour after finding a road is blocked.

The scientists validated this computational model using both behavioural and neural data. To assess the model’s ability to predict behaviour, the scientists conducted an experiment measuring how humans navigated an online maze on a computer screen and how long they had to think before each step.

To validate the model’s predictions about the role of the hippocampus in planning, they analysed neural recordings from rodents navigating a physical maze configured in the same way as in the human experiment. By giving a similar task to humans and rats, the researchers could draw parallels between the behavioural and neural data—an innovative aspect of this research.

“Allowing neural networks to decide for themselves when to 'pause and think' was a great idea, and it was surprising to see that in situations where humans spend time pondering what to do next, so do these neural networks,” said Hennequin. 

The experimental results were consistent with the computational model, showing an intricate interaction between the prefrontal cortex and hippocampus. In the human experiments, participants’ brain activity reflected more time thinking before acting in navigating the maze. In the experiments with laboratory rats, the animals’ neural responses in moving through the maze resembled the model’s simulations.

“Overall, this work provides foundational knowledge on how these brain circuits enable us to think before we act in order to make better decisions,” said Mattar. “In addition, a method in which both human and animal experimental participants and RNNs were all trained to perform the same task offers an innovative and foundational way to gain insights into behaviours.”

“This new framework will enable systematic studies of thinking at the neural level,” said Hennequin. “This will require a concerted effort from neurophysiologists and theorists, and I'm excited about the discoveries that lie ahead.” 

Reference:
Kristopher T. Jensen, Guillaume Hennequin & Marcelo G. Mattar. ‘A recurrent network model of planning explains hippocampal replay and human behavior.’ Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01675-7

Adapted from an NYU press release.

Study uncovers how the brain simulates possible future actions by drawing from our stored memories.

Metaverse portrait

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Earliest detection of metal challenges what we know about the first galaxies

Deep field image from JWST

Using the James Webb Space Telescope (JWST), an international team of astronomers led by the University of Cambridge observed a very young galaxy in the early universe and found that it contained surprising amounts of carbon, one of the seeds of life as we know it.

In astronomy, elements heavier than hydrogen or helium are classed as metals. The very early universe was almost entirely made up of hydrogen, the simplest of the elements, with small amounts of helium and tiny amounts of lithium.

Every other element that makes up the universe we observe today was formed inside a star. When stars explode as supernovas, the elements they produce are circulated throughout their host galaxy, seeding the next generation of stars. With every new generation of stars and ‘stardust’, more metals are formed, and after billions of years, the universe evolves to a point where it can support rocky planets like Earth and life like us.

The ability to trace the origin and evolution of metals will help us understand how we went from a universe made almost entirely of just two chemical elements, to the incredible complexity we see today.

“The very first stars are the holy grail of chemical evolution,” said lead author Dr Francesco D’Eugenio, from the Kavli Institute for Cosmology at Cambridge. “Since they are made only of primordial elements, they behave very differently to modern stars. By studying how and when the first metals formed inside stars, we can set a time frame for the earliest steps on the path that led to the formation of life.”

Carbon is a fundamental element in the evolution of the universe, since it can form into grains of dust that clump together, eventually forming into the first planetesimals and the earliest planets. Carbon is also key for the formation of life on Earth.

“Earlier research suggested that carbon started to form in large quantities relatively late – about one billion years after the Big Bang,” said co-author Professor Roberto Maiolino, also from the Kavli Institute. “But we’ve found that carbon formed much earlier – it might even be the oldest metal of all.”

The team used the JWST to observe a very distant galaxy – one of the most distant galaxies yet observed – just 350 million years after the Big Bang, more than 13 billion years ago. This galaxy is compact and low mass – about 100,000 times less massive than the Milky Way.

“It’s just an embryo of a galaxy when we observe it, but it could evolve into something quite big, about the size of the Milky Way,” said D’Eugenio. “But for such a young galaxy, it’s fairly massive.”

The researchers used Webb’s Near Infrared Spectrograph (NIRSpec) to break down the light coming from the young galaxy into a spectrum of colours. Different elements leave different chemical fingerprints in the galaxy’s spectrum, allowing the team to determine its chemical composition. Analysis of this spectrum showed a confident detection of carbon, and tentative detections of oxygen and neon, although further observations will be required to confirm the presence of these other elements.

“We were surprised to see carbon so early in the universe, since it was thought that the earliest stars produced much more oxygen than carbon,” said Maiolino. “We had thought that carbon was enriched much later, through entirely different processes, but the fact that it appears so early tells us that the very first stars may have operated very differently.” 

According to some models, when the earliest stars exploded as supernovas, they may have released less energy than initially expected. In this case, carbon, which was in the stars’ outer shell and was less gravitationally bound than oxygen, could have escaped more easily and spread throughout the galaxy, while a large amount of oxygen fell back and collapsed into a black hole.

“These observations tell us that carbon can be enriched quickly in the early universe,” said D’Eugenio. “And because carbon is fundamental to life as we know it, it’s not necessarily true that life must have evolved much later in the universe. Perhaps life emerged much earlier – although if there’s life elsewhere in the universe, it might have evolved very differently than it did here on Earth.”

The results have been accepted for publication in the journal Astronomy & Astrophysics and are based on data obtained within the JWST Advanced Deep Extragalactic Survey (JADES).

The research was supported in part by the European Research Council, the Royal Society, and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

 

Reference:
Francesco D’Eugenio et al. ‘JADES: Carbon enrichment 350 Myr after the Big Bang.’ Astronomy & Astrophysics (in press). DOI: 10.48550/arXiv.2311.09908

Astronomers have detected carbon in a galaxy just 350 million years after the Big Bang, the earliest detection of any element in the universe other than hydrogen.

Deep field image from JWST

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New instrument to search for signs of life on other planets

Artist's impression of the ANDES instrument

The ANDES instrument will be installed on ESO’s Extremely Large Telescope (ELT), currently under construction in Chile’s Atacama Desert. It will be used to search for signs of life in exoplanets and look for the very first stars. It will also test variations of the fundamental constants of physics and measure the acceleration of the Universe’s expansion.

The University of Cambridge is a member institution on the project, which involves scientists from 13 countries. Professor Roberto Maiolino, from Cambridge’s Cavendish Laboratory and Kavli Institute for Cosmology, is ANDES Project Scientist.  

Formerly known as HIRES, ANDES is a powerful spectrograph, an instrument which splits light into its component wavelengths so astronomers can determine properties about astronomical objects, such as their chemical compositions. The instrument will have a record-high wavelength precision in the visible and near-infrared regions of light and, when working in combination with the powerful mirror system of the ELT, it will pave the way for research spanning multiple areas of astronomy.

“ANDES is an instrument with an enormous potential for groundbreaking scientific discoveries, which can deeply affect our perception of the Universe far beyond the small community of scientists,” said Alessandro Marconi, ANDES Principal Investigator.

ANDES will conduct detailed surveys of the atmospheres of Earth-like exoplanets, allowing astronomers to search extensively for signs of life. It will also be able to analyse chemical elements in faraway objects in the early Universe, making it likely to be the first instrument capable of detecting signatures of Population III stars, the earliest stars born in the Universe.

In addition, astronomers will be able to use ANDES’ data to test if the fundamental constants of physics vary with time and space. Its comprehensive data will also be used to directly measure the acceleration of the Universe’s expansion, one of the most pressing mysteries about the cosmos.

When operations start later this decade, the ELT will be the world’s biggest eye on the sky, marking a new age in ground-based astronomy.

Adapted from an ESO press release

The European Southern Observatory (ESO) has signed an agreement for the design and construction of ANDES, the ArmazoNes high Dispersion Echelle Spectrograph.

Artist's impression of the ANDES instrument

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Electrified charcoal ‘sponge’ can soak up CO2 directly from the air

Sample of activated charcoal used to capture carbon dioxide

Researchers from the University of Cambridge used a method similar to charging a battery to instead charge activated charcoal, which is often used in household water filters.

By charging the charcoal ‘sponge’ with ions that form reversible bonds with CO2, the researchers found the charged material could successfully capture CO2 directly from the air.

The charged charcoal sponge is also potentially more energy efficient than current carbon capture approaches, since it requires much lower temperatures to remove the captured CO2 so it can be stored. The results are reported in the journal Nature.

“Capturing carbon emissions from the atmosphere is a last resort, but given the scale of the climate emergency, it’s something we need to investigate,” said Dr Alexander Forse from the Yusuf Hamied Department of Chemistry, who led the research. “The first and most urgent thing we’ve got to do is reduce carbon emissions worldwide, but greenhouse gas removal is also thought to be necessary to achieve net zero emissions and limit the worst effects of climate change. Realistically, we’ve got to do everything we can.”

Direct air capture, which uses sponge-like materials to remove carbon dioxide from the atmosphere, is one potential approach for carbon capture, but current approaches are expensive, require high temperatures and the use of natural gas, and lack stability.

“Some promising work has been done on using porous materials for carbon capture from the atmosphere,” said Forse. “We wanted to see if activated charcoal might be an option, since it’s cheap, stable and made at scale.”

Activated charcoal is used in many purification applications, such as water filters, but normally it can’t capture and hold CO2 from the air. Forse and his colleagues proposed that if activated charcoal could be charged, like a battery, it could be a suitable material for carbon capture.

When charging a battery, charged ions are inserted into one of the battery’s electrodes. The researchers hypothesised that charging activated charcoal with chemical compounds called hydroxides would make it suitable for carbon capture, since hydroxides form reversible bonds with CO2.

The team used a battery-like charging process to charge an inexpensive activated charcoal cloth with hydroxide ions. In this process, the cloth essentially acts like an electrode in a battery, and hydroxide ions accumulate in the tiny pores of the charcoal. At the end of the charging process, the charcoal is removed from the “battery”, washed and dried.

Tests of the charged charcoal sponge showed that it could successfully capture CO2 directly from the air, thanks to the bonding mechanism of the hydroxides.

“It’s a new way to make materials, using a battery-like process,” said Forse. “And the rates of CO2 capture are already comparable to incumbent materials. But what’s even more promising is this method could be far less energy-intensive, since we don’t require high temperatures to collect the CO2 and regenerate the charcoal sponge.”

To collect the CO2 from the charcoal so it can be purified and stored, the material is heated to reverse the hydroxide-CO2 bonds. In most materials currently used for CO2 capture from air, the materials need to be heated to temperatures as high as 900°C, often using natural gas. However, the charged charcoal sponges developed by the Cambridge team only require heating to 90-100°C, temperatures that can be achieved using renewable electricity. The materials are heated through resistive heating, which essentially heats them from the inside out, making the process faster and less energy-intensive.

The materials do, however, have limitations that the researchers are now working on. “We are working now to increase the quantity of carbon dioxide that can be captured, and in particular under humid conditions where our performance decreases,” said Forse.

The researchers say their approach could be useful in fields beyond carbon capture, since the pores in the charcoal and the ions inserted into them can be fine-tuned to capture a range of molecules.

“This approach was a kind of crazy idea we came up with during the Covid-19 lockdowns, so it’s always exciting when these ideas actually work,” said Forse. “This approach opens a door to making all kinds of materials for different applications, in a way that’s simple and energy-efficient.”

A patent has been filed and the research is being commercialised with the support of Cambridge Enterprise, the University’s commercialisation arm.

The research was supported in part by the Leverhulme Trust, the Royal Society, the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI), and the Centre for Climate Repair at Cambridge.

 

Reference:
Huaiguang Li et al. ‘Capturing carbon dioxide from air with charged sorbents.’ Nature (2024). DOI: 10.1038/s41586-024-07449-2

Researchers have developed a low-cost, energy-efficient method for making materials that can capture carbon dioxide directly from the air.

The first and most urgent thing we’ve got to do is reduce carbon emissions worldwide, but greenhouse gas removal is also thought to be necessary to achieve net zero emissions and limit the worst effects of climate change. Realistically, we’ve got to do everything we can
Alex Forse
Sample of activated charcoal used to capture carbon dioxide

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Missing’ sea sponges discovered

Heliocolocellus fossil

At first glance, the simple, spikey sea sponge is no creature of mystery.

No brain. No gut. No problem dating them back 700 million years. Yet convincing sponge fossils only go back about 540 million years, leaving a 160-million-year gap in the fossil record.

In a paper released in the journal Nature, an international team including researchers from the University of Cambridge, have reported a 550-million-year-old sea sponge from the “lost years” and proposed that the earliest sea sponges had not yet developed mineral skeletons, offering new parameters to the search for the missing fossils.

The mystery of the missing sea sponges centred on a paradox.

Molecular clock estimates, which involve measuring the number of genetic mutations that accumulate within the Tree of Life over time, indicate that sponges must have evolved about 700 million years ago. And yet, there had been no convincing sponge fossils found in rocks that old.

For years, this conundrum was the subject of debate among zoologists and palaeontologists.

This latest discovery fills in the evolutionary family tree of one of the earliest animals, connecting the dots all the way back to Darwin’s questions about when the first animals evolved and explaining their apparent absence in older rocks.

Shuhai Xiao from Virginia Tech, who led the research, first laid eyes on the fossil five years ago when a collaborator texted him a picture of a specimen excavated along the Yangtze River in China. “I had never seen anything like it before,” he said. “Almost immediately, I realised that it was something new.”

The researchers began ruling out possibilities one by one: not a sea squirt, not a sea anemone, not a coral. They wondered, could it be an elusive ancient sea sponge?

In an earlier study published in 2019, Xiao and his team suggested that early sponges left no fossils because they had not evolved the ability to generate the hard needle-like structures, known as spicules, that characterise sea sponges today.

The team traced sponge evolution through the fossil record. As they went further back in time, sponge spicules were increasingly more organic in composition, and less mineralised.

“If you extrapolate back, then perhaps the first ones were soft-bodied creatures with entirely organic skeletons and no minerals at all,” said Xiao. “If this was true, they wouldn’t survive fossilisation except under very special circumstances where rapid fossilisation outcompeted degradation.”

Later in 2019, Xiao’s group found a sponge fossil preserved in just such a circumstance: a thin bed of marine carbonate rocks known to preserve abundant soft-bodied animals, including some of the earliest mobile animals. Most often this type of fossil would be lost to the fossil record. The new finding offers a window into early animals before they developed hard parts.

The surface of the new sponge fossil is studded with an intricate array of regular boxes, each divided into smaller, identical boxes.

“This specific pattern suggests our fossilised sea sponge is most closely related to a certain species of glass sponges,” said first author Dr Xiaopeng Wang, from Cambridge’s Department of Earth Sciences and the Nanjing Institute of Geology and Palaeontology.

Another unexpected aspect of the new sponge fossil is its size.

“When searching for fossils of early sponges I had expected them to be very small,” said co-author Alex Liu from Cambridge’s Department of Earth Sciences. “The new fossil can reach over 40 centimetres long, and has a relatively complex conical body plan, challenging many of our expectations for the appearance of early sponges”.

While the fossil fills in some of the missing years, it also provides researchers with important guidance about what they should look for, which will hopefully extend understanding of early animal evolution further back in time.

“The discovery indicates that perhaps the first sponges were spongey but not glassy,” said Xiao. “We now know that we need to broaden our view when looking for early sponges.”

Reference:
Xiaopeng Wang et al. ‘A late-Ediacaran crown-group sponge animal.’ Nature (2024). DOI: 10.1038/s41586-024-07520-y

Adapted from a Virginia Tech press release.

The discovery, published in Nature, opens a new window on early animal evolution.

Heliocolocellus fossil

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Genetics study points to potential treatments for restless leg syndrome

Leg sticking out from a white blanket

Restless leg syndrome can cause an unpleasant crawling sensation in the legs and an overwhelming urge to move them. Some people experience the symptoms only occasionally, while others get symptoms every day. Symptoms are usually worse in the evening or at night-time and can severely impair sleep.

Despite the condition being relatively common – up to one in 10 older adults experience symptoms, while 2-3% are severely affected and seek medical help – little is known about its causes. People with restless leg syndrome often have other conditions, such as depression or anxiety, cardiovascular disorders, hypertension, and diabetes, but the reason why is not known.

Previous studies had identified 22 genetic risk loci – that is, regions of our genome that contain changes associated with increased risk of  developing the condition. But there are still no known ‘biomarkers’ – such as genetic signatures – that could be used to objectively diagnose the condition.

To explore the condition further, an international team led by researchers at the Helmholtz Munich Institute of Neurogenomics, Institute of Human Genetics of the Technical University of Munich (TUM) and the University of Cambridge pooled and analysed data from three genome-wide association studies. These studies compared the DNA of patients and healthy controls to look for differences more commonly found in those with restless leg syndrome. By combining the data, the team was able to create a powerful dataset with more than 100,000 patients and over 1.5 million unaffected controls.

The results of the study are published today in Nature Genetics.

Co-author Dr Steven Bell from the University of Cambridge said: “This study is the largest of its kind into this common – but poorly understood – condition. By understanding the genetic basis of restless leg syndrome, we hope to find better ways to manage and treat it, potentially improving the lives of many millions of people affected worldwide.”

The team identified over 140 new genetic risk loci, increasing the number known eight-fold to 164, including three on the X chromosome. The researchers found no strong genetic differences between men and women, despite the condition being twice as common in women as it is men – this suggests that a complex interaction of genetics and the environment (including hormones) may explain the gender differences we observe in real life.

Two of the genetic differences identified by the team involve genes known as glutamate receptors 1 and 4 respectively, which are important for nerve and brain function. These could potentially be targeted by existing drugs, such as anticonvulsants like perampanel and lamotrigine, or used to develop new drugs. Early trials have already shown positive responses to these drugs in patients with restless leg syndrome.

The researchers say it would be possible to use basic information like age, sex, and genetic markers to accurately rank who is more likely to have severe restless leg syndrome in nine cases out of ten.

To understand how restless leg syndrome might affect overall health, the researchers used a technique called Mendelian randomisation. This uses genetic information to examine cause-and-effect relationships. It revealed that the syndrome increases the risk of developing diabetes.

Although low levels of iron in the blood are thought to trigger restless leg syndrome – because they can lead to a fall in the neurotransmitter dopamine – the researchers did not find strong genetic links to iron metabolism. However, they say they cannot completely rule it out as a risk factor.

Professor Juliane Winkelmann from TUM, one of senior authors of the study, said: “For the first time, we have achieved the ability to predict restless leg syndrome risk. It has been a long journey, but now we are empowered to not only treat but even prevent the onset of this condition in our patients.”

Professor Emanuele Di Angelantonio, a co-author of the study and Director of the NIHR and NHS Blood and Transplant-funded Research Unit in Blood Donor Health and Behaviour, added: "Given that low iron levels are thought to trigger restless leg syndrome, we were surprised to find no strong genetic links to iron metabolism in our study. It may be that the relationship is more complex than we initially thought, and further work is required."

The dataset included the INTERVAL study of England’s blood donors in collaboration with NHS Blood and Transplant.

A full list of funders can be found in the study paper.

Reference
Schormair et al. Genome-wide meta-analyses of restless legs syndrome yield insights into genetic architecture, disease biology, and risk prediction. Nature Genetics; 5 June 2024; DOI: 10.1038/s41588-024-01763-1

Scientists have discovered genetic clues to the cause of restless leg syndrome, a condition common among older adults. The discovery could help identify those individuals at greatest risk of the condition and point to potential ways to treat it.

By understanding the genetic basis of restless leg syndrome, we hope to find better ways to manage and treat it, potentially improving the lives of many millions of people affected worldwide
Steven Bell
Woman covered with white blanket

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Exercising during pregnancy normalises eating behaviours in offspring from obese mice

Fast food meal of burger and fries

Previous studies in both humans and animal models have shown that the offspring of mothers living with obesity have a higher risk of developing obesity and type 2 diabetes themselves when they grow up. While this relationship is likely to be the result of a complex relationship between genetics and environment, emerging evidence has implicated that maternal obesity in pregnancy can disrupt the baby’s hypothalamus—the region of the brain responsible for controlling food intake and energy regulation.

In animal models, offspring exposed to overnutrition during key periods of development eat more when they grow up, but little is known about the molecular mechanisms that lead to these changes in eating behaviour.

In a study published today in PLOS Biology, researchers from the Institute of Metabolic Science and the MRC Metabolic Diseases Unit at the University of Cambridge found that mice born from obese mothers had higher levels of the microRNA miR-505-5p in their hypothalamus—from as early as the fetal stage into adulthood. The offspring of obese mothers chose to eat more specifically of foods that were high in fat, which is consistent with fat sensing being disrupted in the hypothalamus.  

Dr Laura Dearden from the Institute of Metabolic Science, the study’s first author, said: “Our results show that obesity during pregnancy causes changes to the baby's brain that makes them eat more high fat food in adulthood and more likely to develop obesity.”

Senior author Professor Susan Ozanne from the MRC Metabolic Diseases Unit and Institute of Metabolic Science said: “Importantly, we showed that moderate exercise, without weight loss, during pregnancies complicated by obesity prevented the changes to the baby's brain.”

Cell culture experiments showed that miR-505-5p levels can be influenced by exposing hypothalamic neurons to long-chain fatty acids and insulin, which are both high in pregnancies complicated by obesity. The researchers identified miR-505-5p as a regulator of pathways involved in fatty acid uptake and metabolism – high levels of the miRNA make the offspring brain unable to sense when they are eating high fat foods. Several of the genes that miR-505-5p regulates are associated with high body mass index in human genetic studies, showing these same changes in humans can cause obesity.

The study is one of the first to demonstrate the molecular mechanisms linking nutritional exposure in utero to eating behaviour. 

Dr Dearden added: “While our work was only carried out in mice, it may help us understand why the children of mothers living with obesity are more likely to become obese themselves, with early life exposures, genetics and current environment all being contributing factors.”

Reference
Dearden, L et al. Maternal obesity increases hypothalamic miR-505-5p expression in mouse offspring leading to altered fatty acid sensing and increased intake of high-fat food. PLOS Biology; 4 Jun 2024; DOI: 10.1371/journal.pbio.3002641

Adapted from a press release by PLOS Biology

Maternal obesity in pregnancy changes the eating behaviours of offspring by increasing long-term levels of particular molecules known as microRNAs in the part of the brain that controls appetite ­– but this can be changed by exercise during pregnancy, a study in obese mice has suggested.

We showed that moderate exercise, without weight loss, during pregnancies complicated by obesity prevented the changes to the baby's brain
Susan Ozanne
Fast food meal of burger and fries

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

New open-source platform allows users to evaluate performance of AI-powered chatbots

Chatbot

A team of computer scientists, engineers, mathematicians and cognitive scientists, led by the University of Cambridge, developed an open-source evaluation platform called CheckMate, which allows human users to interact with and evaluate the performance of large language models (LLMs).

The researchers tested CheckMate in an experiment where human participants used three LLMs – InstructGPT, ChatGPT and GPT-4 – as assistants for solving undergraduate-level mathematics problems.

The team studied how well LLMs can assist participants in solving problems. Despite a generally positive correlation between a chatbot’s correctness and perceived helpfulness, the researchers also found instances where the LLMs were incorrect, but still useful for the participants. However, certain incorrect LLM outputs were thought to be correct by participants. This was most notable in LLMs optimised for chat.

The researchers suggest models that communicate uncertainty, respond well to user corrections, and can provide a concise rationale for their recommendations, make better assistants. Human users of LLMs should verify their outputs carefully, given their current shortcomings.

The results, reported in the Proceedings of the National Academy of Sciences (PNAS), could be useful in both informing AI literacy training, and help developers improve LLMs for a wider range of uses.

While LLMs are becoming increasingly powerful, they can also make mistakes and provide incorrect information, which could have negative consequences as these systems become more integrated into our everyday lives.

“LLMs have become wildly popular, and evaluating their performance in a quantitative way is important, but we also need to evaluate how well these systems work with and can support people,” said co-first author Albert Jiang, from Cambridge’s Department of Computer Science and Technology. “We don’t yet have comprehensive ways of evaluating an LLM’s performance when interacting with humans.”

The standard way to evaluate LLMs relies on static pairs of inputs and outputs, which disregards the interactive nature of chatbots, and how that changes their usefulness in different scenarios. The researchers developed CheckMate to help answer these questions, designed for but not limited to applications in mathematics.

“When talking to mathematicians about LLMs, many of them fall into one of two main camps: either they think that LLMs can produce complex mathematical proofs on their own, or that LLMs are incapable of simple arithmetic,” said co-first author Katie Collins from the Department of Engineering. “Of course, the truth is probably somewhere in between, but we wanted to find a way of evaluating which tasks LLMs are suitable for and which they aren’t.”

The researchers recruited 25 mathematicians, from undergraduate students to senior professors, to interact with three different LLMs (InstructGPT, ChatGPT, and GPT-4) and evaluate their performance using CheckMate. Participants worked through undergraduate-level mathematical theorems with the assistance of an LLM and were asked to rate each individual LLM response for correctness and helpfulness. Participants did not know which LLM they were interacting with.

The researchers recorded the sorts of questions asked by participants, how participants reacted when they were presented with a fully or partially incorrect answer, whether and how they attempted to correct the LLM, or if they asked for clarification. Participants had varying levels of experience with writing effective prompts for LLMs, and this often affected the quality of responses that the LLMs provided.

An example of an effective prompt is “what is the definition of X” (X being a concept in the problem) as chatbots can be very good at retrieving concepts they know of and explaining it to the user.

“One of the things we found is the surprising fallibility of these models,” said Collins. “Sometimes, these LLMs will be really good at higher-level mathematics, and then they’ll fail at something far simpler. It shows that it’s vital to think carefully about how to use LLMs effectively and appropriately.”

However, like the LLMs, the human participants also made mistakes. The researchers asked participants to rate how confident they were in their own ability to solve the problem they were using the LLM for. In cases where the participant was less confident in their own abilities, they were more likely to rate incorrect generations by LLM as correct.

“This kind of gets to a big challenge of evaluating LLMs, because they’re getting so good at generating nice, seemingly correct natural language, that it’s easy to be fooled by their responses,” said Jiang. “It also shows that while human evaluation is useful and important, it’s nuanced, and sometimes it’s wrong. Anyone using an LLM, for any application, should always pay attention to the output and verify it themselves.”

Based on the results from CheckMate, the researchers say that newer generations of LLMs are increasingly able to collaborate helpfully and correctly with human users on undergraduate-level maths problems, as long as the user can assess the correctness of LLM-generated responses. Even if the answers may be memorised and can be found somewhere on the internet, LLMs have the advantage of being flexible in their inputs and outputs over traditional search engines (though should not replace search engines in their current form).

While CheckMate was tested on mathematical problems, the researchers say their platform could be adapted to a wide range of fields. In the future, this type of feedback could be incorporated into the LLMs themselves, although none of the CheckMate feedback from the current study has been fed back into the models.

“These kinds of tools can help the research community to have a better understanding of the strengths and weaknesses of these models,” said Collins. “We wouldn’t use them as tools to solve complex mathematical problems on their own, but they can be useful assistants if the users know how to take advantage of them.”

The research was supported in part by the Marshall Commission, the Cambridge Trust, Peterhouse, Cambridge, The Alan Turing Institute, the European Research Council, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

 

Reference:
Katherine M Collins, Albert Q Jiang, et al. ‘Evaluating Language Models for Mathematics through Interactions.’ Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2318124121

 

Researchers have developed a platform for the interactive evaluation of AI-powered chatbots such as ChatGPT. 

Anyone using an LLM, for any application, should always pay attention to the output and verify it themselves
Albert Jiang
Chatbot

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Vice-Chancellor on how Cambridge can drive UK economic growth

Deborah Prentice.

In her article, she starts by comparing her current role with the one she had at Princeton in the United States where she worked for more than 30 years, latterly as its Provost. She discusses how Cambridge contributes to the UK at a scale and with a significance that no single US university can match, outlining its clear role to play as a national university, in the service of the country:

"Cambridge already contributes around £30bn each year to the UK economy. This delivers benefits right across the country. It is Europe’s 'unicorn' capital. Over the past three decades, 178 spin-outs and more than 200 start-ups connected to it have emerged, including semi-conductor giant Arm and the global life sciences firm Abcam. This success is no accident. For years, Cambridge has done things differently. The University’s intellectual property management policy is liberal, providing greater freedom to our researchers and ensuring that they benefit from the commercialisation of their ideas. We have established science parks, venture capital funds and accelerator programmes. Cambridge is now a globally significant place to start and grow businesses."

To continue with this success, she argues, Cambridge needs to be able to attract the brightest minds to the UK, something she considers to be at risk if the new government continues to pursue policies that make it appear unwelcoming to those from outside the UK:

"... they send a not-so-subtle message that foreigners are not welcome here. That alone is enough to deter talented students and academics, who are increasingly looking to the US, Asia, to other parts of Europe — our competitors for talent — for a warm welcome. Indeed, for Cambridge to sustain the success of our innovation ecosystem we need the brightest and best from the UK and around the world to come here — especially as postgraduate research students — and for a better environment to support them."

She urges that government to not just focus on infrastructure but to create an environment that supports other objectives in areas such as patents, licences, spin outs, industry collaborations and venture funding. This kind of reform, she says, would help universities like Cambridge really take off: 

"Britain is in a global competition; I know first-hand that what US universities spend on research in Boston and Silicon Valley is many times larger than what we see coming from our leading institutions in the UK. Despite this, Cambridge is ranked first globally for science intensity; we should aspire for it also to be the leader in translating research for economic impact. At the moment it is very good, but not yet great. The lessons for incoming ministers set on solving this country’s productivity under-performance are there to be learnt from our example, in terms of the barriers we face and the support we lack. This is both a huge challenge and huge opportunity for whoever wins the general election."

 

Vice-Chancellor Professor Deborah Prentice has written an article for the Financial Times reflecting on the University’s role as a driver of economic growth, innovation and productivity far beyond the city and its surrounding region.

We need the brightest and best from the UK and around the world to come here.
Professor Deborah Prentice
Deborah Prentice

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Rainforest wildlife under threat as below-canopy temperatures rise

Rainforest on the south-eastern edge of Amazonia, Brazil.

Crucial strongholds for biodiversity are under threat as temperatures are rising in tropical forests, the world’s most diverse terrestrial ecosystems, a new study reveals.

It has been long assumed that the forest subcanopy and understorey – where direct sunlight is reduced – would be insulated from the worst climate change impacts by the shielding effect of the forest canopy.

A new study, published today in the journal Nature Climate Change, used a microclimate model to examine temperatures beneath the rainforest canopy across the global tropics.

This showed that between 2005 and 2019, most of the world’s undisturbed tropical forests experienced climate conditions at least partially outside the range of historic conditions. Many areas had transitioned to almost entirely new temperature averages.

Until recently, temperatures beneath the canopy in rainforests have remained relatively stable, meaning that the wildlife that lives there has evolved within a narrow range of temperatures. This leaves it poorly adapted to deal with temperatures outside this range.

The study found pronounced shifts in climate regimes in a significant proportion of tropical forests, including globally important national parks, indigenous reserves, and large tracts of ecologically unfragmented areas.

Recent studies in largely undisturbed, or primary lowland tropical forests have found changes in species composition and significant declines in animal, insect, and plant populations. These changes are attributed to warming temperatures and are consistent with the findings of the new research.

"Tropical forests are the true powerhouses of global biodiversity, and the complex networks of species they contain underpin vast carbon stocks that help to mitigate climate change. A severe risk is that species are no longer able to survive within tropical forests as climate change intensifies, further exacerbating the global extinction crisis and degrading rainforest carbon stocks," said Professor David Edwards at the University of Cambridge’s Department of Plant Sciences, a study co-author.   

“Our study challenges the prevailing notion that tropical forest canopies will mitigate climate change impacts and it helps us understand how to prioritise conservation of these key areas of biodiversity effectively,” said Dr Alexander Lees, Reader in Biodiversity at Manchester Metropolitan University, a study co-author.

He added: “It is paramount that distant, wealth-related drivers of deforestation and degradation are addressed and that the future of those forests acting as climate refuges is secured by effecting legal protection, and by empowering indigenous communities.

“Notwithstanding the fundamental need for global carbon emission reductions, the prioritisation and protection of refugia and the restoration of highly threatened forests is vital to mitigate further damage to global tropical forest ecosystems.”

“Tropical forests, home to many of the world’s highly specialised species, are particularly sensitive to even small changes in climate,” said Dr Brittany Trew, Conservation Scientist for the Royal Society for the Protection of Birds, and lead author of the study.

She added: “Our research shows that climate change is already impacting vast areas of pristine tropical forest globally. To provide species with the best chance to adapt to these changes, these forests must be protected from additional human-induced threats.”

“The world's rainforests are incredible reservoirs of biodiversity, harbouring species that live in micro-environments in which climate conditions are generally stable. Thus, they are particularly sensitive to any changes brought about by climate change. It is vital that we take measures to safeguard these ecosystems from human pressures,” said Ilya Maclean, Professor of Global Change Biology at the University of Exeter and senior author of the study.

The study was made possible through a global collaboration that included researchers at Mountains of the Moon University, Uganda; Universidade Federal do Pará, Brazil; the Brazilian Agricultural Research Corporation and Universidad Nacional de San Antonio Abad del Cusco, Perú. It was funded by the National Science Foundation (NSF).

Reference: Trew, B T et al: ‘Novel temperatures are already widespread beneath the world’s tropical forest canopies.’ Nature Climate Change, June 2024. DOI: 10.1038/s41558-024-02031-0

Adapted from a press released by Manchester Metropolitan University

Assumptions that tropical forest canopies protect from the effects of climate change are unfounded, say researchers.

A severe risk is that species are no longer able to survive within tropical forests as climate change intensifies, further exacerbating the global extinction crisis and degrading rainforest carbon stocks.
David Edwards
Rainforest on the south-eastern edge of Amazonia, Brazil

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cuckoos evolve to look like their hosts - and form new species in the process

Male wren with bright blue plumage brings food to a cuckoo fledgling .

The theory of coevolution says that when closely interacting species drive evolutionary changes in each other this can lead to speciation - the evolution of new species. But until now, real-world evidence for this has been scarce.

Now a team of researchers has found evidence that coevolution is linked to speciation by studying the evolutionary arms race between cuckoos and the host birds they exploit.

Bronze-cuckoos lay their eggs in the nests of small songbirds. Soon after the cuckoo chick hatches, it pushes the host’s eggs out of the nest. The host not only loses all its own eggs, but spends several weeks rearing the cuckoo, which takes up valuable time when it could be breeding itself.

Each species of bronze-cuckoo closely matches the appearance of their host’s chicks, fooling the host parents into accepting the cuckoo.

The study shows how these interactions can cause new species to arise when a cuckoo species exploits several different hosts. If chicks of each host species have a distinct appearance, and hosts reject odd-looking nestlings, then the cuckoo species diverges into separate genetic lineages, each mimicking the chicks of its favoured host. These new lineages are the first sign of new species emerging.

The study is published today in the journal Science.

“This exciting new finding could potentially apply to any pairs of species that are in battle with each other. Just as we’ve seen with the cuckoo, the coevolutionary arms race could cause new species to emerge - and increase biodiversity on our planet,” said Professor Kilner in the University of Cambridge’s Department of Zoology, a co-author of the report.

The striking differences between the chicks of different bronze-cuckoo lineages correspond to subtle differences in the plumage and calls of the adults, which help males and females that specialise on the same host to recognise and pair with each other.

“Cuckoos are very costly to their hosts, so hosts have evolved the ability to recognise and eject cuckoo chicks from their nests,’’ said Professor Naomi Langmore at the Australian National University, Canberra, lead author of the study. 

She added: “Only the cuckoos that most resemble the host’s own chicks have any chance of escaping detection, so over many generations the cuckoo chicks have evolved to mimic the host chicks.”

The study revealed that coevolution is most likely to drive speciation when the cuckoos are very costly to their hosts, leading to a ‘coevolutionary arms race’ between host defences and cuckoo counter-adaptations.

A broad scale analysis across all cuckoo species found that those lineages that are most costly to their hosts have higher speciation rates than less costly cuckoo species and their non-parasitic relatives.

“This finding is significant in evolutionary biology, showing that coevolution between interacting species increases biodiversity by driving speciation,” said Dr Clare Holleley at the Australian National Wildlife Collection within CSIRO, Canberra, senior author of the report.

The study was made possible by the team’s breakthrough in extracting DNA from eggshells in historical collections, and sequencing it for genetic studies.

The researchers were then able to combine two decades of behavioural fieldwork with DNA analysis of specimens of eggs and birds held in museums and collections.

The study involved an international team of researchers at the University of Cambridge, Australian National University, CSIRO (Australia’s national science agency), and the University of Melbourne. It was funded by the Australian Research Council.

Reference: Langmore, N E et al: ‘Coevolution with hosts underpins speciation in brood-parasitic cuckoos.’ Science, May 2024. DOI: 10.1126/science.adj3210

Adapted from a press release by the Australian National University.

Two decades of cuckoo research have helped scientists to explain how battles between species can cause new species to arise

This exciting new finding could potentially apply to any pairs of species that are in battle with each other...the coevolutionary arms race could cause new species to emerge - and increase biodiversity on our planet
Rebecca Kilner
Male wren (left) brings food to a cuckoo fledgling (right)

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Earliest, most distant galaxy discovered with James Webb Space Telescope

Infrared image showing JADES-GS-z14-0 galaxy

Found in a region near the Hubble Ultra Deep Field by the JWST Advanced Deep Extragalactic Survey (JADES) team, these galaxies mark a major milestone in the study of the early Universe.

“These galaxies join a small but growing population of galaxies from the first half billion years of cosmic history where we can really probe the stellar populations and the distinctive patterns of chemical elements within them,” said Dr Francesco D’Eugenio of the Kavli Institute for Cosmology at the University of Cambridge, one of the team behind the discovery.

Because of the expansion of the Universe, the light from distant galaxies stretches to longer wavelength as it travels, an effect known as redshift. In these galaxies, the effect is extreme, stretching by a factor of 15, and moving even the ultraviolet light of the galaxies to infrared wavelengths where only JWST has the capability to see it.

Modern theory holds that galaxies develop in special regions where gravity has concentrated the cosmic gas and dark matter into dense lumps known as ‘halos’. These halos evolved quickly in the early Universe, rapidly merging into more and more massive collections of matter. This fast development is why astronomers are so eager to find yet earlier galaxies: each small increment moves our eyes to a less developed period, where luminous galaxies are even more distinctive and unusual.

The two newly discovered galaxies have been confirmed spectroscopically. In keeping with the collaboration’s standard naming practice, the galaxies are now known as JADES-GS-z14-0 and JADES-GS-z14-1, the former being the more distant of the two.

In addition to being the new distance record holder, JADES-GS-z14-0 is remarkable for how big and bright it is. JWST measures the galaxy at over 1,600 light-years in diameter. Many of the most luminous galaxies produce the bulk of their light via gas falling into a supermassive black hole, producing a quasar, but at this size JADES-GS-z14-0 cannot be this. Instead, the researchers believe the light is being produced by young stars.

The combination of the high luminosity and the stellar origin makes JADES-GS-z14-0 the most distinctive evidence yet found for the rapid formation of large, massive galaxies in the early Universe. This trend runs counter to the pre-JWST expectations of theories of galaxy formation. Evidence for surprisingly vigorous early galaxies appeared even in the first JWST images and has been mounting in the first two years of the mission.

“JADES-GS-z14-0 now becomes the archetype of this phenomenon,” said Dr Stefano Carniani of the Scuola Normale Superiore in Pisa, lead author on the discovery paper. “It is stunning that the Universe can make such a galaxy in only 300 million years.”

Despite its luminosity, JADES-GS-z14-0 was a puzzle for the JADES team when they first spotted it over a year ago, as it appears close enough on the sky to a foreground galaxy that the team couldn’t be sure that the two weren’t neighbours. But in October 2023, the JADES team conducted even deeper imaging—five full days with the JWST Near-Infrared Camera on just one field—to form the “JADES Origins Field.” With the use of filters designed to better isolate the earliest galaxies, confidence grew that JADES-GS-z14-0 was indeed very distant.

“We just couldn’t see any plausible way to explain this galaxy as being merely a neighbour of the more nearby galaxy,” said Dr Kevin Hainline, research professor at the University of Arizona.

Fortunately, the galaxy happened to fall in a region where the team had conducted ultra-deep imaging with the JWST Mid-Infrared Instrument. The galaxy was bright enough to be detected in 7.7 micron light, with a higher intensity than extrapolation from lower wavelengths would predict.

“We are seeing extra emission from hydrogen and possibly even oxygen atoms, as is common in star-forming galaxies, but here shifted out to an unprecedented wavelength,” said Jakob Helton, graduate student at the University of Arizona and lead author of a second paper on this finding.

These combined imaging results convinced the team to include the galaxy in what was planned to be the capstone observation of JADES, a 75-hour campaign to conduct spectroscopy on faint early galaxies. The spectroscopy confirmed their hopes that JADES-GS-z14-0 was indeed a record-breaking galaxy and that the fainter candidate, JADES-GS-z14-1, was nearly as far away.

Beyond the confirmation of distance, the spectroscopy allows further insight into the properties of the two galaxies. Being comparatively bright, JADES-GS-z14-0 will permit detailed study.

“We could have detected this galaxy even if it were 10 times fainter, which means that we could see other examples yet earlier in the Universe—probably into the first 200 million years,” says Brant Robertson, professor of astronomy and astrophysics at the University of California-Santa Cruz, and lead author of a third paper on the team’s study of the evolution of this early population of galaxies. “The early Universe has so much more to offer.”

Reference
Carniani, S et al. A shining cosmic dawn: spectroscopic confirmation of two luminous galaxies at z∼14. arXiv:2405.18485 [astro-ph.GA]

The two earliest and most distant galaxies yet confirmed, dating back to only 300 million years after the Big Bang, have been discovered using NASA’s James Webb Space Telescope (JWST), an international team of astronomers today announced.

These galaxies join a small but growing population of galaxies from the first half billion years of cosmic history where we can really probe the stellar populations and the distinctive patterns of chemical elements within them
Francesco D’Eugenio
Infrared image showing JADES-GS-z14-0 galaxy

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Clare Hall, Cambridge and LUT University, Finland sign agreement on fellowships and global climate prize

LUT Rector Juha-Matti Saksa and Clare Hall President President Alan Short and signing the joint agreement

Clare Hall, Cambridge and LUT University, Finland, establish a Visiting Fellowship programme and joint Global Prize for Solutions to Climate Change Threats. Please read more about this story here

We very much look forward to welcoming high-flying academics from LUT over the years to come to our unique interdisciplinary research community
Clare Hall President Alan Short
LUT Rector Juha-Matti Saksa and Clare Hall President President Alan Short and signing the joint agreement

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Cambridge research receives £5 million boost for ‘world-leading’ cardiovascular research

Professor Martin Bennett standing outside the Victor Phillip Dahdaleh Heart and Lung Research Institute

The funding will support the university to cultivate a world-class research environment that encourages collaboration, inclusion and innovation, and where visionary scientists can drive lifesaving breakthroughs.

Professor Martin Bennett, BHF Professor of Cardiovascular Sciences at the University of Cambridge, said: “This is a fantastic achievement from the whole Cambridge team. This award will support our multiple research programmes identifying new targets and treatments for vascular disease and heart failure, new ways to reduce the consequences of diabetes and obesity, and how we can get our research used to treat patients.”

The Cambridge award is part of a £35 million boost to UK cardiovascular disease research from the British Heart Foundation. It comes from the charity’s highly competitive Research Excellence Awards funding scheme. The £5 million award to the University of Cambridge will support researchers to:

  • Combine their expertise to work on cardiovascular diseases and in populations with high unmet need.
  • Identify new markers and disease targets for a wide range of cardiovascular diseases, and test new drugs in clinical trials.
  • Develop new ways to diagnose cardiovascular disease and harness the power of artificial intelligence from imaging and health records to identify people at highest risk.
  • Generate user-friendly risk communication and management tools to improve the prevention and management of cardiovascular disease.

Professor Bryan Williams, Chief Scientific and Medical Officer at the British Heart Foundation, said: “We’re delighted to continue to support research at the University of Cambridge addressing the biggest challenges in cardiovascular disease. This funding recognises the incredible research happening at Cambridge and will help to further its reputation as a global leader in the field.

“With generous donations from our supporters, this funding will attract the brightest talent, power cutting-edge science, and unlock lifesaving discoveries that can turn the tide on the devastation caused by heart and circulatory diseases.”

Research Excellence Awards offer greater flexibility than traditional research funding, allowing scientists to quickly launch ambitious projects that can act as a springboard for larger, transformative funding applications.

The funding also aims to break down the silos that have traditionally existed in research, encouraging collaboration between experts from diverse fields. From clinicians to data scientists, biologists to engineers, the funding will support universities to attract the brightest minds, nurture new talent and foster collaboration to answer the biggest questions in heart and circulatory disease research.

The University of Cambridge has previously been awarded £9 million funding through the BHF’s Research Excellence Awards scheme. This funding has supported research that will lay the foundations for future breakthroughs, including:

  • Research showing that low doses of a cancer drug could improve recovery after a heart attack. The drug boosts activity of anti-inflammatory immune cells that can cause harmful inflammation in blood vessels supplying the heart. It’s currently being tested in clinical trials to see if it benefits patients.
  • A new risk calculator to enable doctors across the UK and Europe predict who is at risk of having a heart attack or stroke in the next 10 years with greater accuracy. The calculator has been adopted by the European Guidelines on Cardiovascular Disease Prevention in Clinical Practice.
  • Developing imaging and artificial intelligence tools to improve diagnosis of heart and vascular disease by enhancing analysis of scans for disease activity and high-risk fatty plaques. These tools can be rapidly implemented to support diagnosis, treatment and prevention.
  • A study investigating whether an epilepsy medication could help to prevent strokes in people with a common gene variant. The change in the gene HDAC9 can cause it to become ‘overactive’ and increase stroke risk. The epilepsy medication sodium valproate blocks the HDAC9 activity, so could reduce stroke risk in people with the variant.
  • Discovery of rare and common changes in the genetic code that influences proteins and small molecules in the blood, helping us understand the development of cardiovascular diseases and identify novel drug targets.

Adapted from a press release by BHF

The University of Cambridge has received £5 million funding from the British Heart Foundation (BHF) to support its world-class cardiovascular disease research over the next five years, the charity has announced.

This is a fantastic achievement from the whole Cambridge team. This award will support our multiple research programmes.
Martin Bennett
Professor Martin Bennett standing outside the Victor Phillip Dahdaleh Heart and Lung Research Institute

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

US Food and Drug Administration approves Cambridge-developed artificial pancreas

Phone showing CamAPS FX

This means that even more people living with the disease will be able to use this life-changing app. For the first time, the FDA authorised the use of the artificial pancreas system in pregnancy.

CamAPS FX, produced by Cambridge spinout company CamDiab (www.camdiab.com), is an Android app that can be used to help manage glucose levels in people with type 1 diabetes, including during pregnancy.

The app allows a compatible insulin pump and a compatible continuous glucose monitor to ‘talk to each other’, creating an artificial pancreas.

The CamAPS FX closed loop algorithm was given FDA authorisation on Thursday 23 May. It had already been CE-marked for use in the UK and the EU.

CamAPS FX creator Roman Hovorka is Professor of Metabolic Technology at the Institute of Metabolic Science and Department of Paediatrics at the University of Cambridge, where the technology was developed.

He said: "We set out to help people with type 1 diabetes and their families live better lives and we’re delighted that the FDA has reviewed the safety and effectiveness of CamAPS FX and has given the technology its approval."

"It has been extensively tested and we’re proud that it is considered by many to be the best algorithm out there."

CamAPS FX is already used by more than 27,000 people in 15 countries across Europe and Australia. Artificial pancreas systems such as CamAPS FX have been granted approval for wide use by the NHS in November 2023 by the National Institute for Health and Care Excellence (NICE).

Read more about the CamAPS FX app

An artificial pancreas developed by researchers at the University of Cambridge has been granted approval by the USA’s Food and Drug Administration (FDA) for use by individuals with type 1 diabetes aged two and older, including during pregnancy.

We set out to help people with type 1 diabetes and their families live better lives and we’re delighted that the FDA has [...] given the technology its approval
Roman Hovorka
Phone showing CamAPS FX

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Imperceptible sensors made from ‘electronic spider silk’ can be printed directly on human skin

Sensors printed on human fingers

The method, developed by researchers from the University of Cambridge, takes its inspiration from spider silk, which can conform and stick to a range of surfaces. These ‘spider silks’ also incorporate bioelectronics, so that different sensing capabilities can be added to the ‘web’.

The fibres, at least 50 times smaller than a human hair, are so lightweight that the researchers printed them directly onto the fluffy seedhead of a dandelion without collapsing its structure. When printed on human skin, the fibre sensors conform to the skin and expose the sweat pores, so the wearer doesn’t detect their presence. Tests of the fibres printed onto a human finger suggest they could be used as continuous health monitors.

This low-waste and low-emission method for augmenting living structures could be used in a range of fields, from healthcare and virtual reality, to electronic textiles and environmental monitoring. The results are reported in the journal Nature Electronics.

Although human skin is remarkably sensitive, augmenting it with electronic sensors could fundamentally change how we interact with the world around us. For example, sensors printed directly onto the skin could be used for continuous health monitoring, for understanding skin sensations, or could improve the sensation of ‘reality’ in gaming or virtual reality application.

While wearable technologies with embedded sensors, such as smartwatches, are widely available, these devices can be uncomfortable, obtrusive and can inhibit the skin’s intrinsic sensations.

“If you want to accurately sense anything on a biological surface like skin or a leaf, the interface between the device and the surface is vital,” said Professor Yan Yan Shery Huang from Cambridge’s Department of Engineering, who led the research. “We also want bioelectronics that are completely imperceptible to the user, so they don’t in any way interfere with how the user interacts with the world, and we want them to be sustainable and low waste.”

There are multiple methods for making wearable sensors, but these all have drawbacks. Flexible electronics, for example, are normally printed on plastic films that don’t allow gas or moisture to pass through, so it would be like wrapping your skin in cling film. Other researchers have recently developed flexible electronics that are gas-permeable, like artificial skins, but these still interfere with normal sensation, and rely on energy- and waste-intensive manufacturing techniques.

3D printing is another potential route for bioelectronics since it is less wasteful than other production methods, but leads to thicker devices that can interfere with normal behaviour. Spinning electronic fibres results in devices that are imperceptible to the user, but don't have a high degree of sensitivity or sophistication, and they’re difficult to transfer onto the object in question.

Now, the Cambridge-led team has developed a new way of making high-performance bioelectronics that can be customised to a wide range of biological surfaces, from a fingertip to the fluffy seedhead of a dandelion, by printing them directly onto that surface. Their technique takes its inspiration in part from spiders, who create sophisticated and strong web structures adapted to their environment, using minimal material.

The researchers spun their bioelectronic ‘spider silk’ from PEDOT:PSS (a biocompatible conducting polymer), hyaluronic acid and polyethylene oxide. The high-performance fibres were produced from water-based solution at room temperature, which enabled the researchers to control the ‘spinnability’ of the fibres. The researchers then designed an orbital spinning approach to allow the fibres to morph to living surfaces, even down to microstructures such as fingerprints.

Tests of the bioelectronic fibres, on surfaces including human fingers and dandelion seedheads, showed that they provided high-quality sensor performance while being imperceptible to the host.

“Our spinning approach allows the bioelectronic fibres to follow the anatomy of different shapes, at both the micro and macro scale, without the need for any image recognition,” said Andy Wang, the first author of the paper. “It opens up a whole different angle in terms of how sustainable electronics and sensors can be made. It’s a much easier way to produce large area sensors.”

Most high-resolution sensors are made in an industrial cleanroom and require the use of toxic chemicals in a multi-step and energy-intensive fabrication process. The Cambridge-developed sensors can be made anywhere and use a tiny fraction of the energy that regular sensors require.

The bioelectronic fibres, which are repairable, can be simply washed away when they have reached the end of their useful lifetime, and generate less than a single milligram of waste: by comparison, a typical single load of laundry produces between 600 and 1500 milligrams of fibre waste.

“Using our simple fabrication technique, we can put sensors almost anywhere and repair them where and when they need it, without needing a big printing machine or a centralised manufacturing facility,” said Huang. “These sensors can be made on-demand, right where they’re needed, and produce minimal waste and emissions.”

The researchers say their devices could be used in applications from health monitoring and virtual reality, to precision agriculture and environmental monitoring. In future, other functional materials could be incorporated into this fibre printing method, to build integrated fibre sensors for augmenting the living systems with display, computation, and energy conversion functions. The research is being commercialised with the support of Cambridge Enterprise, the University’s commercialisation arm.

The research was supported in part by the European Research Council, Wellcome, the Royal Society, and the Biotechnology and Biological Sciences Research Council (BBSRC), part of UK Research and Innovation (UKRI).

Reference:
Wenyu Wang et al. ‘Sustainable and imperceptible augmentation of living structures with organic bioelectronic fibres.’ Nature Electronics (2024). DOI: 10.1038/s41928-024-01174-4

Researchers have developed a method to make adaptive and eco-friendly sensors that can be directly and imperceptibly printed onto a wide range of biological surfaces, whether that’s a finger or a flower petal.

Sensors printed on human fingers

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

One in two children with ADHD experience emotional problems, study finds

Teenage boys fighting on way to school

In research published in Nature Mental Health, the team found that as many as one in two children with ADHD show signs of emotional dysregulation, and that Ritalin – the commonly-prescribed drug to help the condition – appears to be less effective at treating this symptom.

ADHD affects around one in 14 young people under the age of 18 and in around half of these cases it persists into adulthood. The condition causes problems including hyperactivity, impulsivity and a difficulty to focus attention.

It has become increasingly clear that some people with ADHD also have self-control problems, affecting their ability to regulate emotions. For example, one in 50 (2.1%) children with a diagnosis of ADHD also have a mood disorder, such as depression, while more than one in four (27.4%) have an anxiety disorder. Many also have verbal or physical outbursts due to an inability to regulate their emotions.

These problems were thought to be a result of other symptoms associated with ADHD, such as problems with cognition and motivation. But today’s study shows that emotional dysregulation occurs independently of these.

The researchers examined data from the ABCD Study, a large longitudinal cohort that tracks the brain development and mental health of children from across the United States. Data on ADHD symptoms was available for just over 6,000 of these children, allowing the researchers to attribute a score to each individual indicating their likelihood of having ADHD.

A team of scientists from Fudan University in Shanghai, China, and the University of Cambridge identified 350 individuals within the cohort who had high symptom scores that met the clinical cut-off for ADHD. Two-thirds (65.7%) of these were male.

Parents or guardians of the children and adolescents in the cohort had previously completed a series of questionnaires, which included questions that related to emotional behaviour, for example:

When my child is upset, he/she has difficulty controlling his/her behaviours.

When my child is upset, he/she knows that he/she can find a way to eventually feel better. 

When my child is upset, he/she starts to feel very bad about him/herself.

The researchers found that half (51.4%) of the individuals in the high-symptom group showed signs of emotion dysregulation and this was independent of cognitive and motivational problems.

Among children with only low-ADHD symptoms at both ages 12 and 13 years, those with a high scores of emotion dysregulation at age 13 years were 2.85 times more likely to have developed high-ADHD symptoms by age 14 years compared with those with a low score of emotion dysregulation.

When the researchers examined brain imaging data available for some of the participants, they discovered a particular region of the brain known as the pars orbitalis that was smaller among children who scored highly for ADHD and emotional problems. The pars orbitalis is at the front of the brain and plays an important role in understanding and processing of emotion and communication as well as inhibitory control over behaviour, which may explain some of the behaviours seen in ADHD.

Professor Barbara Sahakian from the Department of Psychiatry at the University of Cambridge and a Fellow of Clare Hall said: “The pars orbitalis is a well-connected part of the brain, and if it hasn’t developed properly it might make it difficult for individuals to control their emotions and communicate with others appropriately, especially in social situations.

“Parents and teachers often say they have problems controlling children with ADHD, and it could be that when the children can’t express themselves well – when they hit emotional difficulties – they may not be able to control their emotions and have an outburst rather than communicating with the parent, teacher or the other child.”

Professor Sahakian hopes that acknowledging emotion dysregulation as a key part of ADHD will help people better understand the problems the child is experiencing. This could lead to using effective treatments for regulation of emotion, such as cognitive behavioural therapy.

The findings may also point to potential ways to help the child manage their emotions, for example by using cognitive behavioural techniques to learn to stop and think before they react and to express their feelings verbally, or use techniques such as exercise or relaxation to calm themselves or alleviate symptoms of depression and anxiety.

This may be particularly important as the researchers found that Ritalin, the drug used to help manage ADHD symptoms, does not appear to fully treat symptoms of emotion dysregulation. Identifying the problem earlier would allow for alternative, more effective interventions to help the child better manage their emotions, potentially helping the individual in adulthood.

Professor Qiang Luo from Fudan University and a Life Member at Clare Hall, Cambridge, said: “If you're having trouble controlling your emotions, this can lead to problems with social interactions, which further exacerbates any depression or anxiety that you might have. It also might mean that you're saying things or doing things that exacerbate a situation rather than calming it down. Teaching vulnerable individuals from an early age how to manage your emotions and express yourself could help them overcome such problems further down the line.”

While it is not clear exactly what causes these problems in the first place, the researchers found signs of a link to possible dysfunction of the immune system, with individuals who exhibited signs of emotion dysregulation showing higher percentages of certain types of immune cell.

Professor Sahakian added: “We already know that problems with the immune system can be linked to depression, and we’ve seen similar patterns in individuals with ADHD who experience emotion dysregulation.”

The research was supported by the National Key Research and Development Program of China, the National Natural Science Foundation of China, the Program of Shanghai Academic Research Leader and the Shanghai Municipal Science and Technology Major Project.

Reference
Hou, W et al.  Emotion dysregulation and right pars orbitalis constitute a neuropsychological pathway to attention deficit hyperactivity disorder. Nature Mental Health; 13 May 2024: DOI: 10.1038/s44220-024-00251-z

Cambridge scientists have shown that problems regulating emotions – which can manifest as depression, anxiety and explosive outbursts – may be a core symptom of attention deficit hyperactivity disorder (ADHD).

When the children can’t express themselves well – when they hit emotional difficulties – they may not be able to control their emotions and have an outburst rather than communicating
Barbara Sahakian
Teenage boys fighting on way to school

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

More than 1,000 may have died in Nazi camps on island of Alderney, report finds

"Nazi Fire Control Tower on Alderney" by neilalderney123 is licensed under CC BY-NC 2.0.

A review of evidence, gathered by a panel of 13 international experts, including Cambridge archaeologist Dr Gilly Carr, has sought to give the most accurate possible assessment of how many prisoners and labourers died on the Channel Island between 1941 to 1945.

During this time, crimes were committed against forced and slave labourers, transported from countries across Europe and brought to Alderney to construct fortifications as part of the German war effort. Housed in camps that shared many of the traits of those in mainland Europe, these labourers were subject to atrocious living and working conditions, and, in some cases, executions.  

Commissioned by Lord Eric Pickles, UK Special Envoy on Post Holocaust Issues, the investigation aims to dispel conspiracy theories and provide the most accurate figure possible of those who lost their lives on the island. The report also aims to bring justice for those who died, and ensure that this period of history, and the Holocaust, is remembered fully and accurately.

The team’s calculation of the minimum number of prisoners or labourers sent to Alderney throughout the German occupation stands between 7,608 and 7,812 people. Death figures calculated after Alderney was liberated by the British originally suggested that 389 people died as a result of ill-treatment. Now, the Alderney Expert Review Panel has found that the number of deaths in Alderney is likely to range between 641 and 1,027. 

The review panel has concluded that there is no evidence that many thousands of victims died, and that claims Alderney constituted a ‘mini-Auschwitz’ are unsubstantiated.  

Dr Carr, Associate Professor in Archaeology at Cambridge’s Institute of Continuing Education, and Fellow of St Catharine’s College, who co-ordinated the panel, said: “I am proud of the way the team of experts came together to provide answers to the questions set by Lord Pickles. It shows what can be achieved when you bring together the right people with the right experience and expertise who are committed to working in memory of those who suffered in Alderney during the Occupation.”

Chief Rabbi Sir Ephraim Mirvis KBE said: “The findings of the Alderney Review are a significant and welcome development. Having an authoritative account of this harrowing element of the island’s history is vital. It enables us to accurately remember the individuals who so tragically suffered and died on British soil. Marking the relevant sites will now be an appropriate step to take, to ensure that this information is widely available.”

The panel also sought to discover why German perpetrators were not tried by Britain for war crimes committed in Alderney. It concluded that a war crimes investigation carried out in Alderney immediately after the war was “wholly serious in intent”. But because most of the victims were Soviet citizens, the case was handed to the Russians. In exchange, Germans who murdered British servicemen in Stalag Luft III during the “Great Escape” were handed over to Britain.

The report says the Soviet Union did not follow up the Alderney case and were thus responsible for the failure to bring the perpetrators to justice, causing much anger among members of the British government.

The number of people killed during the Nazi Occupation of Alderney is far greater than the figure previously thought, according to a new report published today, which says more than 1,000 could have perished.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Cambridge experts awarded 2024 Academy of Medical Sciences Fellowships

Academy of Medical Sciences logo

Professor Nita Forouhi from the Medical Research Council (MRC) Epidemiology Unit and Professor Susan Gathercole from the Department of Psychiatry and MRC Cognition and Brain Sciences Unit join an esteemed Fellowship of over 1,400 researchers who have been recognised for their remarkable contributions to advancing biomedical and health sciences, ground-breaking research discoveries and translating developments into benefits for patients and wider society.

Professor Nita Forouhi is a clinical scientist whose research is focused on the link between diet, nutrition and the risk of diabetes, obesity and related disorders. She is Professor of Population Health and Nutrition and leads the Nutritional Epidemiology programme, which was awarded the Vice-Chancellor’s Best Impact Award in 2016. She frequently engages with the media to promote knowledge in the area of diet and health.

Professor Susan Gathercole is a cognitive psychologist with interests in memory and learning, including the causes of specific learning difficulties in children and how they might be overcome. Susan became a Fellow of the British Academy in 2014 and was awarded an OBE for services to psychology and education in 2016.

Professor Andrew Morris PMedSci, President of the Academy of Medical Sciences, said: “It is an honour to welcome these brilliant minds to our Fellowship. Our new Fellows lead pioneering work in biomedical research and are driving remarkable improvements in healthcare. We look forward to working with them, and learning from them, in our quest to foster an open and progressive research environment that improves the health of people everywhere through excellence in medical science.

“It is also welcoming to note that this year's cohort is our most diverse yet, in terms of gender, ethnicity and geography. While this progress is encouraging, we recognise that there is still much work to be done to truly diversify our Fellowship. We remain committed to our EDI goals and will continue to take meaningful steps to ensure our Fellowship reflects the rich diversity of the society we serve."

The new Fellows will be formally admitted to the Academy at a ceremony on Wednesday 18 September 2024.

Two Cambridge Fellows are among the new Academy of Medical Sciences Fellows announced today.

Academy of Medical Sciences logo

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

“I feel like I’m Alice in Wonderland”: nightmares and ‘daymares’ could be early warning signs of autoimmune disease

A ghostly figure silhouetted between trees in a forest.

The researchers argue that there needs to be greater recognition that these types of mental health and neurological symptoms can act as an early warning sign that an individual is approaching a ‘flare’, where their disease worsens for a period.

In a study published today in eClinicalMedicine, researchers surveyed 676 people living with lupus and 400 clinicians, as well as carrying out detailed interviews with 69 people living with systemic autoimmune rheumatic diseases (including lupus) and 50 clinicians. Lupus is an autoimmune inflammatory disease known for its effect on many organs including the brain.

In the study, the team also asked patients about the timing of 29 neurological and mental health symptoms (such as depression, hallucinations and loss of balance). In interviews, patients were also asked if they could list the order that symptoms usually occurred when their disease was flaring.

One of the more common symptoms reported was disrupted dream sleep, experienced by three in five patients, a third of whom reported this symptom appearing over a year before onset of lupus disease.

Just under one in four patients reported hallucinations, though for 85% of these the symptom did not appear until around the onset of disease or later. When the researchers interviewed the patients, however, they found that three in five lupus patients and one in three with other rheumatology-related conditions reported increasingly disrupted dreaming sleep – usually vivid and distressing nightmares – just before their hallucinations. These nightmares were often vivid and distressing, involving being attacked, trapped, crushed, or falling.

One patient from Ireland described their nightmares as: “Horrific, like murders, like skin coming off people, horrific…I think it’s like when I’m overwhelmed which could be the lupus being bad…So I think the more stress my body is under then the more vivid and bad the dreaming would be.”

The study interviewers found that using the term ‘daymares’ to talk about hallucinations often led to a ‘lightbulb’ moment for patients, and they felt that it was a less frightening and stigmatised word.

A patient from England said: “[When] you said that word daymare and as soon as you said that it just made sense, it’s like not necessarily scary, it’s just like you’ve had a dream and yet you’re sitting awake in the garden…I see different things, it’s like I come out of it and it’s like when you wake up and you can’t remember your dream and you’re there but you’re not there… it’s like feeling really disorientated, the nearest thing I can think of is that I feel like I’m Alice in Wonderland.”

Patients experiencing hallucinations were reluctant to share their experiences, and many specialists said they had never considered nightmares and hallucinations as being related to disease flares. Most said they would talk to their patients about nightmares and hallucinations in future, agreeing that recognising these early flare symptoms may provide an ‘early warning system’ enabling them to improve care and even reduce clinic times by averting flares at any earlier stage.

Lead author Dr Melanie Sloan from the Department of Public Health and Primary Care at the University of Cambridge said: “It’s important that clinicians talk to their patients about these types of symptoms and spend time writing down each patient’s individual progression of symptoms. Patients often know which symptoms are a bad sign that their disease is about to flare, but both patients and doctors can be reluctant to discuss mental health and neurological symptoms, particularly if they don’t realise that these can be a part of autoimmune diseases.”  

Senior study author Professor David D’Cruz from Kings College London said: “For many years, I have discussed nightmares with my lupus patients and thought that there was a link with their disease activity. This research provides evidence of this, and we are strongly encouraging more doctors to ask about nightmares and other neuropsychiatric symptoms – thought to be unusual, but actually very common in systemic autoimmunity – to help us detect disease flares earlier.”

The importance of recognising these symptoms was highlighted by reports that some patients had initially been misdiagnosed or even hospitalised with a psychotic episode and/or suicidal ideation, which was only later found to be the first sign of their autoimmune disease.

One patient from Scotland said: “At 18 I was diagnosed with borderline personality disorder, and then 6 months later with lupus at 19, so it’s all very close together and it was strange that when my [borderline personality disorder] got under control and my lupus got under control was within 6 months.”

A nurse from Scotland said: “I’ve seen them admitted for an episode of psychosis and the lupus isn’t screened for until someone says ‘oh I wonder if it might be lupus’...but it was several months and very difficult… especially with young women and it’s learning more that that is how lupus affects some people and it’s not anti-psychotic drugs they needed, it’s like a lot of steroids.”

Professor Guy Leschziner, a study author and neurologist at Guys’ and St Thomas’ hospital, and author of The Secret World of Sleep, said: "We have long been aware that alterations in dreaming may signify changes in physical, neurological and mental health, and can sometimes be early indicators of disease. However, this is the first evidence that nightmares may also help us monitor such a serious autoimmune condition like lupus, and is an important prompt to patients and clinicians alike that sleep symptoms may tell us about impending relapse."

The research was funded by The Lupus Trust and is part of the INSPIRE project (Investigating Neuropsychiatric Symptom Prevalence and Impact in Rheumatology patient Experiences).

Reference
Sloan, M et al. Neuropsychiatric prodromes and symptom timings in relation to disease onset and/or flares in SLE: results from the mixed methods international INSPIRE study. eClinicalMedicine; 21 May 2024; DOI: 10.1016/j.eclinm.2024.102634

An increase in nightmares and hallucinations – or ‘daymares’ – could herald the onset of autoimmune diseases such as lupus, say an international team led by researchers at the University of Cambridge and King’s College London.

Both patients and doctors can be reluctant to discuss mental health and neurological symptoms, particularly if they don’t realise that these can be a part of autoimmune diseases
Mel Sloan
A ghostly figure silhouetted between trees in a forest

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Winners of Vice-Chancellor’s Social Impact Awards 2024 announced

Winners of this year’s Vice-Chancellor’s Social Impact Awards

The awards, organised by Cambridge Hub and sponsored by the Vice Chancellor’s Office, recognise and celebrate exceptional achievement in contributing to society. University of Cambridge Vice-Chancellor Professor Deborah Prentice hosted the ceremony on 30 April, which saw 15 students recognised with awards.

Undergraduate Student Awards

Sakshi Jha from Clare College

Sakshi is a law finalist, who co-founded Cambridge Freedom from Torture, a refugee-aid group, where she formed part of the first student volunteering convoy to Calais, France. Sakshi is also leading a policy paper examining UK asylum policy; she is on the Managing Board of the Cambridge Human Rights Law Journal, and she is the founding Co-Editor in Chief of the Clare College Law Journal, where she interviewed Supreme Court Justices on prevalent legal issues such as human rights and international law enforcement. Sakshi has also aided fundraising efforts as Treasurer of Cambridge Amnesty International, and is a legal researcher for a social consulting firm, completing commissioned research for the United Nations High Commissioner for Refugees. 

Millie May from St John's College

Millie is a third-year politics and social anthropology undergraduate at St John's College. She is extremely passionate about climate and social justice-related work and has been the Lead of the Cambridge Climate Society Education Team for two academic years. She has led several projects in this role, the main being a campaign and student-faculty collaborative effort to integrate climate-related content across degrees at Cambridge, which she presented at COP28 to advocate for an integrated climate change curriculum on an international level.

Faustine Petron from Department of Sociology

Faustine is a final-year Human, Social and Political Sciences student specialising in Sociology. She is interested in gendered violence and feminist modes of resistance in the Maghreb and South Asia. Outside of academia, Faustine is an award-winning campaigner who works with the government and charities in using education as a tool to prevent gendered violence in the UK.

Josephine Somerville from Clare College

Jo is a third-year English student who has acted on her passion for making long-lasting positive changes for biodiversity and climate change, centrally in the role of lead of the Cambridge Climate Society Action team. In 2023 she led the prosecution in the Generation on Trial project and more recently has initiated collaborations between the student bodies and the local community. The most extensive campaign she has been running is the Pesticide-Free Cambridge Colleges Campaign. 

Master’s Student Award

Ming Hong Choi from Hughes Hall

Ming is a Master of Finance candidate at Cambridge Judge Business School, supported by both the UK Government’s Chevening Scholarship and Cambridge Trust Scholarship. He has made contributions across and beyond Cambridge through various leadership and advisory roles in youth leadership and development, real estate, investment, arts, sustainability, and educational initiatives.

PhD Student Awards

Samantha Hodder from Clare College

Sam is a final year PhD student studying cancer biology in the Department of Biochemistry. During a clinical placement early on in her PhD, Sam saw how important it is for children with cancer to be well informed about what they’ll be going through during the course of their treatment. This experience led Sam to begin the development of Chum, an app based learning and support platform for children with cancer and their families.

Swetha Kannan from Trinity Hall

Swetha is a PhD student at the Department of Medicine, as well as a successful junior scientist, educator, and social entrepreneur. Her key contributions to the local Cambridge community have been a result of her involvement with Make-A-Smile Cambridge, Student Minds Cambridge and the Cambridge Development Initiative. Swetha also established The Lalitha Foundation, a non-profit organisation in India dedicated to the betterment of lives of cancer- and post-sepsis patients.

Mine Koprulu from Pembroke College

Mine is a final year PhD student in Medical Sciences at MRC Epidemiology Unit. Improving the lives of others and making the world a better place to live in has been a long-standing aspiration of Mine’s. Professionally, she is aiming to improve healthcare by better understanding the biological basis of diseases and identifying effective treatment opportunities. In parallel, she also has been leading and contributing to various social impact projects, ranging from building more inclusive communities to promoting gender equity.

Nazifa Rafa from Lucy Cavendish College

Nazifa is a PhD student in Geography and a pioneering researcher dedicated to addressing pressing environmental and social justice issues. Her work spans biodiversity conservation, climate change, disaster risk, water and energy security, environmental health, and sustainable development, with a focus on empowering marginalised communities.

Mayumi Sato from Trinity Hall

Mayumi is a PhD student and Gates Cambridge Scholar, and the founder and director of SustainED. She has several years' experience working with climate-affected groups, predominantly in the Global South. Her academic and advocacy interests involve leading campaigns and initiatives for impact-based community development and justice-oriented research. Her interests focus on the intersection between social equity, environmental justice, and community engagement.

Volunteering Award

Kate Lucas from Homerton College

Kate is a third year undergraduate studying Manufacturing Engineering, who is dedicated to increasing diversity in engineering. As well as being President of Cambridge University Robotics Society and organising Unibots UK 2023 and 2024, she also mentors Year 13 students through platforms such as Zero Gravity and is also an active ambassador for Homerton Changemakers.

Innovation Award

William Lan from St Catharine's College

William is an MPhil student in Medical Science who has significantly contributed to mental health advocacy and community support. He is the Postgraduate Welfare Officer at St Catharine’s College, Vice-Chair of the International Students’ Campaign, and a Mental Health Foundation Young Leader, launching crucial welfare programmes and peer-support systems. The judges said William's innovative methods and steadfast commitment to mental health advocacy have broadened his impact, establishing him as a force for positive change within and beyond the academic community.

Global Impact Award

Paulina Pérez-Duarte Mendiola from Sidney Sussex College

Paulina is a PhD candidate focusing on play and health at the Faculty of Education. She is a paediatrician, medical anthropologist and advocate for children’s holistic health and healthcare equity. Her work focuses on the role and impact of play in sick children’s development, learning and healthcare experiences. She is the Founder and Director of Semana JIM, which is the acronym of Play in Hospital Awareness Week in Mexico.

Impact in the Local Community Award

Zara Crapper from Robinson College

Zara is a third-year undergraduate in Natural Sciences. She has been involved in Scouting since she was young, and before her arrival in Cambridge she was an adult volunteer for a Cub Scout group in Andover. Since coming to Cambridge, she has opened a new section in a local Group, enabling the youngest members in the Scouting family from across the community to come together and learn in an enjoyable and inclusive environment. 

Sustainability Award

Clara Ma from Selwyn College

Clara is a Gates Cambridge Scholar at Selwyn College, an alumna of Churchill College and a PhD student in environmental science and policy at the Cambridge Centre for Environment, Energy and Natural Resource Governance. She assists departments, colleges, and organisations across the University in transitioning to more sustainable food procurement.

The winners of this year’s Vice-Chancellor’s Social Impact Awards have been announced.

The winners of this year’s Vice-Chancellor’s Social Impact Awards

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Earth’s earliest sea creatures drove evolution by stirring the water

Artistic recreation of the marine animal forest

A study involving the University of Cambridge has used virtual recreations of the earliest animal ecosystems, known as marine animal forests, to demonstrate the part they played in the evolution of our planet.

Using state-of-the-art computer simulations of fossils from the Ediacaran time period - approximately 565 million years ago - scientists discovered how these animals mixed the surrounding seawater. This may have affected the distribution of important resources such as food particles and could have increased local oxygen levels.

Through this process, the scientists think these early communities could have played a crucial role in shaping the initial emergence of large and complex organisms prior to a major evolutionary radiation of different forms of animal life, the so-called Cambrian ‘explosion’.

Over long periods of time, these changes might have allowed life forms to perform more complicated functions, like those associated with the evolution of new feeding and movement styles.

The study was led by the Natural History Museum and is published today in the journal Current Biology.

Dr Emily Mitchell at the University of Cambridge’s Department of Zoology, a co-author of the report, said: “It’s exciting to learn that the very first animals from 580 million years ago had a significant impact on their environment, despite not being able to move or swim. We’ve found they mixed up the water and enabled resources to spread more widely - potentially encouraging more evolution.”

Scientists know from modern marine environments that nutrients like food and oxygen are carried in seawater, and that animals can affect water flow in ways that influence the distribution of these resources.

To test how far back this process goes in Earth’s history, the team looked at some of the earliest examples of marine animal communities, known from rocks at Mistaken Point, Newfoundland, Canada. This world-famous fossil site perfectly preserves early life forms thanks to a cover of volcanic ash (sometimes referred to as an ‘Ediacaran Pompeii’).

Although some of these life forms look like plants, analysis of their anatomy and growth strongly suggests they are animals. Owing to the exceptional preservation of the fossils, the scientists could recreate digital models of key species, which were used as a basis for further computational analyses.

First author Dr Susana Gutarra, a Scientific Associate at the Natural History Museum, said: “We used ecological modelling and computer simulations to investigate how 3D virtual assemblages of Ediacaran life forms affected water flow. Our results showed that these communities were capable of ecological functions similar to those seen in present-day marine ecosystems.”

The study showed that one of the most important Ediacaran organisms for disrupting the flow of water was the cabbage-shaped animal Bradgatia, named after Bradgate Park in England. The Bradgatia from Mistaken Point are among some of the largest fossils known from this site, reaching diameters of over 50 centimetres.

Through their influence on the water around them, the scientists believe these Ediacaran organisms might have been capable of enhancing local oxygen concentrations. This biological mixing might also have had repercussions for the wider environment, possibly making other areas of the sea floor more habitable and perhaps even driving evolutionary innovation.

Dr Imran Rahman, lead author and Principal Researcher at the Natural History Museum, said: “The approach we’ve developed to study Ediacaran fossil communities is entirely new in palaeontology, providing us with a powerful tool for studying how past and present marine ecosystems might shape and influence their environment.”

The research was funded by the UK Natural Environment Research Council and the US National Science Foundation.

Reference: Gutarra-Diaz, S. “Ediacaran marine animal forests and the ventilation of the oceans.” May 2024, Current Biology. DOI: 10.1016/j.cub.2024.04.059

Adapted from a press release by the Natural History Museum

3D reconstructions suggest that simple marine animals living over 560 million years ago drove the emergence of more complex life by mixing the seawater around them

It’s exciting to learn that the very first animals from 580 million years ago had a significant impact on their environment, despite not being able to move or swim.
Emily Mitchell
Artistic recreation of the marine animal forest

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Webb detects most distant black hole merger to date

The environment of the galaxy system ZS7 from the JWST PRIMER programme as seen by Webb's NIRCam instrument.

Astronomers have found supermassive black holes with masses of millions to billions times that of the Sun in most massive galaxies in the local Universe, including in our Milky Way galaxy. These black holes have likely had a major impact on the evolution of the galaxies they reside in. However, scientists still don’t fully understand how these objects grew to become so massive.

The finding of gargantuan black holes already in place in the first billion years after the Big Bang indicates that such growth must have happened very rapidly, and very early. Now, the James Webb Space Telescope is shedding new light on the growth of black holes in the early Universe.

The new Webb observations have provided evidence for an ongoing merger of two galaxies and their massive black holes when the Universe was just 740 million years old. The system is known as ZS7.

Massive black holes that are actively accreting matter have distinctive spectrographic features that allow astronomers to identify them. For very distant galaxies, like those in this study, these signatures are inaccessible from the ground and can only be seen with Webb.

“We found evidence for very dense gas with fast motions in the vicinity of the black hole, as well as hot and highly ionised gas illuminated by the energetic radiation typically produced by black holes in their accretion episodes,” said lead author Dr Hannah Übler of Cambridge’s Cavendish Laboratory and Kavli Institute for Cosmology. “Thanks to the unprecedented sharpness of its imaging capabilities, Webb also allowed our team to spatially separate the two black holes.”

The team found that one of the two black holes has a mass that is 50 million times the mass of the Sun. “The mass of the other black hole is likely similar, although it is much harder to measure because this second black hole is buried in dense gas,” said team member Professor Roberto Maiolino, also from the Kavli Institute.

“Our findings suggest that merging is an important route through which black holes can rapidly grow, even at cosmic dawn,” said Übler. “Together with other Webb findings of active, massive black holes in the distant Universe, our results also show that massive black holes have been shaping the evolution of galaxies from the very beginning.”

The team notes that once the two black holes merge, they will also generate gravitational waves. Events like this will be detectable with the next generation of gravitational wave observatories, such as the upcoming Laser Interferometer Space Antenna (LISA) mission, which was recently approved by the European Space Agency and will be the first space-based observatory dedicated to studying gravitational waves.

This discovery was from observations made as part of the Galaxy Assembly with NIRSpec Integral Field Spectroscopy programme. The team has recently been awarded a new Large Programme in Webb’s Cycle 3 of observations, to study in detail the relationship between massive black holes and their host galaxies in the first billion years. An important component of this programme will be to systematically search for and characterise black hole mergers. This effort will determine the rate at which black hole merging occurs at early cosmic epochs and will assess the role of merging in the early growth of black holes and the rate at which gravitational waves are produced from the dawn of time.

These results have been published in the Monthly Notices of the Royal Astronomical Society.

Reference:
Hannah Übler et al. ‘GA-NIFS: JWST discovers an offset AGN 740 million years after the big bang’ Monthly Notices of the Royal Astronomical Society (2024). DOI: 10.1093/mnras/stae943

Adapted from a press release by the European Space Agency.

An international team of astronomers, led by the University of Cambridge, has used the James Webb Space Telescope to find evidence for an ongoing merger of two galaxies and their massive black holes when the Universe was only 740 million years old. This marks the most distant detection of a black hole merger ever obtained and the first time that this phenomenon has been detected so early in the Universe.

Massive black holes have been shaping the evolution of galaxies from the very beginning
Hannah Übler
The environment of the galaxy system ZS7 from the JWST PRIMER programme as seen by Webb's NIRCam instrument

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Ten Cambridge scientists elected as Fellows of the Royal Society 2024

The Royal Society in central London

The Royal Society is a self-governing Fellowship of many of the world’s most distinguished scientists drawn from all areas of science, engineering and medicine.

The Society’s fundamental purpose, as it has been since its foundation in 1660, is to recognise, promote and support excellence in science and to encourage the development and use of science for the benefit of humanity.

This year, over 90 researchers, innovators and communicators from around the world have been elected as Fellows of the Royal Society for their substantial contribution to the advancement of science. Nine of these are from the University of Cambridge.

Sir Adrian Smith, President of the Royal Society said: “I am pleased to welcome such an outstanding group into the Fellowship of the Royal Society.

“This new cohort have already made significant contributions to our understanding of the world around us and continue to push the boundaries of possibility in academic research and industry.

“From visualising the sharp rise in global temperatures since the industrial revolution to leading the response to the Covid-19 pandemic, their diverse range of expertise is furthering human understanding and helping to address some of our greatest challenges. It is an honour to have them join the Fellowship.”

The Fellows and Foreign Members join the ranks of Stephen Hawking, Isaac Newton, Charles Darwin, Albert Einstein, Lise Meitner, Subrahmanyan Chandrasekhar and Dorothy Hodgkin.

The new Cambridge fellows are: 
 

Professor Sir John Aston Kt FRS

Aston is the Harding Professor of Statistics in Public Life at the Statistical Laboratory, Department of Pure Mathematics and Mathematical Statistics, where he develops techniques for public policy and improves the use of quantitative methods in public policy debates.

From 2017 to 2020 he was the Chief Scientific Adviser to the Home Office, providing statistical and scientific advice to ministers and officials, and was involved in the UK’s response to the Covid pandemic. He was knighted in 2021 for services to statistics and public policymaking, and is a Fellow of Churchill College.
 

Professor Sarah-Jayne Blakemore FBA FMedSci FRS

Blakemore is the Professor of Psychology and Cognitive Neuroscience, Department of Psychology, and leader of the Developmental Cognitive Neuroscience Group. Her research focuses on the development of social cognition and decision making in the human adolescent brain, and adolescent mental health. 

Blakemore has been awarded several national and international prizes for her research, and is a Fellow of the British Academy, the American Association of Psychological Science and the Academy of Medical Sciences. 
 

Professor Patrick Chinnery FMedSci FRS

Chinnery is Professor of Neurology and head of the University’s Department of Clinical Neurosciences, and a Fellow of Gonville & Caius College. He was appointed Executive Chair of the Medical Research Council last year, having previously been MRC Clinical Director since 2019.

His principal research is the role of mitochondria in human disease and developing new treatments for mitochondrial disorders. Chinnery is a Wellcome Principal Research Fellow with a lab based in the MRC Mitochondrial Biology Unit and jointly chairs the NIHR BioResource for Translational Research in Common and Rare Diseases. He is a Fellow of the Academy of Medical Sciences.


Professor Rebecca Fitzgerald OBE FMedSci FRS

Fitzgerald is Professor of Cancer Prevention in the Department of Oncology and the inaugural Director of the University’s new Early Cancer Institute, which launched in 2022. She is a Fellow of Trinity College.

Her pioneering work to devise a first-in-class, non-endoscopic capsule sponge test for identifying individuals at high risk for oesophageal cancer has won numerous prizes, including the Westminster Medal, and this test is now being rolled out in the NHS and beyond by her spin-out Cyted Ltd.


Professor David Hodell FRS

Hodell is the Woodwardian Professor of Geology and Director of the Godwin Laboratory for Palaeoclimate Research in the Department of Earth Sciences, and a Fellow of Clare College.

A marine geologist and paleoclimatologist, his research focuses on high-resolution paleoclimate records from marine and lake sediments, as well as mineral deposits, to better understand past climate dynamics. Hodell is a fellow of the American Geophysical Union and the American Association for the Advancement of Science. He has received the Milutin Milankovic Medal.


Professor Eric Lauga FRS

Lauga is Professor of Applied Mathematics in the Department of Applied Mathematics and Theoretical Physics, where his research is in fluid mechanics, biophysics and soft matter. Lauga is the author, or co-author, of over 180 publications and currently serves as Associate Editor for the journal Physical Review Fluids.

He is a recipient of three awards from the American Physical Society: the Andreas Acrivos Dissertation Award in Fluid Dynamics, the François Frenkiel Award for Fluid Mechanics and the Early Career Award for Soft Matter Research. He is a Fellow of the American Physical Society and of Trinity College.


Professor George Malliaras FRS

Malliaras is the Prince Philip Professor of Technology in the Department of Engineering, where he leads a group that works on the development and translation of implantable and wearable devices that interface with electrically active tissues, with applications in neurological disorders and brain cancer.

Research conducted by Malliaras has received awards from the European Academy of Sciences, the New York Academy of Sciences, and the US National Science Foundation among others. He is a Fellow of the Materials Research Society and of the Royal Society of Chemistry.
 

Professor Lloyd Peck FRI FRSB FRS

Peck is a marine biologist at the British Antarctic Survey and a fellow at Wolfson College, Cambridge.

He identified oxygen as a factor in polar gigantism and identified problems with protein synthesis as the cause of slow development and growth in polar marine species. He was awareded a Polar Medal in 2009, the PLYMSEF Silver medal in 2015 and an Erskine Fellowship at the University of Canterbury, Christchurch in 2016-2017. 


Professor Oscar Randal-Williams FRS

Randal-Williams is the Sadleirian Professor of Pure Mathematics in the Department of Pure Mathematics and Mathematical Statistics.

He has received the Whitehead Prize from the London Mathematical Society, a Philip Leverhulme Prize, the Oberwolfach Prize, the Dannie Heineman Prize of the Göttingen Academy of Sciences and Humanities, and was jointly awarded the Clay Research Award.

Randal-Williams is one of two managing editors of the Proceedings of the London Mathematical Society, and an editor of the Journal of Topology.


Professor Mihaela van der Schaar FRS

Van der Schaar is the John Humphrey Plummer Professor of Machine Learning, Artificial Intelligence and Medicine in the Departments of Applied Mathematics and Theoretical Physics, Engineering and Medicine.

She is the founder and director of the Cambridge Centre for AI in Medicine, and a Fellow at The Alan Turing Institute. Her work has received numerous awards, including the Oon Prize on Preventative Medicine, a National Science Foundation CAREER Award, and the IEEE Darlington Award.

Van der Schaar is credited as inventor on 35 US patents, and has made over 45 contributions to international standards for which she received three ISO Awards. In 2019, a Nesta report declared her the most-cited female AI researcher in the UK.


 

Ten outstanding Cambridge researchers have been elected as Fellows of the Royal Society, the UK’s national academy of sciences and the oldest science academy in continuous existence.

Royal Society
The Royal Society in central London

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

2023 was the hottest summer in two thousand years

Morning sun over Los Angeles, USA.

Although 2023 has been reported as the hottest year on record, the instrumental evidence only reaches back as far as 1850 at best, and most records are limited to certain regions.

Now, by using past climate information from annually resolved tree rings over two millennia, scientists from the University of Cambridge and the Johannes Gutenberg University Mainz have shown how exceptional the summer of 2023 was.

Even allowing for natural climate variations over hundreds of years, 2023 was still the hottest summer since the height of the Roman Empire, exceeding the extremes of natural climate variability by half a degree Celsius.

“When you look at the long sweep of history, you can see just how dramatic recent global warming is,” said co-author Professor Ulf Büntgen, from Cambridge’s Department of Geography. “2023 was an exceptionally hot year, and this trend will continue unless we reduce greenhouse gas emissions dramatically.”

The results, reported in the journal Nature, also demonstrate that in the Northern Hemisphere, the 2015 Paris Agreement to limit warming to 1.5C above pre-industrial levels has already been breached.

Early instrumental temperature records, from 1850-1900, are sparse and inconsistent. The researchers compared early instrumental data with a large-scale tree ring dataset and found the 19th century temperature baseline used to contextualise global warming is several tenths of a degree Celsius colder than previously thought. By re-calibrating this baseline, the researchers calculated that summer 2023 conditions in the Northern Hemisphere were 2.07C warmer than mean summer temperatures between 1850 and 1900.

“Many of the conversations we have around global warming are tied to a baseline temperature from the mid-19th century, but why is this the baseline? What is normal, in the context of a constantly-changing climate, when we’ve only got 150 years of meteorological measurements?” said Büntgen. “Only when we look at climate reconstructions can we better account for natural variability and put recent anthropogenic climate change into context.”

Tree rings can provide that context, since they contain annually-resolved and absolutely-dated information about past summer temperatures. Using tree-ring chronologies allows researchers to look much further back in time without the uncertainty associated with some early instrumental measurements.

The available tree-ring data reveals that most of the cooler periods over the past 2000 years, such as the Little Antique Ice Age in the 6th century and the Little Ice Age in the early 19th century, followed large-sulphur-rich volcanic eruptions. These eruptions spew huge amounts of aerosols into the stratosphere, triggering rapid surface cooling. The coldest summer of the past two thousand years, in 536 CE, followed one such eruption, and was 3.93C colder than the summer of 2023.

Most of the warmer periods covered by the tree ring data can be attributed to the El Niño climate pattern, or El Niño-Southern Oscillation (ENSO). El Niño affects weather worldwide due to weakened trade winds in the Pacific Ocean and often results in warmer summers in the Northern Hemisphere. While El Niño events were first noted by fisherman in the 17th century, they can be observed in the tree ring data much further back in time.

However, over the past 60 years, global warming caused by greenhouse gas emissions are causing El Niño events to become stronger, resulting in hotter summers. The current El Niño event is expected to continue into early summer 2024, making it likely that this summer will break temperature records once again.

“It’s true that the climate is always changing, but the warming in 2023, caused by greenhouse gases, is additionally amplified by El Niño conditions, so we end up with longer and more severe heat waves and extended periods of drought,” said Professor Jan Esper, the lead author of the study from the Johannes Gutenberg University Mainz in Germany. “When you look at the big picture, it shows just how urgent it is that we reduce greenhouse gas emissions immediately.”

The researchers note that while their results are robust for the Northern Hemisphere, it is difficult to obtain global averages for the same period since data is sparse for the Southern Hemisphere. The Southern Hemisphere also responds differently to climate change, since it is far more ocean-covered than the Northern Hemisphere.

The research was supported in part by the European Research Council.

Reference:
Jan Esper, Max Torbenson, Ulf Büntgen. ‘2023 summer warmth unparalleled over the past 2,000 years.’ Nature (2024). DOI: 10.1038/s41586-024-07512-y

Researchers have found that 2023 was the hottest summer in the Northern Hemisphere in the past two thousand years, almost four degrees warmer than the coldest summer during the same period.

When you look at the long sweep of history, you can see just how dramatic recent global warming is
Ulf Büntgen
Morning sun over Los Angeles, USA.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Over 20,000 people join search for new dementia treatments

Smiling elderly woman speaking to a healthcare worker

Using the resource, scientists have already been able to show for the first time that two important bodily mechanisms – inflammation and metabolism – play a role in the decline in brain function as we age.

By 2050, approximately 139 million people are expected to be living with dementia worldwide. In the UK, in 2022, UK Prime Minister launched the Dame Barbara Windsor Dementia Mission, part of the government’s commitment to double increase research funding for dementia.

Although there has been recent progress developing drugs that slow down progression of the disease, the two leading treatments only have a small effect, and the vast majority of new approaches that work in animal studies fail when it comes to patient clinical trials.

One explanation for these failures is that the drugs are tested in people who already have memory loss – and by this point, it may be too late to stop or reverse the disease. Hence, there is an urgent need to understand what is going on before people develop symptoms at the very early stages of disease, and to test new treatments before people come to their doctor with cognitive problems. This approach requires a large cohort of participants willing to be recalled for clinical and experimental studies of cognitive decline.

Today, writing in the journal Nature Medicine, scientists led by the University of Cambridge in partnership with the Alzheimer’s Society report how they have recruited 21,000 people aged 17-85 to the Genes and Cognition Cohort within the National Institute for Health and Care Research (NIHR) BioResource.

The NIHR BioResource was established in 2007 to recruit volunteers keen to engage in experimental medicine and clinical trials across the whole of medicine. Approximately half of its participants are recruited to disease specific cohorts, but the other half are from the general public, and detailed information about their genetics and their physical makeup has been collected. They have all given their consent to be contacted about future research studies.

For the Genes and Cognition Cohort, researchers used a combination of cognitive tests and genetic data, combined with other health data and demographic information, to enable the first at-scale study of cognitive changes. This will allow the team to recruit participants for studies of cognitive decline and new treatments for this.

For example, a pharmaceutical company with a promising new drug candidate to slow the cognitive decline could recruit people through the BioResource based on their profile and invite them to join in the clinical trial. Having a baseline measurement for their cognitive performance will allow scientists to observe whether the drug slows their expected cognitive decline.

Professor Patrick Chinnery from the Department of Clinical Neurosciences at the University of Cambridge and co-Chair of the NIHR BioResource, who has led the project, said: “We’ve created a resource that is unmatched anywhere else in the world, recruiting people who are not showing any signs of dementia rather than people already having symptoms. It will allow us to match individuals to particular studies and speed up the development of much-needed new drugs to treat dementia.

“We know that over time our cognitive function decreases, so we’ve plotted out the expected trajectory of various different cognitive functions over our volunteers’ life course according to their genetic risk. We’ve also asked the question, ‘What are the genetic mechanisms that predispose you to slow or fast cognitive decline as you age?’.”

Using the research, the team have identified two mechanisms that appear to affect cognition as we age and could serve as potential targets to slow down cognitive decline and thereby delay the onset of dementia. The first of these is inflammation, with immune cells specific to the brain and central nervous system – known as microglia – causing gradual deterioration of the brain and hence its ability to perform key cognitive functions. The second mechanism relates to metabolism – in particular, how carbohydrates are broken down in the brain to release energy.

Professor Chinnery added: “Cognitive decline is a natural process, but when it drops below a particular threshold, that’s when there’s a problem – that is when we would diagnose dementia. Anything that slows that decline will delay when we drop below that threshold. If you could put off the onset of dementia from 65 to 75 or even 85, it would make a huge difference at an individual and at a population level.”

Dr Richard Oakley, Associate Director of Research and Innovation at Alzheimer’s Society, said: “This exciting study, funded by Alzheimer’s Society, is an important step in helping us to better understand how the diseases that cause dementia begin, and will aid in the development of new treatments that target the early stages of these diseases.

“The data, from over 20,000 volunteers, helps us to better understand the connection between participants’ genes and cognitive decline and allows for further ground-breaking analysis in future. 

“One in three people born in the UK today will go on to develop dementia in their lifetime but research will beat dementia. We need to make it a reality sooner through more funding, partnership working and people taking part in dementia research.”

For further information about how you can join the BioResource and contribute to studies like this one and many others, please visit bioresource.nihr.ac.uk.

The research was carried out in collaboration with the Medical Research Council Biostatistics Unit and was supported by the Alzheimer’s Society and the NIHR BioResource. The researchers were also supported by Wellcome and the Medical Research Council.

Reference
Rahman, MS et al. Dynamics of cognitive variability with age and its genetic underpinning in NIHR BioResource Genes and Cognition Cohort participants. Nat Med; 14 May 2024; DOI: 10.1038/s41591-024-02960-5

More than 20,000 volunteers have been recruited to a resource aimed at speeding up the development of much-needed dementia drugs. The cohort will enable scientists in universities and industry to involve healthy individuals who may be at increased risk of dementia in clinical trials to test whether new drugs can slow the decline in various brain functions including memory and delay the onset of dementia.

We’ve created a resource that is unmatched anywhere else in the world, recruiting people who are not showing any signs of dementia rather than people already having symptoms
Patrick Chinnery
Smiling elderly woman speaking to a healthcare worker

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Birth by C-section more than doubles odds of measles vaccine failure

Very sick 5 year old little boy fighting measles infection, boy is laying in bed under the blanket with an agonizing expression, boy is covered with rash caused by virus.

A study by the University of Cambridge, UK, and Fudan University, China, has found that a single dose of the measles jab is up to 2.6 times more likely to be completely ineffective in children born by C-section, compared to those born naturally.

Failure of the vaccine means that the child’s immune system does not produce antibodies to fight against measles infection, so they remain susceptible to the disease.

A second measles jab was found to induce a robust immunity against measles in C-section children.

Measles is a highly infectious disease, and even low vaccine failure rates can significantly increase the risk of an outbreak.

A potential reason for this effect is linked to the development of the infant’s gut microbiome – the vast collection of microbes that naturally live inside the gut. Other studies have shown that vaginal birth transfers a greater variety of microbes from mother to baby, which can boost the immune system.

“We’ve discovered that the way we’re born - either by C-section or natural birth - has long-term consequences on our immunity to diseases as we grow up,” said Professor Henrik Salje in the University of Cambridge​’s Department of Genetics, joint senior author of the report.

He added: “We know that a lot of children don't end up having their second measles jab, which is dangerous for them as individuals and for the wider population.

“Infants born by C-section are the ones we really want to be following up to make sure they get their second measles jab, because their first jab is much more likely to fail.”

The results are published today in the journal Nature Microbiology.

At least 95% of the population needs to be fully vaccinated to keep measles under control but the UK is well below this, despite the Measles, Mumps and Rubella (MMR) vaccine being available through the NHS Routine Childhood Immunisation Programme.

An increasing number of women around the world are choosing to give birth by caesarean section: in the UK a third of all births are by C-section, in Brazil and Turkey over half of all children are born this way.

“With a C-section birth, children aren’t exposed to the mother’s microbiome in the same way as with a vaginal birth. We think this means they take longer to catch up in developing their gut microbiome, and with it, the ability of the immune system to be primed by vaccines against diseases including measles,” said Salje.

To get their results, the researchers used data from previous studies of over 1,500 children in Hunan, China, which included blood samples taken every few weeks from birth to the age of 12. This allowed them to see how levels of measles antibodies in the blood change over the first few years of life, including following vaccination.

They found that 12% of children born via caesarean section had no immune response to their first measles vaccination, as compared to 5% of children born by vaginal delivery. This means that many of the children born by C-section did still mount an immune response following their first vaccination.

Two doses of the measles jab are needed for the body to mount a long-lasting immune response and protect against measles. According to the World Health Organisation, in 2022 only 83% of the world's children had received one dose of measles vaccine by their first birthday – the lowest since 2008.

Salje said: “Vaccine hesitancy is really problematic, and measles is top of the list of diseases we’re worried about because it’s so infectious.”

Measles is one of the world’s most contagious diseases, spread by coughs and sneezes. It starts with cold-like symptoms and a rash, and can lead to serious complications including blindness, seizures, and death.

Before the measles vaccine was introduced in 1963, there were major measles epidemics every few years causing an estimated 2.6 million deaths each year.

The research was funded by the National Natural Science Foundation of China.

Reference

Wang, W et al: ‘Dynamics of measles immunity from birth and following vaccination.’ Nature Microbiology, 13 May 2024. DOI: 10.1038/s41564-024-01694-x

Researchers say it is vital that children born by caesarean section receive two doses of the measles vaccine for robust protection against the disease.

Very sick 5 year old little boy fighting measles infection, boy is laying in bed under the blanket with an agonizing expression, boy is covered with rash caused by virus.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Baby born deaf can hear after breakthrough gene therapy

Baby Opal and mother Jo

Opal Sandy from Oxfordshire is the first patient treated in a global gene therapy trial, which shows 'mind-blowing' results. She is the first British patient in the world and the youngest child to receive this type of treatment.

Opal was born completely deaf because of a rare genetic condition, auditory neuropathy, caused by the disruption of nerve impulses travelling from the inner ear to the brain.

Within four weeks of having the gene therapy infusion to her right ear, Opal responded to sound, even with the cochlear implant in her left ear switched off.

Clinicians noticed continuous improvement in Opal’s hearing in the weeks afterwards. At 24 weeks, they confirmed Opal had close to normal hearing levels for soft sounds, such as whispering, in her treated ear.

Now 18 months old, Opal can respond to her parents’ voices and can communicate words such as “Dada” and “bye-bye.”

Opal’s mother, Jo Sandy, said: “When Opal could first hear us clapping unaided it was mind-blowing - we were so happy when the clinical team confirmed at 24 weeks that her hearing was also picking up softer sounds and speech. The phrase ‘near normal’ hearing was used and everyone was so excited such amazing results had been achieved.”

Auditory neuropathy can be due to a variation in a single gene, known as the OTOF gene. The gene produces a protein called otoferlin, needed to allow the inner hair cells in the ear to communicate with the hearing nerve. Approximately 20,000 people across the UK, Germany, France, Spain, Italy and UK and are deaf due to a mutation in the OTOF gene.

The CHORD trial, which started in May 2023, aims to show whether gene therapy can provide hearing for children born with auditory neuropathy.

Professor Manohar Bance from the Department of Clinical Neurosciences at the University of Cambridge and an ear surgeon at Cambridge University Hospitals NHS Foundation Trust is chief investigator of the trial. He said:

“These results are spectacular and better than I expected. Gene therapy has been the future of otology and audiology for many years and I’m so excited that it is now finally here. This is hopefully the start of a new era for gene therapies for the inner ear and many types of hearing loss.”

Children with a variation in the OTOF gene often pass the newborn screening, as the hair cells are working, but they are not talking to the nerve. It means this hearing loss is not commonly detected until children are 2 or 3 years of age – when a delay in speech is likely to be noticed.

Professor Bance added: “We have a short time frame to intervene because of the rapid pace of brain development at this age. Delays in the diagnosis can also cause confusion for families as the many reasons for delayed speech and late intervention can impact a children’s development.”

“More than sixty years after the cochlear implant was first invented – the standard of care treatment for patients with OTOF related hearing loss – this trial shows gene therapy could provide a future alternative. It marks a new era in the treatment for deafness. It also supports the development of other gene therapies that may prove to make a difference in other genetic related hearing conditions, many of which are more common than auditory neuropathy.”

Mutations in the OTOF gene can be identified by standard NHS genetic testing. Opal was identified as being at risk as her older sister has the condition; this was confirmed by genetic test result when she was 3 weeks old.

Opal was given an infusion containing a harmless virus (AAV1). It delivers a working copy of the OTOF gene and is delivered via an injection in the cochlea during surgery under general anaesthesia. During surgery, while Opal was given the gene therapy in right ear, a cochlear implant was fitted in her left ear.

James Sandy, Opal’s father said: “It was our ultimate goal for Opal to hear all the speech sounds. It’s already making a difference to our day-to-day lives, like at bath-time or swimming, when Opal can’t wear her cochlear implant. We feel so proud to have contributed to such pivotal findings, which will hopefully help other children like Opal and their families in the future.”

Opal’s 24-week results, alongside other scientific data from the CHORD trial are being presented at the American Society of Gene and Cell Therapy (ASGC) in Baltimore, USA this week.

Dr Richard Brown, Consultant Paediatrician at CUH, who is an Investigator on the CHORD trial, said: “The development of genomic medicine and alternative treatments is vital for patients worldwide, and increasingly offers hope to children with previously incurable disorders. It is likely that in the long run such treatments require less follow up so may prove to be an attractive option, including within the developing world. Follow up appointments have shown effective results so far with no adverse reactions and it is exciting to see the results to date.  

“Within the new planned Cambridge Children’s Hospital, we look forward to having a genomic centre of excellence which will support patients from across the region to access the testing they need, and the best treatment, at the right time.”

The CHORD trial has been funded by Regeneron. Patients are being enrolled in the study in the US, UK and Spain.

Patients in the first phase of the study receive a low dose to one ear. The second phase are expected to use a higher dose of gene therapy in one ear only, following proven safety of the starting dose. The third phase will look at gene therapy in both ears with the dose selected after ensuring the safety and effectiveness in parts 1 and 2. Follow up appointments will continue for five years for enrolled patients, which will show how patients adapt to understand speech in the longer term.

In Cambridge, the trial is supported by NIHR Cambridge Clinical Research Facility and NIHR Cambridge Biomedical Research Centre.

Adapted from a press release from CUH

A baby girl born deaf can hear unaided for the first time, after receiving gene therapy when she was 11 months old at Addenbrooke’s Hospital in Cambridge.

Gene therapy has been the future of otology and audiology for many years and I’m so excited that it is now finally here
Manohar Bance
Cambridge University Hospitals NHS Foundation Trust
Baby Opal and mother Jo

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Call for safeguards to prevent unwanted ‘hauntings’ by AI chatbots of dead loved ones

A visualisation of one of the design scenarios highlighted in the latest paper

Artificial intelligence that allows users to hold text and voice conversations with lost loved ones runs the risk of causing psychological harm and even digitally 'haunting' those left behind without design safety standards, according to University of Cambridge researchers. 

‘Deadbots’ or ‘Griefbots’ are AI chatbots that simulate the language patterns and personality traits of the dead using the digital footprints they leave behind. Some companies are already offering these services, providing an entirely new type of “postmortem presence”.

AI ethicists from Cambridge’s Leverhulme Centre for the Future of Intelligence outline three design scenarios for platforms that could emerge as part of the developing  “digital afterlife industry”, to show the potential consequences of careless design in an area of AI they describe as “high risk”.

The research, published in the journal Philosophy and Technology, highlights the potential for companies to use deadbots to surreptitiously advertise products to users in the manner of a departed loved one, or distress children by insisting a dead parent is still “with you”.

When the living sign up to be virtually re-created after they die, resulting chatbots could be used by companies to spam surviving family and friends with unsolicited notifications, reminders and updates about the services they provide – akin to being digitally “stalked by the dead”.

Even those who take initial comfort from a ‘deadbot’ may get drained by daily interactions that become an “overwhelming emotional weight”, argue researchers, yet may also be powerless to have an AI simulation suspended if their now-deceased loved one signed a lengthy contract with a digital afterlife service. 

“Rapid advancements in generative AI mean that nearly anyone with Internet access and some basic know-how can revive a deceased loved one,” said Dr Katarzyna Nowaczyk-Basińska, study co-author and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence (LCFI).

“This area of AI is an ethical minefield. It’s important to prioritise the dignity of the deceased, and ensure that this isn’t encroached on by financial motives of digital afterlife services, for example.

“At the same time, a person may leave an AI simulation as a farewell gift for loved ones who are not prepared to process their grief in this manner. The rights of both data donors and those who interact with AI afterlife services should be equally safeguarded.”

Platforms offering to recreate the dead with AI for a small fee already exist, such as ‘Project December’, which started out harnessing GPT models before developing its own systems, and apps including ‘HereAfter’. Similar services have also begun to emerge in China.

One of the potential scenarios in the new paper is “MaNana”: a conversational AI service allowing people to create a deadbot simulating their deceased grandmother without consent of the “data donor” (the dead grandparent). 

The hypothetical scenario sees an adult grandchild who is initially impressed and comforted by the technology start to receive advertisements once a “premium trial” finishes. For example, the chatbot suggesting ordering from food delivery services in the voice and style of the deceased.

The relative feels they have disrespected the memory of their grandmother, and wishes to have the deadbot turned off, but in a meaningful way – something the service providers haven’t considered.

“People might develop strong emotional bonds with such simulations, which will make them particularly vulnerable to manipulation,” said co-author Dr Tomasz Hollanek, also from Cambridge’s LCFI.

“Methods and even rituals for retiring deadbots in a dignified way should be considered. This may mean a form of digital funeral, for example, or other types of ceremony depending on the social context.”

“We recommend design protocols that prevent deadbots being utilised in disrespectful ways, such as for advertising or having an active presence on social media.”

While Hollanek and Nowaczyk-Basińska say that designers of re-creation services should actively seek consent from data donors before they pass, they argue that a ban on deadbots based on non-consenting donors would be unfeasible.

They suggest that design processes should involve a series of prompts for those looking to “resurrect” their loved ones, such as ‘have you ever spoken with X about how they would like to be remembered?’, so the dignity of the departed is foregrounded in deadbot development.    

Another scenario featured in the paper, an imagined company called “Paren’t”, highlights the example of a terminally ill woman leaving a deadbot to assist her eight-year-old son with the grieving process.

While the deadbot initially helps as a therapeutic aid, the AI starts to generate confusing responses as it adapts to the needs of the child, such as depicting an impending in-person encounter.

The researchers recommend age restrictions for deadbots, and also call for “meaningful transparency” to ensure users are consistently aware that they are interacting with an AI. These could be similar to current warnings on content that may cause seizures, for example.

The final scenario explored by the study – a fictional company called “Stay” – shows an older person secretly committing to a deadbot of themselves and paying for a twenty-year subscription, in the hopes it will comfort their adult children and allow their grandchildren to know them.

After death, the service kicks in. One adult child does not engage, and receives a barrage of emails in the voice of their dead parent. Another does, but ends up emotionally exhausted and wracked with guilt over the fate of the deadbot. Yet suspending the deadbot would violate the terms of the contract their parent signed with the service company.

“It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations,” said Hollanek.

“These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost. The potential psychological effect, particularly at an already difficult time, could be devastating.”

The researchers call for design teams to prioritise opt-out protocols that allow potential users to terminate their relationships with deadbots in ways that provide emotional closure.

Added Nowaczyk-Basińska: “We need to start thinking now about how we mitigate the social and psychological risks of digital immortality, because the technology is already here.”    

Cambridge researchers lay out the need for design safety protocols that prevent the emerging “digital afterlife industry” causing social and psychological harm. 

Tomasz Hollanek
A visualisation of one of the design scenarios highlighted in the latest paper

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Wraparound’ implants represent new approach to treating spinal cord injuries

Illustration of spinal cord

A team of engineers, neuroscientists and surgeons from the University of Cambridge developed the devices and used them to record the nerve signals going back and forth between the brain and the spinal cord. Unlike current approaches, the Cambridge devices can record 360-degree information, giving a complete picture of spinal cord activity.

Tests in live animal and human cadaver models showed the devices could also stimulate limb movement and bypass complete spinal cord injuries where communication between the brain and spinal cord had been completely interrupted.

Most current approaches to treating spinal injuries involve both piercing the spinal cord with electrodes and placing implants in the brain, which are both high-risk surgeries. The Cambridge-developed devices could lead to treatments for spinal injuries without the need for brain surgery, which would be far safer for patients.

While such treatments are still at least several years away, the researchers say the devices could be useful in the near-term for monitoring spinal cord activity during surgery. Better understanding of the spinal cord, which is difficult to study, could lead to improved treatments for a range of conditions, including chronic pain, inflammation and hypertension. The results are reported in the journal Science Advances.

“The spinal cord is like a highway, carrying information in the form of nerve impulses to and from the brain,” said Professor George Malliaras from the Department of Engineering, who co-led the research. “Damage to the spinal cord causes that traffic to be interrupted, resulting in profound disability, including irreversible loss of sensory and motor functions.”

The ability to monitor signals going to and from the spinal cord could dramatically aid in the development of treatments for spinal injuries, and could also be useful in the nearer term for better monitoring of the spinal cord during surgery.

“Most technologies for monitoring or stimulating the spinal cord only interact with motor neurons along the back, or dorsal, part of the spinal cord,” said Dr Damiano Barone from the Department of Clinical Neurosciences, who co-led the research. “These approaches can only reach between 20 and 30 percent of the spine, so you’re getting an incomplete picture.”

By taking their inspiration from microelectronics, the researchers developed a way to gain information from the whole spine, by wrapping very thin, high-resolution implants around the spinal cord’s circumference. This is the first time that safe 360-degree recording of the spinal cord has been possible – earlier approaches for 360-degree monitoring use electrodes that pierce the spine, which can cause spinal injury.

The Cambridge-developed biocompatible devices – just a few millionths of a metre thick – are made using advanced photolithography and thin film deposition techniques, and require minimal power to function.

The devices intercept the signals travelling on the axons, or nerve fibres, of the spinal cord, allowing the signals to be recorded. The thinness of the devices means they can record the signals without causing any damage to the nerves, since they do not penetrate the spinal cord itself.

“It was a difficult process, because we haven’t made spinal implants in this way before, and it wasn’t clear that we could safely and successfully place them around the spine,” said Malliaras. “But because of recent advances in both engineering and neurosurgery, the planets have aligned and we’ve made major progress in this important area.”

The devices were implanted using an adaptation to routine surgical procedure so they could be slid under the spinal cord without damaging it. In tests using rat models, the researchers successfully used the devices to stimulate limb movement. The devices showed very low latency – that is, their reaction time was close to human reflexive movement. Further tests in human cadaver models showed that the devices can be successfully placed in humans.

The researchers say their approach could change how spinal injuries are treated in future. Current attempts to treat spinal injuries involve both brain and spinal implants, but the Cambridge researchers say the brain implants may not be necessary.

“If someone has a spinal injury, their brain is fine, but it’s the connection that’s been interrupted,” said Barone. “As a surgeon, you want to go where the problem is, so adding brain surgery on top of spinal surgery just increases the risk to the patient. We can collect all the information we need from the spinal cord in a far less invasive way, so this would be a much safer approach for treating spinal injuries.”

While a treatment for spinal injuries is still years away, in the nearer term, the devices could be useful for researchers and surgeons to learn more about this vital, but understudied, part of human anatomy in a non-invasive way. The Cambridge researchers are currently planning to use the devices to monitor nerve activity in the spinal cord during surgery.

“It’s been almost impossible to study the whole of the spinal cord directly in a human, because it’s so delicate and complex,” said Barone. “Monitoring during surgery will help us to understand the spinal cord better without damaging it, which in turn will help us develop better therapies for conditions like chronic pain, hypertension or inflammation. This approach shows enormous potential for helping patients.”

The research was supported in part by the Royal College of Surgeons, the Academy of Medical Sciences, Health Education England, the National Institute for Health Research, MRC Confidence in Concept, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

 

Reference:
Ben J Woodington, Jiang Lei et al. ‘Flexible Circumferential Bioelectronics to Enable 360-degree Recording and Stimulation of the Spinal Cord.’ Science Advances (2024). DOI: 10.1126/sciadv.adl1230

A tiny, flexible electronic device that wraps around the spinal cord could represent a new approach to the treatment of spinal injuries, which can cause profound disability and paralysis.

Because of recent advances in both engineering and neurosurgery, the planets have aligned and we’ve made major progress in this important area
George Malliaras
SEBASTIAN KAULITZKI/SCIENCE PHOTO LIBRARY
Illustration of spinal cord

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New vaccine effective against coronaviruses that haven’t even emerged yet

Syringe and vaccine bottle

This is a new approach to vaccine development called ‘proactive vaccinology’, where scientists build a vaccine before the disease-causing pathogen even emerges.

The new vaccine works by training the body’s immune system to recognise specific regions of eight different coronaviruses, including SARS-CoV-1, SARS-CoV-2, and several that are currently circulating in bats and have potential to jump to humans and cause a pandemic.

Key to its effectiveness is that the specific virus regions the vaccine targets also appear in many related coronaviruses. By training the immune system to attack these regions, it gives protection against other coronaviruses not represented in the vaccine – including ones that haven’t even been identified yet.

For example, the new vaccine does not include the SARS-CoV-1 coronavirus, which caused the 2003 SARS outbreak, yet it still induces an immune response to that virus.

“Our focus is to create a vaccine that will protect us against the next coronavirus pandemic, and have it ready before the pandemic has even started,” said Rory Hills, a graduate researcher in the University of Cambridge’s Department of Pharmacology and first author of the report.

He added: “We’ve created a vaccine that provides protection against a broad range of different coronaviruses – including ones we don’t even know about yet.”

The results are published today in the journal Nature Nanotechnology.

“We don’t have to wait for new coronaviruses to emerge. We know enough about coronaviruses, and different immune responses to them, that we can get going with building protective vaccines against unknown coronaviruses now,” said Professor Mark Howarth in the University of Cambridge’s Department of Pharmacology, senior author of the report.

He added: “Scientists did a great job in quickly producing an extremely effective COVID vaccine during the last pandemic, but the world still had a massive crisis with a huge number of deaths. We need to work out how we can do even better than that in the future, and a powerful component of that is starting to build the vaccines in advance.”

 

 

The new ‘Quartet Nanocage’ vaccine is based on a structure called a nanoparticle – a ball of proteins held together by incredibly strong interactions. Chains of different viral antigens are attached to this nanoparticle using a novel ‘protein superglue’. Multiple antigens are included in these chains, which trains the immune system to target specific regions shared across a broad range of coronaviruses.

This study demonstrated that the new vaccine raises a broad immune response, even in mice that were pre-immunised with SARS-CoV-2.

The new vaccine is much simpler in design than other broadly protective vaccines currently in development, which the researchers say should accelerate its route into clinical trials.

The underlying technology they have developed also has potential for use in vaccine development to protect against many other health challenges.

The work involved a collaboration between scientists at the University of Cambridge, the University of Oxford, and Caltech. It improves on previous work, by the Oxford and Caltech groups, to develop a novel all-in-one vaccine against coronavirus threats. The vaccine developed by Oxford and Caltech should enter Phase 1 clinical trials in early 2025, but its complex nature makes it challenging to manufacture which could limit large-scale production.

Conventional vaccines include a single antigen to train the immune system to target a single specific virus. This may not protect against a diverse range of existing coronaviruses, or against pathogens that are newly emerging.

The research was funded by the Biotechnology and Biological Sciences Research Council.

Reference: Hills, R A et al: ‘Proactive vaccination using multiviral Quartet Nanocages to elicit broad anti-coronavirus responses.’ Nature Nanotechnology, May 2024. DOI: 10.1038/s41565-024-01655-9

Researchers have developed a new vaccine technology that has been shown in mice to provide protection against a broad range of coronaviruses with potential for future disease outbreaks - including ones we don’t even know about

Our focus is to create a vaccine that will protect us against the next coronavirus pandemic, and have it ready before the pandemic has even started.
Rory Hills
Stefan Cristian Cioata on Getty
Syringe and vaccine bottle

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Ice shelves fracture under weight of meltwater lakes

Ali Banwell and Laura Stevens installing the time-lapse camera used in this study on the George VI Ice Shelf in Antarctica.

When air temperatures in Antarctica rise and glacier ice melts, water can pool on the surface of floating ice shelves, weighing them down and causing the ice to bend. Now, for the first time in the field, researchers have shown that ice shelves don’t just buckle under the weight of meltwater lakes — they fracture.

As the climate warms and melt rates in Antarctica increase, this fracturing could cause vulnerable ice shelves to collapse, allowing inland glacier ice to spill into the ocean and contribute to sea level rise.

Ice shelves are important for the Antarctic Ice Sheet’s overall health as they act to buttress or hold back the glacier ice on land. Scientists have predicted and modelled that surface meltwater loading could cause ice shelves to fracture, but no one had observed the process in the field, until now.

The new study, published in the Journal of Glaciology, may help explain how the Larsen B Ice Shelf abruptly collapsed in 2002. In the months before its catastrophic breakup, thousands of meltwater lakes littered the ice shelf’s surface, which then drained over just a few weeks.

To investigate the impacts of surface meltwater on ice shelf stability, a research team led by the University of Colorado Boulder, and including researchers from the University of Cambridge, travelled to the George VI Ice Shelf on the Antarctic Peninsula in November 2019.

First, the team identified a depression or ‘doline’ in the ice surface that had formed by a previous lake drainage event where they thought meltwater was likely to pool again on the ice. Then, they ventured out on snowmobiles, pulling all their science equipment and safety gear behind on sleds.

Around the doline, the team installed high-precision GPS stations to measure small changes in elevation at the ice’s surface, water-pressure sensors to measure lake depth, and a timelapse camera system to capture images of the ice surface and meltwater lakes every 30 minutes.

In 2020, the COVID-19 pandemic brought their fieldwork to a screeching halt. When the team finally made it back to their field site in November 2021, only two GPS sensors and one timelapse camera remained; two other GPS and all water pressure sensors had been flooded and buried in solid ice. Fortunately, the surviving instruments captured the vertical and horizontal movement of the ice’s surface and images of the meltwater lake that formed and drained during the record-high 2019/2020 melt season.

GPS data indicated that the ice in the centre of the lake basin flexed downward about a foot in response to the increased weight from meltwater. That finding builds upon previous work that produced the first direct field measurements of ice shelf buckling caused by meltwater ponding and drainage.

The team also found that the horizontal distance between the edge and centre of the meltwater lake basin increased by over a foot. This was most likely due to the formation and/or widening of circular fractures around the meltwater lake, which the timelapse imagery captured. Their results provide the first field-based evidence of ice shelf fracturing in response to a surface meltwater lake weighing down the ice.

“This is an exciting discovery,” said lead author Alison Banwell, from the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado Boulder. “We believe these types of circular fractures were key in the chain reaction style lake drainage process that helped to break up the Larsen B Ice Shelf.”

“While these measurements were made over a small area, they demonstrate that bending and breaking of floating ice due to surface water may be more widespread than previously thought,” said co-author Dr Rebecca Dell from Cambridge’s Scott Polar Research Institute. “As melting increases in response to predicted warming, ice shelves may become more prone to break up and collapse than they are currently.”

“This has implications for sea level as the buttressing of inland ice is reduced or removed, allowing the glaciers and ice streams to flow more rapidly into the ocean,” said co-author Professor Ian Willis, also from SPRI.

The work supports modelling results that show the immense weight of thousands of meltwater lakes and subsequent draining caused the Larsen B Ice Shelf to bend and break, contributing to its collapse.

“These observations are important because they can be used to improve models to better predict which Antarctic ice shelves are more vulnerable and most susceptible to collapse in the future,” Banwell said.

The research was funded by the U.S. National Science Foundation (NSF) and the Natural Environment Research Council (NERC), part of UK Research and Innovation (UKRI). The team also included researchers from the University of Oxford and the University of Chicago. Rebecca Dell is a Fellow of Trinity Hall, Cambridge. 

Reference:
Alison F Banwell et al. ‘Observed meltwater-induced flexure and fracture at a doline on George VI Ice Shelf, Antarctica.’ Journal of Glaciology (2024). DOI: 10.1017/jog.2024.31

Adapted from a CIRES press release.

Heavy pooling meltwater can fracture ice, potentially leading to ice shelf collapse

Ian Willis
Ali Banwell and Laura Stevens installing the time-lapse camera used in this study on the George VI Ice Shelf in Antarctica.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Robotic nerve ‘cuffs’ could help treat a range of neurological conditions

Illustration of the human nervous system

The researchers, from the University of Cambridge, combined flexible electronics and soft robotics techniques to develop the devices, which could be used for the diagnosis and treatment of a range of disorders, including epilepsy and chronic pain, or the control of prosthetic limbs.

Current tools for interfacing with the peripheral nerves – the 43 pairs of motor and sensory nerves that connect the brain and the spinal cord – are outdated, bulky and carry a high risk of nerve injury. However, the robotic nerve ‘cuffs’ developed by the Cambridge team are sensitive enough to grasp or wrap around delicate nerve fibres without causing any damage.

Tests of the nerve cuffs in rats showed that the devices only require tiny voltages to change shape in a controlled way, forming a self-closing loop around nerves without the need for surgical sutures or glues.

The researchers say the combination of soft electrical actuators with neurotechnology could be an answer to minimally invasive monitoring and treatment for a range of neurological conditions. The results are reported in the journal Nature Materials.

Electric nerve implants can be used to either stimulate or block signals in target nerves. For example, they might help relieve pain by blocking pain signals, or they could be used to restore movement in paralysed limbs by sending electrical signals to the nerves. Nerve monitoring is also standard surgical procedure when operating in areas of the body containing a high concentration of nerve fibres, such as anywhere near the spinal cord.

These implants allow direct access to nerve fibres, but they come with certain risks. “Nerve implants come with a high risk of nerve injury,” said Professor George Malliaras from Cambridge’s Department of Engineering, who led the research. “Nerves are small and highly delicate, so anytime you put something large, like an electrode, in contact with them, it represents a danger to the nerves.”

“Nerve cuffs that wrap around nerves are the least invasive implants currently available, but despite this they are still too bulky, stiff and difficult to implant, requiring significant handling and potential trauma to the nerve,” said co-author Dr Damiano Barone from Cambridge’s Department of Clinical Neurosciences.

The researchers designed a new type of nerve cuff made from conducting polymers, normally used in soft robotics. The ultra-thin cuffs are engineered in two separate layers. Applying tiny amounts of electricity – just a few hundred millivolts – causes the devices to swell or shrink.

The cuffs are small enough that they could be rolled up into a needle and injected near the target nerve. When activated electrically, the cuffs will change their shape to wrap around the nerve, allowing nerve activity to be monitored or altered.

“To ensure the safe use of these devices inside the body, we have managed to reduce the voltage required for actuation to very low values,” said Dr Chaoqun Dong, the paper’s first author. “What's even more significant is that these cuffs can change shape in both directions and be reprogrammed. This means surgeons can adjust how tightly the device fits around a nerve until they get the best results for recording and stimulating the nerve.”

Tests in rats showed that the cuffs could be successfully placed without surgery, and formed a self-closing loop around the target nerve. The researchers are planning further testing of the devices in animal models, and are hoping to begin testing in humans within the next few years.

“Using this approach, we can reach nerves that are difficult to reach through open surgery, such as the nerves that control, pain, vision or hearing, but without the need to implant anything inside the brain,” said Barone. “The ability to place these cuffs so they wrap around the nerves makes this a much easier procedure for surgeons, and it’s less risky for patients.”

“The ability to make an implant that can change shape through electrical activation opens up a range of future possibilities for highly targeted treatments,” said Malliaras. “In future, we might be able to have implants that can move through the body, or even into the brain – it makes you dream how we could use technology to benefit patients in future.”

The research was supported in part by the Swiss National Science Foundation, the Cambridge Trust, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).

 

Reference:
Chaoqun Dong et al. ‘Electrochemically actuated microelectrodes for minimally invasive peripheral nerve interfaces.’ Nature Materials (2024). DOI: 10.1038/s41563-024-01886-0

Researchers have developed tiny, flexible devices that can wrap around individual nerve fibres without damaging them.

The ability to make an implant that can change shape through electrical activation opens up a range of future possibilities for highly targeted treatments
George Malliaras
XH4D via iStock / Getty Images Plus
Illustration of the human nervous system

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Study highlights increased risk of second cancers among breast cancer survivors

Female doctor standing near woman patient doing breast cancer scan

For the first time, the research has shown that this risk is higher in people living in areas of greater socioeconomic deprivation.

Breast cancer is the most commonly diagnosed cancer in the UK. Around 56,000 people in the UK are diagnosed each year, the vast majority (over 99%) of whom are women. Improvements in earlier diagnosis and in treatments mean that five year survival rates have been increasing over time, reaching 87% by 2017 in England.

People who survive breast cancer are at risk of second primary cancer, but until now the exact risk has been unclear. Previously published research suggested that women and men who survive breast cancer are at a 24% and 27% greater risk of a non-breast second primary cancer than the wider population respectively. There have been also suggestions that second primary cancer risks differ by the age at breast cancer diagnosis.

To provide more accurate estimates, a team led by researchers at the University of Cambridge analysed data from over 580,000 female and over 3,500 male breast cancer survivors diagnosed between 1995 and 2019 using the National Cancer Registration Dataset. The results of their analysis are published today in Lancet Regional Health – Europe.

First author Isaac Allen from the Department of Public Health and Primary Care at the University of Cambridge said: “It’s important for us to understand to what extent having one type of cancer puts you at risk of a second cancer at a different site. The female and male breast cancer survivors whose data we studied were at increased risk of a number of second cancers. Knowing this can help inform conversations with their care teams to look out for signs of potential new cancers.”

The researchers found significantly increased risks of cancer in the contralateral (that is, unaffected) breast and for endometrium and prostate cancer in females and males, respectively. Females who survived breast cancer were at double the risk of contralateral breast cancer compared to the general population and at 87% greater risk of endometrial cancer, 58% greater risk of myeloid leukaemia and 25% greater risk of ovarian cancer.

Age of diagnosis was important, too – females diagnosed with breast cancer under the age of 50 were 86% more likely to develop a second primary cancer compared to the general population of the same age, whereas women diagnosed after age 50 were at a 17% increased risk. One potential explanation is that a larger number of younger breast cancer survivors may have inherited genetic alterations that increase risk for multiple cancers. For example, women with inherited changes to the BRCA1 and BRCA2 genes are at increased risk of contralateral breast cancer, ovarian and pancreatic cancer.

Females from the most socioeconomically deprived backgrounds were at 35% greater risk of a second primary cancer compared to females from the least deprived backgrounds. These differences were primarily driven by non-breast cancer risks, particularly for lung, kidney, head and neck, bladder, oesophageal and stomach cancers. This may be because smoking, obesity, and alcohol consumption – established risk factors for these cancers – are more common among more deprived groups.

Allen, a PhD student at Clare Hall, added: “This is further evidence of the health inequalities that people from more deprived backgrounds experience. We need to fully understand why they are at greater risk of second cancers so that we can intervene and reduce this risk.”

Male breast cancer survivors were 55 times more likely than the general male population to develop contralateral breast cancer – though the researchers stress that an individual’s risk was still very low. For example, for every 100 men diagnosed with breast cancer at age 50 or over, about three developed contralateral breast cancer during a 25 year period.  Male breast cancer survivors were also 58% more likely than the general male population to develop prostate cancer.

Professor Antonis Antoniou from the Department of Public Health and Primary Care at the University of Cambridge, the study’s senior author, said: “This is the largest study to date to look at the risk in breast cancer survivors of developing a second cancer. We were able to carry this out and calculate more accurate estimates because of the outstanding data sets available to researchers through the NHS.”

The research was funded by Cancer Research UK with support from the National Institute for Health and Care Research Cambridge Biomedical Research Centre.

Cancer Research UK’s senior cancer intelligence manager, Katrina Brown, said: “This study shows us that the risk of second primary cancers is higher in people who have had breast cancer, and this can differ depending on someone’s socioeconomic background. But more research is needed to understand what is driving this difference and how to tackle these health inequalities.”

People who are concerned about their cancer risk should contact their GP for advice. If you or someone close to you have been affected by cancer and you’ve got questions, you can call Cancer Research UK nurses on freephone 0808 800 4040, Monday to Friday.

Reference
Allen, I, et al. Risks of second primary cancers among 584,965 female and male breast cancer survivors in England: a 25-year retrospective cohort study. Lancet Regional Health – Europe; 24 April 2024: DOI: 10.1016/j.lanepe.2024.100903

Survivors of breast cancer are at significantly higher risk of developing second cancers, including endometrial and ovarian cancer for women and prostate cancer for men, according to new research studying data from almost 600,000 patients in England.

It’s important for us to understand to what extent having one type of cancer puts you at risk of a second cancer at a different site. Knowing this can help inform conversations with their care teams to look out for signs of potential new cancers
Isaac Allen
Doctor standing near woman patient doing breast cancer scan

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

A simple ‘twist’ improves the engine of clean fuel generation

Abstract orange swirls on a black background

The researchers, led by the University of Cambridge, are developing low-cost light-harvesting semiconductors that power devices for converting water into clean hydrogen fuel, using just the power of the sun. These semiconducting materials, known as copper oxides, are cheap, abundant and non-toxic, but their performance does not come close to silicon, which dominates the semiconductor market.

However, the researchers found that by growing the copper oxide crystals in a specific orientation so that electric charges move through the crystals at a diagonal, the charges move much faster and further, greatly improving performance. Tests of a copper oxide light harvester, or photocathode, based on this fabrication technique showed a 70% improvement over existing state-of-the-art oxide photocathodes, while also showing greatly improved stability.

The researchers say their results, reported in the journal Nature, show how low-cost materials could be fine-tuned to power the transition away from fossil fuels and toward clean, sustainable fuels that can be stored and used with existing energy infrastructure.

Copper (I) oxide, or cuprous oxide, has been touted as a cheap potential replacement for silicon for years, since it is reasonably effective at capturing sunlight and converting it into electric charge. However, much of that charge tends to get lost, limiting the material’s performance.

“Like other oxide semiconductors, cuprous oxide has its intrinsic challenges,” said co-first author Dr Linfeng Pan from Cambridge’s Department of Chemical Engineering and Biotechnology. “One of those challenges is the mismatch between how deep light is absorbed and how far the charges travel within the material, so most of the oxide below the top layer of material is essentially dead space.”

“For most solar cell materials, it’s defects on the surface of the material that cause a reduction in performance, but with these oxide materials, it’s the other way round: the surface is largely fine, but something about the bulk leads to losses,” said Professor Sam Stranks, who led the research. “This means the way the crystals are grown is vital to their performance.”

To develop cuprous oxides to the point where they can be a credible contender to established photovoltaic materials, they need to be optimised so they can efficiently generate and move electric charges – made of an electron and a positively-charged electron ‘hole’ – when sunlight hits them.

One potential optimisation approach is single-crystal thin films – very thin slices of material with a highly-ordered crystal structure, which are often used in electronics. However, making these films is normally a complex and time-consuming process.

Using thin film deposition techniques, the researchers were able to grow high-quality cuprous oxide films at ambient pressure and room temperature. By precisely controlling growth and flow rates in the chamber, they were able to ‘shift’ the crystals into a particular orientation. Then, using high temporal resolution spectroscopic techniques, they were able to observe how the orientation of the crystals affected how efficiently electric charges moved through the material.

“These crystals are basically cubes, and we found that when the electrons move through the cube at a body diagonal, rather than along the face or edge of the cube, they move an order of magnitude further,” said Pan. “The further the electrons move, the better the performance.”

“Something about that diagonal direction in these materials is magic,” said Stranks. “We need to carry out further work to fully understand why and optimise it further, but it has so far resulted in a huge jump in performance.” Tests of a cuprous oxide photocathode made using this technique showed an increase in performance of more than 70% over existing state-of-the-art electrodeposited oxide photocathodes.

“In addition to the improved performance, we found that the orientation makes the films much more stable, but factors beyond the bulk properties may be at play,” said Pan.

The researchers say that much more research and development is still needed, but this and related families of materials could have a vital role in the energy transition.

“There’s still a long way to go, but we’re on an exciting trajectory,” said Stranks. “There’s a lot of interesting science to come from these materials, and it’s interesting for me to connect the physics of these materials with their growth, how they form, and ultimately how they perform.”

The research was a collaboration with École Polytechnique Fédérale de Lausanne, Nankai University and Uppsala University. The research was supported in part by the European Research Council, the Swiss National Science Foundation, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Sam Stranks is Professor of Optoelectronics in the Department of Chemical Engineering and Biotechnology, and a Fellow of Clare College, Cambridge.

 

Reference:
Linfeng Pan, Linjie Dai et al. ‘High carrier mobility along the [111] orientation in Cu2O photoelectrodes.’ Nature (2024). DOI: 10.1038/s41586-024-07273-8

For more information on energy-related research in Cambridge, please visit the Energy IRC, which brings together Cambridge’s research knowledge and expertise, in collaboration with global partners, to create solutions for a sustainable and resilient energy landscape for generations to come. 

Researchers have found a way to super-charge the ‘engine’ of sustainable fuel generation – by giving the materials a little twist.

orange via Getty Images
Abstract orange swirls

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Rare disease research at Cambridge receives major boost with launch of two new centres

Woman inhaling from a mask nebulizer

The virtual centres, supported by the charity LifeArc, will focus on areas where there are significant unmet needs. They will tackle barriers that ordinarily prevent new tests and treatments reaching patients with rare diseases and speed up the delivery of rare disease treatment trials.

The centres will bring together leading scientists and rare disease clinical specialists from across the UK for the first time, encouraging new collaborations across different research disciplines and providing improved access to facilities and training.

LifeArc Centre for Rare Mitochondrial Diseases

Professor Patrick Chinnery will lead the LifeArc Centre for Rare Mitochondrial Diseases, a national partnership with the Lily Foundation and Muscular Dystrophy UK, together with key partners at UCL, Newcastle University and three other centres (Oxford, Birmingham and Manchester).

Mitochondrial diseases are genetic disorders affecting 1 in 5,000 people. They often cause progressive damage to the brain, eyes, muscles, heart and liver, leading to severe disability and a shorter life. There is currently have no cure for most conditions, however, new opportunities to treat mitochondrial diseases have been identified in the last five years, meaning that it’s a critical time for research development. The £7.5M centre will establish a national platform that will connect patient groups, knowledge and infrastructure in order to accelerate new treatments getting to clinical trial.

Professor Chinnery said: “The new LifeArc centre unites scientific and clinical strengths from across the UK. For the first time we will form a single team, focussed on developing new treatments for mitochondrial diseases which currently have no cure.”

Adam Harraway has Mitochondrial Disease and says he lives in constant fear of what might go wrong next with his condition. “With rare diseases such as these, it can feel like the questions always outweigh the answers. The news of this investment from LifeArc fills me with hope for the future. To know that there are so many wonderful people and organisations working towards treatments and cures makes me feel seen and heard. It gives a voice to people who often have to suffer in silence, and I'm excited to see how this project can help Mito patients in the future."

LifeArc Centre for Rare Respiratory Diseases

Professor Stefan Marciniak will co-lead the LifeArc Centre for Rare Respiratory Diseases, a UK wide collaborative centre co-created in partnership with patients and charities. This Centre is a partnership between Universities and NHS Trusts across the UK, co-led by Edinburgh with Nottingham, Dundee, Cambridge, Southampton, University College London and supported by six other centres (Belfast, Cardiff, Leeds, Leicester, Manchester and Royal Brompton).

For the first time ever, it will provide a single ‘go to’ centre that will connect children and adults with rare respiratory disease with clinical experts, researchers, investors and industry leaders across the UK. The £9.4M centre will create a UK-wide biobank of patient samples and models of disease that will allow researchers to advance pioneering therapies and engage with industry and regulatory partners to develop innovative human clinical studies.

Professor Marciniak said: “There are many rare lung diseases, and together those affected constitute a larger underserved group of patients. The National Translational Centre for Rare Respiratory Diseases brings together expertise from across the UK to find effective treatments and train the next generation of rare disease researchers.”

Former BBC News journalist and presenter, Philippa Thomas, has the rare incurable lung disease, Lymphangioleiomyomatosis (LAM). Her condition has stabilised but for many people, the disease can be severely life-limiting. Philippa said: “There is so little research funding for rare respiratory diseases, that getting treatment - let alone an accurate diagnosis - really does feel like a lottery. It is also terrifying being diagnosed with something your GP will never have heard of.”

Globally, there are more than 300 million people living with rare diseases. However, rare disease research can be fragmented. Researchers can lack access to specialist facilities, as well as advice on regulation, trial designs, preclinical regulatory requirements, and translational project management, which are vital in getting new innovations to patients.

Dr Catriona Crombie, Head of Rare Disease at LifeArc, says: “We’re extremely proud to be launching four new LifeArc Translational Centres for Rare Diseases. Each centre has been awarded funding because it holds real promise for delivering change for people living with rare diseases. These centres also have the potential to create a blueprint for accelerating improvements across other disease areas, including common diseases.”

Adapted from a press release from LifeArc

Cambridge researchers will play key roles in two new centres dedicated to developing improved tests, treatments and potentially cures for thousands of people living with rare medical conditions.

The new LifeArc centre unites scientific and clinical strengths from across the UK
Patrick Chinnery
Woman inhaling from a mask nebulizer

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Training AI models to answer ‘what if?’ questions could improve medical treatments

Computer generated image of a human brain

Artificial intelligence techniques can be helpful for multiple medical applications, such as radiology or oncology, where the ability to recognise patterns in large volumes of data is vital. For these types of applications, the AI compares information against learned examples, draws conclusions, and makes extrapolations.

Now, an international team led by researchers from Ludwig-Maximilians-Universität München (LMU) and including researchers from the University of Cambridge, is exploring the potential of a comparatively new branch of AI for diagnostics and therapy.

The researchers found that causal machine learning (ML) can estimate treatment outcomes – and do so better than the machine learning methods generally used to date. Causal machine learning makes it easier for clinicians to personalise treatment strategies, which individually improves the health of patients.

The results, reported in the journal Nature Medicine, suggest how causal machine learning could improve the effectiveness and safety of a variety of medical treatments.

Classical machine learning recognises patterns and discovers correlations. However, the principle of cause and effect remains closed to machines as a rule; they cannot address the question of why. When making therapy decisions for a patient, the ‘why’ is vital to achieve the best outcomes.

“Developing machine learning tools to address why and what if questions is empowering for clinicians, because it can strengthen their decision-making processes,” said senior author Professor Mihaela van der Schaar, Director of the Cambridge Centre for AI in Medicine. “But this sort of machine learning is far more complex than assessing personalised risk.”

For example, when attempting to determine therapy decisions for someone at risk of developing diabetes, classical ML would aim to predict how probable it is for a given patient with a range of risk factors to develop the disease. With causal ML, it would be possible to answer how the risk changes if the patient receives an anti-diabetes drug; that is, gauge the effect of a cause. It would also be possible to estimate whether metformin, the commonly-prescribed medication, would be the best treatment, or whether another treatment plan would be better.

To be able to estimate the effect of a hypothetical treatment, the AI models must learn to answer ‘what if?’ questions. “We give the machine rules for recognising the causal structure and correctly formalising the problem,” said Professor Stefan Feuerriegel from LMU, who led the research. “Then the machine has to learn to recognise the effects of interventions and understand, so to speak, how real-life consequences are mirrored in the data that has been fed into the computers.”

Even in situations for which reliable treatment standards do not yet exist or where randomised studies are not possible for ethical reasons because they always contain a placebo group, machines could still gauge potential treatment outcomes from the available patient data and form hypotheses for possible treatment plans, so the researchers hope.

With such real-world data, it should generally be possible to describe the patient cohorts with ever greater precision in the estimates, bringing individualised therapy decisions that much closer. Naturally, there would still be the challenge of ensuring the reliability and robustness of the methods.

“The software we need for causal ML methods in medicine doesn’t exist out of the box,” says Feuerriegel. “Rather, complex modelling of the respective problem is required, involving close collaboration between AI experts and doctors.”

In other fields, such as marketing, explains Feuerriegel, the work with causal ML has already been in the testing phase for some years now. “Our goal is to bring the methods a step closer to practice,” he said. The paper describes the direction in which things could move over the coming years.”

“I have worked in this area for almost 10 years, working relentlessly in our lab with generations of students to crack this problem,” said van der Schaar, who is affiliated with the Departments of Applied Mathematics and Theoretical Physics, Engineering and Medicine. “It’s an extremely challenging area of machine learning, and seeing it come closer to clinical use, where it will empower clinicians and patients alike, is very satisfying.”

Van der Schaar is continuing to work closely with clinicians to validate these tools in diverse clinical settings, including transplantation, cancer and cardiovascular disease.

Reference:
Stefan Feuerriegel et al. ‘Causal machine learning for predicting treatments.’ Nature Medicine (2024). DOI: 10.1038/s41591-024-02902-1

Adapted from an LMU media release.

Machines can learn not only to make predictions, but to handle causal relationships. An international research team shows how this could make medical treatments safer, more efficient, and more personalised.

Yuichiro Chino via Getty Images
Computer-generated image of human brain

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Mess is best: disordered structure of battery-like devices improves performance

Left to right: Clare Grey, Xinyu Liu, Alex Forse

Researchers led by the University of Cambridge used experimental and computer modelling techniques to study the porous carbon electrodes used in supercapacitors. They found that electrodes with a more disordered chemical structure stored far more energy than electrodes with a highly ordered structure.

Supercapacitors are a key technology for the energy transition and could be useful for certain forms of public transport, as well as for managing intermittent solar and wind energy generation, but their adoption has been limited by poor energy density.

The researchers say their results, reported in the journal Science, represent a breakthrough in the field and could reinvigorate the development of this important net-zero technology.

Like batteries, supercapacitors store energy, but supercapacitors can charge in seconds or a few minutes, while batteries take much longer. Supercapacitors are far more durable than batteries, and can last for millions of charge cycles. However, the low energy density of supercapacitors makes them unsuitable for delivering long-term energy storage or continuous power.

“Supercapacitors are a complementary technology to batteries, rather than a replacement,” said Dr Alex Forse from Cambridge’s Yusuf Hamied Department of Chemistry, who led the research. “Their durability and extremely fast charging capabilities make them useful for a wide range of applications.”

A bus, train or metro powered by supercapacitors, for example, could fully charge in the time it takes to let passengers off and on, providing it with enough power to reach the next stop. This would eliminate the need to install any charging infrastructure along the line. However, before supercapacitors are put into widespread use, their energy storage capacity needs to be improved.

While a battery uses chemical reactions to store and release charge, a supercapacitor relies on the movement of charged molecules between porous carbon electrodes, which have a highly disordered structure. “Think of a sheet of graphene, which has a highly ordered chemical structure,” said Forse. “If you scrunch up that sheet of graphene into a ball, you have a disordered mess, which is sort of like the electrode in a supercapacitor.”

Because of the inherent messiness of the electrodes, it’s been difficult for scientists to study them and determine which parameters are the most important when attempting to improve performance. This lack of clear consensus has led to the field getting a bit stuck.

Many scientists have thought that the size of the tiny holes, or nanopores, in the carbon electrodes was the key to improved energy capacity. However, the Cambridge team analysed a series of commercially available nanoporous carbon electrodes and found there was no link between pore size and storage capacity.

Forse and his colleagues took a new approach and used nuclear magnetic resonance (NMR) spectroscopy – a sort of ‘MRI’ for batteries – to study the electrode materials. They found that the messiness of the materials – long thought to be a hindrance – was the key to their success.

“Using NMR spectroscopy, we found that energy storage capacity correlates with how disordered the materials are – the more disordered materials can store more energy,” said first author Xinyu Liu, a PhD candidate co-supervised by Forse and Professor Dame Clare Grey. “Messiness is hard to measure – it’s only possible thanks to new NMR and simulation techniques, which is why messiness is a characteristic that’s been overlooked in this field.”

When analysing the electrode materials with NMR spectroscopy, a spectrum with different peaks and valleys is produced. The position of the peak indicates how ordered or disordered the carbon is. “It wasn’t our plan to look for this, it was a big surprise,” said Forse. “When we plotted the position of the peak against energy capacity, a striking correlation came through – the most disordered materials had a capacity almost double that of the most ordered materials.”

So why is mess good? Forse says that’s the next thing the team is working on. More disordered carbons store ions more efficiently in their nanopores, and the team hope to use these results to design better supercapacitors. The messiness of the materials is determined at the point they are synthesised.

“We want to look at new ways of making these materials, to see how far messiness can take you in terms of improving energy storage,” said Forse. “It could be a turning point for a field that’s been stuck for a little while. Clare and I started working on this topic over a decade ago, and it’s exciting to see a lot of our previous fundamental work now having a clear application.”

The research was supported in part by the Cambridge Trusts, the European Research Council, and UK Research and Innovation (UKRI).

Reference:
Xinyu Liu et al. ‘Structural disorder determines capacitance in nanoporous carbons.’ Science (2024). DOI: 10.1126/science.adn6242

For more information on energy-related research in Cambridge, please visit the Energy IRC, which brings together Cambridge’s research knowledge and expertise, in collaboration with global partners, to create solutions for a sustainable and resilient energy landscape for generations to come. 

The energy density of supercapacitors – battery-like devices that can charge in seconds or a few minutes – can be improved by increasing the ‘messiness’ of their internal structure.

This could be a turning point for a field that’s been stuck for a little while.
Alex Forse
Nathan Pitt
Left to right: Clare Grey, Xinyu Liu, Alex Forse

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Steven Barrett appointed Regius Professor of Engineering

Steven Barrett

Professor Steven Barrett has been appointed Regius Professor of Engineering at the University of Cambridge, effective 1 June. He joins the University from the Massachusetts Institute of Technology (MIT), where he is head of the Department of Aeronautics and Astronautics (AeroAstro).

Barrett’s appointment marks his return to Cambridge, where he was an undergraduate at Pembroke College, and received his PhD. He was a Lecturer in the Department of Engineering from 2008 until 2010, when he joined the faculty at MIT.

The Regius Professorships are royal academic titles created by the monarch. The Regius Professorship in Engineering was announced in 2011, in honour of HRH Prince Philip, The Duke of Edinburgh’s 35 years as Chancellor of the University.

“It’s a pleasure to welcome Steven back to Cambridge to take up one of the University’s most prestigious roles,” said Vice-Chancellor Professor Deborah Prentice. “His work on sustainable aviation will build on Cambridge’s existing strengths, and will help us develop the solutions we need to address the threat posed by climate change.”

Barrett’s research focuses on the impact aviation has on the environment. He has developed a number of solutions to mitigate the impact aviation has on air quality, climate, and noise pollution. The overall goal of his research is to help develop technologies that eliminate the environmental impact of aviation. His work on the first-ever plane with no moving propulsion parts was named one of the 10 Breakthroughs of 2018 by Physics World.

“This is an exciting time to work on sustainable aviation, and Cambridge, as well as the UK more generally, is a wonderful platform to advance that,” said Barrett. “Cambridge’s multidisciplinary Department of Engineering, as well as the platform that the Regius Professorship provides, makes this a great opportunity. I’ve learned a lot at MIT, but I’d always hoped to come back to Cambridge at some point.”

Much of Barrett’s research focuses on the elimination of contrails, line-shaped clouds produced by aircraft engine exhaust in cold and humid conditions. Contrails cause half of all aviation-related global warming – more than the entirety of the UK economy. Barrett uses a combination of satellite observation and machine learning techniques to help determine whether avoiding certain regions of airspace could reduce or eliminate contrail formation.

“It will take several years to make this work, but if it does, it could drastically reduce emissions at a very low cost to the consumer,” said Barrett. “We could make the UK the first ‘Blue Skies’ country in the world – the first without any contrails in the sky.”

“Steven’s pioneering work on contrail formation and avoidance is a key element in reducing the environmental impact of aviation, and will strengthen the UK’s position as a world leader in this area,” said Professor Colm Durkan, Head of Cambridge’s Department of Engineering. “Together with Steven’s work on alternative aviation propulsion systems, this will strengthen Cambridge’s vision of helping us all achieve net zero at an accelerated rate.”

In addition to the Professorship in Engineering, there are seven other Regius Professorships at Cambridge: Divinity, Hebrew, Greek, Civil Law and Physic (all founded by Henry VIII in 1540), History (founded by George I in 1724) and Botany (founded in 2009, to mark the University’s 800th anniversary).

An expert on the environmental impacts of aviation, Barrett joins the University of Cambridge from MIT.

It’s a pleasure to welcome Steven back to Cambridge to take up one of the University’s most prestigious roles
Vice-Chancellor Professor Deborah Prentice
MIT
Steven Barrett

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Artificial intelligence beats doctors in accurately assessing eye problems

close up of an eye

The clinical knowledge and reasoning skills of GPT-4 are approaching the level of specialist eye doctors, a study led by the University of Cambridge has found.

GPT-4 - a ‘large language model’ - was tested against doctors at different stages in their careers, including unspecialised junior doctors, and trainee and expert eye doctors. Each was presented with a series of 87 patient scenarios involving a specific eye problem, and asked to give a diagnosis or advise on treatment by selecting from four options.

GPT-4 scored significantly better in the test than unspecialised junior doctors, who are comparable to general practitioners in their level of specialist eye knowledge.

GPT-4 gained similar scores to trainee and expert eye doctors - although the top performing doctors scored higher.

The researchers say that large language models aren’t likely to replace healthcare professionals, but have the potential to improve healthcare as part of the clinical workflow.

They say state-of-the-art large language models like GPT-4 could be useful for providing eye-related advice, diagnosis, and management suggestions in well-controlled contexts, like triaging patients, or where access to specialist healthcare professionals is limited.

“We could realistically deploy AI in triaging patients with eye issues to decide which cases are emergencies that need to be seen by a specialist immediately, which can be seen by a GP, and which don’t need treatment,” said Dr Arun Thirunavukarasu, lead author of the study, which he carried out while a student at the University of Cambridge’s School of Clinical Medicine.

He added: “The models could follow clear algorithms already in use, and we’ve found that GPT-4 is as good as expert clinicians at processing eye symptoms and signs to answer more complicated questions.

“With further development, large language models could also advise GPs who are struggling to get prompt advice from eye doctors. People in the UK are waiting longer than ever for eye care.

Large volumes of clinical text are needed to help fine-tune and develop these models, and work is ongoing around the world to facilitate this.

The researchers say that their study is superior to similar, previous studies because they compared the abilities of AI to practicing doctors, rather than to sets of examination results.

“Doctors aren't revising for exams for their whole career. We wanted to see how AI fared when pitted against to the on-the-spot knowledge and abilities of practicing doctors, to provide a fair comparison,” said Thirunavukarasu, who is now an Academic Foundation Doctor at Oxford University Hospitals NHS Foundation Trust.

He added: “We also need to characterise the capabilities and limitations of commercially available models, as patients may already be using them - rather than the internet - for advice.”

The test included questions about a huge range of eye problems, including extreme light sensitivity, decreased vision, lesions, itchy and painful eyes, taken from a textbook used to test trainee eye doctors. This textbook is not freely available on the internet, making it unlikely that its content was included in GPT-4’s training datasets.

The results are published today in the journal PLOS Digital Health.

“Even taking the future use of AI into account, I think doctors will continue to be in charge of patient care. The most important thing is to empower patients to decide whether they want computer systems to be involved or not. That will be an individual decision for each patient to make,” said Thirunavukarasu.

GPT-4 and GPT-3.5 – or ‘Generative Pre-trained Transformers’ - are trained on datasets containing hundreds of billions of words from articles, books, and other internet sources. These are two examples of large language models; others in wide use include Pathways Language Model 2 (PaLM 2) and Large Language Model Meta AI 2 (LLaMA 2).

The study also tested GPT-3.5, PaLM2, and LLaMA with the same set of questions. GPT-4 gave more accurate responses than all of them.

GPT-4 powers the online chatbot ChatGPT to provide bespoke responses to human queries. In recent months, ChatGPT has attracted significant attention in medicine for attaining passing level performance in medical school examinations, and providing more accurate and empathetic messages than human doctors in response to patient queries.

The field of artificially intelligent large language models is moving very rapidly. Since the study was conducted, more advanced models have been released - which may be even closer to the level of expert eye doctors.

Reference: Thirunavukarasu, A J et al: ‘Large language models approach expert-level clinical knowledge and reasoning in ophthalmology: A head-to-head cross-sectional study.’ PLOS Digital Health, April 2024. DOI: 10.1371/journal.pdig.0000341

A study has found that the AI model GPT-4 significantly exceeds the ability of non-specialist doctors to assess eye problems and provide advice.

We could realistically deploy AI in triaging patients with eye issues to decide which cases are emergencies.
Arun Thirunavukarasu
Mavocado on Getty

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

AI speeds up drug design for Parkinson’s ten-fold

Professor Michele Vendruscolo wearing a white lab coat

The researchers, from the University of Cambridge, designed and used an AI-based strategy to identify compounds that block the clumping, or aggregation, of alpha-synuclein, the protein that characterises Parkinson’s.

The team used machine learning techniques to quickly screen a chemical library containing millions of entries, and identified five highly potent compounds for further investigation.

Parkinson’s affects more than six million people worldwide, with that number projected to triple by 2040. No disease-modifying treatments for the condition are currently available. The process of screening large chemical libraries for drug candidates – which needs to happen well before potential treatments can be tested on patients – is enormously time-consuming and expensive, and often unsuccessful.

Using machine learning, the researchers were able to speed up the initial screening process ten-fold, and reduce the cost by a thousand-fold, which could mean that potential treatments for Parkinson’s reach patients much faster. The results are reported in the journal Nature Chemical Biology.

Parkinson’s is the fastest-growing neurological condition worldwide. In the UK, one in 37 people alive today will be diagnosed with Parkinson’s in their lifetime. In addition to motor symptoms, Parkinson’s can also affect the gastrointestinal system, nervous system, sleeping patterns, mood and cognition, and can contribute to a reduced quality of life and significant disability.

Proteins are responsible for important cell processes, but when people have Parkinson’s, these proteins go rogue and cause the death of nerve cells. When proteins misfold, they can form abnormal clusters called Lewy bodies, which build up within brain cells stopping them from functioning properly.

“One route to search for potential treatments for Parkinson’s requires the identification of small molecules that can inhibit the aggregation of alpha-synuclein, which is a protein closely associated with the disease,” said Professor Michele Vendruscolo from the Yusuf Hamied Department of Chemistry, who led the research. “But this is an extremely time-consuming process – just identifying a lead candidate for further testing can take months or even years.”

While there are currently clinical trials for Parkinson’s currently underway, no disease-modifying drug has been approved, reflecting the inability to directly target the molecular species that cause the disease.

This has been a major obstacle in Parkinson’s research, because of the lack of methods to identify the correct molecular targets and engage with them. This technological gap has severely hampered the development of effective treatments.

The Cambridge team developed a machine learning method in which chemical libraries containing millions of compounds are screened to identify small molecules that bind to the amyloid aggregates and block their proliferation.

A small number of top-ranking compounds were then tested experimentally to select the most potent inhibitors of aggregation. The information gained from these experimental assays was fed back into the machine learning model in an iterative manner, so that after a few iterations, highly potent compounds were identified.

“Instead of screening experimentally, we screen computationally,” said Vendruscolo, who is co-Director of the Centre for Misfolding Diseases. “By using the knowledge we gained from the initial screening with our machine learning model, we were able to train the model to identify the specific regions on these small molecules responsible for binding, then we can re-screen and find more potent molecules.”

Using this method, the Cambridge team developed compounds to target pockets on the surfaces of the aggregates, which are responsible for the exponential proliferation of the aggregates themselves. These compounds are hundreds of times more potent, and far cheaper to develop, than previously reported ones.

“Machine learning is having a real impact on drug discovery – it’s speeding up the whole process of identifying the most promising candidates,” said Vendruscolo. “For us, this means we can start work on multiple drug discovery programmes – instead of just one. So much is possible due to the massive reduction in both time and cost – it’s an exciting time.”

The research was conducted in the Chemistry of Health Laboratory in Cambridge, which was established with the support of the UK Research Partnership Investment Fund (UKRPIF) to promote the translation of academic research into clinical programmes.

 

Reference:
Robert I Horne et al. ‘Discovery of Potent Inhibitors of α-Synuclein Aggregation Using Structure-Based Iterative Learning.’ Nature Chemical Biology (2024). DOI: 10.1038/s41589-024-01580-x

Researchers have used artificial intelligence techniques to massively accelerate the search for Parkinson’s disease treatments.

Machine learning is having a real impact on drug discovery – it’s speeding up the whole process of identifying the most promising candidates
Michele Vendruscolo
Nathan Pitt
Michele Vendruscolo

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Interspecies competition led to even more forms of ancient human – defying evolutionary trends in vertebrates

A cast of the skull of Homo Heidelbergensis, one of the hominin species analysed in the latest study.

Climate has long been held responsible for the emergence and extinction of hominin species. In most vertebrates, however, interspecies competition is known to play an important role.

Now, research shows for the first time that competition was fundamental to 'speciation' – the rate at which new species emerge – across five million years of hominin evolution.

The study, published today in Nature Ecology & Evolution, also suggests that the species formation pattern of our own lineage was closer to island-dwelling beetles than other mammals.  

“We have been ignoring the way competition between species has shaped our own evolutionary tree,” said lead author Dr Laura van Holstein, a University of Cambridge biological anthropologist at Clare College. “The effect of climate on hominin species is only part of the story.” 

In other vertebrates, species form to fill ecological “niches” says van Holstein. Take Darwin’s finches: some evolved large beaks for nut-cracking, while others evolved small beaks for feeding on certain insects. When each resource niche gets filled, competition kicks in, so no new finches emerge and extinctions take over.

Van Holstein used Bayesian modelling and phylogenetic analyses to show that, like other vertebrates, most hominin species formed when competition for resources or space were low.

“The pattern we see across many early hominins is similar to all other mammals. Speciation rates increase and then flatline, at which point extinction rates start to increase. This suggests that interspecies competition was a major evolutionary factor.”

However, when van Holstein analysed our own group, Homo, the findings were 'bizarre'.

For the Homo lineage that led to modern humans, evolutionary patterns suggest that competition between species actually resulted in the appearance of even more new species – a complete reversal of the trend seen in almost all other vertebrates.

“The more species of Homo there were, the higher the rate of speciation. So when those niches got filled, something drove even more species to emerge. This is almost unparalleled in evolutionary science.”

The closest comparison she could find was in beetle species that live on islands, where contained ecosystems can produce unusual evolutionary trends.

“The patterns of evolution we see across species of Homo that led directly to modern humans is closer to those of island-dwelling beetles than other primates, or even any other mammal.”

Recent decades have seen the discovery of several new hominin species, from Australopithecus sediba to Homo floresiensis. Van Holstein created a new database of 'occurrences' in the hominin fossil record: each time an example of a species was found and dated, around 385 in total.

Fossils can be an unreliable measure of species’ lifetimes. “The earliest fossil we find will not be the earliest members of a species,” said van Holstein.

“How well an organism fossilises depends on geology, and on climatic conditions: whether it is hot or dry or damp. With research efforts concentrated in certain parts of the world, and we might well have missed younger or older fossils of a species as a result.”

Van Holstein used data modelling to address this problem, and factor in likely numbers of each species at the beginning and end of their existence, as well as environmental factors on fossilisation, to generate new start and end dates for most known hominin species (17 in total).

She found that some species thought to have evolved through 'anagenesis' – when one slowly turns into another, but lineage doesn’t split – may have actually 'budded': when a new species branches off from an existing one.

For example, the hominin species Australopithecus afarensis was believed to have speciated via anagenesis from Australopithecus anamensis. However, the new data modelling suggests they overlapped by around half a million years.  

This meant that several more hominin species than previously assumed were co-existing, and so possibly competing.

While early species of hominins, such as Paranthropus, probably evolved physiologically to expand their niche – adapting teeth to exploit new types of food, for example – the driver of the very different pattern in our own genus Homo may well have been technology.

“Adoption of stone tools or fire, or intensive hunting techniques, are extremely flexible behaviours. A species that can harness them can quickly carve out new niches, and doesn’t have to survive vast tracts of time while evolving new body plans,” said van Holstein

She argues that an ability to use technology to generalise, and rapidly go beyond ecological niches that force other species to compete for habitat and resources, may be behind the exponential increase in the number of Homo species detected by the latest study.

But it also led to Homo sapiens – the ultimate generalisers. And competition with an extremely flexible generalist in almost every ecological niche may be what contributed to the extinction of all other Homo species.

Added van Holstein: “These results show that, although it has been conventionally ignored, competition played an important role in human evolution overall. Perhaps most interestingly, in our own genus it played a role unlike that across any other vertebrate lineage known so far.”

Competition between species played a major role in the rise and fall of hominins, and produced a “bizarre” evolutionary pattern for the Homo lineage.

This is almost unparalleled in evolutionary science
Laura van Holstein
The Duckworth Laboratory
A cast of the skull of Homo Heidelbergensis, one of the hominin species analysed in the latest study.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Is Democracy Dying?

Palace of Westminster, London.

2024 is the year of elections. A record number of elections will take place, with half the adult population of the world – some two billion people – having the chance to vote. Is this a milestone to be celebrated in our democratic history or are we at a crossroads where the fate of liberal democracy hangs in the balance?

Against a backdrop of polarising populist movements, the erosion of trust in traditional institutions and a decline of democratic norms, we asked: is democracy dying? Is the election of populists an expression of democracy or a breakdown of democracy? How resilient are our democratic institutions in the face of unprecedented challenges? Is the tension between liberal and democracy ultimately too great to resolve?

We addressed these questions in our second Vice-Chancellor’s Dialogues, hosted by Vice-Chancellor Professor Deborah Prentice on 24 April 2024.

Our speakers

  • David Goodhart, founding editor of Prospect magazine and Head of the Demography, Immigration and Integration unit at the think tank Policy Exchange. He is the author of The Road to Somewhere: The Populist Revolt and the Future of Politics.
  • Nabila Ramdani, award-winning journalist, broadcaster and academic. She is the author of Fixing France: How to Repair a Broken Republic.
  • Helen Thompson, Professor of Political Economy at the University. She is author of Disorder: Hard Times in the 21st Century.

The discussion was chaired by Roger Mosey, Master of Selwyn College and former Editorial Director of the BBC.

The Vice-Chancellor’s Dialogues

There are two purposes to these events. The first is to establish whether there is any common ground between people who may seem to be far apart. If we are to make progress in legislation or in understanding the world we live in, we need to identify where we agree as well as where we disagree. The second is to ensure discussions involve the widest range of viewpoints – that nothing, within the law, is taboo and that freedom of speech and of thought, and of academic debate, is upheld.

Watch our first event on whether assisted dying is compassionate, or dangerous for society >

On 24 April 2024, the second Vice-Chancellor’s Dialogues event grappled with the question: 'Is Democracy Dying?' The event is part of a series of dialogues about some of the most difficult issues of our time.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

No

Study unpicks why childhood maltreatment continues to impact on mental and physical health into adulthood

Black and white image of boy curled up on the floor

Individuals who experienced maltreatment in childhood – such as emotional, physical and sexual abuse, or emotional and physical neglect – are more likely to develop mental illness throughout their entire life, but it is not yet well understood why this risk persists many decades after maltreatment first took place.

In a study published in Proceedings of the National Academy of Sciences, scientists from the University of Cambridge and Leiden University found that adult brains continue to be affected by childhood maltreatment in adulthood because these experiences make individuals more likely to experience obesity, inflammation and traumatic events, all of which are risk factors for poor health and wellbeing, which in turn also affect brain structure and therefore brain health.

The researchers examined MRI brain scans from approximately 21,000 adult participants aged 40 to 70 years in UK Biobank, as well as information on body mass index (an indicator of metabolic health), CRP (a blood marker of inflammation) and experiences of childhood maltreatment and adult trauma.

Sofia Orellana, a PhD student at the Department of Psychiatry and Darwin College, University of Cambridge, said: “We’ve known for some time that people who experience abuse or neglect as a child can continue to experience mental health problems long into adulthood and that their experiences can also cause long term problems for the brain, the immune system and the metabolic system, which ultimately controls the health of your heart or your propensity to diabetes for instance. What hasn’t been clear is how all these effects interact or reinforce each other.”

Using a type of statistical modelling that allowed them to determine how these interactions work, the researchers confirmed that experiencing childhood maltreatment made individuals more likely to have an increased body mass index (or obesity) and experience greater rates of trauma in adulthood. Individuals with a history of maltreatment tended to show signs of dysfunction in their immune systems, and the researchers showed that this dysfunction is the product of obesity and repeated exposure to traumatic events.

Next, the researchers expanded their models to include MRI measures of the adult’s brains and were able to show that widespread increases and decreases in brain thickness and volume associated with greater body mass index, inflammation and trauma were attributable to childhood maltreatment having made these factors more likely in the first place. These changes in brain structure likely mean that some form of physical damage is occurring to brain cells, affecting how they work and function.

Although there is more to do to understand how these effects operate at a cellular level in the brain, the researchers believe that their findings advance our understanding of how adverse events in childhood can contribute to life-long increased risk of brain and mind health disorders.

Professor Ed Bullmore from the Department of Psychiatry, Cambridge, said: “Now that we have a better understanding of why childhood maltreatment has long term effects, we can potentially look for biomarkers – biological red flags – that indicate whether an individual is at increased risk of continuing problems. This could help us target early on those who most need help, and hopefully aid them in breaking this chain of ill health.”

Professor Bullmore is a Fellow at Lucy Cavendish College and and an Honorary Fellow at Downing College.

The research was supported by MQ: Transforming Mental Health, the Royal Society, Medical Research Council, National Institute for Health and Care Research (NIHR) Cambridge Biomedical Research Centre, the NIHR Applied Research Collaboration East of England, Girton College and Darwin College.

Reference
Orellana, SC et al. Childhood maltreatment influences adult brain structure through its effects on immune, metabolic and psychosocial factors. PNAS; 9 Apr 2024 ; DOI: 10.1073/pnas.230470412

Childhood maltreatment can continue to have an impact long into adulthood because of how it effects an individual’s risk of poor physical health and traumatic experiences many years later, a new study has found.

We’ve known for some time that people who experience abuse or neglect as a child can continue to experience mental health problems long into adulthood
Sofia Orellana
Black and white image of boy curled up on the floor

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Four Cambridge researchers awarded prestigious European Research Council Advanced Grants

Photographs of the four awardees

The European Research Council (ERC) has announced today the award of 255 Advanced Grants to outstanding research leaders across Europe, as part of the EU’s Horizon Europe programme. Four University of Cambridge researchers are amongst those to receive this prestigious and competitive funding.

The University of Cambridge’s grant awardees are:

Professor Albert Guillén i Fàbregas in the Department of Engineering for his project Scaling and Concentration Laws in Information Theory.

Guillén i Fàbregas, who has previously received ERC Starting, Consolidator and Proof of Concept Grants, said: “I am truly delighted with the news that the ERC will continue to fund my research in information theory, which studies the mathematical aspects of data transmission and data compression.

“This project will broaden the theory to study arbitrary scaling laws of the number of messages to transmit or compress." Read more about Professor Guillén i Fàbregas' project here.

Professor Beverley Glover in the Department of Plant Sciences and Director of Cambridge University Botanic Garden, for her project Convergent evolution of floral patterning through alternative optimisation of mechanical parameter space.

Glover said: “This funding will enable us to explore how iridescent colour evolved repeatedly in different flowers. We think it will shed new light on evolution itself, as we think about the development of iridescence structure from a mechanical perspective, focusing on the forces acting as a petal grows and the mechanical properties of the petal tissue.

“It's only possible for me to do this work because of the amazing living collection at Cambridge University Botanic Garden, and I'm thrilled that the ERC is keen to support it."

Professor Ian Henderson in the Department of Plant Sciences for his project Evolution of the Arabidopsis Pancentromere.

Henderson said: “This project seeks to investigate enigmatic regions of the genome called the centromeres, using the model plant Arabidopsis. These regions play a deeply conserved role in cell division yet paradoxically are fast evolving.

“I am highly honoured and excited to be awarded an ERC Advanced grant. The advent of long-read sequencing technology makes addressing these questions timely. The ERC’s long-term support will allow us to capitalise on these advances, build new collaborations, and train postdoctoral researchers.”

Professor Paul Lane in the Department of Archaeology, for his project Landscape Historical Ecology and Archaeology of Ancient Pastoral Societies in Kenya

Lane said: “Pastoralism has been an extraordinarily resilient livelihood strategy across Africa. This project provides an excellent opportunity to reconstruct how East Africa’s pastoralists responded to significant climate change in the past, and to draw lessons from these adaptations for responding to contemporary climate crises in a region that is witnessing heightened water scarcity and loss of access to critically important grazing lands.”

“This project will allow us to utilise the department’s world-leading archaeological science laboratories and expertise to answer crucial questions about past patterns of mobility, dietary diversity, climatic regimes and food security among East African pastoralists over the last fifteen hundred years. This has never been attempted before for this time period.” Read more about Professor Lane's project here.

Professor Anne Ferguson-Smith, Pro-Vice Chancellor for Research at the University of Cambridge said: “Many congratulations to Albert, Beverley, Ian and Paul on receiving these prestigious and highly competitive awards. It is fantastic that their ambitious, cutting-edge research will be supported by the European Research Council, marking them as outstanding European research leaders.

“Now that the UK is an associated country to Horizon Europe I encourage other Cambridge researchers to also consider applying to the ERC and other Horizon Europe programmes.”

President of the European Research Council Professor Maria Leptin said: “Congratulations to the 255 researchers who will receive grants to follow their scientific instinct in this new funding round. I am particularly happy to see more mid-career scientists amongst the Advanced Grant winners this time. I hope that it will encourage more researchers at this career stage to apply for these grants.”

The ERC is the premier European funding organisation for excellent frontier research. The 255 ERC Advanced Grants, totalling €652 million, support cutting-edge research in a wide range of fields from medicine and physics to social sciences and humanities.

The European Commission and the UK Government have reached an agreement on the association of the UK to Horizon Europe, which applies for calls for proposals implementing the 2024 budget and onwards.

The ERC Advanced Grants target established, leading researchers with a proven track-record of significant achievements. In recent years, there has been a steady rise in mid-career researchers (12-17 years post-PhD), who have been successful in the Advanced Grants competitions, with 18% securing grants in this latest round.

 

 

The funding provides leading senior researchers with the opportunity to pursue ambitious, curiosity-driven projects that could lead to major scientific breakthroughs.

Many congratulations to Albert, Beverley, Ian and Paul... It is fantastic that their ambitious, cutting-edge research will be supported by the European Research Council, marking them as outstanding European research leaders.
Anne Ferguson-Smith

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Pork labelling schemes ‘not helpful’ in making informed buying choices, say researchers

Two pigs on a farm

Researchers have evaluated different types of pig farming – including woodland, organic, free range, RSPCA assured, and Red Tractor certified, to assess each systems’ impact across four areas: land use (representing biodiversity loss), greenhouse gas emissions, antibiotics use and animal welfare. Their study concludes that none of the farm types performed consistently well across all four areas – a finding that has important implications for increasingly climate conscious consumers, as well as farmers themselves.

However, there were individual farms that did perform well in all domains, including an indoor Red Tractor farm, an outdoor bred, indoor finished RSPCA assured farm and fully outdoor woodland farm. “Outliers like these show that trade-offs are not inevitable,” said lead author Dr Harriet Bartlett, Research Associate at the University of Oxford's Smith School of Enterprise and the Environment, who was formerly at the University of Cambridge.  

“Somewhat unexpectedly we found that a handful of farms perform far better than average across all four of our environmental and welfare measures,” added senior author Andrew Balmford, Professor of Conservation Science at the University of Cambridge. However, none of the current label or assurance schemes predicted which farms these would be.

“The way we classify farm types and label pork isn’t helpful for making informed decisions when it comes to buying more sustainable meat. Even more importantly, we aren’t rewarding and incentivising the best-performing farmers. Instead of focusing on farm types or practices, we need to focus on meaningful outcomes for people, the planet and the pigs – and assess, and reward farms based on these,” said Bartlett.

The findings also show that common assumptions around food labelling can be misplaced. For instance, Organic farming systems, which consumers might see as climate and environmentally friendly, have on average three times the CO2 output per kg of meat of more intensive Red Tractor or RSPCA assured systems and four times the land use. However, these same systems use on average almost 90% fewer antibiotic medicines, and result in improved animal welfare compared with production from Red tractor or RSPCA assured systems.

The way we classify livestock farms must be improved, Bartlett says, because livestock production is growing rapidly, especially pork production, which has quadrupled in the past 50 years and already accounts for 9% of greenhouse gas emissions from livestock. Pig farming also uses more antibiotics than any other livestock sector, and 8.5% of all arable land.

“Our findings show that mitigating the environmental impacts of livestock farming isn’t a case of saying which farm type is the best,” said Bartlett. “There is substantial scope for improvement within types, and our current means of classification is not identifying the best farms for the planet and animals overall. Instead, we need to identify farms that successfully limit their impacts across all areas of societal concern, and understand, promote and incentivise their practises.”

The study reached its conclusions using data from 74 UK and 17 Brazilian breed-to-finish systems, each made up of 1-3 farms and representing the annual production of over 1.2 million pigs. It is published today in the journal Nature Food.

“To the best of our knowledge, our dataset covers by far the largest and most diverse sample of pig production systems examined in any single study,” said Bartlett.

James Wood, Professor of Equine and Farm Animal Science at the University of Cambridge, commented: “This important study identifies a key need to clarify what different farm labels should indicate to consumers; there is a pressing need to extend this work into other farming sectors. It also clearly demonstrates the critical importance that individual farmers play in promoting best practice across all farming systems.”

Trade-offs in the externalities of pig production are not inevitable was authored by academics at the University of Oxford, University of Cambridge and the University of São Paulo.

The research was funded by the Biotechnology and Biological Sciences Research Council (BBSRC).

Reference: Bartlett, H.,‘Trade-offs in the externalities of pig production are not inevitable.’ Nature Food, April 2024. DOI: 10.1038/s43016-024-00921-2

Adapted from a press release by the University of Oxford.

Farmers don’t have to choose between lowering environmental impact and improving welfare for their pigs, a new study has found: it is possible to do both. But this is not reflected in the current food labelling schemes relied on by consumers.

The way we classify farm types and label pork isn’t helpful for making informed decisions when it comes to buying more sustainable meat.
Harriet Bartlett
Charity Burggraaf/ Getty
Two pigs on a farm

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Collections-based research and innovation receives vital investment from Research England

Rhododendron brookianum type specimen from the University of Cambridge Herbarium

The University cares for the country’s highest concentration of internationally important collections outside London, with more than five million works of art, artefacts and specimens. Together, these collections play a fundamental role in delivering the University mission to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence and encompasses collaboration with and support of world-renowned researchers, game-changing research-led exhibitions and wide-ranging inclusion and learning programmes, promoting wellbeing, creativity and connectivity.

“I’m delighted that Research England has made such a strong statement of support for collections-based research at Cambridge, particularly in a challenging funding landscape,” said Kamal Munir, Pro-Vice-Chancellor for University Community and Engagement.

“The University continues to invest in enhanced research infrastructure and services to generate and enable research that spans the arts and the sciences, including via a Strategic Research Initiative, Collections-Connections-Communities that provides a convening space for research that benefits our communities. HEMG funding is critical in ensuring our collections support researchers and students across the UK and worldwide, through infrastructure, services, staffing and equitable collaboration.” 

This year, the University Herbarium joins the portfolio for the first time and the Sedgwick Museum of Earth Sciences rejoins the portfolio. 

Sam Brockington, academic lead for the Herbarium, which was recently awarded Designated status, said: “It’s fantastic to see the University Herbarium receive investment in this way. The Herbarium is the fourth-largest of its kind in the country, and a rich resource that supports a huge range of scientific and humanities research. Research supported by the Herbarium ranges from the discovery of species new to science, to the genomics of crop improvement, and investigations into the history and development of scientific ideas and natural history. This investment will enable us to substantially develop our support for the wider academic community.”

Dr Liz Hide, Director of the Sedgwick Museum of Earth Sciences, which has been awarded £210,000 a year, said: “I’m delighted that Research England has recognised the strength of the Sedgwick’s collections and their importance to the UK and international research landscape. Over the next five years, this new investment will be transformative for the Sedgwick Museum, ensuring researchers can fully utilise our new Collections Research Centre, and enabling our outstanding collections to inspire many new avenues of research across both the sciences and the humanities.”

Dr Juliette Fritsch, the University’s first Director for Collections’ Strategy, said: "I’m thrilled to work across the incredible resources contained within the University’s museums, libraries, and botanic garden collections to create strategies together, building on major initiatives, such as the cross-collections Power and Memory programme. These integrated approaches enhance our collective impact and are only possible through the input of our funders, including Research England and Arts Council England.”

The full list of University of Cambridge museums and collections awarded HEMG funding are:

1. Cambridge University Botanic Garden 
2. Fitzwilliam Museum 
3. Kettle’s Yard 
4. Museum of Archaeology & Anthropology (MAA) 
5. University Museum of Zoology 
6. Polar Museum 
7. Whipple Museum of the History of Science 
8. Sedgwick Museum of Earth Sciences 
9. Cambridge University Herbarium 

Research England has supported nine of the University’s museums and collections with £3m a year of Higher Education Museums, Galleries and Collections (HEMG) funding, over the coming five years.

HEMG funding is critical in ensuring our collections support researchers and students across the UK and worldwide
Kamal Munir
©markbox.co.uk
Rhododendron brookianum type specimen from the University of Cambridge Herbarium

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

£9.2m boost for next generation of Cambridge cancer experts

Cancer researchers in the laboratory

The charity is to award the funding over the next five years to train early-career clinician scientists – doctors who also carry out medical research - as part of its Clinical Academic Training Programme. 

The Clinical Academic Training Programme will invest £58.7m at nine research centres including the Cancer Research UK Cambridge Centre in partnership with the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust, which includes Addenbrooke’s Hospital.

Clinician scientists play an essential role in translating cancer research, helping to bridge the gap between scientific research carried out in laboratories and clinical research involving patients.  

Dr Caroline Watson – now a Group Leader in the Early Cancer Institute at the University of Cambridge and Honorary Haematology Consultant at Addenbrooke’s Hospital – has benefited from this funding, having previously been awarded a three-year Cancer Research UK Clinical Research Training Fellowship in 2017. Caroline was first author on a Science paper and Nature Genetics paper, based on her Cancer Research UK-funded research, that identified which mutations in healthy blood are associated with the highest risk of developing blood cancer.

Dr Watson said: “As we age, we all acquire mutations in the cells that make up our tissues.  The vast majority are harmless, but some can increase cancer risk. With blood’s relative ease of sampling and improved DNA sequencing costs, we now have enough data, across many thousands of individuals, to determine which specific mutations enable cells to expand most rapidly and could therefore confer the highest risk of cancer. Knowing whether specific mutations are high-risk or clinically insignificant is key for the future of personalised cancer risk. 

“I’m immensely grateful for the funding I received from Cancer Research UK, which provided me with a key stepping stone in my clinician scientist career.  I feel fortunate to now be able to spend the bulk of my time focused on research, but also continue with some clinical work in parallel.  Having been involved in setting up the UK’s first clinic focused on blood cancer prevention at Addenbrooke’s Hospital, I look forward to translating my research findings to directly benefit patients.”

Michelle Mitchell, Cancer Research UK’s Chief Executive, said: “Clinician scientists have a very important role to play by bringing their knowledge and experience of treating people with cancer to scientific research.

“We need all our doctors and scientists to be able to reach their full potential, no matter their background. That’s why we are continuing to provide flexible training options for early-career clinician scientists.”

The contribution of clinician scientists in the new Cambridge Cancer Research Hospital will be critical for the future of cancer research. The East of England specialist cancer hospital planned for the Cambridge Biomedical Campus is bringing together clinical expertise from leading Addenbrooke’s Hospital with world-leading scientists from the University of Cambridge and Cancer Research UK Cambridge Centre, under one roof.  

This integrated approach will help fast-track cancer innovations and will mean patients across the region can directly benefit from the latest innovations in cancer science.

Becoming a clinician scientist usually involves doctors taking time out of their medical training to undertake a PhD, before returning to train in their chosen specialisation, but many clinicians don’t come back to research after qualifying as consultants. This may be due to existing pressure on the healthcare system and lack of available funding.   

Nearly three quarters (74%) of clinical research staff surveyed by Cancer Research UK in 2023 said that it has become harder to deliver research in a timely manner in the last 18 months, with 78% of respondents describing wider pressures on the health service as a substantial or extreme barrier.  

To tackle this issue, Cancer Research UK’s Clinical Academic Training Programme provides flexible training options alongside mentorship and networking opportunities to better support clinicians who want to get involved and stay in cancer research.  

Data from the Medical Schools Council Clinical Academic Survey reports a decline in the number of clinical academic positions between 2011–2020. Research from the United States also suggests that offering combined qualifications retains more women in clinical research roles.    

Professor Richard Gilbertson, Head of the Department of Oncology at the University of Cambridge and Director of the Cancer Research UK Cambridge Centre, said: “We are delighted to gain further generous support from Cancer Research UK to enable us to provide doctors and medical students with flexible training opportunities, training them to be the clinical cancer research leaders of the future.

“Developing new and effective treatments of cancer requires teams of scientists working in the clinic and laboratory, in all specialities. This funding is crucial to ensure that we train these individuals so that we can make these discoveries to benefit patients with cancer well into the future.”

Adapted from a press release from Cancer Research UK

Cancer Research UK has announced £9.2m for Cambridge to train the next generation of doctors and scientists to bring new and better cancer treatments to patients faster. 

I’m immensely grateful for the funding I received from Cancer Research UK, which provided me with a key stepping stone in my clinician scientist career
Caroline Watson
CRUK
Cancer researchers at the CRUK Cambridge Institute

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Partha Dasgupta wins BBVA Frontiers of Knowledge Award for Economics

The 16th edition of the Banco Bilbao Vizcaya Argentaria (BBVA) Frontiers of Knowledge Award in Economics, Finance and Management honours Professor Dasgupta for his work in defining the field of environmental economics by incorporating and quantifying the social value of nature.

The award also takes into account Professor Dasgupta's leadership of an independent, global review on the Economics of Biodiversity commissioned by the UK Treasury in 2019. The Dasgupta Review is expected to help set the agenda for the UK Government’s 25-year environment plan.

The BBVA awards committee said it commended Professor Dasgupta for laying the foundations of environmental economics through his pioneering work “on the interaction between economic life and the natural environment, including biodiversity.”

“More than any other economist of our time, Partha Dasgupta has stressed the important interplay between economic life and the natural environment," Chair of the BBVA selection committee and Nobel Economics laureate Eric Maskin said, adding that Dasgupta’s work and his proposals for measuring economic well-being “are critical for our time.”

The citation for the award said that Professor Dasgupta provided conceptual foundations for the definition and measurement of sustainable development with the social value of nature as a determining factor. That in contrast with measures of well-being based on flows such as GDP, Dasgupta proposed measuring sustainable development as the change in the accounting value of total wealth, including natural capital within this indicator.

“These ideas...have provided a framework for green accounting which is now widely adopted for measuring sustainable development,” the citation concludes.

“Most economists who work on natural resources or the environment think about nature as providing certain types of goods, like food, clean water, timber, fibres or pharmaceuticals,” Professor Dasgupta said. “So these are goods. These are objects that you can harvest from nature and transform with our human ingenuity into a final product, like the clothes we are wearing or the painting in the room where you are sitting, and so forth. These are the things we make out of the goods that nature gives us.”

At the core of this conventional line of economic thought, he explains, is that when a good becomes scarce, you can substitute it with another offering the same or similar results. But as he delved deeper into the subject, Dasgupta came to realize that nature supplies something much more important and irreplaceable than goods. It supplies processes (or in more economic terms, services).

“My own understanding of economics,” he said, “has moved away from goods to processes. These are the key things we economists should keep in mind. Of course we care about nature’s goods, like water, food and clothing, because without them we wouldn’t be here. But none of this would exist without the underlying processes of nature.”

Climate regulation is among the services, or processes, that Dasgupta uses to illustrate his point: sunlight comes and gets reflected into space, water evaporates and comes down as rain.

“You have the water cycle and you get your drinking water from it. And what is not consumed doesn’t disappear, it just evaporates or becomes part of the ocean through the river system and so forth. But if you mess around too much with climate, you also mess around with the water cycle, which will end up weakened. Likewise, if you deforest too much or get rid of biodiversity in the Amazon, you’re going to exacerbate the climate system. So my work has been to bring these issues into economics.”

Dasgupta believes economics has become over-reliant on the idea that scarcity can be overcome by substituting goods.

“In industrial production, of course, this idea of substitutability has been a great success. Think of all the materials that are produced in engineering departments or material science departments. But there are limits to this, when you tamper with processes. Just think of the human body. You have the metabolic process, which keeps you in a healthy state, and it would be foolish to think you could substitute one process for another. You wouldn’t say let me have less digestive capacity, but more running capacity. It would be silly, because these two things go together.”

The BBVA Foundation centers its activity on the promotion of world-class scientific research and cultural creation, and the recognition of talent.

The BBVA Foundation Frontiers of Knowledge Awards recognise and reward contributions of singular impact in physics and chemistry, mathematics, biology and biomedicine, technology, environmental sciences (climate change, ecology and conservation biology), economics, social sciences, the humanities and music, privileging those that significantly enlarge the stock of knowledge in a discipline, open up new fields, or build bridges between disciplinary areas.

Professor Sir Partha Dasgupta (Economics, St. John's) wins the BBVA award for Economics, Finance and Management for his groundbreaking work in environmental economics.

More than any other economist of our time, Partha Dasgupta has stressed the important interplay between economic life and the natural environment
Nobel Economics laureate Eric Maskin

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists identify rare gene variants that confer up to 6-fold increase in risk of obesity

Woman with obesity washing food

The discovery of rare variants in the genes BSN and APBA1 are some of the first obesity-related genes identified for which the increased risk of obesity is not observed until adulthood.

The study, published in Nature Genetics, was led by researchers at the Medical Research Council (MRC) Epidemiology Unit and the MRC Metabolic Diseases Unit at the Institute of Metabolic Science, both based at the University of Cambridge.

The researchers used UK Biobank and other data to perform whole exome sequencing of body mass index (BMI) in over 500,000 individuals.

They found that genetic variants in the gene BSN, also known as Bassoon, can raise the risk of obesity as much as six times and was also associated with an increased risk of non-alcoholic fatty liver disease and of type 2 diabetes.

The Bassoon gene variants were found to affect 1 in 6,500 adults, so could affect about 10,000 people in the UK.

The brain’s role in obesity

Obesity is a major public health concern as it is a significant risk factor for other serious diseases, including cardiovascular disease and type 2 diabetes, yet the genetic reasons why some people are more prone to weight gain are incompletely understood.

Previous research has identified several obesity-associated gene variants conferring large effects from childhood, acting through the leptin-melanocortin pathway in the brain, which plays a key role in appetite regulation.

However, while both BSN and APBA1 encode proteins found in the brain, they are not currently known to be involved in the leptin-melanocortin pathway. In addition, unlike the obesity genes previously identified, variants in BSN and APBA1 are not associated with childhood obesity.

This has led the researchers to believe that they may have uncovered a new biological mechanism for obesity, different to those we already know for previously identified obesity gene variants.

Based on published research and laboratory studies they report in this paper, which indicate that BSN and APBA1 play a role in the transmission of signals between brain cells, the researchers suggest that age-related neurodegeneration could be affecting appetite control.

Professor John Perry, study author and an MRC Investigator at the University of Cambridge, said: “These findings represent another example of the power of large-scale human population genetic studies to enhance our understanding of the biological basis of disease. The genetic variants we identify in BSN confer some of the largest effects on obesity, type 2 diabetes and fatty liver disease observed to date and highlight a new biological mechanism regulating appetite control.”

The use of global data

The accessibility of large-scale databases such as UK Biobank has enabled researchers to search for rare gene variants that may be responsible for conditions including obesity.

For this study, the researchers worked closely with AstraZeneca to replicate their findings in existing cohorts using genetic data from individuals from Pakistan and Mexico. This is important as the researchers can now apply their findings beyond individuals of European ancestry.

If the researchers can better understand the neural biology of obesity, it could present more potential drug targets to treat obesity in the future.

Dr Slavé Petrovski, VP of the Centre for Genomics Research at AstraZeneca, said: “Rigorous large-scale studies such as this are accelerating the pace at which we uncover new insights into human disease biology. By collaborating across academia and industry, leveraging global datasets for validation, and embedding a genomic approach to medicine more widely, we will continue to improve our understanding of disease – for the benefit of patients.”

Next steps for research

Professor Giles Yeo, study author based at the MRC Metabolic Diseases Unit, added: “We have identified two genes with variants that have the most profound impact on obesity risk at a population level we’ve ever seen, but perhaps more importantly, that the variation in Bassoon is linked to adult-onset and not childhood obesity. Thus these findings give us a new appreciation of the relationship between genetics, neurodevelopment and obesity.”

Reference
Zhao, T et al. Protein-truncating variants in BSN are associated with severe adult-onset obesity, type 2 diabetes and fatty liver disease. Nat Gen; 4 Apr 2024; DOI: 10.1038/s41588-024-01694-x

Adapted from a press release from the Medical Research Council

Cambridge researchers have identified genetic variants in two genes that have some of the largest impacts on obesity risk discovered to date.

We have identified two genes with variants that have the most profound impact on obesity risk at a population level we’ve ever seen
Giles Yeo
Woman with obesity washing food

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

UK-wide trials to begin on blood tests for diagnosing dementia

Elderly couple taking a walk through the park

Professor James Rowe from the Department of Clinical Neurosciences at Cambridge will co-lead a team that will test multiple existing and novel blood tests, looking at a range of types of dementia.

The trials will capitalise on recent breakthroughs in potential dementia blood tests, and generate the evidence needed for them to be validated for use in the NHS within the next 5 years.

The teams from Dementias Platform UK (which includes the Universities of Cambridge and Oxford) and UCL make up the Blood Biomarker Challenge - a multi-million pound award given by Alzheimer’s Society, Alzheimer’s Research UK and the National Institute for Health and Care Research and Gates Ventures including £5m raised by players of People’s Postcode Lottery. The project aims to revolutionise dementia diagnosis.

Both teams will recruit participants from sites spread across the country, to ensure their findings are applicable to the whole of the UK’s diverse population.

Timely and accurate diagnosis of the diseases that cause dementia, such as Alzheimer’s disease, is crucial as it means people can access vital care and support and take part in medical research. This will be even more imperative if new treatments are approved for use in the NHS, as these work best for people in the earliest stage of their disease.

Currently, people are usually diagnosed using memory tests and brain scans. These are less accurate than ‘gold standard’ tests like PET scans or lumbar punctures, which can confirm what type of dementia they have. However, only 2% of people can access these specialist tests.

In recent years, a number of different blood tests that can diagnose Alzheimer’s disease and other causes of dementia have shown very promising results in research settings. But they have yet to be tested widely in clinical settings in the UK.

The READ-OUT team (REAl World Dementia OUTcomes) will be led by Professor James Rowe from Cambridge and Drs Vanessa Raymont and Ivan Koychev from Oxford, who are part of Dementias Platform UK. They will test multiple existing and novel blood tests, looking at a range of types of dementia, including Alzheimer’s disease, vascular dementia, frontotemporal dementia, and dementia with Lewy bodies. The researchers will also look at whether the blood tests can help detect these diseases at various stages.

Professor Rowe said: “This is a ground-breaking study, to discover the best blood tests for dementia, not just Alzheimer’s but any type of dementia and for anyone, whatever their background age and other health problems. An early accurate diagnosis opens the way to better treatment, support and care. Cambridge researchers will lead the analysis pipeline, and the vital input from patients and families throughout the study.” 

For the first 3 years, READ-OUT will run a fact-finding study that will take blood tests in around 20 Dementias Platform UK sites across the UK, involving 3000 people from diverse populations. In the final 2 years, they will run a clinical trial with 880 people to explore how having a blood test for dementia affects diagnosis and quality of life, patients and carers, impact on care and how the results should be communicated to patients.

Dr Raymont said: “Since I first stepped into a memory clinic 30 years ago there has thankfully been a shift in the way society thinks about dementia. There was previously a feeling that this was just another part of aging, but now we’re seeing that people want to know more about their condition and they want a diagnosis as it helps them access the support they need. Both my parents lived with dementia so I know firsthand the devastation this disease causes, and how a timely and accurate diagnosis can benefit people and their families.”

A second team, ADAPT, will be led by Professor Jonathan Schott and Dr Ashvini Keshavan at UCL and will focus on the most promising biomarker for Alzheimer’s disease, called p-tau217. This reflects levels of two hallmark proteins found inside the brain in Alzheimer’s disease – amyloid and tau. The researchers will carry out a clinical trial to see whether measuring p-tau217 in the blood increases the rate of diagnosis for Alzheimer’s disease both in people with early dementia, but also in those with mild, progressive problems with memory.

These complementary research approaches will maximise the chances of providing the evidence needed to prove that blood tests are ready for use in the NHS. They will pave the way for them to be made available to all who might benefit within the next 5 years.

Fiona Carragher, Director of Research and Influencing at Alzheimer’s Society, said: “At the moment only 2% of people with dementia can access the specialised tests needed to demonstrate eligibility for new treatments, leading to unnecessary delays, worry and uncertainty. Blood tests are part of the answer to this problem – they’re quick, easy to administer and cheaper than current, more complex tests. I’ve spent decades working in research and the NHS and, after years of slow progress, it feels like we’re on the cusp of a new chapter on how we treat dementia in this country.”

Dr Sheona Scales, Director of Research at Alzheimer’s Research UK, said: “It’s fantastic that through collaborating with the leading experts in the dementia community, we can look to bring cutting-edge blood tests for diagnosing dementia within the NHS. And this will be key to widening access to groundbreaking new treatments that are on the horizon.”

For more information about the Blood Biomarker Challenge and how to take part, please visit the Dementia Platforms UK website.

Adapted from a press release from Alzheimer’s Research UK

Cambridge researchers are helping lead countrywide trials to identify accurate and quick blood tests that can diagnose dementia, in a bid to improve the UK’s shocking diagnosis rate.

This is a ground-breaking study, to discover the best blood tests for dementia, not just Alzheimer’s but any type of dementia and for anyone, whatever their background age and other health problems. An early accurate diagnosis opens the way to better treatment, support and care
James Rowe
Elderly couple taking a walk through the park

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Last chance to record archaic Greek language ‘heading for extinction’

Professor Ioanna Sitaridou (right) with a 100 year-old Romeyka speaker in Turkey's Trabzon region.

The initiative, led by Professor Ioanna Sitaridou (Queens' College and Faculty of Modern and Medieval Languages and Linguistics), contributes to the UN’s International Decade of Indigenous Languages (2022-32), which aims ‘to draw global attention on the critical situation of many indigenous languages and to mobilise stakeholders and resources for their preservation, revitalization and promotion.’

Romeyka is thought to have only a couple of thousand native speakers left in Turkey’s Trabzon region, but the precise number is hard to calculate especially because of the fact that there are also a large number of heritage speakers in the diaspora and the ongoing language shift to Turkish.

Romeyka does not have a writing system and has been transmitted only orally. Extensive contact with Turkish, the absence of support mechanisms to facilitate intergenerational transmission, socio-cultural stigma, and migration have all taken their toll on Romeyka. A high proportion of native speakers in Trabzon are over 65 years of age and fewer young people are learning the language.

The newly launched trilingual Crowdsourcing Romeyka platform invites members of the public from anywhere in the world to upload audio recordings of Romeyka being spoken.

“Speech crowdsourcing is a new tool which helps speakers build a repository of spoken data for their endangered languages while allowing researchers to document these languages, but also motivating speakers to appreciate their own linguistic heritage. At the same time, by creating a permanent monument of their language, it can help speakers achieve acknowledgement of their identity from people outside of their speech community,” said Professor Sitaridou, who has been studying Romeyka for the last 16 years.

The innovative tool is designed by a Harvard undergraduate in Computer Science, Mr Matthew Nazari, himself a heritage speaker of Aramaic. Together they hope that this new tool will also pave the way for the production of language materials in a naturalistic learning environment away from the classroom, but based instead around everyday use, orality, and community.

To coincide with the platform’s launch, Sitaridou is unveiling major new findings about the language’s development and grammar at an exhibition in Greece (details below).

Sitaridou’s most important findings include the conclusion that Romeyka descends from Hellenistic Greek not Medieval Greek, making it distinct from other Modern Greek dialects. “Romeyka is a sister, rather than a daughter, of Modern Greek,” said Sitaridou, Professor of Spanish and Historical Linguistics. “Essentially this analysis unsettles the claim that Modern Greek is an isolate language”.

Over the last 150 years, only four fieldworkers have collected data on Romeyka in Trabzon. By engaging with local communities, particularly female speakers, Sitaridou has amassed the largest collection of audio and video data in existence collected monolingually and amounting to more than 29GB of ethically sourced data, and has authored 21 peer-reviewed publications. A YouTube film about Sitaridou’s fieldwork has received 723,000 views to-date.

Grammar and a new phylogeny for Greek

Sitaridou’s analysis of the Romeyka infinitive is key. All other Greek dialects known today have stopped using the infinitive found in ancient Greek. So speakers of Modern Greek would say I want that I go instead of I want to go. But, in Romeyka, the infinitive lives on and Sitaridou has observed uncontroversial proof that this Ancient Greek infinitive can be dated back to Hellenistic Greek due to its preservation in a structure which became obsolete by early Mediaeval times in all other Greek varieties, but continued to be used in Romeyka while also undergoing a cross-linguistically rare mutation to a negative item.

Sitaridou’s findings have significant implications for our understanding of the evolution of Greek, because they suggest that there is more than one Greek language on a par with the Romance languages (which all derived out of Vulgar Latin rather than out of each other).

Historical context and new field work sites

The roots of the Greek presence in the Black Sea are steeped in myth: from the journey of Jason and the Argonauts to Colchis, to the Amazons. But what we know is that the Greeks began to spread around the Black Sea from approximately the 6th Century BCE. Ionians founded Miletus, which, in turn, founded Sinope, which, eventually, colonized Trebizond. In the Pontus, the language of the first Greek colonizers of Trebizond was the Ionic Greek of Sinope.

In the 4th Century BCE, the passage of Alexander the Great’s army contributed to the creation of another Greek-speaking centre, to the South of Pontus, at Cappadocia. It is possible that from Cappadocia, Greek may have also spread northwards towards Pontus.

However, the decisive phase for the expansion of the Greek language seems to be Christianization. The inhabitants of Pontus were among the first converts and are mentioned in the New Testament. The Soumela monastery was founded in 386 CE, around 20 years after the region officially adopted Christianity. The fall of Trebizond to the Ottomans in 1461 led to the city becoming majority Muslim.

Professor Sitaridou said: “Conversion to Islam across Asia Minor was usually accompanied by a linguistic shift to Turkish, but communities in the valleys retained Romeyka. And because of Islamisation, they retained some archaic features while the Greek-speaking communities who remained Christian grew closer to Modern Greek, especially because of extensive schooling in Greek in the 19th and early 20th centuries.”

Recently, Professor Sitaridou started field working in a new site, Tonya, where no other field worker has ever reached, only to reveal significant grammatical variation between the valleys indicating different Islamisation onset. In a publication to appear soon, it is argued that both the syntax of subordination and negation systems in Tonya show different patterns and thus diachronic development from the Çaykara variety.

In 1923, under the Greco-Turkish population exchange, Greek-speaking Christians of Pontus were forced to leave Turkey and relocate to Greece while Romeyka-speaking Muslim communities in the Trabzon area remained in their homeland as they professed Islam, explaining why this Greek variety is still spoken in small enclaves in the region. Since 1923 and until very recently the two speech communities were oblivious of each other’s existence.

Preservation of heritage languages and why it matters

Speakers are still reluctant to identify Romeyka as one of their languages since, for Turkish nationalists, speaking Greek goes against the very fundamentals of one’s belonging. From a Greek nationalist perspective, these varieties are deemed ‘contaminated’ and/or disruptive to the ideology of one single Greek language spoken uninterruptedly since antiquity, as Sitaridou explains in an article which is about to be published by the Laz Institute in Istanbul.

In Greece, Turkey and beyond, Sitaridou has used her research to raise awareness of Romeyka, stimulate language preservation efforts and enhance attitudes. In Greece, for instance, Sitaridou co-introduced a pioneering new course on Pontic Greek at the Democritus University of Thrace since the number of speakers of Pontic Greek is also dwindling. 

“Raising the status of minority and heritage languages is crucial to social cohesion, not just in this region, but all over the world,” Professor Sitaridou said. “When speakers can speak their home languages they feel 'seen' and thus they feel more connected to the rest of the society; on the other hand, not speaking the heritage or minority languages creates some form of trauma which in fact undermines the integration which linguistic assimilation takes pride in achieving”.

The same ethos traverses a new AHRC-funded project about the documentation of a critically endangered language, Sri Lanka Portuguese, among Afrodescent communities in north-western Sri Lanka. Sitaridou will be documenting and analysing manja, the only remaining linguistic and cultural expression of African heritage for these communities.

Exhibition at Mohamed Ali’s historical House in Kavala

The Romeyka exhibition runs at the MOHA Research Centre in Kavala, Greece, from 29 March to 28 April 2024.

The exhibition features previously unpublished archival material from Exeter College, Oxford and photographic material from British School of Athens which give us a glimpse into the Greek-speaking communities and language in the southern Black Sea shores 110 years ago taken by R M Dawkins, one of the first field workers in the area. This is combined with photographs and video material from Professor Sitaridou’s own fieldwork, interspersed with panels and audio material to communicate her linguistic findings.

The exhibition aims to generate further reflections on endangered heritages, fragmented and shared identities and collective memory as well as helping us get a better grasp of multilingualism, localised experiences, intergenerational stories of co-existence and displacement, diasporic selves and language loss, and alternative modalities of being and belonging both in Greece and Turkey.

A new data crowdsourcing platform aims to preserve the sound of Romeyka, an endangered millennia-old variety of Greek. Experts consider the language to be a linguistic goldmine and a living bridge to the ancient world.

Raising the status of minority and heritage languages is crucial to social cohesion, not just in this region, but all over the world
Professor Ioanna Sitaridou
Professor Ioanna Sitaridou
Professor Ioanna Sitaridou (right) with a 100 year-old Romeyka speaker in Turkey's Trabzon region.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

UK's only research institute dedicated to understanding early cancer receives £11 million donation

Sir Ka-shing Li at the opening of the MRC Cancer Centre in the Hutchinson Building

Located on the Cambridge Biomedical Campus – the largest bioscience ecosystem in Europe – the Institute brings together world-leading expertise from across diverse fields including biology, physics, mathematics, epidemiology, medicine, and computer science under one roof with one goal: to predict and prevent cancer.

The donation will support the redevelopment of the Hutchison building, home to the Early Cancer Institute. This will enable the Institute to scale up its work, creating the cutting-edge laboratory space needed for its research teams to advance their early detection efforts and expand the Institute's research capabilities, attracting more world-class scientists and clinicians to join its teams.

The building will be renamed the Li Ka Shing Early Cancer Institute in honour of Hong Kong-based philanthropist Sir Ka-shing Li and the enduring partnership between the Li Ka Shing Foundation and the University of Cambridge in progressing the fight against cancer. Sir Ka-shing Li generously donated to the original Hutchison building in 2002, and then – in 2007 – to the Li Ka Shing Centre, which houses the CRUK Cambridge Institute.

Commenting on the renaming of the building in his honour, Sir Ka-shing Li said: "I am greatly encouraged that much advancement has been made towards cancer diagnosis, treatment and prevention. It is also evident now that early detection of cancer will yield the best chance of successful treatment and quality of life for the patient.

"It is a great privilege, therefore, to support the transformation of the Hutchison building to become a centre of excellence and a fitting home for the national Early Cancer Research Institute and a first of its kind in the UK. This inspirational journey with Cambridge University spanning over two decades fulfils my lifetime commitment to build the good of science, and I am truly gratified by this partnership."

Researchers at the Institute are focusing on cancers that are hard to treat, such as lung, oesophageal and liver cancers, and acute myeloid leukaemia. Detection and treatment methods have changed very little for these types of cancer over the past few years, and outcomes are often poor. Detecting and treating cancer earlier will dramatically increase survival rates and reduce healthcare costs across all tumour types.

By working across disciplines to understand the fundamental biology of how cancer develops and evolves, researchers at the Institute are making pioneering early detection research advances and translating these into clinical practice. They have used the power of theoretical physics methods to identify blood cancer years before the patient has symptoms, while biology and chemical engineering experts have collaborated to develop a method to detect and destroy early lung cancer.

The Institute’s director, Professor Rebecca Fitzgerald, pioneered the capsule sponge – a new test that can identify ten times more heartburn patients with Barrett’s oesophagus, a pre-cursor to oesophageal cancer. The device aims to catch the disease when it is easier to treat, thus helping more people survive.

Fitzgerald, also Professor of Cancer Prevention, remarked on the gift’s far-reaching impact, highlighting the importance of the redevelopment in helping researchers make life-saving scientific advances: "This extraordinary gift will provide the cutting-edge research facilities necessary to help our researchers develop pioneering early cancer detection innovations and take these from bench to bedside with even greater speed and focus, resulting in fewer cancer-related deaths worldwide."

Professor Richard Gilbertson, the Li Ka Shing Chair of Oncology said: "It is fitting that the home of this exceptional centre for research into the early detection of cancer should be renamed the Li Ka Shing Early Cancer Institute. From his inaugural gift to establish the Li Ka Shing Centre to house the Cancer Research UK Cambridge Institute, to the endowment of a new Professorship of Oncology, Sir Ka-shing Li has been a generous and constant partner in the University’s pioneering work to help create a world free of the fear of cancer."

The Vice-Chancellor, Professor Deborah Prentice, said: "New technologies are ensuring that ideas developed here in Cambridge can be used to benefit patients around the world, and we must ensure that as many people as possible are able to benefit from our cancer research. We are very grateful for Sir Ka-shing Li’s longstanding generosity, which has allowed us to make extraordinary progress in understanding this terrible disease. As our work continues, we look forward to developing novel ways of diagnosing cancer earlier and treating it more precisely and effectively."

The University of Cambridge’s Early Cancer Institute – the UK's only research facility dedicated to understanding early cancer – has received a landmark £11 million donation to support its vital work in the fight against cancer.

This extraordinary gift will provide the cutting-edge research facilities necessary to help our researchers develop pioneering early cancer detection innovations... resulting in fewer cancer-related deaths worldwide.
Rebecca Fitzgerald
Li Ka Shing Foundation
Sir Ka-shing Li at the opening of the MRC Cancer Centre in the Hutchinson Building, 18 May 2022

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge do the double in 2024 Boat Race

The victorious Cambridge Men and Women’s Blue Boat crews

Despite both the Cambridge Men and Women’s Blue Boats starting as underdogs, Cambridge emerged victorious in both races.

In the 78th Women’s Race, despite Oxford taking an early lead, Cambridge caught up and then overtook Oxford. Oxford cox Joe Gellett raised an appeal at the end of the race, arguing that the Cambridge boat had crossed their path, but after a discussion with umpire Richard Phelps the appeal was dismissed.

In the 169th Men’s Race, Cambridge took an early lead but slowed towards the end with stroke Matt Edge struggling, but with his teammates digging in they held on for what was in the end a comfortable victory.

All in all it was a fantastic weekend for Cambridge, with the Light Blues dominating the results, with Goldie winning the Men’s Reserve race, and the Cambridge Men’s Lightweight and Women’s Lightweight Crews winning on Friday.

“This Boat Race just means so much, this Club just means so much” said victorious Men’s President Seb Benzecry.

“This season has been the most amazing season, it’s been challenging, we’ve pushed ourselves harder than any team I’ve been a part of before. We knew Oxford would pose a huge challenge this year to us, we knew we had to step on. I couldn’t be prouder of the ways the guys responded to that challenge, in a year when basically every single boat was an underdog.”

Women’s President Jenna Armstrong said she was almost pinching herself at the result.

“I almost can’t believe it. This year we were slated as the underdogs going in to the race. Our race plan was to go out and row our best race, go as fast as possible and hang on and wait for an opportunity to pop up - and that’s what we did.”

Cambridge University Vice-Chancellor Professor Deborah Prentice, who was watching her first Boat Race after joining in July last year, said it had been a fantastic weekend.

“It was brilliant, utterly brilliant – everything I expected and more,” she said. “This is my first time at the Boat Race obviously and I heard Oxford started as favourites so I didn’t expect Cambridge to come out ahead like this.

“I went out to see the Women’s Blue Boat training earlier this year and today I could see the fruits of the labour that they put in, going out every day at 5.30 in the morning – it’s incredible.”

Cambridge have done the double in the Boat Race, winning both the Men’s and Women’s races in a thrilling day of action on the Thames.

The victorious Cambridge Men and Women’s Blue Boat crews

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

TB vaccine may enable elimination of the disease in cattle by reducing its spread

Herd of cows in a grassy field

The research, led by the University of Cambridge and Penn State University, improves prospects for the elimination and control of bovine tuberculosis (TB), an infectious disease of cattle that results in large economic costs and health impacts across the world.  

This is the first study to show that BCG-vaccinated cattle infected with TB are substantially less infectious to other cattle. This remarkable indirect effect of the vaccine beyond its direct protective effect has not been measured before.

The spillover of infection from livestock has been estimated to account for about 10% of human tuberculosis cases. While such zoonotic TB (zTB) infections are most commonly associated with gastro-intestinal infections related to drinking contaminated milk, zTB can also cause chronic lung infections in humans. Lung disease caused by zTB can be indistinguishable from regular tuberculosis, but is more difficult to treat due to natural antibiotic resistance in the cattle bacteria.

TB remains endemic in many countries around the world, including in Europe and the Americas, where its control costs farmers and taxpayers hundreds of millions of dollars each year.

The study is published today in the journal Science.

In the study, carried out in Ethiopia, researchers examined the ability of the vaccine, Bacillus Calmette-Guérin (BCG), to directly protect cattle that receive it, as well as to indirectly protect both vaccinated and unvaccinated cattle by reducing TB transmission. Vaccinated and unvaccinated animals were put into enclosures with naturally infected animals, in a novel crossover design performed over two years.

“Our study found that BCG vaccination reduces TB transmission in cattle by almost 90%. Vaccinated cows also developed significantly fewer visible signs of TB than unvaccinated ones. This suggests that the vaccination not only reduces the progression of the disease, but that if vaccinated animals become infected, they are substantially less infectious to others,” said Andrew Conlan, Associate Professor of Epidemiology at the University of Cambridge’s Department of Veterinary Medicine and a corresponding author of the study.

Using livestock census and movement data from Ethiopia, the team developed a transmission model to explore the potential for routine vaccination to control bovine tuberculosis.

“Results of the model suggest that vaccinating calves within the dairy sector of Ethiopia could reduce the reproduction number of the bacterium — the R0 — to below 1, arresting the projected increase in the burden of disease and putting herds on a pathway towards elimination of TB,” Conlan said.

The team focused their studies in Ethiopia, a country with the largest cattle herd in Africa and a rapidly growing dairy sector that has a growing burden of bovine tuberculosis and no current control program, as a representative of similarly situated transitional economies.

“Bovine tuberculosis is largely uncontrolled in low- and middle-income countries, including Ethiopia,” said Abebe Fromsa, associate professor of agriculture and veterinary medicine at Addis Ababa University in Ethiopia and the study’s co-lead author. “Vaccination of cattle has the potential to provide significant benefits in these regions.”

“For over a hundred years, programs to eliminate bovine tuberculosis have relied on intensive testing and slaughtering of infected animals,” said Vivek Kapur, professor of microbiology and infectious diseases and Huck Distinguished Chair in Global Health at Penn State and a corresponding author of the study.

He added: “This approach is unimplementable in many parts of the world for economic and social reasons, resulting in considerable animal suffering and economic losses from lost productivity, alongside an increased risk of spillover of infection to humans. By vaccinating cattle, we hope to be able to protect both cattle and humans from the consequences of this devastating disease.”

Professor James Wood, Alborada Professor of Equine and Farm Animal Science in the University of Cambridge’s Department of Veterinary Medicine, noted that despite TB being more prevalent in lower-income countries, the United Kingdom, Ireland and New Zealand also experience considerable economic pressures from the disease which continues to persist despite intensive and costly control programs.

Wood said: “For over twenty-years the UK government has pinned hopes on cattle vaccination for bovine tuberculosis as a solution to reduce the disease and the consequent costs of the controls. These results provide important support for the epidemiological benefit that cattle vaccination could have to reduce rates of transmission to and within herds.”

This research was supported by The Bill & Melinda Gates Foundation, as well as the Biotechnology and Biological Sciences Research Council; Foreign, Commonwealth and Development Office; Economic & Social Research Council; Medical Research Council; Natural Environment Research Council; and Defence Science & Technology.

Reference: Fromsa, A. et al: ‘BCG vaccination of cattle reduces transmission of bovine tuberculosis, improving the prospects for elimination.’ Science, March 2024. DOI: 10.1126/science.adl3962

Vaccination not only reduces the severity of TB in infected cattle, but reduces its spread in dairy herds by 89%, research finds.

Our study suggests that vaccination not only reduces the progression of the disease, but that if vaccinated animals become infected, they are substantially less infectious to others.
Andrew Conlan
Getty/ kamisoka
Herd of cows in a grassy field

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New approach to monitoring freshwater quality can identify sources of pollution, and predict their effects

Study lake in Norway

The source of pollutants in rivers and freshwater lakes can now be identified using a comprehensive new water quality analysis, according to scientists at the University of Cambridge and Trent University, Canada.

Microparticles from car tyres, pesticides from farmers’ fields, and toxins from harmful algal blooms are just some of the organic chemicals that can be detected using the new approach, which also indicates the impact these chemicals are likely to have in a particular river or lake.

Importantly, the approach can also point to the origin of specific organic matter dissolved in the water, because it has a distinct composition depending on its source.

The approach uses a technique called high-resolution mass spectrometry to analyse water samples: within an hour this provides a comprehensive overview of all the organic molecules present.

Water quality is strongly determined by the diversity of organic matter dissolved in it – termed ‘chemodiversity.’ The scientists say that the thousands of different dissolved organic compounds can keep freshwater ecosystems healthy, or contribute to their decline, depending on the mixture present.

The paper is published today in the journal Science.

“Traditional approaches to monitoring water quality involve taking lots of different measurements with many devices, which takes a lot of time. Our technique is a very simple way to get a comprehensive overview of what’s going on in a particular river or lake,” said Jérémy Fonvielle, a researcher in the University of Cambridge’s Departments of Plant Sciences and Biochemistry, and co-author of the paper.

To understand what drives this chemodiversity, the team reviewed studies of dissolved organic matter in freshwater samples from rivers and lakes across Europe and northern Canada.

For example, water analysis of Lake Erie in Canada revealed high levels of phosphorus pollution. By looking at the composition of individual molecules in the water sample, researchers identified agricultural activities as the source of this pollution, rather than wastewater effluent. 

“Whereas before, we could measure the amount of organic nitrogen or phosphorus pollution in a river, we couldn't really identify where pollution was coming from. With our new approach we can use the unique molecular fingerprint of different sources of pollution in freshwater to identify their source,” said Dr Andrew Tanentzap at Trent University School of the Environment, co-author of the report.

Traditional approaches involve separately measuring many indicators of ecosystem health, such as the level of organic nutrients or particular pollutants like nitrogen. These can indicate the condition of the water, but not why this state has arisen.

Dissolved organic matter is one of the most complex mixtures on Earth. It consists of thousands of individual molecules, each with their own unique properties. This matter influences many processes in rivers and lakes, including nutrient cycling, carbon storage, light absorption, and food web interactions - which together determine ecosystem function.

Sources of dissolved organic matter in freshwater include urban runoff, agricultural runoff, aerosols and wildfires.

“It's possible to monitor the health of freshwater through the diversity of compounds that are present. Our approach can, and is, being rolled out across the UK,” said Tanentzap.

Fonvielle will now apply this technique to analysing water samples from farmland drainage ditches in the Fens, as part of a project run by the University of Cambridge’s Centre for Landscape Regeneration to understand freshwater health in this agricultural landscape.

The research was funded primarily by the Natural Sciences and Engineering Research Council and the European Research Council.

Reference: Tanentzap, A.J. and Fonvielle, J.A: ‘Chemodiversity in freshwater health.’ Science, March 2024. DOI: 10.1126/science.adg8658

Analysing the diversity of organic compounds dissolved in freshwater provides a reliable measure of ecosystem health, say scientists.

Our technique is a very simple way to get a comprehensive overview of what’s going on in a particular river or lake.
Jérémy Fonvielle
Sam Woodman
Study lake in Norway

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

‘Exhausted’ immune cells in healthy women could be target for breast cancer prevention

Woman holds pink breast cancer awareness ribbon.

Everyone has BRCA1 and BRCA2 genes, but mutations in these genes - which can be inherited - increase the risk of breast and ovarian cancer.

The study found that the immune cells in breast tissue of healthy women carrying BRCA1 or BRCA2 gene mutations show signs of malfunction known as ‘exhaustion’. This suggests that the immune cells can’t clear out damaged breast cells, which can eventually develop into breast cancer.

This is the first time that ‘exhausted’ immune cells have been reported in non-cancerous breast tissues at such scale - normally these cells are only found in late-stage tumours.

The results raise the possibility of using existing immunotherapy drugs as early intervention to prevent breast cancer developing, in carriers of BRCA1 and BRCA2 gene mutations.

The researchers have received a ‘Biology to Prevention Award’ from Cancer Research UK to trial this preventative approach in mice. If effective, this will pave the way to a pilot clinical trial in women carrying BRCA gene mutations.

“Our results suggest that in carriers of BRCA mutations, the immune system is failing to kill off damaged breast cells - which in turn seem to be working to keep these immune cells at bay,” said Professor Walid Khaled in the University of Cambridge’s Department of Pharmacology and Wellcome-MRC Cambridge Stem Cell Institute, senior author of the report.

He added: “We’re very excited about this discovery, because it opens up potential for a preventative treatment other than surgery for carriers of BRCA breast cancer gene mutations.

“Drugs already exist that can overcome this block in immune cell function, but so far, they’ve only been approved for late-stage disease. No-one has really considered using them in a preventative way before.”

The results are published today in the journal Nature Genetics.

Risk-reducing surgery, in which the breasts are removed, is offered to those at increased risk of breast cancer. This can be a difficult decision for young women to make and can have a significant effect on body image and sexual relationships.

“The best way to prevent breast cancer is to really understand how it develops in the first place. Then we can identify these early changes and intervene,” said Khaled.

He added: “Late-stage breast cancer tends to be very unpredictable and hard to manage. As we make better and better drugs, the tumours just seem to find a way around it.”

Using samples of healthy breast tissue collected from 55 women across a range of ages, the researchers catalogued over 800,000 cells - including all the different types of breast cell.

The resulting Human Breast Cell Atlas is now available as a resource for other researchers to use and add to. It contains huge amounts of information on other risk factors for breast cancer including Body Mass Index (BMI), menopausal status, contraceptive use and alcohol consumption.

“We've found that there are multiple breast cell types that change with pregnancy, and with age, and it’s the combination of these effects - and others - that drives the overall risk of breast cancer,” said Austin Reed, a PhD student in the University of Cambridge’s Department of Pharmacology and joint first author of the report.

He added: “As we collect more of this type of information from samples around the world, we can learn more about how breast cancer develops and the impact of different risk factors - with the aim of improving treatment.”

One of the biggest challenges in treating breast cancer is that it is not just one disease, but many. Many different genetic variations can lead to breast cancer, and genetic risk interacts with other risk factors in complicated ways.

For example, it is known that the likelihood of breast cancer increases with age, but this risk is greatly reduced by pregnancy early in life. And age-associated risk is greatly increased in carriers of the breast cancer genes BRCA1 and BRCA2.

The new study aimed to understand how some of these risk factors interact, by characterising the different cell types in the human breast under many different physiological states.

The researchers used a technique called ‘single cell RNA-sequencing’ to characterise the many different breast cell types and states. Almost all cells in the body have the same set of genes, but only a subset of these are switched on in each cell – and these determine the cell’s identity and function. Single cell RNA-sequencing reveals which genes are switched on in individual cells.

“Breast cancer occurs around the world, but social inequalities mean not everyone has access to treatment. Prevention is the most cost-effective approach. It not only tackles inequality, which mostly affects low-income countries, but also improves disease outcome in high-income countries,” said Dr Sara Pensa, Senior Research Associate in the University of Cambridge’s Department of Pharmacology and joint first author of the study.

Breast tissue samples were provided by the Breast Cancer Now tissue bank.

The research was primarily funded by the Medical Research Council and Cancer Research UK.

Reference: Reed, A.D. et al: ‘A human breast cell atlas enables mapping of homeostatic cellular shifts in the adult breast.’ Nature Genetics, March 2024. DOI: 10.1038/s41588-024-01688-9

Researchers at the University of Cambridge have created the world’s largest catalogue of human breast cells, which has revealed early cell changes in healthy carriers of BRCA1 and BRCA2 gene mutations.

We’re very excited about this discovery, because it opens up potential for a preventative treatment other than surgery for carriers of BRCA breast cancer gene mutations.
Walid Khaled
Angiola Harry on Unsplash
Woman holds pink breast cancer awareness ribbon. Credit angiola-harry-unsplash

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Reclaim ‘wellness’ from the rich and famous, and restore its political radicalism, new book argues

People doing yoga together outdoors in Richmond USA in 2015

Today’s wellness industry generates trillions of dollars in revenue, but in a new book, Dr James Riley (Faculty of English & Girton College), shows that 1970s wellness pioneers imagined something radically different to today’s culture of celebrity endorsements and exclusive health retreats. 

“Wellness was never about elite experiences and glossy, high-value products,” says Riley, noting that “When we think of wellness today, Gwyneth Paltrow’s Goop and other lifestyle brands might come to mind, along with the oft-cited criticism that they only really offer quackery for the rich.” By contrast, in the 1970s, “wellness was much more practical, accessible and political.” 

The word, as it was first proposed in the late-1950s, described a holistic approach to well-being, one that attended equally to the mind (mental health), the body (physical health) and the spirit (one’s sense of purpose in life). The aim was to be more than merely ‘not ill’. Being well, according to the likes of Halbert Dunn and later in the 1970s, John Travis and Don Ardell, meant realising your potential, living with ‘energy to burn’ and putting that energy to work for the wider social good.

Riley’s Well Beings: How the Seventies Lost Its Mind and Taught Us to Find Ourselves, published by Icon Books on 28 March, is the first book to explore the background of the wellness concept in the wider political and cultural context of the 1970s. 

“Wellness in the 1970s grew out of changing attitudes to health in the post-war period – the same thinking that gave rise to the NHS,” Riley says. “When coupled with the political activism of the 1960s counterculture and the New Left, what emerged was a proactive, socially oriented approach to physical and mental well-being. This was not about buying a product off the shelf. 

“The pursuit of wellness was intended to take time, commitment and effort. It challenged you to think through every facet of your life: your diet, health, psychology, relationships, community engagement and aspirations. The aim was to change your behaviour – for the better – for the long term.”

Riley’s book also makes a case for what the 1970s wellness industry can do for us today.
 
“We’re often warned about an imminent return to ‘the seventies’, a threat that’s based on the stereotypical image of the decade as one of social decline, urban strife, and industrial discontent. It’s an over-worked comparison that tends to say more about our own social problems, our own contemporary culture of overlapping political, social and economic crises. Rather than fearing the seventies, there’s much we can learn to help us navigate current difficulties.”  

“It was in the 1970s that serious thought was given to stress and overwork to say nothing of such frequently derided ‘events’ as the mid-life crisis and the nervous breakdown. The manifold pressures of modern life - from loneliness to information overload - increasingly came under the microscope and wellness offered the tools to deal with them.” 

“Not only are these problems still with us, they’ve got much worse. To start remedying them, we need to remember what wellness used to mean. The pandemic, for all its horrors, reminded us of the importance of mutual self-care. To deal with the ongoing entanglement of physical and mental health requires more of that conviviality. Being well should be within everyone’s reach, it should not be a privilege afforded to those who have already done well.”

Mindfulness versus wellness

At the heart of Riley’s book is an analysis of the ongoing corporate and commercial tussle between ‘mindfulness’ and ‘wellness’. 

In 1979 Dr Jon Kabat-Zinn founded the Stress Reduction and Relaxation Programme at the University of Massachusetts Medical Center, where he taught ‘mindfulness-based stress reduction’. For Kabat-Zinn mindfulness meant accepting the inevitable stress that comes with the ‘full catastrophe’ of life and adopting an attitude of serene resilience in the face of it. Stress could be alleviated thanks to a regular meditation routine and small changes made to the working day such as the decision to try a different, more pleasant commute. Little was said about altering the pace of the work causing the stress in the first place. 

By contrast, John Travis, a medical doctor who founded the Wellness Resource Center in California’s Marin County in 1975, talked about the health dangers of sedentary, office-based jobs while Don Ardell, author of High Level Wellness (1977), encouraged his readers to become agents of change in the workplace. Both saw work-fixated lifestyles as the problem. Work and work-related stress was thus something to fix, not to endure.     

Ardell argued that because burn-out was becoming increasingly common it was incumbent upon employers to offer paid time off to improve employee well-being. Better to be too well to come to work, reasoned Ardell, than too sick. “We tend to think that flexible hours and remote working are relatively new concepts, particularly in the digital and post-COVID eras,” adds Riley, “but Ardell was calling for this half a century ago.” 

Riley argues that the techniques of mindfulness, rather than those of wellness, have proved attractive to contemporary corporate culture because they ultimately help to maintain the status quo. Corporate mindfulness puts the onus on the employee to weather the storm of stress. It says, “there is nothing wrong with the firm, you are the problem, this is the pace, get with it or leave”.  

According to Riley this view is a far-cry from the thinking of seventies wellness advocates like Travis and Ardell who “imagined a health-oriented citizenship, a process of development in which social well-being follows on from the widespread optimistic and goal-oriented pursuit of personal health. It’s that sense of social mission that self-care has lost.”

Riley points out that this self-care mission had a very particular meaning in the 1970s among groups like The Black Panther Party for Self-Defense, which established clinics and ran an ambulance service for black communities in and around Oakland, California. “They were saying you’ve got to look after yourself so you can then look after your community. Such communal effort was vital because the system was seen to be so opposed to Oakland’s needs. One sees the deeply political potency of ‘self-care’ in this context. It meant radical, collective autonomy, not indulgent self-regard.”

The bad guru

As well as suggesting positive lessons from the past, Riley is also quick to call out the problems. “The emphasis on self-responsibility in wellness culture could easily turn into a form of patient-blame,” he argues, “the idea that if you’re ill, or rather if you fail to be well, it’s your fault, a view that neglects to consider all kinds of social and economic factors that contribute to ill-health.”

Elsewhere, Riley draws attention to the numerous claims of exploitation and abuse within the wider context of the alternative health systems, new religious movements and ‘therapy cults’ that proliferated in the 1970s. 

“It was not always a utopia of free thought. The complex and often unregulated world of New Age groups and alternative health systems could often be a minefield of toxic behaviour, aggressive salesmanship and manipulative mind games. Charismatic and very persuasive human engineers were a common presence in the scene, and one can easily see these anxieties reflected in the various ‘bad gurus’ of the period’s fiction and film. 

“There are plenty of voices who say they gained great insights as a result of being pushed to their limits in these situations,” says Riley, “but many others were deeply affected, if not traumatised, by the same experiences.”


Self-experimentation 

In addition to exploring the literature of the period, Riley’s research for Well Beings found him trying out many of the therapeutic practices he describes. These included extended sessions in floatation tanks, guided meditation, mindfulness seminars, fire walking, primal screaming in the middle of the countryside, remote healing, yoga, meal replacement and food supplements.

 

References

J Riley, Well Beings: How the Seventies Lost Its Mind and Taught Us to Find Ourselves. Published by Icon Books on 28th March 2024. ISBN: 9781785787898.

A new cultural history of the 1970s wellness industry offers urgent lessons for today. It reveals that in the seventies, wellness was neither narcissistic nor self-indulgent, and nor did its practice involve buying expensive, on-trend luxury products. Instead, wellness emphasised social well-being just as much as it focused on the needs of the individual. Wellness practitioners thought of self-care as a way of empowering people to prioritise their health so that they could also enhance the well-being of those around them.

Wellness was much more practical, accessible and political
James Riley
Eli Christman via Flikr under a cc license
People doing yoga together outdoors in Richmond USA in 2015

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Clinical trial underway to treat ultra-rare genetic disease with possible link to leader of mutiny on the Bounty

Patrick Chinnery looks at brain scans on a computer screen

A clinical trial to look at repurposing the UK-licensed medicine deferiprone for patients with the ultra-rare genetic disease neuroferritinopathy has launched today at the University of Cambridge.

Neuroferritinopathy is a progressive and incurable brain disorder caused by changes in a gene that produces a specific protein - ferritin light chain protein. This change leads to the build-up of iron in the brain. The disease usually appears in middle-aged adults and causes severe symptoms that impact on day-to-day life, eventually resulting in the loss of speech and swallowing. There are currently no effective treatments.

Funded by LifeArc, the new randomised placebo-controlled trial - DefINe - will be led by Professor Patrick Chinnery from the Department of Clinical Neurosciences. It aims to stop the progression of the disease by reducing the iron accumulation in the brain with an existing drug called deferiprone. Deferiprone is an affordable oral tablet that is already licensed for use in the UK to reduce iron levels in blood conditions like thalassemia. If successful, the trial could also open the possibility of deferiprone being used for other neurodegenerative conditions linked with build-up of iron the brain.

Professor Chinnery said: “Neuroferritinopathy leads to severe disability and currently has no cure. The DefINe trial will show whether we can stop the disease in its tracks by pulling iron out of the brain using a well-known medicine called deferiprone.

“By funding this study, LifeArc has given the first hope of a treatment for affected families. If successful, the trial will open the possibility of using a similar approach for other neurodegenerative conditions linked to the build-up of iron in the brain, including Parkinson’s disease.”

Neuroferritinopathy affects approximately 100 patients worldwide. Initial discovery of the condition came when a surprising number of individuals diagnosed found to live in the Lake District in Cumbria experienced similar symptoms with a series of incorrect diagnoses. Research into the ancestry of these families by Professor John Burn, a clinical geneticist at Newcastle Hospitals NHS Foundation Trust, discovered the genetic commonality and also found an interesting potential link to the past.

Professor Burn found that a rare mutation caused the progression of the condition and almost all the known cases were likely to be descended from the same ancestor. He traced it back to the 18th Century in Cockermouth in Cumbria and families with the surname Fletcher. Professor Burn suggested they could have shared common ancestry with Fletcher Christian (Fletcher being his surname), known for leading the mutiny on the Bounty in April 1789, given he was also from the region.

The DefINe trial will involve 40 patients taking the drug for a year, who will undergo state-of-the-art 7T magnetic resonance imaging (MRI) scanning to monitor the iron levels in the brain throughout. The evidence collected will form the basis of an application for licensing in the UK under ‘Exceptional Circumstances’, which is often used for rare conditions where the number of people affected is low. This means, if the trial is successful the drug could go on to benefit all people with the condition more quickly.

Samantha Denison, a patient hoping to participate in the trial, said: “It came as such a surprise to be informed of the trial and to learn that we have not been forgotten about. To have the chance to be involved in the trial gives me such hope. If it can help to slow or stop the condition progressing, that would be a huge relief. Just to know that by taking part we could also be helping future generations, is amazing.”

LifeArc has contributed £750,000 to the project and Lipomed, a Swiss life sciences company, has offered to provide both a cost-effective generic form of deferiprone, Deferiprone Lipomed, and a placebo to the trial – a Gift in Kind worth £250,000.

Dr Catriona Crombie, Head of LifeArc’s Rare Disease Translational Challenge, said: “Drug repurposing trials like this are an increasingly effective way of taking treatments that have already been approved and applying them to new conditions and diseases. This will help unlock new treatments for conditions that currently have few, if any, available."

Dr Chantal Manz, Chief Scientific Officer Lipomed AG, Switzerland, said: “Lipomed is very excited to support this promising study concept in patients with neuroferritinopathy, by providing deferiprone 500 mg film-coated tablets and matching placebo tablets. We recognise the unmet clinical need and the potentially significant benefit of this orally active iron chelator.  Deferiprone is able to penetrate the blood-brain barrier and may reduce cerebral iron accumulation in patients with this extremely rare, but devastating genetic neurodegenerative disorder, for which no alternative treatments are available.”

Adapted from a press release by LifeArc.

If successful, the trial will open the possibility of using a similar approach for other neurodegenerative conditions linked to the build-up of iron in the brain, including Parkinson’s disease.
Patrick Chinnery
Patrick Chinnery looks at brain scans

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Fish fed to farmed salmon should be part of our diet, too, study suggests

Mackerel with potato salad

Scientists found that farmed salmon production leads to an overall loss of essential dietary nutrients. They say that eating more wild ‘feed’ species directly could benefit our health while reducing aquaculture demand for finite marine resources.

Researchers analysed the flow of nutrients from the edible species of wild fish used as feed, to the farmed salmon they were fed to. They found a decrease in six out of nine nutrients in the salmon fillet – calcium, iodine, iron, omega-3, vitamin B12 and vitamin A, but increased levels of selenium and zinc.

Most wild ‘feed’ fish met dietary nutrient recommendations at smaller portion sizes than farmed Atlantic salmon, including omega-3 fatty acids which are known to reduce the risk of cardiovascular disease and stroke.

“What we’re seeing is that most species of wild fish used as feed have a similar or greater density and range of micronutrients than farmed salmon fillets,” said lead author, Dr David Willer, Zoology Department, University of Cambridge.

“Whilst still enjoying eating salmon and supporting sustainable growth in the sector, people should consider eating a greater and wider variety of wild fish species like sardines, mackerel and anchovies, to get more essential nutrients straight to their plate.”

In the UK, 71% of adults have insufficient vitamin D in winter, and teenage girls and women often have deficiencies of iodine, selenium and iron. Yet while, 24% of adults ate salmon weekly, only 5.4% ate mackerel, 1% anchovies and just 0.4% herring.

“Making a few small changes to our diet around the type of fish that we eat can go a long way to changing some of these deficiencies and increasing the health of both our population and planet,” said Willer.

The researchers found consuming one-third of current food-grade wild feed fish directly would be the most efficient way of maximising nutrients from the sea.

“Marine fisheries are important local and global food systems, but large catches are being diverted towards farm feeds. Prioritising nutritious seafood for people can help improve both diets and ocean sustainability,” said senior author Dr James Robinson, Lancaster University.

This approach could help address global nutrient deficiencies say the team of scientists from the University of Cambridge, Lancaster University, University of Stirling and the University of Aberdeen.

The study was published today in the journal, Nature Food

The scientists calculated the balance of nutrients in edible portions of whole wild fish, used within pelleted salmon feed in Norway, compared to the farmed salmon fillets.

They focused on nine nutrients that are essential in human diets and concentrated in seafood – iodine, calcium, iron, vitamin B12, vitamin A, omega-3 (EPA + DHA), vitamin D, zinc and selenium.

The wild fish studied included Pacific and Peruvian anchoveta, and Atlantic herring, mackerel, sprat and blue whiting – which are all marketed and consumed as seafood.

They found that these six feed species contained a greater, or similar, concentration of nutrients as the farmed salmon fillets. Quantities of calcium were over five times higher in wild feed fish fillets than salmon fillets, iodine was four times higher, and iron, omega-3, vitamin B12, and vitamin A were over 1.5 times higher.

Wild feed species and salmon had comparable quantities of vitamin D.

Zinc and selenium were found to be higher in salmon than the wild feed species – the researchers say these extra quantities are due to other salmon feed ingredients and are a real mark of progress in the salmon sector.

“Farmed salmon is an excellent source of nutrition, and is one of the best converters of feed of any farmed animal, but for the industry to grow it needs to become better at retaining key nutrients that it is fed. This can be done through more strategic use of feed ingredients, including from fishery by-products and sustainably-sourced, industrial-grade fish such as sand eels”, said Dr Richard Newton of the Institute of Aquaculture, University of Stirling, whose team also included Professor Dave Little, Dr Wesley Malcorps and Björn Kok.

 “It was interesting to see that we’re effectively wasting around 80% of the calcium and iodine from the feed fish – especially when we consider that women and teenage girls are often not getting enough of these nutrients”.

Willer said “These numbers have been underacknowledged by the aquaculture industry’s standard model of quoting Fish In Fish Out (FIFO) ratios rather than looking at nutrients.

The researchers would like to see a nutrient retention metric adopted by the fishing and aquaculture industries. They believe that if combined with the current FIFO ratio, the industry could become more efficient, and reduce the burden on fish stocks that also provide seafood. The team are building a standardised and robust vehicle for integrating the nutrient retention metric into industry practice.

“We’d like to see the industry expand but not at a cost to our oceans,” said Willer.

“We’d also like to see a greater variety of affordable, convenient and appealing products made of wild ‘feed’ fish and fish and salmon by-products for direct human consumption.”

The research was funded by the Scottish Government’s Rural and Environmental Science and Analytical Services Division (RESAS), a Royal Society University Research Fellowship, a Leverhulme Trust Early Career Fellowship a Henslow Fellowship at Murray Edwards College and the University of Cambridge.

Reference: D. Willer et al. Wild fish consumption can balance nutrient retention in farmed fish Nature Food DOI: 10.1038/s43016-024-00932-z

The public are being encouraged to eat more wild fish, such as mackerel, anchovies and herring, which are often used within farmed salmon feeds. These oily fish contain essential nutrients including calcium, B12 and omega-3 but some are lost from our diets when we just eat the salmon fillet.

Making a few small changes to our diet around the type of fish that we eat can go a long way to changing some of these deficiencies and increasing the health of both our population and planet
Dr David Willer, Zoology Department
Mackerel with potato salad

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge welcomes Harvard’s Interim President

Professor Debbie Prentice and Professor Alan Garber

After a meeting with Cambridge counterparts, the Harvard team went to Emmanuel College, which has strong historical links with Harvard, and took part in a symposium on research in the life sciences.

In the afternoon, Professor Garber and his colleagues visited the Cambridge Biomedical Campus to learn more about its world-leading medical science, research, education and patient care.

"It was a great pleasure to meet with our alumni in London and to travel to the University of Cambridge on my first presidential visit outside of the United States,” said Professor Garber. “Acknowledging our historical roots and celebrating our ongoing connections underscored for me the strength of our worldwide community. It was inspiring to see first-hand the many ways in which that strength is helping us fulfil our commitment to teaching, research, and innovation. I am grateful to the Vice-Chancellor and our colleagues in 'old Cambridge', and to our alumni community in London, for hosting me for these important conversations.”

Professor Prentice said: “I was delighted to welcome Professor Garber and the Harvard team to Cambridge. It was a wonderful opportunity to catch up with colleagues, discuss some of the great work going on in our institutions, and exchange ideas on a wide range of topics.”

The Vice-Chancellor, Professor Debbie Prentice, welcomed Professor Alan Garber, Interim President of Harvard University, and members of his senior team during a visit to Cambridge.

Professor Debbie Prentice and Professor Alan Garber

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Three Cambridge researchers awarded Royal Academy of Engineering Chair in Emerging Technologies

Left to right: Manish Chhowalla, Nic Lane, Erwin Reisner

From atomically thin semiconductors for more energy-efficient electronics, to harnessing the power of the sun by upcycling biomass and plastic waste into sustainable chemicals, their research encompasses a variety of technological advances with the potential to deliver wide-ranging benefits.

Funded by the UK Department for Science, Innovation and Technology, the Academy’s Chair in Emerging Technologies scheme aims to identify global research visionaries and provide them with long-term support. Each £2,500,000 award covers employment and research costs, enabling each researcher to focus on advancing their technology to application in a strategic manner for up to 10 years.

Since 2017, the Chair in Emerging Technologies programme has awarded over £100 million to Chairs in 16 universities located across the UK. Of the four Chairs awarded in this round, three were awarded to Cambridge researchers.

Professor Manish Chhowalla FREng, from the Department of Materials Science and Metallurgy, is developing ultra-low-power electronics based on wafer-scale manufacture of atomically thin (or 2D) semiconductors. The atomically thin nature of the 2D semiconductors makes them ideal for energy-efficient electronics. To reap their benefits, complementary metal oxide semiconductor processes will be developed for integration into ultra-low power devices.

Professor Nic Lane and his team at the Department of Computer Science and Technology, are working to make the development of AI more democratic by focusing on AI methods that are less centralised and more collaborative, and offer better privacy protection.

Their project, nicknamed DANTE, aims to encourage wider and more active participation across society in the development and adoption of AI techniques.

“Artificial intelligence (AI) is evolving towards a situation where only a handful of the largest companies in the world can participate,” said Lane. “Given the importance of this technology to society this trajectory must be changed. We aim to invent, popularise and commercialise core new scientific breakthroughs that will enable AI technology in the future to be far more collaborative, distributed and open than it is today.”

The project will focus on developing decentralised forms of AI that facilitate the collaborative study, invention, development and deployment of machine learning products and methods, primarily between collections of companies and organisations. An underlying mission of DANTE is to facilitate advanced AI technology remaining available for adoption in the public sphere, for example in hospitals, public policy, and energy and transit infrastructure.

Professor Erwin Reisner, from the Yusuf Hamied Department of Chemistry, is developing a technology, called solar reforming, that creates sustainable fuels and chemicals from biomass and plastic waste. This solar-powered technology uses only waste, water and air as ingredients, and the sun powers a catalyst to produce green hydrogen fuel and platform chemicals to decarbonise the transport and chemical sectors. A recent review in Nature Reviews Chemistry gives an overview of plans for the technology.

“The generous long-term support provided by the Royal Academy of Engineering will be the critical driver for our ambitions to engineer, scale and ultimately commercialise our solar chemical technology,” said Reisner. “The timing for this support is perfect, as my team has recently demonstrated several prototypes for upcycling biomass and plastic waste using sunlight, and we have excellent momentum to grasp the opportunities arising from developing these new technologies. I also hope to use this Chair to leverage further support to establish a circular chemistry centre in Cambridge to tackle our biggest sustainability challenges.”

“I am excited to announce this latest round of Chairs in Emerging Technology,” said Dr Andrew Clark, Executive Director, Programmes, at the Royal Academy of Engineering. “The mid-term reviews of the previous rounds of Chairs are providing encouraging evidence that long-term funding of this nature helps to bring the groundbreaking and influential ideas of visionary engineers to fruition. I look forward to seeing the impacts of these four exceptionally talented individuals.”

Three Cambridge researchers – Professors Manish Chhowalla, Nic Lane and Erwin Reisner – have each been awarded a Royal Academy of Engineering Chair in Emerging Technologies, to develop emerging technologies with high potential to deliver economic and social benefits to the UK.

University of Cambridge
L-R: Manish Chhowalla, Nic Lane, Erwin Reisner

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Crews announced for The Boat Race 2024

The Cambridge and Oxford crews for The Boat Race 2024

The 36 student rowers who have won a place in the Blue Boat were announced at the event which was held in public for the first time in the Boat Races’ history.

The crews for the Women’s Race were unveiled first of all, and the Cambridge crew will feature two returning faces, Jenna Armstrong and Carina Graf, but for the others it is their first time in the coveted Blue Boat.

The crews for the Men’s Race were then unveiled, and this year there are five returning Blues: Seb Benzecry, Noam Mouelle, Tom Lynch, Luca Ferraro and Matt Edge.

The full line-ups are as follows:

Cambridge Women’s Blue Boat

Cox: Hannah Murphy     (Girton - MPhil Health, Medicine and Society)
Stroke: Megan Lee (Lucy Cavendish - MPhil  Management)
7: Iris Powell (Churchill - BA Natural Sciences)
6: Carys Earl (Gonville and Caius - BA Medicine)
5: Carina Graf    (Emmanuel - PhD Neuro Sci)
4: Jenna Armstrong (Jesus - PhD Physiology)
3: Clare Hole (St Catharine’s - MPhil Population Health Sciences)
2: Jo Matthews (St John’s - BA Medicine (Clinical))
Bow: Gemma King (St John’s - MRes + PhD Stem Cell Biology)

Cambridge Men’s Blue Boat

Cox: Ed Bracey (Wolfson - MPhil Economics)
Stroke: Matt Edge (St Catharine’s - PhD Chem Eng)
7: Luca Ferraro (King’s - BA Classics)
6: Tom Lynch (Hughes Hall - PhD Engineering)
5: Kenny Coplan (Hughes Hall - MPhil History of Art)
4: Gus John (Wolfson - MPhil Medieval History)
3: Thomas Marsh (St John’s - PhD Physics)
2: Noam Mouelle (Hughes Hall - PhD Astrophysics)
Bow: Seb Benzecry (Jesus - PhD Film Studies)

Asked by the host of the event, BBC Sport commentator Andrew Cotter, about whether there was a challenge integrating new faces into the Blue Boat, Cambridge Women’s Coach Patrick Ryan said: “Actually there are no new faces, every single one of them is a returner – just new Blues!”

The Cambridge Men’s and Women’s clubs unified in 2020 and Patrick added: “As we’ve become one club, we’ve learned to share more information and work together, hopefully for the betterment of the athletes here tonight.”

Cambridge Men’s Coach Rob Baker was then asked whether the number of returning Blues in the Men’s Boat would give Cambridge an advantage.

“Every year is different, every year is a challenge,” said Rob. “It’s great to have guys that have won the race and been through the process before, but yes it’s always a big challenge but we are up for it”.

It was the first time that Battersea Power Station, which famously featured on a Pink Floyd album cover, has hosted the crew announcement event. Siobhan Cassidy, Chair of the Boat Race Company Limited, said the venue was appropriate, given that it was designed by Sir Giles Gilbert Scott, who was also behind iconic buildings at Cambridge and Oxford, including Cambridge University Library.

She added: “[The Boat Race] is the ultimate British tradition, which draws on its heritage yet, with boats full of young students, it is very much an event that looks to the future.

“These young people have a unique opportunity to take to the water on such a high profile day. In order to get there they have made incredible choices. They have combined a full-time rigorous academic schedule with training and racing throughout the year - so let’s not underestimate how impressive these young people really are”.

The Gemini Boat Race 2024 takes place in Putney, London, on Saturday 30 March  – with the Women’s Race starting at 14:46 BST and the Men’s Race at 15:46 BST – renewing an intense rivalry which stretches back nearly 200 years. The event will be broadcast live on BBC One from 14:00 BST.

Last year saw Cambridge University win both the men’s and women’s races, leaving the overall records as 86-81 in the favour of Cambridge Men’s and 47-30 in the favour of Cambridge Women’s.

The Cambridge and Oxford crews for The Boat Race 2024 have been officially unveiled at a crew announcement held at the iconic Battersea Power Station.

The Cambridge and Oxford crews for The Boat Race 2024

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Major investment in doctoral training announced

Two people working on circuit boards in an office

The 65 Engineering and Physical Sciences Research Council (EPSRC) Centres for Doctoral Training (CDTs) will support leading research in areas of national importance, including net zero, AI, defence and security, healthcare and quantum technologies. The £1 billion in funding – from government, universities and industry – represents the UK’s biggest-ever investment in engineering and physical sciences doctoral skills.

The University of Cambridge will lead two of the CDTs and is a partner in a further five CDTs. The funding will support roughly 150 Cambridge PhD students over the next five years.

The CDT in Future Infrastructure and Built Environment: Unlocking Net Zero (FIBE3 CDT), led by Professor Abir Al-Tabbaa from the Department of Engineering, will focus on meeting the needs of the infrastructure and construction sector in its pursuit of net zero by 2050 and is a collaboration between Cambridge, 30+ industry partners and eight international academic partners.

“The infrastructure sector is responsible for significant CO2 emissions, energy use and consumption of natural resources, and it’s key to unlocking net zero,” said Al-Tabbaa. “This CDT will develop the next generation of highly talented doctoral graduates who will be equipped to lead the design and implementation of the net zero infrastructure agenda in the UK.”

The FIBE3 CDT will provide more than 70 fully funded studentships over the next five years. The £8.1M funding from EPSRC is supported by £1.3M funding from the University and over £2.5M from industry as well as over £8.9M of in-kind contributions. Recruitment is underway for the first FIBE3 CDT cohort, to start in October.

The CDT in Sensor Technologies and Applications in an Uncertain World, led by Professor Clemens Kaminski from the Department of Chemical Engineering and Biotechnology, will cover the entire sensor research chain – from development to end of life – and will emphasise systems thinking, responsible research and innovation, co-creation, and cohort learning.

“Our CDT will provide students with comprehensive expertise and skills in sensor technology,” said Kaminski. “This programme will develop experts who are capable of driving impactful sensor solutions for industry and society, and can deal with uncertain data and the consequences of a rapidly changing world.”

The University is also a partner in:

  • EPSRC Centre for Doctoral Training in 2D Materials of Tomorrow (2DMoT), led by: Professor Irina Grigorieva from the University of Manchester
  • EPSRC Centre for Doctoral Training Developing National Capability for Materials 4.0 and Henry Royce Institute, led by Professor William Parnell from the University of Manchester
  • EPSRC Centre for Doctoral Training in Superconductivity: Enabling Transformative Technologies, led by Professor Antony Carrington from the University of Bristol
  • EPSRC Centre for Doctoral Training in Aerosol Science: Harnessing Aerosol Science for Improved Security, Resilience and Global Health, led by Professor Jonathan Reid from the University of Bristol
  • EPSRC Centre for Doctoral Training in Photonic and Electronic Systems, led by Professor Alwyn Seeds from University College London

“As innovators across the world break new ground faster than ever, it is vital that government, business and academia invest in ambitious UK talent, giving them the tools to pioneer new discoveries that benefit all our lives while creating new jobs and growing the economy,” said Science and Technology Secretary, Michelle Donelan. “By targeting critical technologies including artificial intelligence and future telecoms, we are supporting world-class universities across the UK to build the skills base we need to unleash the potential of future tech and maintain our country’s reputation as a hub of cutting-edge research and development.”

“The Centres for Doctoral Training will help to prepare the next generation of researchers, specialists and industry experts across a wide range of sectors and industries,” said Professor Charlotte Deane, Executive Chair of the Engineering and Physical Sciences Research Council, part of UK Research and Innovation. “Spanning locations across the UK and a wide range of disciplines, the new centres are a vivid illustration of the UK’s depth of expertise and potential, which will help us to tackle large-scale, complex challenges and benefit society and the economy. The high calibre of both the new centres and applicants is a testament to the abundance of research excellence across the UK, and EPSRC’s role as part of UKRI is to invest in this excellence to advance knowledge and deliver a sustainable, resilient and prosperous nation.”

More than 4,000 doctoral students will be trained over the next nine years, building on EPSRC’s long-standing record of sustained support for doctoral training.

Total investment in the CDTs includes:

  • £479 million by EPSRC, including £16 million of additional UKRI funding to support CDTs in quantum technologies
  • Over £7 million from Biotechnology and Biological Sciences Research Council, also part of UKRI, to co-fund three CDTs
  • £16 million by the MOD to support two CDTs
  • £169 million by UK universities
  • plus a further £420 million in financial and in-kind support from business partners 

This investment includes an additional £135 million for CDTs which will start in 2025. More than 1,400 companies, higher education institutions, charities and civic organisations are taking part in the centres for doctoral training. CDTs have a significant reputation for training future UK academics, industrialists and innovators who have gone on to develop the latest technologies.

Sixty-five Centres for Doctoral Training – which will train more than 4000 doctoral students across the UK – have been announced by Science, Innovation and Technology Secretary Michelle Donelan.

Phynart Studio via Getty Images
Two people working on circuit boards

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

University signs Armed Forces Covenant

The Armed Forces Covenant is a promise that together we acknowledge and understand that those who serve or have served in the Armed Forces, and their families, should be treated with fairness and respect in the communities, economy, and society they serve with their lives.

The Covenant’s twin underlying principles are that members of the Armed Forces community should face no disadvantage compared to other citizens in the provision of public and commercial services; and that special consideration is appropriate in some cases, especially for those who have given the most such as the injured or the bereaved. The University has also pledged to appoint an Armed Forces Champion.

Vice-Chancellor, Professor Deborah Prentice, signed the Armed Forces Covenant on behalf of the University alongside the Chief of the Air Staff, Air Chief Marshal Sir Richard Knighton. 

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Genetic mutation in a quarter of all Labradors hard-wires them for obesity

Brown labrador retriever dog looks at food treat

This obesity-driving combination means that dog owners must be particularly strict with feeding and exercising their Labradors to keep them slim.

The mutation is in a gene called POMC, which plays a critical role in hunger and energy use.

Around 25% of Labradors and 66% of flatcoated retriever dogs have the POMC mutation, which researchers previously showed causes increased interest in food and risk of obesity.

The new study reveals how the mutation profoundly changes the way Labradors and flatcoated retrievers behave around food. It found that although they don’t need to eat more to feel full, they are hungrier in between meals.

In addition, dogs with the POMC mutation were found to use around 25% less energy at rest than dogs without it, meaning they don’t need to consume as many calories to maintain a healthy body weight.

“We found that a mutation in the POMC gene seems to make dogs hungrier. Affected dogs tend to overeat because they get hungry between meals more quickly than dogs without the mutation,” said Dr Eleanor Raffan, a researcher in the University of Cambridge’s Department of Physiology, Development and Neuroscience who led the study.

She added: “All owners of Labradors and flatcoated retrievers need to watch what they’re feeding these highly food-motivated dogs, to keep them a healthy weight. But dogs with this genetic mutation face a double whammy: they not only want to eat more, but also need fewer calories because they’re not burning them off as fast.”

The POMC mutation was found to alter a pathway in the dogs’ brains associated with body weight regulation. The mutation triggers a starvation signal that tells their body to increase food intake and conserve energy, despite this being unnecessary.

The results are published today in the journal Science Advances.

Raffan said: “People are often rude about the owners of fat dogs, blaming them for not properly managing their dogs’ diet and exercise. But we’ve shown that Labradors with this genetic mutation are looking for food all the time, trying to increase their energy intake. It’s very difficult to keep these dogs slim, but it can be done.”

The researchers say owners can keep their retrievers distracted from this constant hunger by spreading out each daily food ration, for example by using puzzle feeders or scattering the food around the garden so it takes longer to eat.

In the study, 87 adult pet Labrador dogs - all a healthy weight or moderately overweight - took part in several tests including the ‘sausage in a box’ test.

First, the dogs were given a can of dogfood every 20 minutes until they chose not to eat any more. All ate huge amounts of food, but the dogs with the POMC mutation didn’t eat more than those without it. This showed that they all feel full with a similar amount of food.

Next, on a different day, the dogs were fed a standard amount of breakfast. Exactly three hours later they were offered a sausage in a box and their behaviour was recorded. The box was made of clear plastic with a perforated lid, so the dogs could see and smell the sausage, but couldn’t eat it.

The researchers found that dogs with the POMC mutation tried significantly harder to get the sausage from the box than dogs without it, indicating greater hunger.

The dogs were then allowed to sleep in a special chamber that measured the gases they breathed out. This revealed that dogs with the POMC mutation burn around 25% fewer calories than dogs without it.

The POMC gene and the brain pathway it affects are similar in dogs and humans. The new findings are consistent with reports of extreme hunger in humans with POMC mutations, who tend to become obese at an early age and develop a host of clinical problems as a result.

Drugs currently in development for human obesity, underactive sexual desire and certain skin conditions target this brain pathway, so understanding it fully is important.

A mutation in the POMC gene in dogs prevents production of two chemical messengers in the dog brain, beta-melanocyte stimulating hormone (β-MSH) and beta-endorphin, but does not affect production of a third, alpha-melanocyte stimulating hormone (α-MSH).

Further laboratory studies by the team suggest that β-MSH and beta-endorphin are important in determining hunger and moderating energy use, and their role is independent of the presence of α-MSH. This challenges the previous belief, based on research in rats, that early onset human obesity due to POMC mutations is caused only by a lack of α-MSH. Rats don’t produce beta-melanocyte stimulating hormone, but humans and dogs produce both α- and β-MSH.

The research was funded by The Dogs Trust and Wellcome.

Reference: Dittmann, M T et al: ‘Low resting metabolic rate and increased hunger due to β-MSH and β-endorphin deletion in a canine model.’ Science Advances, March 2024. DOI: 10.1126/sciadv.adj3823

New research finds around a quarter of Labrador retriever dogs face a double-whammy of feeling hungry all the time and burning fewer calories due to a genetic mutation.

Labradors with this genetic mutation are looking for food all the time, trying to increase their energy intake. It’s very difficult to keep these dogs slim, but it can be done.
Eleanor Raffan
Jane Goodall
Labrador retriever dog

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

University statement on Budget 2024

The additional funding towards transport and health infrastructure around the Cambridge Biomedical Campus will help pave the way for sustainable growth. We will continue to work with local partners and central government on the development of a long-term funding settlement for Cambridge to be announced at the next spending review.

We also welcome the Department of Levelling up, Housing and Communities' ambitions for the future growth of Cambridge as set out in their Case for Cambridge paper. It recognises the unparalleled research and scientific capabilities that the city holds and its potential to be the world's leading scientific hub. It also rightly points to the issues that threaten this vision, from housing prices to water shortages and transport gridlock. We welcome the Government’s commitment to work with all local partners to seek solutions to these challenges.

We have been calling for solutions to address the water scarcity in the region. Today’s announcement by DEFRA and the Environment Agency enables current projects under the 2018 local plan to go ahead. It is good news that the announcement recognises the importance for water resources to meet the needs of Cambridge residents and business, but also the needs of the natural environment. Greater certainty around both long-term water supply and plans set out to offset demand in the short term can help support the growth of Cambridge in a way that is both sustainable and supports the economic potential of the area.

AstraZeneca

We are delighted with today’s announcement that friends at AstraZeneca intend to expand their footprint on the Cambridge Biomedical Campus and invest in the building of a vaccine manufacturing hub in Liverpool. This will further strengthen the central role that AstraZeneca plays at the heart of the UK life sciences sector and the Cambridge cluster.

Scientists at AstraZeneca have been working with the University of Cambridge for more than two decades. Today there are more than 130 active collaborations between the two organisations - developing new treatments that will make a real difference to patients’ lives.

We welcome today's announcements on steps to unlock Cambridge's potential as the world's leading scientific powerhouse.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Astronomers spot oldest ‘dead’ galaxy yet observed

False-colour JWST image of a small fraction of the GOODS South field, with JADES-GS-z7-01-QU highlighted

Using the James Webb Space Telescope, an international team of astronomers led by the University of Cambridge have spotted a ‘dead’ galaxy when the universe was just 700 million years old, the oldest such galaxy ever observed.

This galaxy appears to have lived fast and died young: star formation happened quickly and stopped almost as quickly, which is unexpected for so early in the universe’s evolution. However, it is unclear whether this galaxy’s ‘quenched’ state is temporary or permanent, and what caused it to stop forming new stars.

The results, reported in the journal Nature, could be important to help astronomers understand how and why galaxies stop forming new stars, and whether the factors affecting star formation have changed over billions of years.

“The first few hundred million years of the universe was a very active phase, with lots of gas clouds collapsing to form new stars,” said Tobias Looser from the Kavli Institute for Cosmology, the paper’s first author. “Galaxies need a rich supply of gas to form new stars, and the early universe was like an all-you-can-eat buffet.”

“It’s only later in the universe that we start to see galaxies stop forming stars, whether that’s due to a black hole or something else,” said co-author Dr Francesco D’Eugenio, also from the Kavli Institute for Cosmology.

Astronomers believe that star formation can be slowed or stopped by different factors, all of which will starve a galaxy of the gas it needs to form new stars. Internal factors, such as a supermassive black hole or feedback from star formation, can push gas out of the galaxy, causing star formation to stop rapidly. Alternatively, gas can be consumed very quickly by star formation, without being promptly replenished by fresh gas from the surroundings of the galaxy, resulting in galaxy starvation.

“We’re not sure if any of those scenarios can explain what we’ve now seen with Webb,” said co-author Professor Roberto Maiolino. “Until now, to understand the early universe, we’ve used models based on the modern universe. But now that we can see so much further back in time, and observe that the star formation was quenched so rapidly in this galaxy, models based on the modern universe may need to be revisited.”

Using data from JADES (JWST Advanced Deep Extragalactic Survey), the astronomers determined that this galaxy experienced a short and intense period of star formation over a period between 30 and 90 million years. But between 10 and 20 million years before the point in time where it was observed with Webb, star formation suddenly stopped.

“Everything seems to happen faster and more dramatically in the early universe, and that might include galaxies moving from a star-forming phase to dormant or quenched,” said Looser.

Astronomers have previously observed dead galaxies in the early universe, but this galaxy is the oldest yet – just 700 million years after the big bang, more than 13 billion years ago. This observation is one of the deepest yet made with Webb.

In addition to the oldest, this galaxy is also relatively low mass – about the same as the Small Magellanic Cloud (SMC), a dwarf galaxy near the Milky Way, although the SMC is still forming new stars. Other quenched galaxies in the early universe have been far more massive, but Webb’s improved sensitivity allows smaller and fainter galaxies to be observed and analysed.

The astronomers say that although it appears dead at the time of observation, it’s possible that in the roughly 13 billion years since, this galaxy may have come back to life and started forming new stars again.

“We’re looking for other galaxies like this one in the early universe, which will help us place some constraints on how and why galaxies stop forming new stars,” said D’Eugenio. “It could be the case that galaxies in the early universe ‘die’ and then burst back to life – we’ll need more observations to help us figure that out.”

The research was supported in part by the European Research Council, the Royal Society, and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

 

Reference:
Tobias J Looser et al. ‘A recently quenched galaxy 700 million years after the Big Bang.’ Nature (2024). DOI: 10.1038/s41586-024-07227-0

A galaxy that suddenly stopped forming new stars more than 13 billion years ago has been observed by astronomers.

JADES Collaboration
False-colour JWST image of a small fraction of the GOODS South field, with JADES-GS-z7-01-QU highlighted

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Neon sign identified by JWST gives clue to planet formation

Artist's impression of the surroundings of the supermassive black hole in NGC 3783

Planetary systems like our Solar System seem to contain more rocky objects than gas-rich ones. Around our sun, these include the inner planets, the asteroid belt and the Kuiper belt. But scientists have known for a long time that planet-forming discs start with 100 times more mass in gas than in solids, which leads to a pressing question; when and how does most of the gas leave the disc/system?

JWST is helping scientists uncover how planets form, by advancing understanding of their birthplaces, the circumstellar discs surrounding young stars. In a new study published in the Astronomical Journal, a team of scientists including those from the University of Leicester, the University of Cambridge and led by the University of Arizona, image for the first time an old planet-forming disc (still very young relative to the Sun) which is actively dispersing its gas content.

Knowing when the gas disperses is important as it constrains the time that is left for nascent planets to consume the gas from their surroundings.

During the very early stages of planetary system formation, planets coalesce in a spinning disc of gas and tiny dust around the young star. These particles clump together, building up into bigger and bigger chunks called planetesimals. Over time, these planetesimals collide and stick together, eventually forming planets. The type, size, and location of planets that form depend on the amount of material available and how long it remains in the disc. So, the outcome of planet formation depends on the evolution and dispersal of the disc.

At the heart of this discovery is the observation of T Cha, a young star (relative to the Sun) enveloped by an eroding disc notable for its vast dust gap, approximately 30 astronomical units in radius. For the first time, astronomers have imaged the dispersing gas (aka winds) using the four lines of the noble gases neon (Ne) and argon (Ar), one of which is the first detection in a planet-forming disc. The images of [Ne II] show that the wind is coming from an extended region of the disc. The team is also interested in knowing how this process takes place, so they can better understand the history and impact on our solar system.

Scientists have been trying to understand the mechanisms behind the winds in protoplanetary discs for over a decade. The observations by JWST represent a huge step-change in the data they have to work with, compared to previous data from ground-based telescopes.

“We first used neon to study planet-forming discs more than a decade ago, testing our computational simulations against data from Spitzer, and new observations we obtained with the ESO VLT,” said co-author Professor Richard Alexander from the University of Leicester. “We learned a lot, but those observations didn’t allow us to measure how much mass the discs were losing.

“The new JWST data are spectacular, and being able to resolve disc winds in images is something I never thought would be possible.  With more observations like this still to come, JWST will enable us to understand young planetary systems as never before.”

“These winds could be driven either by high-energy stellar photons (the star's light) or by the magnetic field that weaves the planet-forming disc,” said Naman Bajaj from the University of Arizona, the study’s lead author.

To differentiate between the two, the same group, this time led by Dr Andrew Sellek of Leiden Observatory and previously of the Institute of Astronomy at the University of Cambridge, performed simulations of the dispersal driven by stellar photons. They compare these simulations to the actual observations and find dispersal by high-energy stellar photons can explain the observations, and hence cannot be excluded as a possibility.

“The simultaneous measurement of all four lines by JWST proved crucial to pinning down the properties of the wind and helped us to demonstrate that significant amounts of gas are being dispersed,” said Sellek.

To put it into context, the researchers calculate that the mass dispersing every year is equivalent to that of the moon! These results will be published in a companion paper, currently under review at the Astronomical Journal.

The [Ne II] line was discovered towards several planet-forming discs in 2007 with the Spitzer Space Telescope and soon identified as a tracer of winds by team member Professor Ilaria Pascucci at the University of Arizona; this transformed research efforts focused on understanding disc gas dispersal. Now the discovery of spatially resolved [Ne II] - as well as the first detection of [Ar III] - using the James Webb Space Telescope, could become the next step towards transforming our understanding of this process. 

The implications of these findings offer new insights into the complex interactions that lead to the dispersal of the gas and dust critical for planet formation. By understanding the mechanisms behind disc dispersal, scientists can better predict the timelines and environments conducive to the birth of planets. The team's work demonstrates the power of JWST and sets a new path for exploring planet formation dynamics and the evolution of circumstellar discs.

Reference:
Naman S Bajaj et al. ‘JWST MIRI MRS Observations of T Cha: Discovery of a Spatially Resolved Disk Wind.’ The Astronomical Journal (2024). DOI: 10.3849/1538-3881/ad22e1

Adapted from a University of Leicester press release.

The winds that help to form planets in the gaseous discs of early solar systems have been imaged for the first time by the James Webb Space Telescope (JWST) using the noble gases neon and argon.

Artist's impression of the surroundings of the supermassive black hole in NGC 3783

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Low iron levels resulting from infection could be key trigger of long COVID

A man sitting on a couch holding his head in his hands

The discovery not only points to possible ways to prevent or treat the condition, but could help explain why symptoms similar to those of long COVID are also commonly seen in a number of post-viral conditions and chronic inflammation.

Although estimates are highly variable, as many as three in 10 people infected with SARS-CoV-2 could go on to develop long COVID, with symptoms including fatigue, shortness of breath, muscle aches and problems with memory and concentration (‘brain fog’). An estimated 1.9 million people in the UK alone were experiencing self-reported long COVID as of March 2023, according to the Office of National Statistics.

Shortly after the start of the COVID-19 pandemic, researchers at the University of Cambridge began recruiting people who had tested positive for the virus to the COVID-19 cohort of the National Institute for Health and Care Research (NIHR) BioResource. These included asymptomatic healthcare staff identified via routine screening through to patients admitted to Cambridge University Hospitals NHS Foundation Trust, some to its intensive care unit.

Over the course of a year, participants provided blood samples, allowing researchers to monitor changes in the blood post-infection. As it became clear that a significant number of patients would go on to have symptoms that persisted – long COVID – researchers were able to track back through these samples to see whether any changes in the blood correlated with their later condition.

In findings published in Nature Immunology, researchers at the Cambridge Institute of Therapeutic Immunology and Infectious Disease (CITIID), University of Cambridge, together with colleagues at Oxford, analysed blood samples from 214 individuals. Approximately 45% of those questioned about their recovery reported symptoms of long COVID between three and ten months later.

Professor Ken Smith, who was Director of CITIID at the time of the study and will take up a position as Director of the Walter and Eliza Hall Institute of Medical Research (WEHI) in Melbourne, Australia, in April, said: “Having recruited a group of people with SARS-CoV-2 early in the pandemic, analysis of several blood samples and clinical information collected over a 12 month period after infection has proved invaluable in giving us important and unexpected insights into why, for some unlucky individuals, initial SARS-CoV-2 infection is followed by months of persistent symptoms.”

The team discovered that ongoing inflammation – a natural part of the immune response to infection – and low iron levels in blood, contributing to anaemia and disrupting healthy red blood cell production, could be seen as early as two weeks post COVID-19 in those individuals reporting long COVID many months later.

Early iron dysregulation was detectable in the long COVID group independent of age, sex, or initial COVID-19 severity, suggesting a possible impact on recovery even in those who were at low risk for severe COVID-19, or who did not require hospitalisation or oxygen therapy when sick.

Dr Aimee Hanson, who worked on the study while at the University of Cambridge, and is now at the University of Bristol, said: “Iron levels, and the way the body regulates iron, were disrupted early on during SARS-CoV-2 infection, and took a very long time to recover, particularly in those people who went on to report long COVID months later.

“Although we saw evidence that the body was trying to rectify low iron availability and the resulting anaemia by producing more red blood cells, it was not doing a particularly good job of it in the face of ongoing inflammation.”

Interestingly, although iron dysregulation was more profound during and following severe COVID-19, those who went on to develop long COVID after a milder course of acute COVID-19 showed similar patterns in the blood. The most pronounced association with long COVID was how quickly inflammation, iron levels and regulation returned to normal following SARS-CoV-2 infection – though symptoms tended to continue long after iron levels had recovered.

Co-author Professor Hal Drakesmith, from the MRC Weatherall Institute of Molecular Medicine at the University of Oxford, said iron dysregulation is a common consequence of inflammation and is a natural response to infection.

“When the body has an infection, it responds by removing iron from the bloodstream. This protects us from potentially lethal bacteria that capture the iron in the bloodstream and grow rapidly. It’s an evolutionary response that redistributes iron in the body, and the blood plasma becomes an iron desert.

“However, if this goes on for a long time, there is less iron for red blood cells, so oxygen is transported less efficiently affecting metabolism and energy production, and for white blood cells, which need iron to work properly. The protective mechanism ends up becoming a problem.”

The findings may help explain why symptoms such as fatigue and exercise intolerance are common in long COVID, as well as in several other post-viral syndromes with lasting symptoms.

The researchers say the study points to potential ways of preventing or reducing the impact of long COVID by rectifying iron dysregulation in early COVID-19 to prevent adverse long-term health outcomes.

One approach might be controlling the extreme inflammation as early as possible, before it impacts on iron regulation. Another approach might involve iron supplementation; however as Dr Hanson pointed out, this may not be straightforward.

“It isn't necessarily the case that individuals don't have enough iron in their body, it's just that it’s trapped in the wrong place,” she said. “What we need is a way to remobilise the iron and pull it back into the bloodstream, where it becomes more useful to the red blood cells.”

The research also supports ‘accidental’ findings from other studies, including the IRONMAN study, which was looking at whether iron supplements benefited patients with heart failure – the study was disrupted due to the COVID-19 pandemic, but preliminary findings suggest that trial participants were less likely to develop severe adverse effects from COVID-19. Similar effects have been observed among people living with the blood disorder beta-thalassemia, which can cause individuals to produce too much iron in their blood.

The research was funded by Wellcome, the Medical Research Council, NIHR and European Union Horizon 2020 Programme.

Reference
Hanson, AL et al. Iron dysregulation and inflammatory stress erythropoiesis associates with long-term outcome of COVID-19. Nat Imm; 1 March 2024; DOI: 10.1038/s41590-024-01754-8

Problems with iron levels in the blood and the body’s ability to regulate this important nutrient as a result of SARS-CoV-2 infection could be a key trigger for long COVID, new research has discovered.

Iron levels, and the way the body regulates iron, were disrupted early on during SARS-CoV-2 infection, and took a very long time to recover, particularly in those people who went on to report long COVID months later
Aimee Hanson
A man sitting on a couch holding his head in his hands

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Pythagoras was wrong: there are no universal musical harmonies, study finds

A man playing a bonang

According to the Ancient Greek philosopher Pythagoras, ‘consonance’ – a pleasant-sounding combination of notes – is produced by special relationships between simple numbers such as 3 and 4. More recently, scholars have tried to find psychological explanations, but these ‘integer ratios’ are still credited with making a chord sound beautiful, and deviation from them is thought to make music ‘dissonant’, unpleasant sounding. 

But researchers from the University of Cambridge, Princeton and the Max Planck Institute for Empirical Aesthetics, have now discovered two key ways in which Pythagoras was wrong.

Their study, published in Nature Communications, shows that in normal listening contexts, we do not actually prefer chords to be perfectly in these mathematical ratios.

“We prefer slight amounts of deviation. We like a little imperfection because this gives life to the sounds, and that is attractive to us,” said co-author, Dr Peter Harrison, from Cambridge’s Faculty of Music and Director of its Centre for Music and Science.

The researchers also found that the role played by these mathematical relationships disappears when you consider certain musical instruments that are less familiar to Western musicians, audiences and scholars. These instruments tend to be bells, gongs, types of xylophones and other kinds of pitched percussion instruments. In particular, they studied the ‘bonang’, an instrument from the Javanese gamelan built from a collection of small gongs.

“When we use instruments like the bonang, Pythagoras's special numbers go out the window and we encounter entirely new patterns of consonance and dissonance,” said Dr Harrison, a Fellow of Churchill College.

“The shape of some percussion instruments means that when you hit them, and they resonate, their frequency components don’t respect those traditional mathematical relationships. That's when we find interesting things happening.”

“Western research has focused so much on familiar orchestral instruments, but other musical cultures use instruments that, because of their shape and physics, are what we would call ‘inharmonic’. 

The researchers created an online laboratory in which over 4,000 people from the US and South Korea participated in 23 behavioural experiments. Participants were played chords and invited to give each a numeric pleasantness rating or to use a slider to adjust particular notes in a chord to make it sound more pleasant. The experiments produced over 235,000 human judgments.

The experiments explored musical chords from different perspectives. Some zoomed in on particular musical intervals and asked participants to judge whether they preferred them perfectly tuned, slightly sharp or slightly flat. The researchers were surprised to find a significant preference for slight imperfection, or ‘inharmonicity’. Other experiments explored harmony perception with Western and non-Western musical instruments, including the bonang.

 

Instinctive appreciation of new kinds of harmony

The researchers found that the bonang’s consonances mapped neatly onto the particular musical scale used in the Indonesian culture from which it comes. These consonances cannot be replicated on a Western piano, for instance, because they would fall between the cracks of the scale traditionally used. 

“Our findings challenge the traditional idea that harmony can only be one way, that chords have to reflect these mathematical relationships. We show that there are many more kinds of harmony out there, and that there are good reasons why other cultures developed them,” Dr Harrison said.

Importantly, the study suggests that its participants – not trained musicians and unfamiliar with Javanese music – were able to appreciate the new consonances of the bonang’s tones instinctively.

“Music creation is all about exploring the creative possibilities of a given set of qualities, for example, finding out what kinds of melodies can you play on a flute, or what kinds of sounds can you make with your mouth,” Harrison said.

“Our findings suggest that if you use different instruments, you can unlock a whole new harmonic language that people intuitively appreciate, they don’t need to study it to appreciate it. A lot of experimental music in the last 100 years of Western classical music has been quite hard for listeners because it involves highly abstract structures that are hard to enjoy. In contrast, psychological findings like ours can help stimulate new music that listeners intuitively enjoy.”

Exciting opportunities for musicians and producers

Dr Harrison hopes that the research will encourage musicians to try out unfamiliar instruments and see if they offer new harmonies and open up new creative possibilities. 

“Quite a lot of pop music now tries to marry Western harmony with local melodies from the Middle East, India, and other parts of the world. That can be more or less successful, but one problem is that notes can sound dissonant if you play them with Western instruments. 

“Musicians and producers might be able to make that marriage work better if they took account of our findings and considered changing the ‘timbre’, the tone quality, by using specially chosen real or synthesised instruments. Then they really might get the best of both worlds: harmony and local scale systems.”

Harrison and his collaborators are exploring different kinds of instruments and follow-up studies to test a broader range of cultures. In particular, they would like to gain insights from musicians who use ‘inharmonic’ instruments to understand whether they have internalised different concepts of harmony to the Western participants in this study.

Reference

R Marjieh, P M C Harrison, H Lee, F Deligiannaki, and N Jacoby, ‘Timbral effects on consonance disentangle psychoacoustic mechanisms and suggest perceptual origins for musical scales’, Nature Communications (2024). DOI: 10.1038/s41467-024-45812-z

The tone and tuning of musical instruments has the power to manipulate our appreciation of harmony, new research shows. The findings challenge centuries of Western music theory and encourage greater experimentation with instruments from different cultures.

There are many more kinds of harmony out there
Peter Harrison
A man playing a bonang

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Opinion: the future of science is automation

Robot arm handling test tubes

Thanks to the widespread availability of food and medical care, the ability to travel, and many other scientific and technological developments, billions of people today are living better lives than kings of centuries past. It is deeply surprising to me how little appreciated this astonishing fact is.

Of course, despite all the progress we’ve made, the world faces many challenges in the 21st century: climate change, pandemics, poverty and cancer, to name just a few.

If all the countries in the world could join together to share technology and resources, we might be to deal with and overcome these challenges. However, history presents no example of such collaboration, and the current geopolitical situation does not offer much in the way of hope.

Our best hope of dealing with these challenges is to make science and technology more productive. The only feasible way to achieve this is through the integration of Artificial Intelligence (AI) and laboratory automation.

AI systems already possess superhuman scientific powers. They can remember massive volumes of facts and learn from huge datasets. They can execute flawless logical reasoning, and near optimal probabilistic reasoning. They are can read every scientific paper, indeed everything ever written. These powers are complimentary to human scientists.

When the scientific method was developed in the 17th century, one of the core insights was the need to conduct experiments in the physical world, not just to think.

Today, laboratory automation is steadily advancing, and robots can now carry out most of the laboratory tasks that humans can. We are also now seeing the emergence of the ‘Cloud Lab’ concept. The idea is to provide laboratory automation at scale and remotely, with scientists sending their samples to the cloud lab, using a computer interface to design and execute their experiments.

And then there are AI Scientists: AI systems integrated with laboratory automations that are capable of carrying out the closed-loop automation of scientific research (aka 'Robot Scientists', 'Self-driving Labs'). These systems automatically originate hypotheses to explain observations, devise experiments to test these hypotheses, physically run these experiments using laboratory robotics, interpret the results, and then repeat the cycle.

AI Scientists can work cheaper, faster, more accurately, and longer than humans. They can also be easily multiplied. As the experiments are conceived and executed automatically by computer, it’s possible to completely capture and digitally curate all aspects of the scientific process, making the science more reproducible. There are now around 100 AI Scientists around the world, working in areas from quantum mechanics to astronomy, from chemistry to medicine.

Within the last year or so the world has been stunned by the success of Large Language Models (LLMs) such as ChatGPT, which have achieved breakthrough performance on a wide range of conversation-based tasks. LLMs are surprisingly strong absorbers of technical knowledge, such as chemical reactions and logical expressions. LLMs, and more broadly Foundation Models, show great potential for super-charging AI Scientists. They can act both as a source of scientific knowledge, since they have read all the scientific literature, and a source of new scientific hypotheses.

One of the current problems with LLMs is their tendency to hallucinate, that is to output statements that are not true. While this is a serious problem in many applications, it is not necessarily so in science, where physical experiments are the arbiters of truth. Hallucinations are hypotheses.

AI has been used as a tool in the research behind tens of thousands of scientific papers. We believe this only a start. We believe that AI has the potential to transform the very process of science.

We believe that by harnessing the power of AI, we can propel humanity toward a future where groundbreaking achievements in science, even achievements worthy of a Nobel Prize, can be fully automated. Such advances could transform science and technology, and provide hope of dealing with the formidable challenges that face humankind in the 21st century

The Nobel Turing Challenge aims to develop AI Scientists capable of making Nobel-quality scientific discoveries at a level comparable, and possibly superior to the best human scientists by 2050.

As well as being a potential transformative power for good, the application of AI to science has potential for harm. As a step towards preventing this harm, my colleagues and I have prepared the Stockholm Declaration on AI for Science. This commits the signees to the responsible and ethical development of AI for science. A copy of the declaration can be signed at: https://sites.google.com/view/stockholm-declaration

We urge all scientists working with AI to sign.

Professor Ross King from Cambridge's Department of Chemical Engineering and Biotechnology, who originated the idea of a 'Robot Scientist', discusses why he believes that AI-powered scientists could surpass the best human scientists by the middle of the century, but only if AI for science is developed responsibly and ethically. 

kynny via Getty Images
Robot arm handling test tubes

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

NHS trial of sponge-on-a-string test replaces need for endoscopy for thousands of patients

Capsule and sponge

The NHS pilot, which has tested over 8,500 patients with the ‘capsule sponge test’, showed almost eight out of 10 patients who completed a test were discharged without the need for further testing, freeing up endoscopy capacity for higher-risk patients and those referred for urgent tests for oesophageal cancer.

The test involves patients swallowing a small capsule-shaped device that contains a tiny sponge that collects cell samples for analysis before being extracted via a string thread attached to the sponge. It has been developed by Professor Rebecca Fitzgerald, Director of the Early Cancer Institute at the University of Cambridge.

Professor Fitzgerald said: “It is very exciting to see the positive results of the NHS England real-world pilot for our capsule-sponge test. This is a major step forward to making this simple test more routinely available outside of clinical trials. Timely diagnosis is vital for improving outcomes for patients.”

Barrett’s oesophagus – a condition affecting the food pipe which can go on to cause oesophageal cancer in some patients – is usually diagnosed or ruled out via endoscopy (a camera test of the food pipe) following a GP referral to a gastroenterologist or other specialist practitioner who can carry out the procedure.

The sponge-on-a-string test being trialled by the NHS can instead be carried out quickly in a short appointment, without the need for sedation.

Amanda Pritchard, NHS chief executive, said: “Thousands of people have now benefitted from this incredibly efficient test on the NHS – while the sponge on a string is small in size, it can make a big difference for patients – they can conveniently fit the test into their day and it can often replace the need for an endoscopy while also helping to reduce waiting lists by freeing up staff and resources.

“The NHS is always striving to adopt the latest innovations and new ways of working that help improve patient experience and increase efficiency simple sponge on a string test is just one example of many pioneering tools we’ve trialled in recent years to help diagnose and treat people sooner.”

In a survey of over 350 patients who had the capsule sponge test, patients often said they would recommend the test to a friend or family member, and 94% of patients reported experiencing only mild or no pain at all.

The NHS began piloting the test during the pandemic when there was increased pressure on services and a growing backlog for endoscopy.

Gastro-oesophageal reflux, also known as acid reflux, is a relatively common condition, affecting around one to two in every ten people to some degree, and some of these people may already have or will develop Barrett’s oesophagus, which is a precursor to oesophageal cancer.

There are around 9,300 new oesophageal cancer cases in the UK every year. The key to saving lives is to detect it an earlier stage of Barrett's oesophagus before it becomes cancerous.

The NHS pilot was launched at 30 hospital sites across 17 areas in England including Manchester, Plymouth, London, Kent and Cumbria. Evaluation of the pilot showed that using capsule sponge was highly cost effective compared to using endoscopy-only for diagnosing patients – saving around £400 per patient.

Patients with positive results from the capsule sponge test who were referred on for an endoscopy had the highest prevalence of Barrett’s oesophagus at 27.2%, compared to zero patients with negative results who completed an endoscopy.

One of the first pilot sites at East and North Hertfordshire NHS Trust has now performed around 1,400 capsule sponge tests – offering to both patients with reflux symptoms via a new consultant led, nurse run early diagnosis service, as well as to patients on an existing Barrett’s surveillance programme.

In the first 1,000 patients, the capsule test identified Barrett’s in 6% patients with reflux and found two new cancers and three patients with dysplasia who may have had a longer time to diagnosis otherwise. While 72% reflux patients were discharged back to their GP without the need for an endoscopy.

As of January, 368 patients have had a positive test result of whom about half have confirmed Barrett's oesophagus.

Dr Danielle Morris, a consultant gastroenterologist at East and North Hertfordshire NHS Trust, said: “Using the capsule sponge test as a diagnosis triage tool has had huge benefits for patients, avoiding the need for unnecessary gastroscopy in almost seven out of 10 patients, and helping to reduce endoscopy waiting lists enabling us to prioritise those who really need endoscopy to have it done quickly.

“The test is performed by a single trained practitioner in an outpatient setting, so it is very resource light compared to gastroscopy, and our patients are very supportive of the service – with almost nine in 10 patients preferring the capsule sponge to a gastroscopy.”

Adapted from a press release from NHS England.

A new test to help diagnose a condition that can lead to oesophageal cancer – developed by Cambridge researchers and trialled by the NHS – has reduced the need for invasive endoscopy in thousands of low-risk patients.

It is very exciting to see the positive results of the NHS England real-world pilot for our capsule-sponge test
Rebecca Fitzgerald
Cyted
Capsule and sponge

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Vice-Chancellor visits North West to encourage more Cambridge applications

Vice-Chancellor Professor Deborah Prentice

The trip aims to build on the progress made in recent years to welcome a more diverse group of students at Cambridge.

Accompanied by Baroness Sally Morgan of Huyton, Master of Fitzwilliam College, who comes from Liverpool herself, Professor Prentice is speaking to students, teachers and education leaders to hear about their experiences and the challenges they face.

The Liverpool Echo has covered the visit and has also published an op-ed by Professor Prentice, sharing her concerns that admissions to Cambridge are skewed towards London and the South East, and reflecting on a recently published Echo piece by Cambridge graduate Eva Carroll, who went to school in Everton, and who wants to inspire more talented people in the region to follow in her footsteps.

Professor Prentice writes:

I smiled with warm recognition last month when I read an inspiring article by one of our recent graduates in the Liverpool Echo. Eva Carroll, who comes from Everton, described arriving at the University of Cambridge and settling in, learning new ways and ancient traditions.

Eva’s story is not completely different to my own – although a few thousand miles and some decades apart. We both grew up with single parents and were the first in our families to go to university.

I’ve been Vice-Chancellor at Cambridge for just over seven months now. On arriving here, like Eva, I noticed many of the traditions and have quickly grown used to them. I do know that it is a place of extraordinary beauty, and the punting and gowns still exist against a backdrop of amazing history and achievement.

Yet it is also a vibrant place where people of all backgrounds come to learn, study and carry out world-leading research on issues which affect the lives of people right here in Liverpool and around the world, such as progress on cancer, on other areas of public health, plus AI and climate.

Today, I am getting to visit Liverpool for the first time, a great city which I have always wanted to see for myself. Growing up in a modest corner of Oakland, California in the 1960s, and loving music as my passion, I could only imagine this place, whose musical talent conquered the world.

Cambridge has a huge impact on the economy of the North West: a recent analysis showed that Cambridge contributes around £769m a year to the region’s economy through outstanding research that leads to new companies and economic activity taking place here, and delivering thousands of jobs.

And the University has made real progress in recent years in welcoming a more diverse group of students, and the proportion of students who join from state schools has risen significantly.

Despite this, I share Eva's concern that admissions to Cambridge - which is most certainly a national university - is skewed towards London and the South East. In 2022 nearly half of our undergraduate students came from those areas, while just 7.7% of applications came from the North West. I want the university to serve the UK as a whole.

We want to attract the best talent and the brightest minds wherever they are, and whatever their backgrounds. So my visit is a listening journey. I’m hearing from hard-working staff and the students themselves, and education leaders, about the challenges they face.

I will also hear what they think about Cambridge. Of course this city, and this region – I also visited Manchester University yesterday - has brilliant universities, and we aren’t trying to draw students away from those.

We know that there are many students in the North West, and beyond, who – for whatever reason - will get the grades but will not think of applying to Cambridge. Our aim is that those talented students will think about doing so.

Today, with Baroness Sally Morgan of Huyton (a proud Liverpudlian colleague who runs one of our Cambridge Colleges, Fitzwilliam), I am visiting St Michael’s Church of England School in Crosby which serves as the hub for the University’s HE+ programme on Merseyside, to talk with students who have applied to the University, and the teachers who have been supporting them, about their experiences.

I will also meet trustees from the Liverpool Aspire project, which supports students considering applications to both Cambridge and Oxford, and which Eva wrote about so positively.

It hosts workshops with students in Years 10 and 12 when students are making important choices, and it helps them to maximise their potential and make applications to university. Aspire has helped 120 talented students to get a place at Cambridge and Oxford to date. It is a fantastic initiative.

I may still be relatively new to the role, but I hope that encouraging more people from all backgrounds to apply to Cambridge from great places such as Liverpool, and right across the North West, can be one of my legacies. Cambridge needs more Eva Carrolls, and we must work hard to make that possible.

Vice-Chancellor Professor Deborah Prentice is this week visiting the North West of England ­– including Manchester and Liverpool – as part of the University’s work to encourage more applications from the region.

Vice-Chancellor Professor Deborah Prentice

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Having a ‘regular doctor’ can significantly reduce GP workload, study finds

Doctor examining a patient

In one of the largest studies of its kind, researchers from the University of Cambridge and INSEAD analysed data from more than 10 million consultations in 381 English primary care practices over a period of 11 years.

The results, reported in the journal Management Science, suggest that a long-term relationship between a patient and their doctor could both improve patient health and reduce workload for GPs.

The researchers found that when patients were able to see their regular doctor for a consultation – a model known as continuity of care – they waited on average 18% longer between visits, compared to patients who saw a different doctor. The productivity benefit of continuity of care was larger for older patients, those with multiple chronic conditions, and individuals with mental health conditions.

Although it will not always be possible for a patient to see their regular GP, this productivity differential would translate to an estimated 5% reduction in consultations if all practices in England were providing the level of care continuity of the best 10% of practices.

Primary care in the UK is under enormous strain: patients struggle to get appointments, GPs are retiring early, and financial pressures are causing some practices to close. According to the Health Foundation and the Nuffield Trust, there is a significant shortfall of GPs in England, with a projected 15% increase required in the workforce. The problem is not limited to UK, however: the Association of American Medical Colleges estimates a shortfall of between 21,400 and 55,200 primary care physicians in the US by 2033.

“Productivity is a huge problem across all the whole of the UK – we wanted to see how that’s been playing out in GP practices,” said Dr Harshita Kajaria-Montag, the study’s lead author, who is now based at the Kelley School of Business at Indiana University. “Does the rapid access model make GPs more productive?” 

“You can measure the productivity of GP surgeries in two ways: how many patients can you see in a day, or how much health can you provide in a day for those patients,” said co-author Professor Stefan Scholtes from Cambridge Judge Business School. “Some GP surgeries are industrialised in their approach: each patient will get seven or ten minutes before the GP has to move on to the next one.”

At English GP practices, roughly half of all appointments are with a patient’s regular doctor, but this number has been steadily declining over the past decade as GP practices come under increasing strain.

The researchers used an anonymised dataset from the UK Clinical Practice Research Datalink, consisting of more than 10 million GP visits between 1 January 2007 and 31 December 2017. Using statistical models to account for confounding and selection bias, and restricting the sample to consultations with patients who had at least three consultations over the past two years, the researchers found that the time to a patient’s next visit is substantially longer when the patient sees the doctor they have seen most frequently over the past two years, while there is no operationally meaningful difference in consultation duration.

“The impact is substantial: it could be the equivalent of increasing the GP workforce by five percent, which would significantly benefit both patients and the NHS,” said Scholtes. “Better health translates into less demand for future consultations. Prioritising continuity of care is crucial in enhancing productivity.”

“The benefits of continuity of care are obvious from a relationship point of view,” said Kajaria-Montag. “If you’re a patient with complex health needs, you don’t want to have to explain your whole health history at every appointment. If you have a regular doctor who’s familiar with your history, it’s a far more efficient use of time, for doctor and patient.”

“A regular doctor may have a larger incentive to take more time to treat her regular patients thoroughly than a transactional provider,” said Scholtes. “Getting it right the first time will reduce her future workload by preventing revisits, which would likely be her responsibility, while a transactional provider is less likely to see the patient for her next visit.”

The researchers emphasise that continuity of care does not only have the known benefits of better patient outcomes, better patient and GP experience, and reduced secondary care use, but also provides a surprisingly large productivity benefit for the GP practices themselves. 

 

Reference:
Harshita Kajaria-Montag, Michael Freeman, Stefan Scholtes. ‘Continuity of Care Increases Physician Productivity in Primary Care.’ Management Science (2024). DOI: 10.1287/mnsc.2021.02015

If all GP practices moved to a model where patients saw the same doctor at each visit, it could significantly reduce doctor workload while improving patient health, a study suggests. 

The Good Brigade via Getty Images
Doctor examining a patient

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Shimmering seaweeds and algae antennae: sustainable energy solutions under the sea

Seaweeds showing structural colour

Funded by the European Union’s Horizon 2020 research and innovation programme, the Bio-inspired and Bionic materials for Enhanced Photosynthesis (BEEP) project, led by Professor Silvia Vignolini in the Yusuf Hamied Department of Chemistry, studied how marine organisms interact with light.

The four-year sustainable energy project brought together nine research groups from across Europe and drew its inspiration from nature, in particular from the marine world, where organisms including algae, corals and sea slugs have evolved efficient ways to convert sunlight into energy. Harnessing these properties could aid in the development of new artificial and bionic photosynthetic systems.

Some of the brightest and most colourful materials in nature – such as peacock feathers, butterfly wings and opals – get their colour not from pigments or dyes, but from their internal structure alone. The colours our eyes perceive originate from the interaction between light and nanostructures at the surface of the material, which reflect certain wavelengths of light.

As part of the BEEP project, the team studied structural colour in marine species. Some marine algae species have nanostructures in their cell walls that can transmit certain wavelengths of visible light or change their structures to guide the light inside the cell. Little is known about the function of these structures, however: scientists believe they might protect the organisms from UV light or optimise light harvesting capabilities.

The team studied the optical properties and light harvesting efficiency of a range of corals, sea-slugs, microalgae and seaweeds. By understanding the photonic and structural properties of these species, the scientists hope to design new materials for bio-photoreactors and bionic systems.

“We’re fascinated by the optical effects performed by these organisms,” said Maria Murace, a BEEP PhD candidate at Cambridge, who studies structural colour in seaweeds and marine bacteria. “We want to understand what the materials and the structures at the base of these colours are, which could lead to the development of green and sustainable alternatives to the conventional paints and toxic dyes we use today.”

BEEP also studied diatoms: tiny photosynthetic algae that live in almost every aquatic system on Earth and produce as much as half of the oxygen we breathe. The silica shells of these tiny algae form into stunning structures, but they also possess remarkable light-harvesting properties.

The BEEP team engineered tiny light-harvesting antennae and attached them to diatom shells. “These antennae allowed us to gather the light that would otherwise not be harvested by the organism, which is converted and used for photosynthesis,” said Cesar Vicente Garcia, one of the BEEP PhD students, from the University of Bari in Italy. “The result is promising: diatoms grow more! This research could inspire the design of powerful bio-photoreactors, or even better

The scientists engineered a prototype bio-photoreactor, consisting of a fully bio-compatible hydrogel which sustains the growth of microalgae and structural coloured bacteria. The interaction of these organisms is mutually beneficial, enhancing microalgal growth and increasing the volume of biomass produced, which could have applications in the biofuel production industry.

Alongside research, the network has organised several training and outreach activities, including talks and exhibitions for the public at science festivals in Italy, France and the UK.

“Society relies on science to drive growth and progress,” said Floriana Misceo, the BEEP network manager who coordinated outreach efforts. “It’s so important for scientists to share their research and help support informed discussion and debate because without it, misinformation can thrive, which is why training and outreach was an important part of this project.”

“Coordinating this project has been a great experience. I learned immensely from the other groups in BEEP and the young researchers,” said Vignolini. “The opportunity to host researchers from different disciplines in the lab was instrumental in developing new skills and approaching problems from a different perspective.”

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under a Marie Skłodowska-Curie grant.

How could tiny antennae attached to tiny algae speed up the transition away from fossil fuels? This is one of the questions being studied by Cambridge researchers as they search for new ways to decarbonise our energy supply, and improve the sustainability of harmful materials such as paints and dyes.

BEEP
Seaweeds showing structural colour

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Long COVID linked to persistently high levels of inflammatory protein: a potential biomarker and target for treatments

Woman sitting on sofa in the dark, placing a hand to her forehead.

A University of Cambridge-led study identifies the protein interferon gamma (IFN-γ) as a potential biomarker for Long COVID fatigue and highlights an immunological mechanism underlying the disease, which could pave the way for the development of much needed therapies, and provide a head start in the event of a future coronavirus pandemic. 

The study, published today in Science Advances, followed a group of patients with Long COVID fatigue for over 2.5 years, to understand why some recovered and others did not. 

Long COVID continues to affect millions of people globally and is placing a major burden on health services. An estimated 1.9 million people in the UK alone (2.9% of the population) were experiencing self-reported Long COVID as of March 2023, according to the ONS. Fatigue remains by far the most common and debilitating symptom and patients are still waiting for an effective treatment.

The study shows that initial infection with SARS-CoV-2 triggers production of the antiviral protein IFN-γ, which is a normal reaction from the immune system. For most people, when their infection clears, COVID-19 symptoms cease and production of this protein stops, but the researchers found that high levels of IFN-γ persisted in some Long COVID patients for up to 31 months.

“We have found a potential mechanism underlying Long COVID which could represent a biomarker – that is, a tell-tale signature of the condition. We hope that this could help to pave the way to develop therapies and give some patients a firm diagnosis,” said co-author, Dr Benjamin Krishna, from the Cambridge Institute of Therapeutic Immunology & Infectious Disease (CITIID).

The research began in 2020 when Dr Nyarie Sithole (Hughes Hall) set up a Long COVID clinic in Cambridge’s Addenbrooke’s Hospital, where he started collecting blood samples from patients and set about studying their immunology. Sithole soon enlisted the support of Dr Benjamin Krishna and Dr Mark Wills from the University of Cambridge’s Department of Medicine.

“When the clinic started, a lot of people didn't even believe Long COVID was real,” Dr Sithole said. “We are indebted to all the patients who volunteered for this study, without whose support and participation we would obviously not have accomplished this study”.

The team studied 111 COVID-confirmed patients admitted to Addenbrooke’s Hospital CUH, Royal Papworth Hospital and Cambridge and Peterborough NHS Foundation Trusts at 28 days, 90 days and 180 days following symptom onset. Between August 2020 and July 2021, they recruited 55 Long COVID patients – all experiencing severe symptoms at least 5 months after acute COVID-19 – attending the Long COVID clinic at Addenbrooke’s.

The researchers analysed blood samples for signs of cytokines, small proteins crucial to the functioning of immune system cells and blood cells. They found that the white blood cells of individuals infected with SARS-CoV-2 produced IFN-γ, a pro inflammatory molecule, and that this persisted in Long COVID patients.

Dr Krishna said: “Interferon gamma can be used to treat viral infections such as hepatitis C but it causes symptoms including fatigue, fever, headache, aching muscles and depression. These symptoms are all too familiar to Long COVID patients. For us, that was another smoking gun.”

By conducting ‘cell depletion assays’, the team managed to identify the precise cell types responsible for producing IFN-γ. They pinpointed immune cells known as CD8+ T cells but found that they required contact with another immune cell type: CD14+ monocytes.

Previous studies have identified IFN-γ signatures using different approaches and cohorts, but this study’s focus on fatigue revealed a much stronger influence. Also, while previous studies have noticed IFN-y levels rising, they have not followed patients long enough to observe when they might drop back down.

The Cambridge team followed its Long COVID cohort for up to 31 months post-infection. During this follow up period, over 60% of patients experienced resolution of some, if not all, of their symptoms which coincided with a drop in IFN-γ.

Vaccination helping Long COVID patients

The team measured IFN-γ release in Long COVID patients before and after vaccination and found a significant decrease in IFN-γ post vaccination in patients whose symptoms resolved.

“If SARS-CoV-2 continues to persist in people with Long COVID, triggering an IFN-γ response, then vaccination may be helping to clear this. But we still need to find effective therapies,” Dr Krishna said.

“The number of people with Long COVID is gradually falling, and vaccination seems to be playing a significant role in that. But new cases are still cropping up, and then there is the big question of what happens when the next coronavirus pandemic comes along. We could face another wave of Long COVID. Understanding what causes Long COVID now could give us a crucial head start.”

Microclotting

Some well-publicised previous studies have proposed microclotting as a principle cause of Long COVID. While not ruling out a role of some kind, these new findings suggest that microclotting cannot be the only or the most significant cause.

Classifying Long COVID

This study argues that the presence of IFN-γ could be used to classify Long COVID into subtypes which could be used to personalise treatment. 

“It’s unlikely that all the different Long COVID symptoms are caused by the same thing. We need to differentiate between people and tailor treatments. Some patients are slowly recovering and there are those who are stuck in a cycle of fatigue for years on end. We need to know why,” Dr Krishna said.

Reference

B A Krishna et al., ‘Spontaneous, persistent, T-cell dependent IFN-γ release in patients who progress to long COVID’, Science Advances (2024). DOI: 10.1126/sciadv.adi9379

SARS-CoV-2 triggers the production of the antiviral protein IFN-γ, which is associated with fatigue, muscle ache and depression. New research shows that in Long COVID patients, IFN-y production persists until symptoms improve, highlighting a potential biomarker and a target for therapies. 

We hope that this could help to pave the way to develop therapies and give some patients a firm diagnosis
Benjamin Krishna
Annie Spratt via Unsplash
Woman sitting on sofa in the dark, placing a hand to her forehead

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Vice-Chancellor visits Cambridge University Boat Club training

It might have been a chilly and dark 6am start, but a warm welcome awaited Professor Deborah Prentice, who joined Chief Women’s Coach Paddy Ryan on a launch following the crews out on the River Great Ouse.

Professor Prentice said it was an opportunity to witness the dedication of the students, who routinely set off from Cambridge at five in the morning, arrive at Ely Boathouse just after six, then complete around an hour’s training before needing to get back to Cambridge in time for morning lectures.

“These students and their coaches are doing unbelievable work, they’re out here every day at six in the morning,” said Professor Prentice.

“The fact that they are combining this training with study is a reminder of how disciplined and committed our students are. Rowing is a team sport and they are a fantastic team for sure.”

Annamarie Phelps, CUBC Club Chair, said she was delighted the Vice-Chancellor was able to visit and meet the students, coaches and staff.

“It was fantastic to welcome the Vice-Chancellor to our Ely Boathouse, where she was able to see first-hand the dedication of these student athletes and their coaches.

“We were also able to show Professor Prentice the amazing facilities we have here – something made possible only with the generous support of the University and alumni.

“We’re now looking forward to welcoming Professor Prentice to the Boat Race itself next month, when she will be presenting the Boat Race Trophies - hopefully to Cambridge!”

This year’s race, on 30th March, will mark the 78th Women’s race and the 169th Men’s race, with Cambridge leading across all categories in historic results.

With The Boat Race 2024 just weeks away, the Vice-Chancellor has been to meet Cambridge University Boat Club students and staff at their Ely training centre.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New Cambridge-developed resources help students learn how maths can help tackle infectious diseases

Aerial view of crowd connected by lines

From measles and flu to SARS and COVID, mathematicians help us understand and predict the epidemics that can spread through our communities, and to help us look at strategies that we may be able to use to contain them.

The project, called Contagious Maths, was led by Professor Julia Gog from Cambridge’s Department of Applied Mathematics and Theoretical Physics (DAMTP), and was supported by a Rosalind Franklin Award from the Royal Society.

The curriculum-linked resources will give students between the ages 11 and 14 the opportunity to join researchers on the mathematical frontline to learn more about infectious disease spread, along with interactive tools to try mathematical modelling for themselves. Teachers receive full lesson plans, backed up by Cambridge research.

“I’ve always loved maths. I was lucky enough to have amazing teachers at sixth form who challenged me and were 100% behind me pursuing maths at the highest level, but maths as it’s taught in school can be highly abstract, so students often wonder what the point of maths even is,” said Gog, who is also Director of the Millennium Maths Project. “This is something I’m trying to help with now: to offer a glimpse from school to the research world to see the role mathematics can play in tackling important real-world problems.”

The Contagious Maths project introduces mathematical modelling; explores how mathematicians can model the spread of disease through a population and the type of questions we might think about when looking at models; and gives an insight into what mathematics researchers working on these real-life problems actually do.

“I’ve been engaged in outreach for many years at Cambridge, and the Contagious Maths project grew out of discussions with colleagues who have expertise in reaching school-age children,” said Gog. “The 11-14 age group we are targeting is a real crunch point for retaining girls in maths, and future female mathematicians. What exactly happens is complex and multifaceted, but this is a period when people form their views on how they fit with maths and science.

“Many of them disengage, as it can seem that maths at school is utterly disconnected from the real world. It can also be a time when maths appears very starkly right or wrong, whereas any research mathematician can tell you it’s always so much more subtle that than, and therefore so much more interesting!”

Gog hopes the Contagious Maths resources might be able to help, as they are designed to be used in regular school lessons, and cover a topic with clear real-world importance.

“The maths is never black and white in this field: there are always ways to challenge and develop the models, and some tricky thinking to be done about how the real epidemics and the simulations are really related to each other,” she said. “I suspect some students will find this frustrating, and just want maths to be algorithmic exercises. But some will be intrigued, and they are the ones we are trying to reach and expose to this larger world of applied maths research.”

Contagious Maths also provides teachers with all the ideas and tools they need, so they have at their fingertips all they need to deliver these lessons, even if they have no experience with research mathematics. “We hope this project will help these teachers to bring in the wider view of mathematics, and we hope it inspires them too,” said Gog. “It’s been really fun developing these resources, teaming up with both NRICH and Plus to make the most of our combined expertise.”

Maths teachers can attend a free online event on 20 March to learn more about the project.

In addition to the school resources, Gog and her colleagues have designed another version of Contagious Maths for a more general self-guided audience, which will work for students older than 14 or anyone, of any age, who is interested in learning about mathematical modelling.

“The paradox between the cleanness and precision of mathematics, and the utter hot mess of anything that involves biological dynamics across populations – like an outbreak of an infectious disease, is what intrigued me to stay in mathematics beyond my degree, and to move into research in mathematical biology,” said Gog. “Elegant theoretical ideas can tell us something valuable and universal about mitigating the devastating effects of disease on human and animal populations. Super abstract equations can hold fundamental truths about real-world problems - I don't think I will ever tire of thinking about that.”

Adapted from a Royal Society interview with Professor Julia Gog.

Cambridge mathematicians have developed a set of resources for students and teachers that will help them understand how maths can help tackle infectious diseases.

Orbon Alija via Getty Images
Aerial view of crowd connected by lines

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists identify genes linked to DNA damage and human disease

DNA jigsaw with pieces missing

The work, published in Nature, provides insights into cancer progression and neurodegenerative diseases as well as a potential therapeutic avenue in the form of a protein inhibitor.

The genome contains all the genes and genetic material within an organism's cells. When the genome is stable, cells can accurately replicate and divide, passing on correct genetic information to the next generation of cells. Despite its significance, little is understood about the genetic factors governing genome stability, protection, repair, and the prevention of DNA damage.

In this new study, researchers from the UK Dementia Research Institute, at the University of Cambridge, and the Wellcome Sanger Institute set out to better understand the biology of cellular health and identify genes key to maintaining genome stability.

Using a set of genetically modified mouse lines, the team identified 145 genes that play key roles in either increasing or decreasing the formation of abnormal micronuclei structures. These structures indicate genomic instability and DNA damage, and are common hallmarks of ageing and diseases.

The most dramatic increases in genomic instability were seen when the researchers knocked out the gene DSCC1, increasing abnormal micronuclei formation five-fold. Mice lacking this gene mirrored characteristics akin to human patients with a number of rare genetic disorders, further emphasising the relevance of this research to human health.

Using CRISPR screening, researchers showed this effect triggered by DSCC1 loss could be partially reversed through inhibiting protein SIRT1. This offers a highly promising avenue for the development of new therapies.

The findings help shed light on genetic factors influencing the health of human genomes over a lifespan and disease development.

Professor Gabriel Balmus, senior author of the study at the UK Dementia Research Institute at the University of Cambridge, formerly at the Wellcome Sanger Institute, said: “Continued exploration on genomic instability is vital to develop tailored treatments that tackle the root genetic causes, with the goal of improving outcomes and the overall quality of life for individuals across various conditions.”

Dr David Adams, first author of the study at the Wellcome Sanger Institute, said: “Genomic stability is central to the health of cells, influencing a spectrum of diseases from cancer to neurodegeneration, yet this has been a relatively underexplored area of research. This work, of 15 years in the making, exemplifies what can be learned from large-scale, unbiased genetic screening. The 145 identified genes, especially those tied to human disease, offer promising targets for developing new therapies for genome instability-driven diseases like cancer and neurodevelopmental disorders.”

This research was supported by Wellcome and the UK Dementia Research Institute.

Reference
Adams, DJ et al. Genetic determinants of micronucleus formation in vivo. Nature; 14 Feb 2024; DOI: 10.1038/s41586-023-07009-0

Adapted from a press release from the Wellcome Sanger Institute.

Cambridge scientists have identified more than one hundred key genes linked to DNA damage through systematic screening of nearly 1,000 genetically modified mouse lines.

Continued exploration on genomic instability is vital to develop tailored treatments that tackle the root genetic causes
Gabriel Balmus
DNA puzzle

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

School uniform policies linked to students getting less exercise, study finds

School children watching a sports game from indoors

The University of Cambridge study used data about the physical activity participation of more than a million five-to-17-year-olds internationally. It found that in countries where a majority of schools require students to wear uniforms, fewer young people tend to meet the average of 60 minutes of physical activity per day recommended by the World Health Organisation (WHO).

Regardless of uniform policies, across most countries fewer girls than boys reach those recommended exercise levels. Among primary school students, however, the difference in activity between girls and boys was found to be wider in countries where most schools mandated uniforms. The same result was not found in secondary school-aged students.

The authors suggest that this could be explained by the fact that younger children get more incidental exercise throughout the school day than older students; for example, through running, climbing and various other forms of active play at break and lunchtimes. There is already evidence that girls feel less comfortable in participating in active play if they are wearing certain types of clothing, such as skirts or dresses.

Importantly, the results do not definitively prove that school uniforms limit children’s physical activity and the researchers stress that “causation cannot be inferred”. Previous, smaller studies however provide support for these findings, indicating that uniforms could pose a barrier. For the first time, the research examines large-scale statistical evidence to assess that claim.

The study was led by Dr Mairead Ryan, a researcher at the Faculty of Education and Medical Research Council (MRC) Epidemiology Unit, University of Cambridge.

“Schools often prefer to use uniforms for various reasons,” Dr Ryan said. “We are not trying to suggest a blanket ban on them, but to present new evidence to support decision-making. School communities could consider design, and whether specific characteristics of a uniform might either encourage or restrict any opportunities for physical activity across the day.”

The WHO recommends that young people get an average of 60 minutes of at least moderate-intensity physical activity per day during the week. The study confirms previous observations that most children and adolescents are not meeting this recommendation, especially girls. The difference in the percentage of boys and girls meeting physical activity guidelines across all countries was, on average, 7.6 percentage points.

Existing evidence suggests that uniforms could be a factor. Previous concerns have, for example, been raised about girls’ PE uniforms and school sports kits. A 2021 study in England found that the design of girls’ PE uniforms deterred students from participating in certain activities, while the hockey player Tess Howard proposed redesigning gendered sports uniforms for similar reasons, after analysing interview and survey data.

Children often get their exercise away from PE and sports lessons, however.

“Activities like walking or cycling to school, breaktime games, and after-school outdoor play can all help young people incorporate physical activity into their daily routines,” Ryan said. “That’s why we are interested in the extent to which various elements of young people’s environments, including what they wear, encourage such behaviours.”

The study analysed existing data on the physical activity levels of nearly 1.1 million young people aged five to 17 in 135 countries and combined this with newly collected data on how common the use of school uniforms is in these countries.

In over 75% of the countries surveyed, a majority of schools required their students to wear uniforms. The study found that in these countries, physical activity participation was lower. The median proportion of all students meeting the WHO recommendations in countries where uniform-wearing was the norm was 16%; this rose to 19.5% in countries where uniforms were less common.

There was a consistent gender gap between boys’ and girls’ physical activity levels, with boys 1.5 times more likely to meet WHO recommendations across all ages. However, the gap widened from 5.5 percentage points at primary school level in non-uniform countries to a 9.8 percentage point difference in countries where uniforms were required in most schools.

The finding appears to match evidence from other studies suggesting that girls are more self-conscious about engaging in physical activity when wearing uniforms in which they do not feel comfortable.

“Girls might feel less confident about doing things like cartwheels and tumbles in the playground, or riding a bike on a windy day, if they are wearing a skirt or dress,” said senior author Dr Esther van Sluijs, MRC Investigator. “Social norms and expectations tend to influence what they feel they can do in these clothes. Unfortunately, when it comes to promoting physical health, that’s a problem.”

The authors of the study argue that there is now enough evidence to warrant further investigation into whether there is a causal relationship between school uniforms and lower activity levels. They also highlight the importance of regular physical activity for all young people, regardless of their gender.

“Regular physical activity helps support multiple physical, mental, and well-being needs, as well as academic outcomes,” Dr Ryan said. “We now need more information to build on these findings, considering factors like how long students wear their uniforms for after school, whether this varies depending on their background, and how broader gendered clothing norms may impact their activity.”

The findings are reported in the Journal of Sport and Health Science.

Reference
Ryan, M et al. Are school uniforms associated with gender inequalities in physical activity? A pooled analysis of population-level data from 135 countries/regions. Journal of Sport and Health Science; 15 Feb 2024; DOI: doi.org/10.1016/j.jshs.2024.02.003

School uniform policies could be restricting young people from being active, particularly primary school-aged girls, new research suggests.

Social norms and expectations tend to influence what they feel they can do in these clothes. Unfortunately, when it comes to promoting physical health, that’s a problem
Esther van Sluijs
School children watching a sports game from indoors

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

£11m semiconductor research centre could be key player in UK’s net zero mission

Robot arm and semiconductor

Semiconductors, also known as microchips, are a key component in nearly every electrical device from mobile phones and medical equipment to electric vehicles.

They are increasingly being recognised as an area of global strategic significance due to the integral role they play in net zero, AI and quantum technology.

Co-created and delivered with industry, the REWIRE IKC is led by the University of Bristol, in partnership with Cambridge and Warwick Universities.

The IKC will accelerate the UK’s ambition for net zero by transforming the next generation of high-voltage electronic devices using wide/ultra-wide bandgap (WBG/UWBG) compound semiconductors.

The project is being led by Professor Martin Kuball and his team at the University of Bristol. Cambridge members of the IKC team include Professors Rachel Oliver, Florin Udrea and Teng Long.

The centre will advance the next generation of semiconductor power device technologies and enhance the security of the UK’s semiconductor supply chain.

Compound semiconductor WBG/UWBG devices have been recognised in the UK National Semiconductor Strategy as key elements to support the net zero economy through the development of high voltage and low energy-loss power electronic technology.

They are essential building blocks for developing all-electric trains, ships and heavy goods electric vehicles, better charging infrastructure, renewable energy and High Voltage Direct Current grid connections, as well as intelligent power distribution and energy supplies to telecommunication networks and data centres.

“Power devices are at the centre of all power electronic systems and pave the way for more efficient and compact power electronic systems, reducing energy loss,” said Kuball. “The REWIRE IKC will focus on power conversion of wind energy, electric vehicles, smart grids, high-temperature applications, device and packaging, and improving the efficiency of semiconductor device manufacture.”

Our home electrical supply is at 240 Volts, but to handle the power from offshore wind turbines, devices will have to operate at thousands of Volts. These very high voltages can easily damage the materials normally used in power electronics.

“Newly emerging ultra-wide bandgap materials have properties which enable them to handle very large voltages more easily,” said Oliver, who Director of the Cambridge Centre for Gallium Nitride. “The devices based on these materials will waste less energy and be smaller, lighter and cheaper. The same materials can also withstand high temperatures and doses of radiation, which means they can be used to enable other new electricity generation technologies, such as fusion energy.”

“The REWIRE IKC will play a prominent role within the UK’s semiconductor strategy, in cementing the UK’s place as a leader in compound semiconductor research and development, developing IP to be exploited here in the UK, rebuilding the UK semiconductor supply chain, and training the next generation of semiconductor materials scientists and engineers,” said Professor Peter Gammon from the University of Warwick.

Industry partners in the REWIRE IKC include Ampaire, BMW, Bosch, Cambridge GaN Devices (CGD), Element-Six Technologies, General Electric, Hitachi Energy, IQE, Oxford Instruments, Siemens, ST Microelectronics and Toshiba.

REWIRE is one of two new IKCs announced being funded by the Engineering and Physical Sciences Research Council (EPSRC) and Innovate UK, both part of UK Research and Innovation. The second IKC at the University of Southampton will improve development and commercialisation of silicon photonics technologies in the UK.

“This investment marks a crucial step in advancing our ambitions for the semiconductor industry, with these centres helping bring new technologies to market in areas like net zero and AI, rooting them right here in the UK,” said Minister for Tech and the Digital Economy Saqib Bhatti. “Just nine months into delivering on the National Semiconductor Strategy, we’re already making rapid progress towards our goals. This isn’t just about fostering growth and creating high-skilled jobs, it's about positioning the UK as a hub of global innovation, setting the stage for breakthroughs that have worldwide impact.”

Adapted from a University of Bristol media release.

For more information on energy-related research in Cambridge, please visit the Energy IRC, which brings together Cambridge’s research knowledge and expertise, in collaboration with global partners, to create solutions for a sustainable and resilient energy landscape for generations to come.

The University of Cambridge is a partner in the new £11m Innovation and Knowledge Centre (IKC) REWIRE, set to deliver pioneering semiconductor technologies and new electronic devices.

Yuichiro Chino via Getty Images
Robot arms and semiconductor wafer

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

£3 million UKRI funding to support research into better health, ageing, and wellbeing

White mouse

UKRI funding of £3 million is awarded today to support a new research cluster, as part of the MRC National Mouse Genetics Network (MRC NMGN), focused on improving existing models of ageing with the aim of improving lifelong health and wellbeing. The cluster is led by scientists at the Universities of Cambridge and Newcastle.

The MRC NMGN focuses on age-related biological changes in model organisms, particularly the mouse, to try and improve our understanding and diagnosis of the most challenging disease area of our time - and generate therapeutic avenues.

This award brings the UKRI’s total investment in the MRC NMGN to £25 million.

The need to improve how people age has become a major requirement of modern societies. Regular increases in life expectancy result in older populations, making healthy ageing essential for a better quality of life and a reduced burden on health and social services. 

Understanding the biological mechanisms underlying the ageing process is paramount for tackling the challenges brought about by our older populations.

The new tools generated as a result of this research will be made available to the scientific community to improve understanding of the ageing process, and to provide a resource for preclinical testing and intervention.

Professor Walid Khaled from Cambridge’s Wellcome-MRC Cambridge Stem Cell Institute and Department of Pharmacology, and Co-lead of the new MRC National Mouse Genetics Network Ageing Cluster, said: “I am very pleased to be co-leading this project from Cambridge and I am looking forward to working with the rest of the team from around the UK. ‘Prevention is better than cure’ and so our project will generate a reference map that we will use in the future to assess interventions that could prevent ageing related health decline.”

Professor Anne Ferguson-Smith, Pro-Vice-Chancellor (Research & International Partnerships) and Arthur Balfour Professor of Genetics at Cambridge, said: "Collaboration is central to our research activities in Cambridge. The new Ageing Cluster is a fine example of multiple institutions working together to add value and bring exciting new insight and expertise to advance the critically important field of healthy ageing. I am proud to be part of this important initiative which can deliver new routes to improved health span."

Professor David Burn, Pro Vice Chancellor, Faculty of Medical Sciences at Newcastle University, added: "I am delighted that Newcastle University is an important part of the UKRI Mouse Genetics Network Ageing Cluster.  This cluster offers researchers the opportunity to develop new animal models so that we may better understand ageing.  This, in turn, will allow us to translate this research into extending healthy lifespan in humans in the future.”

The University is bringing together its world leading expertise to tackle the topic of extending the healthy lifespan. Scientists in the School of Biological Sciences are addressing some of the biggest questions in human biology, including: What if we could identify those at risk of developing chronic age-related conditions before they present in the clinic? What if we could intervene before any symptoms arise and prevent disease onset?

UKRI’s strategy for 2022-2027 aims to harness the full power of the UK’s research and innovation system to tackle major national and global challenges. A total of £75m has been allocated to the theme of Securing better health, ageing and wellbeing, which aims to improve population health, tackle the health inequalities affecting people and communities, and advance interventions that keep us healthier for longer.

Read more about Cambridge research into extending the healthy lifespan.

The University of Cambridge has received UKRI funding for research on age-related biological changes in model organisms as part of a national collaboration.

‘Prevention is better than cure’ and so our project will generate a reference map that we will use in the future to assess interventions that could prevent ageing related health decline
Walid Khaled
White mouse

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Sensors made from ‘frozen smoke’ can detect toxic formaldehyde in homes and offices

A block of silica aerogel being held in a person's hand

The researchers, from the University of Cambridge, developed sensors made from highly porous materials known as aerogels. By precisely engineering the shape of the holes in the aerogels, the sensors were able to detect the fingerprint of formaldehyde, a common indoor air pollutant, at room temperature.

The proof-of-concept sensors, which require minimal power, could be adapted to detect a wide range of hazardous gases, and could also be miniaturised for wearable and healthcare applications. The results are reported in the journal Science Advances.

Volatile organic compounds (VOCs) are a major source of indoor air pollution, causing watery eyes, burning in the eyes and throat, and difficulty breathing at elevated levels. High concentrations can trigger attacks in people with asthma, and prolonged exposure may cause certain cancers.

Formaldehyde is a common VOC and is emitted by household items including pressed wood products (such as MDF), wallpapers and paints, and some synthetic fabrics. For the most part, the levels of formaldehyde emitted by these items are low, but levels can build up over time, especially in garages where paints and other formaldehyde-emitting products are more likely to be stored.

According to a 2019 report from the campaign group Clean Air Day, a fifth of households in the UK showed notable concentrations of formaldehyde, with 13% of residences surpassing the recommended limit set by the World Health Organization (WHO).

“VOCs such as formaldehyde can lead to serious health problems with prolonged exposure even at low concentrations, but current sensors don’t have the sensitivity or selectivity to distinguish between VOCs that have different impacts on health,” said Professor Tawfique Hasan from the Cambridge Graphene Centre, who led the research.

“We wanted to develop a sensor that is small and doesn’t use much power, but can selectively detect formaldehyde at low concentrations,” said Zhuo Chen, the paper’s first author.

The researchers based their sensors on aerogels: ultra-light materials sometimes referred to as ‘liquid smoke’, since they are more than 99% air by volume. The open structure of aerogels allows gases to easily move in and out. By precisely engineering the shape, or morphology, of the holes, the aerogels can act as highly effective sensors.

Working with colleagues at Warwick University, the Cambridge researchers optimised the composition and structure of the aerogels to increase their sensitivity to formaldehyde, making them into filaments about three times the width of a human hair. The researchers 3D printed lines of a paste made from graphene, a two-dimensional form of carbon, and then freeze-dried the graphene paste to form the holes in the final aerogel structure. The aerogels also incorporate tiny semiconductors known as quantum dots.

The sensors they developed were able to detect formaldehyde at concentrations as low as eight parts per billion, which is 0.4 percent of the level deemed safe in UK workplaces. The sensors also work at room temperature, consuming very low power.

“Traditional gas sensors need to be heated up, but because of the way we’ve engineered the materials, our sensors work incredibly well at room temperature, so they use between 10 and 100 times less power than other sensors,” said Chen.

To improve selectivity, the researchers then incorporated machine learning algorithms into the sensors. The algorithms were trained to detect the ‘fingerprint’ of different gases, so that the sensor was able to distinguish the fingerprint of formaldehyde from other VOCs.

“Existing VOC detectors are blunt instruments – you only get one number for the overall concentration in the air,” said Hasan. “By building a sensor that can detect specific VOCs at very low concentrations in real time, it can give home and business owners a more accurate picture of air quality and any potential health risks.”

The researchers say the same technique could be used to develop sensors to detect other VOCs. In theory, a device the size of a standard household carbon monoxide detector could incorporate multiple different sensors within it, providing real-time information about a range of different hazardous gases.  “At Warwick, we're developing a low-cost multi-sensor platform that will incorporate these new aerogel materials and, coupled with AI algorithms, detect different VOCs,” said co-author Professor Julian Gardner from Warwick University. 

“By using highly porous materials as the sensing element, we’re opening up whole new ways of detecting hazardous materials in our environment,” said Chen.

The research was supported in part by the Henry Royce Institute, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Tawfique Hasan is a Fellow of Churchill College, Cambridge.

Reference:
Zhuo Chen et al. ‘Real-time, noise and drift resilient formaldehyde sensing at room temperature with aerogel filaments.’ Science Advances (2024). DOI: 10.1126/sciadv.adk6856

Researchers have developed a sensor made from ‘frozen smoke’ that uses artificial intelligence techniques to detect formaldehyde in real time at concentrations as low as eight parts per billion, far beyond the sensitivity of most indoor air quality sensors.

Silica aerogel

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Ice cores provide first documentation of rapid Antarctic ice loss in the past

Tents at Skytrain Ice Rice in Antarctica

The evidence, contained within an ice core, shows that in one location the ice sheet thinned by 450 metres — that’s more than the height of the Empire State Building — in just under 200 years.

This is the first evidence anywhere in Antarctica for such a fast loss of ice. Scientists are worried that today’s rising temperatures might destabilize parts of the West Antarctic Ice Sheet in the future, potentially passing a tipping point and inducing a runaway collapse. The study, published in Nature Geoscience, sheds light on how quickly Antarctic ice could melt if temperatures continue to soar.

“We now have direct evidence that this ice sheet suffered rapid ice loss in the past,” said Professor Eric Wolff, senior author of the new study from Cambridge’s Department of Earth Sciences. “This scenario isn’t something that exists only in our model predictions and it could happen again if parts of this ice sheet become unstable.”

From west to east, the Antarctic ice sheets contain enough freshwater to raise global sea levels by around 57 metres. The West Antarctic Ice Sheet is considered particularly vulnerable because much of it sits on bedrock below sea level.

Model predictions suggest that a large part of the West Antarctic Ice Sheet could disappear in the next few centuries, causing sea levels to rise. Exactly when and how quickly the ice could be lost is, however, uncertain.

One way to train ice sheet models to make better predictions is to feed them with data on ice loss from periods of warming in Earth’s history. At the peak of the Last Ice Age 20,000 years ago, Antarctic ice covered a larger area than today. As our planet thawed and temperatures slowly climbed, the West Antarctic Ice Sheet contracted to more or less its current extent.

“We wanted to know what happened to the West Antarctic Ice Sheet at the end of the Last Ice Age, when temperatures on Earth were rising, albeit at a slower rate than current anthropogenic warming,” said Dr Isobel Rowell, study co-author from the British Antarctic Survey. “Using ice cores we can go back to that time and estimate the ice sheet’s thickness and extent.”

Ice cores are made up of layers of ice that formed as snow fell and was then buried and compacted into ice crystals over thousands of years. Trapped within each ice layer are bubbles of ancient air and contaminants that mixed with each year’s snowfall — providing clues as to the changing climate and ice extent.

The researchers drilled a 651-metre-long ice core from Skytrain Ice Rise in 2019. This mound of ice sits at the edge of the ice sheet, near the point where grounded ice flows into the floating Ronne Ice Shelf.

After transporting the ice cores to Cambridge at -20C, the researchers analysed them to reconstruct the ice thickness. First, they measured stable water isotopes, which indicate the temperature at the time the snow fell. Temperature decreases at higher altitudes (think of cold mountain air), so they could equate warmer temperatures with lower-lying, thinner ice.

They also measured the pressure of air bubbles trapped in the ice. Like temperature, air pressure also varies systematically with elevation. Lower-lying, thinner ice contains higher-pressure air bubbles.

These measurements told them that ice thinned rapidly 8,000 years ago. “Once the ice thinned, it shrunk really fast,” said Wolff, “this was clearly a tipping point — a runaway process.”

They think this thinning was probably triggered by warm water getting underneath the edge of the West Antarctic Ice Sheet, which normally sits on bedrock. This likely untethered a section of the ice from bedrock, allowing it to float suddenly and forming what is now the Ronne Ice Shelf. This allowed neighbouring Skytrain Ice Rise, no longer restrained by grounded ice, to thin rapidly. 

The researchers also found that the sodium content of the ice (originating from salt in sea spray) increased about 300 years after the ice thinned. This told them that, after the ice thinned, the ice shelf shrunk back so that the sea was hundreds of kilometres nearer to their site.

“We already knew from models that the ice thinned around this time, but the date of this was uncertain,” said Rowell. Ice sheet models placed the retreat anywhere between 12,000 and 5,000 years ago and couldn’t say how quickly it happened. “We now have a very precisely dated observation of that retreat that can be built into improved models,” said Rowell.

Although the West Antarctic Ice Sheet retreated quickly 8,000 years ago, it stabilised when it reached roughly its current extent. “It’s now crucial to find out whether extra warmth could destabilise the ice and cause it to start retreating again,” said Wolff.

Reference

Grieman et al. (2024) Abrupt Holocene ice loss due to thinning and ungrounding in the Weddell Sea Embayment. Nature Geoscience. DOI: 10.1038/s41561-024-01375-8

Researchers from the University of Cambridge and the British Antarctic Survey have uncovered the first direct evidence that the West Antarctic Ice Sheet shrunk suddenly and dramatically at the end of the Last Ice Age, around 8,000 years ago.

University of Cambridge / British Antarctic Survey
Tents at Skytrain Ice Rise

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Strongest evidence to date of brain’s ability to compensate for age-related cognitive decline

Woman in purple and white floral shirt washing a carrot

As we age, our brain gradually atrophies, losing nerve cells and connections and this can lead to a decline in brain function. It’s not fully understood why some people appear to maintain better brain function than others, and how we can protect ourselves from cognitive decline.

A widely accepted notion is that some people’s brains are able to compensate for the deterioration in brain tissue by recruiting other areas of the brain to help perform tasks. While brain imaging studies have shown that the brain does recruit other areas, until now it has not been clear whether this makes any difference to performance on a task, or whether it provides any additional information about how to perform that task.

In a study published in the journal eLife, a team led by scientists at the University of Cambridge in collaboration with the University of Sussex have shown that when the brain recruits other areas, it improves performance specifically in the brains of older people.

Study lead Dr Kamen Tsvetanov, an Alzheimer's Society Dementia Research Leader Fellow in the Department of Clinical Neurosciences, University of Cambridge, said: “Our ability to solve abstract problems is a sign of so-called ‘fluid intelligence’, but as we get older, this ability begins to show significant decline. Some people manage to maintain this ability better than others. We wanted to ask why that was the case – are they able to recruit other areas of the brain to overcome changes in the brain that would otherwise be detrimental?”

Brain imaging studies have shown that fluid intelligence tasks engage the ‘multiple demand network’ (MDN), a brain network involving regions both at the front and rear of the brain, but its activity decreases with age. To see whether the brain compensated for this decrease in activity, the Cambridge team looked at imaging data from 223 adults between 19 and 87 years of age who had been recruited by the Cambridge Centre for Ageing & Neuroscience (Cam-CAN).

The volunteers were asked to identify the odd-one-out in a series of puzzles of varying difficulty while lying in a functional magnetic resonance imaging (fMRI) scanner, so that the researchers could look at patterns of brain activity by measuring changes in blood flow.

As anticipated, in general the ability to solve the problems decreased with age. The MDN was particularly active, as were regions of the brain involved in processing visual information.

When the team analysed the images further using machine-learning, they found two areas of the brain that showed greater activity in the brains of older people, and also correlated with better performance on the task. These areas were the cuneus, at the rear of the brain, and a region in the frontal cortex. But of the two, only activity in the cuneus region was related to performance of the task more strongly in the older than younger volunteers, and contained extra information about the task beyond the MDN.

Although it is not clear exactly why the cuneus should be recruited for this task, the researchers point out that this brain region is usually good at helping us stay focused on what we see. Older adults often have a harder time briefly remembering information that they have just seen, like the complex puzzle pieces used in the task. The increased activity in the cuneus might reflect a change in how often older adults look at these pieces, as a strategy to make up for their poorer visual memory.

Dr Ethan Knights from the Medical Research Council Cognition and Brain Sciences Unit at Cambridge said: “Now that we’ve seen this compensation happening, we can start to ask questions about why it happens for some older people, but not others, and in some tasks, but not others. Is there something special about these people – their education or lifestyle, for example – and if so, is there a way we can intervene to help others see similar benefits?”

Dr Alexa Morcom from the University of Sussex’s School of Psychology and Sussex Neuroscience research centre said: “This new finding also hints that compensation in later life does not rely on the multiple demand network as previously assumed, but recruits areas whose function is preserved in ageing.”

The research was supported by the Medical Research Council, the Biotechnology and Biological Sciences Research Council, the European Union’s Horizon 2020 research and innovation programme, the Guarantors of Brain, and the Alzheimer’s Society.

Reference

Knights, E et al. Neural Evidence of Functional Compensation for Fluid Intelligence Decline in Healthy Ageing. eLife; 6 Feb 2024; DOI: 10.7554/eLife.93327

Scientists have found the strongest evidence yet that our brains can compensate for age-related deterioration by recruiting other areas to help with brain function and maintain cognitive performance.

Now that we’ve seen this compensation happening, we can start to ask questions about why it happens for some older people, but not others - is there something special about these people?
Ethan Knights
Woman in purple and white floral shirt washing a carrot

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Turkey-Syria earthquakes: deficiencies in building structures and construction shortcuts were main cause of casualties

A partially-collapsed building in the aftermath of the Turkey-Syria earthquakes in 2023.

A new, independent field investigation into the aftermath of the Turkey-Syria earthquakes has found that a drive for profit has pushed all players within the construction industry to take shortcuts, with building stock primarily made of Reinforced Concrete (RC) structures, being the main cause of the casualties. 

Findings show that deficiencies were also recorded among even the newest building stock. This is despite established technical know-how, state-of-the-art building codes and rigorous building regulations. 

The longitudinal study report published here today by the Institution of Structural Engineers for EEFIT, was co-led by Cambridge's Professor Emily So, Professor of Architectural Engineering and Director of the Cambridge University Centre for Risk in the Built Environment (CURBE) and Dr Yasemin Didem Aktas from the Faculty of Engineering Sciences at UCL. Some of the findings include:

  • The drive for profit pushes players within the construction industry to take shortcuts. The auditing and quality control mechanisms embedded in the legal and bureaucratic processes should be strengthened to ensure code compliance. The legalisation of non-compliant buildings through amnesties cannot continue. 
  • Critically, despite established technical know-how, state-of-the-art building codes and rigorous building regulations, deficiencies in Reinforced Concrete (RC) structures were found even in the newest building stock. This demonstrates that seismic resilience is not only a technical problem in Turkey, but one that demands a multi-sectoral and interdisciplinary dialogue, scrutinising the regulatory system, bureaucracy, the legal and political backdrop within which the construction sector operates in Turkey. 
  • Building stock is primarily composed of Reinforced Concrete structures, which were therefore the main cause of the casualties. The team saw problems with such structures across their whole lifecycle from design to implementation and post-occupancy stages. The structures therefore did not withstand the seismic pressures.  
  • A review of building stock and infrastructure is critical to understand risk levels for future earthquakes. Lack of publicly available data is a big problem in Turkey, hindering not only a robust inquiry into damage and associated building characteristics, but also reliably establishing the risk profiles for future events. 
  • Debris management and demolishment practices have not fully recognised the potential of mid-/long-term environmental and public health implications. Field observations and contacts in the affected communities show that they are already affected by the poor air quality. The Compulsory Earthquake Insurance (CEI) is a system that was put in place in Turkey following the 1999 earthquakes to provide monetary reserves to fund the management of future disasters. The extent to which these funds have been used and how resources have been allocated remain unclear.' 

Read the full report and findings here.

Professor So says: “The 2023 Türkiye and Syria earthquakes were truly tragic, hitting an already fragile population, including migrants. Our field work and remote analysis revealed many issues, including the issue of non-compliant buildings with little seismic resilience. Building code compliance needs to be strengthened.” 

EEFIT - a joint venture between industry and universities - gathered a team of 30 global experts to assess the damage and develop suggestions to reduce future impacts and vulnerabilities. They studied the science, engineering and data related to the earthquakes including geotechnics, the structural and infrastructure impact, and the relief response and recovery. The team continues to work in the area, to follow the recovery and collaborate with colleagues from Turkey for better seismic resilience.

 

 

The Earthquake Engineering Field Investigation Team (EEFIT), co-led by Professor Emily So, today publishes its findings and recommendations.

Our field work and remote analysis revealed many issues, including the issue of non-compliant buildings with little seismic resilience. Building code compliance needs to be strengthened.” 
Professor Emily So
EEFIT
A partially-collapsed building in the aftermath of the Turkey-Syria earthquakes in 2023.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Ancient seafloor vents spewed tiny, life-giving minerals into Earth’s early oceans

The hydrothermal vent 'Candelabra' in the Logatchev hydrothermal field.

Their study, published in Science Advances, examined 3.5-billion-year-old rocks from western Australia in previously unseen detail and identified large quantities of a mineral called greenalite, which is thought to have played a role in early biological processes. The researchers also found that the seafloor vents would have seeded the oceans with apatite, a mineral rich in the life-essential element phosphorus.

The earliest lifeforms we know of—single-celled microorganisms, or microbes—emerged around 3.7 billion years ago. Most of the rocks that contain traces of them and the environment they lived in have, however, been destroyed. Some of the only evidence we have of this pivotal time comes from an outcrop of sediments in the remote Australian outback.

The so-called Dresser Formation has been studied for years but, in the new study, researchers re-examined the rocks in closer detail, using high magnification electron microscopes to reveal tiny minerals that were essentially hidden in plain sight.

The greenalite particles they observed measured just a few hundred nanometres in size—so small that they would have been washed over thousands of kilometres, potentially finding their way into a range of environments where they may have kick-started otherwise unfavourable chemical reactions, such as those involved in building the first DNA and RNA molecules.

“We’ve found that hydrothermal vents supplied trillions upon trillions of tiny, highly-reactive greenalite particles, as well as large quantities of phosphorus,” said Professor Birger Rasmussen, lead author of the study from the University of Western Australia.

Rasmussen said scientists are still unsure as to the exact role of greenalite in building primitive cells, “but this mineral was in the right place at the right time, and also had the right size and crystal structure to promote the assembly of early cells.”

The rocks the researchers studied contain characteristic layers of rusty-red, iron-rich jasper which formed as mineral-laden seawater spewed from hydrothermal vents. Scientists had thought the jaspers got their distinctive red colour from particles of iron oxide which, just like rust, form when iron is exposed to oxygen.

But how did this iron oxide form when Earth’s early oceans lacked oxygen? One theory is that photosynthesising cyanobacteria in the oceans produced the oxygen, and that it wasn’t until later, around 2.4 billion years ago, that this oxygen started to skyrocket in the atmosphere.

The new results change that assumption, however, “the story is completely different once you look closely enough,” said study co-author Professor Nick Tosca from Cambridge’s Department of Earth Sciences.

The researchers found that tiny, drab, particles of greenalite far outnumbered the iron oxide particles which give the jaspers their colour. The iron oxide was not an original feature, discounting the theory that they were formed by the activity of cyanobacteria.

“Our findings show that iron wasn’t oxidised in the oceans; instead, it combined with silica to form tiny crystals of greenalite,” said Tosca. “That means major oxygen producers, cyanobacteria, may have evolved later, potentially coinciding with the soar in atmospheric oxygen during the Great Oxygenation Event.”

Birger said that more experiments are needed to identify how greenalite might facilitate prebiotic chemistry, “but it was present in such vast quantities that, under the right conditions its surfaces could have synthesized an enormous number of RNA-type sequences, addressing a key question in origin of life research – where did all the RNA come from?” 

Reference:
Rasmussen, B, Muhling, J, Tosca, N J. 'Nanoparticulate apatite and greenalite in oldest, well-preserved hydrothermal vent precipitates.' Science Advances (2024). DOI: 10.1126/sciadv.adj4789

Researchers from the universities of Cambridge and Western Australia have uncovered the importance of hydrothermal vents, similar to underwater geysers, in supplying minerals that may have been a key ingredient in the emergence of early life.

The hydrothermal vent 'Candelabra' in the Logatchev hydrothermal field on the Mid-Atlantic Ridge at a water depth of 3300 metres.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Swarming cicadas, stock traders, and the wisdom of the crowd

Adult cicada on a leaf

Pick almost any location in the eastern United States – say, Columbus Ohio. Every 13 or 17 years, as the soil warms in springtime, vast swarms of cicadas emerge from their underground burrows singing their deafening song, take flight and mate, producing offspring for the next cycle.

This noisy phenomenon repeats all over the eastern and southeastern US as 17 distinct broods emerge in staggered years. In spring 2024, billions of cicadas are expected as two different broods – one that appears every 13 years and another that appears every 17 years – emerge simultaneously.

Previous research has suggested that cicadas emerge once the soil temperature reaches 18°C, but even within a small geographical area, differences in sun exposure, foliage cover or humidity can lead to variations in temperature.

Now, in a paper published in the journal Physical Review E, researchers from the University of Cambridge have discovered how such synchronous cicada swarms can emerge despite these temperature differences.

The researchers developed a mathematical model for decision-making in an environment with variations in temperature and found that communication between cicada nymphs allows the group to come to a consensus about the local average temperature that then leads to large-scale swarms. The model is closely related to one that has been used to describe ‘avalanches’ in decision-making like those among stock market traders, leading to crashes.

Mathematicians have been captivated by the appearance of 17- and 13-year cycles in various species of cicadas, and have previously developed mathematical models that showed how the appearance of such large prime numbers is a consequence of evolutionary pressures to avoid predation. However, the mechanism by which swarms emerge coherently in a given year has not been understood.

In developing their model, the Cambridge team was inspired by previous research on decision-making that represents each member of a group by a ‘spin’ like that in a magnet, but instead of pointing up or down, the two states represent the decision to ‘remain’ or ‘emerge’.

The local temperature experienced by the cicadas is then like a magnetic field that tends to align the spins and varies slowly from place to place on the scale of hundreds of metres, from sunny hilltops to shaded valleys in a forest. Communication between nearby nymphs is represented by an interaction between the spins that leads to local agreement of neighbours.

The researchers showed that in the presence of such interactions the swarms are large and space-filling, involving every member of the population in a range of local temperature environments, unlike the case without communication in which every nymph is on its own, responding to every subtle variation in microclimate.

The research was carried out Professor Raymond E Goldstein, the Alan Turing Professor of Complex Physical Systems in the Department of Applied Mathematics and Theoretical Physics (DAMTP), Professor Robert L Jack of DAMTP and the Yusuf Hamied Department of Chemistry, and Dr Adriana I Pesci, a Senior Research Associate in DAMTP.

“As an applied mathematician, there is nothing more interesting than finding a model capable of explaining the behaviour of living beings, even in the simplest of cases,” said Pesci.

The researchers say that while their model does not require any particular means of communication between underground nymphs, acoustical signalling is a likely candidate, given the ear-splitting sounds that the swarms make once they emerge from underground.

The researchers hope that their conjecture regarding the role of communication will stimulate field research to test the hypothesis.

“If our conjecture that communication between nymphs plays a role in swarm emergence is confirmed, it would provide a striking example of how Darwinian evolution can act for the benefit of the group, not just the individual,” said Goldstein.

This work was supported in part by the Complex Physical Systems Fund.

Reference:
R E Goldstein, R L Jack, and A I Pesci. ‘How Cicadas Emerge Together: Thermophysical Aspects of their Collective Decision-Making.’ Physical Review E (2024). DOI: 10.1103/PhysRevE.109.L022401

The springtime emergence of vast swarms of cicadas can be explained by a mathematical model of collective decision-making with similarities to models describing stock market crashes.

Ed Reschke via Getty Images
Adult Periodical Cicada

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Shadow Science and Technology Secretary discusses AI and innovation during Cambridge visit

The visit took place at Cambridge Innovation Capital and was hosted by Innovate Cambridge – an initiative which is bringing together partners across the city region to deliver an inclusive future for Cambridge and its science and technology cluster. The Shadow Minister met with experts on AI from the University and from industry, discussing both the challenges it presents, as well as the enormous potential for AI to serve science, people, and society.

The first two sessions of the day were convened by AI@Cam, the University’s flagship AI mission which is working to meet the challenges and opportunities of these new technologies as they emerge. At the opening roundtable, academics including Professor Dame Diane Coyle (Director of the Bennett Institute of Public Policy), Professor Neil Lawrence (DeepMind Professor of Machine Learning), and Professor John Aston (Professor of Statistics in Public Life), provided expert analysis on AI policy challenges as well as the role AI can play in public service reform. The group discussed how governance systems need to evolve for the AI era, and how an increasingly complex information infrastructure can be managed. In addition, they considered the opportunity that AI presents for improving public services and breaking down siloed decision-making within government.

Mr Kyle took part in a series of ‘flash talks’, focused on areas where research in AI is delivering benefits to society. These included work by Dr Ronita Bardhan, from the University’s Department of Architecture, on a new deep-learning model which makes it far easier and cheaper to identify ‘hard-to-decarbonise’ houses and develop strategies to improve their green credentials. Dr Anna Moore presented her work in the Department of Psychiatry, using AI systems to speed up the diagnosis of mental health conditions in children.

In the afternoon, Mr Kyle met with leaders representing civic institutions, academia and business organisations from across the city, including Councillor Mike Davey, Leader of Cambridge City Council, and Andrew Williamson, Managing Partner at Cambridge Innovation Capital. They spoke about their shared vision and strategy for the region to ensure Cambridge remains a globally leading innovation centre, and a collective desire to deliver benefits both locally and across the UK.

The day concluded with a spin-out and business roundtable at which participants discussed the need for government and the private sector to be active in ensuring AI benefits all parts of the UK, and people are re-skilled as jobs change. Mr Kyle was also interested to explore how the UK can become a more attractive place to scale companies. Key considerations included the need to improve access to talent, capital and infrastructure, as well tackling the regulatory barriers which can make the UK less competitive.

Peter Kyle MP, the Shadow Secretary of State for Science, Innovation and Technology, met academics from the University of Cambridge and leaders from the Cambridge community for a day focused on AI policy and innovation.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Scientists identify how fasting may protect against inflammation

Intermittent fasting conceptual image, showing a plate of food to represent a clock.

In research published in Cell Reports, the team describes how fasting raises levels of a chemical in the blood known as arachidonic acid, which inhibits inflammation. The researchers say it may also help explain some of the beneficial effects of drugs such as aspirin.

Scientists have known for some time that our diet – particular a high-calorie Western diet – can increase our risk of diseases including obesity, type 2 diabetes and heart disease, which are linked to chronic inflammation in the body.

Inflammation is our body’s natural response to injury or infection, but this process can be triggered by other mechanisms, including by the so-called ‘inflammasome’, which acts like an alarm within our body’s cells, triggering inflammation to help protect our body when it senses damage. But the inflammasome can trigger inflammation in unintentional ways – one of its functions is to destroy unwanted cells, which can result in the release of the cell’s contents into the body, where they trigger inflammation.

Professor Clare Bryant from the Department of Medicine at the University of Cambridge said: “We’re very interested in trying to understand the causes of chronic inflammation in the context of many human diseases, and in particular the role of the inflammasome.

“What's become apparent over recent years is that one inflammasome in particular – the NLRP3 inflammasome – is very important in a number of major diseases such as obesity and atherosclerosis, but also in diseases like Alzheimer's and Parkinson's disease, many of the diseases of older age people, particularly in the Western world.”

Fasting can help reduce inflammation, but the reason why has not been clear. To help answer this question, a team led by Professor Bryant and colleagues at the University of Cambridge and National Institute for Health in the USA studied blood samples from a group of 21 volunteers, who ate a 500kcal meal then fasted for 24 hours before consuming a second 500kcal meal. 

The team found that restricting calorie intake increased levels of a lipid known as arachidonic acid. Lipids are molecules that play important roles in our bodies, such as storing energy and transmitting information between cells. As soon as individuals ate a meal again, levels of arachidonic acid dropped.

When the researchers studied arachidonic acid’s effect in immune cells cultured in the lab, they found that it turns down the activity of the NLRP3 inflammasome. This surprised the team as arachidonic acid was previously thought to be linked with increased levels of inflammation, not decreased.

Professor Bryant, a Fellow of Queens’ College, Cambridge, added: “This provides a potential explanation for how changing our diet – in particular by fasting – protects us from inflammation, especially the damaging form that underpins many diseases related to a Western high calorie diet.

“It’s too early to say whether fasting protects against diseases like Alzheimer's and Parkinson's disease as the effects of arachidonic acid are only short-lived, but our work adds to a growing amount of scientific literature that points to the health benefits of calorie restriction. It suggests that regular fasting over a long period could help reduce the chronic inflammation we associate with these conditions. It's certainly an attractive idea.”

The findings also hint at one mechanism whereby a high calorie diet might increase the risk of these diseases. Studies have shown that some patients that have a high fat diet have increased levels of inflammasome activity.

“There could be a yin and yang effect going on here, whereby too much of the wrong thing is increasing your inflammasome activity and too little is decreasing it,” said Professor Bryant. “Arachidonic acid could be one way in which this is happening.”

The researchers say the discovery may also offer clues to an unexpected way in which so-called non-steroidal anti-inflammatory drugs such as aspirin work. Normally, arachidonic acid is rapidly broken down in the body, but aspirin stops this process, which can lead to an increase in levels of arachidonic acid, which in turn reduce inflammasome activity and hence inflammation.

Professor Bryant said: “It’s important to stress that aspirin should not be taken to reduce risk of long terms diseases without medical guidance as it can have side-effects such as stomach bleeds if taken over a long period.”

The research was funded by Wellcome, the Medical Research Council and the US National Heart, Lung, and Blood Institute Division of Intramural Research.

Reference
Pereira, M & Liang, J et al. Arachidonic acid inhibition of the NLRP3 inflammasome is a mechanism to explain the anti-inflammatory effects of fasting. Cell Reports; 23 Jan 2024; DOI: 10.1016/j.celrep.2024.113700

Cambridge scientists may have discovered a new way in which fasting helps reduce inflammation – a potentially damaging side-effect of the body’s immune system that underlies a number of chronic diseases.

Our work adds to a growing amount of scientific literature that points to the health benefits of calorie restriction
Clare Bryant
Intermittent fasting conceptual image

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Religious people coped better with Covid-19 pandemic, research suggests

People in church praying with covid-19 restrictions

People of religious faith may have experienced lower levels of unhappiness and stress than secular people during the UK’s Covid-19 lockdowns in 2020 and 2021, according to a new University of Cambridge study released as a working paper.

The findings follow recently published Cambridge-led research suggesting that worsening mental health after experiencing Covid infection – either personally or in those close to you – was also somewhat ameliorated by religious belief. This study looked at the US population during early 2021.

University of Cambridge economists argue that – taken together – these studies show that religion may act as a bulwark against increased distress and reduced wellbeing during times of crisis, such as a global public health emergency.

“Selection biases make the wellbeing effects of religion difficult to study,” said Prof Shaun Larcom from Cambridge’s Department of Land Economy, and co-author of the latest study. “People may become religious due to family backgrounds, innate traits, or to cope with new or existing struggles.”

“However, the Covid-19 pandemic was an extraordinary event affecting everyone at around the same time, so we could gauge the impact of a negative shock to wellbeing right across society. This provided a unique opportunity to measure whether religion was important for how some people deal with a crisis.”

Larcom and his Cambridge colleagues Prof Sriya Iyer and Dr Po-Wen She analysed survey data collected from 3,884 people in the UK during the first two national lockdowns, and compared it to three waves of data prior to the pandemic.

They found that while lockdowns were associated with a universal uptick in unhappiness, the average increase in feeling miserable was 29% lower for people who described themselves as belonging to a religion.*

The researchers also analysed the data by “religiosity”: the extent of an individual’s commitment to religious beliefs, and how central it is to their life. Those for whom religion makes “some or a great difference” in their lives experienced around half the increase in unhappiness seen in those for whom religion makes little or no difference.**

“The study suggests that it is not just being religious, but the intensity of religiosity that is important when coping with a crisis,” said Larcom.

Those self-identifying as religious in the UK are more likely to have certain characteristics, such as being older and female. The research team “controlled” for these statistically to try and isolate the effects caused by faith alone, and still found that the probability of religious people having an increase in depression was around 20% lower than non-religious people.

There was little overall difference between Christians, Muslims and Hindus – followers of the three biggest religions in the UK. However, the team did find that wellbeing among some religious groups appeared to suffer more than others when places of worship were closed during the first lockdown.

“The denial of weekly communal attendance appears to have been particularly affecting for Catholics and Muslims,” said Larcom.

For the earlier study, authored by Prof Sriya Iyer, along with colleagues Kishen Shastry, Girish Bahal and Anand Shrivastava from Australia and India, researchers used online surveys to investigate Covid-19 infections among respondents or their immediate family and friends, as well as religious beliefs, and mental health. 

The study was conducted during February and March 2021, and involved 5,178 people right across the United States, with findings published in the journal European Economic Review in November 2023.

Researchers found that almost half those who reported a Covid-19 infection either in themselves or their immediate social network experienced an associated reduction in wellbeing.

Where mental health declined, it was around 60% worse on average for the non-religious compared to people of faith with typical levels of “religiosity”.***

Interestingly, the positive effects of religion were not found in areas with strictest lockdowns, suggesting access to places of worship might be even more important in a US context. The study also found significant uptake of online religious services, and a 40% lower association between Covid-19 and mental health for those who used them.****

“Religious beliefs may be used by some as psychological resources that can shore up self-esteem and add coping skills, combined with practices that provide social support,” said Prof Iyer, from Cambridge’s Faculty of Economics.

“The pandemic presented an opportunity to glean further evidence of this in both the United Kingdom and the United States, two nations characterised by enormous religious diversity.” 

Added Larcom: “These studies show a relationship between religion and lower levels of distress during a global crisis. It may be that religious faith builds resilience, and helps people cope with adversity by providing hope, consolation and meaning in tumultuous times.”  

Two Cambridge-led studies suggest that the psychological distress caused by lockdowns (UK) and experience of infection (US) was reduced among those of faith compared to non-religious people.  

Getty/Luis Alvarez
People in church praying with covid-19 restrictions
Notes

* The increase in the mean measure for unhappiness was 6.1 percent for people who do not identify with a religion during the lockdown, compared to an increase of 4.3 percent for those who do belong to a religion – a difference of 29%.

**For those that religion makes little or no difference, the increase was 6.3 percent.  For those for whom religion makes some or a great difference, the increase was around half that, at 3 percent and 3.5 percent respectively.

*** This was after controlling for various demographic and environmental traits, including age, race, income, and average mental health rates prior to the pandemic.

**** The interpretation is from Column 1 of Table 5: Determinants of mental health, online access to religion. Where the coefficients of Covid {Not accessed online service} is 2.265 and Covid {Accessed online service} is 1.344. Hence the difference is 2.265-1.344 = 0.921 which is 40% of 2.265.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Robot trained to read braille at twice the speed of humans

Robot braille reader

The research team, from the University of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. The robot was able to read the braille at 315 words per minute at close to 90% accuracy.

Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips. The results are reported in the journal IEEE Robotics and Automation Letters.

Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.

Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In Professor Fumiya Iida’s lab in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.

“The softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”

Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. The researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.

“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”

The robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.

The team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character.

Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. The robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.

“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”

“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.

In future, the researchers are hoping to scale the technology to the size of a humanoid hand or skin. The research was supported in part by the Samsung Global Research Outreach Program.

 

Reference:
Parth Potdar et al. ‘High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions.’ IEEE Robotics and Automation Letters (2024). DOI: 10.1109/LRA.2024.3356978

Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.

Parth Potdar
Robot braille reader

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Cambridge student Giulio Regeni remembered

Elisabeth Kendall, Mistress of Girton College, unveils the plaque honouring Giulio Regeni.

The plaque offers a space in which colleagues and friends of the Cambridge PhD student, who studied at Girton, can pay their respects.

Giulio, an experienced researcher, was conducting fieldwork when he was abducted from the streets of Cairo on 25 January 2016, and later found murdered on 3 February 2016. The plaque unveiling marks the 8-year anniversary of his death. No one has yet been convicted of the crime.

Court officials in Rome have charged four Egyptian security officials with Giulio’s abduction, torture and murder, and a trial is due to begin in February. The College and University continue to stand in support of Giulio’s family and friends, and with Amnesty International, in their tireless efforts to uncover the truth of what happened to Giulio.

Elisabeth Kendall, Mistress of Girton College, said: “The loss of Giulio continues to cast a dark shadow over all those who knew him. Giulio was a passionate researcher with a deep sense of justice who had his whole life ahead of him before it was cruelly ended in Cairo. Justice has yet to be done. We will never stop remembering Giulio.”

Every year the College marks the anniversary by flying the College flag to half-mast in memory on 25 January and then on 3 February.

Giulio Regeni was remembered during an event at Girton College, where a plaque was unveiled in his honour.

Giulio was a passionate researcher with a deep sense of justice.
Elisabeth Kendall, Mistress of Girton College
Girton College, University of Cambridge.
Elisabeth Kendall, Mistress of Girton College, unveils the plaque honouring Giulio Regeni.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New Pro-Vice-Chancellor for Innovation appointed

Dr Diarmuid O’Brien

Dr O’Brien will take over from current Senior Pro-Vice-Chancellor Andy Neely, whose term of office finishes at the end of February. Dr O’Brien, who has a PhD in Physics from the University of Sheffield and a degree in Materials Science from Trinity College Dublin, joined Cambridge Enterprise from Trinity College Dublin, where he was Chief Innovation Officer. At Cambridge Enterprise he has led a new strategy which has supported activities such as the establishment of Innovate Cambridge, the formation of Founders at the University of Cambridge, the integration and renewal of ideaSpace and the commencement of the Technology Investment Fund to support the development of University intellectual property.

“The University and the broader Cambridge ecosystem are recognised as being globally leading for innovation, enterprise and entrepreneurship,” said Dr O’Brien. “I have seen this first-hand from my role as Chief Executive of Cambridge Enterprise and in helping to establish Innovate Cambridge. I look forward to my new role as Pro-Vice-Chancellor for Innovation and continuing to enhance the ambition for how the University of Cambridge can enable impact from our research and through our innovation partnerships.”

He replaces Professor Andy Neely, who has served as Pro-Vice-Chancellor for Enterprise and Business Relations since March 2017, and received an OBE for services to University/Industry Collaboration in 2020. Professor Neely’s achievements as Pro-Vice-Chancellor included leading the University’s Recovery Programme helping the University respond to the coronavirus pandemic, overseeing the establishment of the Change and Programme Management Board, as well as building far stronger links with the local and regional innovation community through important initiatives such as Innovate Cambridge.

Professor Neely said: “I’m honoured to have served in this role for seven years and delighted that Diarmuid has been appointed as my successor. The University of Cambridge’s impact on the world is significantly enhanced by our engagement with business and our world-leading innovation ecosystem and I have no doubt that this will go from strength to strength under Diarmuid’s leadership”.

The University of Cambridge Vice-Chancellor Professor Deborah Prentice welcomed Dr O’Brien to the role and thanked Professor Neely for his service.

She said: “I warmly congratulate Diarmuid on being appointed to this important role. With his wealth of experience in driving innovation, most recently at Cambridge Enterprise, he will help ensure no momentum is lost in the handover from the previous Pro-Vice-Chancellor, Andy Neely.
“I would like to put on record my sincerest thanks to Andy for his service to Cambridge, both as an academic leader and as Pro-Vice-Chancellor for Enterprise and Business Relations. I know I speak on behalf of all University colleagues when I say how grateful we are for what he has achieved in that role over the past seven years.”

The Pro-Vice-Chancellor for Innovation is broadly the same role as the current Pro-Vice-Chancellor for Enterprise and Business Relations role, but with an enhanced focus on industry, enterprise and innovation.

Dr O’Brien takes up the role in April, and will remain in his current capacity at Cambridge Enterprise for one day a week to provide continuity and connection with Cambridge Enterprise.

There are five Pro-Vice-Chancellors at the University of Cambridge. Their role is to work in partnership with senior administrators to help drive strategy and policy development. The Pro-Vice-Chancellors also support the Vice-Chancellor in providing academic leadership to the University.
 

Dr Diarmuid O’Brien has been appointed as the University of Cambridge’s new Pro-Vice-Chancellor for Innovation. He is currently Chief Executive of Cambridge Enterprise, the University’s commercialisation arm which supports academics, researchers, staff and students in achieving knowledge transfer and research impact.

Dr Diarmuid O’Brien

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Removing largest wine glass serving reduces amount of wine sold in bars and pubs

Red and white wine in glasses

While only modest, the finding could provide one way of nudging customers to drink less alcohol and have an impact at a population level, say the researchers.

Alcohol consumption is the fifth largest contributor to premature death and disease worldwide. In 2016 it was estimated to have caused approximately 3 million deaths worldwide.

There are many factors that influence how much we drink, from advertising to labelling to availability and cost. Previous research from the Behaviour and Health Research Unit at Cambridge has shown that even glass size can influence how much alcohol is consumed.

In research published today in PLOS Medicine, the Cambridge team carried out a study in 21 licensed premises (mainly pubs) in England to see whether removing their largest serving of wine by the glass for four weeks would have an impact on how much wine is consumed. Wine is the most commonly drunk alcoholic drink in the UK and Europe. Twenty of the premises completed the experiment as designed by the researchers and were included in the final analysis.

After adjusting for factors such as day of the week and total revenue, the researchers found that removing the largest wine glass serving led to an average (mean) decrease of 420ml of wine sold per day per venue – equating to a 7.6% decrease.

There was no evidence that sales of beer and cider increased, suggesting that people did not compensate for their reduced wine consumption by drinking more of these alcoholic drinks. There was also no evidence that it affected total daily revenues, implying that participating licensed premises did not lose money as a result of removing the largest serving size for glasses of wine, perhaps due to the higher profit margins of smaller serving sizes of wine. However, it is important to note that the study was not designed to provide statistically meaningful data on these points.

First author Dr Eleni Mantzari, from the University of Cambridge, said: “It looks like when the largest serving size of wine by the glass was unavailable, people shifted towards the smaller options, but didn’t then drink the equivalent amount of wine.

“People tend to consume a specific number of ‘units’ – in this case glasses – regardless of portion size. So, someone might decide at the outset they’ll limit themselves to a couple of glasses of wine, and with less alcohol in each glass they drink less overall.”

Professor Dame Theresa Marteau, the study’s senior author and an Honorary Fellow at Christ’s College Cambridge, added: “It’s worth remembering that no level of alcohol consumption is considered safe for health, with even light consumption contributing to the development of many cancers. Although the reduction in the amount of wine sold at each premise was relatively small, even a small reduction could make a meaningful contribution to population health.”

Evidence suggests that the public prefer information-based interventions, such as health warning labels, to reductions in serving or package sizes. However, in this study, managers at just four of the 21 premises reported receiving complaints from customers.

The researchers note that although the intervention would potentially be acceptable to pub or bar managers, given there was no evidence that it can result in a loss in revenue, a nationwide policy would likely be resisted by the alcohol industry given its potential to reduce sales of targeted drinks. Public support for such a policy would depend on its effectiveness and how clearly this was communicated.

The research was funded by Wellcome.

Reference
Mantzari, E et al. Impact on wine sales of removing the largest serving size by the glass: an A-B-A reversal trial in 21 pubs, bars and restaurants in England. PLOS Medicine; DOI: 10.1371/journal.pmed.1004313

Taking away the largest serving of wine by the glass – in most cases the 250ml option – led to an average reduction in the amount of wine sold at pubs and bars of just under 8%, new research led by a team at the University of Cambridge has discovered.

When the largest serving size of wine by the glass was unavailable, people shifted towards the smaller options, but didn’t then drink the equivalent amount of wine
Eleni Mantzari
Red and white wine in glasses

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Galaxy mergers solve early Universe mystery

This image shows the galaxy EGSY8p7, a bright galaxy in the early Universe where light emission is seen from, among other things, excited hydrogen atoms – Lyman-α emission.

This has solved one of the most puzzling mysteries in astronomy – why astronomers detect light from hydrogen atoms that should have been entirely blocked by the pristine gas that formed after the Big Bang.

These new observations have found small, faint objects surrounding the galaxies that show the ‘inexplicable’ hydrogen emission. In conjunction with state-of-the-art simulations of galaxies in the early Universe, the observations have shown that the chaotic merging of these neighbouring galaxies is the source of this hydrogen emission. The results are reported in the journal Nature Astronomy.

Light travels at a finite speed (300 000 km a second), which means that the further away a galaxy is, the longer it has taken the light from it to reach our Solar System. As a result, not only do observations of the most distant galaxies probe the far reaches of the Universe, but they also allow us to study the Universe as it was in the past.

To study the early Universe, astronomers require exceptionally powerful telescopes that are capable of observing very distant – and therefore very faint – galaxies. One of Webb’s key capabilities is its ability to observe these galaxies, and probe the early history of the Universe.

The earliest galaxies were sites of vigorous and active star formation, and were rich sources of a type of light emitted by hydrogen atoms called Lyman-α emission. However, during the epoch of reionisation, an immense amount of neutral hydrogen gas surrounded these stellar nurseries. Furthermore, the space between galaxies was filled by more of this neutral gas than is the case today. The gas can effectively absorb and scatter this kind of hydrogen emission, so astronomers have long predicted that the abundant Lyman-α emission released in the early Universe should not be observable today.

This theory has not always stood up to scrutiny, however, as examples of early hydrogen emission have previously been observed by astronomers. This has presented a mystery: how is it that this hydrogen emission – which should have long since been absorbed or scattered – is being observed?

“One of the most puzzling issues that previous observations presented was the detection of light from hydrogen atoms in the very early Universe, which should have been entirely blocked by the pristine neutral gas that was formed after the Big Bang,” said lead author Callum Witten from Cambridge’s Institute of Astronomy. “Many hypotheses have previously been suggested to explain the great escape of this ‘inexplicable’ emission.”

The team’s breakthrough came thanks to Webb’s combination of angular resolution and sensitivity. The observations with Webb’s NIRCam instrument were able to resolve smaller, fainter galaxies that surround the bright galaxies from which the ‘inexplicable’ hydrogen emission had been detected. In other words, the surroundings of these galaxies appear to be a much busier place than we previously thought, filled with small, faint galaxies.

These smaller galaxies were interacting and merging with one another, and Webb has revealed that galaxy mergers play an important role in explaining the mystery emission from the earliest galaxies.

“Where Hubble was seeing only a large galaxy, Webb sees a cluster of smaller interacting galaxies, and this revelation has had a huge impact on our understanding of the unexpected hydrogen emission from some of the first galaxies,” said co-author Sergio Martin-Alvarez from Stanford University.

The team then used computer simulations to explore the physical processes that might explain their results. They found that the rapid build-up of stellar mass through galaxy mergers both drove strong hydrogen emission and facilitated the escape of that radiation via channels cleared of the abundant neutral gas. So, the high merger rate of the previously unobserved smaller galaxies presented a compelling solution to the long-standing puzzle of the ‘inexplicable’ early hydrogen emission.

The team is planning follow-up observations with galaxies at various stages of merging, to continue to develop their understanding of how the hydrogen emission is ejected from these changing systems. Ultimately, this will enable them to improve our understanding of galaxy evolution.

Reference:
Callum Witten et al. ‘Deciphering Lyman-α emission deep into the epoch of reionization.’ Nature Astronomy (2024). DOI: 10.1038/s41550-023-02179-3

Adapted from an ESA press release.

A team of astronomers, led by the University of Cambridge, has used the NASA/ESA/CSA James Webb Space Telescope to reveal, for the first time, what lies in the local environment of galaxies in the very early Universe.

Zooming in on three neighbouring galaxies (NIRCam image)

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New admissions tests for 2024

Senate House

Cambridge and Imperial will provide two sets of tests. The Engineering and Science Admissions Test (ESAT) will be used for degree programmes in Engineering, Natural Sciences, Chemical Engineering and Biotechnology, and Veterinary Medicine at Cambridge, and Physics and most Engineering degrees at Imperial.

The Test of Mathematics for University Admission (TMUA) will be used for Economics and Computer Science degrees at Cambridge, and both the Economics, Finance and Data Science and Computing degrees at Imperial. A number of other UK universities will also use the TMUA for assessing applications for mathematically-based courses.

Pearson VUE is the certification and licensure arm of Pearson, the world’s leading learning company, providing assessment services to many institutions in the academic and admissions space. From October 2024 students will take a new computer based assessment at a Pearson VUE test centre, selecting from a global network of more than 5,500 locations in more than 180 countries. Mike Nicholson, Director of Recruitment, Admissions and Participation at Cambridge said “We are delighted to be able to provide computer based admissions tests from 2024, and in locations that take the burden off teachers and schools to act as test centres.”

Cambridge and Imperial will also be using the UCAT assessment for admission to their medical degrees from 2024, also provided through Pearson VUE, and Cambridge will continue to use the LNAT test for Law admissions.

Applicants will be required to pay an administration charge to take the tests, in line with other comparable institutions, but a fee waiver will be applied for UK-based applicants who are eligible for free school meals or who meet a number of other widening participation criteria. Nicholson added that “It is important that cost is not a barrier to participation, and the model we are using for the fee waivers has been successfully used for other admissions tests supported by Pearson”.

Lizzie Burrows, Director of Marketing, Recruitment and Admissions at Imperial said “The applicant experience is at the heart of our ambitions. With the number of applications expected to continue to rise over coming years, universities need to find ways to fairly select the best candidates while minimising the burden on our applicants.”

We hope that these tests, operating through Pearson VUE’s well established test centre network  will encourage other universities to use the TMUA and ESAT as assessments and streamline the admissions process for students.”

To attract a wider range of applicants the TMUA and ESAT will run test-sittings in mid-October 2024 and early January 2025 to reflect the two main deadlines for courses in the UCAS admissions process. Applicants to Cambridge must take the Autumn sitting.

Matthew Poyiadgi, Vice President EMEA and Asia at Pearson VUE, commented “As academic settings and admissions programmes continue to evolve in an increasingly digital world, computer-based assessments drive greater efficiencies. We look forward to collaborating with Imperial and Cambridge on this transition and supporting applicants to these world-leading universities in proving their potential.’’ 

More information can be found here.

The University of Cambridge and Imperial College London are to launch a new joint venture to deliver admissions tests for science, engineering and mathematics based degree courses. The tests, which will be delivered by global assessments leader, Pearson VUE, aim to improve the experience of students applying for highly competitive undergraduate courses while helping universities to fairly assess the skills of the brightest applicants. 

We are delighted to be able to provide computer based admissions tests from 2024
Mike Nicholson
Senate House

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

‘Mini-placentas’ help scientists understand the causes of pre-eclampsia and pregnancy disorders

Microscopic image of placental organoids

The study, published today in Cell Stem Cell, shows that it is possible to experiment on a developing human placenta, rather than merely observe specimens, in order to study major disorders of pregnancy.

Successful pregnancy depends on the development of the placenta in the first few weeks of gestation. During this period, the placenta implants itself into the endometrium – the mucosal lining of the mother’s uterus.

Interactions between the cells of the endometrium and the cells of the placenta are critical to whether a pregnancy is successful. In particular, these interactions are essential to increase the maternal blood supply to the placenta, necessary for fetal growth and development.

When these interactions do not work properly, they can lead to complications, such as pre-eclampsia, a condition that causes high blood pressure during pregnancy. Pre-eclampsia occurs in around six in 100 first pregnancies and can put at risk the health of both the mother and the baby.

Professor Ashley Moffett from the Department of Pathology at the University of Cambridge said: “Most of the major disorders of pregnancy – pre-eclampsia, still birth, growth restriction, for example – depend on failings in the way the placenta develops in the first few weeks. This is a process that is incredibly difficult to study – the period after implantation, when the placenta embeds itself into the endometrium, is often described as a ‘black box of human development’.

“Over the past few years, many scientists – including several at Cambridge – have developed embryo-like models to help us understand early pre-implantation development. But further development is impeded because we understand so little about the interactions between the placenta and the uterus.”

Professor Moffett and colleagues at the Friedrich Miescher Institute, Switzerland, and the Wellcome Sanger Institute, Cambridge, have used ‘mini-placentas’ – a cellular model of the early stages of the placenta – to provide a window into early pregnancy and help improve our understanding of reproductive disorders. Known as ‘trophoblast organoids’, these are grown from placenta cells and model the early placenta so closely that they have previously been shown to record a positive response on an over-the-counter pregnancy test.

In previous work, Professor Moffett and colleagues identified genes that increase the risk of or protect against conditions such as pre-eclampsia. These highlighted the important role of immune cells uniquely found in the uterus, known as ‘uterine natural killer cells’, which cluster in the lining of the womb at the site where the placenta implants. These cells mediate the interactions between the endometrium and the cells of the placenta.

In their new study, her team applied proteins secreted by the uterine natural killer cells to the trophoblast organoids so that they could mimic the conditions where the placenta implants itself. They identified particular proteins that were crucial to helping the organoids develop. These proteins will contribute to successful implantation, allowing the placenta to invade the uterus and transform the mother’s arteries.

“This is the only time that we know of where a normal cell invades and transforms an artery, and these cells are coming from another individual, the baby,” said Professor Moffett, who is also a Fellow at King’s College, Cambridge.

“If the cells aren’t able to invade properly, the arteries in the womb don’t open up and so the placenta – and therefore the baby – are starved of nutrients and oxygen. That's why you get problems later on in pregnancy, when there just isn't enough blood to feed the baby and it either dies or is very tiny.”

The researchers also found several genes that regulate blood flow and help with this implantation, which Professor Moffett says provide pointers for future research to better understand pre-eclampsia and similar disorders.

Dr Margherita Turco, from the Friedrich Miescher Institute in Switzerland and co-lead of this work, added: “Despite affecting millions of women a year worldwide, we still understand very little about pre-eclampsia. Women usually present with pre-eclampsia at the end of pregnancy, but really to understand it – to predict it and prevent it – we have to look at what's happening in the first few weeks.

“Using ‘mini-placentas’, we can do just that, providing clues as to how and why pre-eclampsia occurs. This has helped us unpick some of the key processes that we should now focus on far more. It shows the power of basic science in helping us understand our fundamental biology, something that we hope will one day make a major difference to the health of mothers and their babies.”

The research was supported by Wellcome, the Royal Society, European Research Council and Medical Research Council.

Reference
Li, Q et al. Human uterine natural killer cells regulate differentiation of extravillous trophoblast early in pregnancy. Cell Stem Cell; 17 Jan 2024; DOI: doi.org/10.1016/j.stem.2023.12.013

Volunteers wanted for women's health study

University of Cambridge researchers at Addenbrooke’s Hospital are looking for volunteers who are planning their first pregnancy, to take part in a new study focussing on pregnancy and women’s long-term health (the POPPY study).

The POPPY study aims to understand why some women develop pre-eclampsia and other placental complications and why these conditions have an adverse effect on women’s future heart health.

If you are aged 18-45 years and are planning your first pregnancy, you may be eligible to participate. We are also looking for similar aged volunteers who are not actively planning a pregnancy, for a control group.

Reimbursement is provided for time, inconvenience and travel.

To find out more, please visit the POPPY study website or email the POPPY study team.

18 January 2024

 

Scientists have grown ‘mini-placentas’ in the lab and used them to shed light on how the placenta develops and interacts with the inner lining of the womb – findings that could help scientists better understand and, in future, potentially treat pre-eclampsia.

Most of the major disorders of pregnancy – pre-eclampsia, still birth, growth restriction, for example – depend on failings in the way the placenta develops in the first few weeks. This is a process that is incredibly difficult to study.
Ashley Moffett
Friedrich Miescher Institute/University of Cambridge
Placental organoid (circle in the centre). Trophoblast cells are invading out of the organoid, mimicking placental cells invading the uterus in the early weeks of pregnancy.

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Astronomers detect oldest black hole ever observed

The GN-z11 galaxy, taken by the Hubble Space Telescope

The international team, led by the University of Cambridge, used the NASA/ESA/CSA James Webb Space Telescope (JWST) to detect the black hole, which dates from 400 million years after the big bang, more than 13 billion years ago. The results, which lead author Professor Roberto Maiolino says are “a giant leap forward”, are reported in the journal Nature.

That this surprisingly massive black hole – a few million times the mass of our Sun – even exists so early in the universe challenges our assumptions about how black holes form and grow. Astronomers believe that the supermassive black holes found at the centre of galaxies like the Milky Way grew to their current size over billions of years. But the size of this newly-discovered black hole suggests that they might form in other ways: they might be ‘born big’ or they can eat matter at a rate that’s five times higher than had been thought possible.

According to standard models, supermassive black holes form from the remnants of dead stars, which collapse and may form a black hole about a hundred times the mass of the Sun. If it grew in an expected way, this newly-detected black hole would take about a billion years to grow to its observed size. However, the universe was not yet a billion years old when this black hole was detected.

“It’s very early in the universe to see a black hole this massive, so we’ve got to consider other ways they might form,” said Maiolino, from Cambridge’s Cavendish Laboratory and Kavli Institute for Cosmology. “Very early galaxies were extremely gas-rich, so they would have been like a buffet for black holes.”

Like all black holes, this young black hole is devouring material from its host galaxy to fuel its growth. Yet, this ancient black hole is found to gobble matter much more vigorously than its siblings at later epochs.

The young host galaxy, called GN-z11, glows from such an energetic black hole at its centre. Black holes cannot be directly observed, but instead they are detected by the tell-tale glow of a swirling accretion disc, which forms near the edges of a black hole. The gas in the accretion disc becomes extremely hot and starts to glow and radiate energy in the ultraviolet range. This strong glow is how astronomers are able to detect black holes.

GN-z11 is a compact galaxy, about one hundred times smaller than the Milky Way, but the black hole is likely harming its development. When black holes consume too much gas, it pushes the gas away like an ultra-fast wind. This ‘wind’ could stop the process of star formation, slowly killing the galaxy, but it will also kill the black hole itself, as it would also cut off the black hole’s source of ‘food’.

Maiolino says that the gigantic leap forward provided by JWST makes this the most exciting time in his career. “It’s a new era: the giant leap in sensitivity, especially in the infrared, is like upgrading from Galileo’s telescope to a modern telescope overnight,” he said. “Before Webb came online, I thought maybe the universe isn’t so interesting when you go beyond what we could see with the Hubble Space Telescope. But that hasn’t been the case at all: the universe has been quite generous in what it’s showing us, and this is just the beginning.”

Maiolino says that the sensitivity of JWST means that even older black holes may be found in the coming months and years. Maiolino and his team are hoping to use future observations from JWST to try to find smaller ‘seeds’ of black holes, which may help them untangle the different ways that black holes might form: whether they start out large or they grow fast.

The research was supported in part by the European Research Council, the Royal Society, and the Science and Technology Facilities Council (STFC), part of UK Research and Innovation (UKRI).

 

Reference:
Roberto Maiolino et al. ‘A small and vigorous black hole in the early Universe.’ Nature (2024). DOI: 10.1038/s41586-024-07052-5

Researchers have discovered the oldest black hole ever observed, dating from the dawn of the universe, and found that it is ‘eating’ its host galaxy to death.

It’s a new era: the giant leap in sensitivity, especially in the infrared, is like upgrading from Galileo’s telescope to a modern telescope overnight
Roberto Maiolino
The GN-z11 galaxy, taken by the Hubble Space Telescope

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Role of inherited genetic variants in rare blood cancer uncovered

DNA

Large-scale genetic analysis has helped researchers uncover the interplay between cancer-driving genetic mutations and inherited genetic variants in a rare type of blood cancer.

Researchers from the University of Cambridge, Wellcome Sanger Institute, and collaborators, combined various comprehensive data sets to understand the impact of both cancer-driving spontaneous mutations and inherited genetic variation on the risk of developing myeloproliferative neoplasms (MPN).

The study, published in the journal Nature Genetics, describes how inherited genetic variants can influence whether a spontaneous mutation in a particular gene increases the risk of developing this rare blood cancer.

This analysis has an impact on current clinical predictions of disease development in individuals. Further research is required to understand the biological mechanisms behind how these inherited genetic variants influence the chances of developing rare blood cancer. In the future, this knowledge could aid drug development and interventions that reduce the risk of disease.

Myeloproliferative neoplasms, MPNs, are a group of rare, chronic, blood cancers. There are around 4,000 cases of MPN in the UK each year. These occur when the bone marrow overproduces blood cells, which can result in blood clots and bleeding. MPNs can also progress into other forms of blood cancer, such as leukaemia.

In the population, there is a large amount of natural variation between individuals’ blood cells, which can affect the amount of blood cells a person has and their particular traits. This is because multiple different genes can influence blood cell features in an individual. During routine blood tests, researchers take known information about these genes and analyse the variation to give a genetic risk score, which is how likely that individual is to develop a disease over their lifetime.  

MPNs have been linked to random somatic mutations in certain genes including in a gene called JAK2. However, mutated JAK2 is commonly found in the global population, and the vast majority of these individuals do not have or go on to develop MPN.

Whilst previous studies have identified over a dozen associated inherited genetic variants that increase the risk of MPN, these studies insufficiently explain why most individuals in the population do not go on to develop MPN.

This new study, from the Wellcome Sanger Institute and collaborators, combined information on the known somatic driver mutations in MPN, inherited genetic variants, and genetic risk scores from individuals with MPN.

They found that the inherited variants that cause natural blood cell variation in the population also impact whether a JAK2 somatic mutation will go on to cause MPN.  They also found that individuals with an inherited risk of having a higher blood cell count could display MPN features in the absence of cancer-driving mutations, thus, mimicking disease.

Dr Jing Guo, from the University of Cambridge and the Wellcome Sanger Institute and first author of the study, said: “Our large-scale statistical study has helped fill the knowledge gaps in how variants in DNA, both inherited and somatic, interact to influence complex disease risk. By combining these three different types of datasets we were able to get a more complete picture of how these variants combine to cause blood disorders.”

Professor Nicole Soranzo, co-senior author from the University of Cambridge, the Wellcome Sanger Institute, and Human Technopole, Italy, said: “There has been increasing realisation that human diseases have complex causes involving a combination of common and rare inherited genetic variants with different severity.

“We have previously shown that variation in blood cell parameters and function has complex genetic variability by highlighting thousands of genetic changes that affect different gene functions. Here, we show for the first time that common variants in these genes also affect blood cancers, independent of causative somatic mutations. This confirms a new important contribution of normal variability beyond complex disease, contributing to our understanding of myeloproliferative neoplasms and blood cancer more generally.”

Dr Jyoti Nangalia, co-senior author from the Wellcome-MRC Cambridge Stem Cell Institute at the University of Cambridge, and the Wellcome Sanger Institute, said: “We have a good understanding of the genetic causes of myeloproliferative neoplasms. In fact, many of these genetic mutations are routine diagnostic tests in the clinic. However, these mutations can often be found in healthy individuals without the disease.

“Our study helps us understand how inherited DNA variation from person to person can interact with cancer-causing mutations to determine whether disease occurs in the first place, and how this can alter the type of any subsequent disease that emerges. Our hope is that this information can be incorporated into future disease prediction efforts.”  

This research was funded by Cancer Research UK and Wellcome.

Reference

J Guo, K Walter, P M Quiros, et al. ‘Inherited polygenic effects on common hematological traits influence clonal selection on JAK2V617F and the development of myeloproliferative neoplasms.’ Jan 2024,  Nature Genetics. DOI: 10.1038/s41588-023-01638-x

Adapted from a press release by the Wellcome Sanger Institute

Combining three different sources of genetic information has allowed researchers to further understand why only some people with a common mutation go on to develop rare blood cancer.

Our hope is that this information can be incorporated into future disease prediction efforts
Jyoti Nangalia
DNA

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Significant gaps in COVID-19 vaccine uptake may have led to over 7,000 hospitalisations and deaths

Young girl watching herself being injected with COVID-19 vaccine at a medical clinic

The findings, published today in The Lancet, suggest that more than 7,000 hospitalisations and deaths might have been averted in summer 2022 if the UK had had better vaccine coverage.

With COVID-19 cases on the rise and a new variant strain recently identified, this research provides a timely insight into vaccine uptake and hesitancy and could inform policy-makers.

The research relied on secure access to anonymised health data for everyone in all four nations of the UK, an advance which has only become possible during the pandemic.

Co-author Angela Wood, Professor of Health Data Science at the Victor Phillip Dahdaleh Heart & Lung Research Institute, University of Cambridge and Associate Director of the British Heart Foundation Data Science Centre said: “This is the first epidemiological study to use individual-level anonymised health data covering the entire UK population. We have created a detailed, UK-wide picture of who is under-vaccinated against COVID-19 and the associated risks of under-vaccination.

“These results can be used to help create health policy and public health interventions to improve vaccine uptake. This approach could be extended to many other areas of medicine with great potential for new discoveries in the understanding and treatment of disease.”

Early COVID-19 vaccine rollout began strongly in the UK, with over 90% of the population over the age of 12 vaccinated with at least one dose by January 2022. However, rates of subsequent booster doses across the UK were not fully understood until now.

Scientists from England, Scotland, Northern Ireland and Wales – led by Health Data Research UK (HDR UK) and the University of Edinburgh – studied securely-held, routinely collected NHS data from everyone over five years of age during 1 June to 30 September 2022. All data was de-identified and available only to approved researchers.

Data from across the four countries was then pooled and harmonised, a feat that was not possible until now. People were grouped by vaccine status, with under-vaccination defined as not having had all doses of a vaccine for which that a person was eligible.

The findings reveal that the proportion of people who were under-vaccinated on 1 June 2022 ranged between one third and one half of the population – 45.7% for England, 49.8% for Northern Ireland, 34.2% for Scotland and 32.8% Wales.

Mathematical modelling indicated that 7,180 hospitalisations and deaths out of around 40,400 severe COVID-19 outcomes during four months in summer 2022 might have been averted, if the UK population was fully vaccinated.

Under-vaccination was related to significantly more hospitalisations and deaths across all age groups studied, with under-vaccinated people over 75 more than twice as likely to have a severe COVID-19 outcome than those who were fully protected.

The highest rates of under-vaccination were found in younger people, men, people in areas of higher deprivation, and people of non-white ethnicity.

Researchers say the study – the largest ever study carried out in the UK – also ushers in a new era for UK science by overcoming challenges in uniting NHS data that is gathered and stored in different ways between devolved nations.

Professor Cathie Sudlow, Chief Scientist at Health Data Research UK and Director of the British Heart Foundation (BHF) Data Science Centre, said: “The infrastructure now exists to make full use of the potential of routinely collected data in the NHS across the four nations of the UK. We believe that we could and should extend these approaches to many other areas of medicine, such as cancer, heart disease and diabetes to search for better understanding, prevention and treatment of disease."

Professor Sir Aziz Sheikh, Director of the Usher Institute at the University of Edinburgh, HDR UK Research Director and study co-lead, said: “Large-scale data studies have been critical to pandemic management, allowing scientists to make policy-relevant findings at speed. COVID-19 vaccines save lives. As new variants emerge, this study will help to pinpoint groups of our society and areas of the country where public health campaigns should be focused and tailored for those communities.”

Reference
HDR UK COALESCE Consortium. Undervaccination and severe COVID-19 outcomes: meta-analysis of national cohort studies in England, Northern Ireland, Scotland, and Wales. Lancet; 16 Jan 2024; DOI:
10.1016/S0140-6736(23)02467-4

Adapted from a release from HDR-UK

Between a third and a half of the populations of the four UK nations had not had the recommended number of COVID vaccinations and boosters by summer 2022, according to the first research study to look at COVID-19 vaccine coverage of the entire UK population.

These results can be used to help create health policy and public health interventions to improve vaccine uptake
Angela Wood
Girl being injected with COVID-19 vaccine

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Accelerating how new drugs are made with machine learning

Digital image of a molecule

Predicting how molecules will react is vital for the discovery and manufacture of new pharmaceuticals, but historically this has been a trial-and-error process, and the reactions often fail. To predict how molecules will react, chemists usually simulate electrons and atoms in simplified models, a process that is computationally expensive and often inaccurate.

Now, researchers from the University of Cambridge have developed a data-driven approach, inspired by genomics, where automated experiments are combined with machine learning to understand chemical reactivity, greatly speeding up the process. They’ve called their approach, which was validated on a dataset of more than 39,000 pharmaceutically relevant reactions, the chemical ‘reactome’.

Their results, reported in the journal Nature Chemistry, are the product of a collaboration between Cambridge and Pfizer.

“The reactome could change the way we think about organic chemistry,” said Dr Emma King-Smith from Cambridge’s Cavendish Laboratory, the paper’s first author. “A deeper understanding of the chemistry could enable us to make pharmaceuticals and so many other useful products much faster. But more fundamentally, the understanding we hope to generate will be beneficial to anyone who works with molecules.”

The reactome approach picks out relevant correlations between reactants, reagents, and performance of the reaction from the data, and points out gaps in the data itself. The data is generated from very fast, or high throughput, automated experiments.

“High throughput chemistry has been a game-changer, but we believed there was a way to uncover a deeper understanding of chemical reactions than what can be observed from the initial results of a high throughput experiment,” said King-Smith.

“Our approach uncovers the hidden relationships between reaction components and outcomes,” said Dr Alpha Lee, who led the research. “The dataset we trained the model on is massive – it will help bring the process of chemical discovery from trial-and-error to the age of big data.”

In a related paper, published in Nature Communications, the team developed a machine learning approach that enables chemists to introduce precise transformations to pre-specified regions of a molecule, enabling faster drug design.

The approach allows chemists to tweak complex molecules – like a last-minute design change – without having to make them from scratch. Making a molecule in the lab is typically a multi-step process, like building a house. If chemists want to vary the core of a molecule, the conventional way is to rebuild the molecule, like knocking the house down and rebuilding from scratch. However, core variations are important to medicine design.

A class of reactions, known as late-stage functionalisation reactions, attempts to directly introduce chemical transformations to the core, avoiding the need to start from scratch. However, it is challenging to make late-stage functionalisation selective and controlled – there are typically many regions of the molecules that can react, and it is difficult to predict the outcome.

“Late-stage functionalisations can yield unpredictable results and current methods of modelling, including our own expert intuition, isn't perfect,” said King-Smith. “A more predictive model would give us the opportunity for better screening.”

The researchers developed a machine learning model that predicts where a molecule would react, and how the site of reaction vary as a function of different reaction conditions. This enables chemists to find ways to precisely tweak the core of a molecule.

“We trained the model on a large body of spectroscopic data – effectively teaching the model general chemistry – before fine-tuning it to predict these intricate transformations,” said King-Smith. This approach allowed the team to overcome the limitation of low data: there are relatively few late-stage functionalisation reactions reported in the scientific literature. The team experimentally validated the model on a diverse set of drug-like molecules and was able to accurately predict the sites of reactivity under different conditions.

“The application of machine learning to chemistry is often throttled by the problem that the amount of data is small compared to the vastness of chemical space,” said Lee. “Our approach – designing models that learn from large datasets that are similar but not the same as the problem we are trying to solve – resolves this fundamental low-data challenge and could unlock advances beyond late-stage functionalisation.”  

The research was supported in part by Pfizer and the Royal Society.

References:
Emma King-Smith et al. ‘Predictive Minisci Late Stage Functionalization with Transfer Learning.’ Nature Communications (2023). DOI: 10.1038/s41467-023-42145-1

Emma King-Smith et al. ‘Probing the Chemical "Reactome" with High Throughput Experimentation Data.’ Nature Chemistry (2023). DOI: 10.1038/s41557-023-01393-w

Researchers have developed a platform that combines automated experiments with AI to predict how chemicals will react with one another, which could accelerate the design process for new drugs.

A deeper understanding of the chemistry could enable us to make pharmaceuticals and so many other useful products much faster.
Emma King-Smith
BlackJack3D via Getty Images
Digital Molecular Structure Concept

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Feeling depressed linked to short-term increase in bodyweight among people with overweight or obesity

Person standing on white digital bathroom scale

The study, published today in PLOS ONE, found that the increase was only seen among people with overweight or obesity, but found no link between generally having greater symptoms of depression and higher bodyweight.

Research has suggested a connection between weight and mental health – with each potentially influencing the other – but the relationship is complex and remains poorly understood, particularly in relation to how changes in an individual’s mental health influence their bodyweight over time.

To help answer this question, researchers at Cambridge’s Medical Research Council (MRC) Epidemiology Unit examined data from over 2,000 adults living in Cambridgeshire, UK, who had been recruited to the Fenland COVID-19 Study.

Participants completed digital questionnaires on mental wellbeing and bodyweight every month for up to nine months during the COVID-19 pandemic (August 2020 – April 2021) using a mobile app developed by Huma Therapeutics Limited.

Questions assessed an individual’s symptoms of depression, anxiety and perceived stress. A higher score indicated greater severity, with the maximum possible scores being 24 for depression, 21 for anxiety and 40 for stress. The team then used statistical modelling to explore whether having poorer mental wellbeing than usual was related to changes in bodyweight one month later.

The researchers found that for every increment increase in an individual’s usual score for depressive symptoms, their subsequent weight one month later increased by 45g. This may seem small but would mean, for example, that in an individual whose depressive symptoms score rose from five to 10 (equal to an increase from ‘mild’ to ‘moderate’ depressive symptoms) it would relate to an average weight gain of 225g (0.225kg).

This effect was only observed in those individuals with overweight (defined as BMI 25-29.9kg/m2) or with obesity (BMI of over 30kg/m2). Individuals with overweight had on average an increase of 52g for each increment point increase from their usual depressive symptoms score and for those with obesity the comparable weight gain was 71g. The effect was not seen in those individuals with a healthy weight.

First author Dr Julia Mueller from the MRC Epidemiology Unit said: “Overall, this suggests that individuals with overweight or obesity are more vulnerable to weight gain in response to feeling more depressed. Although the weight gain was relatively small, even small weight changes occurring over short periods of time can lead to larger weight changes in the long-term, particularly among those with overweight and obesity.

“People with a high BMI are already at greater risk from other health conditions, so this could potentially lead to a further deterioration in their health. Monitoring and addressing depressive symptoms in individuals with overweight or obesity could help prevent further weight gain and be beneficial to both their mental and physical health.”

The researchers found no evidence that perceived stress or anxiety were related to changes in weight.

Senior author Dr Kirsten Rennie from the MRC Epidemiology Unit said: “Apps on our phones make it possible for people to answer short questions at home more frequently and over extended periods of time, which provides much more information about their wellbeing. This technology could help us understand how changes in mental health influence behaviour among people with overweight or obesity and offer ways to develop timely interventions when needed.”

Although previous studies have suggested that poor mental health is both a cause and consequence of obesity, the research team found no evidence that weight predicted subsequent symptoms of depression.

The research was supported by the Medical Research Council.

Reference
Mueller, J et al. The relationship of within-individual and between-individual variation in mental health with bodyweight: An exploratory longitudinal study. PLOS ONE; 10 Jan 2024; DOI: 10.1371/journal.pone.0295117

Increases in symptoms of depression are associated with a subsequent increase in bodyweight when measured one month later, new research from the University of Cambridge has found.

Person standing on white digital bathroom scale

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
Licence type: 

Mysterious missing component in the clouds of Venus revealed

Sunrise over Venus

What are the clouds of Venus made of? Scientists know they are mainly made of sulfuric acid droplets, with some water, chlorine, and iron. Their concentrations vary with height in the thick and hostile Venusian atmosphere. But until now they have been unable to identify the missing component that would explain the clouds’ patches and streaks, only visible in the UV range.

In a study published in Science Advances, researchers from the University of Cambridge synthesised iron-bearing sulfate minerals that are stable under the harsh chemical conditions in the Venusian clouds. Spectroscopic analysis revealed that a combination of two minerals, rhomboclase and acid ferric sulfate, can explain the mysterious UV absorption feature on our neighbouring planet.

“The only available data for the composition of the clouds were collected by probes and revealed strange properties of the clouds that so far we have been unable to fully explain,” said Paul Rimmer from the Cavendish Laboratory and co-author of the study. “In particular, when examined under UV light, the Venusian clouds featured a specific UV absorption pattern. What elements, compounds, or minerals are responsible for such observation?”

Formulated on the basis of Venusian atmospheric chemistry, the team synthesised several iron-bearing sulfate minerals in an aqueous geochemistry laboratory in the Department of Earth Sciences. By suspending the synthesised materials in varying concentrations of sulfuric acid and monitor the chemical and mineralogical changes, the team narrowed down the candidate minerals to rhomboclase and acid ferric sulfate, of which the spectroscopic features were examined under light sources specifically designed to mimic the spectrum of solar flares (Rimmer’s FlareLab; Cavendish Laboratory).

Researchers from Harvard University provided measurements of the UV absorbance patterns of ferric iron under extreme acidic conditions, in an attempt to mimic the even more extreme Venusian clouds. The scientists are part of the newly-established Origins Federation, which promotes such collaborative projects.

“The patterns and level of absorption shown by the combination of these two mineral phases are consistent with the dark UV-patches observed in Venusian clouds,” said co-author Clancy Zhijian Jiang, from the Department of Earth Sciences, Cambridge. “These targeted experiments revealed the intricate chemical network within the atmosphere, and shed light on the elemental cycling on the Venusian surface.”

“Venus is our nearest neighbour, but it remains a mystery,” said Rimmer. “We will have a chance to learn much more about this planet in the coming years with future NASA and ESA missions set to explore its atmosphere, clouds and surface. This study prepares the grounds for these future explorations.”

The research was supported by the Simons Foundation, and the Origins Federation.

Reference:
Clancy Zhijian Jiang et al., ‘Iron-sulfur chemistry can explain the ultraviolet absorber in the clouds of Venus.’ Science Advances (2024). DOI:10.1126/sciadv.adg8826

Researchers may have identified the missing component in the chemistry of the Venusian clouds that would explain their colour and 'splotchiness' in the UV range, solving a longstanding mystery.

FreelanceImages/Universal Images Group/Science Photo Library via Getty Images
Sunrise over Venus

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

New trial of 'pill-on-a-thread' brings screening for oesophageal cancer closer

Pill-on-a-thread and capsule sponge

The capsule sponge, known as the pill-on-a-thread, is a quick and simple test for Barrett’s oesophagus, a condition that can be a precursor to cancer. Heartburn is a common symptom of Barrett’s oesophagus, a changing of cells in the food pipe.  

The BEST4 trial launched at Addenbrooke’s Hospital today is the final step to see if the capsule sponge can prevent oesophageal cancer when used to screen or monitor those most at risk of the disease. If so, it could become a national screening programme across the NHS, in the same way mammograms are used to screen for breast cancer.

The first stage of the trial, BEST4 Surveillance, is for people already diagnosed with Barrett’s oesophagus. It will look at whether the capsule sponge test could replace endoscopies to monitor their condition. Participants will receive both examinations during the trial with results used to assess their risk of developing oesophageal cancer. 

The second stage of the trial, BEST4 Screening, opens in the summer and will recruit 120,000 people aged over 55 on long-term treatment for heartburn.

The multi-million-pound trial is jointly funded by Cancer Research UK and the National Institute for Health and Care Research. It builds on decades of research led by Professor Rebecca Fitzgerald from the University of Cambridge. She and a team of scientists, clinicians and nurses at the Early Cancer Institute, University of Cambridge and Cancer Research UK Cambridge Centre, invented and refined the capsule sponge test.

Professor Fitzgerald said: “The capsule sponge, a quick and simple test for Barrett’s oesophagus, could halve the number of deaths from oesophageal cancer every year. Cases of oesophageal cancer have increased six fold since the 1990s.  On average only 12% of patients live more than five years after diagnosis. Most don’t realise there’s a problem until they have trouble swallowing. By then it is too late.

“The first phase of the trial looks at whether the capsule sponge can be used as a cancer early warning system for patients diagnosed with Barrett’s. Using the capsule sponge and a new set of lab tests, we will be monitoring patients to see if we can prevent more cases of cancer.”

Tim Cowper, 49, a brewer from Cambridge, has had acid reflux, or heartburn, every night since he was 16. A routine health check while he was at university resulted in the shock diagnosis of Barrett’s oesophagus. After his diagnosis, he has been monitored ever since.

Tim said: “I was alarmed when I was told that having Barrett’s meant having pre-cancerous cells in my gullet. Cancer is never a nice word to hear, especially when you are so young, but luckily, I’ve had my condition monitored.

“Since my diagnosis, I’ve been going for an endoscopy at least once every three years to monitor my oesophagus. It is not pleasant at all. Each time I have a thick tube pushed down through my mouth and I can feel every single one of the biopsies taken by the camera. Swallowing a capsule sponge is a much better experience and I now get the test before my regular endoscopy appointment.”

Barrett’s oesophagus is currently identified via an endoscopy and a biopsy in hospital following a GP referral. It is time-consuming, unpleasant, and quite invasive for patients, as well as being expensive for the healthcare system.

The capsule sponge is a small, easy to swallow capsule on a thread, which contains a sponge. The patient swallows the capsule which dissolves in the stomach and the sponge expands to the size of a 50p coin.

The sponge is carefully pulled back up using the string, collecting cells for laboratory testing. The test takes just 10 minutes and can be done in a GP surgery.

Cancer Research UK and others have funded several successful clinical trials to demonstrate that the test is safe, accurate and can detect 10 times more cases of Barrett’s oesophagus than standard practice.

The test is faster and cheaper than endoscopy, which is currently used to diagnose and monitor Barrett’s oesophagus and oesophageal cancer. It has been piloted in health services in England, Scotland and Northern Ireland for patients who are currently on waiting lists for endoscopy because they have long-term heartburn or diagnosed with Barrett’s oesophagus.

Executive Director of Research and Innovation at Cancer Research UK, Dr Iain Foulkes, said: “Around 59% of all oesophageal cancer cases are preventable. Yet endoscopy, the gold standard for diagnosing and treating this cancer, is labour-intensive. We need better tools and tests to monitor people most at risk.

“Backed by funding from Cancer Research UK, the capsule sponge has become one of the most exciting early detection tools to emerge in recent years. It’s a remarkable invention by Professor Fitzgerald and her team, and previous trials have shown how powerful it can be in identifying cancer earlier.

“There are 9,200 people diagnosed with oesophageal cancer in the UK every year and the capsule sponge will mean they can benefit from kinder treatment options, if their cancer is caught at a much earlier stage.”

The future Cambridge Cancer Research Hospital will bring together clinical and research expertise, including Professor Fitzgerald’s work, under one roof.  It will enable the development and discovery of more non-invasive devices like the capsule sponge, to detect cancer earlier, and save more lives.

The BEST4 Surveillance Trial is led from Cambridge University Hospitals NHS Foundation Trust and the University of Cambridge, with trial design, coordination and analysis of results by the Cancer Research UK Cancer Prevention Trials Unit at Queen Mary University of London.

Further information about the BEST4 trial.

Adapted from a press release from Cambridge University Hospitals NHS Foundation Trust

A man from Cambridge is the first to join the surveillance part of a clinical trial that could see routine screening for oesophageal cancer introduced into the NHS, potentially halving deaths from this cancer every year.

The capsule sponge, a quick and simple test for Barrett’s oesophagus, could halve the number of deaths from oesophageal cancer every year
Rebecca Fitzgerald
Cyted
Pill-on-a-thread and capsule sponge

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified. All rights reserved. We make our image and video content available in a number of ways – on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes

Pioneering transplant surgeon Sir Roy Calne dies aged 93

Professor Sir Roy Calne

Professor Calne pursued a career as a transplant surgeon after his experience as a medical student at Guy’s Hospital in the 1950s, when he was told there was nothing that could be done for a man dying of kidney failure.

He was appointed to the position of Professor of Surgery at the University of Cambridge in 1965, where he remained until 1998. He established the kidney transplant programme at Addenbrooke’s Hospital, now part of Cambridge University Hospitals (CUH) NHS Foundation Trust.

On 2 May 1968, Professor Calne performed the first successful liver transplant in Europe. Almost two decades later, in 1986, he would go on to carry out the world’s first liver, heart and lung transplant together with Professor John Wallwork at Papworth Hospital in Cambridge. Professor Wallwork described Professor Calne as "a giant in the transplant world and an innovative surgeon".

Professor Calne was a pioneer in immunosuppression – the use of drugs to dampen the response of the immune system in order to prevent the body from rejecting transplanted organs, a potentially fatal complication. This would go on to revolutionise transplantation. He was among the first to introduce the immunosuppressant drug cyclosporin into routine clinical care, for which he shared the prestigious Lasker Award in 2012.

Despite retiring from the Chair of Surgery at the University of Cambridge in 1998, he continued to perform kidney transplants until well into his seventies, and remained active in research into his eighties.

Professor Deborah Prentice, Vice-Chancellor of the University of Cambridge, said: “Professor Calne was a true pioneer, driven by the desire to help his patients. His work here in Cambridge as a scientist and clinician has saved many thousands of lives and continues to have a major impact worldwide. We are saddened by his loss and pay tribute to his extraordinary achievements.”

Patrick Maxwell, Regius Professor of Physic at the University of Cambridge, added: “Sir Roy was a brilliant man who made a series of major breakthroughs in transplant surgery. His work has transformed the lives of countless patients around the world.”

Professor Calne was a Fellow of Trinity Hall, Cambridge, from 1965-1998. Following his retirement, he was made an Honorary Fellow. In 2018, he attended celebrations at the College to commemorate the 50th anniversary of his pioneering liver transplant surgery, where he was able to meet patients and colleagues from a career spanning six decades. To mark the anniversary, he helped launch a £250,000 appeal by Addenbrooke’s Charitable Trust to trial and run a new perfusion machine, which would allow more donated organs to be rendered suitable for transplantation. In 2021, the Addenbrooke’s Transplant Unit was named after him.

Dr Mike More, Chair of CUH, said: “Sir Roy leaves behind a truly amazing legacy and many of our staff will remember him with fondness for his vision and genuine kindness. We will all miss him very much.”

Professor Sir Roy Calne, the pioneering transplant surgeon who carried out the first liver transplant in the UK during his time at Cambridge, has died aged 93.

Cambridge University Hospitals NHS Foundation Trust
Professor Sir Roy Calne

Creative Commons License.
The text in this work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

Yes
❌