Too many kids face bullying rooted in social power imbalances — and educators can help prevent this

Being at school among peers and friends can be exciting and positive for many children and youth. But, too many kids in Canada face the reality of being bullied because of some aspect of who they are.

This type of bullying — known as identity-based or bias-based bullying — is extremely harmful to kids’ sense of belonging at school, and has negative effects on their physical and mental health, their academic achievement and their social well-being.

As psychology researchers and directors of the Promoting Relationships and Eliminating Violence Network (PREVNet), we developed accessible learning modules for educators so they can learn to recognize identity-based bullying, and intervene to stop it.

While explicitly developed with education settings in mind, these may also be helpful for parents or other caring adults in situations of influence for children’s peer relations. These modules will be available in French by the end of the year.

Harmful to kids’ well-being

Bullying has several key elements that make it so harmful to kids’ well-being.

Bullying is unwanted, aggressive behaviour that is often repeated over time. These behaviours can be verbal, social, physical, sexual and/or cyber in nature.

It happens in relationships where there is a power imbalance. In other words, the child who bullies holds more power than the child who experiences the bullying. In the case of identity-based bullying, this power imbalance is rooted in the types of power differences we see at a larger societal level.

Bullying behaviours can be verbal, social, physical or sexual, and can take place in person or online.
(Shutterstock)

Social power dynamics, identity-based bullying

It is well-documented that Indigenous youth, Black youth, 2SLGBTQIA+ youth and youth with disabilities experience discrimination in Canada.

But why? Put simply, these experiences of discrimination are rooted in Canada’s settler-colonial history, which institutionalized racialized, class-based and colonial norms and forms of social privilege. These institutionalized forms of privilege resulted in greater political, social and economic power being granted to groups as they more closely aligned with these norms, with the greatest power allotted to those at the top of this “civilized” ideal: people who are white (western European), Christian, wealthy, cisgender, heterosexual, settler men.

Read more:
Rethinking masculinity: Teaching men how to love and be loved

Groups who have been granted unearned power and privilege through these systems work to maintain their power through things like stigma, discrimination and other forms of oppression, while groups marginalized as “other” — less aligned with these dominant norms — continue to experience and hold less power across the socio-political-economic spectrum.

And, youth who hold more than one socially marginalized identity often experience even greater discrimination.

Schools as societal institutions

Since schools are societal institutions, the discrimination and other forms of oppression that are used by dominant groups to maintain power in larger society are mirrored within schools through identity-based bullying.

With identity-based bullying, the power imbalance that is a key feature of bullying behaviour is rooted in these larger social power imbalances.

Relationships can reflect power dynamics based on multiple social identities.
(Shutterstock)

Because we all hold multiple social identities, a social power perspective also explains how these identities interact. Take, for example, a situation where a white, queer student is bullying a Black, queer student. Although both students are marginalized based on their queer identities, the white student still benefits from the power and privilege afforded to whiteness. So, this situation still reflects a power dynamic based on social identities.

Read more:
Racism contributes to poor attendance of Indigenous students in Alberta schools: New study

Educator interventions

Identity-based bullying is likely an issue in your neighbourhood school. In data we collected from 1,200 youth across Canada in 2023, one in three reported identity-based bullying because of their body weight, race or skin colour, disability, religion, sexual orientation and/or gender identity.

Second, identity-based bullying impacts kids’ experiences at school. For example, a recent study from the United States found that youth who experienced multiple forms of identity-based bullying were the most likely to report avoiding class or activities. This study also found that if these same students felt more supported by adults at their school, they reported less school avoidance. This means caring educators are a protective factor for youth experiencing identity-based bullying.

Our research has proposed ways educators specifically can prevent identity-based bullying in their schools:

1) Educators (or other adults engaged in a school community) could examine their school board policy on bullying, and make sure it specifically mentions the role of social identities. If it doesn’t, educators can work to change it. A great example of naming identities when defining bullying can be seen in the Northwest Territories’ Education Act.

2) Be self-reflective and aware. As a first step, educators can explore their own unconscious biases and reflect on how they may be influencing the classroom climate.

3) Be a positive role model. Students look to adults about how to behave. Celebrate the strengths of all students and role model how to be respectful and inclusive. Also role model how to helpfully intervene when harmful behaviour occurs.

4) Actively create opportunities for positive peer dynamics in the classroom. Be intentional about creating groups to ensure that students who are excluded are given the opportunity to interact and work with students who are kind and prosocial, and who may have similar interests and abilities.

Educators can teach strategies that help all students learn how to be positive allies.
(Shutterstock)

5) Empower all students to intervene safely and effectively. Actively educate students on how to recognize identity-based bullying and provide strategies that will help all students to be positive allies.

6) Work at classroom, school and community levels to create a welcoming, inclusive environment for all children. For educators, this can include things like conducting curriculum review, actively incorporating learning about power, privilege and oppression, creating and supporting clubs like gay-straight alliances and working to create a trauma-informed classroom.

These strategies can be consolidated and deepened through engaging with our new anti-bullying training modules, which focus specifically on identity-based bullying.

In these ways, educators and other caring adults can help kids understand the difference between using power negatively and positively, and encourage its positive use to build inclusive, respectful and safe environments for all. Läs mer…

Transparency and trust: How news consumers in Canada want AI to be used in journalism

When it comes to artificial intelligence (AI) and news production, Canadian news consumers want to know when, how and why AI is part of journalistic work. And if they don’t get that transparency, they could lose trust in news organizations.

News consumers are so concerned about how the use of AI could impact the accuracy of stories and the spread of misinformation, a majority favour government regulation of how AI is used in journalism.

These are some of our preliminary findings after surveying a representative sample of 1,042 Canadian news consumers, most of whom accessed news daily.

This research is part of the Global Journalism Innovation Lab which researches new approaches to journalism. Those of us on the team at Toronto Metropolitan University are particularly interested in looking at news from an audience perspective in order to develop strategies for best practice.

The industry has high hopes that the use of AI could lead to better journalism, but there is still a lot of work to be done in terms of figuring out how to use it ethically.

Not everyone, for example, is sure the promise of time saved on tasks that AI can do faster will actually translate into more time for better reporting.

We hope our research will help newsrooms understand audience priorities as they develop standards of practice surrounding AI, and prevent further erosion of trust in journalism.

AI and transparency

Most survey respondents said newsrooms should be transparent about when and how they use AI.
(Author provided)

We found that a lack of transparency could have serious consequences for news outlets that use AI. Almost 60 per cent of those surveyed said they would lose trust in a news organization if they found out a story was generated by AI that they thought was written by a human, something also reflected in international studies.

The overwhelming majority of respondents in our study, more than 85 per cent, want newsrooms to be transparent about how AI is being used. Three quarters want that to include labelling of content created by AI. And more than 70 per cent want the government to regulate the use of AI by news outlets.

Organizations like Trusting News, which helps journalists build trust with audiences, now offer advice on what AI transparency should look like and say it’s more than just labelling a story — people want to know why news organizations are using AI.

Audience trust

Our survey also showed a significant contrast in confidence in news depending on the level of AI used. For example, more than half of respondents said they had high to very high trust in news produced just by humans. However, that level of trust dropped incrementally the more AI was involved in the process, to just over 10 per cent for news content that was generated by AI only.

In questions where news consumers had to choose a preference between humans and AI to make journalistic decisions, humans were far preferred. For example, more than 70 per cent of respondents felt humans were better at determining what was newsworthy, compared to less than six per cent who felt AI would have better news judgement. Eighty-six per cent of respondents felt humans should always be part of the journalistic process.

Most respondents had more trust in news reports produced by humans without the use of AI.
(Author provided)

As newsrooms struggle to retain fractured audiences with fewer resources, the use of AI also has to be considered in terms of the value of the products they’re creating. More than half of our survey respondents perceived news produced mostly by AI with some human oversight as less worth paying for, which isn’t encouraging considering the existing reluctance to pay for news in Canada.

This result echoes a recent Reuters study, where an average of 41 per cent of people across six countries saw less value in AI-generated news.

Concerns about accuracy

In terms of negative impacts of AI in a newsroom, about 70 per cent of respondents were concerned about accuracy in news stories and job losses for journalists. Two-thirds of respondents felt the use of AI might lead to reduced exposure to a variety of information. An increased spread of mis- and disinformation, something recognized widely as a serious threat to democracy, was of concern for 78 per cent of news consumers.

Using AI to replace journalists was what made respondents most uncomfortable, and there was also less comfort with using it for editorial functions such as writing articles and deciding what stories to develop in the first place.

There was far more comfort with using it for non-editorial tasks such as transcription and copy editing, echoing findings in previous research in Canada and other markets.

Most respondents agreed that human editors should always be part of the process.
(Author provided)

We also gathered a lot of data unrelated to AI to get a sense of how Canadians are tapping into news and the news they’re tapping into. Politics and local news were the two most popular types of news, chosen by 67 per cent of respondents, even though there is less local news to consume due to extensive cuts, mergers and closures.

A lot of people in our sample of Canadians, around 30 per cent, don’t actively look for news. They let it find them, something called passive consumption. And although this is proportionally higher in news consumers under 35, this isn’t just a phenomenon seen in the younger demographic. More than half of those who reported letting news find them were over 35 years old.

Although smartphones are increasingly becoming the likely access points of news for many consumers, including almost 70 per cent for those 34 and under and about 60 per cent of those between 35 and 44, television is where most news consumers in our study reported getting their journalism.

Respondents in our survey were asked to select all of their points of news access. More than 80 per cent of participants chose some form of TV, with some respondents picking two TV formats, for example, cable TV and smart TV. Surprisingly to us, half of 18-24 year olds reported TV as an access point for news. For those 44 and under, it was more often through a smart TV, though. As shown in other Canadian studies, TV news still plays an important role in the media landscape.

This is just a broad look at the data we have collected. Our analysis is just beginning. We’re going to dig deeper into how different demographics feel about the use of AI in journalism and how the use of AI might impact audience trust.

We will also soon be launching our survey with research partners in the United Kingdom and Australia to find out if there are differences in perceptions of AI in the three countries.

Even these initial results provide a lot of evidence that, as newsrooms work to survive in a destabilized market, using AI could have detrimental effects on the perceived value of their journalism. Developing clear policies and principles that are communicated with audiences should be an essential part of any newsroom’s AI practice in Canada. Läs mer…

What you need to know about cold and flu season

As the fall months settle in, Canadians are being urged to take precautions against the upcoming flu season.

Flu season in Canada typically peaks between December and February, but the virus can circulate much earlier. Public health officials are advocating for early vaccination, emphasizing that the annual flu vaccine is the most effective way to protect against infection and reduce the severity of illness.

Clinics across Canada offer flu shots free of charge.

Influenza

Influenza, commonly known as the flu, is a respiratory illness caused by influenza viruses that spread easily from person to person. These viruses mainly affect the nose, throat and lungs. Flu symptoms typically include fever, chills, muscle aches, cough, congestion, runny nose, headaches and fatigue.

Unlike the common cold, which often develops slowly, the flu tends to hit suddenly and can lead to severe complications like pneumonia, bronchitis and even death, particularly in high-risk groups such as young children, seniors over 65, pregnant individuals, and those with chronic conditions like asthma, diabetes or heart disease.

Influenza spreads mainly through droplets when an infected person coughs, sneezes or talks. These droplets can land in the mouths or noses of people nearby, or they can linger on surfaces where the virus can survive for up to 48 hours. Preventive measures such as handwashing, mask-wearing and staying home when symptomatic help reduce the spread of the virus.

How the flu vaccine works

Each year, flu vaccines are updated to protect against the influenza viruses that research indicates will be most common during the upcoming season. The flu shot contains inactivated or weakened influenza viruses, which cannot cause the flu but help the immune system develop antibodies. These antibodies protect against infection when exposed to live flu viruses.

The vaccine typically takes about two weeks after administration for immunity to build up, which is why public health officials recommend getting vaccinated in the fall, before flu rates start to rise. This gives individuals enough time to develop immunity before influenza becomes more widespread.

Can you get flu and COVID-19 vaccines together?

Each year, flu vaccines are updated to protect against the influenza viruses that research indicates will be most common during the upcoming season.
(Shutterstock)

Public health experts have confirmed that it is safe to receive the flu vaccine and the COVID-19 vaccine at the same time. Doing so can provide protection against both illnesses and reduce the chances of severe complications from either virus. Administering both vaccines during the same visit is a convenient way to ensure you’re protected for the season, especially as COVID-19 continues to circulate alongside influenza.

Benefits of the flu shot

One of the key benefits of flu vaccination is that it significantly reduces the risk of severe illness, hospitalization and death from the flu. While flu vaccines aren’t 100 per cent effective at preventing infection, they greatly lessen the severity of the illness and reduce the spread of the virus in the community. This is especially important for protecting high-risk groups like seniors, children, pregnant people and individuals with chronic health conditions.

Additionally, widespread flu vaccination helps prevent the health-care system from becoming overwhelmed, especially in a year when other respiratory viruses like respiratory syncytial virus (RSV) and COVID-19 are still circulating. By reducing the overall number of flu-related hospitalizations, vaccines also free up health-care resources for other urgent needs.

Why get vaccinated every year?

One of the unique challenges of influenza is that the virus mutates constantly. Because of these frequent changes, immunity from last year’s vaccine won’t provide full protection this season. This is why the flu vaccine is updated annually to match the most prevalent strains of the virus.

Even if a person received a flu shot the previous year, it’s important to get vaccinated again to stay protected against new viral strains circulating in the population. Flu vaccines are reformulated each year based on global surveillance data collected by organizations like the World Health Organization (WHO) and the U.S. Centers for Disease Control and Prevention (CDC).

Misconceptions about the flu vaccine

Despite clear benefits, misconceptions about the flu shot continue to contribute to low vaccination rates.

Some people believe that the flu vaccine can cause the flu, but this is a myth. The inactivated viruses in the flu vaccine cannot cause illness. After receiving the vaccine, some people may experience mild side-effects like soreness at the injection site or a low-grade fever, but these symptoms are short-lived and far less severe than a full-blown flu infection.

Another misconception is that the flu shot is not necessary for healthy adults. While healthy people may have a lower risk of severe flu complications, they can still spread the virus to more vulnerable individuals, such as young children, seniors or immunocompromised family members. Getting vaccinated helps protect both the individual and the community through herd immunity. Läs mer…

No country still uses an electoral college − except the US

The United States is the only democracy in the world where a presidential candidate can get the most popular votes and still lose the election. Thanks to the Electoral College, that has happened five times in the country’s history. The most recent examples are from 2000, when Al Gore won the popular vote but George W. Bush won the Electoral College after a U.S. Supreme Court ruling, and 2016, when Hillary Clinton got more votes nationwide than Donald Trump but lost in the Electoral College.

The Founding Fathers did not invent the idea of an electoral college. Rather, they borrowed the concept from Europe, where it had been used to pick emperors for hundreds of years.

As a scholar of presidential democracies around the world, I have studied how countries have used electoral colleges. None have been satisfied with the results. And except for the U.S., all have found other ways to choose their leaders.

The Holy Roman Empire had seven electors: Three were members of the Catholic Church and four were significant members of the nobility. This image depicts, from left, the archbishop of Cologne, the archbishop of Mainz, the archbishop of Trier, the count palatine of the Rhine, the duke of Saxony, the margrave of Brandenburg and the king of Bohemia.
Codex Balduini Trevirorum, c. 1340, Landeshauptarchiv Koblenz via Wikimedia Commons

The origins of the US Electoral College

The Holy Roman Empire was a loose confederation of territories that existed in central Europe from 962 to 1806. The emperor was not chosen by heredity, like most other monarchies. Instead, emperors were chosen by electors, who represented both secular and religious interests.

As of 1356, there were seven electors: Four were hereditary nobles and three were chosen by the Catholic Church. By 1803, the total number of electors had increased to 10. Three years later, the empire fell.

When the Founding Fathers were drafting the U.S. Constitution in 1787, the initial draft proposal called for the “National Executive,” which we now call the president, to be elected by the “National Legislature,” which we now call Congress. However, Virginia delegate George Mason viewed “making the Executive the mere creature of the Legislature as a violation of the fundamental principle of good Government,” and so the idea was rejected.

Pennsylvania delegate James Wilson proposed that the president be elected by popular vote. However, many other delegates were adamant that there be an indirect way of electing the president to provide a buffer against what Thomas Jefferson called “well-meaning, but uninformed people.” Mason, for instance, suggested that allowing voters to pick the president would be akin to “refer(ring) a trial of colours to a blind man.”

For 21 days, the founders debated how to elect the president, and they held more than 30 separate votes on the topic – more than for any other issue they discussed. Eventually, the complicated solution that they agreed to was an early version of the electoral college system that exists today, a method where neither Congress nor the people directly elect the president. Instead, each state gets a number of electoral votes corresponding to the number of members of the U.S. House and Senate it is apportioned. When the states’ electoral votes are tallied, the candidate with the majority wins.

James Madison, who was not fond of the Holy Roman Empire’s use of an electoral college, later recalled that the final decision on how to elect a U.S. president “was produced by fatigue and impatience.”

After just two elections, in 1796 and 1800, problems with this system had become obvious. Chief among them was that electoral votes were cast only for president. The person who got the most electoral votes became president, and the person who came in second place – usually their leading opponent – became vice president. The current process of electing the president and vice president on a single ticket but with separate electoral votes was adopted in 1804 with the passage of the 12th Amendment.

Some other questions about how the Electoral College system should work were clarified by federal laws through the years, including in 1887 and 1948.

After the 2020 presidential election exposed additional flaws with the system, Congress further tweaked the process by passing legislation that sought to clarify how electoral votes are counted.

James Madison disliked the idea of an electoral college.
Chester Harding, via National Portrait Gallery

Other electoral colleges

After the the U.S. Constitution went into effect, the idea of using an electoral college to indirectly elect a president spread to other republics.

For example, in the Americas, Colombia adopted an electoral college in 1821. Chile adopted one in 1828. Argentina adopted one in 1853.

In Europe, Finland adopted an electoral college to elect its president in 1925, and France adopted an electoral college in 1958.

Over time, however, these countries changed their minds. All of them abandoned their electoral colleges and switched to directly electing their presidents by votes of the people. Colombia did so in 1910, Chile in 1925, France in 1965, Finland in 1994, and Argentina in 1995.

The U.S. is the only democratic presidential system left that still uses an electoral college.

A ‘popular’ alternative?

There is an effort underway in the U.S. to replace the Electoral College. It may not even require amending the Constitution.

The National Popular Vote Interstate Compact, currently agreed to by 17 U.S. states, including small states such as Delaware and big ones such as California, as well as the District of Columbia, is an agreement to award all of their electoral votes to whichever presidential candidate gets the most votes nationwide. It would take effect once enough states sign on that they would represent the 270-vote majority of electoral votes. The current list reaches 209 electoral votes.

A key problem with the interstate compact is that in races with more than two candidates, it could lead to situations where the winner of the election did not get a majority of the popular vote, but rather more than half of all voters chose someone else.

When Argentina, Chile, Colombia, Finland and France got rid of their electoral colleges, they did not replace them with a direct popular vote in which the person with the most votes wins. Instead, they all adopted a version of runoff voting. In those systems, winners are declared only when they receive support from more than half of those who cast ballots.

Notably, neither the U.S. Electoral College nor the interstate compact that seeks to replace it are systems that ensure that presidents are supported by a majority of voters.

Editor’s note: This story includes material from a story published on May 20, 2020. Läs mer…

What is a communist, and what do communists believe?

Curious Kids is a series for children of all ages. If you have a question you’d like an expert to answer, send it to CuriousKidsUS@theconversation.com.

What is a communist? – Artie, age 10, Astoria, New York

Simply put, a communist is someone who supports communism. I study the history of communism, which is a political and economic view.

Communism has long been controversial, and in the U.S. today, reputable sources disagree about it. Some experts argue that communist views are well supported by historical evidence about the way societies have developed over time. Others suggest that history has shown communism not to work.

Many of those appraisals are based on examples of people who tried to establish communism. Communists have launched revolutions in many places including Russia and China. In five countries – China, North Korea, Laos, Cuba and Vietnam – communist parties control the current governments. The economic and political systems in those countries are not fully communist, but some might be working to transition from capitalism to communism.

In part because the U.S. has difficult relationships with these countries, many Americans have negative views of communists and communism. To evaluate those countries and to decide your own opinions about communism in general, it is important to first be clear about what the principles of communism are.

Communists believe that people should share wealth so that no one is too poor, no one is too rich, and everyone has enough to survive and have a good life.

A communist might be a member of a Communist party, which is a political party, or a member of a group of people who want to play a role in government.

The opening of the 2014 convention of the Communist Party of the United States of America.

In communism, people work together to produce and distribute the things they need to live, such as food, clothing and entertainment. That does not mean that everything is shared at all times.

In a communist society, individuals might still live in their own homes and have their own food, clothing and personal items such as televisions and cellphones. However, the places where these items were produced, such as factories and farms, would be owned by everyone.

Similarly, a person might still create artistic products such as works of literature or craftsmanship on their own. The goal would not be to make money, though, but instead to share for everyone to enjoy.

Communists support some form of collective ownership. Ownership by everyone would ensure that all members of society have equal rights to the products from the factories and farms because they would all be part owners of the enterprises.

In such a society, everyone would also have equal political rights and would participate in governance together. Theoretically, communism should entail some form of democracy.

What is Marxism?

German philosopher Karl Marx.
John Jabez Edwin Mayal via Wikimedia Commons

Throughout history, there have been many different views on what communism is, how it should be organized and how it might be achieved. The most famous theories about communism are probably the ones that were developed by a German philosopher named Karl Marx. His ideas are often called Marxism.

Marx studied history and observed that the way people produced goods and services was closely related to who held power. For example, in farming societies, those who owned the land had more power than those who did not.

Marx also noticed that people with less power had often risen up, usually violently, to overthrow the powerful people. He called this concept class struggle. He believed this process was how societies developed from one system of government and economy to another. He claimed that class struggle led societies through a progression toward greater efficiency in the production of goods and services, higher levels of technology and wider distribution of social and political power.

When Marx was alive in the 1800s, an economic and political system called capitalism had developed in many countries. In capitalist societies, the economy centered on factories. Factory owners had significant political and economic influence.

Marx observed that in countries such as Germany, England and the United States, factory owners hired laborers who worked long hours producing goods such as shirts or tables. While the factory owners sold these products at high prices, they paid the workers very little. As a result, the factory owners became richer, while many workers struggled to afford the goods they produced or even to provide food for their families.

Marx believed that this inequality would eventually lead to a worker uprising. During their revolution, Marx predicted, the workers would seize control of the factories, begin running them more fairly, and this would lead to a new political system, known as socialism.

Where does socialism fit in?

A campaign poster from 1976, spotlighting the candidates from the Communist Party of the United States of America.
Library of Congress

Of course, if the workers staged a revolution, the factory owners would fight back. Marx thought that, immediately after the revolution, the workers would first need to create a strong government to prevent the owners from reestablishing capitalism. During that phase, which Marx called socialism, the workers would run the government while they continued moving away from capitalism and trying to create a more equal society.

Marx thought people would eventually see that socialism was much better than capitalism because socialism would end exploitation while still allowing a society to continue moving toward better economic and political practices, but without inequality. Once that happened, a government would no longer be necessary.

The society would become communist. There would still be governance, but not a government that was separated from the people. Rather, in a communist society, the people would govern together, and everyone would do some of the work and receive what they needed.

There are Communist parties in many places, and many are currently working to move their countries toward communism. At this time, no country has yet made the transition to full communism, but many people still hope that transition will happen somewhere, sometime. Those people are communists. Communists are optimistic that humans can one day create a more fair and equal society.

Hello, curious kids! Do you have a question you’d like an expert to answer? Ask an adult to send your question to CuriousKidsUS@theconversation.com. Please tell us your name, age and the city where you live.

And since curiosity has no age limit – adults, let us know what you’re wondering, too. We won’t be able to answer every question, but we will do our best. Läs mer…

A devastating hurricane doesn’t dramatically change how people vote – but in a close election, it can matter

North Carolina and Florida are changing administrative rules and, in some cases, issuing emergency funding that is intended to make it easier for people in areas damaged by Hurricanes Helene and Milton to vote.

The recovery in both states is expected to extend far beyond the November 2024 election period. The majority of the people in the affected communities in North Carolina and Florida voted for Republican presidential nominee Donald Trump in 2020, making some election analysts wonder if some Trump supporters will be able to cast their ballots.

Amy Lieberman, a politics and society editor at The Conversation U.S., spoke with Boris Heersink, a scholar of voters’ behavior after a natural disaster, to better understand if and how the recent hurricanes could shift the results of the 2024 presidential election.

How can hurricanes create complications ahead of an election?

A massive hurricane disrupts people’s lives in many important ways, including affecting people’s personal safety and where they can live. Ahead of an election, there are a lot of practical limitations about how an election can be executed – like if a person can still receive mail-in ballots at home or elsewhere, or if it is possible to still vote in person at their polling location if that building was destroyed or damaged.

Another issue is whether people who have just lived through a natural disaster and will likely be dealing with the aftermath for weeks to come are focused on politics right now. Some might sit out the election because they simply have more important things to worry about.

Beyond practical concerns, how else can a natural disaster influence an election?

The other side of the equation, which is what political scientists like myself are mostly focusing on, is whether people take the fact that a natural disaster happened into consideration when they vote.

Two scholars, Christopher Achen and Larry Bartels, have argued that sometimes voters are not great at figuring out how to incorporate bad things that happened to them into a voting position. In some cases, it is entirely fair to hold an elected official responsible for bad outcomes that affect people’s lives. But at other moments, bad things can happen to us without that being the fault of an incumbent president or governor. And voters should ideally be able to balance out these different types of bad things – those it is fair to punish elected officials for, and those for which it isn’t fair to hold them responsible.

After all, a devastating hurricane is terrible, but it is not Kamala Harris’ fault that it happened. But Achen and Bartels argue that voters frequently still punish elected officials for random bad events like this.

Their most famous example is the consequences of a series of shark attacks off the New Jersey coast in the summer of 1916. As a result of those attacks, the New Jersey tourism industry saw a major decline. While these findings are still being debated, Achen and Bartels argue that Jersey shore voters subsequently voted against Woodrow Wilson in the 1916 presidential election at a higher rate than they would have had the shark attacks not happened. They argue that voters did this even though Wilson had no involvement in the shark attacks.

Kamala Harris visits a Hurricane Helene donation drop-off site for emergency supplies in Charlotte, N.C., on Oct. 5, 2024.
Mario Tama/Getty Images

How else do voters consider bad events when they vote?

Scholars like John Gasper and Andrew Reeves argue that voters mostly care whether elected officials respond appropriately to a disaster. So, if the president does a good job reacting, voters do not actually punish them at all in the next election. However, voters can punish elected officials if they feel like the response is not correct.

The fact that Hurricane Katrina hit Louisiana in 2005 was not the fault of then-President George W. Bush. But the perceived slowness of the government response is something a voter could have held him responsible for.

How do voters’ political affiliations affect where and how they lay the blame?

Colleagues and I have shown that how people interpret the combination of a disaster and the government response is likely colored by their own partisanship.

We looked at both the effects of Superstorm Sandy on the 2012 presidential election and natural disasters’ impact on elections more broadly from 1972 through 2004. One core finding is that when presidents reject state officials’ disaster declaration requests, they lose votes in affected counties – but only if those counties were already more supportive of the opposite party.

If there is a strong positive government response, the incumbent president or their party can actually gain votes or lose voters affected by a disaster. So, Republicans affected by the hurricanes could become more inclined to vote against Harris if they feel like they are not getting the help they need. But it could also help Harris if affected Democrats feel like they are getting enough aid.

The major takeaway is that if the government responds really effectively to a natural disaster or other emergency, there is not a huge electoral penalty – and there could even be a small reward.

That is not irrelevant in a close election. If Republicans in affected areas in North Carolina feel the government response has been poor and it inspires them to turn out in higher numbers to punish Harris, that could matter. But if they feel like the response has been adequate, research suggests either no real effect on their support for Harris – or possibly even an increase in Harris voters.

Donald Trump speaks with owners of a furniture store that was damaged during Hurricane Helene on Sept. 30, 2024, in Valdosta, Ga.
Michael M. Santiago/Getty Images

How much influence can a politician have on people assessing a government response?

Scholars mostly assume that people affected can tell whether the government response was good or not. Trump and other Republicans are falsely saying that the response is slow and falsely claiming that Federal Emergency Management Agency money is being spent on immigrants who are not living in the country legally. There does not appear to be a slow government response to the hurricane in North Carolina, and there’s no evidence the response is insufficient in Florida, either.

So, the question now is whether voters affected by these hurricanes will respond based on their actual lived experiences, or how they are told they are living their experience. Läs mer…

From Swift to Springsteen to Al Jolson, candidates keep trying to use celebrities to change voters’ songs

It’s 2016 all over again. And 2020, for that matter. Democrats are staring at what looks to be another coin flip election between their party’s nominee and Donald Trump.

In an election that could come down to a few hundred thousand votes in a handful of states, every voter matters – no matter how you reach them. With that in mind, Democrats are communicating not just on matters of policy, but matters of pop culture.

Specifically, Democrats are embracing football and Taylor Swift. The Harris-Walz campaign trotted out endorsements from 15 Pro Football Hall of Famers and sells Swiftie-style friendship bracelets on its campaign website, among other overtures. Swift herself has endorsed Kamala Harris.

Tim Walz cited his experience as a football coach and mentioned Swift in the vice presidential debate.

Democratic challenger and former NFLer Colin Allred, who is running to unseat GOP Sen. Ted Cruz of Texas, has put out ads in which he appears moments from taking to the gridiron.

But how much does pop culture campaigning, if you will, matter? Does trying to link a campaign to a sport, or a culture, or a style of music actually influence elections? Looking to five different election campaigns in the past can give a sense of the effects, or lack thereof, of such campaigning.

An ad for Texas Democrat Rep. Colin Allred, a former NFL player, stresses his football past in his bid to unseat GOP Sen. Ted Cruz.

Reagan and Springsteen

Any discussion of the embrace of pop culture by candidates should probably start with Ronald Reagan’s Bruce Springsteen era.

Reagan, attempting to reach beyond his base, viewed 1984 as a vibes-based election and cited Springsteen as an exemplar of the hope his campaign wished to inspire. Springsteen rejected a request from Reagan’s camp to use his often-misunderstood “Born in the U.S.A.” on the campaign trail. The song’s lyrics describe a down-on-his-luck Vietnam War veteran, but if you don’t listen carefully to the lyrics, the song can sound like a celebration of veterans and being American.

While Reagan went on to win 49 states in that year’s election, perhaps the biggest long-term impact of his courtship of Springsteen fans was to turn Springsteen from a relatively apolitical performer to a staunch supporter of the Democratic Party.

In this way, Springsteen’s transformation mirrors that of Taylor Swift, with Marsha Blackburn, the Tennessee Republican senator, serving as her Reagan – the person who pushed the performer into the political arena after years on the sidelines.

Springsteen and Kerry

Springsteen’s foray into politics eventually led him to back Democratic presidential nominee John Kerry in 2004 with a series of concerts called the “Vote for Change”“ tour.

Democratic presidential candidate John Kerry greets the crowd with musician Bruce Springsteen while campaigning in Columbus, Ohio, on Oct. 28, 2004.
AP Photo/Laura Rauch

Kerry, meanwhile, undertook his own efforts at cultural turf claiming. His attempts to demonstrate his bona fides as a sports-loving everyman went awry at times, when he flubbed the name of ”Lambeau Field,” home of Wisconsin’s Green Bay Packers, and referred to a nonexistent Boston Red Sox player, “Manny Ortez.” The ill-fated sports references arguably didn’t hurt his campaign – he won Wisconsin and Massachusetts – but he was ridiculed for a photo-op hunting trip late in the campaign and went on to lose rural Midwestern voters decisively – as well as the election.

Kerry’s dabbling with hunting imagery was perhaps an attempt to dull President George W. Bush’s advantage in perceived strength of leadership, which was in part burnished by his adoption of a cowboy persona.

Harding, Jolson and the Cubs

While Reagan’s attempt to woo 1980s rock fans is one of the best-known attempts to campaign on a mantra of popular culture, it was far from the first.

Sen. Warren Harding’s 1920 front porch campaign for president was given a jolt of enthusiasm by a visit from singer and actor Al Jolson. Harding was also visited in his hometown, Marion, Ohio, by other actors and celebrities and the Chicago Cubs.

Harding’s strategy probably better serves as a template for things to come than a decisive move in the 1920 election: His victory with over 60% of the popular vote suggests no celebrity could have saved Democrat James Cox.

Bill Clinton and MTV

As the Harris-Walz campaign tries to draw votes from Swift’s young fans, parallels can be drawn to Democratic Arkansas Gov. Bill Clinton’s attempts to embrace youth culture in the 1992 presidential election. Among other appearances, Clinton took questions from young voters on MTV and played saxophone on “The Arsenio Hall Show.”

While the direct effect of Clinton’s forays into youth culture is difficult to measure, he did surge among young voters relative to Democrat Michael Dukakis’ 1988 presidential campaign.

In his 1992 campaign, Bill Clinton went on MTV to answer young people’s questions, which included ‘If you had it to do over again, would you inhale?’

Ford and football

Any discussion of politicians embracing football culture would be incomplete without a discussion of the American president best at playing football, Gerald Ford, the vice president who became the nation’s 38th president in 1974, when Richard Nixon resigned during the Watergate scandal.

Ford played center on two national championship teams at the University of Michigan. While not using his football player background to the same level as former football coach Walz did at the Democratic National Convention, Ford did make use of his football credentials on the stump during the 1976 presidential campaign and was joined on the campaign trail by Alabama football coach Paul “Bear” Bryant.

But the votes of football fans were apparently not enough to keep Ford in the White House for long. He lost the 1976 election to Democrat Jimmy Carter.

Potentially fruitful pickups

Will the Harris-Walz strategy of recruiting voters through pop culture be successful? Swift’s fans are largely young, suburban women, and NFL fans are strewn across the political spectrum. There are potentially fruitful pickups in both camps. The candidates certainly think it matters: Walz said he “took football back” from Republicans, a claim disputed by Trump.

Stressing pop culture credentials can also provide attention to a campaign, regardless of persuasion. Clinton’s pop culture appearances generated coverage beyond the appearances themselves and were cost-effective for a campaign short on funds.

This type of pop culture campaigning generates coverage, then, even if voters aren’t moved by thinking a candidate shares their love of football or pop music. Läs mer…

What does Springfield, Illinois, in 1908 tell us about Springfield, Ohio, in 2024?

Lying about Black people is nothing new in political campaigning.

Despite the thorough debunking of false rumors that Haitian immigrants were eating cats and dogs in Springfield, Ohio, former President Donald Trump and his GOP allies insist on repeating the lies.

“If I have to create stories,” admitted JD Vance, Trump’s running mate, “that’s what I’m going to do.”

While many political observers believe that these lies have, as The New York Times columnist Lydia Polgreen described, finally “crossed a truly unacceptable line,” in fact, white politicians have told brazen, fearmongering, racist lies about Black people for over the past 100 years.

One of the more notorious lies occurred in 1908 in another Springfield, this one in Illinois. As a historian who studies the impact of racism on democracy, it’s my belief that what happened there and in other cities helps to clarify what Trump and Vance are trying to do in Springfield, Ohio, today.

Lying when everyone knows you’re lying seems to be the point.

New target, old message

Springfield, Illinois, Abraham Lincoln’s home town, was, in 1908, a working-class city of just under 50,000 people – about the same size as its modern counterpart in Ohio.

Because of the city’s manufacturing industries, Springfield was also an attractive place to live and work for Black men and women escaping the social oppression of the Deep South.

The Black population of Springfield had been growing by about 4% annually, and by 1908, roughly 2,500 Black people were living there to work in the city’s manufacturing plants. As the wealth of some Black families rose, so too did racist fears among whites that Black migrants were taking their jobs.

Rumors spread through false newspaper reports among white residents that a Black man had raped a white woman.

As the story went, a Black man broke through the screen door of a modest house in a white neighborhood. He supposedly dragged a 21-year-old white woman by her throat into the backyard, where he raped her. Or so the woman said.

A couple of weeks after the incident, the woman admitted she lied. There was no Black man. There was no rape. But by then, telling the truth was too late. The rumor had triggered a wave of anti-Black violence.

William English Walling, a white, liberal journalist from Kentucky, reported that Springfield’s white folks launched “deadly assaults on every negro they could lay their hands on, to sack and plunder their houses and stores, and to burn and murder.”

For two days, the violence raged, while white “prosperous businessmen looked on” in complicit approval, Walling wrote. Several blocks in Black neighborhoods were burned, and at least eight Black men were killed.

One of the men killed was William K. Donnegan. The 84-year-old died after his white attackers slit his throat and then hanged him with a clothesline from a tree near his home.

As a dozen different rioters told Walling: “Why, the n—–s came to think they were as good as we are!”

Telling the truth about racist tropes

At the turn of the 19th century, racial tensions were most often expressed in sexual terms – Black men having sex with white women.

That sexual anxiety was part of what cultural historians call a “master narrative,” a symbolic story that dramatizes white nationalism and the belief that citizenship and its benefits were preserved for one racial group at the expense of all others.

One of the first to debunk this rape fantasy was Ida B. Wells, the Black editor and owner of the weekly “Memphis Free Press.”

In 1892, a white mob lynched one of her good friends, Thomas Moss, and two others associated with his cooperative Peoples’ Grocery store. The Appeal Avalanche, a white Memphis newspaper, wrote that the lynching “was done decently and in order.”

Ida B. Wells was among the NAACP’s founders.
Library of Congress

In her May 21, 1892, editorial about Moss’ death, Wells told a different story about “the same old racket – the new alarm about raping a white woman.”

Wells explained that she worried that people who lived outside of the Deep South might believe the lies about Black people.

“Nobody in this section of the country,” she wrote, not even the demagogues spreading rumors, “believe the old thread bare lie that Negro men rape white women.”

Political fearmongering

What happened in Wilmington, North Carolina, in 1898 was based on a deliberate, cynical election strategy of lies.

At the turn of the 20th century, North Carolina’s disaffected, poor working-class white Populists joined forces with Black Republicans to form what were known as the Fusionists.

In Wilmington, then the largest city in North Carolina, the Fusionists were able to vote out the white-nationalist Democratic Party in the early 1890s and became a symbol of hope for a democratic South and racial equality.

They also became a target for Democrats seeking to regain power and restore white nationalism.

A political cartoon from the Raleigh News & Observer, Aug. 13, 1898.
North Carolina Collection, UNC Chapel Hill

The spark came in the summer of 1898 when Rebecca Felton, the wife of a Georgia congressman and a leading women’s rights advocate, gave an address to Georgia’s Agricultural Society on Aug. 11 that sought to protect the virtue of white women.

“If it needs lynching,” she said, “to protect a woman’s dearest possession from the ravening of beasts – then I say lynch; a thousand times a week if necessary.”

In response, Alexander Manly, the Black editor of The Daily Record, in Wilmington, followed the lead of Ida B. Wells and attacked the myths of Black men. Manly pointed out in his August 1898 editorial that poor white women “are not any more particular in the matter of clandestine meetings with colored men than are the white men with colored women.”

Democrats bent on stoking racial fears circulated Manly’s editorial throughout North Carolina before the November 1898 elections, decrying the “Outrageous Attack on White Women!” by “the scurrilous negro editor.”

If that wasn’t enough to stir up North Carolina Democrats, party officials sent the Red Shirts, their white nationalist militia, to Wilmington to overthrow the city’s biracial government, install all white officials and restore white rule.

To that end, a white mob destroyed Manly’s newspaper office, chased him and other Black leaders into exile, rampaged through Black neighborhoods and killed an untold number of Black men.

It was a white nationalist coup d’etat.

The great white protector

In his modern-day attempt to divide working-class white people from working-class Black people, Vance has urged his supporters to ignore “the crybabies” in the mainstream media.

“Keep the cat memes flowing!” he posted on X.

An estimated 67 million people watched the U.S. presidential debate on ABC and heard Trump angrily proclaim: “They’re eating the dogs. They’re eating the cats. They’re eating … the pets of the people that live there.”

Once again, the old narrative is resurrected. Läs mer…

Godzilla at 70: The monster’s warning to humanity is still urgent

The 2024 Nobel Peace Prize has been awarded to Nihon Hidankyo, the Japan Confederation of A- and H-bomb Sufferers Organizations. Many of these witnesses have spent their lives warning of the dangers of nuclear war – but initially, much of the world didn’t want to hear it.

“The fates of those who survived the infernos of Hiroshima and Nagasaki were long concealed and neglected,” the Nobel committee noted in its announcement. Local groups of nuclear survivors created Nihon Hidankyo in 1956 to fight back against this erasure.

Atomic bomb survivor Masao Ito, 82, speaks at the park across from the Atomic Bomb Dome in Hiroshima in May 15, 2023.
Richard A. Brooks/AFP via Getty Images

Around the same time that Nihon Hidankyo was formed, Japan produced another warning: a towering monster who topples Tokyo with blasts of irradiated breath. The 1954 film “Godzilla” launched a franchise that has been warning viewers to take better care of the Earth for the past 70 years.

We study popular Japanese media and business ethics and sustainability, but we found a common interest in Godzilla after the 2011 earthquake, tsunami and meltdown at Japan’s Fukushima Daiichi nuclear plant. In our view, these films convey a vital message about Earth’s creeping environmental catastrophe. Few survivors are left to warn humanity about the effects of nuclear weapons, but Godzilla remains eternal.

Into the atomic age

By 1954, Japan had survived almost a decade of nuclear exposure. In addition to the bombings of Hiroshima and Nagasaki, the Japanese people were affected by a series of U.S. nuclear tests in the Bikini Atoll.

When the U.S. tested the world’s first hydrogen bomb in 1954, its devastation reached far outside the expected damage zone. Though it was far from the restricted zone, the Lucky Dragon No. 5 Japanese fishing boat and its crew were doused with irradiated ash. All fell ill, and one fisherman died within the year. Their tragedy was widely covered in the Japanese press as it unfolded.

The Castle Bravo hydrogen bomb test on March 1, 1954, produced an explosion equivalent to 15 megatons of TNT, more than 2.5 times what scientists had expected. It released large quantities of radioactive debris into the atmosphere.

This event is echoed in a scene at the beginning of “Godzilla” in which helpless Japanese boats are destroyed by an invisible force.

“Godzilla” is full of deep social debates, complex characters and cutting-edge special effects for its time. Much of the film involves characters discussing their responsibilities – to each other, to society and to the environment.

This seriousness, like the film itself, was practically buried outside of Japan by an alter ego, 1956’s “Godzilla, King of the Monsters!” American licensors cut the 1954 film apart, removed slow scenes, shot new footage featuring Canadian actor Raymond Burr, spliced it all together and dubbed their creation in English with an action-oriented script they wrote themselves.

This version was what people outside of Japan knew as “Godzilla” until the Japanese film was released internationally for its 50th anniversary in 2004.

From radiation to pollution

While “King of the Monsters!” traveled the world, “Godzilla” spawned dozens of Japanese sequels and spinoffs. Godzilla slowly morphed from a murderous monster into a monstrous defender of humanity in the Japanese films, a transition that was also reflected in the later U.S.-made films.

In 1971, a new, younger creative team tried to define Godzilla for a new era with “Godzilla vs. Hedorah.” Director Yoshimitsu Banno joined the movie’s crew while he was promoting a recently completed documentary about natural disasters. That experience inspired him to redirect Godzilla from nuclear issues to pollution.

World War II was fading from public memory. So were the massive Anpo protests of 1959 and 1960, which had mobilized up to one-third of the Japanese people to oppose renewal of the U.S.-Japan security treaty. Participants included housewives concerned by the news that fish caught by the Lucky Dragon No. 5 had been sold in Japanese grocery stores.

At the same time, pollution was soaring. In 1969, Michiko Ishimure published “Paradise in the Sea of Sorrow: Our Minamata Disease,” a book that’s often viewed as a Japanese counterpart to “Silent Spring,” Rachel Carson’s environmental classic. Ishimure’s poetic descriptions of lives ruined by the Chisso Corp.’s dumping of methyl mercury into the Shiranui Sea awoke many in Japan to their government’s numerous failures to protect the public from industrial pollution.

The Chisso Corp. released toxic methylmercury into Minamata Bay from 1932 to 1968, poisoning tens of thousands of people who ate local seafood.

“Godzilla vs. Hedorah” is about Godzilla’s battles against Hedorah, a crash-landed alien that grows to monstrous size by feeding on toxic sludge and other forms of pollution. The film opens with a woman singing jazzily about environmental apocalypse as young people dance with abandon in an underground club.

This combination of hopelessness and hedonism continues in an uneven film that includes everything from an extended shot of an oil slick-covered kitten to an animated sequence to Godzilla awkwardly levitating itself with its irradiated breath.

After Godzilla defeats Hedorah at the end of the film, it pulls a handful of toxic sludge out of Hedorah’s torso, gazes at the sludge, then turns to stare at its human spectators – both those onscreen and the film’s audience. The message is clear: Don’t just lazily sing about imminent doom – shape up and do something.

Official Japanese trailer for ‘Godzilla vs. Hedorah’

“Godzilla vs. Hedorah” bombed at the box office but became a cult hit over time. Its positioning of Godzilla between Earth and those who would harm it resonates today in two separate Godzilla franchises.

One line of movies comes from the original Japanese studio that produced “Godzilla.” The other line is produced by U.S. licensors making eco-blockbusters that merge the environmentalism of “Godzilla” with the spectacle of “King of the Monsters.”

A meltdown of public trust

The 2011 Fukushima disaster has now become part of the Japanese people’s collective memory. Cleanup and decommissioning of the damaged nuclear plant continues, amid controversies around ongoing releases of radioactive water used to cool the plant. Some residents are allowed to visit their homes but can’t move back there while thousands of workers remove topsoil, branches and other materials to decontaminate these areas.

Before Fukushima, Japan derived one-third of its electricity from nuclear power. Public attitudes toward nuclear energy hardened after the disaster, especially as investigations showed that regulators had underestimated risks at the site. Although Japan needs to import about 90% of the energy it uses, today over 70% of the public opposes nuclear power.

The first Japanese “Godzilla” film released after the Fukushima disaster, “Shin Godzilla” (2016), reboots the franchise in a contemporary Japan with a new type of Godzilla, in an eerie echo of the damages of and governmental response to Fukushima’s triple disaster. When the Japanese government is left leaderless and in disarray following initial counterattacks on Godzilla, a Japanese government official teams up with an American special envoy to freeze the newly named Godzilla in its tracks, before a fearful world unleashes its nuclear weapons once again.

Their success suggests that while national governments have an important role to play in major disasters, successful recovery requires people who are empowered to act as individuals. Läs mer…

Scientists around the world report millions of new discoveries every year − but this explosive research growth wasn’t what experts predicted

Millions of scientific papers are published globally every year. These papers in science, technology, engineering, mathematics and medicine present discoveries that range from the mundane to the profound.

Since 1900, the number of published scientific articles has doubled about every 10 to 15 years; since 1980, about 8% to 9% annually. This acceleration reflects the immense and ever-growing scope of research across countless topics, from the farthest reaches of the cosmos to the intricacies of life on Earth and human nature.

Derek de Solla Price wrote an influential book about the growth rate of science.
The de Solla Price family/Wikimedia Commons

Yet, this extraordinary expansion was once thought to be unsustainable. In his influential 1963 book, “Little Science, Big Science… And Beyond,” the founder of scientometrics – or data informetrics related to scientific publications – Derek de Solla Price famously predicted limits to scientific growth.

He warned that the world would soon deplete its resources and talent pool for research. He imagined this would lead to a decline in new discoveries and potential crises in medicine, technology and the economy. At the time, scholars widely accepted his prediction of an impending slowdown in scientific progress.

Faulty predictions

In fact, science has spectacularly defied Price’s dire forecast. Instead of stagnation, the world now experiences “global mega-science” – a vast, ever-growing network of scientific discovery. This explosion of scientific production made Price’s prediction of collapse perhaps the most stunningly incorrect forecast in the study of science.

Unfortunately, Price died in 1983, too early to realize his mistake.

So, what explains the world’s sustained and dramatically increasing capacity for scientific research?

We are sociologists who study higher education and science. Our new book, “Global Mega-Science: Universities, Research Collaborations, and Knowledge Production,” published on the 60th anniversary of Price’s fateful prediction, offers explanations for this rapid and sustained scientific growth. It traces the history of scientific discovery globally.

Factors such as economic growth, warfare, space races and geopolitical competition have undoubtedly spurred research capacity. But these factors alone cannot account for the immense scale of today’s scientific enterprise.

The education revolution: Science’s secret engine

In many ways, the world’s scientific capacity has been built upon the educational aspirations of young adults pursuing higher education.

Funding from higher education supports a large part of the modern scientific enterprise.
AP Photo/Paul Sancya

Over the past 125 years, increasing demand for and access to higher education has sparked a global education revolution. Now, more than two-fifths of the world’s young people ages 19-23, although with huge regional differences, are enrolled in higher education. This revolution is the engine driving scientific research capacity.

Today, more than 38,000 universities and other higher-education institutions worldwide play a crucial role in scientific discovery. The educational mission, both publicly and privately funded, subsidizes the research mission, with a big part of students’ tuition money going toward supporting faculty.

These faculty scientists balance their teaching with conducting extensive research. University-based scientists contribute 80% to 90% of the discoveries published each year in millions of papers.

External research funding is still essential for specialized equipment, supplies and additional support for research time. But the day-to-day research capacity of universities, especially academics working in teams, forms the foundation of global scientific progress.

Even the most generous national science and commercial research and development budgets cannot fully sustain the basic infrastructure and staffing needed for ongoing scientific discovery.

Likewise, government labs and independent research institutes, such as the U.S. National Institutes of Health or Germany’s Max Planck Institutes, could not replace the production capacity that universities provide.

Collaboration benefits science and society

The past few decades have also seen a surge in global scientific collaborations. These arrangements leverage diverse talent from around the world to enhance the quality of research.

International collaborations have led to millions of co-authored papers. International research partnerships were relatively rare before 1980, accounting for just over 7,000 papers, or about 2% of the global output that year. But by 2010 that number had surged to 440,000 papers, meaning 22% of the world’s scientific publications resulted from international collaborations.

This growth, building on the “collaboration dividend,” continues today and has been shown to produce the highest-impact research.

Universities tend to share academic goals with other universities and have wide networks and a culture of openness, which makes these collaborations relatively easy.

Today, universities also play a key role in international supercollaborations involving teams of hundreds or even thousands of scientists. In these huge collaborations, researchers can tackle major questions they wouldn’t be able to in smaller groups with fewer resources.

Supercollaborations have facilitated breakthroughs in understanding the intricate physics of the universe and the synthesis of evolution and genetics that scientists in a single country could never achieve alone.

The IceCube collaboration, a prime example of a global megacollaboration, has made big strides in understanding neutrinos, which are ghostly particles from space that pass through Earth.
Martin Wolf, IceCube/NSF

The role of global hubs

Hubs made up of universities from around the world have made scientific research thoroughly global. The first of these global hubs, consisting of dozens of North American research universities, began in the 1970s. They expanded to Europe in the 1980s and most recently to Southeast Asia.

These regional hubs and alliances of universities link scientists from hundreds of universities to pursue collaborative research projects.

Scientists at these universities have often transcended geopolitical boundaries, with Iranian researchers publishing papers with Americans, Germans collaborating with Russians and Ukrainians, and Chinese scientists working with their Japanese and Korean counterparts.

The COVID-19 pandemic clearly demonstrated the immense scale of international collaboration in global megascience. Within just six months of the start of the pandemic, the world’s scientists had already published 23,000 scientific studies on the virus. These studies contributed to the rapid development of effective vaccines.

With universities’ expanding global networks, the collaborations can spread through key research hubs to every part of the world.

Is global megascience sustainable?

But despite the impressive growth of scientific output, this brand of highly collaborative and transnational megascience does face challenges.

On the one hand, birthrates in many countries that produce a lot of science are declining. On the other, many youth around the world, particularly those in low-income countries, have less access to higher education, although there is some recent progress in the Global South.

Sustaining these global collaborations and this high rate of scientific output will mean expanding access to higher education. That’s because the funds from higher education subsidize research costs, and higher education trains the next generation of scientists.

De Solla Price couldn’t have predicted how integral universities would be in driving global science. For better or worse, the future of scientific production is linked to the future of these institutions. Läs mer…