One way of putting it

12 months ago 39

In recent weeks, Oxford and other dictionaries weighed in. The Gazette wondered if Harvard faculty could do better.

The post-pandemic world seems rife with disagreement — why should opinions about the 2023 word of the year be any different? In recent weeks, Merriam Webster, Oxford, and The Cambridge Dictionary chose “authentic,” “rizz,” and “hallucinate,” respectively. The Gazette wondered if Harvard faculty could do better, so we asked experts in different disciplines to give us their picks. Below are offerings from business, psychology, public health, divinity, biomedical informatics, and international affairs for the word that best defines 2023.


Disruption

Willy Shih, Robert and Jane Cizik Professor of Management Practice in Business Administration, Harvard Business School

Whether it was geopolitical — ranging from new ties in the Middle East and the Indo-Pacific — or economic, with the dramatic changes in investment and global trade flows involving China and the consequences of that, or the new style of warfare in Ukraine and now the conflict in Gaza, or climate change, with a sea-tide shift in investment — particularly in the U.S. — or technology, with generative AI capturing the imagination of investors, what 2023 has meant is upsetting the status quo. Surprising, unexpected, and unsettling, “disruption” kind of says it all to me.


Combustible

Melani Cammett, Director, Weatherhead Center for International Affairs; Clarence Dillon Professor of International Affairs, Faculty of Arts and Sciences

This has been a year of conflagrations, literally and figuratively. In physical terms, the past year has witnessed the warmest temperatures on record, unprecedented sea level rises and temperatures, the lowest Antarctic sea-ice levels, extreme drought and rainfall in parts of Southern Europe, Africa, and Latin America, and deadly wildfires in places as far-flung as Canada, Hawaii, and Greece. Beyond climate change, politics in the U.S. and abroad seem increasingly explosive. Violence and human suffering in the Middle East have hit new heights and have reverberated around the world, including on our campus, eliciting deep emotions that keep people on a knife’s edge. Meanwhile, the war in Ukraine continues, far-right parties and candidates have won elections in the Netherlands, Argentina, and beyond, and reactions to the multiple indictments of former President Donald Trump further expose deep rifts among Americans. In short, our world seems to have hit new material and symbolic thresholds that make it seem increasingly combustible.

Lest we despair, we must recognize the combustibility is not inevitable. We have the capacity to step back from the brink and address our shared problems constructively — even in our polarized world. Already, scientists, policymakers, researchers, and activists are working toward addressing the worst effects of climate change and adapting to our warming world, with meaningful successes. Politically, cooperation is possible, even in the face of seemingly intractable differences. Societies that have lived through bloody civil wars have achieved sufficient levels of toleration — if not reconciliation — to enable former antagonists to recognize their shared humanity, allowing people to move forward in a shared political community.


Resilience

Kari Nadeau, Interim Director, Harvard Center for Climate, Health and the Global Environment; John Rock Professor of Climate and Population Studies; Chair, Department of Environmental Health; Harvard T.H. Chan School of Public Health

I would say resilience is the word of the year, given how much our world has faced, how much individuals have suffered, and how much sorrow the public has experienced. I don’t use that word lightly, so to give more context, I believe that resilience is the ability to be cautiously optimistic and find purpose despite difficulties around us.


Heat

William Hanage, Associate Director, Center for Communicable Disease Dynamics; Associate Professor of Epidemiology, Harvard T.H. Chan School of Public Health

My initial thought was “Swiftonomics” as a reflection of how a return to “business as usual” post-pandemic has been far from usual. The Taylor Swift “Eras” tour has been everywhere you look, and whether you are a Swiftie, Swift-curious, Swiftie-adjacent (me), or none of the above, you have to recognize the extraordinary impact she has had as an artist. (And yes, I saw the movie. And yes, I have a teenage daughter. And yes, I enjoyed it.)

But then I remembered what else has been everywhere — the temperature records tumbling around the planet. Wildfires in Canada. Punishing, deadly heatwaves in Europe. Postponed Taylor Swift concerts in Brazil. More prosaically, I found myself outside in the middle of the night in a T-shirt at a Halloween party — in New England — and I wasn’t even chilly. To be clear, this is not normal.


Alignment

Isaac Kohane, Chair, Department of Biomedical Informatics; Marion V. Nelson Professor of Biomedical Informatics, Harvard Medical School

Alignment, in the context of large language models like GPT-4, is a term that playfully yet seriously refers to the ongoing and somewhat Sisyphean effort to ensure that these AI entities don’t go off the rails and start spouting nonsense, biases, or the AI equivalent of “I’m going to take over the world” rhetoric. It’s about aligning the AI’s outputs with human values, expectations, and societal norms, akin to teaching a super-smart parrot to not embarrass you in front of your grandmother. This involves a complex dance of programming, training, and retraining where AI researchers try to imbue their creations with enough wisdom to be helpful, but not so much that they start giving unsolicited life advice or plotting a digital uprising. In essence, alignment is the art of making sure our AI pals are well-behaved digital citizens, capable of understanding and respecting the intricate tapestry of human ethics, culture, and sensibilities.

***

Steven Pinker, Johnstone Family Professor of Psychology, Faculty of Arts and Sciences

This term (alignment), often following “AI,” is the catchword for concerns about whether artificial intelligence systems have goals that are the same as those of humans. It comes from a fear that AI systems of the future are not just tools that people use to accomplish their goals but agents with goals of their own, raising the question of whether their goals are aligned with ours.

This could evolve either because engineers will think that AI systems are so smart that they can just be given a goal and left to figure out how to achieve it (e.g., “Eliminate cancer”) or because the systems megalomaniacally adopt their own goals. For some worriers, the implication is AI Doomerism or AI Existential Risk (runners-up for words of the year), where an AI might, say, eliminate cancer by exterminating humans (not so aligned). In a milder form, “alignment” is a synonym for AI safety (bias, deepfakes, etc.), which led to the infamous firing of OpenAI CEO Sam Altman by his board. You sometimes see “alignment” being extended to other conflicts of interest.


Hope

Stephanie Paulsell, Susan Shallcross Swartz Professor of the Practice of Christian Studies, Harvard Divinity School

This may seem like the least likely word of the year for 2023, a year in which the world feels pressed up against the limits of hope. But over these last months, I’ve often heard people grappling with the idea of hope in academic, religious, and activist communities. Hope emerges from these discussions less as a feeling and more as a practice of committing ourselves to the future, a discipline that keeps us turning toward one another and the world.


View Entire Post

Read Entire Article