Nicholas Dirks: the ‘two cultures’ must finally be reconciled
Resistance to the knowledge generated by science will only be overcome with the help of the humanities. But what can universities do to bridge C. P. Snow’s famous divide between these fields, which endures to this day?
“Closing the gap between our cultures is a necessity in the most abstract intellectual sense, as well as in the most practical.”
When C. P. Snow spoke those words in his 1959 Rede Lecture at the University of Cambridge, he was speaking as a trained scientist who had become a successful novelist. His main target was the “literary intellectuals” – who, he claimed, discounted the role of science and scientists because of their failure to understand the deep nuances of the human condition. While he acknowledged scientists’ occasional disregard of what he called the fundamentally tragic character of existence, he was convinced that science was necessary to deal not just with the challenges the UK faced in the post-war world but with the even greater challenges of underdevelopment and poverty across the globe.
His argument, famously reprinted in book form as The Two Cultures and the Scientific Revolution, remains highly relevant today. The place of science in universities has changed dramatically since the 1950s, but the arts and sciences continue to be regarded as occupying not just different parts of the campus but parallel and mutually incomprehensible universes of enquiry and understanding.
This is despite the fact that, broadly speaking, the arts and sciences were part of a single intellectual culture until the early 20th century. As Henry Cowles demonstrates in his recent book The Scientific Method: An Evolution of Thinking from Darwin to Dewey, scientists in the 19th century saw themselves as exploring the same forms of knowledge as humanists. Darwin exemplified this in his reliance on imagination in discovering the principles of the natural world, although he came to believe increasingly in the importance of testing scientific ideas when possible.
Cowles goes on to trace the trajectory whereby the scientific method not only became the catechism of school science classes over the course of the 20th century but was generalised across domains, from the laboratory to the factory floor. Yet the expansion of science also engendered a vigorous reaction.
The real campaign against Darwinism in America did not begin until the 1920s, and it was about far more than the literal truth of Genesis. For many, as Andrew Jewett shows in his recent book Science Under Fire: Challenges to Scientific Authority in Modern America, science came to be seen as authorising “a misguided, dangerous view of humanity. It delivers material progress but also sows moral degradation…In the 1950s and early 1960s, a remarkably broad array of mainline Protestants, humanities scholars, conservative political commentators, and even establishment liberals joined theological conservatives in arguing that science represented a moral, and even existential, threat to civilization.” Their concerns paved the way for, and were then exacerbated by, the political explosions of the late 1960s and 1970s, when the military industrial complex – and, particularly, its expressions in the Vietnam War – was linked by radical theorists to big science and the growing influence of science and engineering in university life.
What we need to do, according to Jewett, is to adopt “a more charitable and nuanced assessment of science”, recognising it as “a messy, thoroughly human enterprise” that nonetheless “produces remarkable outcomes”.
Instead, scepticism about science has grown steadily alongside scientific advances. In recent years, increasingly visible evidence for climate change has raised concerns that the modern industrial age has sown the seeds of planetary destruction, while the promise of digital technology has brought with it growing worries about security, privacy and levels of disinformation that threaten democracy itself. During the Covid-19 pandemic, we have witnessed both the almost miraculous capacity of science to develop effective vaccines in record time and a deep resistance to public health measures ranging from wearing masks to taking the new vaccines.
We have seen vaccine resistance from both African Americans and other minorities, who perhaps look back to abuses such as the Tuskegee syphilis experiment, and rural white men, whose distrust of experts has been exacerbated by the politicisation of science by the Trump administration and other right-wing voices. Meanwhile, the ranks of “anti-vaxxers” have been made up of people from across the political spectrum who invoke a wide range of bogus arguments, including the continued insistence by some self-styled scientists that there are correlations between autism incidence and vaccine take-up.
The task of countering this widespread resistance to scientific knowledge, both in the US and globally, is daunting. My main point is that this task is made harder by the perpetuation of the two cultures delineated by Snow and still very much present on college campuses.
As science became increasingly central, securing greater and greater funding, the humanities began their slow decline. Today, there is a widespread view that they are both largely irrelevant to contemporary life and ill-suited to preparing students for careers.
This loss in prestige for the humanities has also been part of a general critique of the university in the US, driven by concerns about cost as well as relevance. Rising levels of student debt have coincided with a time when career opportunities in non-technical fields have declined, making the liberal arts appear to be at best a luxury, at worst an expensive waste of time. They have also been attacked for enshrining ideas of Western civilisation or American culture that give no place to the voices of those oppressed by imperialism, slavery and capitalism (and are now increasingly being condemned for doing the opposite).
At the same time, an understandable political interest in providing college-level skills across the population to prepare for future jobs has conspicuously shunted aside any serious conversation about the larger purposes of higher education. And yet, whether we look at the current threats to democracy, dangerous uses of technology or even the politicisation of public health, it is clear that we need a broad commitment to higher education for reasons that go well beyond career readiness.
Ironically, therefore, at a time when the culture of science is clearly in the ascendant – when Snow’s vision for the future has in some respects come to pass – science needs the arts more than ever. Regrettably, however, the two cultures have become in some ways even more incomprehensible to each other. The humanities and humanistic social sciences have understandably become more defensive about their place in the university and resist the idea that they should become mere “service” fields for STEM disciplines. Yet, in hunkering down with an eye to weathering the storm, they have too often retreated inside their own disciplinary shells rather than venturing into larger, if riskier, arenas that might invite a deeper conversation between the arts and sciences.
In schools and departments of public health, scientists have for years collaborated with social scientists to work on questions ranging from the epidemiology of infectious disease to the social factors surrounding health. Even as the sciences become more specialised, we have also seen a new openness to interdisciplinary collaboration as a result, for example, of the explosion of knowledge in the biological sciences. Yet these developments have only rarely been translated into the structural reorganisation of programmes and departments, which continue to reflect the categories of knowledge from the turn of the last century more than the forms appropriate for the 21st.
While the rapid development of vaccines is about as clear a demonstration of scientific accomplishment as one could ask for, the emergent crisis around climate change provides an equally clear example of the need to accept certain levels of uncertainty while serious scientists develop a consensus about trends, correlations, future prospects for the planet, and the steps needed to ameliorate the dangerous effects of massive fossil fuel use over many decades. The task of explaining “the science” behind all this has become increasingly challenging, as Steven Koonin demonstrates in his recent book Unsettled: What Climate Science Tell Us, What It Doesn’t, and Why It Matters.
The “two cultures” paradigm is a particular obstacle here, since it means that many people fail to understand that science can be both uniquely valuable and what Jewett calls “a messy, thoroughly human enterprise”. It proceeds not only by the necessary if also serendipitous intertwining of observation and experiment, but by the zigs and zags – the debates, arguments and disagreements – that are vital components of all human knowledge, even the most fact-based.
While there is much we can do to improve the ways in which we communicate the findings of science to the public, the public face of science begins where science is made and taught, especially in the universities that sponsor high-level research and train advanced students in a broad range of fields. It must be possible to use the current public crises around science to help bring the two cultures together.
Here, I want to consider a few initiatives that I started at the University of California, Berkeley when I was chancellor, before I try to draw out the central lesson.
One of the first investments I made in programme development was in neuroscience. We had the advantage of working closely with the clinical neuroscience group at the University of California’s flagship, free-standing medical school based at its San Francisco campus, but Berkeley had deliberately taken advantage of its core strength in engineering to supplement its own excellent neuroscience research cluster. Compared with Columbia University, where I had previously worked, the group was small and under-resourced, but it grew nimbly by establishing new links with fields ranging from biology to psychology to new imaging technologies. Those links even extended beyond the sciences; when a donor wanted to connect the work of neuroscience to an interest in Buddhist meditation, the group was ready and willing to do so.
I also initiated a far larger effort to bring together teaching and research in computer science and statistics with schools and departments across the whole university. The impetus for this was the flood of students wishing to take courses in computer science. One of the first meetings I had was with the chair of the department, who provided me with enrolment data and a proposal to double the size of the faculty. We could not do that even if we had wanted to, but the larger question was how to teach computational skills in ways that would connect with the discrete forms of knowledge that students were actually studying.
So we convened a committee made up of faculty from across the university – from computer science and statistics, but also from physics, public health, computational biology, urban studies, philosophy, history and literature – and asked them to design a new set of data science courses. They succeeded brilliantly in fashioning a core course that introduced students to computational methods and modes of thinking alongside a set of “plug-in” courses that connected those methods to datasets and questions emerging from other fields.
For example, students in public health could analyse epidemiological data about the spread of the Zika virus. Students in history could analyse mortality data around pandemics such as the Black Death. And students in literature could study debates about authorship and Shakespeare by evaluating patterns in word use across multiple texts. The courses were wildly popular and led to a new recognition on the part of faculty of how they could work together across departments to create opportunities for students and advance the work of disciplines across the arts and sciences. Fortunately, it appears that technology companies often prefer to hire college graduates with these kinds of broad interdisciplinary backgrounds, since they know the basics of both computer science and other fields that use real-world data but also provide serious contextual knowledge.
Another of my initiatives was to encourage computer scientists working in areas such as machine learning and artificial intelligence to build into their programmes more attention to ethics, bias and social impact. It has become increasingly clear that algorithms are no more neutral than any other kind of text. Even when they are designed without any intention to introduce bias, they both embody programmers’ unconscious bias and encode social biases through analysing large datasets from the outside world, as a 2019 study by the AI Now Institute demonstrated. Addressing these issues urgently requires the analytical tools of the humanities.
These are not merely academic questions. Indeed, technological discoveries are decidedly outpacing our advances in evaluating their social, economic and ethical implications. Examples include not just new areas of exploration but also classic philosophical puzzles that suddenly take on urgent real-world meaning. Take the standard “trolley problem” in moral philosophy, which turns out to be relevant to designing self-driving “autonomous” vehicles. The problem refers to a host of “thought experiments” that pose questions about whether the conductor of a runaway trolley (or tram, in UK parlance) should avoid hitting a particular person (who is known to them or is particularly young, for example) when the alternative is to hit and possibly kill a greater number of people (who are strangers or older). These once-abstract questions have become highly relevant to the programmers writing code for self-driving cars – as well as to the insurance companies that might have to assume liability for the coders’ decisions.
Questions of ethics also circulate around the development of new medical techniques and procedures, as described in Walter Isaacson’s recent book, The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race. No sooner had Doudna and a group of colleagues extended the use of CRISPR-Cas 9 “gene-editing” technology to human RNA than she called for the development of ethical guidelines and protocols for any human applications. In 2015, she helped convene a conference of leading biological scientists to explore the ethical implications of her scientific breakthrough (for which she shared the 2020 Nobel Prize in Chemistry).
While most scientists said that they were in favour of the use of the technique to cure disease and against it for any kind of human enhancement, Doudna was quick to provide examples of how difficult this distinction could be to make in practice. It was clear to her that scientists had to work with humanists to think about these challenging questions.
These are just a few examples to demonstrate the importance of bringing the two cultures of the arts and sciences not just into greater alignment, but ultimately into a larger, shared culture of intellectual enquiry and moral evaluation. Universities must lead the way. Once they do, the daunting task of communicating science to the public may not be easier, but it will at least be predicated on an understanding of the relationship between truth and facts, knowledge and interpretation, discovery and wisdom – art and science. And that is a relationship that will perforce play a critical role in making the work of science both more effective and more persuasive.
Nicholas Dirks is president of the New York Academy of Sciences. He was previously the 10th chancellor of the University of California, Berkeley, and before that executive vice-president and dean of the Faculty of Arts and Sciences at Columbia University. This essay is an edited version of a paper presented to a conference of university presidents in Switzerland as part of the 13th Glion Colloquium in June 2021, organised by Yves Flückiger, rector of the University of Geneva.
0 Comments