This interview appears in print in PALLADIUM 08: Scientific Authority. To receive your copy of Palladium’s quarterly print edition, subscribe here.
Surrounded by the English wilderness in far-west Shropshire, a young Martin Rees discovered the cosmos. Born in 1942, he began his education in a school run by his parents in a repurposed Victorian mansion. Today, he recalls this home sparking childhood questions—like why high tides vary on the coast—that would only be answered when he began his scientific studies. After a few years at a boarding school away from home, he entered Cambridge University, where he discovered astrophysics and has since spent much of his career. From 2004 until 2012, he was Master of Trinity College.
Rees was fortunate to enter his field at a time when it was particularly creative and generative. Radio astronomy from Cambridge and elsewhere was generating evidence for the Big Bang theory of the universe’s origins. Relativistic astrophysics was delivering exciting new models of black holes and quasars. Rees’s work has helped to disprove Steady State theory, update models on how galaxies are formed, and enabled us to better understand the earliest conditions of the universe. Throughout his decades-long career, he has interacted and collaborated with luminaries like Joseph Rotblat, Freeman Dyson, and Sir Roger Penrose.
In addition to his research, Rees has dedicated much of his career to bringing science to the general public. He has written numerous books, lectured around the world, and advised governments and other institutions. He was appointed Astronomer Royal in 1995 and became a Life Peer in the House of Lords in 2005. From 2005 to 2010, he served as President of the Royal Society, first established in 1660 and today the oldest scientific academy in continuous existence.
Rees’s latest book is If Science Is to Save Us. Looking to predecessors like the Great Exhibition and the Pugwash Conferences, he invokes an optimistic vision of science as a force that can improve society. But with increasing global turmoil, and with institutional obstacles holding back rising generations of scientists, realizing that vision is by no means guaranteed.
The Early Years of Science
In your new book, you take us through some of the early history of science. How would you characterize this earlier period when science is getting established as a discipline?
Lord Martin Rees: Well, back at that time, science wasn’t a profession. The Royal Society, until the mid-nineteenth century, was a place where most of the scientists were amateurs, though some of the leading ones were real polymaths. What they had in common was that they were wealthy enough to be independent, and they were lucky enough to have an education.
And so it was only after the mid-nineteenth century that science became professionalized as an academic subject. If I look at Cambridge University, the idea of the undergraduate major in science as a subject to study—in American jargon—didn’t take root until the middle of the nineteenth century.
These days, everyone thinks in terms of scientific ideas being the impetus for new technology. But although that may be to some extent true now—there’s a symbiosis between them, anyway—in the earlier days there was very little connection. With the invention of the steam engine, and of all the technology involved in its construction, they wouldn’t have thought they were using science at all.
So technology, even sophisticated technology like shipbuilding, was in no sense based on science in the way that we would say that modern technology is based on science.
You mention Charles Babbage in your book, who in the nineteenth century attacked the Royal Society. It sounds like he thought math and physics had become stagnant. What was his critique? And as a former president of the Royal Society, do you think there was anything in it?
He was thinking about the UK. And it was certainly true in the UK that, throughout the eighteenth century, even the leading universities were in a pretty low state—no intellectual standards and so on. So I think the universities did, as he implies, sink into a torpor in the eighteenth century and in the early nineteenth century.
And indeed, he was quite right about the Royal Society. And what happened in the nineteenth century in England was that an interest in science and technology was revived in a big way. One thinks of the Great Exhibition of 1851, which was an amazing show of all these technological achievements. And the first half of the nineteenth century saw the foundation of specialized societies: the Linnean Society, the Geological Society, the Royal Astronomical Society, and many others. And also the British Association for the Advancement of Science, which had a sort of outreach program.
And I think those were founded mainly because there was growing public interest in science, but also because the Royal Society itself was not being at all effective.
What were the communities behind these generative, productive periods of science like?
It was all fairly amateurish. In Germany, the foundation of research universities is attributed to Alexander von Humboldt in about 1820. Thereafter, the Germans had the idea of a university where you would teach students, but also have people doing research. And of course, as I say in the book, that’s the model which we now have in the UK and in the US. And now, China has taken it up too.
But in mainland Europe—indeed, even in Germany—they don’t quite have that model now. Most of the best research in Germany is done at Max Planck Institutes, which are severed from universities. And in France, it’s done by people supported by the CNRS [the French National Centre for Scientific Research] and civil servants. So Germany developed an institutional structure for science that was independent of academies in the 1820s. And other countries, including the UK, took longer to develop the sort of scientific components based in universities.
Incidentally, I would recommend, if you don’t know it, The Age of Wonder by Richard Holmes, which is a fascinating book about science and culture in the late eighteenth and early nineteenth centuries. But the point here is that there was a sort of culture of science linked to the culture of the humanities. I mean, Shelley and Wordsworth were interested in science.
The Great Exhibition is part of this interesting turn that happened in the middle of the nineteenth century, the idea that science needed to have more public engagement. Until that point, as you’ve said, these were very restricted groups. The key thing was how peers within the scientific community received someone’s work. Why did science take this turn toward public recognition?
Well, I think it was a response to wider public interest. The early people who took an interest in science were limited to a small elite, educationally and financially. Whereas in the nineteenth century, there were lots of local organizations for following science and a far wider interest in it. And that’s what the British Association for the Advancement of Science (BA) met through its meetings around the country. They had 3000 people on a beach listening to a lecture by Adam Sedgwick, the geologist, in 1837.
And there were bodies like that because science was becoming more advanced and more technical. So there was a motive, from the mid-nineteenth century onwards, to actually have organized teaching in those subjects. And as I said, science only became part of the Cambridge University curriculum for undergraduates in the mid-nineteenth century—mathematics had done so already. And of course, this led to career openings for people to teach at these places, and also to do research.
The Challenges of Scientific Authority
You’re an admirer of the physicist Joseph Rotblat. You even met him later in his life. He left the Manhattan Project on moral grounds, and later ran the Pugwash conferences to promote nuclear disarmament. He also got political operators like Kissinger on board with his projects.
Rotblat seems to have been quite successful in his initiatives. We could contrast him with Leo Szilard, another physicist who was working on the same issues but got sidelined by politicians. What made Rotblat so effective as a scientific advisor in this period?
For the first half of the century, Germany and Britain were the major creators of science, with most of the origins of quantum theory, Einstein, and all that coming from Europe. America only became a leading scientific power after World War II. And that’s partly stimulated by the war itself, of course.
In World War II, even more than in World War I, it was realized that science was crucial for weaponry. And of course, the most conspicuous development was the atomic bomb, which was an extraordinary collective achievement of science. And then, of course, we had the space program. Politicians were aware of the dependence of our civilization on having good scientists around. And especially after World War II, there was great optimism about science, because science had clearly been crucial in the war’s outcome.
I think one shouldn’t overemphasize the role of Rotblat and Pugwash, and all that. But the reason I wrote about them in my book was that the first group of scientists to confront the real ethical dilemmas of science in a big way were, of course, the nuclear physicists who built the bomb. And I got to know several of them in their later years, in the 1970s, including people like Hans Bethe. Many of them were exceptional scientists, but also people of some ethical sensibility.
And they’re the people who went back to civilian life. But they had an obligation to try and optimally harness the forces they’d helped unleash during the war. And the Pugwash conference group founded by Rotblat and Bertrand Russell was an example of this.
The other feature of science, of course, is that it’s always been international. Mendeleev had the periodic table in the late nineteenth century, and there was lots of contact with the Germans in chemistry. So even when the world was divided during the Cold War, scientists wanted to keep in contact. And that was the motive for channels like the Pugwash conferences, where scientists from both sides who trusted and respected each other could get together. And bodies like that were, in fact, especially important when there were few other channels between the two sides. I think they were less important from the 1970s on, because there were a lot more channels after that.
The nuclear issue was not the only important one. But I think the Pugwash movement was important in getting scientists on both sides together and offering a back channel to governments rather than direct channels. And from what I’ve read about the history, which was before my time, they had a role in easing the path to the Anti-Ballistic Missile Treaty and all that.
What was the value of Pugwash to the political side? It doesn’t seem like they only found them to be useful messengers. Rotblat and his allies actually got important people on board with trying to curtail the nuclear threat. Why was the group so effective at that?
Well, they got politicians involved after the politicians’ retirement. There was this group, the Gang of Four, that was trying to move towards zero nuclear weapons. That was Henry Kissinger, George Schultz, William Perry, and Sam Nunn. And there were other groups like that. And of course, Robert McNamara, who was a great hawk at the time of Vietnam, spoke in his later years about the excessive risks that were taken during the Cuban Missile Crisis in 1962. And he said that the U.S. was lucky as well as wise that it didn’t lead to a nuclear confrontation.
So many of these people who’d been at the center of these decisions realized in their later years just how lucky they’d been, and how great the danger that they managed to avoid was. And therefore, they were prepared to engage with these groups at that time.
It’s hard to strike the balance in a situation where the science is uncertain but the state has to take action. You mentioned that when there is scientific controversy in a political question, it may be better for a scientist not to invoke their scientific authority. They should engage that point in public as a citizen. What is the difference between engaging as a citizen versus as a scientist?
Geoengineering is an interesting example. That’s an issue where everyone accepts it’s a sort of Plan B if climate change does turn out to be drastic and we don’t cut CO2 emissions. In fact, there was a campaign in Canada that was against any experiments that were at all relevant to geoengineering, whereas other people said, “well, let’s at least do some experiments so that we know it would work if we need it.” So this is a genuine controversial debate, and I think it’s quite right that it should be.
One point which I make is that many of these decisions which benefit future generations are, of course, doing that at the expense of money we could spend straightaway. So there’s a tension between instant gratification and doing something which will benefit future generations. Of course, the issue here is that politicians have a focus on doing what’s right for their own constituents before the next election. And so they’re not the kind of people who will think very long-term unless their voters are happy with it. That’s why in my book, I quoted the senior European politician Jean-Claude Juncker: “We know what to do, but we don’t know how to get re-elected when we’ve done it.” And he was thinking of the measures you need to stem serious climate change in the far future.
The other point is that because politicians normally have a difficult agenda of short-term urgent problems—especially at the moment—they won’t take very much notice of the scientific adviser who tells them to worry about these long-term questions. They’ve got urgent things on their mind. And that’s just human psychology. That’s why one thing I noted in my book as being very important to change is public opinion—the opinion of voters. Because if the voters can be made to think long-term and to care about the world their grandchildren will live in, then the politicians will respond. And that’s why I say that we should appreciate all the demonstrations that are happening now in favor of action to prevent climate change.
I admire what I call the Disparate Quartet of charismatic individuals: Pope Francis, David Attenborough, Bill Gates, and Greta Thunberg—all very different from each other, but they’ve all done a great deal to make the public more aware of climate change. And they’ve affected the willingness to have legislation that favors clean energy. This, I think, is important because they change the opinion of voters, and the voters will then be happy. And then the politicians may make these long-term decisions. So that’s an example where public opinion is important.
Another example is the pollution of the oceans. When there was a bill passed in the UK to prohibit non-reusable plastic drinking straws and the like, that was because of David Attenborough’s programs that showed people the stomachs of fish full of bits of plastic. There’s also a scene [reported by Attenborough] of an albatross who returns to his nest and coughs up lumps of plastic for its brood. And that is an iconic picture, that millions in the UK saw. It became rather like the polar bear on a melting iceberg, you know. And that made a difference. The British politician certainly wouldn’t have used any of his political capital on a regulation like this, had he not realized that lots of people now cared about ocean pollution.
And so I think the lesson there is that if the public cares about something which is long-term, then politicians will respond. That’s why it’s very important to make the public aware. It is also important that scientists should get through to the public. But in many cases—except people like Carl Sagan, who were masters of this—this is best done through intermediaries, like the four charismatic people I mentioned, who aren’t themselves scientists but people who listen to the science.
During the COVID-19 response, we saw how politics engulfed everyone—even those trying to do objective work. There was this battening of the hatches on the part of many scientists. One quote that made some waves was Dr. Fauci saying that “attacks on me are attacks on science.” One side saw scientific authority under assault, the other as cover for a political agenda.
In retrospect, do you think that battening of the hatches was necessary? Or was it a bad move?
Well, let me say first, I think Fauci’s job when Trump was president must have been near impossible. I mean, the hard thing to get through to the public is that, very often, we just don’t know what the right thing is. And the question is, to what extent should we be precautionary in these measures one recommends? And the scientists have to illustrate that to the politicians and leave it to the politicians to decide.
There was a case in the 1980s—I think I mentioned it in my book—about Mad Cow Disease, which was a new kind of prion disease that wasn’t understood at all. And it killed up to 100 people. The government took rather excessive precautions against it, like banning beef on the bone. But the reason that was not irrational was that the Science Advisor, and I knew him at the time, said “Well, we are going to have 100 deaths.” But if the politicians asked whether he could say the chance of a million deaths was less than one percent, he had to honestly answer no. And of course, a one percent chance of a million deaths is more worrying than a definite chance of 100 deaths. And so, given the ignorance at that time, it was right to be overcautious.
And so there are lots of cases like that, when you’ve got to do what is in retrospect an overreaction by preparing against the worst case. So that’s just an example. So I just think that scientists should do their best. And then the politicians have to decide on this, on regulations which may prove to be overly stringent.
But if you pay your fire insurance on your house, and the house doesn’t burn down, you don’t think it was a waste of money. You just accept that as a possibility. I think the public has to understand that there are cases that are analogous to that.
The nineteenth century kicked off the period when scientists wanted to engage the public. I wonder if we’re seeing the end of that cycle, partially as a result of these current conflicts. I do hear people say things like if people don’t have formal training, then they’re not going to understand how the precautionary principle works, or how statistics work.
So, the claim goes, maybe it’s better that these things are kept among qualified experts. Those who aren’t qualified should not comment. Is that a change you’ve seen at all?
Well, I would have thought that the fraction of the public that has got some basic feel for science is still too low. But it’s going up, not down. I mean, there is noise caused by fake news. But I would say that with more people going to university and having at least some classes, the number of people who know a bit of science is going up. It’s still too low. And I have to say it’s worse in America; only recently do the majority of Americans believe in evolution. And that’s far worse than in Europe. But I think, in general, the number is going up.
The other thing is to appreciate and not be bamboozled by statistics. One obvious case is if you have a test for some rare disease, then the false positives may outnumber the actual cases. And that’s a well-known phenomenon, but one has to try and get that over to the public. And it’s not trivial, but it’s not too difficult. So that’s an example of something where one wants the public to actually understand what the uncertainties are.
You’re optimistic, it sounds like.
I think things are improving. And I say in my book that scientists shouldn’t moan too much about ignorance on the part of the public, because it’s gratifying how many people are interested in things like space and astronomy, and all that. Kids love dinosaurs. So, the wonder of science is a separate thing. I think we should be grateful for that.
And I think, also, we should equally bemoan the public’s ignorance of other things. I think it’s just as bad if a low fraction of the public can identify Ukraine or South Korea on a map, or don’t know the basic history of their country. The results of a poll on those would, I suspect, be at least as depressing as when they do polls on scientific knowledge. If you’re going to be an informed citizen, I think you ought to know all those things to have an informed judgment.
I think the number of political decisions that have a scientific dimension is probably going up. And so the importance of scientific knowledge, relative to geographical knowledge, is probably going up a bit as well. But the public—in a proper democracy where it can actually make informed decisions—needs at least some basic understanding of these things, a background in history and geography, et cetera. I don’t think it’s too much of an aspiration to hope for that. Lifelong learning on the web, and all that, can help with this sort of thing.
The Future of Research
In your book, you point out that published papers today are often judged by what journal they appeared in, rather than their own merit. You also say that the Nobel prizes have an effect where, again, scientific fields are seen as valuable because they’re represented in the Nobel Prize system itself.
It seems like what you have in both cases is a compounding distortion effect on attention. It becomes harder and harder for an outsider not just to judge actual research, but to even pay attention to the correct things. How serious of a problem is this?
Well, I’ve been in academia for 40 years now. So I think I realized a sort of randomness in the way these awards go, and the fact that they are in a certain set of fields. They don’t reflect how science is done, because so much is a team effort rather than an individual one. And even when work is done by an individual, it may be someone who is lucky, rather than especially brilliant. So for all those reasons, it’s a mistake to elevate people winning Nobel Prizes as being the great leaders and the great intellects of science, because that’s just not true. And so, what’s rather good is that there’s now a greater variety of awards and ways of recognizing scientists, and in some cases, of recognizing groups.
Going over to journals, this is an aspect of the way that I think that academia is getting a bit sclerotic in the way it operates. It needs to open up a bit. For instance, research is important, and it’s done in universities alongside teaching. But the way research develops may be better with lots of blogs, exchanges, and things like that. And the traditional model—where the only thing that affects your promotion as an academic is publication in good journals—is, I think, a mistake. It’s unfair because many people can make a bigger contribution by outreach, or by having a good blog, and things of that kind, and we ought to recognize that.
That’ll make academic careers more attractive. Because, being in this world myself, I do worry that becoming an academic is a less attractive career path now than it was when I was young. That’s because there’s more bureaucracy and more audit culture. And also other forces, demographic reasons. Promotion is slower.
An American example of this is that at the NIH, where the average age when you get your first grant as an investigator is now 43, or something like that. Shirley Tilghman, the former president of Princeton, was on the committee that found this. And she was worried! She contrasted this with when she was young—she’s about my age right now—when she got a PhD, then did a postdoc, and then got a grant and opened up her own lab. People can’t do that now. At that time—we’re talking about the late 1960s—the young outnumbered the old, because there’d been an expansion of higher education. And people also retired, they didn’t stay on after retirement.
This is worrying, really, because I feel that the people who are going to be most deterred by this difficulty of getting fast promotions are just the ones you want to keep in: the people who are enterprising, flexible, and would like to achieve something distinctive in their thirties. It was possible in the past, but is less so now, and so my worry is that they will go into something else. Well, we want some of them to go to start-ups and into other professions, but we want some of them to go into academia. Academia can’t depend just on the nerdish elements and people who are happy to spend most of their lives writing grant applications.
So I think the health of academia, and of academic culture and research culture, is a bit under threat for these reasons I just mentioned. So there’s got to be some ways of recognizing achievements, other than through those journals. And also a way of perhaps encouraging more independent scientists.
The thing is, people have been studying these problems for a long time. Everyone’s very aware of things like the replication crisis, the grant problem, the problems of peer review, and so on.
It seems like the institutions have a very hard time adapting even to what we know, or to quite well-established criticisms of the way they work. So how could such a thing actually happen? How could you actually change the momentum?
Many things that I grumble about most are problems created by academics themselves. So they need to change. There’s some hope. In fact, I’m on a committee on research that was set up by the American National Academies. And I’m one of two Brits on this, as a former member of a National Academy. They’re trying to address this sort of thing. And I think collectively, they could have some influence on academia. So I do have some hope of changing these criteria.
But of course, there is a problem. It’s very easy for people who don’t have real credentials to get a lot of attention via social media, right? There’s got to be trade-offs to make sure that people who do solid work without much publicity get recognition, but at the same time avoid too much bureaucracy. And one related issue is whether pure research, which is clearly important as the basis for future technologies, is still best done in universities. Or should we shift more away from that model—which we in Britain and you in America still have—to a system where there are more standalone institutions?
The disadvantage is, then you don’t get a direct link between the professors and the students. But you would give people the opportunity for full-time, long-term research projects, which they can’t do now because there are more distractions and more administration than there used to be in academia. And because most grants are given with reviews—you’ve got to do something good every two or three years, you know, and can’t do long-term stuff.
So I think the balance is shifting a bit in favor of long-term institutes, separate from universities and especially in some health topics and in technology—for clean energy and things of that kind, where the social need is a mixture of pure and applied science. For bridging the gap between the university and commercial work. I think there’s a case for more standalone institutes with specially directed targets.
Do you think that the research paper is still a useful unit of research?
Well, I think it’s one option. But you know, someone could do a successful blog. And, of course, someone might do a long-term project and write a book. So I think, the idea that you’ve got to write a certain number of papers in a three-year period, that’s obviously a constraint on your choice of topic. You’ve got to choose a bite-size topic where you know you’d have good results within three years. And that may militate against working on really important long-term projects.
So I mean, there’s virtue in the research journal and research paper. And in topics like philosophy, it probably has a bigger role. Although even there, I think it is a mistake to sort of focus too much on that compared to blogs and papers in the wider literature.
Could finding more ways to fund independent research be a good idea? What strategies should people have on that front?
Those who currently aspire to academic careers face a nastily competitive and insecure environment, bedeviled by audit culture, where the requirement to meet short-term targets impedes a focus on long-term risky projects. My earlier generation was far luckier. Academia needs at least some of the people with ambition and flexible talent who hope to achieve something by their thirties.
It’s good, of course, if some of these people create start-ups. Better still if, having made money by their forties, they become “independent scientists” in the mold of the independently-wealthy Darwin and Rayleigh in the nineteenth century, and Edwin Land and James Lovelock in the twentieth. Indeed, we need more such people in order to avoid groupthink.
A related issue is how fastidious we—or universities—should be in accepting donations. This is an issue in the UK. It’s not just cases like the Sacklers [a U.S.-based pharmaceutical dynasty often criticized for their role in the opioid epidemic]. Donations from leading fossil-fuel companies are being declined, as are those from defense contractors and from countries like Saudi Arabia. But I think there are inconsistencies. For instance, those who make billions from “crypto” are surely socially damaging—the typical non-savvy “investors” lose just as those who engage in online betting do. But these “super-rich” are accepted as prime donors to ethically-sensitive organizations like the California-based “Effective Altruism” group.
Lessons From Public Service
From 2005 to 2010, you were president of the Royal Society. You’ve mentioned that, in retrospect, you should have been more “activist” as a president. What would you have done differently?
The Royal Society has a very broad remit: science itself, advice on government policies, and a strong international dimension. In 2005, I was elected to a five-year term as the Society’s President. It’s an honorary post, and therefore can only be part-time for anyone who is neither retired nor independently wealthy. But there were many activities—fundraising, engagement with Fellows, “representational” events, attendance at inter-academy meetings overseas, and so forth—where I felt the Society would have benefited from a full-time President.
I believe that all scientists should, as individuals, be politically engaged. When their own work is concerned they have a special obligation to foster its benign applications and to warn against its downsides.
It’s less clear how academies and learned societies should become advocacy groups for specific policies. Clearly, they should offer assessments of scientific issues and policy options; they should make recommendations within their range of expertise; they should offer views on the curriculum of schools and colleges. And the need and scope of regulations on dangerous pathogens, climate, and so on can best be addressed by inter-academy dialogue.
But academies should not adopt any collective stance that’s too controversial, either through being overtly party-political, or being ethically dubious in many people’s minds. For instance, should academies advocate the building of nuclear power stations? This is an issue where opinion in many countries is roughly equally split, both among people with genuine expertise, and among those with none. My line was that the Royal Society should not take a collective view on this, though I expressed my personal view in favor of R&D into improved fourth-generation nuclear reactors. And of course, there are other issues where there’s an ethical divide. For instance, the deployment of genetic techniques for human enhancement.
What about issues when there is a strong consensus among experts but some “dissidents” exist? This sharpened up for me in the context of the climate debate.
Our policy was that a collective Royal Society statement required endorsement by the Society’s Council, which includes the officers plus eighteen elected members. There are around 1000 UK-based members altogether, among whom there will obviously be proponents of “dissident” viewpoints, but these cannot expect to “veto” a statement. On this basis, the Society endorsed the UK’s Climate Change Act, which enshrined the goal of major cuts to CO2 emissions, despite opposition from some “climate deniers” among our members.
Apart from climate policy, another issue that aroused controversy, though fortunately one peripheral to the Society’s main agenda, stemmed from a vocal faction of “New Atheists”—best described, I think, as small-time Bertrand Russells. There was little in their views that he hadn’t expressed more eloquently decades earlier. My line was that the Society should be a secular organization but need not be anti-religious. Of course, we should oppose, as Darwin did, views manifestly in conflict with the evidence, such as creationism. But we should strive for peaceful coexistence with mainstream religions, which number many excellent scientists among their adherents.
This tolerant view would probably have resonated with Darwin himself, who wrote: “The whole subject is too profound for the human intellect. A dog might as well speculate on the mind of Newton. Let each man hope and believe as he can.” If teachers tell young people that they can’t have both God and Darwinism, many will choose to stick with their religion and be lost to science. My own perspective is that if we learn anything from science, it is that even something as basic as an atom is quite hard to understand.
This should induce skepticism about any claim to have achieved more than a very incomplete and metaphorical insight into any profound aspect of our existence. But this need not prevent us from appreciating the cultural traditions, rituals, and aesthetic accretions of religion, and its emphasis on common humanity in a world where so much divides us.
You also sit in the UK’s House of Lords as a crossbencher. What has your experience been like in this role? Where did it prove an asset, and where was it a hindrance?
In 2005, I became a member of the House of Lords in the category of “people’s peers.” This process involved being nominated, and then, if shortlisted, being interviewed by a panel. It’s important that peers in this category should be “crossbenchers,” with no party affiliation. And, because they don’t take a party whip, they can vote as and when they wish, without obligation to respond to a party’s call. I was an opponent of Brexit, for example—and feel sadly vindicated when I see the mess we’re now in.
Most new peers enter via a different route: nomination by the Prime Minister or party leaders. Numerous such appointments in recent years have been criticized as rewards for donors, or cronyism: there’s a widespread view that reforms are needed. But membership remains a privilege, even though perhaps less of an honor.
I speak in the Lords on educational and scientific issues, and on some topics I care about—legalization of assisted dying, for instance. But I honestly haven’t been very active except on the select committees and “special inquiries.” I’ve tried to use the Lords, along with other commitments, to raise awareness of a topic that I’d devoted much of the last few years campaigning for: we need to prioritize the prevention of catastrophic risks. These include not just slowly emergent catastrophes like global warming, but those stemming from misuse—by error or by design—of ever-more powerful cyber, AI, and biotechnologies. I also helped set up the Centre for the Study of Existential Risk (CSER) in Cambridge.
Are there fields of knowledge today that you feel are not getting sufficient attention, whether from scientists or the public?
As science advances, its frontier with the unknown becomes more extensive. Among the most exciting areas on the current frontier are synthetic biology and robotics. But I think we need to keep a focus on the challenge of properly nourishing the 9 billion people who will be on Earth in 2050. Doing this without encroaching on or despoiling natural habitats will require novel technology: genetically modified crops, artificial meat, and so forth.
But let me put in a plug for astronomy, the grandest of the environmental sciences. Thanks to improved instruments on the ground and in space, this subject is becoming broader. For instance, we know that most stars are orbited by retinues of planets. There are billions of planets in the Milky Way that could be abodes of life—but are they?
Astronomy is also a “fundamental” science: understanding the very beginning of our expanding universe will require advances in physics that may take us further from our intuitive concepts than quantum theory and relativity already do. We must be open-minded that these concepts may be too deep for human brains to grasp—at any rate, any more than a monkey can understand quantum theory.