Vinoth Ramachandra

“When I use a word,” Humpty Dumpty said in rather a scornful tone, “it means just what I choose it to mean–neither more nor less.”

“The question is,” said Alice, “whether you CAN make words mean so many different things.”

“The question is,” said Humpty Dumpty, “which is to be master- that’s all.”

– Lewis Carroll, Through the Looking Glass (1872)

I was reminded of this well-known exchange when a friend showed me the cover of this week’s Economist celebrating the spread of assisted dying legislation. The latter claims the “right” of terminally ill patients to demand from doctors help in killing themselves. Doctors then have a legal obligation to honour such a right.

It changes the meaning of “medical care”, turning doctors into partners in intentional killing.

A foreigner observing current Western public culture is struck by the glaring contradictions on display. Individual autonomy or “choice” is elevated to an absolute status when issues such as abortion, assisted suicide or gender and sexuality are discussed. We are regarded as solitary monads whose lives and developmental capacities are self-generated and self-possessed, and the choices we make do not affect others. Hence the putative “right to take one’s own life” or the “right to design my baby” or “the right to decide my sexual identity”.

At the same time, some celebrity scientists and popular science journalists never tire of intoning that free-will is an illusion, the “autonomous self” a myth. We are all at the mercy of our neurones or genes. In the oft-quoted words of the late Francis Crick, “‘You’, your joys and your sorrows, your memories and your ambitions, your sense of identity and free will, are in fact no more than the behaviour of a vast assembly of nerve cells and their associated molecules.” (The Astonishing Hypothesis, 2000). While smuggling in a naturalist metaphysics, neurological or genetic reductionism is proclaimed as the scientific acid that dissolves all metaphysics. Of course, the purveyors of such narratives exempt themselves from such determinism. They do not really believe it themselves; for if they did, they cannot logically take credit for their discoveries. We would have to hand over their Nobel awards, royalties and salaries to their genes and brain structures.

However, both the rhetoric of “choice” and that of naturalist determinism stand side-by-side within the contemporary Western medical profession, and the contradiction is blithely ignored.

Moreover, the advertising industry spends billions of dollars devising clever videos, slogans and sound bites that manipulate the choices consumers make. And we have learned from feminism that in a patriarchal society women’s choices are rarely free and well-informed, but  shaped by men’s expectations, definitions and decrees.

The toxic polarization that is called “culture wars” in North America is spreading globally via social media. Eschewing respectful dialogue and argument, both sides seek power over the other through legislation. After all, if something is legal, the general public assumes it must be right. Legal rights can even be proclaimed as if they were universal human rights.

Behind the talk of respecting “diversity” and “inclusiveness” in a growing number of Western countries, a selective filter operates. Sexual identities and practices are regarded as fluid, amoral, defended either on the grounds of “freedom of choice” or “this is who I am”. Objections to men claiming to be women are dismissed as “transphobia”, but a white man presenting himself as a coloured would be ridiculed, even prosecuted. Polyamorous relations are okay, but polygamy is not.

In both the US and UK, economic class remains the most powerful determinant of a child’s educational attainment or taking to street crime, but class resists all deconstruction. The rich and the poor within nations are physically and socially segregated. The poor and the disabled are largely invisible and inaudible. Social “inclusiveness” doesn’t often apply to them. (Pressure to abort disabled fetuses is the other side of the “inclusive society”).

Further, even as talk of “inclusiveness” and “equality” reigns supreme within these nations, so does the barrier of nationality. Observe the way nationalistic mind-sets have been so apparent during the Covid pandemic and the COP26 summit.

So, suppose that I were to claim: “I have a right to a British or American identity because I speak English better than most of the inhabitants of these nations and my Sri Lankan identity was not chosen but imposed on me at birth.” Would such a rights-claim be recognized? Clearly not. But why not, if I’m as uncomfortable in my Sri Lankan identity even as another may be uncomfortable in her female body?

Postmodernism has indeed lent a voice to some humiliated and marginalized groups, and for that we should be grateful. But in its more extreme manifestations, it undermines every effort at global resistance to the status quo by replacing the transformation of the material conditions in which people live with language-correction, and mocking notions of objective knowledge and moral truths that transcend culture and context.

COP26 has reminded us that for most of the world’s human and non-human inhabitants, a “right to acquire resources necessary to live” is more fundamental than an alleged “right to die in the way I choose”. Those European Christians who encouraged the former right were the architects of human rights charters, the builders of national welfare states and health services, and pioneers in palliative care and the Hospice movement. Human dignity and inter-dependence were not prised apart, but seen as mutually constitutive. Chipping away steadily at its (predominantly) Judaeo-Christian moral heritage has thus left late modern Western culture oscillating between a Greco-Roman fatalism and a naked, Nietzschean will-to-power.

Either way, Humpty Dumpty’s conclusion is a foretaste of things to come.

What has artificial/machine intelligence (AI) to do with environmental destruction and global warming?

What goes into the making of such systems?

The language of “cloud computing”, “virtual reality” and “cyberspace” has inured us into thinking that the web and AI systems are floating in an ethereal, other-worldly sphere that is divorced from physical bodies and their natural environments.

In previous posts I have written about how AI is not as artificial or intelligent as many imagine it to be. It is part of an extractive late-modern capitalist economy that strip-mines the Internet for all our personal data, just as strip-mining for coal and mineral resources were the foundation of early-modern capitalism, and then feeding this mass data into devices that manipulate and manage us in ways more powerful than all earlier methods of surveillance and social control. Data is the new Capital. Also, like nineteenth-century robber barons, AI development is largely in the hands of a few hi-tech giants in the US and China, who wield concentrated, unaccountable power.

The giddy “hype” that attended the dawn of the Information Age has now been replaced by sober soul-searching by the more reflective practitioners in the field. With regard to AI, the racist, sexist and other biases inherent in training data sets and many algorithms have been exposed. Employees of Amazon, Google and Facebook have publicly complained about the dehumanizing nature of much work in these companies, and -given their scale of operations- the difficulty of regulating them. They are asking the basic questions: Who is making these AI systems and why? What are the effects on the planet as well as on “ordinary” peoples’ lives?

One such prophetic voice is Kate Crawford’s new, deeply-researched book Atlas of AI: Power, Politics and the Planetary Costs of Artificial Intelligence. “Exploitative forms of work exist at all stages of the AI pipeline,” Crawford notes, “from the mining sector…to the software side, where distributed workforces are paid pennies per microtask…. Workers do the repetitive tasks that backstop claims of AI magic-but they rarely receive credit for making the systems function.”

Her fascinating tour of the world of AI begins with a reminder that strip-mining is more than a metaphor for the plundering of our data on the Internet: it literally is what supports the development of AI. For instance, all our smart phones and laptop computers depend on lithium in their batteries, and lithium reserves will disappear within the next twenty years.

The “cloud” takes up a vast amount of land. The world’s largest data farm is in Langfang, China, and covers 6.3 million square feet, the equivalent of 110 football fields. The obsessive drive to collect ever-larger data sets in order to “train” machine language algorithms means that the computing industry is carbon intensive and could make up 14 percent of all greenhouse emissions by 2040- about half of the entire transportation sector worldwide. Researchers from the University of Massachusetts Amherst calculated that the carbon emissions required to build and train a single natural language processing system was about five times the lifetime emissions of the average American car.

For an interview with Kate Crawford on her book, see https://www.youtube.com/watch?v=hfGxh2_Jkds

So, we tend to forget that, like everything else humans do, our “virtual” communications are not ethereal but embedded in physical objects: power stations, data centres, undersea cables, overhead satellites, batteries and cooling systems. Inside every wind turbine, smart phone, medical scanner and electric car are minerals known as rare earths.

This small group of 17 elements is in extraordinary demand, but the supply is limited to China and Australia. Extracting rare earths is a difficult and dirty business, typically involving the use of sulphuric and hydrofluoric acids and the production of vast amounts of highly toxic waste. Gold is also an element common in smartphones, primarily to make connectors. But gold mining is a major cause of deforestation in the Peruvian Amazon. Extraction of gold from the earth also generates waste rich in cyanide and mercury, two highly toxic substances that can contaminate drinking water and fish, with serious implications for human health.

It is unlikely that AI systems or smart phones will feature on the agenda of the COP26 conference next month on climate change.

It is not only the environmental costs of our device-dependence that we forget. Jaron Lanier, one of the pioneers of virtual reality, laments the fact that “People degrade themselves in order to make machines seem smart all the time.”

He observes: “We have repeatedly demonstrated our species’ bottomless ability to lower our standards to make information technology look good… The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates. When people are told that a computer is intelligent, they become prone to changing themselves in order to make the computer appear to work better, instead of demanding that the computer be changed to become more useful.  People already tend to defer to computers, blaming themselves when a digital device or online service is hard to use. Treating computers as intelligent, autonomous entities ends up standing the process of engineering on its head. We can’t afford to respect our own designs so much.” (You Are Not a Gadget: A Manifesto, 2010)

There is nothing new in the way engineers take the most advanced machines of their day as models and analogies for human functioning. But there is a short (though calamitous) step from modelling to identification. We then imagine that machines which help us perform certain functions have those functions themselves. When we speak of “clocks telling the time”, what we mean is just that they enable us (conscious human persons) to tell the time. The philosopher Raymond Tallis refers to the “fallacy of the displaced epithet”. Walking sticks don’t actually walk, and running shoes don’t run. The same applies to “radar searching for aircraft”, “telescopes discovering black holes” or “smart phones remembering our appointments”: they do not literally search, discover or remember. If there were no conscious human persons using these prosthetic tools, these activities would not happen.

The lesson: pay attention to language, and ask questions about technology- for every benefit to some, who bears the costs?

The closing decade of the last millennium saw a flood of books announcing the dawn of a new “global village” as the speed of transportation and information flows compressed space and time and gathered the peoples of the world into one happy family under the arch of liberal democracy and global capitalism.

“The world is flat”, intoned Thomas Friedman, the New York Times guru, as he jet-hopped from one luxury hotel to another, canvassing the tastes of local cultural elites.

The collapse of the Berlin Wall on 9 November 1989 (11/9 in American style) was possibly the first live global media event. Glued to their TV sets, millions shared the emotions of a continent liberated from brutal oppression by a popular tsunami of non-military resistance. For political pundits like Francis Fukuyama, a Rand corporation protégé who became an overnight celebrity on the US lecture circuit, the imploding of Soviet communism ushered in an age of international integration. In a now infamous comment, he proclaimed:

“What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such- that is, the end point of mankind’s ideological evolution and universalization of Western Liberal Democracy as the final form of human government.” (National Interest, Summer 1989)

Since the whole world-or the world that really mattered- had now embraced free-market capitalism and liberal democracy, ideological conflict was now a thing of the past. Of course there would be those awkward “trouble-spots” around the world (the “Iraqs and Ruritanias”, as Fukuyama put it in a later essay) which refused to accept the New World Order, and critical intellectuals everywhere who still indulged in Canute-like gestures to fend off the tidal waves of change. But they could be consigned to the scrapheap of history.

For those of us consigned to live in the “Iraqs and Ruritanias”, the world looks rather different.

If decisions in Wall Street can affect the lives of those in the Hindu Kush mountains of Afghanistan, then decisions made in the latter can affect Wall Street. Americans woke up to the reality of the dark face of globalization on that fateful morning of 11 September 2001. The collapse of the Twin Towers was, unlike the collapse of the Berlin Wall, an act of unspeakable horror. We were again glued to our TV sets and laptop screens, voyeurs of suffering, exposed to endless re-runs of what was to become the defining “media moment” for the next decade.

9/11 was media-packaged as an epochal “Event”, not simply another day in the long history of human brutality and suffering.

It was immediately located within a mythic narrative of “unprovoked terror” and the loss of American innocence (“They hated us for what we are”). For multitudes ignorant of their own history, let alone the history of other nations, the new normal of random, meaningless terror stalking their streets and airspace was enough to support their government’s shredding of liberal democracy’s constitutional safeguards against the abuses of power. National Security became the new god. The Patriot Act, torture, and the dragnet surveillance of entire communities became acceptable overnight. Terror was to be fought with terror.

The subsequent military adventures  by US and UK governments left over a million Iraqis (who had nothing at all to do with 9/11) killed and the region awash in advanced weapons that fell into the hands of new militias of which ISIL was the most dangerous. Indeed, ISIL could be called George W. Bush’s baby. The latter’s post-9/11 “war on terror” was the perfect global recruiting program for new waves of Islamist terrorists.

I wrote in a Blog post on 16 June 2013: “In the United States, the massive surveillance apparatus built up since 9/11 is the domestic companion of the overseas drone killings. It spells the degradation of the liberal state. Unaccountable government is at one end of that spectrum of degradation, and an unaccountable financial sector at the other. Iraq has already been forgotten by the American and British public, and so will Afghanistan. Bush and Blair have retreated to their private havens and lecture circuits, while the people of Iraq continue to suffer the aftermath of their destructive and illegal political actions.”

Globalization – not to mention global warming- has transformed our understanding of who our neighbour is. It is not spatial proximity that defines neighbourliness now. We are indeed our “brother’s keeper” (Genesis 4:9 ) in way that was unimaginable in previous ages. In the words of Jonathan Sacks, the late Chief Rabbi of the United Kingdom, “The scope of our interconnectedness defines the radius of responsibility and concern.”(Jonathan Sacks, The Dignity of Difference, London and New York: Continuum, 2002)

Despite its overwhelming military superiority, far greater than any previous imperial power, the US has not won a single major conflict since 1945. It has sought to fight communism and terrorism by making alliances with corrupt, despotic regimes and even “outsourcing” its engagement to shady US companies and military contractors.

Is it naïve to believe that the $6 trillion poured into Afghanistan over the past twenty years could have been better spent addressing global hunger, healthcare and education? Poverty does not lead to terrorism. But terrorists prey on the fears and anger of the socially excluded.

Liberal political institutions are fragile, especially in countries where large numbers go hungry and employment depends on the whims of politicians. We forget that in the West it took centuries to establish and, as recent events have shown, although Western democracies are unlikely to turn into tyrannies they can easily slip into illiberal forms of democracy as economies shrink and acts of local terrorism increase.

Moreover, many of the foundational ideas behind liberal democracy have largely Christian theological roots. And there is a body of historical evidence that those non-Western nations that have been most exposed to Protestant forms of Christianity have been more hospitable to liberal/constitutional democracy. As those historical roots are forgotten, and moral sensibilities overridden by Western governments in the interest of realpolitik  and purely “market-driven” policies, promoting it globally by means of aggressive top-down “regime change” is bound to fail. If the majority of people are indifferent to freedom or prefer a type of society in which it is subordinated to other values, no rules or legal procedures can prevent democratic versions of tyranny from sprouting.

I have a chapter in a book on Artificial Intelligence and Robotics that has been published this week. I hope many of you will read the entire book, since next to climate change this may well be the biggest global political challenge we face!

The United States, which created the Internet as a defence department research project, now considers cyberspace a “domain” or potential battlefield equal in importance to land, sea, air, and outer space.

The last few weeks have seen a number of crippling cyber-attacks on the websites of prominent corporations, banks and government agencies around the world. Many of these originate in the republics of the former Soviet Union and China. And, following revelations of irresponsible Israeli exports of the malicious Pegasus software, Amnesty has highlighted surveillance technology as the biggest threat to human rights everywhere and called for a moratorium on its use.

Not too long ago, if the government wanted to know your most intimate secrets, it could seek to follow you around the clock. But that is very expensive and difficult, and could practically be deployed only against very high-priority suspects. It could search your home, but only if it first obtained a judicial warrant based on specific probable cause. It could interrogate you, but only if it had probable cause to arrest, and even then, you could assert your right to remain silent.

The Edward Snowden revelations in 2013 showed how easily the security services can go today to the  commercial services we all use- the Internet service providers, phone companies, and social networks-  and obtain from them detailed information on your every phone call, web search, e-mail, online chat, or credit card purchase, as well as your physical location whenever you are carrying your cell phone. Speech-recognition algorithms can simultaneously listen in to millions of phone calls. Computer vision algorithms can simultaneously watch millions of CCTV cameras. And algorithms that process natural language can read simultaneously millions of emails.

It has made possible dragnet surveillance of whole communities, and this is perhaps the biggest moral challenge. The Chinese are doing this with the Muslim Uighur community in the Western provinces. But the Chinese government’s data-gathering and face-recognition technology are targeting the entire population. The aim is to collect as much information as possible about every company and citizen, store it in a centralized data base, and assign a credit score to each that indicates how “trustworthy” they are. This is a draconian form of social discipline, designed to identify and punish human rights activists, political dissidents, and other so-called “anti-social elements” by denying them and their family members employment, housing, banking services, and other social benefits.  This is to date the most comprehensive effort to implement B.F. Skinner’s infamous programme of human “behaviour modification” through a conditioning system of rewards and punishments.

The big cats of the Internet industry (Google, Amazon, Facebook) regulate us more subtly, often invisibly. They mine and store our personal data in staggering quantities, and use it to customize our searches and “decide” what we see or find on the Internet. When Wall Street puts a value on Facebook or Google it is not for the services they provide, but for the data they collect and its worth to advertisers, among others.

We are learning that there are hidden costs to all those “free” services offered by the Internet giants. Indeed, we are starting to realise that when the product costs nothing, often you are the product. These are companies with some of the largest profit margins on the planet. They are not giving their services away without getting a lot in return.

Every click of the mouse, every app we choose to open, sends information about our ourselves to thousands of invisible advertisers and, often, government watchers. We are living in what Al Gore called the “stalker economy” and Shoshana Zuboff, in her best-selling book calls “surveillance capitalism”. And there is no firewall between commercial surveillance and governmental surveillance anymore.

So China is not the only country to be worried about. Anti-trust laws in the United States are impotent to reign in monopolies like the hi-tech giants that offer customers “free” services. They’ve even been allowed to enlarge their monopolies by buying up potential rivals. Google, for instance, increased its monopoly over video by buying Youtube, while Facebook bought out Instagram and WhatsApp. Yet we’ve seen that these “free” services hide the real costs to the customer, and so are anything but free.

The impact on print journalism is a cost. Our addictive devices are a cost. The narrowing of choice is a cost. Our loss of privacy has a cost. Snowdon told Britain’s Guardian newspaper that he had leaked the details of the US National Security Agency’s surveillance programme to “protect basic liberties for people around the world.”

Can we seek to redeem surveillance, quite apart from the necessary legal controls such as various Data Protection Acts? Many surveillance initiatives stem from valid social concerns: for example, health surveillance programmes, or tracking human smuggling operations, identifying paedophile rings or reducing crime on the streets. The only way it seems to me to prevent the reinforcing of an individualist conception of “autonomy or “privacy” is to recover a relational view of ourselves. Privacy must be grounded in an ethic of mutual trust and care.

Yes, I have an inviolable, subjective “core” of thoughts, feelings and practices that must be protected from coercion or manipulation; but since such thoughts, feeling and practices are socially formed, I must be open to their being questioned and criticized. And this applies to the surveillors as much as to the surveilled. Such an ethic must enable mutual transparency (within obvious limits), responsibility as well as liberty, and not generate either fear or complacency born of ignorance.

Given our corrupted humanity, powerful technologies tend to be used first for evil; and then when the powerful themselves suffer the fall-out, there are calls for their “regulation”. That is the best argument in support of the non-malicious, non-commercial “hacking” community. But who will save them from themselves, once they have tasted their own power to humble the mighty?

Following the brutal onslaught on Gaza in 2009, hailed by the Israeli government as a “military victory”, one of the wisest political voices in Israel, Uri Avnery, wrote: “What will be seared into the consciousness of the world will be the image of Israel as a blood-stained monster, ready at any moment to commit war crimes and not prepared to abide by any moral restraints. This will have severe consequences for our long-term future, our standing in the world, our chance of achieving peace and quiet.  In the end, this war is a crime against ourselves, too, a crime against the State of Israel.”

I have written so often on this Blog (particularly, 26 November 2014 and 09 April 2019) about the indifference of Western governments and publics towards the colonial aggression of the Israeli state that it depresses me to address this topic again, in the light of the recent events in Gaza.

What I will do is remind readers of 7 important facts which I hope they will share with others in their circles of influence.

(1) Israel is the last remaining European colonial power, which is why its aggression is not treated in the same way as that, say, of China or Iran. Whenever fresh violence breaks out, the US and the EU issue bland appeals for a ceasefire without attending to the obvious questions: “What event(s) triggered this violence?” What can the international community do to prevent such events from recurring?” “How many Palestinians were killed, and how many Israelis?” (The latter question will reveal just how disproportionate Israel’s military responses are, violating all “just war” principles encoded in international law and rules of military engagement.)

(2) Israel is the only country in the world that does not have internationally recognized borders. Just compare a map of Israel in 1948 with a present map. Israel continues to flout international laws with impunity (for instance, erecting permanent structures on lands seized by invasion). It does so because it enjoys the diplomatic, ideological and military protection of the United States.

The military occupation of Palestine encroaches on every area of peoples’ lives: restrictions on travel, high youth unemployment, poor healthcare and educational facilities, forcible annexation of houses and land.

Monoethnic, autocratic regimes in many countries (such as my own) look to Israel’s example in how to deal with their own intransigent ethnic and religious minorities. Seize land, re-settle it with members of the majority community, protect the latter by sending in an occupation army, label all attacks on the new settlers as “terrorist” or “extremist” acts and use them to justify further acts of repression.

(3) Far more Jews live outside Israel than within it, and many are outspoken critics of the Zionist project. There are also courageous rabbis and human rights activists within Israel who are opposed to the abuses heaped on the Palestinian people by the Israeli army and right-wing Jewish colonists. So to be anti-Zionist is not to be anti-Jewish. Yet some Western media and politicians regularly confuse anti-Zionism with anti-Semitism. (The latter is also a misleading term, as most Jews who settled in Israel have European rather than Semitic ancestry. Arabs are also Semites, yet attacks on Arabs- who include many Christians- are labelled anti-Muslim!)

(4) While insisting that everybody recognizes Israel’s “right to exist”, Israel will never recognize the Palestinians’ “right of return”, let alone their right to liberation and self-determination. Israel’s settlement and development programs in the occupied territories- all illegal, as Israel was informed in 1967 by its own highest legal authorities and affirmed later by the World Court- are designed to undermine the possibility of a viable Palestinian state.

(5) Israel is not a democracy by any modern understanding of that term. It officially declares itself to be a “Jewish state”. Its own Arab population that carries Israeli passports are second-class citizens; and as for the indigenous Palestinians, they are a beleaguered and segregated people in their own land. So Israel is no more a democracy than South Africa was under apartheid.

(6) Theodore Herzl, the Austrian journalist often credited with the label “founder of the Zionist movement”, was rightly concerned that assimilation and sporadic persecution were destroying Jewish culture in Europe. The Jews need a “home” where they could preserve their traditional way of life. Herzl was not thinking of Palestine as the Jewish “home”- for Judaism had for the past two millennia reconfigured itself around the study of the Torah rather than the Land and Temple. He initially toyed with the idea of Uganda as a safe haven.

It was “dispensationalist” Christians in the US and UK following the teachings of John Nelson Darby, Henry Irving, the Moody Bible Institute and (later) the Texan Cyrus Scofield’s commentary on the Bible, who influenced the Zionist movement and the British colonial authorities to settle the Jews in Palestine. Wrenching Old Testament texts out of their historical contexts, they taught that the return of Jews to Palestine was foretold in biblical prophecy and would usher in the parousia, or “return” of Christ.

I tell my British and American Christian friends that they can never be part of the solution to the Palestinian crisis until they recognize that they have been a huge part of the problem. And fundamentalist Christian preachers in the so-called American Bible Belt, continue to be the problem as they refuse to accept any other reading of “biblical prophecy”, and spread misleading Zionist historiography around the world through the Internet and TV channels. Benjamin Netanyahu is a great friend of these American fundamentalist preachers and visits them on his trips to the US.

Ignorance of history has to be countered with historical facts. Bad theology has to be challenged with good theology. The Christian theologians of Palestine have come up with a Kairos theological statement similar to the seminal Kairos document of South Africa in 1985 that countered Afrikaaner state theology and mobilized the Church against apartheid. I commend it to you.

(7) As for the United States, there is a huge gap between public opinion and foreign policy. In relation to Israel, U.S policy since the Johnson era has been dictated by the hugely influential pro-Israel lobby (AIPAC) and the corporate-military sector. Many conscientious Americans oppose U.S. government policy and younger members of the Democratic party are far more outraged over Palestine than Biden and his generation and call for a halt to military and economic support to Israel. If the U.S. too were to become a properly functioning democracy, in which an informed public had a meaningful voice in policy formation, things may well change.

Boris Johnson, the UK Prime Minister, last month attributed the development of Covid-19 vaccines to “capitalism” and “greed”.

Although he was reported to have later backtracked, with some of his aids claiming that his comments were made in jest (quoting the film Wall Street), Johnson’s comment is typical of a widespread myth, propagated by conventional economics, that capitalist “innovation”, funded by visionary private investors, is largely responsible for the scientific and technological progress on which our health and prosperity rests.

Johnson seems to have forgotten that his own government promised pharmaceutical companies to underwrite the risks attendant on vaccine development and used public funds to to place huge advance orders for Covid-19 vaccines. Thus the normal risks associated with vaccine development were almost completely removed from investors.

That’s how Big Business capitalism, typified by the pharmaceutical industry and the Internet giants, operates today. Capitalism preys on public funds and public trust. Corporations walk away with the profits, while the public bears the costs.

In a trenchant critique of typical fantasies about capitalism, David Whyte of Liverpool University points out that, prior to the current pandemic, vaccine development was extremely sluggish because previous viruses did not threaten rich nations’ economies.  Earlier coronavirus diseases, Sars and Mers, had no vaccine. The Ebola vaccine was finally approved in 2019, sixteen years after it was first patented and a full six years after the start of the epidemic in West Africa, though the costs of Ebola to these countries was estimated at more than US$50b.

Whyte concludes that “There can be little doubt that racial capitalism and global economics has shaped our response to this virus… Most advanced economies stand to lose at least 4.5% of GDP as a result of this pandemic. So we needed COVID-19 vaccines to save these economies.”

He also reminds us that the “infrastructure that produced the COVID-19 vaccines was nurtured in publicly funded universities, in public institutes and in heavily subsidised private labs.” This is knowledge that is held in common. Universities “provide trained scientists and a foundation of knowledge that emerges over hundreds of years. It is in universities that the rules for clinical research are developed, and it is university researchers who publish results in academic journals which provide that knowledge foundation.” However, in the current economic models, such knowledge production counts “as an ‘externality’ that never shows up on a corporate balance sheet, because corporations never have to pay for them.

Thus, even though the scientific research community is a global one, scientific priorities are skewed by rich nations’ interests.

Furthermore, many of the researchers at AstraZenica, Pfizer and universities such as Oxford were born and educated at local tax-payers expense in the “developing” world. This is also what makes the current gross imbalance in vaccine distribution grossly unfair.

More than a year into the pandemic, three-quarters of the current vaccine supply has been secured and administered by 10 countries that account for 60 percent of global economic growth, while about 130 countries- home to 2.5 billion people- have not received a single dose. COVAX, the global initiative to coordinate the distribution of COVID-19 vaccines in an equitable way, has fallen far short of its aim to deliver 100 million doses by the end of March.

Brazil has been devastated by Covid-19, with infection rates only less than the USA. Yet its local pharmaceutical industry is hindered from manufacturing and distributing vaccines owing to patents held by the US and UK industry giants.

In October 2020, South Africa and India called on the World Trade Organization to suspend its agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS) for the duration of the coronavirus pandemic. This would facilitate the transfer of technology and scientific know-how to developing countries to bolster global production. (The Developing Countries Vaccine Manufacturers Network, which includes the Serum Institute of India- the world’s largest vaccine maker- has been supplying some 3.5 billion vaccines to the world annually).

However, several high-income countries (including the US, UK and many EU members) and pharmaceutical companies have rejected the idea of a waiver, claiming that it would deter private investment and hamper further innovation.  

This is to ignore the fact that vaccine developers received about $10bn in public and non-profit funding for their vaccine candidates, with the five top companies securing between $950m and $2.1bn in funding commitments, mostly from the Coalition for Epidemic Preparedness Innovations (CEPI) and the US government, as reported by the prestigious Lancet medical journal.

A group of more than 170 former world leaders and Nobel laureates has urged United States President Joe Biden to support the South African and Indian proposal, demanding the World Trade Organization (WTO) temporarily waive COVID-19 vaccine patents so that vaccine know-how and technology can be shared openly with all.

Christian theology has long held that the right to life trumps the right to private property. If I have food or life-saving drugs in my home that I don’t need for my survival, yet my poor neighbour is starving or seriously ill, then if the latter were to break into my home to take what he needs for his survival is not an act of theft. Rather, it is I who am guilty of theft by withholding it from him.

Here is one representative quote from one of the Early Church Fathers, Basil of Caesarea (c.329 CE- c.379 CE):

“Will not one be called a thief who steals the garment of one already clothed, and is one deserving of any other title who will not clothe the naked if he is able to do so? That bread which you keep belongs to the hungry; that coat which you preserve in your wardrobe, to the naked; those shoes which are rotting in your possession, to the shoeless; that gold which you have hidden in the ground, to the needy. Wherefore, as often as you were able to help others, and refused, so often did you do them wrong.” (For more such arguments, see my Gods That Fail, Ch.4 or Subverting Global Myths, Ch.3)

Every March, the United Nations Human Rights Council meets in Geneva and, among its other business, passes resolutions calling on the government of Sri Lanka to implement mechanisms to ensure greater political accountability and respect for human rights. The government, in turn, protests with cliches about “national sovereignty”, promises to comply, and repeatedly fails to honour its promises.

Much of the pressure (but not all) on the UNHRC stems from militant members of the Sri Lankan Tamil ethnic diaspora in the West. Their exclusive concern seems to be bringing the President and military generals before the international criminal court to face charges of war crimes, especially during the final days of the war (May 2009- incidentally, when my Blog was birthed). If they were genuinely concerned about justice, rather than vengeance, they should also seek the prosecution of those who had financially supported the Tamil Tiger guerrillas during the protracted conflict. For the Tamil Tigers violated all rules of military engagement in using non-combatants as human shields and engaging in suicide bombings, political assassinations and the conscription of children. Many of the Tamils who fled as refugees to the West were fleeing not only the brutality of the army but also that of the Tamil Tigers. Indeed, the de facto government in the north of the island that the latter set up during the last decade of the conflict was more oppressive than what was experienced in the south.

For those of us who chose to stay in Sri Lanka during those harrowing decades of bloody conflict, having to combat the hypocrisy and double standards on both sides was as depressing as challenging the simplistic view propagated by Western media who reduced it to an “ethnic conflict”, ignoring all the complexities. There were rich Tamils who, while selling or renting their mansions in Colombo to foreign embassies or international companies, went as political asylees to the US, UK or Australia claiming to suffer economic discrimination. They hailed the Tamil Tigers as “our boys and girls”, while educating their own boys and girls to elite schools and universities in the West. As so often happens in such conflicts, the people least affected are the ones who determine outcomes.

In my experience, it is rare to find among Asian diasporas in the West, advocates for human rights or economic justice- unless, of course, the victims happen to be from their own ethnic community. Many well-to-do Sri Lankans in the UK, both Sinhalese and Tamils, speak in disparaging ways of Afro-Caribbean Britons and East European migrant workers. They voted for Brexit, forgetting their own recent history.

This is why I view with some scepticism the righteous anger on the part of “Asians” in the USA as they become targets of white violence and hate-speech. Of course these acts have to be condemned publicly and unequivocally. But, when hundreds of black churches were burned every year by white supremacists, where were the Asian-Americans who protested in solidarity with their black brethren? Going further back, how many marched with Rev Martin Luther King and supported the Civil Rights Movement?

Tragically, the Asian-American ecclesiastical landscape is characterised by segregation and ethnocentrism, even though they may be predominantly English-speaking; and parents are typically horrified if their children choose to marry a dark-skinned person.

I have often stated on this Blog that India and China are probably the most racist countries in the world (although “race” is not a recognized socio-political category in either). Just look at the movies, TV advertisements, billboards, news anchors, quiz shows, pop stars and see if you can identify a single dark face that is not that of a villain! And, if you still doubt me, ask any African student in, or visitor to, these countries.

In a recent op-ed in the New York Times, David Brooks interviewed a Christian theology professor on the distinctive Christian perspective on social justice. As the professor observed, there are rich resources which Christian theology and historical experience bring to the issue. When racism is seen as sin, and not just a social or political problem, we deal with it the way we do all sin: acknowledging it, confessing it, seeking forgiveness, changing direction (repentance) and (wherever possible) making restitution to those affected. 

I would add that we are all, in the Biblical light, both sinners and sinned-against, albeit to varying degrees.  This induces in us a certain self-critical humility: to address the sins in our own community even as we protest the sins committed against us.

Moreover, while social and political transformation involves systemic and institutional change, without a robust doctrine of the intrinsic and equal worth of every human person, coupled with a deeply relational view of such persons, all protest movements for change generate their own victims and new forms of oppression. This is happening with the “identity politics” and “culture wars” in North America which disfigure the contours of social justice.

If only we could isolate evil people like we do Covid-19 patients and inject them with drugs and vaccines! But, as the Russian Christian dissident and Nobel prize-winning author Aleksandr Solzhenitsyn reminded us, “The line dividing good and evil cuts through the heart of every human being.” (The Gulag Archipelago)

This year the global Church celebrates the birth centenary of one its greatest twentieth-century leaders, John Stott (1921-2011).

Although I heard him as a speaker and read some of his books during my undergraduate years in London, it was only in the final year of my postgraduate study that I got to know him personally when he invited me to join the reading group that met quarterly in his flat. One of my vivid memories of that group was going to watch a film (the title eludes me) by the renowned Swedish existentialist Ingmar Bergman. Stott was so deeply moved by the film that he insisted on taking us all to a nearby church where he knelt before the Lord’s Table and poured out his soul in contrition over all his flawed relationships.

It is such integrity and vulnerability that leave an indelible impression on young people’s minds. And it is the memory of Stott’s character, far more than his books or preaching, that I recall whenever I grow discouraged by the hypocrisies or arrogance of so many in leadership positions today.

Much of Stott’s “British public school theology” was challenged by his visits to the non-Western world and his friendships with non-Western Christian leaders. He actually listened to us, unlike so many others who only came to propagate their views and to “train” us. Commitment to the poor, and a growing engagement with social and political ethics, came to the fore in his later writings, much to the consternation of his conservative friends. His eclecticism and willingness to engage in dialogue with Roman Catholics alienated him from many in his own country who believed that there was nothing they could learn from others in the global Body of Christ.

When Stott invited me to give the London Lectures in Contemporary Christianity of 1998 (lectures which eventually became a book Faiths in Conflict? Christian Integrity in a Multicultural World), he took me out to dinner to explain the aim of the lectures and urged me to “Please help us evangelical Christians to see our blind-spots.” Here was a 77-year old man desiring to be taught by an obscure non-Westerner roughly half his age and with, hitherto, only two books to his credit! I was amazed. I have not met any other leader, before or since, who has expressed to me such a desire.

Stott shunned all adulation and the near-idolization that many heaped on him, not least in the USA. While continuing to hold him in great respect, there were, of course, aspects of his theology with which I disagree. Some of these are common to the Western evangelical culture he inhabited, such as being too rationalistic in his reading of the Bible and a tendency to treat the apostle Paul almost as a “second incarnation”. His exposure to the Eastern Church Fathers and the best of the monastical tradition in the West was severely limited.

In one of his most important books, The Contemporary Christian, Stott called for a “double refusal” on the part of the Church. Both Escapism and Conformity should be replaced by a posture of “double listening”: listening both to the Word and to the World. This was central to the development of a Christian Mind. He wrote: “We listen to the Word with humble reverence, anxious to understand it, and resolved to believe and obey what we come to understand. We listen to the world with critical alertness, anxious to understand it too, and resolved not necessarily to believe and obey it, but to sympathise with it and to seek grace to discover how the gospel relates to it.” (John Stott, The Contemporary Christian: An Urgent Plea for Double Listening, pp. 27-29)

This is well said; but it does not go far enough. For, surely, the aim of our listening to the world is not only to find relevant ways of communicating the gospel to that world but also to learn from the world (or, more accurately, from God’s actions in the world) a fuller and deeper understanding of that gospel itself. The apostle Peter’s listening to Cornelius relating his personal journey (Acts 10 & 11) would be a paradigm example from the early Church. What is happening here is a “double conversion”: Cornelius to Christ and Peter to a deeper understanding of Christ.

As the Church historian Andrew Walls famously put it: “It is as though Christ himself actually grows through the work of mission… As he enters new areas of thought and life, he fills the picture. It is surely right to see the process as being repeated in subsequent transmission of the faith across cultural lines.” (Andrew F. Walls, The Missionary Movement in Christian History: Studies in the Transmission of Faith, p.xvii)

Listening to the world also involves more than reading influential secular texts. It includes deep personal encounters with men and women outside the Church and also being plunged into the pain, confusion and creativity of all humanity. This is where the hermeneutics of “double listening” must lead. And the development of a “Christian mind” cannot occur by leap-frogging the rich Christian intellectual traditions that have emerged in the world Church through prolonged conversation with all other human intellectual enquiries and a faithful immersion in wider human communities. So, it is not simply a matter of “the Word and the World” but “the Word in its long engagement with the World”.

So, in this centenary year, even as we give thanks to God for such a remarkable servant of the Church, we should neither pay mere lip-service to John Stott’s legacy nor idolize him. Perhaps the best way to honour him would be to imitate his integrity and teachability.

While the world breathes a collective sigh of relief at the departure of Donald Trump and his acolytes, celebrations may well be premature.

The USA remains a deeply polarized society, and the influence of the internet has exacerbated similar polarizations- moral, economic, political- all over the world.

Of the wide variety of people who voted for Trump, despite all his incompetence, blatant lies and  narcissistic rants, the only ones with whom I can sympathise to some degree are those rural and urban working-class Americans who looked to him, both in 2016 and now, as one standing outside the conventional political system, whether Democrat or Republican, that had largely ignored their fears and concerns in recent decades. These are people who feel impotent, irrelevant, obsolete.

Of course Trump shamelessly stoked racist, misogynist, and xenophobic sentiments at every opportunity. But in 2016, and again last year, Trump offered hope to those who had been left jobless by the global flow of capital and also wanted him to bring back their children from fruitless wars overseas. He promised to rebuild American industry and continue his hard line on China. The latter policy is the only promise Trump honoured and it won support from many Asian countries, as well as Asian-Americans living in the US who are rightly angered by China’s repressive political regime.

But, at the same time, his choice for Labour Secretary, Andrew Puzder, is the boss of several big fast food companies and a fan of automated customer services: “They’re always polite, they always upsell, they never take a vacation, they never show up late, there’s never a slip-and-fall, or an age-, sex-, or race-discrimination case,” he is reported to have said soon after his appointment. Further, nearly half of current jobs in the U.S. will be automated by 2033. Not only will 3D printers eliminate jobs in manufacturing, but “truck drivers” (the most common job in some American states) will also become obsolete in the age of driverless cars and trucks.  

Those who drive this relentless surge to automate everything are the Hi-tech giants and owners of Big Business. They are found among Democrats as well as Republicans; and Joe Biden’s almost single-minded focus on Covid in his election campaign did little to reassure voters that he understood their desperate economic future. Apart from Bernie Sanders, no one seemed able to acknowledge how so-called “neoliberal” economic ideology had subverted liberal democracy and played into the hands of “far-right nationalists”- something also seen in Europe and parts of Asia.

Many political liberals are, along with rich conservatives, part of the ruling elite that treat poorer and less-educated folk with supercilious disdain. As for the latter’s moral and religious concerns, these too are summarily dismissed as antiquated and regressive, without public debate. They are jeered at by East Coast comedians and are often the butt of ridicule in Hollywood movies.

It is impossible, of course, to argue with conspiracy theorists and those who move only within their limited circles and see every issue in black-or-white terms. The latter include militant atheists as well as the militantly religious, the highly educated as well as the non-literate. One can only argue with those who believe in argument, reason with those who respect reason. But there are plenty of such men and women across the political and moral/religious divides in all nations. They may well be the “silent majority”; but, if so, their silence and neglect of genuine dialogue among themselves have ushered extremists on to the centre of the political stage.

About ten years ago, I spoke at an American university on the theme of Justice. I told my audience that the three most important contemporary justice challenges that American society faced were (a) the massive wealth inequalities (which translate into power inequalities); (b) global warming and climate change (which affect millions of people who are not responsible for greenhouse emissions); and (c) protecting the lives of foetal human children who are the most voiceless and vulnerable persons in our human community.

Many students told me afterwards that they had heard “conservative” speakers address the third topic, and “progressive” speakers address the first two, but that they had never heard anybody bring all three issues together in a single talk on justice. I found this both revealing and disheartening. It expresses the utterly unintelligible polarizations in American society.

And, far more tragically for me, it mirrors the same polarizations in the American church today, a church that while failing to live as a prophetic counter-culture within its own context, exports its divisions and prejudices to the rest of the world. Can the American church be an agent of reconciliation? Only if it practices humble repentance and the willingness to listen to, and learn from, others.

This gloomy year comes to a close with a glimmer of light in the form of remarkable vaccines developed and coming on board at an unprecedented rate. These vaccines are safe and offer hope to many.

However, there are serious questions about who will have access to them, and how soon; and lurking behind  all this is the all-important question of whether the exclusive pursuit of “technological fixes”, apart from giving rise to new sets of problems, can ever be a substitute for addressing the deeper moral, ecological and political challenges the world has been ignoring and which have exacerbated Covid-19.

As the global population waits for vaccines to become available, the human costs are mounting, in lives lost, long-term disability, economic collapse, children dropping out of schools, and lost livelihoods. Further, the WHO has repeatedly warned that several viruses, likely to cause pandemics similar to what we have been experiencing this year, are on the horizon- unless we take preventive measures.

In the early months of the pandemic I pointed out that the rapid spread of Covid-19 was a result of our global interconnectedness combined with deteriorating global cooperation. And that it exposed the growing health and economic disparities within nations, with the people whom we typically ignore (because they are invisible to us in “normal” times) at the forefront of caring for the victims and helping the rest of manage the effects of lockdowns. Once mass immunizations spread, and the threat of Covid-19 recedes, there should not be any return to such “business-as-usual”, whether within or between nations. It has to be a wake-up call to political, business and intellectual leaders.

The Intergovernmental Platform on Biodiversity and Ecosystem Services Workshop (27-31 July 2020, held virtually) warned that an estimated 1.7 million currently undiscovered viruses are thought to exist in mammal and avian hosts. Of these, 631,000-827,000 could have the ability to infect humans. Five new diseases emerging in people every year, any one of which has the potential to spread and become global.

The underlying causes of pandemics are the same global environmental changes that drive biodiversity loss and anthropogenic climate change.

The Report of the workshop highlighted several drivers of pandemic risk. Pandemics have their origins in diverse microbes carried by animal reservoirs, but their emergence is entirely driven by human activities. These include agricultural expansion and intensification, and wildlife trade and consumption. These bring wildlife, livestock, and people into closer contact, allowing animal microbes to move into people and lead to infections, sometimes outbreaks, and more rarely into true pandemics that spread through road networks, urban slums and global travel. Land-use change is a significant driver of pandemics and includes deforestation, human settlement in primarily wildlife habitats, the growth of cash crops and livestock production, and urban sprawl.

Earlier this year I ready of poor people in Kenya, whose hunger has worsened because of lockdown measures, resorting eating giraffe meat and that of other endangered species.

However, it is our unsustainable global consumption habits, driven by demand in developed countries and emerging economies, as well as by demographic pressure, that have to change.

“The business-as-usual approach to pandemics is based on containment and control after a disease has emerged and relies primarily on reductionist approaches to vaccine and therapeutic development rather than on reducing the drivers of pandemic risk to prevent them before they emerge”, states the Report.

Scientific and economic analysis warns us that unless we make transformative changes in our taken-for-granted “lifestyles”, the costs of climate change coupled with more regular pandemics will prove disastrous for the entire human race. This will be a century of crises, notwithstanding technological breakthroughs, many of them more dangerous than what we are currently experiencing.

We now know what it’s like to have a full-on global-scale crisis, one that disrupts everything. The world has come to feel different, with every assumption about safety and predictability turned on its head.

How can faith, “seeking understanding” as always, direct our walk into the darkness of the future? The moral theologian Oliver O’Donovan raises this question and answers in terms of Christian hope: “No act of ours can be a condition for the coming of God’s Kingdom. God’s Kingdom, on the contrary, is the condition for our acting; it underwrites the intelligibility of our purposes.” (Self, World, and Time, Vol.1, 2013)

And here is the novelist Marilyn Robinson, a sane public voice in the midst of religious and secularist obscurantism: “[B]y nature we participate in eternal things- justice, truth, compassion, love.  We have a vision of these things we have not arrived at by reason, have rarely learned from experience, have not found in history. We feel the lack. Hope leads us toward them.” (“Considering the Theological Virtues”, What Are We Doing Here?, 2018)

This is not a time for nostalgia and myths of national sovereignty. If we ever needed globally-minded statesmen and stateswomen, as opposed to mere politicians, it is now.

Categories

Archives

December 2021
M T W T F S S
 12345
6789101112
13141516171819
20212223242526
2728293031