Vinoth Ramachandra

Archive for January 2011

With “whistle-blowing” once again the subject of legal and political harangues, surely it is only a matter of time before WikiLeaks and Julian Assange make it to Hollywood. The finest example from Hollywood of the genre of thriller-journalism is Michael Mann’s The Insider (1999), starring Al Pacino and Russell Crowe. More cleverly crafted than All the President’s Men (about Watergate), The Insider wrestles with the age-old question, going back perhaps to Plato’s Republic, whether truth matters more than personal happiness and social harmony. It is these kinds of questions that the greatest fiction and the best movies raise for us; and, interestingly, often the best movies of this kind are based on real life stories.

The Insider is about the long-running battle in the 1990s between various US state legislatures and the giant tobacco companies. Jeffrey Wigand (Crowe) is head of research at Brown & Williamson’s laboratories. Highly paid and at the peak of his career, Wigand is nevertheless troubled by the information he possesses. Not only is nicotine addictive (which the CEOs of seven cigarette companies, including his own, had denied under oath before Congress), but Wigand knows that additives are used to make it more addictive and, in fact, one of the additives was a known carcinogen. He is sacked for expressing his dissent over this practice.

Wigand finds himself coaxed into telling his story to Lowell Bergman (Pacino), a producer for “60 Minutes”, the CBS News program. He has, however, signed a confidentiality agreement with B&W, and Bergman somehow has to get around that promise if the truth is going to be revealed. When B & W’s “gag order” order on Wigand fails to dissuade him from giving evidence in a Mississippi legal deposition against the tobacco industry, the company resorts to the now familiar tactics of smear campaigns, bribing the police and death threats against his family.

Halfway through the movie, there is a fascinating shift of focus from Wigand’s moral/legal struggle with breaching his confidentiality agreement to the ethical dilemmas of journalism. Bergman is caught up in a battle with his own bosses. CBS executives are afraid to air the ”60 Minutes” interview with Wigand because they are threatened by a lawsuit by B & W which could destroy the network. Here emerge issues deeper than the lies of the tobacco industry. Should the truth be told even if it means destroying one’s own media company? As Mike Wallace, the celebrated interviewer of Wigand puts it to Bergman (my own paraphrase): “I can retire after a lifetime of fame. But infamy lasts longer. I don’t want to be known as the man who brought down CBS.”

Pacino plays the journalist hero beautifully: stubborn, hoarse-voiced, faithfully running with his source, unrelenting in his manipulation of other networks to force his organization to broadcast the interview with Wigand. “60 Minutes” did eventually find a way of running the story, after delays and soul-searching. We are told at the end of the film that Wigand’s expose of the B & W laboratories led to a US $246 billion settlement of suits brought against the tobacco industry in all 50 American states.

How much the film is accurate in its portrayal of Bergman, Wigand, CBS and others is beyond my competence to judge. But, in any case, I don’t watch movies to learn facts. What I enjoy about the film is its strong story and memorable characters. Like all such good movies it has the ability not only to entertain but to rouse emotion about issues that matter in very society and in every age.

It has become fashionable in some academic circles (misleadingly dubbed “postmodernist”) to play down the notion of “truth”, even to suggest that it is but one language-game that vies with others for social power. Yet the judicial system, as well as most academic disciplines, presuppose the fundamental role of truthful testimony. And, despite all the self-serving rhetoric about “don’t rock the boat” and “be realistic” (a phrase regularly thrown at Bergman by his bosses), most of us still admire the martyr: that is, the man or woman who is willing to risk imprisonment, financial ruin, and ultimately death itself for standing up for the “truth”. Furthermore, in an age that tends to reduce all moral language to talk about “personal preferences”, why is it that the moral character of a witness, whether in a law-court or the laboratory, is still considered absolutely vital to the pursuit of “truth”?

Another interesting point to discuss in groups: will socialized medicine and education encourage more “whistle-blowers” to expose cover-ups that are in the public interest? Wigand’s personal struggle had as much to do with fear of losing medical benefits (he had an asthmatic daughter) as it had to with his conscience.

There are also sinister political connotations. Whether in the West or East, governments try to persuade us that “national security” or “social harmony” decree that religious communities surrender claims to truth and higher loyalties. This is to accept definitions of “security” and “harmony” that are defined by those who happen to hold the reins of power. The cost of giving up talk about  truth is to let power and violence have the last word. That would be the death of all civilization.

Finally, back to WikiLeaks. It would be tragic if the recently leaked US diplomatic cables (mostly uninteresting gossip) distracted global attention from the plight of the American soldier, Bradley Manning, whose release to WikiLeaks of videos and documents showing alleged war crimes in Iraq led to his imprisonment last April. It is these latter charges that American citizens, particularly Christians and others who care about truth, should be calling their courts to investigate.

One of the characters in the film The Social Network delivers what is intended as a generation-defining line: “We lived on farms, then we lived in cities, and now we’re gonna live on the Internet.” A scary thought.

What kind of life is it that is lived on the Internet? In my post of 31 October 2009 (“Becoming Faceless?”) I sketched how we are changed as persons by the technologies we use; and also gave my reasons for being a recluse in the world of social media. Most of my colleagues in the organization I work with are more “plugged in” than I am. They have strange cords and gadgets semi-permanently attached to their ears, and much of their waking hours are spent on Facebook, Twitter, Skype and e-mail. With the click of a button they can send off comments and replies to a large and anonymous audience. But I have ceased to expect a thoughtful, considered reply to any of my own e-mails. It may be that my letters disappear under the sheer mass of “information” with which my colleagues are inundated. Or, as is more likely, they simply do not have the time to switch off and think before they click the reply button. This does raise the important question: how is human communication suffering as a result of the widespread use of the new communication media?

If anybody is tempted to dismiss my comments as the nostalgic rants of a Luddite, let him read Jaron Lanier’s recent book You Are Not a Gadget: A Manifesto (Penguin, 2010). Lanier is no technophobe or ignoramus. One of the pioneers of virtual reality (indeed, he was the one who first coined the term “virtual reality”) Lanier belongs to that rare breed of engineers who reflects philosophically on their work. He is cynical about the reductionist tendencies prevalent in the field of computer science (for example, reducing thinking to “information processing” and prostrating oneself before machines). He points out that every software program embodies a personal philosophy. “It is impossible to work with information technology without also engaging in social engineering.”

The slightest change in something as seemingly trivial as the ease of use of a button can sometimes completely alter behaviour patterns. For instance, Stanford University researcher Jeremy Bailenson has demonstrated that changing the height of one’s avatar in immersive virtual reality transforms self-esteem and social self-perception.

Lanier points out that “anti-human rhetoric” abounds in the world of computing. Kevin Kelly, founder of Wired magazine, has stated that we don’t need authors anymore, since all the ideas of the world, all the fragments that used to be assembled into coherent books by identifiable authors, can be combined into one single, global book. “People degrade themselves in order to make machines seem smart all the time,” writes Lanier. “Before the crash, bankers believed in supposedly intelligent algorithms that could calculate credit risks before making bad loans. We ask teachers to teach to standardized tests so a student will look good to an algorithm.. The attribution of intelligence to machines, crowds of fragments, or other nerd deities obscures more than it illuminates…Treating computers as intelligent, autonomous entities ends up standing the process of engineering on its head. We can’t afford to respect our own designs so much.”

His most scathing comments are directed at the developers of what has come to be called Web 2.0. “It breaks my heart when I talk to energized young people who idolize the icons of the new digital ideology, like Facebook, Twitter, Wikipedia, and free/open/Creative Commons mashups.” In the preface to the book he states: “You have to be somebody before you can share yourself.” But for Mark Zuckerberg sharing your choices with everybody (and doing whatever they do) is being somebody. When a human being becomes a set of data on a website like Facebook, he or she shrinks. We are squeezed into “multiple-choice identities”. Wikipedia obliterates context and personal perspective- without which information can be dangerously misleading. Yet research any topic on an Internet search engine, and the first site you be will be referred to is Wikipedia.

Recent political and legal debates in the US over the Wikileaks “revelations” has perhaps obscured other serious threats to freedom- those posed by the advertising industry and credit card companies that can buy and manipulate personal data for private profit. In an article entitled ‘Generation Why?’ the novelist Zadie Smith, who teaches English at Harvard and is only a  decade or so older than Zuckerberg, complains that “our denuded networked selves don’t look more free, they just look more owned.”  For her Facebook reflects a sad reality: “500 million sentient people entrapped in the recent careless thoughts of a Harvard sophomore with a Harvard sophomore’s preoccupations. What is your relationship status? (Choose one. There can be only one answer. People need to know.) Do you have a ‘life’? (Prove it. Post pictures). Do you like the right sort of things? (Make a list. Things to like will include: movies, music, books, and television, but not architecture, ideas and plants.)”

Given the huge numbers of Christians involved in the world of computers and information technology, especially in India, South Korea and the USA, why is there so little critical and theological reflection of this nature emerging in our churches and seminaries? Christians, of all people, should be profoundly interested in communication, given that the self-communication of God in human flesh is at the heart of the Gospel. Why has it been left to secular humanists and others to articulate the prophetic insights that we desperately need in our technology-driven environment?



January 2011