Connect with us

Sat Mag

Towards a new realism, or why American interventionism fails

Published

on

By Uditha Devapriya

In hindsight the interventionists got it wrong: they relied too much on the middle-classes, the civil societies, and the potential for permanent democracy in the countries they intervened in. After four years of entertaining doubts about interventionism, Condoleezza Rice argued, “A rising middle class also creates new centres of social power for political movements.” Today the middle-class in the Third World continues to grow and civil society continues to flourish. Yet democracy, apart from a roller-coaster ride of regime change which never seems to end and always produces electoral backlashes against it, has not. How come?

Charles Krauthammer called the US a “commercial republic.” If it is a commercial republic, it probably is the only one operating in a political system run by an oligarchy, maintained by an upper middle-class, tolerated by a lower middle-class, and suffered by a working class. In this it surpasses not only Athens, but also the European Powers.

What Clinton humanitarians and Bush realists failed to realise was that the rest of the world, barring Western Europe and South-East Asia, does not fit this description: we not only are averse to democracy maintained by a middle-class, we also don’t buy it. Joe Biden may write that “[t]he world does not organize itself” and that for 70 years the US “played a leading role in… animating the institutions that guide relations among nations.” Yet 70 years have ended in the present. Now is not 70 years back. Now is now. The world does not let one country, let alone one superpower, organise everything. The world does not wait.

In trying to intervene and imposing democracy, America hence strove to create an order in its image. “We’ll win hearts and minds”, ran the refrain when US forces pulled down Saddam Hussein’s statue in Firdos Square in April 2003. On the eve of the invasion a Congresswoman went on television and declared, “We’ll go in there, take out Saddam, destroy his army with clean surgical strikes, and everyone will think it’s great.” A year earlier, Dick Cheney made his case for regime change quoting Washington’s favourite Arab expert Fouad Ajami, who apparently believed Iraq would “erupt in joy in the same way the throngs in Kabul greeted the Americans.” But then the throngs didn’t greet the Americans in Kabul. They weren’t going to do so in Basra. What resulted from these interventions was an aberration.

US foreign policy has always been guided by national interest. Only once did it enjoy the status of a sole superpower, and that was between 1991 and 2001. Two camps surveying the world came up with two conclusions about the role of the US in the world order then: the end of history theorists and the clash of civilisations theorists.

The fall of the Soviet Union, which was less the work of Reagan Republicans and Thatcherite Conservatives than of Brezhnev’s entrenchment of the Nomenklatura and its contradictions with a supposedly egalitarian Communist state, was followed by ethno-nationalist-religious uprisings outside the West. Yet in the aftermath of Soviet collapse, the US gained supremacy. There was no one to challenge it. Not even Islamic fundamentalism.

The reaction in America was, as Charles Krauthammer put it, first of confusion, then of awe. Even so not all political pundits felt or believed that America should take over the rule of the world: the isolationists, or the paleoconservatives as they were to be called later, wanted out. But then this was not the time of the paleoconservatives: that would come a quarter century later with the Tea Party Movement, Steve Bannon, and Donald Trump.

In the meantime, civilisation-states propped up their struggle for regional hegemony with one another: Japanese and Chinese in East Asia, Buddhists and Hindus (or rather Sinhalese and Tamils: the clash of civilisations theorists always get it wrong with South Asia) in Sri Lanka. Yet in the absence of a counterweight, the US attempted to carpet the world with the kind of order it had wanted to impose for over half a century. What resulted was the suppression of nationalism on the one hand and the frothing of nationalist tensions on the other.

Accompanying all this was a shift to the neoliberal right by political parties associated with the left or the left-of-centre: the Democrats in the US, the Labourites in the UK, the Congress in India, and of course the SLFP in Sri Lanka. Underlying them was one simple, undeniable fact: for the first time since the beginning of it all, a disparity of power put one country at the helm of the world. “There is no comparison,” Paul Kennedy famously observed.

I’ve observed that American foreign policy has always been driven by national interest. It has also been driven by the need to promote national interest. Yet with Bill Clinton’s presidency there came an impasse: a dove on the Vietnam War who rejected Reagan’s neoconservatism yet embraced his promise of free markets, how was he to reconcile his anti-conservatism with the imperative to exert US strength abroad? The Clinton administration did this by looking at the world through a new lens: liberal interventionism.

On four occasions under Clinton, liberal interventionism resorted to military action: Somalia, Haiti, Bosnia, and Kosovo. How did Cold War doves turn into humanitarian hawks there? Simple, Krauthammer argued: they saw in these countries a cause for intervention “devoid of raw national interest.” In other words these were propelled by a call to promote US interests, but those interests weren’t national interests. They were more values than interests: they used the force of moral suasion, rather than the threat of war, to intervene.

The issue here was that while liberal interventionism operated on the premise of multilateral action, it didn’t need multilateralism to act. The truth was that US administrations, regardless of whoever was in power, could have gone there and bombed the living daylights out of an entire civilisation without requiring a by-your-leave from the UN or the EU. Reagan didn’t inform Margaret Thatcher of his intention to invade Grenada even at the tail-end of the Cold War, so why should Joe Biden defend the Chemical Weapons Convention on the grounds of the “moral suasion of the entire international community”?

Liberal interventionism, then as now, works best when one superpower rules them all. It does not work when two or more rising powers challenge the hegemony of the one superpower. It does not work when multilateral institutions are used by these rising powers to pre-empt the choices that the superpower has to make to impose its will upon us all. Thus it was perfectly suited to 1990s, when Somalia, Bosnia, Haiti, and Kosovo could happen with consensus. It was ill-suited to the end of the 1990s and the 2000s, when tensions between civilisation-states no longer made “moral suasion” good enough grounds for US intervention.

What could a viable alternative be? Certainly not Kissingerian realism. Even realists had long abandoned Kissinger’s axioms about the irrelevance of morality in relations between nations. The world was too dangerous a place to try that out: unlike earlier, when nuclear powers banded themselves into either of the two main camps (Communist and capitalist, aligned and non-aligned), now they were left to their own devices. Indeed, the flip-side to US power over the world after the Cold War was the “nationalisation” of the nuclear programme: India and Pakistan now would use the bomb not in the backdrop of detente between the US and the USSR, but in the backdrop of clashes between Hindus and Muslims. Amorality was no longer a card on the table because no one dealt that card any more: while nation-states were guided by the force of authority, there was now a moral aspect to that authority.

Iraq would have wanted to bomb Israel to preserve its power, but it also would have wanted to do so because of Arab tensions with a Jewish settlement in their territory.

In other words, what worked for the world before 1939 – Wilsonian idealism – and the world before 1991 – Kissingerian realism – would not work for the world after 1991. Underlying this disjuncture between idealism and realism was another crucial one: between neoliberalism and neoconservatism. The two would meet in the Reagan presidency.

Under Reagan neoconservatism reared its head (mainly but not only) on the foreign policy front, and neoliberalism reared its head (mainly but not only) on the domestic economic front, promoting counterinsurgency in Latin America while cutting down the welfare state and also empowering the Christian Right and ballooning defence spending. It was a potpourri, and its relevance to post-Cold War US foreign policy lies in the fact that despite differences between the two, both tended to justify action with ideal: neoliberalism with its belief in free markets against government intervention, and neoconservatism with its championing of a traditional American order against internal threats and external enemies.

The neoconservatives found their home in the Bush II presidency. For a while, they grappled with what doctrine they could invoke to justify action. Then 9/11 happened. To validate US interventions in West Asia, the Bush troupe – Cheney, Rumsfeld, Wolfowitz, and Perle – all resorted to what Charles Krauthammer called “democratic realism.” It strove to balance the concerns of liberal interventionism with the imperatives of realism; in other words, “We will support democracy everywhere, but we will commit blood and treasure only in places where there is strategic necessity – meaning, places central to the larger war against the existential enemy, the enemy that poses a global mortal threat to freedom.”

To say that democratic realism squared the circle would be putting it mildly: in one stroke it assuaged the concerns of liberal interventionists, who didn’t want to impose order without a humanitarian rationale for it, while alienating the isolationists, who didn’t want the US to get involved in any order imposing project, period. What it did was to substitute values for power as the overriding principle of national interest, thereby distancing itself from the Morgenthau-Carr-Kennan-Kissinger school of international relations thought.

Francis Fukuyama criticised this new realism on the basis that it was aimed at a threat far removed from Soviet Communism.

To that Krauthammer retorted: “So what?” Islamists may not have had the means of “actualising their vision”, as Fukuyama claimed it didn’t, yet as Krauthammer observed, borrowing an analogy from Nazi Germany, Hitler did not have the means of actualising his goal of overrunning Europe when he marched into the Rhineland either. The argument was convincing, and it won over mild neoconservatives in the Bush II presidency. You see it crop up in Condoleezza Rice’s article, quoted above: she envisions an “American realism for a new world”, a fusion of power and principle. Instead of imposing our will upon them, she contends, an international order “that reflects our values” would be “the best guarantee of our enduring national interest.”

How prophetic those words were – not. 12 years later, we’re gorging on the leftovers of the Trump presidency, which in the popular consciousness of Republicans got the United States out of Iraq and Afghanistan, reduced its military obligations in Europe, steered the country from globalist-internationalists and democratic realists to America First, and got the job done in the economy. Were it not for the “China virus”, his supporters declare, Trump would have easily won, a point confirmed in less than expected Democratic victories even in states where they were expected to lead by wide margins.

If, for a brief moment at least, the paleoconservatives had their day in the US, it was because democratic realism, which combined the humanitarian ideals of the Clinton presidency with the neoconservatism of the Bush presidency, ended up promoting a variant of interventionism which could not really go or think beyond recreating the world in America’s image. In that sense one can say it differed in degree and not in substance from the Wilsonian idealism and the Kissingerian realism which it eschewed. Coming in the wake of the “unipolar moment” in human history, it could not survive China’s rise and Russia’s shift to the East, or the coming up of regional powers, unless it evolved a strategy of engagement with them.

America’s belief in messianism must end, not because it isn’t a superpower, but because it’s not the only major power. Therein lies the key difference between the bipolarity of the Cold War, the unipolarity of the 1990s, the nonpolarity of the early 2000s, and the multipolarity of now: this is a world which pits the US and China on a collision course with each other, with regional players siding with either side. Interventionism by the US, no matter how pure the intentions it exhibits to the world may be, will invariably end up as a failure because every country it intervenes in will oppose it. Thus the world, contrary to what President-elect Biden says, can and will organise itself, and it will do so to oppose any one power leading the race to sort everything out. Think of it as a more interlinked order in which Rome reigns in one corner and the Persians and Chinese reign in another. In that sense this is not a Cold War. It is hotter than any Cold War. And interventionism will only make it hotter. To prevent that from happening, the US needs a new doctrine. A new realism, for a new order.

The writer can be reached at udakdev1@gmail.com

Author


  • News Advertiesment

    See Kapruka’s top selling online shopping categories such as ToysGroceryFlowersBirthday CakesFruitsChocolatesClothing and Electronics. Also see Kapruka’s unique online services such as Money Remittence,NewsCourier/DeliveryFood Delivery and over 700 top brands. Also get products from Amazon & Ebay via Kapruka Gloabal Shop into Sri Lanka.

    Author

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Sat Mag

End of Fukuyama’s last man, and triumph of nationalism

Published

on

By Uditha Devapriya

 

What, I wonder, are we to make of nationalism, the most powerful political force we have today? Liberals dream on about its inevitable demise, rehashing a line they’ve been touting since goodness-knows-when. Neoliberals do the same, except their predictions of its demise have less to do with the utopian triumph of universal values than with their undying belief in the disappearance of borders and the interlinking of countries and cities through the gospel of trade. Both are wrong, and grossly so. There is no such thing as a universal value, and even those described as such tend to differ in time and place. There is such a thing as trade, and globalisation has made borders meaningless. But far from making nationalism meaningless, trade and globalisation have in fact bolstered its relevance.

The liberals of the 1990s were (dead) wrong when they foretold the end of history. That is why Francis Fukuyama’s essay reads so much like a wayward prophet’s dream today. And yet, those who quote Fukuyama tend to focus on his millenarian vision of liberal democracy, with its impending triumph across both East and West. This is not all there is to it.

To me what’s interesting about the essay isn’t his thesis about the end of history – whatever that meant – but what, or who, heralds it: Fukuyama’s much ignored “last man.” If we are to talk about how nationalism triumphed over liberal democracy, how populists trumped the end of history, we must talk about this last man, and why he’s so important.

In Fukuyama’s reading of the future, mankind gets together and achieves a state of perfect harmony. Only liberal democracy can galvanise humanity to aspire to and achieve this state, because only liberal democracy can provide everyone enough of a slice of the pie to keep us and them – majority and minority – happy. This is a bourgeois view of humanity, and indeed no less a figure than Marx observed that for the bourgeoisie, the purest political system was the bourgeois republic. In this purest of political systems, this bourgeois republic, Fukuyama sees no necessity for further progression: with freedom of speech, the right to assemble and dissent, an independent judiciary, and separation of powers, human beings get to resolve, if not troubleshoot, all their problems. Consensus, not competition, becomes the order of the day. There can be no forward march; only a turning back.

Yet that future state of affairs suffers from certain convulsions. History is a series of episodic progressions, each aiming at something better and more ideal. If liberal democracy, with its championing of the individual and the free market, triumphs in the end, it must be preceded by the erosion of community life. The problem here is that like all species, humanity tends to congregate, to gather as collectives, as communities.

“[I]n the future,” Fukuyama writes, “we risk becoming secure and self-absorbed last men, devoid of thymotic striving for higher goals in our pursuit of private comforts.” Being secure and self-absorbed, we become trapped in a state of stasis; we think we’re in a Panglossian best of all possible worlds, as though there’s nothing more to achieve.

Fukuyama calls this “megalothymia”, or “the desire to be recognised as greater than other people.” Since human beings think in terms of being better than the rest, the fact of reaching a point where we don’t need to show we’re better lulls us to a sense of restless dissatisfaction. The inevitable follows: some of us try finding out ways of doing something that’ll put us a cut above the rest. In the rush to the top, we end up in “a struggle for recognition.”

Thus the last men of history, in their quest to find some way they can show that they’re superior, run the risk of becoming the first men of history: rampaging, irrational hordes, hell-bent on fighting enemies, at home and abroad, real and imagined.

Fukuyama tries to downplay this risk, contending that liberal democracy provides the best antidote against a return to such a primitive state of nature. And yet even in this purest of political systems, security becomes a priority: to prevent a return to savagery, there must be an adequate deterrent against it. In his scheme of things, two factors prevent history from realising the ideals of humanity, and it is these that make such a deterrent vital: persistent war and persistent inequality. Liberal democracy does not resolve these to the extent of making them irrelevant. Like dregs in a teacup, they refuse to dissolve.

The problem with those who envisioned this end of history was that they conflated it with the triumph of liberal democracy. Fukuyama committed the same error, but most of those who point at his thesis miss out on the all too important last part of his message: that built into the very foundation of liberal democracy are the landmines that can, and will, blow it off. Yet this does not erase the first part of his message: that despite its failings, it can still render other political forms irrelevant, simply because, in his view, there is no alternative to free markets, constitutional republicanism, and the universal tenets of liberalism. There may be such a thing as civilisation, and it may well divide humanity. Such niceties, however, will sooner or later give way to the promise of globalisation and free trade.

It is no coincidence that the latter terms belong in the dictionary of neoliberal economists, since, as Kanishka Goonewardena has put it pithily, no one rejoiced at Fukuyama’s vision of the future of liberal democracy more than free market theorists. But could one have blamed them for thinking that competitive markets would coexist with a political system supposedly built on cooperation? To rephrase the question: could one have foreseen that in less than a decade of untrammelled deregulation, privatisation, and the like, the old forces of ethnicity and religious fundamentalism would return? Between the Berlin Wall and Srebrenica, barely three years had passed. How had the prophets of liberalism got it so wrong?

Liberalism traces its origins to the mid-19th century. It had the defect of being younger, much younger, than the forces of nationalism it had to fight and put up with. Fast-forward to the end of the 20th century, the breakup of the Soviet Union, and the shift in world order from bipolarity to multipolarity, and you had these two foes fighting each other again, only this time with the apologists of free markets to boot. This three-way encounter or Mexican standoff – between the nationalists, the liberal democrats, and the neoliberals – did not end up in favour of dyed-in-the-wool liberal democrats. Instead it ended up vindicating both the nationalists and the neoliberals. Why it did so must be examined here.

The fundamental issue with liberalism, which nationalism does not suffer from, is that it views humanity as one. Yet humanity is not one: man is man, but he is also rich, poor, more privileged, and less privileged. Even so, liberal ideals such as the rule of law, separation of powers, and judicial independence tend to believe in the equality of citizens.

So long as this assumption is limited to political theory, nothing wrong can come out of believing it. The problem starts when such theories are applied as economic doctrines. When judges rule in favour of welfare cuts or in favour of corporations over economically backward communities, for instance, the ideals of humanity no longer appear as universal as they once were; they appear more like William Blake’s “one law for the lion and ox.”

That disjuncture didn’t trouble the founders of European liberalism, be it Locke, Rousseau, or Montesquieu, because for all their rhetoric of individual freedoms and liberties they never pretended to be writing for anyone other than the bourgeoisie of their time. Indeed, John Stuart Mill, beloved by advocates of free markets in Sri Lanka today, bluntly observed that his theories did not apply to slaves or subjects of the colonies. To the extent that liberalism remained cut off from the “great unwashed” of humanity, then, it could thrive because it did not face the problem of reconciling different classes into one category. Put simply, humanity for 19th century liberals looked white, bourgeois, and European.

The tail-end of the 20th century could not have been more different to this state of affairs. I will not go into why so and how come, but I will say that between the liberal promise of all humanity merging as one, the nationalist dogma of everyone pitting against everyone else, and the neoliberal paradigm of competition and winner-takes-all, the winner could certainly not be ideologues who believed in the withering away of cultural differences and the coming together of humanity. As the century drew to a close, it became increasingly obvious that the winners would be the free market and the nationalist State. How exactly?

Here I like to propose an alternative reading of not just Fukuyama’s end of history and last man, but also the triumph of nationalism and neoliberalism over liberal democracy. In 1992 Benjamin Barber wrote an interesting if not controversial essay titled “Jihad vs. McWorld” to The Atlantic in which he argued that two principles governed the post-Cold War order, and of the two, narrow nationalism threatened globalisation. Andre Gunder Frank wrote a reply to Barber where he contended that, far from opposing one another, narrow nationalism, or tribalism, in fact resembled the forces of globalisation – free markets and free trade – in how they promoted the transfer of resources from the many to the few.

For Gunder Frank, the type of liberal democracy Barber championed remained limited to a narrow class, far too small to be inclusive and participatory. In that sense “McWorldisation”, or the spread of multinational capital to the most far-flung corners of the planet, would not lead to the disappearance of communal or cultural fragmentation, but would rather bolster and lay the groundwork for such fragmentation. Having polarised entire societies, especially those of the Global South, along class lines, McWorldisation becomes a breeding ground for the very “axial principle” Barber saw as its opposite: “Jihadism.”

Substitute neoliberalism for McWorldisation, nationalism for Jihadism, and you see how the triumph of one has not led to the defeat of the other. Ergo, my point: nationalism continues to thrive, not just because (as is conventionally assumed) liberal democracy vis-à-vis Francis Fukuyama failed, but more importantly because, in its own way, neoliberalism facilitated it. Be it Jihadism there or Jathika Chintanaya here, in the Third World of the 21st century, what should otherwise have been a contradiction between two forces opposed to each other has instead become a union of two opposites. Hegel’s thesis and antithesis have hence become a grand franken-synthesis, one which will govern the politics of this century for as long as neoliberalism survives, and for as long as nationalism thrives on it.

 

The writer can be reached at udakdev1@gmail.com

Author

Continue Reading

Sat Mag

Chitrasena: Traditional dance legacy perseveres

Published

on

By Rochelle Palipane Gunaratne

Where would Mother Lanka’s indigenous dance forms be, if not for the renaissance of traditional dance in the early 1940s? January 26, 2021 marked the 100th birth anniversary of the legendary Guru Chitrasena who played a pivotal role in reviving a dance form which was lying dormant, ushering in a brand new epoch to a traditional rhythmic movement that held sway for over two millennia.

“There was always an aura that drew us all to Seeya and we were mesmerized by it,” enthused Heshma, Artistic Director of the Chitrasena Dance Company and eldest grand-daughter of the doyen of dance. She reminisced about her legendary grandfather during a brief respite from working on a video depicting his devotion to a dance form that chose him.

“Most classical art forms require a lifetime of learning and dedication as it’s also a discipline which builds character and that is what we have been inculcated with by Guru Chitrasena, who also left us with an invaluable legacy,” emphasized Heshma, adding that it makes everything else pale in comparison and provides the momentum even when faced with trials.

Blazing a dynamic trail

The patriarch’s life and times resonated with an era of change in Ceylon, here was an island nation that was almost overshadowed by a gigantic peninsula whose influence had been colossal. Being colonized by the western empires meant a further suppression for over four centuries. Yet, hidden in the island’s folds were artistes, dancers and others who held on almost devoutly to their sacred doctrines. The time was ripe for the harvest and the need for change was almost palpable. To this era was born Chitrasena, who took the idea by its horns and led it all the way to the world stage.

He literally coaxed the hidden treasures of the island out of the Gurus of old whose birthrights were the traditional dance forms, who did not have a need or a desire for the stage. Their repertoire was relegated to village ceremonies, peraheras and ritual sacrifices. The nobles, at the time, entertained themselves sometimes watching these ‘devil dancers.’ In fact, some of these traditional dancers are said to have been taken as part of a ‘human circus’ act to be presented abroad in the late 1800s.

But how did Chitrasena change that thinking? He went in search of these traditional Gurus, lived with them, learned the traditions and then re-presented them as a respectable dance art on the stage. He revolutionized the manner in which we, colonized islanders, viewed what was endemic to us, suffice it to say he gave it the pride and honour it deserved, though it came with a supreme sacrifice, a lifetime of commitment to dancing, braving the criticism and other challenges that were constantly put up to deter him. Not only did he commit himself to this colossal task but the involvement of his immediate family and the family of dancers was exceptional, bordering on devotion as their lives revolved around dance alone.

Imbued in them is the desire to dance and share their knowledge with others and it is done through various means, such as giving prominence to Gurus of yore, hence the Guru Gedara Festival which saw the confluence of many artistes and connoisseurs who mingled at the Chitrasena Kalayathanaya in August 2018. Moreover the family has been heavily involved in inculcating a love for dancing in all age groups through various dance classes for over 75 years, specifically curated dance workshops, concerts and scholarships for students who are passionate about dancing.

While hardship is what strengthens our inner selves, there were questions posed by Chitrasena that we need to ask ourselves and the authorities concerning the arts and their development in our land. “Yes, there is a burgeoning interest in expanding infrastructure in many different fields as part of post war development. But what purpose will it serve if there are no artistes to perform in all the new theatres to be built for instance?” queries Heshma. The new theatres we have now are not even affordable to most of the local artistes. “When I refer to dance I am not referring to the cabaret versions of our traditional forms. I am talking about the dancers who want to immerse themselves in a manner that refuses to compromise their art for any reason at all, not to cater to the whims and fancies of popular trends, vulgarization for financial gain or simply diluting these sacred art forms to appeal to audiences who are ignorant about its value,” she concludes. There are still a few master artistes and some very talented young artistes, who care very deeply about our indigenous art forms, who need to be encouraged and supported to pursue their passion, which then will help preserve our rich cultural heritage. But the support for the arts is so minimal in our country that one wonders as to how their astute devotion will prevail in this unhinged world where instant fixes run rampant.

Yet, the cry of the torchbearers of unpretentious traditional dance theatre in our land, is to provide it a respectable platform and the support it rightly deserves, and this is an important moment in time to ensure the survival of our dance. With this thought, one needs to pay homage to Chitrasena whose influence transcends cultures and metaphorical boundaries and binds the connoisseurs of dance and other art forms, leaving an indelible mark through the ages.

Amaratunga Arachchige Maurice Dias alias Chitrasena was born on 26 January 1921 at Waragoda, Kelaniya, in Sri Lanka. Simultaneously, in India, Tagore had established his academy, Santiniketan and his lectures on his visit to Sri Lanka in 1934 had inspired a revolutionary change in the outlook of many educated men and women. Tagore had stressed the need for a people to discover its own culture to be able to assimilate fruitfully the best of other cultures. Chitrasena was a schoolboy at the time, and his father Seebert Dias’ house had become a veritable cultural confluence frequented by the literary and artistic intelligentsia of the time.

In 1936, Chitrasena made his debut at the Regal Theatre at the age of 15 in the role of Siri Sangabo, the seeds of the first Sinhala ballet produced and directed by his father. Presented in Kandyan style, Chitrasena played the lead role, and this created a stir among the aficionados who noticed the boy’s talents. D.B. Jayatilake, who was Vice-Chairman of the Board of Ministers under the British Council Administration, Buddhist scholar, Founder and first President of the Colombo Y.M.B.A, freedom fighter, Leader of the State Council and Minister of Home Affairs, was a great source of encouragement to the young dancer.

Chitrasena learnt the Kandyan dance from Algama Kiriganitha Gurunnanse, Muddanawe Appuwa Gurunnanse and Bevilgamuwe Lapaya Gurunnanse. Having mastered the traditional Kandyan dance, his ‘Ves Bandeema’, ceremony of graduation by placing the ‘Ves Thattuwa’ on the initiate’s head, followed by the ‘Kala-eliya’ mangallaya, took place in 1940. In the same year he proceeded to Travancore to study Kathakali dance at Sri Chitrodaya Natyakalalayam under Sri Gopinath, Court dancer in Travancore. He gave a command performance with Chandralekha (wife of portrait painter J.D.A. Perera) before the Maharaja and Maharani of Travancore at the Kowdiar Palace. He later studied Kathakali at the Kerala Kalamandalam.

In 1941, Chitrasena performed at the Regal Theatre, one of the first dance recitals of its kind, before the Governor Sir Andrew Caldecott and Lady Caldecott with Chandralekha and her troupe. Chandralekha was one of the first women to break into the field of the Kandyan dance, followed by Chitrasenás protégé and soul mate, Vajira, who then became the first professional female dancer. Thereafter, Chitrasena and Vajira continued to captivate audiences worldwide with their dynamic performances which later included their children, Upeka, Anjalika and students. The matriarch, Vajira took on the reigns at a time when the duo was forced to physically separate with the loss of the house in Colpetty where they lived and worked for over 40 years. Daughter Upeka then continued to uphold the tradition, leading the dance company to all corners of the globe during a very difficult time in the country. At present, the grand-children Heshma, Umadanthi and Thaji interweave their unique talents and strengths to the legacy inspired by Guru Chitrasena.

Author

Continue Reading

Sat Mag

Meat by any other name is animal flesh

Published

on

In India most animal welfare people are vegetarians. We, in People for Animals, insist on that. After all, you cannot want to look after animals and then eat them. But most meat eaters, whether they are animal people or not, have a hesitant relationship with the idea of killing animals for food. They enjoy the taste of meat, but shy away from making the connection that animals have been harmed grievously in the process.

This moral conflict is referred to, in psychological terms, as the ‘meat paradox’. A meat eater will eat caviar, but he will refuse to listen to someone telling him that this has been made from eggs gotten from slitting the stomach of a live pregnant fish. The carnivorous individual simply does not want to feel responsible for his actions. Meat eaters and sellers try and resolve this dilemma by adopting the strategy of mentally dissociating meat from its animal origins. For instance, ever since hordes of young people have started shunning meat, the meat companies and their allies in the government, and nutraceutical industry, have deliberately switched to calling it “protein”. This is an interesting manipulation of words and a last-ditch attempt to influence consumer behaviour.

For centuries meat has been a part of people’s diet in many cultures. Global meat eating rose hugely in the 20th century, caused by urbanization and developments in meat production technology. And, most importantly, the strategies used by the meat industry to dissociate the harming of animals from the flesh on the plate. Researchers say “These strategies can be direct and explicit, such as denial of animals’ pain, moral status, or intelligence, endorsement of a hierarchy in which humans are placed above non-human animals” (using religion and god to amplify the belief that animals were created solely for humans, and had no independent importance for the planet, except as food and products). The French are taught, for instance, that animals cannot think.

Added to this is the justification of meat consumption based on spurious nutritional grounds. Doctors and dieticians, who are unwitting tools of the “nutritional science” industry, put their stamp on this shameless hard sell. 

The most important of all these strategies, and the one that has a profound effect on meat consumption, is the dissociation of meat from its animal origins. Important studies have been done on this (Kunst & Hohle, 2016; Rothgerber, 2013; Tian, Hilton & Becker, 2016; Foer, 2009; Joy, 2011; Singer, 1995). “At the core of the meat paradox is the experience of cognitive dissonance. Cognitive dissonance theory proposes that situations involving conflicting behaviours, beliefs or attitudes produce a state of mental discomfort (Festinger, 1957). If a person holds two conflicting, or inconsistent pieces of information, he feels uncomfortable. So, the mind strives for consistency between the two beliefs, and attempts are made to explain or rationalize them, reducing the discomfort. So, the person distorts his/her perception wilfully and changes his/her perception of the world.

The meat eater actively employs dissociation as a coping strategy to regulate his conscience, and simply stops associating meat with animals.

In earlier hunter-gatherer and agricultural societies, people killed or saw animals killed for their table. But from the mid-19th century the eater has been separated from the meat production unit. Singer (1995) says that getting meat from shops, or restaurants, is the last step of a gruesome process in which everything, but the finished product, is concealed. The process: the loading of animals into overcrowded trucks, the dragging into killing chambers, the killing, beheading, removing of skin, cleaning of blood, removal of intestines and cutting the meat into pieces, is all secret and the eater is left with neatly packed, ready-to-cook pieces with few reminders of the animal. No heads, bones, tails, feet. The industry manipulates the mind of the consumer so that he does not think of the once living and intelligent animal.

The language is changed concealing the animal. Pig becomes pork, sausage, ham, bacon, cows become beef and calves become veal, goat becomes mutton and hens become chicken and white meat. And now all of them have become protein.

Then come rituals and traditions which remove any kind of moral doubt. People often partake in rituals and traditions without reflecting on their rationale or consequences. Thanksgiving is turkey, Fridays is fish. In India all rituals were vegetarian. Now, many weddings serve meat. Animal sacrifice to the gods is part of this ritual.

Studies have found that people prefer, or actively choose, to buy and eat meat that does not remind them of the animal origins (Holm, 2018; Te Velde et al.,2002. But Evans and Miele (2012), who investigated consumers’ interactions with animal food products, show that the fast pace of food shopping, the presentation of animal foods, and the euphemisms used instead of the animal (e.g., pork, beef and mutton) reduced consumers’ ability to reflect upon the animal origins of the food they were buying. Kubberod et al. (2002) found that high school students had difficulty in connecting the animal origins of different meat products, suggesting that dissociation was deeply entrenched in their consuming habits. Simons et al. found that people differed in what they considered meat: while red meat and steak was seen as meat, more processed and white meat (like chicken nuggets e.g.) was sometimes not seen as meat at all, and was often not considered when participants in the study reported the frequency of their meat eating.

Kunst and Hohle (2016) demonstrated how the process of presenting and preparing meat, and deliberately turning it from animal to product, led to less disgust and empathy for the killed animal and higher intentions to eat meat. If the animal-meat link was made obvious – by displaying the lamb for instance, or putting the word cow instead of beef on the menu – the consumer avoided eating it and went for a vegetarian alternative. This is an important finding: by interrupting the mental dissociation, meat eating immediately went down. This explains how, during COVID, the pictures of the Chinese eating animals in Wuhan’s markets actually put off thousands of carnivores and meat sale went down. In experiments by Zickfeld et al. (2018) and Piazza et al. (2018) it was seen that showing the pictures of animals, especially young animals, reduce people’s willingness to eat meat.

Do gender differences exist when it comes to not thinking about the meat one eats?

In Kubberød and colleagues’ (2002) study on disgust and meat consumption, substantial differences emerged between females and males. Men were more aware of the origins of different types of meat, yet did not consider the origins when consuming it. Women reported that they did not want to associate the meat they ate with a living animal, and that reminders would make them uncomfortable and sometimes even unable to eat the meat. In a study by Bray et al. (2016), who investigated parents’ conversations with their children about the origins of meat, women were more likely than men to avoid these conversations with their children, as they felt more conflicted about eating meat themselves. In a study by Kupsala (2018) female consumers expressed more tension related to the thought of killing animals for food than men. The supermarket customer group preferred products that did not remind them of animal origins, and showed a strong motivation to avoid any clues that highlighted the meat-animal connection. What emerged was that the females felt that contact with, and personification of, food producing animals would sometimes make it impossible for them to eat animal products.

What are the other dissociation techniques that companies and societies use to make people eat meat. For men, the advertising is direct: Masculinity, the inevitable fate of animals, the generational traditions of their family. For women it is far more indirect: just simply hiding the source of the meat and giving the animal victim a cute name to prevent disgust and avoidance.

Kubberod et al. (2202) compared groups from rural and urban areas but found little evidence for differences between these groups. Moreover, both urban and rural consumers in the study agreed that meat packaging and presentation functioned to conceal the link between the meat and the once living animal. Both groups of respondents also stated that if pictures of tied up pigs, or pigs in stalls, would be presented on packaging of pork meat, or pictures of caged hens on egg cartons, they would not purchase the product in question.

Are people who are sensitive to disruptions of the dissociation process (or, in plain English, open to learning the truth about the lies they tell themselves) more likely to become vegetarians? Probably. Everyone has a conscience. The meat industry has tried to make you bury it. We, in the animal welfare world, should try to make it active again.

 

(To join the animal welfare movement contact gandhim@nic.in,www.peopleforanimalsindia.org)

Author

Continue Reading
  • HomePage Advertiesment – middle11

    Author

  • HomePage Advertiesment – middle11

    Author

  • HomePage Advertiesment – middle11

    Author