Connect with us

Sat Mag

The long, never-ending life of humanitarian intervention

Published

on

By Uditha Devapriya

In 1999 Charles Krauthammer famously penned an obituary for humanitarian intervention. Such an idea, he argued, cannot last, especially not as a cornerstone in foreign policy, since it has no real plan, no real purpose. All it does is pursue utopian objectives which are “of the most peripheral strategic interest to the United States.” Americans may be willing to give up their lives in the cause of their country, but not “to allay feelings of pity.” Bringing peace to the world, which was humanitarian intervention’s aim, is practical and doable if all it takes is the bulldozing of enemy territory, as with Germany and Japan during World War II. Yet this is precisely what interventionists choose not to do.

As policy, humanitarian intervention dates back to the 19th century. As theory, it goes back even further. At the centre of its universe is the question of how, if not to what extent, the sovereignty of the State became secondary to the sovereignty of the individual. Since this is, by default, a key dilemma in international law, it was first addressed though not resolved by those associated with the development of international law: Francisco de Vitoria, Francisco Suarez, Alberico Gentili, Hugo Grotius, and Jean Bodin.

Though these were hardly, if at all, advocates of humanitarian intervention, they affirmed the legitimacy of interfering in another State’s affairs, even for purposes other than national interest. “[K]ings, and those who possess rights equal to those kings,” observed Grotius, had the right to demand punishments regarding “injuries committed against themselves or their subjects” and “injuries which… excessively violate the law of nature or of nations in regard of any person whatsoever [my emphasis].” Restated, rephrased, this is, in the words of one scholar, “the principle that exclusiveness of domestic jurisdiction stops when outrage upon humanity begins.” Yet even Grotius recognised the limits of such intervention, just as he did the limits of sovereignty: in De jure belli ac pacis (1625), he makes what Raymond J. Vincent calls “a remarkable concession” to sovereignty, by “denying his subjects the right to take up arms when wronged by him.” However, interestingly enough, while denying locals this right, he concedes the right of foreign players to intervene on their behalf.

Hersch Lauterpacht’s view that Grotius propounded “the first authoritative statement of the principle of humanitarian intervention” has been widely contested. To focus attention on one individual philosopher is to ignore, if not undermine, the trajectories of history that led to him formulating that statement. As Heraclides and Dialla (2015) have pointed out in their excellent book on the subject, the antecedents of humanitarian intervention go back, not to Renaissance Europe, but to Classical Antiquity, to a concept that, while not directly related to the issue of intervention today, nevertheless influenced it: Just War, or jus bellum, whose acknowledged founding father was Thomas Aquinas.

In the distinction Aquinas made between the innocent and the guilty, and in his admission that the innocent can well be killed in a conflict, he addresses a central dilemma for all those caught up in war: how justifiable is it? Three conditions, he wrote, can help us rationalise it: if war is declared by a proper authority, if it is embarked upon to punish wrongdoing States, and if military force is exerted “to secure for peace, rather than lust for power.” The tenet that binds these together is proportionality: a war may have good and bad effects, but that is permissible so long as the good is intended and the bad is necessary to achieve the good. I concur this is a notoriously fluid thing to verify, even at a time when it’s becoming harder to conceal the ill-effects of war. But in Aquinas’s day, and in later periods when the sanction of the clergy was considered necessary to embark on the Crusades against the Fertile Crescent, such points were conceded without too much debate.

Besides, this was the era of city-states. After the Peace of Westphalia when the focus shifted to nation-states, when the issue was of sovereignty and to what extent it could be intruded on, the founding fathers of international law revisited the principles of Just War to ascertain whether the State was absolute or not. To my mind four distinct historical trends had a say in shaping the principles of humanitarian intervention, apart from, and in addition to, this shift in international politics: Spanish conquests in the New World, the rise of naval power “from a medieval setting into an early modern framework”, the decline of Ottoman power at the end of the 18th century, and the clash between Hobbesian sovereignty and Lockean liberalism in the wake of revolution in France and elsewhere in Europe.

All these factors had a say, a profound one, in widening both the theory and the practice of intervention for purposes other than those of national interest and power, especially under its foremost proponent of the 19th century, Emer de Vattel. It is in de Vattel’s writings that we see, as one scholar puts it, the principle of sovereignty giving way to “a claim to freedom and independence.” This line of reasoning really goes back to the English liberals of the 16th century, in particular John Locke. Locke, while not justifying revolution, argued that subjects had a right to rebel against the State and sovereign if the latter turned tyrant: such rebellion symbolised, not an act of revolution, but an act of restoration to what it had been before. For Locke, by violating political order the sovereign automatically vitiated any right to hold on to power and govern.

 

In other words, like most proponents of individual sovereignty, he saw the State as a holder of trust rather than a wielder of power.

What de Vattel did was to extend this to the framework governing relations between, not just within, States: resistance to tyranny invariably called for intervention by other States if, and when, citizens found it difficult to rise up in arms against it. Obvious and self-evident as that may seem today, in de Vattel’s day it was an extreme doctrine to hold, just as in Locke’s day it was an extreme doctrine to hold that sovereigns were less rulers than trustees. But in spite of its radical overtones, interventionism caught on; far from becoming an exceptional feature of 19th century politics, it became very much a part and parcel of it, and was put into practice on three occasions: the Greek War of Independence (1821-1832), the Mount Lebanon Civil War (1860), and the Balkan Crisis (1875-1878).

As Heraclides and Dialla have cogently pointed out, each of these crises bolstered the use of humanitarian rhetoric relative to the one before, and each of them had as the adversary the Ottoman Turks. As important, in my opinion, was the use made of naval power, one which proved pivotal to the rise of Britain as a colonial powerhouse in the 19th century and of the United States as a regional and global imperium in the 20th. Indeed, insofar as humanitarian intervention is concerned, it was the United States, not Europe, which reshaped the rules of the game in intervention throughout the 20th century.

It is essential that we not underestimate the American aspiration to become a naval power when putting in its proper historical perspective American involvement in Cuba at the tail-end of the 19th century. As Jenny Pearce has noted, the publication of Alfred Mahan’s book on naval superiority as a determinant of global power did not go unnoticed in the US, and it was this which propelled it to build its first battleship. Notwithstanding the humanitarian impulses it touted later on, the primary motivation, as one Senator put it, was to “cover the ocean with our merchant marine” and “build a navy to the measure of our greatness.” Since its first target was its “backyard”, i.e. Central and Latin America, this new policy called for an overhaul of the Monroe Doctrine, which had limited American intervention in the region to preventing and pre-empting interference by European powers.

How soon that altered. And yet, as scholars point out, the decision to move into Cuba and support the local uprising against colonial rule followed years of extensive debate among US lawyers over the merits of humanitarian intervention. “The intervention in Cuba,” observe Heraclides and Dialla, “was to prove a turning point.” Jenny Pearce puts it better: Cuba, she argues, “emerged as a model for United States imperialism.”

But that model was, at least at the time, couched in purely humanitarian terms. What John Hay (US Secretary of State under McKinley and Roosevelt) called the “splendid little war” came be garbed in the rhetoric of “big power protecting small player” owing to three main reasons: the shift of Big Business from opposition to support for intervention; the push in Congress for US support for local uprisings against colonial rule; and the widely shared belief that as a maritime power, America just had to project its greatness. Insofar as this is what exacerbated the push for intervention on humanitarian grounds, it has served to justify and prolong intervention even in cases where it has not been called for: an ironic perversion of what Spanish philosophers of the 16th and 17th centuries had intended. Perhaps it is not so much a coincidence that in Cuba, the US chose to combat the Spanish.

Charles Krauthammer was wrong, I think, in arguing that humanitarian intervention – as a doctrine of US foreign policy – would fade away in the new millennium. A vocal, trenchant critic of the Clinton administration, he simply saw no need for deploying the military to the far corners of the world to fight bloodless wars. War without bloodshed, he contended, was a self-contradiction, an oxymoron that prolonged war; the Serbian army, after all, chose to expel the Kosovar people from their homes after the NATO strikes. This was a great illusion, and for Krauthammer, it had to end: “it is an idea whose time has come, and gone.”

But illusions are great for a reason: they live on and they endure. Krauthammer may or may not have foreseen Libya and Iraq, yet even he fell under the gospel of intervention: barely three years after he wrote his critique of it, he began to enunciate a new doctrine, a new variation on it, in support of the Bush administration’s relentless pursuit of rogue states after 9/11. But as I wrote to this paper two months ago, that doctrine – what Krauthammer called “democratic realism” – proved to be even more of a disaster than what Clinton toyed with. An even bigger disaster cropped up a decade after 9/11, when Samantha Power, Susan Rice, Hillary Clinton, and Barack Obama “despatched” R2P to Tripoli. The West’s conception of humanitarian intervention suffers from one major flaw, and in Libya we saw it unfold only too clearly: a failure to oversee essential, vital post-war reconstruction.

Yet as the smoke and the ashes of the R2P fire wafted out, the evangelists of humanitarian intervention, like Oliver Twist, returned to keep asking for more, whether in Syria or Yemen. This is an idea that survived Antiquity, the Renaissance, Westphalia, the Triple Alliance, and two World Wars, not to mention a Cold War. It survived Clinton and Bush, and it survived Trump. It will survive Joe Biden, just as it will survive whoever comes after him.

The writer can be reached at udakdev1@gmail.com

 

Author


  • News Advertiesment

    See Kapruka’s top selling online shopping categories such as ToysGroceryFlowersBirthday CakesFruitsChocolatesClothing and Electronics. Also see Kapruka’s unique online services such as Money Remittence,NewsCourier/DeliveryFood Delivery and over 700 top brands. Also get products from Amazon & Ebay via Kapruka Gloabal Shop into Sri Lanka.

    Author

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Sat Mag

End of Fukuyama’s last man, and triumph of nationalism

Published

on

By Uditha Devapriya

 

What, I wonder, are we to make of nationalism, the most powerful political force we have today? Liberals dream on about its inevitable demise, rehashing a line they’ve been touting since goodness-knows-when. Neoliberals do the same, except their predictions of its demise have less to do with the utopian triumph of universal values than with their undying belief in the disappearance of borders and the interlinking of countries and cities through the gospel of trade. Both are wrong, and grossly so. There is no such thing as a universal value, and even those described as such tend to differ in time and place. There is such a thing as trade, and globalisation has made borders meaningless. But far from making nationalism meaningless, trade and globalisation have in fact bolstered its relevance.

The liberals of the 1990s were (dead) wrong when they foretold the end of history. That is why Francis Fukuyama’s essay reads so much like a wayward prophet’s dream today. And yet, those who quote Fukuyama tend to focus on his millenarian vision of liberal democracy, with its impending triumph across both East and West. This is not all there is to it.

To me what’s interesting about the essay isn’t his thesis about the end of history – whatever that meant – but what, or who, heralds it: Fukuyama’s much ignored “last man.” If we are to talk about how nationalism triumphed over liberal democracy, how populists trumped the end of history, we must talk about this last man, and why he’s so important.

In Fukuyama’s reading of the future, mankind gets together and achieves a state of perfect harmony. Only liberal democracy can galvanise humanity to aspire to and achieve this state, because only liberal democracy can provide everyone enough of a slice of the pie to keep us and them – majority and minority – happy. This is a bourgeois view of humanity, and indeed no less a figure than Marx observed that for the bourgeoisie, the purest political system was the bourgeois republic. In this purest of political systems, this bourgeois republic, Fukuyama sees no necessity for further progression: with freedom of speech, the right to assemble and dissent, an independent judiciary, and separation of powers, human beings get to resolve, if not troubleshoot, all their problems. Consensus, not competition, becomes the order of the day. There can be no forward march; only a turning back.

Yet that future state of affairs suffers from certain convulsions. History is a series of episodic progressions, each aiming at something better and more ideal. If liberal democracy, with its championing of the individual and the free market, triumphs in the end, it must be preceded by the erosion of community life. The problem here is that like all species, humanity tends to congregate, to gather as collectives, as communities.

“[I]n the future,” Fukuyama writes, “we risk becoming secure and self-absorbed last men, devoid of thymotic striving for higher goals in our pursuit of private comforts.” Being secure and self-absorbed, we become trapped in a state of stasis; we think we’re in a Panglossian best of all possible worlds, as though there’s nothing more to achieve.

Fukuyama calls this “megalothymia”, or “the desire to be recognised as greater than other people.” Since human beings think in terms of being better than the rest, the fact of reaching a point where we don’t need to show we’re better lulls us to a sense of restless dissatisfaction. The inevitable follows: some of us try finding out ways of doing something that’ll put us a cut above the rest. In the rush to the top, we end up in “a struggle for recognition.”

Thus the last men of history, in their quest to find some way they can show that they’re superior, run the risk of becoming the first men of history: rampaging, irrational hordes, hell-bent on fighting enemies, at home and abroad, real and imagined.

Fukuyama tries to downplay this risk, contending that liberal democracy provides the best antidote against a return to such a primitive state of nature. And yet even in this purest of political systems, security becomes a priority: to prevent a return to savagery, there must be an adequate deterrent against it. In his scheme of things, two factors prevent history from realising the ideals of humanity, and it is these that make such a deterrent vital: persistent war and persistent inequality. Liberal democracy does not resolve these to the extent of making them irrelevant. Like dregs in a teacup, they refuse to dissolve.

The problem with those who envisioned this end of history was that they conflated it with the triumph of liberal democracy. Fukuyama committed the same error, but most of those who point at his thesis miss out on the all too important last part of his message: that built into the very foundation of liberal democracy are the landmines that can, and will, blow it off. Yet this does not erase the first part of his message: that despite its failings, it can still render other political forms irrelevant, simply because, in his view, there is no alternative to free markets, constitutional republicanism, and the universal tenets of liberalism. There may be such a thing as civilisation, and it may well divide humanity. Such niceties, however, will sooner or later give way to the promise of globalisation and free trade.

It is no coincidence that the latter terms belong in the dictionary of neoliberal economists, since, as Kanishka Goonewardena has put it pithily, no one rejoiced at Fukuyama’s vision of the future of liberal democracy more than free market theorists. But could one have blamed them for thinking that competitive markets would coexist with a political system supposedly built on cooperation? To rephrase the question: could one have foreseen that in less than a decade of untrammelled deregulation, privatisation, and the like, the old forces of ethnicity and religious fundamentalism would return? Between the Berlin Wall and Srebrenica, barely three years had passed. How had the prophets of liberalism got it so wrong?

Liberalism traces its origins to the mid-19th century. It had the defect of being younger, much younger, than the forces of nationalism it had to fight and put up with. Fast-forward to the end of the 20th century, the breakup of the Soviet Union, and the shift in world order from bipolarity to multipolarity, and you had these two foes fighting each other again, only this time with the apologists of free markets to boot. This three-way encounter or Mexican standoff – between the nationalists, the liberal democrats, and the neoliberals – did not end up in favour of dyed-in-the-wool liberal democrats. Instead it ended up vindicating both the nationalists and the neoliberals. Why it did so must be examined here.

The fundamental issue with liberalism, which nationalism does not suffer from, is that it views humanity as one. Yet humanity is not one: man is man, but he is also rich, poor, more privileged, and less privileged. Even so, liberal ideals such as the rule of law, separation of powers, and judicial independence tend to believe in the equality of citizens.

So long as this assumption is limited to political theory, nothing wrong can come out of believing it. The problem starts when such theories are applied as economic doctrines. When judges rule in favour of welfare cuts or in favour of corporations over economically backward communities, for instance, the ideals of humanity no longer appear as universal as they once were; they appear more like William Blake’s “one law for the lion and ox.”

That disjuncture didn’t trouble the founders of European liberalism, be it Locke, Rousseau, or Montesquieu, because for all their rhetoric of individual freedoms and liberties they never pretended to be writing for anyone other than the bourgeoisie of their time. Indeed, John Stuart Mill, beloved by advocates of free markets in Sri Lanka today, bluntly observed that his theories did not apply to slaves or subjects of the colonies. To the extent that liberalism remained cut off from the “great unwashed” of humanity, then, it could thrive because it did not face the problem of reconciling different classes into one category. Put simply, humanity for 19th century liberals looked white, bourgeois, and European.

The tail-end of the 20th century could not have been more different to this state of affairs. I will not go into why so and how come, but I will say that between the liberal promise of all humanity merging as one, the nationalist dogma of everyone pitting against everyone else, and the neoliberal paradigm of competition and winner-takes-all, the winner could certainly not be ideologues who believed in the withering away of cultural differences and the coming together of humanity. As the century drew to a close, it became increasingly obvious that the winners would be the free market and the nationalist State. How exactly?

Here I like to propose an alternative reading of not just Fukuyama’s end of history and last man, but also the triumph of nationalism and neoliberalism over liberal democracy. In 1992 Benjamin Barber wrote an interesting if not controversial essay titled “Jihad vs. McWorld” to The Atlantic in which he argued that two principles governed the post-Cold War order, and of the two, narrow nationalism threatened globalisation. Andre Gunder Frank wrote a reply to Barber where he contended that, far from opposing one another, narrow nationalism, or tribalism, in fact resembled the forces of globalisation – free markets and free trade – in how they promoted the transfer of resources from the many to the few.

For Gunder Frank, the type of liberal democracy Barber championed remained limited to a narrow class, far too small to be inclusive and participatory. In that sense “McWorldisation”, or the spread of multinational capital to the most far-flung corners of the planet, would not lead to the disappearance of communal or cultural fragmentation, but would rather bolster and lay the groundwork for such fragmentation. Having polarised entire societies, especially those of the Global South, along class lines, McWorldisation becomes a breeding ground for the very “axial principle” Barber saw as its opposite: “Jihadism.”

Substitute neoliberalism for McWorldisation, nationalism for Jihadism, and you see how the triumph of one has not led to the defeat of the other. Ergo, my point: nationalism continues to thrive, not just because (as is conventionally assumed) liberal democracy vis-à-vis Francis Fukuyama failed, but more importantly because, in its own way, neoliberalism facilitated it. Be it Jihadism there or Jathika Chintanaya here, in the Third World of the 21st century, what should otherwise have been a contradiction between two forces opposed to each other has instead become a union of two opposites. Hegel’s thesis and antithesis have hence become a grand franken-synthesis, one which will govern the politics of this century for as long as neoliberalism survives, and for as long as nationalism thrives on it.

 

The writer can be reached at udakdev1@gmail.com

Author

Continue Reading

Sat Mag

Chitrasena: Traditional dance legacy perseveres

Published

on

By Rochelle Palipane Gunaratne

Where would Mother Lanka’s indigenous dance forms be, if not for the renaissance of traditional dance in the early 1940s? January 26, 2021 marked the 100th birth anniversary of the legendary Guru Chitrasena who played a pivotal role in reviving a dance form which was lying dormant, ushering in a brand new epoch to a traditional rhythmic movement that held sway for over two millennia.

“There was always an aura that drew us all to Seeya and we were mesmerized by it,” enthused Heshma, Artistic Director of the Chitrasena Dance Company and eldest grand-daughter of the doyen of dance. She reminisced about her legendary grandfather during a brief respite from working on a video depicting his devotion to a dance form that chose him.

“Most classical art forms require a lifetime of learning and dedication as it’s also a discipline which builds character and that is what we have been inculcated with by Guru Chitrasena, who also left us with an invaluable legacy,” emphasized Heshma, adding that it makes everything else pale in comparison and provides the momentum even when faced with trials.

Blazing a dynamic trail

The patriarch’s life and times resonated with an era of change in Ceylon, here was an island nation that was almost overshadowed by a gigantic peninsula whose influence had been colossal. Being colonized by the western empires meant a further suppression for over four centuries. Yet, hidden in the island’s folds were artistes, dancers and others who held on almost devoutly to their sacred doctrines. The time was ripe for the harvest and the need for change was almost palpable. To this era was born Chitrasena, who took the idea by its horns and led it all the way to the world stage.

He literally coaxed the hidden treasures of the island out of the Gurus of old whose birthrights were the traditional dance forms, who did not have a need or a desire for the stage. Their repertoire was relegated to village ceremonies, peraheras and ritual sacrifices. The nobles, at the time, entertained themselves sometimes watching these ‘devil dancers.’ In fact, some of these traditional dancers are said to have been taken as part of a ‘human circus’ act to be presented abroad in the late 1800s.

But how did Chitrasena change that thinking? He went in search of these traditional Gurus, lived with them, learned the traditions and then re-presented them as a respectable dance art on the stage. He revolutionized the manner in which we, colonized islanders, viewed what was endemic to us, suffice it to say he gave it the pride and honour it deserved, though it came with a supreme sacrifice, a lifetime of commitment to dancing, braving the criticism and other challenges that were constantly put up to deter him. Not only did he commit himself to this colossal task but the involvement of his immediate family and the family of dancers was exceptional, bordering on devotion as their lives revolved around dance alone.

Imbued in them is the desire to dance and share their knowledge with others and it is done through various means, such as giving prominence to Gurus of yore, hence the Guru Gedara Festival which saw the confluence of many artistes and connoisseurs who mingled at the Chitrasena Kalayathanaya in August 2018. Moreover the family has been heavily involved in inculcating a love for dancing in all age groups through various dance classes for over 75 years, specifically curated dance workshops, concerts and scholarships for students who are passionate about dancing.

While hardship is what strengthens our inner selves, there were questions posed by Chitrasena that we need to ask ourselves and the authorities concerning the arts and their development in our land. “Yes, there is a burgeoning interest in expanding infrastructure in many different fields as part of post war development. But what purpose will it serve if there are no artistes to perform in all the new theatres to be built for instance?” queries Heshma. The new theatres we have now are not even affordable to most of the local artistes. “When I refer to dance I am not referring to the cabaret versions of our traditional forms. I am talking about the dancers who want to immerse themselves in a manner that refuses to compromise their art for any reason at all, not to cater to the whims and fancies of popular trends, vulgarization for financial gain or simply diluting these sacred art forms to appeal to audiences who are ignorant about its value,” she concludes. There are still a few master artistes and some very talented young artistes, who care very deeply about our indigenous art forms, who need to be encouraged and supported to pursue their passion, which then will help preserve our rich cultural heritage. But the support for the arts is so minimal in our country that one wonders as to how their astute devotion will prevail in this unhinged world where instant fixes run rampant.

Yet, the cry of the torchbearers of unpretentious traditional dance theatre in our land, is to provide it a respectable platform and the support it rightly deserves, and this is an important moment in time to ensure the survival of our dance. With this thought, one needs to pay homage to Chitrasena whose influence transcends cultures and metaphorical boundaries and binds the connoisseurs of dance and other art forms, leaving an indelible mark through the ages.

Amaratunga Arachchige Maurice Dias alias Chitrasena was born on 26 January 1921 at Waragoda, Kelaniya, in Sri Lanka. Simultaneously, in India, Tagore had established his academy, Santiniketan and his lectures on his visit to Sri Lanka in 1934 had inspired a revolutionary change in the outlook of many educated men and women. Tagore had stressed the need for a people to discover its own culture to be able to assimilate fruitfully the best of other cultures. Chitrasena was a schoolboy at the time, and his father Seebert Dias’ house had become a veritable cultural confluence frequented by the literary and artistic intelligentsia of the time.

In 1936, Chitrasena made his debut at the Regal Theatre at the age of 15 in the role of Siri Sangabo, the seeds of the first Sinhala ballet produced and directed by his father. Presented in Kandyan style, Chitrasena played the lead role, and this created a stir among the aficionados who noticed the boy’s talents. D.B. Jayatilake, who was Vice-Chairman of the Board of Ministers under the British Council Administration, Buddhist scholar, Founder and first President of the Colombo Y.M.B.A, freedom fighter, Leader of the State Council and Minister of Home Affairs, was a great source of encouragement to the young dancer.

Chitrasena learnt the Kandyan dance from Algama Kiriganitha Gurunnanse, Muddanawe Appuwa Gurunnanse and Bevilgamuwe Lapaya Gurunnanse. Having mastered the traditional Kandyan dance, his ‘Ves Bandeema’, ceremony of graduation by placing the ‘Ves Thattuwa’ on the initiate’s head, followed by the ‘Kala-eliya’ mangallaya, took place in 1940. In the same year he proceeded to Travancore to study Kathakali dance at Sri Chitrodaya Natyakalalayam under Sri Gopinath, Court dancer in Travancore. He gave a command performance with Chandralekha (wife of portrait painter J.D.A. Perera) before the Maharaja and Maharani of Travancore at the Kowdiar Palace. He later studied Kathakali at the Kerala Kalamandalam.

In 1941, Chitrasena performed at the Regal Theatre, one of the first dance recitals of its kind, before the Governor Sir Andrew Caldecott and Lady Caldecott with Chandralekha and her troupe. Chandralekha was one of the first women to break into the field of the Kandyan dance, followed by Chitrasenás protégé and soul mate, Vajira, who then became the first professional female dancer. Thereafter, Chitrasena and Vajira continued to captivate audiences worldwide with their dynamic performances which later included their children, Upeka, Anjalika and students. The matriarch, Vajira took on the reigns at a time when the duo was forced to physically separate with the loss of the house in Colpetty where they lived and worked for over 40 years. Daughter Upeka then continued to uphold the tradition, leading the dance company to all corners of the globe during a very difficult time in the country. At present, the grand-children Heshma, Umadanthi and Thaji interweave their unique talents and strengths to the legacy inspired by Guru Chitrasena.

Author

Continue Reading

Sat Mag

Meat by any other name is animal flesh

Published

on

In India most animal welfare people are vegetarians. We, in People for Animals, insist on that. After all, you cannot want to look after animals and then eat them. But most meat eaters, whether they are animal people or not, have a hesitant relationship with the idea of killing animals for food. They enjoy the taste of meat, but shy away from making the connection that animals have been harmed grievously in the process.

This moral conflict is referred to, in psychological terms, as the ‘meat paradox’. A meat eater will eat caviar, but he will refuse to listen to someone telling him that this has been made from eggs gotten from slitting the stomach of a live pregnant fish. The carnivorous individual simply does not want to feel responsible for his actions. Meat eaters and sellers try and resolve this dilemma by adopting the strategy of mentally dissociating meat from its animal origins. For instance, ever since hordes of young people have started shunning meat, the meat companies and their allies in the government, and nutraceutical industry, have deliberately switched to calling it “protein”. This is an interesting manipulation of words and a last-ditch attempt to influence consumer behaviour.

For centuries meat has been a part of people’s diet in many cultures. Global meat eating rose hugely in the 20th century, caused by urbanization and developments in meat production technology. And, most importantly, the strategies used by the meat industry to dissociate the harming of animals from the flesh on the plate. Researchers say “These strategies can be direct and explicit, such as denial of animals’ pain, moral status, or intelligence, endorsement of a hierarchy in which humans are placed above non-human animals” (using religion and god to amplify the belief that animals were created solely for humans, and had no independent importance for the planet, except as food and products). The French are taught, for instance, that animals cannot think.

Added to this is the justification of meat consumption based on spurious nutritional grounds. Doctors and dieticians, who are unwitting tools of the “nutritional science” industry, put their stamp on this shameless hard sell. 

The most important of all these strategies, and the one that has a profound effect on meat consumption, is the dissociation of meat from its animal origins. Important studies have been done on this (Kunst & Hohle, 2016; Rothgerber, 2013; Tian, Hilton & Becker, 2016; Foer, 2009; Joy, 2011; Singer, 1995). “At the core of the meat paradox is the experience of cognitive dissonance. Cognitive dissonance theory proposes that situations involving conflicting behaviours, beliefs or attitudes produce a state of mental discomfort (Festinger, 1957). If a person holds two conflicting, or inconsistent pieces of information, he feels uncomfortable. So, the mind strives for consistency between the two beliefs, and attempts are made to explain or rationalize them, reducing the discomfort. So, the person distorts his/her perception wilfully and changes his/her perception of the world.

The meat eater actively employs dissociation as a coping strategy to regulate his conscience, and simply stops associating meat with animals.

In earlier hunter-gatherer and agricultural societies, people killed or saw animals killed for their table. But from the mid-19th century the eater has been separated from the meat production unit. Singer (1995) says that getting meat from shops, or restaurants, is the last step of a gruesome process in which everything, but the finished product, is concealed. The process: the loading of animals into overcrowded trucks, the dragging into killing chambers, the killing, beheading, removing of skin, cleaning of blood, removal of intestines and cutting the meat into pieces, is all secret and the eater is left with neatly packed, ready-to-cook pieces with few reminders of the animal. No heads, bones, tails, feet. The industry manipulates the mind of the consumer so that he does not think of the once living and intelligent animal.

The language is changed concealing the animal. Pig becomes pork, sausage, ham, bacon, cows become beef and calves become veal, goat becomes mutton and hens become chicken and white meat. And now all of them have become protein.

Then come rituals and traditions which remove any kind of moral doubt. People often partake in rituals and traditions without reflecting on their rationale or consequences. Thanksgiving is turkey, Fridays is fish. In India all rituals were vegetarian. Now, many weddings serve meat. Animal sacrifice to the gods is part of this ritual.

Studies have found that people prefer, or actively choose, to buy and eat meat that does not remind them of the animal origins (Holm, 2018; Te Velde et al.,2002. But Evans and Miele (2012), who investigated consumers’ interactions with animal food products, show that the fast pace of food shopping, the presentation of animal foods, and the euphemisms used instead of the animal (e.g., pork, beef and mutton) reduced consumers’ ability to reflect upon the animal origins of the food they were buying. Kubberod et al. (2002) found that high school students had difficulty in connecting the animal origins of different meat products, suggesting that dissociation was deeply entrenched in their consuming habits. Simons et al. found that people differed in what they considered meat: while red meat and steak was seen as meat, more processed and white meat (like chicken nuggets e.g.) was sometimes not seen as meat at all, and was often not considered when participants in the study reported the frequency of their meat eating.

Kunst and Hohle (2016) demonstrated how the process of presenting and preparing meat, and deliberately turning it from animal to product, led to less disgust and empathy for the killed animal and higher intentions to eat meat. If the animal-meat link was made obvious – by displaying the lamb for instance, or putting the word cow instead of beef on the menu – the consumer avoided eating it and went for a vegetarian alternative. This is an important finding: by interrupting the mental dissociation, meat eating immediately went down. This explains how, during COVID, the pictures of the Chinese eating animals in Wuhan’s markets actually put off thousands of carnivores and meat sale went down. In experiments by Zickfeld et al. (2018) and Piazza et al. (2018) it was seen that showing the pictures of animals, especially young animals, reduce people’s willingness to eat meat.

Do gender differences exist when it comes to not thinking about the meat one eats?

In Kubberød and colleagues’ (2002) study on disgust and meat consumption, substantial differences emerged between females and males. Men were more aware of the origins of different types of meat, yet did not consider the origins when consuming it. Women reported that they did not want to associate the meat they ate with a living animal, and that reminders would make them uncomfortable and sometimes even unable to eat the meat. In a study by Bray et al. (2016), who investigated parents’ conversations with their children about the origins of meat, women were more likely than men to avoid these conversations with their children, as they felt more conflicted about eating meat themselves. In a study by Kupsala (2018) female consumers expressed more tension related to the thought of killing animals for food than men. The supermarket customer group preferred products that did not remind them of animal origins, and showed a strong motivation to avoid any clues that highlighted the meat-animal connection. What emerged was that the females felt that contact with, and personification of, food producing animals would sometimes make it impossible for them to eat animal products.

What are the other dissociation techniques that companies and societies use to make people eat meat. For men, the advertising is direct: Masculinity, the inevitable fate of animals, the generational traditions of their family. For women it is far more indirect: just simply hiding the source of the meat and giving the animal victim a cute name to prevent disgust and avoidance.

Kubberod et al. (2202) compared groups from rural and urban areas but found little evidence for differences between these groups. Moreover, both urban and rural consumers in the study agreed that meat packaging and presentation functioned to conceal the link between the meat and the once living animal. Both groups of respondents also stated that if pictures of tied up pigs, or pigs in stalls, would be presented on packaging of pork meat, or pictures of caged hens on egg cartons, they would not purchase the product in question.

Are people who are sensitive to disruptions of the dissociation process (or, in plain English, open to learning the truth about the lies they tell themselves) more likely to become vegetarians? Probably. Everyone has a conscience. The meat industry has tried to make you bury it. We, in the animal welfare world, should try to make it active again.

 

(To join the animal welfare movement contact gandhim@nic.in,www.peopleforanimalsindia.org)

Author

Continue Reading
  • HomePage Advertiesment – middle11

    Author

  • HomePage Advertiesment – middle11

    Author

  • HomePage Advertiesment – middle11

    Author