The legacy of milk drinking has not come to us from Krishna or the Vedas – nowhere in any of sacred texts does anyone drink milk – but from the British. Rabid milk drinkers, they brought this culture into India, started the first dairy farms and promoted it widely. Then our own governments, filled with people who had aped the West for so long, carried on this advertising and gave it a religious connotation (it is not the holy cow but holy milk), filled it with virtues like calcium and protein, and promoted it as an essential food for children to have at least three times a day. But does milk deserve its halo as a food that “does a body good.”
Now scientists are retracting their rubber stamps. In the world’s premier medical magazine, The New England Journal of Medicine in February 2020, scientists Walter Willett, MD, DrPH, professor of nutrition and epidemiology at the Harvard T.H. Chan School of Public Health, and his co-author, David Ludwig, MD, PhD, a professor of paediatrics and nutrition at Harvard, say the science behind those dietary recommendations is almost nonexistent, and eating dairy may cause harm to both our bodies and the planet. Willett is the best known scientist on dairy in the world.
According to them, the core reason why people drink milk – its supposed benefits of calcium – is based on flawed evidence. Does milk justify its majorly adverse impact on the environment? Willett says no.
Their study comes at the same time as another prestigious one: Elizabeth Jacobs, PhD, is a professor of epidemiology, biostatistics, and nutritional sciences, at the University of Arizona College of Public Health in Tucson, recommends in Nutritional Reviews that milk be downgraded as a separate and essential food. She says it should be placed in a much lower category as one of many foods that could provide protein.
Milk drinking in America has fallen by 40% since 1975. But its production and consumption has risen by 9% . That is because people are eating more cheese and yoghurt and it takes far more milk to make these products. India has the same pattern. Much less milk is being fed to children, but paneer and dahi and sweets made of milk are up.
According to the team led by Willett, the recommendations for milk as a major calcium source came from a few small studies on a few people, and were carried on for just a few weeks.. Researchers measured how much calcium they ate and drank, and compared it to how much they were excreting. The idea was to find out how much calcium the body needs to keep it in balance.
In adults, the calcium balance should be net zero – i.e., the body should excrete the same amount as it ingests. Researchers of these small studies concluded that for Americans 741 milligrams of calcium a day was enough for balance.
But in countries where dairy was not a common food, like Peru for instance (and all of Asia and Africa before the British), the amount needed for balance was much less around 200 milligrams. As far back as 1951, Harvard University nutritionist Mark Hegsted did a study to find out whether calcium was needed in such quantities by the body He looked for a chronically calcium-deprived population and found one in the Central Penitentiary of Lima, Peru, where prisoners subsisted on a rice and beans diet and drank milk once a week. Hegsted monitored the calcium intake and compared it to the amount they excreted in their waste, to calculate how much calcium they retained. The average prisoner took in just 126 milligrams of calcium a day, but tests still showed normal levels of calcium in his body. Willett says the body simply absorbs what it needs for balance, taking more from whatever food is eaten to meet its needs whether from green leaves, fruit and vegetables or fish.
But what happens to the body when large amounts of dairy calcium are poured into it ? Willet points to large population-based studies, that have followed how people eat for years (not weeks), and measures what happens to their health. All these studies consistently point out that populations that eat the most milk have the highest number of bone fractures (specially hip fractures), i.e., the weakest most fragile bones. Exactly the opposite of what parents want to achieve when they force their children to drink milk. The risk appears to be highest for men who drank a lot of milk in childhood. One huge study found that women who drank 2.5 or more glasses of milk a day had a higher risk of fractures than women who drank less than one glass a day.
What are the other claims that Willett has found that have no basis in fact?
That milk makes you lose weight.
That milk can help control blood pressure.
That dairy lowers the rates of cardiovascular disease : all research shows that dairy has no effect on cardiovascular disease rates.
Dairy consumption lowers bone fracture rates: goes against every study which actually suggest the opposite
And there was no link between lifespan and eating dairy.
All in all, every research done to date shows that “milk is not essential for health,” says Marion Nestle, PhD, Professor of nutrition, food studies, and public health at New York University.
According to the data there is good evidence that kids who drink cow’s milk grow taller than those who don’t (I would contest that: this has to do with genetics as well – Maneka Gandhi). But there is no scientific reason that shows why or how milk accelerates growth. Willett gives a sound reason : cows are often pregnant when they’re milked, which increases hormones like estrogen and progesterone. Cows produce more of another hormone, called insulin-like growth factor, IGF, which increases milk production (and gives cancer). These hormones may also promote growth in people.
Does milk create strong bones in children ? Children need calcium for bone building but do they need to get it from milk ? The authors of this review showed that no studies have proven the link between dairy and strong bones.
A study, “The effect of dairy intake on bone mass and body composition in early pubertal girls and boys: a randomized controlled trial” by Vogel, Martin McCabe et al and published in 2017’s American Society for Nutrition, done on 240 healthy 8-15 year old children of different colour and weight, showed that feeding them three added servings of dairy, as against a control group that had none, had no effect at all. After 18 months the study found no difference in bone density between the children who had more dairy and the ones who didn’t. The U.S. recommends that children ages 4 to 8 get 1,000 milligrams of calcium in their diets. The U.K. recommends half that much, just 450 to 550 milligrams a day. The difference in recommendations is not because scientists differ but is dependent on the political weightage of the dairy industry.
Humans get far more calcium from green vegetables nuts, beans. When you eat a food that makes no difference in calcium or protein but has chemicals, artificial hormones, antibiotics and pesticides in it, you also bring disaster to the environment. Dairy farms consume masses of water. They contribute to water pollution. The cattle emit methane which heats the atmosphere much more than carbon dioxide. Why don’t you do the world a favour and eat plastic instead?
(To join the animal welfare movement contact email@example.com, www.peopleforanimalsindia.org)
- News Advertiesment
See Kapruka’s top selling online shopping categories such as Toys, Grocery, Flowers, Birthday Cakes, Fruits, Chocolates, Clothing and Electronics. Also see Kapruka’s unique online services such as Money Remittence,News, Courier/Delivery, Food Delivery and over 700 top brands. Also get products from Amazon & Ebay via Kapruka Gloabal Shop into Sri Lanka.
End of Fukuyama’s last man, and triumph of nationalism
By Uditha Devapriya
What, I wonder, are we to make of nationalism, the most powerful political force we have today? Liberals dream on about its inevitable demise, rehashing a line they’ve been touting since goodness-knows-when. Neoliberals do the same, except their predictions of its demise have less to do with the utopian triumph of universal values than with their undying belief in the disappearance of borders and the interlinking of countries and cities through the gospel of trade. Both are wrong, and grossly so. There is no such thing as a universal value, and even those described as such tend to differ in time and place. There is such a thing as trade, and globalisation has made borders meaningless. But far from making nationalism meaningless, trade and globalisation have in fact bolstered its relevance.
The liberals of the 1990s were (dead) wrong when they foretold the end of history. That is why Francis Fukuyama’s essay reads so much like a wayward prophet’s dream today. And yet, those who quote Fukuyama tend to focus on his millenarian vision of liberal democracy, with its impending triumph across both East and West. This is not all there is to it.
To me what’s interesting about the essay isn’t his thesis about the end of history – whatever that meant – but what, or who, heralds it: Fukuyama’s much ignored “last man.” If we are to talk about how nationalism triumphed over liberal democracy, how populists trumped the end of history, we must talk about this last man, and why he’s so important.
In Fukuyama’s reading of the future, mankind gets together and achieves a state of perfect harmony. Only liberal democracy can galvanise humanity to aspire to and achieve this state, because only liberal democracy can provide everyone enough of a slice of the pie to keep us and them – majority and minority – happy. This is a bourgeois view of humanity, and indeed no less a figure than Marx observed that for the bourgeoisie, the purest political system was the bourgeois republic. In this purest of political systems, this bourgeois republic, Fukuyama sees no necessity for further progression: with freedom of speech, the right to assemble and dissent, an independent judiciary, and separation of powers, human beings get to resolve, if not troubleshoot, all their problems. Consensus, not competition, becomes the order of the day. There can be no forward march; only a turning back.
Yet that future state of affairs suffers from certain convulsions. History is a series of episodic progressions, each aiming at something better and more ideal. If liberal democracy, with its championing of the individual and the free market, triumphs in the end, it must be preceded by the erosion of community life. The problem here is that like all species, humanity tends to congregate, to gather as collectives, as communities.
“[I]n the future,” Fukuyama writes, “we risk becoming secure and self-absorbed last men, devoid of thymotic striving for higher goals in our pursuit of private comforts.” Being secure and self-absorbed, we become trapped in a state of stasis; we think we’re in a Panglossian best of all possible worlds, as though there’s nothing more to achieve.
Fukuyama calls this “megalothymia”, or “the desire to be recognised as greater than other people.” Since human beings think in terms of being better than the rest, the fact of reaching a point where we don’t need to show we’re better lulls us to a sense of restless dissatisfaction. The inevitable follows: some of us try finding out ways of doing something that’ll put us a cut above the rest. In the rush to the top, we end up in “a struggle for recognition.”
Thus the last men of history, in their quest to find some way they can show that they’re superior, run the risk of becoming the first men of history: rampaging, irrational hordes, hell-bent on fighting enemies, at home and abroad, real and imagined.
Fukuyama tries to downplay this risk, contending that liberal democracy provides the best antidote against a return to such a primitive state of nature. And yet even in this purest of political systems, security becomes a priority: to prevent a return to savagery, there must be an adequate deterrent against it. In his scheme of things, two factors prevent history from realising the ideals of humanity, and it is these that make such a deterrent vital: persistent war and persistent inequality. Liberal democracy does not resolve these to the extent of making them irrelevant. Like dregs in a teacup, they refuse to dissolve.
The problem with those who envisioned this end of history was that they conflated it with the triumph of liberal democracy. Fukuyama committed the same error, but most of those who point at his thesis miss out on the all too important last part of his message: that built into the very foundation of liberal democracy are the landmines that can, and will, blow it off. Yet this does not erase the first part of his message: that despite its failings, it can still render other political forms irrelevant, simply because, in his view, there is no alternative to free markets, constitutional republicanism, and the universal tenets of liberalism. There may be such a thing as civilisation, and it may well divide humanity. Such niceties, however, will sooner or later give way to the promise of globalisation and free trade.
It is no coincidence that the latter terms belong in the dictionary of neoliberal economists, since, as Kanishka Goonewardena has put it pithily, no one rejoiced at Fukuyama’s vision of the future of liberal democracy more than free market theorists. But could one have blamed them for thinking that competitive markets would coexist with a political system supposedly built on cooperation? To rephrase the question: could one have foreseen that in less than a decade of untrammelled deregulation, privatisation, and the like, the old forces of ethnicity and religious fundamentalism would return? Between the Berlin Wall and Srebrenica, barely three years had passed. How had the prophets of liberalism got it so wrong?
Liberalism traces its origins to the mid-19th century. It had the defect of being younger, much younger, than the forces of nationalism it had to fight and put up with. Fast-forward to the end of the 20th century, the breakup of the Soviet Union, and the shift in world order from bipolarity to multipolarity, and you had these two foes fighting each other again, only this time with the apologists of free markets to boot. This three-way encounter or Mexican standoff – between the nationalists, the liberal democrats, and the neoliberals – did not end up in favour of dyed-in-the-wool liberal democrats. Instead it ended up vindicating both the nationalists and the neoliberals. Why it did so must be examined here.
The fundamental issue with liberalism, which nationalism does not suffer from, is that it views humanity as one. Yet humanity is not one: man is man, but he is also rich, poor, more privileged, and less privileged. Even so, liberal ideals such as the rule of law, separation of powers, and judicial independence tend to believe in the equality of citizens.
So long as this assumption is limited to political theory, nothing wrong can come out of believing it. The problem starts when such theories are applied as economic doctrines. When judges rule in favour of welfare cuts or in favour of corporations over economically backward communities, for instance, the ideals of humanity no longer appear as universal as they once were; they appear more like William Blake’s “one law for the lion and ox.”
That disjuncture didn’t trouble the founders of European liberalism, be it Locke, Rousseau, or Montesquieu, because for all their rhetoric of individual freedoms and liberties they never pretended to be writing for anyone other than the bourgeoisie of their time. Indeed, John Stuart Mill, beloved by advocates of free markets in Sri Lanka today, bluntly observed that his theories did not apply to slaves or subjects of the colonies. To the extent that liberalism remained cut off from the “great unwashed” of humanity, then, it could thrive because it did not face the problem of reconciling different classes into one category. Put simply, humanity for 19th century liberals looked white, bourgeois, and European.
The tail-end of the 20th century could not have been more different to this state of affairs. I will not go into why so and how come, but I will say that between the liberal promise of all humanity merging as one, the nationalist dogma of everyone pitting against everyone else, and the neoliberal paradigm of competition and winner-takes-all, the winner could certainly not be ideologues who believed in the withering away of cultural differences and the coming together of humanity. As the century drew to a close, it became increasingly obvious that the winners would be the free market and the nationalist State. How exactly?
Here I like to propose an alternative reading of not just Fukuyama’s end of history and last man, but also the triumph of nationalism and neoliberalism over liberal democracy. In 1992 Benjamin Barber wrote an interesting if not controversial essay titled “Jihad vs. McWorld” to The Atlantic in which he argued that two principles governed the post-Cold War order, and of the two, narrow nationalism threatened globalisation. Andre Gunder Frank wrote a reply to Barber where he contended that, far from opposing one another, narrow nationalism, or tribalism, in fact resembled the forces of globalisation – free markets and free trade – in how they promoted the transfer of resources from the many to the few.
For Gunder Frank, the type of liberal democracy Barber championed remained limited to a narrow class, far too small to be inclusive and participatory. In that sense “McWorldisation”, or the spread of multinational capital to the most far-flung corners of the planet, would not lead to the disappearance of communal or cultural fragmentation, but would rather bolster and lay the groundwork for such fragmentation. Having polarised entire societies, especially those of the Global South, along class lines, McWorldisation becomes a breeding ground for the very “axial principle” Barber saw as its opposite: “Jihadism.”
Substitute neoliberalism for McWorldisation, nationalism for Jihadism, and you see how the triumph of one has not led to the defeat of the other. Ergo, my point: nationalism continues to thrive, not just because (as is conventionally assumed) liberal democracy vis-à-vis Francis Fukuyama failed, but more importantly because, in its own way, neoliberalism facilitated it. Be it Jihadism there or Jathika Chintanaya here, in the Third World of the 21st century, what should otherwise have been a contradiction between two forces opposed to each other has instead become a union of two opposites. Hegel’s thesis and antithesis have hence become a grand franken-synthesis, one which will govern the politics of this century for as long as neoliberalism survives, and for as long as nationalism thrives on it.
The writer can be reached at firstname.lastname@example.org
Chitrasena: Traditional dance legacy perseveres
By Rochelle Palipane Gunaratne
Where would Mother Lanka’s indigenous dance forms be, if not for the renaissance of traditional dance in the early 1940s? January 26, 2021 marked the 100th birth anniversary of the legendary Guru Chitrasena who played a pivotal role in reviving a dance form which was lying dormant, ushering in a brand new epoch to a traditional rhythmic movement that held sway for over two millennia.
“There was always an aura that drew us all to Seeya and we were mesmerized by it,” enthused Heshma, Artistic Director of the Chitrasena Dance Company and eldest grand-daughter of the doyen of dance. She reminisced about her legendary grandfather during a brief respite from working on a video depicting his devotion to a dance form that chose him.
“Most classical art forms require a lifetime of learning and dedication as it’s also a discipline which builds character and that is what we have been inculcated with by Guru Chitrasena, who also left us with an invaluable legacy,” emphasized Heshma, adding that it makes everything else pale in comparison and provides the momentum even when faced with trials.
Blazing a dynamic trail
The patriarch’s life and times resonated with an era of change in Ceylon, here was an island nation that was almost overshadowed by a gigantic peninsula whose influence had been colossal. Being colonized by the western empires meant a further suppression for over four centuries. Yet, hidden in the island’s folds were artistes, dancers and others who held on almost devoutly to their sacred doctrines. The time was ripe for the harvest and the need for change was almost palpable. To this era was born Chitrasena, who took the idea by its horns and led it all the way to the world stage.
He literally coaxed the hidden treasures of the island out of the Gurus of old whose birthrights were the traditional dance forms, who did not have a need or a desire for the stage. Their repertoire was relegated to village ceremonies, peraheras and ritual sacrifices. The nobles, at the time, entertained themselves sometimes watching these ‘devil dancers.’ In fact, some of these traditional dancers are said to have been taken as part of a ‘human circus’ act to be presented abroad in the late 1800s.
But how did Chitrasena change that thinking? He went in search of these traditional Gurus, lived with them, learned the traditions and then re-presented them as a respectable dance art on the stage. He revolutionized the manner in which we, colonized islanders, viewed what was endemic to us, suffice it to say he gave it the pride and honour it deserved, though it came with a supreme sacrifice, a lifetime of commitment to dancing, braving the criticism and other challenges that were constantly put up to deter him. Not only did he commit himself to this colossal task but the involvement of his immediate family and the family of dancers was exceptional, bordering on devotion as their lives revolved around dance alone.
Imbued in them is the desire to dance and share their knowledge with others and it is done through various means, such as giving prominence to Gurus of yore, hence the Guru Gedara Festival which saw the confluence of many artistes and connoisseurs who mingled at the Chitrasena Kalayathanaya in August 2018. Moreover the family has been heavily involved in inculcating a love for dancing in all age groups through various dance classes for over 75 years, specifically curated dance workshops, concerts and scholarships for students who are passionate about dancing.
While hardship is what strengthens our inner selves, there were questions posed by Chitrasena that we need to ask ourselves and the authorities concerning the arts and their development in our land. “Yes, there is a burgeoning interest in expanding infrastructure in many different fields as part of post war development. But what purpose will it serve if there are no artistes to perform in all the new theatres to be built for instance?” queries Heshma. The new theatres we have now are not even affordable to most of the local artistes. “When I refer to dance I am not referring to the cabaret versions of our traditional forms. I am talking about the dancers who want to immerse themselves in a manner that refuses to compromise their art for any reason at all, not to cater to the whims and fancies of popular trends, vulgarization for financial gain or simply diluting these sacred art forms to appeal to audiences who are ignorant about its value,” she concludes. There are still a few master artistes and some very talented young artistes, who care very deeply about our indigenous art forms, who need to be encouraged and supported to pursue their passion, which then will help preserve our rich cultural heritage. But the support for the arts is so minimal in our country that one wonders as to how their astute devotion will prevail in this unhinged world where instant fixes run rampant.
Yet, the cry of the torchbearers of unpretentious traditional dance theatre in our land, is to provide it a respectable platform and the support it rightly deserves, and this is an important moment in time to ensure the survival of our dance. With this thought, one needs to pay homage to Chitrasena whose influence transcends cultures and metaphorical boundaries and binds the connoisseurs of dance and other art forms, leaving an indelible mark through the ages.
Amaratunga Arachchige Maurice Dias alias Chitrasena was born on 26 January 1921 at Waragoda, Kelaniya, in Sri Lanka. Simultaneously, in India, Tagore had established his academy, Santiniketan and his lectures on his visit to Sri Lanka in 1934 had inspired a revolutionary change in the outlook of many educated men and women. Tagore had stressed the need for a people to discover its own culture to be able to assimilate fruitfully the best of other cultures. Chitrasena was a schoolboy at the time, and his father Seebert Dias’ house had become a veritable cultural confluence frequented by the literary and artistic intelligentsia of the time.
In 1936, Chitrasena made his debut at the Regal Theatre at the age of 15 in the role of Siri Sangabo, the seeds of the first Sinhala ballet produced and directed by his father. Presented in Kandyan style, Chitrasena played the lead role, and this created a stir among the aficionados who noticed the boy’s talents. D.B. Jayatilake, who was Vice-Chairman of the Board of Ministers under the British Council Administration, Buddhist scholar, Founder and first President of the Colombo Y.M.B.A, freedom fighter, Leader of the State Council and Minister of Home Affairs, was a great source of encouragement to the young dancer.
Chitrasena learnt the Kandyan dance from Algama Kiriganitha Gurunnanse, Muddanawe Appuwa Gurunnanse and Bevilgamuwe Lapaya Gurunnanse. Having mastered the traditional Kandyan dance, his ‘Ves Bandeema’, ceremony of graduation by placing the ‘Ves Thattuwa’ on the initiate’s head, followed by the ‘Kala-eliya’ mangallaya, took place in 1940. In the same year he proceeded to Travancore to study Kathakali dance at Sri Chitrodaya Natyakalalayam under Sri Gopinath, Court dancer in Travancore. He gave a command performance with Chandralekha (wife of portrait painter J.D.A. Perera) before the Maharaja and Maharani of Travancore at the Kowdiar Palace. He later studied Kathakali at the Kerala Kalamandalam.
In 1941, Chitrasena performed at the Regal Theatre, one of the first dance recitals of its kind, before the Governor Sir Andrew Caldecott and Lady Caldecott with Chandralekha and her troupe. Chandralekha was one of the first women to break into the field of the Kandyan dance, followed by Chitrasenás protégé and soul mate, Vajira, who then became the first professional female dancer. Thereafter, Chitrasena and Vajira continued to captivate audiences worldwide with their dynamic performances which later included their children, Upeka, Anjalika and students. The matriarch, Vajira took on the reigns at a time when the duo was forced to physically separate with the loss of the house in Colpetty where they lived and worked for over 40 years. Daughter Upeka then continued to uphold the tradition, leading the dance company to all corners of the globe during a very difficult time in the country. At present, the grand-children Heshma, Umadanthi and Thaji interweave their unique talents and strengths to the legacy inspired by Guru Chitrasena.
Meat by any other name is animal flesh
In India most animal welfare people are vegetarians. We, in People for Animals, insist on that. After all, you cannot want to look after animals and then eat them. But most meat eaters, whether they are animal people or not, have a hesitant relationship with the idea of killing animals for food. They enjoy the taste of meat, but shy away from making the connection that animals have been harmed grievously in the process.
This moral conflict is referred to, in psychological terms, as the ‘meat paradox’. A meat eater will eat caviar, but he will refuse to listen to someone telling him that this has been made from eggs gotten from slitting the stomach of a live pregnant fish. The carnivorous individual simply does not want to feel responsible for his actions. Meat eaters and sellers try and resolve this dilemma by adopting the strategy of mentally dissociating meat from its animal origins. For instance, ever since hordes of young people have started shunning meat, the meat companies and their allies in the government, and nutraceutical industry, have deliberately switched to calling it “protein”. This is an interesting manipulation of words and a last-ditch attempt to influence consumer behaviour.
For centuries meat has been a part of people’s diet in many cultures. Global meat eating rose hugely in the 20th century, caused by urbanization and developments in meat production technology. And, most importantly, the strategies used by the meat industry to dissociate the harming of animals from the flesh on the plate. Researchers say “These strategies can be direct and explicit, such as denial of animals’ pain, moral status, or intelligence, endorsement of a hierarchy in which humans are placed above non-human animals” (using religion and god to amplify the belief that animals were created solely for humans, and had no independent importance for the planet, except as food and products). The French are taught, for instance, that animals cannot think.
Added to this is the justification of meat consumption based on spurious nutritional grounds. Doctors and dieticians, who are unwitting tools of the “nutritional science” industry, put their stamp on this shameless hard sell.
The most important of all these strategies, and the one that has a profound effect on meat consumption, is the dissociation of meat from its animal origins. Important studies have been done on this (Kunst & Hohle, 2016; Rothgerber, 2013; Tian, Hilton & Becker, 2016; Foer, 2009; Joy, 2011; Singer, 1995). “At the core of the meat paradox is the experience of cognitive dissonance. Cognitive dissonance theory proposes that situations involving conflicting behaviours, beliefs or attitudes produce a state of mental discomfort (Festinger, 1957). If a person holds two conflicting, or inconsistent pieces of information, he feels uncomfortable. So, the mind strives for consistency between the two beliefs, and attempts are made to explain or rationalize them, reducing the discomfort. So, the person distorts his/her perception wilfully and changes his/her perception of the world.
The meat eater actively employs dissociation as a coping strategy to regulate his conscience, and simply stops associating meat with animals.
In earlier hunter-gatherer and agricultural societies, people killed or saw animals killed for their table. But from the mid-19th century the eater has been separated from the meat production unit. Singer (1995) says that getting meat from shops, or restaurants, is the last step of a gruesome process in which everything, but the finished product, is concealed. The process: the loading of animals into overcrowded trucks, the dragging into killing chambers, the killing, beheading, removing of skin, cleaning of blood, removal of intestines and cutting the meat into pieces, is all secret and the eater is left with neatly packed, ready-to-cook pieces with few reminders of the animal. No heads, bones, tails, feet. The industry manipulates the mind of the consumer so that he does not think of the once living and intelligent animal.
The language is changed concealing the animal. Pig becomes pork, sausage, ham, bacon, cows become beef and calves become veal, goat becomes mutton and hens become chicken and white meat. And now all of them have become protein.
Then come rituals and traditions which remove any kind of moral doubt. People often partake in rituals and traditions without reflecting on their rationale or consequences. Thanksgiving is turkey, Fridays is fish. In India all rituals were vegetarian. Now, many weddings serve meat. Animal sacrifice to the gods is part of this ritual.
Studies have found that people prefer, or actively choose, to buy and eat meat that does not remind them of the animal origins (Holm, 2018; Te Velde et al.,2002. But Evans and Miele (2012), who investigated consumers’ interactions with animal food products, show that the fast pace of food shopping, the presentation of animal foods, and the euphemisms used instead of the animal (e.g., pork, beef and mutton) reduced consumers’ ability to reflect upon the animal origins of the food they were buying. Kubberod et al. (2002) found that high school students had difficulty in connecting the animal origins of different meat products, suggesting that dissociation was deeply entrenched in their consuming habits. Simons et al. found that people differed in what they considered meat: while red meat and steak was seen as meat, more processed and white meat (like chicken nuggets e.g.) was sometimes not seen as meat at all, and was often not considered when participants in the study reported the frequency of their meat eating.
Kunst and Hohle (2016) demonstrated how the process of presenting and preparing meat, and deliberately turning it from animal to product, led to less disgust and empathy for the killed animal and higher intentions to eat meat. If the animal-meat link was made obvious – by displaying the lamb for instance, or putting the word cow instead of beef on the menu – the consumer avoided eating it and went for a vegetarian alternative. This is an important finding: by interrupting the mental dissociation, meat eating immediately went down. This explains how, during COVID, the pictures of the Chinese eating animals in Wuhan’s markets actually put off thousands of carnivores and meat sale went down. In experiments by Zickfeld et al. (2018) and Piazza et al. (2018) it was seen that showing the pictures of animals, especially young animals, reduce people’s willingness to eat meat.
Do gender differences exist when it comes to not thinking about the meat one eats?
In Kubberød and colleagues’ (2002) study on disgust and meat consumption, substantial differences emerged between females and males. Men were more aware of the origins of different types of meat, yet did not consider the origins when consuming it. Women reported that they did not want to associate the meat they ate with a living animal, and that reminders would make them uncomfortable and sometimes even unable to eat the meat. In a study by Bray et al. (2016), who investigated parents’ conversations with their children about the origins of meat, women were more likely than men to avoid these conversations with their children, as they felt more conflicted about eating meat themselves. In a study by Kupsala (2018) female consumers expressed more tension related to the thought of killing animals for food than men. The supermarket customer group preferred products that did not remind them of animal origins, and showed a strong motivation to avoid any clues that highlighted the meat-animal connection. What emerged was that the females felt that contact with, and personification of, food producing animals would sometimes make it impossible for them to eat animal products.
What are the other dissociation techniques that companies and societies use to make people eat meat. For men, the advertising is direct: Masculinity, the inevitable fate of animals, the generational traditions of their family. For women it is far more indirect: just simply hiding the source of the meat and giving the animal victim a cute name to prevent disgust and avoidance.
Kubberod et al. (2202) compared groups from rural and urban areas but found little evidence for differences between these groups. Moreover, both urban and rural consumers in the study agreed that meat packaging and presentation functioned to conceal the link between the meat and the once living animal. Both groups of respondents also stated that if pictures of tied up pigs, or pigs in stalls, would be presented on packaging of pork meat, or pictures of caged hens on egg cartons, they would not purchase the product in question.
Are people who are sensitive to disruptions of the dissociation process (or, in plain English, open to learning the truth about the lies they tell themselves) more likely to become vegetarians? Probably. Everyone has a conscience. The meat industry has tried to make you bury it. We, in the animal welfare world, should try to make it active again.
(To join the animal welfare movement contact email@example.com,www.peopleforanimalsindia.org)