Sunday 15 November 2015

Natural fats vs artificial substitutes

We are always caught up in the debate of which type of fat is better: natural fat or artificial substitutes? To get to know this with proper understanding, below is the excerpt from the book, "The Complete Book of Chinese Health and Healing-- Guarding the three treasures" by Daniel Reid:

Medical myths about fats have steered millions of Western people into self-destructive dietary folly in recent years. Natural unadulterated fats are not only highly nutritious, gram for gram they contain far more energy than any other type of food on earth, which makes them the most efficient fuel for essence-to-energy food alchemy. Natural fats contain nutrients which are absolutely essential for proper functioning of the brain, heart, and immune system, but despite this fact, the Western medical establishment, along with the media and processed-food industry, have condemned natural fats as killers and suggest instead that we all switch over to 'low-fat' or 'no-fat' products in which natural fats have been replaced by hydrogenated vegetable oils.

First, let's discuss why the body requires fats and how it uses them, then take a look at the artificial substitutes.

Natural fats such as butter, nut oils, and fish oils contain important nutrients called 'essential fatty acids', which are required for many metabolic processes and vital functions. Among other things, fatty acids are required to build and repair cellular membranes, especially in brain, nerve, and white blood cells, and to keep blood vessels clean and well lubricated. Two of them--linoleic and linolenic acid-- cannot be synthesized in the body and must therefore be obtained from dietary sources. According to Dr Cass Igram, one of America's leading nutritional scientists, virtually all Americans are deficient in essential fatty acids.

Fats are about twice as efficient in producing energy as any other type of food, including complex carbohydrates and natural sugars. The essence-to-energy conversion of fats takes place in tiny power plants within each cell, called mitochondria, which prefer fat over all other fuels. But the fat must be natural and unadulterated in order to yield viable cellular energy. That means butter, meat, fish, nuts, seeds, and and cold-pressed oils. The traditional Eskimo diet included mounds of raw fat from whales, seals, and fish, but Eskimos never experienced problems with arteriosclerosis and heart disease until they switched from natural fats to processed American foods made with hydrogenated vegetable oils, sugar, and starch. The Japanese also eat a lot of natural raw fish oils in the form of sashimi and sushi, which contain abundant supplies of essential fatty acids. Cold-pressed olive oil has been a mainstay in Mediterranean diets for thousands of years, and these countries are known for their relatively low incidence of cancer and heart disease. In China, people traditionally used cold-pressed sesame and peanut oil for cooking and making condiments, and in India, essential fatty acids are obtained by abundant use of clarified butter called ghee.

During World War II, when butter became scarce, American chemists fiddled around with vegetable oils to produce a butter substitute and came up with margarine and 'shortening'. They did this by heating various vegetable oils to over 500 degrees F, then pumping hydrogen through it and adding nickel as a catalyst to harden it. The result of this is a solid fat substitute with a molecular structure very similar to plastic.

When natural fats are eliminated from the diet in favour of hydrogenated-oil substitutes, the body is forced to use these denatured fat molecules in place of the natural fatty acids missing from the diet. White blood cells, which are pillars of the immune system, are particularly dependent on essential fatty acids. Here's how Dr Igram describes what happens to white cells when hydrogenated oils replace natural fats in the diet, excerpted from his book 'Eat Right or Die Young':
These cells incorporate the hydrogenated fats you eat into their membranes. When this happens, the white cells become sluggish in function, and their membranes actually become stiff! Such white blood cells are poor defenders against infection. This leaves the body wide open to all sorts of derangements of the immune system. Cancer, or infections by yeasts, bacteria and viruses can more easily take a foothold...In fact, one of the quickest way to paralyze your immune system is to eat, on a daily basis, significant quantities of deep-fried foods, or fats such as margarine...No wonder that a high consumption of margarine, shortening, and other hydrogenated fats is associated with a greater incidence of a variety of cancers.

Besides cancer, regular consumption of hydrogenated-oil products, including non-dairy creamers and toppings and virtually all processed and packaged foods, is closely associated with an increased risk of arteriosclerosis, heart disease, autoimmune diseases, candidiasis, and high blood pressure.

The heart is particularly fond of natural fats as fuel, and heart cells specialize in the conversion of fats into energy. In order to do this, however, a nutrient called 'carnitine' is required to deliver fats into the cells for combustion. 'Fats cannot be properly combusted without adequate amounts of carnitine,' writes Dr Igram. Carnitine is an amino acid synthesized in the liver from two other amino acids-- lysine and methionine-- both of which must be obtained from dietary sources. If you have sufficient supplies of carnitine, you can eat all the natural fats you want, because carnitine helps burn fat, especially in the heart, which never rests.

The richest dietary sources of carnitine are fish, avocado and wheat germ. The best sources of essential fatty acids are deep-water ocean fish such as tuna and salmon, avocados, almonds, pecans, and pumpkin, pine, and sunflower seeds. The best choices in cooking oils are cold-pressed olive, corn, sunflower, sesame, and safflower oils. Clarified butter or ghee is better than ordinary butter for cooking because it can withstand higher temperatures without damage. Avoid all products made with hydrogenated or partially hydrogenated vegetable oils, including commercial mayonnaise, bottled salad dressings, margarine, shortening, and virtually all processed foods.

Is milk good for us?

After reading a lot of articles on the pros and cons of drinking milk, I stopped drinking milk about six years ago. Occasionally I would have milk or cottage cheese or some milk sweets, but soon I started seeing the results of not having the dairy products in my diet-- lesser joint pain, clearer skin, etc. Recently one of my Chinese friends, who practices TCM, gave me a book to read titled "The Complete Book of Chinese Health and Healing--Guarding the three treasures" by Daniel Reid. When I read the pages on 'Dairy', it resonated with what I had read earlier. Below is the excerpt from this book on dairy:

Cow's milk is meant for calves, and babies are meant to drink mother's milk until weaned from it. Nature has designed both types of milk and digestive systems accordingly. It is a scientifically documented fact that calves fed on pasteurized milk from their own mother cows usually die within six weeks, so it stands to reason that pasteurized cow's milk is not a wholesome, life-sustaining food for calves, much less for humans. Yet not only do adult humans feed this denatured animal secretion to their own infants, they also consume it themselves.

Cow's milk has four times the protein and only half the carbohydrate content of human milk; pasteurization destroys the natural enzyme in cow's milk required to digest its heavy protein content. This excess milk protein therefore putrefies in the human digestive tract, clogging the intestines with sticky sludge, some of which seeps into the bloodstream. As this putrid sludge accumulates from daily consumption of dairy products, the body forces some of it out through the skin (acne, blemishes) and lungs (catarrh), while the rest of it festers inside, forms mucus that breeds infections, causes allergic reactions, and stiffens joints with calcium deposits. Many cases of chronic asthma, allergies, ear infections, and acne have been totally cured simply by eliminating all dairy products from the diet.

Cow's milk products are particularly harmful to women. Milk is supposed to flow out of, not into, women's bodies. The debilitating effects of pasteurized cow's milk on women is further aggravated by the synthetic hormones cows are injected with to increase milk production. These chemicals play havoc with the delicately balanced female endocrine system. In 'Food and Healing', the food therapist Annemarie Colbin describes the dairy disaster for women as follows:
"The consumption of dairy products, including milk, cheese, yogurt, and ice cream, appears to be strongly linked to various disorders of the female reproductive system, including ovarian tumors and cysts, vaginal discharges, and infections. I see this link confirmed time and again by the countless women I know who report these problems diminishing or disappearing altogether after they have stopped consuming dairy food. I hear of fibroid tumors being passed or dissolved, cervical cancer arrested, menstrual irregularities straightened out...."

Many women, as well as men, consume dairy products because their doctors tell them it's a good source of calcium. This is fallacious advice. True, cow's milk contains 118 mg of calcium in every 100 g, compared to 33 mg/100 g in human milk. But cow's milk also contains 97 mg phosphorus/ 100 g, compared to only 18 mg in human milk. Phosphorus combines with calcium in the digestive tract and actually blocks its assimilation. Dr Frank Oski, chairman of the Department of Pediatrics at the State University of New York's Medical Center, states: "Only foods with a calcium-to-phosphorus-ratio of two-to-one or better should be used as a primary source of calcium." The ratio in human milk is 2.35 to one, in cow's milk is only 1.27 to one. Cow's milk also contains 50 mg sodium/100 g, compared with only 16 mg in human milk, so dairy products are probably one of the most common sources of excess sodium in the modern Western diet.

Besides, cow's milk is not nearly as good a source of calcium as other far more digestible and wholesome foods. Compare the 118 mg calcium/100 g cow's milk with 100 g of the following foods: almonds (254 mg), broccoli (130 mg), kale (187 mg), sesame seeds (1,160 mg), kelp (1,093 mg), and sardines (400 mg).

As for osteoporosis, it is caused not so much by the calcium deficiency in the diet as it is by dietary factors which leach calcium from bones and teeth, especially sugar. Sugar, meat, refined starch, and alcohol all cause a constant state of acidosis in the bloodstream, and acid blood is known to dissolve calcium from bones. The best way to correct osteoporosis is to consume the non-dairy calcium-rich foods mentioned above, while simultaneously cutting down or eliminating acidifying calcium robbers from the diet. A daily supplement of 3 mg of the mineral boron also seems to help bones assimilate and retain calcium.

From the traditional Chinese medical point of view, milk is a form of 'sexual essence'. For the human species to drink the sexual essence of another species can only lead to trouble, especially for females, because the hormones it contains will upset the sensitive balance of the human endocrine system.

If you insist on consuming dairy products, your best bet is goat's milk, which approximates the nutritional composition and balance of human milk. The only safe products made from cow's milk are fresh butter, which is a digestible fat, and fresh live-culture yogurt, which is predigested for you by lactobacteria, but even these should be consumed in moderation and preferably prepared from raw unpasteurized milk.

Friday 6 November 2015

Lecture me, really.

I resumed my teaching three weeks ago after the autumn break, and while trying to explain something to a group of students, I asked for paper and pen. None of the people in that group had those, and when I asked in the class, only two students had paper and pen with them. Their reason was that since they do everything on laptop, they don't see a need to bring paper and pen in the classroom. I then announced to the whole class that from next week each one of them was required to bring paper and pen, and take down notes when I go through the lecture!

Perhaps my request of pen and paper was unusual. Isn't the old-fashioned lecture on the way out?

A 2014 study showed test scores in Science and Maths improved after professors replaced lecture time with "active learning" methods like group work, prompting Harvard physicist Eric Mazur, who has long campaigned against the lecture format, to declare that "it's almost unethical to be lecturing". In many quarters, the active learning craze is only the latest development in a long tradition of complaining about boring professors.

Today's fad for active learning is nothing new. In 1852, John Henry Newman wrote in The Idea Of A University that true learning "consists not merely in the passive reception into the mind of a number of ideas hitherto unknown to it, but in the mind's energetic and simultaneous action upon and towards and among those new ideas." So a good lecture class does just what Newman said: It keeps students' minds in energetic and simultaneous action. And it teaches a rare skill in our smartphone-app-addled culture: the art of attention, the crucial first step in the "critical thinking" that educationists prize.

Those who want to abolish the lecture course do not understand what a lecture is. A lecture is not the recitation of an encyclopedia article or facts on a whole topic. Rather, a lecture places a premium on the connections between individual facts. Absorbing a long, complex concept or argument is hard work, requiring students to synthesize, organise and react as they listen. In today's time, when any reading assignment longer than a Facebook post seems difficult, students have little experience doing this.

But if we abandon the lecture format because students may find it difficult, we do them a disservice. Moreover, we capitulate to the worst features of the customer-service mentality that has seeped into the educational institutes from the business world. The solution, instead, is to teach those students how to gain all a great lecture course has to give them.

Many many times I see my students drifting away, either on laptop or mobile phones when I am in the midst of explaining something complex and crucial for their learning; and then I realize that I first need to start by teaching them how to create space in their inner world, so they could take on that concept on a clean slate-- basically how to listen. The art of listening helps the students learn to clear their minds and improve focus. This ability to concentrate is not just a study skill. Think in a larger perspective--- "Can they listen to a political candidate with an analytical ear? Can they go and listen to their minister with an attentive mind? Can they listen to one another? One of the things a lecture does is build that habit of listening.

Listening continuously and taking notes for an hour is an unusual cognitive experience for most young people. Professors should embrace lecture courses as an exercise in mindfulness and attention building, a mental workout that counteracts the junk food of non-stop social media. I usually ask my students to stop 'staring' at their laptops and write down their notes on paper as I explain the topic. Initially there is some resistance, but soon, they start liking it. I think the students value a break from their multitasking lives. The classroom is an unusual space for them to be in: Here's a person talking about complicated formulas and pathways, challenging their pre-conceived notions, and trying not to dumb them down, not playing for laughs, requiring 60 minutes of focused attention.

Holding students' attention is not easy. I lecture from detailed notes, which I assimilate before each class until I know the script well. I move around the class, wave my arms and ask questions to which I expect an answer. When the hour is done, I am exhausted but happy! A good lecturer is "someone who conveys that there's something at stake in what you are talking about". Good lecturers communicate the emotional vitality of the intellectual discourse. ("The way she lectured always made me make connections to my own body systems and previous topics," wrote one of my students in online feedback.)

But we also must persuade students to value that aspect of a lecture course often regarded as drudgery: note-taking. Note-taking is important partly for the record it creates, but the real power of good notes lies in how they shape the mind. Learning to take attentive and analytical notes can greatly help in presenting their power point slides clearly as well as defending their answers in front of the class. However technology can sabotage note-taking. Studies suggest that taking notes by hand helps students master material better than typing notes on a laptop, probably because most find it impossible to take verbatim notes with pen and paper. Verbatim transcription of the lecture is never the goal: Students should synthesize their own points as they listen.

Lecturing in classroom is not a "passive" learning experience, and it cannot be replicated by asking students to watch videotaped lectures online: The temptations of the Internet, the safeguard of the rewind button and the comforts of 'own' room are deadly to the attention span. A lecture course teaches students that listening is not the same thing as thinking about what you plan to say next--- and that critical thinking depends on mastery of facts, not knee-jerk opinions.

In conclusion, lectures are essential for teaching the students most basic skills: comprehension and reasoning, skills whose value extends beyond the classroom to the essential demands of working life and citizenship. Such a student learns "when to speak and when to be silent", Newman wrote. "He is able to converse, he is able to listen."

The simplicity movement

In a world of rampant materialism and manifold opportunities, many people these days are apparently learning who they are, by choosing what they can do without. I had just finished  Diwali-cleaning of my home, and then I happened to read something on simplicity and its virtues. Being 'cleanliness-minded' myself, I decided to write some of these thoughts here.

We are usually told to try new things and explore the life's possibilities. As Oliver Wendell Holmes put it:"The chief work of civilisation is just that it makes the means of living more complex. Because more complex and intense intellectual efforts mean a fuller and richer life. That means more life. Life is an end to itself and the only question as to whether it is worth living is whether you have enough of it."

This striving for fullness and variety has always sparked a controversy towards simplicity and naturalness. Many great thinkers and conventional groups have always favoured ascetic living and high thinking as a way to clear out those material things that might distract them from humility and grace, compassion and prayer, the soul and the God.

Today's simplicity movements are different from what they were in the past. Today's most obvious simplicity impulse is the movement to declutter the home. Marie Kondo's book, The Life-Changing Magic of Tidying Up, now ranks No.2 on Amazon among the best-selling books of 2015. Magazines and websites are stuffed with tips on how to declutter our living areas--- everything that can be folded should be folded! Open the mail while standing over the recycling bin, etc. Cleaning out the closets and paring down the wardrobe have become a religious ritual for many--- a search for serenity, and a blow against stress.

The second big tendency in today's simplicity movement involves mental hygiene: techniques to clean out the email folder and reduce the incoming flow. For example, Mailwise is a mobile email product that cleans out repetitive phrases so we can read our email more quickly. So there's a mass movement to combat mental harriedness, the epidemic of attention deficit disorder all around. Of course, there's a struggle to regain control of our own attention, to set priorities about what we will think about, to see fewer things but to see them more deeply.

One of the troublesome things about today's simplicity movements is that they are often just alternative forms of consumption. Some magazines advise us to strip away our stuff so we can buy new, simpler stuff! So simplification is not really spiritual or anti-materialism; just a more refined and morally status-building form of materialism.

Today's simplicity movements are not as philosophically explicit as older ones. Still, there's clearly some process of discovery here. Early in life we choose our identity by getting things. But later in an affluent life we discover or update our identity by throwing away what is no longer useful, true and beautiful. One simplicity expert advised people to take all their books off their shelves and throw them on the floor. Put back only the books that you truly value.

That's an exercise in identity discovery, an exercise in realizing and then prioritizing our current tastes and beliefs. People who do that may instinctively be seeking higher forms of pruning: being impeccable with our words, strong with our commitments, disciplined about our time, selective about our friendships, hence moving from fragmentation towards unity of purpose. There's an enviable emotional tranquility at the end of that road.

Thursday 5 November 2015

Read between the lines

Our landscape in imparting knowledge, be it at any level, is changing rapidly and traditional lines are blurring. The idea of learning spaces without classrooms and unconventional teaching approaches are being experimented widely, with a view to incorporate them as the possible ways in future education. Besides the need to work in teams and connect with people from other cultures, students need to learn to read between the lines.

Almost every time I face this challenge in my class--- how to coax or inspire my students to listen so attentively and 'deeply' that they are able to ask 'why'? Of course, there are a few exceptions who look for the reason why a particular process happens in that way in our body (I teach Human Anatomy and Physiology at the undergraduate level), and these are the ones who score very well in the module and of course, are the reason for my continuing to teach even after thirty two years!

Often, it is easier for students to take what is taught in class at face value. To raise challenges and ask questions to know more are seen as taboo and irrelevant by majority of them. This type of attitude results in memory work and regurgitation which is not a surprise, because in order to understand knowledge truly and internalise skills, students need to learn the basis and rationale behind what they are learning. As a result, many times students perform very badly when asked unconventional or application-type of questions though these still belong to the prescribed syllabus.

Students need to be encouraged to be "cheeky" learners, ready to question and challenge, eager to understand the basis behind what is being taught. This will inevitably lead to a teacher levelling up also to ensure that he/she "knows his/her stuff". Essentially, both teacher and student will level up together as co-learners. Based on my teaching experience, it's sheer joy for me to be in a class of inquisitive and enthusiastic students where frequent constructive exchanges happen among us!

The Ministry of Education in Singapore has emphasized that for young people to go further in their chosen careers, classroom learning must be combined with deep skills and on-the-job experience. One of these deep skills is social-emotional processing, which empowers students to understand and manage their emotions, show empathy for others and solve problems constructively. This is lacking in our students. 

Making out the meaning of circumstances is not just a cognitive process. It requires an awareness of one's own attitudes and conviction, and also how others are feeling. Students are writing essays lacking conviction, and penning narratives lacking emotion. Our education system needs to put emphasis on helping students find convictions and understand emotions. Essentially, learning to read between the lines is not just about success; it is about finding greater meaning and purpose in life. In the present world, it is becoming imperative to be responsive to the changing needs, and hence hone our social-emotional skills.

Can we become smarter?

You can increase the size of your muscles by pumping iron and improve your stamina with aerobic training. Can you get smarter by exercising-- or altering-- your brain?

This is an imortant question, considering that cognitive decline is a nearly universal feature of ageing. According to Richard A. Friedman, Professor of Clinical Psychiatry at Weill Cornell Medical College, US, starting at age 55, our hippocampus, a brain region critical to memory, shrinks 1 to 2 percent every year, and 1 in 9 people aged 65 and older has Alzheimer's disease. The number afflicted is expected to grow rapidly as the baby boom generation ages. Given these grim statistics, we are so keen to try supposed smart drugs and supplements to brain training, that promise to boost normal mental functioning or stem its all-too-common decline.

The very notion of cognitive enhancement is seductive and possible. After all, the brain is capable of change and learning at all ages. Our brain has remarkable neuroplasticity; that is, it can remodel and change itself in response to various experiences and injuries. So can it be trained to enhance its own cognitive prowess? The multi-billion-dollar brain training industry certainly thinks so and claims that you can increase your memory, attention and reasoning just by playing various mental games.

A few years back, a joint study by BBC and Cambridge University neuroscientists put brain training to the test involving reasoning, problem-solving, short-term memory and attention span. All subjects took a benchmark cognitive test, a kind of modified IQ test, at the beginning and at the end of the study. Although improvements were observed in every cognitive task that was practiced, there was no evidence that brain training made people smarter. There was , however, a glimmer of hope for subjects aged 60 and above. Unlike the younger participants, older subjects showed a significant improvement in verbal reasoning which suggests that brain exercise might delay some of the effects of ageing on the brain.

There are also easy and powerful ways to enhance learning in young people. For example, there is growing evidence that the attitude that young people have about their own intelligence---and what their teachers believe---can have a big impact on how well they learn. Prof Carol Dweck, a Psychology professor at Stanford University, has shown that kids who think that their intelligence is malleable perform better and are more motivated to learn than those who believe that their intelligence is fixed and unchangeable.

In one experiment, Prof Dweck and her colleagues gave a group of low-achieving seventh graders a seminar on how the brain works and put the students at random into two groups. The experimental group was told that learning changes the brain and that students are in charge of this process. The control group received a lesson on memory, but was not instructed to think of intelligence as malleable. At the end of eight weeks, students who had been encouraged to view their intelligence as changeable scored significantly better (85%) than those in the control group (54%) on a test of the material they learnt in the seminar.

These findings appear to have profound implications for educating young people because they suggest that a relatively simple intervention--- teachers encouraging their students to think of their own cognitive capacity as a quality that they can improve--- can have a powerful effect enhancing learning and motivation. The adolescent brain is more malleable than the adult brain, so whether Prof Dweck's findings might hold for adult learning is still an open question! Perhaps it is not the same as increasing innate intelligence, but helping young people hit their intellectual potential is critically valuable--- and apparently not so difficult to do.

So we can clearly enhance learning; there is still more you can do for your brain. It turns out that physical exercise can also improve cognitive function and promote the growth and creation of neurons. For example, mice that are allowed to run on a wheel for just 45 days had more neurons in their hippocampus, a brain region critical for memory formation, than sedentary mice. Also, another study found that women who did weight training twice a year had less brain shrinkage than those who trained once a week or did stretching exercises, though the cognitive significance of this effect is not clear yet.

How might exercise exert these effects? Intriguingly, exercise in humans and animals increases the level of a protein called brain-derived neurotrophic factor, or BDNF, in the blood and brain. BDNF promotes the growth and formation of new neurons, and it may be responsible for a remarkable effect of exercise on the brain: an increase in the size of the hippocampus that is linked with improved memory. Conversely, adverse experiences like major depression can lower BDNF levels and are associated with hippocampal shrinkage, a phenomenon that helps explain some of the cognitive impairments that are hallmark of depression. Aside from making people feel better, antidepressants can block the depression-induced drop in BDNF, so these drugs are, in a sense, neuroprotective.

Now the question is whether there is a smart pill, like Adderall or Ritalin, that will do the same work as exercise? We know that these stimulants increase focus and make the world feel more interesting by releasing dopamine in key brain circuits. But when it comes to their effects on memory and learning, the data is mixed. The only consistent cognitive benefit of stimulants is their effect on the consolidation of long-term memory, meaning that they strengthen the ability to recall previously learnt information. There is no evidence that any prescription drug or supplement or smart drink is going to raise your IQ!

But there is one thing that doesn't require a prescription that seems to help preserve cognitive fitness: other people. There is strong evidence that people with richer social networks and engagement have a reduced rate of cognitive decline as they age. Prof Lisa F. Berkman, a professor at the Harvard School of Public Health, and other colleagues examined data from the Health and Retirement Study, which followed a nationally representative sample of nearly 17,000 subjects aged 50 and older from 1998 to 2004. Subjects were cognitively assessed with a simple word-recall test at baseline and then at two-year-old intervals, and social integration was gauged by contact with family, friends and other social activities.

The results showed that people with the highest level of social integration had less than half the decline in cognitive function of the least socially active subjects. Also, the cognitive protective effects of socializing were greatest among those with fewer than 12 years of education. In conclusion, you can't exceed your innate intelligence. But that seems less important than the fact that there is much that you can do to reach your cognitive potential and to keep it. Forget the smart drugs and supplements; put on your shoes and go exercise or consider brain training. And better still, do it all with your friends!