Thursday 27 October 2016

Teachers are mentors, not vendors

On 20th Oct, 2016, Thursday, an article appeared in The Straits Times by Dr William Wan, who is the General Secretary of the Singapore Kindness Movement titled "Teachers are mentors, not vendors". As I was reading the article, I was feeling as if he has written what I see and feel around me, about today's students and parents, and the whole education system-- that TEACHERS are the MENTORS, not the service providers for the students. Below is the article for you to read and see for yourself:

Dr William Wan writes: On taking up my current role as general secretary of the Singapore Kindness Movement six years ago, I was surprised that students and parents were considered by the Ministry of Education (MOE) as "customers". I expressed my surprise and concern in discussion with my friends in the MOE management team.

What we call a thing has a big impact on how people perceive the thing itself. Would it really affect our treatment of teachers and the education system if we called our students and their parents "customers", I wondered. The word "customer" is based on a commercial concept. It is by definition, a transaction with which parties buy and sell among each other. And in sales and service, an often-used cliche is that the customer is always right. Hence, the customer is more likely the one in the stronger bargaining position.

Parents and teachers should work together as partners to educate students. Parents should also set a good example for their children by being more supportive of teachers. This thorny problem was anticipated by a 2012 publication, Case Studies In Public Governance: Building Institutions In Singapore, edited by June Gwee. It was already recognised that "education and learning were not for profit-making, and the commercial notion of 'delighting customers' threatened to be a misnomer and a source of tension".

What then would be the implications on our education system? Should we pull through with this customer-centric service?

For one, instead of the MOE and educational institutions having control over our education system, this terminology shifts the balance of power to the students and parents. This does not bode well for the education sector of modern Singapore, which already has a worrying number of able teachers leaving the profession. For example, discipline, already a difficult area by many anecdotal accounts, would become even more difficult to enforce. It is not common or easy for vendors to discipline their customers.

In addition, when there is a "problem" in the food chain, be it homework or other schoolwork, which party is to be held responsible? In the marketplace, the "service provider" will have to shoulder the blame and correct his mistakes. Hence, if the student has a problem completing her homework, it need not be the student's fault. National education cannot be treated as a mere commercial transaction. Teaching is a very noble calling, and our teachers are the best providers, not of a service, but of education itself, an indispensable block for nation-building. It should not be trivialised by commercialising it. Effective teachers do not just teach: They mentor, they listen and help with the student's problems (be it academic or personal) and they walk the journey with the students in their personal growth.

So how do we make the education landscape right for everyone?

AN END TO BEING STUDENT-PLEASERS

First, let us get rid of the old, misplaced idea that students and parents are customers that the MOE, educational institutions, principals and teachers absolutely have to please. Education itself is the institutionalised dissemination of information, knowledge and skills to a future generation. The overall objective is certainly not for a commercial benefit nor to service a customer base. The intent, according to the MOE's website, is to help our students discover their own talents, to make the best of these talents and realise their full potential, and to develop a passion for learning that lasts through life.

Teachers are not transactional vendors, but transformational mentors. They are educators, whose duties include guiding students and helping them develop skills and acquire a variety of knowledge. Their work is transformational, not transactional, and they must be empowered to do so. If they had to please their "customers", their work is immediately reduced to the transactional level.

Next, we should always give credit where it is due. I do not believe we can thank our educators enough for nurturing the future of our nation. Rather than critiquing how they should work their hours, we, as parents and grandparents, should support them and motivate them to continue nurturing the desire in our young to learn.

In any society, it takes a whole village to educate a child. All the more so in modern society. Today, the need for both parents to be at work makes the role of educators even more critically important in the lives of our children. We require more than just the parents to raise a child. We need able educators and a supportive workforce to understand one another, and to educate the child together as partners.

Finally, shouldn't we remember that we, as parents and extended family, have the prime responsibility of bringing up our children? Including, yes, educating them - perhaps less so in formal education, but certainly in general knowledge and most importantly, values.

On the latter, more important than any words uttered, our personal conduct and role-modelling do far more in educating our young. If we mistreat, or are otherwise excessively demanding of, and unreasonable with, our children's teachers, how do we think our children will regard them?

At the end of the day, in our relationship with our children's teachers, both we and they need to remember that kindness is, after all, up to each one of us. And then perhaps, in modelling this thought, despite all the other influences now readily available, we may yet succeed in making kindness and graciousness intrinsic values in the next generation and in our nation.

Monday 10 October 2016

Right Away is the Opposite of Now

I was reading this book called "Time and the soul". It was really good book. A part of the book which explains how doing something 'right away' is the Opposite of 'Now" is given below:

Some years ago, I was walking downtown San Francisco with a great friend and a learned Tibetan scholar. I asked him about one of the most striking ways that the Tibetans express the uniqueness of the human condition. Imagine, they say, that deep in the vast ocean there swims a great and ancient turtle who surfaces for air once every hundred years. Imagine further that floating somewhere in the ocean is a single ox-yoke carried here and there by random waves and currents. What are the chances that when the turtle surfaces, his head will happen to emerge precisely through the center of the ox-yoke? That is how rare it is to be born as a human being!

In the middle of our conversation, I pointed to the crowds of men and women rushing by on the street and I gestured in a way to indicate not only them, but all the thousands and millions of people rushing around in the world. "Tell me, Lobsang," I said, "if it is so rare to be born a human being, how come there are so many people in the world?"

My friend slowed his pace and then stopped. He waited for a moment, taking in my question. I remember suddenly being able to hear, as though for the first time, the loud and frenetic traffic all around us. He looked at me and very quietly replied, "How many human beings do you see?"

In a flash, I understood the meaning of the story and the idea. Most of the people I was seeing, in the inner state they were in at that moment, were not really people at all. Most were what the Tibetans call "hungry ghosts." They did not really exist. They were not really *there*. They were *busy*, they were *in a hurry*. They -- like all of us -- were obsessed with doing things *right away*. But *right away* is the opposite of *now* -- the opposite of the lived present moment in which the passing of time no longer tyrannizes us. The hungry ghosts are starved for "more" time; but the more time we hungry ghosts get, the more time we "save", the hungrier we become, the less we actually *live*. And I understood that it is not exactly more time, more days and years, that we are starved for, it is the present moment.

Through our increasing absorption in the busyness, we have the present moment. "Right away" is not now. What a toxic illusion!


(Excerpted from Jacob Needleman's book "Time and the Soul")

Saturday 20 August 2016

Active aging vs dying

One of the biggest fear of mankind is dying or death. Almost everyone of us suffers from thanatophobia, the fear of death, some time or the other. "I'm not afraid of death," Woody Allen, famous Hollywood director, had once said. "I just don't want to be there when it happens." But death does come looking for us.

In Philip Larkin's great but chilling poem Aubade, a man woke at 4 in the morning and agonised fearfully about "unresting death". At the crux of his terror is that annihilation of consciousness and awareness: "That this is what we fear - no sight, no sound,/No touch or taste or smell, nothing to think with,/Nothing to love or link with,/The anaesthetic from which none come round."

Several philosophers through the ages have, however, exhorted that none should fear this absolute dissolution since being dead is akin to a state of dreamless sleep or being unborn - a perpetual nothingness. The focus, hence, ought to be on living and that includes dying since dying, too, is an act of living. "True philosophers," Plato wrote, "are always occupied in the practice of dying."

In a 2014 essay in the New York Times, Dr Paul Kalanithi, a 36-year-old doctor who was at the cusp of finishing his training in neurosurgery, wrote of that moment of confirmation (he had been suspecting it for some time, with his excruciating backache, weight loss and fatigue) that he had Stage 4 lung cancer. As he methodically scrutinised the CT films that revealed the cancer mottling his lungs and eating into his liver and spine, he registered his initial feeling. "I wasn't taken aback. In fact, there was a certain relief," he wrote. "The next steps were clear: Prepare to die. Cry. Tell my wife that she should remarry, and refinance the mortgage. Write overdue letters to dear friends. Yes, there were lots of things I had meant to do in life, but sometimes this happens…" He spent the remaining 22 months of his life learning how to die - or in the words of journalist Christopher Hitchens, "living dyingly".

Dr Kalanithi did not divorce his wife; they chose to have a child. Distilling his experiences and thoughts on his own dying into an autobiographical book entitled When Breath Becomes Air, which was published in early 2015. It was the first and only book that he had written. He wrote it for his only child and daughter and for other people "to understand death and face their mortality" and to get them into his shoes and "walk a bit, and say, 'So that's what it looks like from here… sooner or later I'll be back here in my shoes'… Not the sensationalism of dying, and not exhortations to gather rosebuds, but: Here's what lies up ahead on the road". After trying whatever treatments he could find tolerable and acceptable, and having made a decision together with his family and his attending doctors not to carry on any further, he died with his family at his bedside.

If there can ever be one, Dr Kalanithi's death could possibly be called "a good death" or at least a good enough death. When asked what a good death is, most people would say that it is a sudden and painless death - and probably would add that this is what they would want for themselves. But is it? Such a sudden and unexpected death would usually leave behind a detritus of unfinished and unresolved matters, and a clutch of traumatised survivors who had been denied of being able to express or hear what they have meant to that person, robbed of any opportunity to express gratitude or regrets, and deprived of any hope of reconciliation.

If it is any consolation, most of us would not go this way; we would have to endure that variable period of dying. The intervention of modern medicine can drag this process for months or even years with a progressive accretion of debilities and miseries. It might seem, then, that most of us would have the time to plan for our imminent death: to grieve, to come to terms with things, to provide for others, to try to live out the remaining time with some purpose and meaning, to voice our preference for life support or not, and plan for our funeral - but we often do not do many of these.

In mediaeval Christian Europe, it was widely subscribed that the preparation for one's earthly death and the celestial judgment that would follow were matters of immense importance. Such preparation was even celebrated in the arts and literature as Ars moriendi, the art of dying. The Ars moriendi provided practical guidance on reaffirming one's faith in God, remembering the right values and taking the right attitude in composing oneself to meet death fearlessly and stoically.

Today, we are a "death avoidance" society. Perhaps we are less religious now; maybe our blind faith in medical advances has given us that illusion that we can postpone death each time it comes threateningly close, and our various superstitions and cultural aversion towards death have certainly not made discussion of dying and death any easier. It is also very likely that the public still possess little information - let alone knowledge - of end-of-life options, including hospice and palliative care, and the legal rights to refuse or withdraw life-prolonging treatments.

We talk about active ageing but ageing, whether active or otherwise, would eventually lead to death - yet there is no talk of "dying well". Granted that it is difficult to attend to the thoughts and concerns of the dying; not to discuss it is to ignore - using that old phrase - that 800-pound gorilla in the room. Perhaps, together with active ageing, we should also start talking about our own version of the Ars moriendi.

Bipolar disorder vs genius-- the dilemma

Bipolar disorder is a mental condition marked by alternating periods of elation and depression. March 30 is called World Bipolar Day and its aim is "to bring world awareness to bipolar disorders and eliminate social stigma",according to the website of The International Society for Bipolar Disorders. This particular date was chosen because it is the birthday of Vincent Van Gogh, famous painter, who was posthumously diagnosed to have bipolar disorder and has since been turned into an icon of the tragic melding of genius and mental illness.

The strange association of creativity and extraordinary achievements with mental illness has long been a subject of popular fascination and scholarly studies. There is a fairly long list of individuals who have shaped history, science, culture and the arts who were thought to have been afflicted with bipolar disorder: Isaac Newton, Abraham Lincoln, Winston Churchill, Theodore Roosevelt, Florence Nightingale, Johann Goethe, Edgar Allen Poe, George Frederic Handel, Ludwig von Beethoven, Virginia Woolf and Ernest Hemingway.

Since bipolar disorder is probably as old as humankind, having persisted generation upon generation, it suggests that it confers some evolutionary advantages. In a paper published in the British Journal Of Psychiatry in August, 2015, researchers linked high childhood IQ to an increased risk of experiencing bipolar traits in later life. "There is something about the genetics underlying the disorder that are advantageous," said Daniel Smith, the lead investigator of the study. "One possibility is that serious disorders of mood - such as bipolar disorder - are the price that human beings have had to pay for more adaptive traits such as intelligence, creativity and verbal proficiency."

The finding of this study is consistent with previous research, showing that people with an increased genetic predisposition to bipolar disorder are more likely to have a repertoire of intellectual and creative abilities which can certainly be advantageous in leadership roles and in the various artistic pursuits. One of the earliest accounts of bipolar disorder comes from Aretaeus, a Greek physician who was believed to have practised in Alexandria and Rome in the second century, and had left behind a clear description of how excited and depressed states might alternate in an individual. However, the disorder was not clearly recognised nor given a name for centuries.

It was only in 1896 that Emil Kraepelin, a German psychiatrist, called it manic-depressive psychosis, having observed that the peaks of frenzied excitement and periods of abysmal melancholy were usually separated by intervals during which the person seemed normal. In 1957, another German psychiatrist, Karl Leonhard, introduced the word "bipolar" for people who experienced mania and depression, and "unipolar" for those with depression only.

In all likelihood, bipolar disorder - like all mental disorders - remains shrouded in ignorance, fear and embarrassment. And people with this disorder - as with those with other mental health issues - are likely to be avoided, mocked, misunderstood and discriminated against. That needs to change, and the call for change needs to be made again and again as long as this situation remains unchanged. The term "bipolar disorder" has since replaced "manic-depressive psychosis" in the lexicon of psychiatry.

One end of the polarity of this disorder is mania which is an abnormally expansive and euphoric mood state that can unpredictably erupt into explosive anger. There is often a shedding of the person's normal inhibition; an urge towards potentially harmful activities such as spending sprees or sexual indiscretion or foolish business ventures; an unbounded energy with decreased need for sleep; a loquacity that permits no interruption; and a sense of inflated self-worth that can sometimes morph into grandiose delusions of fabulous wealth or special powers.

There is a rich literature of autobiographical accounts of what it is like to live with bipolar disorder, and one of the most eloquent is Kay Redfield Jamison's An Unquiet Mind. A highly regarded clinical psychologist, Jamison writes of her first attack when she was a senior in high school: "I lost my mind rather rapidly… I raced about like a crazed weasel, bubbling with plans and enthusiasms, immersed in sports, and staying up all night, night after night, out with friends, reading everything that wasn't nailed down, filling manuscript books with poems and fragments of plays, and making expansive, completely unrealistic, plans for my future."

Being in the grip of a storm of such seething energy can give the illusion of power, brilliance or genius. "I felt not just great, I felt really great," writes Jamison. "I felt I could do anything, that no task was too difficult. My mind seemed clear, fabulously focused, and able to make intuitive mathematical leaps that had up to that point entirely eluded me. At that time, however, not only did everything make perfect sense, but it all began to fit into a marvellous kind of cosmic relatedness."

In a way, it can be so intoxicating that there are some patients who want this manic phase. They miss the pleasurable excitement, the preternatural elation and the apparent creativity of mania, and they resent the levelling effect of medication. But this tumultuous brainstorm and frenetic overdrive of mania are not sustainable. The mania is ultimately exhausting and alienates others. Mania gets those who experience it into messes which they regret when they come out of it, and wrecks their life and work. Van Gogh had lamented that "if I could have worked without this accursed disease, what things I might have done".

And there is the other polarity; that same person who experiences such exultation of mood can plunge into a state of depression that brings in its wake abject misery, apathy, dejection and hopelessness that can make suicide seem the only way out.

Ten years ago, the British actor and comedian Stephen Fry, who has bipolar disorder, starred in a BBC television documentary called The Secret Life Of The Manic Depressive. In it, he spoke about his life with the disorder and went on to interview other celebrities, including British pop singer Robbie Williams and Hollywood actress Carrie Fisher, as well as other people with bipolar disorder.

A few years after the airing of this programme, psychiatrists Diana Chan and Lester Sireling reported in a 2010 publication of the British College of Psychiatrists, The Psychiatrist, on the phenomenon of a rising tide of people actively seeking out psychiatrists - either of their own accord or at the instigation of family members - and wanting to be diagnosed with bipolar disorder.

The two doctors speculated that the increased media coverage and the line-up of famous people of high social status talking about their own personal experiences have not only made bipolar disorder less stigmatising but possibly even desirable. They posited that beneath the quest for a diagnosis of bipolar disorder is the person's aspiration for a higher status which is vicariously attained through association.

Therein lies the potential treachery of such well-meaning efforts to enlighten and raise awareness of mental illness. In banishing those disparaging stereotypes and replacing them with positive ones, there lies the risk of romanticising and glamorising the condition. The onus lies with the psychiatrist to ensure that the right diagnosis is made. There are obvious dangers in misdiagnosing bipolar disorder: a person could end up being shunned by others, discriminated against by employers and insurance companies, and being prescribed medication with potential side effects. But it is just as harmful, if not more so, to miss a true bipolar diagnosis. All things considered, it is better for people who think they have this disorder to err on the side of safety and seek help.

In all likelihood, bipolar disorder - like all mental disorders - remains shrouded in ignorance, fear and embarrassment. And people with this disorder - as with those with other mental health issues - are likely to be avoided, mocked, misunderstood and discriminated against. That needs to change, and the call for change needs to be made again and again as long as this situation remains unchanged.

Friday 19 August 2016

Concerns of parenting today's kids

In today's The Straits Times, an article appeared written by Dr Chong Siow Ann, vice-chairman of the medical board (research) at the Institute of Mental Health, Singapore. I liked it as I also experienced the type of childhood which the writer has described, and hence thought of sharing parts of it on my blog:

"Growing up in the 1960s and 70s, I (Dr Chong) had what could be accurately called a carefree childhood. Mostly I was left to my own devices to entertain myself. I wasn't taught to read or write until I went to primary school nor was I enrolled in a kindergarten; I think now it was because my parents couldn't afford the additional expense (there were already four other children at school) and they probably felt that it was unnecessary.

My mother, who had a few years of formal education and could read and write only in Chinese, proceeded to coach me on that single subject and kept an eye out that I would complete whatever homework I was given - even though she couldn't understand most of the other subjects. Schooling was relatively straightforward then: You went through primary school, took the Primary School Leaving Examination - which had no aggregate score - and, having cleared that, you proceeded to secondary school which, in my case, was the one that was nearest to my home.

And if you did reasonably well by Secondary 2, you were expected to go to the "science stream". I made that decision myself as with all other decisions about my subsequent education: which extra-curricular activities to join, which junior college to go to, and what university degree to pursue.

I felt no pressure from my parents, though, of course, they were proud (and probably surprised) that I was admitted into medical school. What mattered to them was that I should at least have a university degree and thereafter a steady job, and be an honest, decent and useful person - and they tried to ensure all that in a rather instinctual way.

HELICOPTER PARENTS
Within two generations, Singapore has catapulted itself into the First World. Meritocracy has been the organising principle of that transformation; and for better or worse, it has also been imprinted into our psyche.

With growing affluence and with most couples having fewer children, the latter have become the most precious of all possessions and, in tandem, parenting has become a very deliberate, self-conscious and angst-riven activity - particularly with the so-called helicopter parenting which is that odd amalgam of pampering and achievement pressure. Overprotective, over- controlling and intrusive, these helicopter parents would hover and keep their children on their radar screen: orchestrating and monitoring their activities, and swooping to blast away any obstacles in their path.

Sheep-like, disempowered and bereft of any sense of agency, these children are ferried, guided and nudged along the highways and byways of a demanding terrain of academic and extra-curricular activities. Having imbibed the ambitions of their parents and squinting through the parental prism, they see only one narrow path to success in life. The consequence - as we are told by concerned scholars and educators in a slew of scholarly studies, best-selling books and newspaper and magazine articles - is that these children who are consumed with the fear of not measuring up, don't learn to cope effectively with problems nor do they know how to soothe themselves when they are distressed.

There is "declining student resilience" and "emotional fragility", according to the Boston College psychologist Peter Gray. "Students are afraid to fail; they do not take risks; they need to be certain about things," he wrote of the students in the United States and the growing mental health crisis among them. "For many of them, failure is seen as catastrophic and unacceptable. External measures of success are more important than learning and autonomous development."

A five-year study from the National University of Singapore published in the Journal Of Personality this year showed that local children of intrusive parents who have high academic expectations of them are likely to be more self-critical and more inclined to feel that they fall short. "The child may become afraid of making the slightest mistake and will blame himself or herself for not being 'perfect'," said the study's lead investigator Ryan Hong, who warned bleakly that "it increases the risk of the child developing symptoms of depression, anxiety and even suicide in very serious cases".

Other research elsewhere has shown that students with "helicopter" parents are more likely to be medicated for anxiety and depression.

TIGER MUMS
To a certain extent, some parents may feel as hapless as their children, being compelled as they were in a meritocratic elitist society where - so goes the popular narrative - the best chance of material success in later life is attaining the requisite academic credentials earlier in life. And which parent would not be beset by that raft of guilt, uncertainty and anxiety of not doing enough in securing that head start for their child?

But still there is a general feeling that such values and expectations are wrong. The tendency is to blame the education system for being that crucible of feverish competition and high pressure. There have been many calls for changes. As The Straits Times editorial of July 16 said, the recent revamp of the PSLE nurtures the hope that primary education should be for children "to develop their passion for learning, grow in values and character, and explore their strengths and interests".

That sounds intuitively and sensibly right but there is a salutary lesson to be learnt from the experience of the world's most powerful nation. Americans have been drilled to respect the individuality of their children, to support them in their self-chosen passions, and to boost their self-esteem which is supposed to make them learn better.

But as the American journalist Elizabeth Kolbert pointed out in her piece in The New Yorker a few years ago: "After a generation or so of applying this theory, we have the results. Just about the only category in which American students outperform the competition is self-regard." She highlighted a study by the Brookings Institution that compared students' own assessments of their abilities in maths with their actual scores on a standardised test. Nearly 40 per cent of American students declared that they usually do well in mathematics, but only 7 per cent of them actually did well enough on the test to qualify as advanced.

In contrast, 18 per cent of Singaporean students said they usually did well in maths; 44 per cent qualified as advanced on the test, with even the least self-confident Singaporean students outscoring the most self-confident Americans. As Ms Kolbert commented wryly: "You can say it's sad that kids in Singapore are so beaten down that they can't appreciate their own accomplishments. But you've got to give them this: At least they get the math right."

And it's not just maths - American students are far from the top in international rankings for excellence in science. This Western orthodoxy of nurturing the self-esteem of the children and allowing them unfettered expression is anathema to Amy Chua, Yale law professor and author of that controversial book, Battle Hymn Of The Tiger Mother, where she expounded her exacting Chinese child-rearing of her two high-achieving daughters.

She argued that the sort of parenting which emphasises self-esteem without an accompanying insistence on actual accomplishment will set the children up to accept mediocrity. And it has another darker implication - a society that nurtures and blithely accepts unearned self-esteem could turn out entitled narcissists and weaken its global competitiveness.

The changes to Singapore's own education system are made in the hope that our children will have a less burdened childhood. But there is, I think, another intent, which is to help them be more creative, more original and more imaginative as adults - attributes that are essential for a "knowledge economy".

Let's hope that it will achieve all that, though Amy Chua's stern assertion might be something to be borne in mind. However, being what we are, it is unlikely that our tiger mums and cubs would be an endangered species any time soon."

Sunday 31 July 2016

How to make a good teacher

I came across an interesting article in June 11, 2016 issue of The Economist on 'How to make a good teacher'. It was quite relevant, and the points made there resonated with my thinking, hence I have appended the article below:

FORGET smart uniforms and small classes. The secret to stellar grades and thriving students is teachers. One American study found that in a single year’s teaching the top 10% of teachers impart three times as much learning to their pupils as the worst 10% do. Another suggests that, if black pupils were taught by the best quarter of teachers, the gap between their achievement and that of white pupils would disappear.

But efforts to ensure that every teacher can teach are hobbled by the tenacious myth that good teachers are born, not made. Classroom heroes like Robin Williams in “Dead Poets Society” or Michelle Pfeiffer in “Dangerous Minds” are endowed with exceptional, innate inspirational powers. Government policies, which often start from the same assumption, seek to raise teaching standards by attracting high-flying graduates to join the profession and prodding bad teachers to leave. Teachers’ unions, meanwhile, insist that if only their members were set free from central diktat, excellence would follow.

The premise that teaching ability is something you either have or don’t is mistaken. A new breed of teacher-trainers is founding a rigorous science of pedagogy. The aim is to make ordinary teachers great, just as sports coaches help athletes of all abilities to improve their personal best. Done right, this will revolutionise schools and change lives.

Education has a history of lurching from one miracle solution to the next. The best of them even do some good. Teach for America, and the dozens of organisations it has inspired in other countries, have brought ambitious, energetic new graduates into the profession. And dismissing teachers for bad performance has boosted results in Washington, DC, and elsewhere. But each approach has its limits. Teaching is a mass profession: it cannot grab all the top graduates, year after year. When poor teachers are fired, new ones are needed—and they will have been trained in the very same system that failed to make fine teachers out of their predecessors.

By contrast, the idea of improving the average teacher could revolutionise the entire profession. Around the world, few teachers are well enough prepared before being let loose on children. In poor countries many get little training of any kind. A recent report found 31 countries in which more than a quarter of primary-school teachers had not reached (minimal) national standards. In rich countries the problem is more subtle. Teachers qualify following a long, specialised course. This will often involve airy discussions of theory—on ecopedagogy, possibly, or conscientisation (don’t ask). Some of these courses, including masters degrees in education, have no effect on how well their graduates’ pupils end up being taught.

What teachers fail to learn in universities and teacher-training colleges they rarely pick up on the job. They become better teachers in their first few years as they get to grips with real pupils in real classrooms, but after that improvements tail off. This is largely because schools neglect their most important pupils: teachers themselves. Across the OECD club of mostly rich countries, two-fifths of teachers say they have never had a chance to learn by sitting in on another teacher’s lessons; nor have they been asked to give feedback on their peers.

Those who can, learn

If this is to change, teachers need to learn how to impart knowledge and prepare young minds to receive and retain it. Good teachers set clear goals, enforce high standards of behaviour and manage their lesson time wisely. They use tried-and-tested instructional techniques to ensure that all the brains are working all of the time, for example asking questions in the classroom with “cold calling” rather than relying on the same eager pupils to put up their hands.

Instilling these techniques is easier said than done. With teaching as with other complex skills, the route to mastery is not abstruse theory but intense, guided practice grounded in subject-matter knowledge and pedagogical methods. Trainees should spend more time in the classroom. The places where pupils do best, for example Finland, Singapore and Shanghai, put novice teachers through a demanding apprenticeship. In America high-performing charter schools teach trainees in the classroom and bring them on with coaching and feedback.




Teacher-training institutions need to be more rigorous—rather as a century ago medical schools raised the calibre of doctors by introducing systematic curriculums and providing clinical experience. It is essential that teacher-training colleges start to collect and publish data on how their graduates perform in the classroom. Courses that produce teachers who go on to do little or nothing to improve their pupils’ learning should not receive subsidies or see their graduates become teachers. They would then have to improve to survive.

Big changes are needed in schools, too, to ensure that teachers improve throughout their careers. Instructors in the best ones hone their craft through observation and coaching. They accept critical feedback—which their unions should not resist, but welcome as only proper for people doing such an important job. The best head teachers hold novices’ hands by, say, giving them high-quality lesson plans and arranging for more experienced teachers to cover for them when they need time for further study and practice.

Money is less important than you might think. Teachers in top-of-the-class Finland, for example, earn about the OECD average. But ensuring that the best stay in the classroom will probably, in most places, mean paying more. People who thrive in front of pupils should not have to become managers to earn a pay rise. And more flexibility on salaries would make it easier to attract the best teachers to the worst schools.

Improving the quality of the average teacher would raise the profession’s prestige, setting up a virtuous cycle in which more talented graduates clamoured to join it. But the biggest gains will come from preparing new teachers better, and upgrading the ones already in classrooms. The lesson is clear; it now just needs to be taught.

Wednesday 15 June 2016

The smart products we are stupid enough to buy

The other day I was reading this article on how we are being lured to buy and use the smart products nowadays. For example, all of us use the toothbrush, floss and interdental brushes, or at least one of them, to clean our teeth every day. There comes the new Oral B Genius 9000, the smartest of all smart toothbrushes. To use the toothbrush you have to attach your phone to your bathroom mirror at mouth level so that its camera can keep an eye on you as it takes you on a "28-day plaque journey".

As you brush, the screen lits up telling you which bit of your mouth you are working on. This might have been smart, only you know the answer already. It times your brushing (a task you don't require) and while it does so, it distracts you from the job by telling you (incorrectly) what the weather is like outside, and what is happening in the world. "Impressive!" it says when you are done. Again, you being an adult, no longer need congratulating on having brushed your teeth.

The data from your brushing is duly logged, against which every future act of brushing could be compared - turning the oral hygiene routine into a fun competition against yourself. If you ask me, I shall never use this type of toothbrush app. The five minutes or so a day I spend cleaning teeth are a time of relative calm, not data-gathering activity. I am going to keep them that way.

Another smart invention is the smart clothes pegs. Peggy, being tested in Australia by Unilever, is a plastic peg containing a thermometer and a hygrometer that sends messages to your phone that say: "Hi, rain clouds are on the way, let's dry the washing tomorrow." The company is claiming that Peggy will allow parents to spend more time with children. This makes no sense as the main thing that keeps parents from children is not drying out the washing on rainy days - it's staring at their smartphones.

Fitbit and Jawbone have already turned half the population into competitive walking bores. Oral B smart toothbrush and Peggy take it one step beyond. Even more promising are smart umbrellas and smart wallets that discourage you from losing them by reminding you every time they stray too far from your phone. Yet these sound like a real nuisance - whenever you leave your umbrella inside your own front door and go to sit on the sofa, your phone tells you your umbrella is out of radius.

The most unwelcome "advance" of all is the smart tampon. This is a normal tampon attached to a wire that connects to a sensor clipped to your underpants. Every time the sensor thinks it's time for a new one it alerts your phone. It's difficult to imagine why anyone would want their body to be wired up in this way and, in any case, there is no need. Women already have two methods of knowing when to change tampons: looking at their watches and listening to their bodies.

The more we learn about the Internet of Things, the more I think we are slipping into a world of make-believe. The giddy growth of smart technology is both easy to understand and a mystery. The growing supply is no surprise. Manufacturers make this stuff because they can. The technology exists and it is quite cheap. Thanks to venture capitalists, there is no shortage of people to finance it.
On the demand side, it remains a puzzle. The fact that people are so willing to pay for non-solutions to non-problems is the best evidence of the irrationality of the consumer market.

If we want such smart gadgets, we must be dumb. And not only that: Smart technology is making us dumber. If we no longer have to look at the sky before putting the washing out, and if our favourite conversation is who walked/ brushed/squeezed for longest, our brains will soon become in far more urgent need of exercise than our gums or leg muscles.

Two questions to ask every day

One friend sent an interesting post, and towards the end, there were two questions. These are not just any old questions though, but the questions that the famous inventor, author and diplomat Ben Franklin used to ask himself each day.

What are the two questions? Well, in the morning he asked himself: What good shall I do this day?

And in the evening he asked himself: What good have I done today?

Simple but powerful. These two questions bring our focus back on doing the essential things during the day. In that way we won't be wasting time in doing meaningless, redundant activities. When we take stock of our day's work in the evening, we take ownership for it, rather than complaining or blaming others for not not achieving something. And if we have been successful in accomplishing what we wished to do in the morning, it gives us reason to feel satisfied and happy. So if you also wish to feel the same as I have started feeling, you need to ask these two questions from one of the most productive and creative people in the world!

Saturday 11 June 2016

Fitbit craze

Have you heard of the 'Fitbit'? It is a watch-strap like wearable device which keeps track of many of our body parameters, like heart rate, blood pressure, etc., including the steps we take during a day, converting them into miles/kilometers. Both of my daughters bought it as soon as it was launched some time ago.

However, I am not a big fan of such devices, as I feel there is no need to monitor such parameters every day or after every activity. See, our body responds to daily activities in many different ways, hence these parameters keep fluctuating over the day. But equally true is that our body is constantly trying to balance these all the time. And the good thing is that we can 'sense' these changes easily as these are expressed quite well as bodily reactions like fever, tiredness, allergies, muscle soreness, cough, etc., or/and emotional responses like feelings of sadness, despair, anxiety, elation, etc.

Many users of these activity-trackers have also harboured suspicions over their accuracy. One research study recently found that the pulse-monitoring technology used in its wrist-bound Surge and Charge devices like Fitbit was "highly inaccurate during elevated physical activity". Researchers from California State Polytechnic University, Pomona, had 43 subjects wear the devices as they ran, jogged and jumped rope, among other activities, and then compared the readings with those of an electrocardiogram. During moderate to high-intensity exercise, Fitbit's sensor was off by an average of about 19 beats a minute.

In another study on Jawbone and Fitbit devices, it was found that both devices overcounted and undercounted as the activities intensified.

Both these studies were not peer-reviewed, neither repeated; so no concrete conclusion can be drawn from these. However, it indicates that it is difficult to believe the accuracy of these devices. Then what's the point of wearing such devices??

If you are checking your vital parameters very frequently through the day, and notice changes, which are bound to be, you are heading towards a downward spiral of obsession for 'perfect readings' which never happen for anyone! Our bodies are unique, and always in flux, responding to different environments, both external and internal; hence the concept of having a record of these vital statistics itself is stress-inducing. I will rather believe my intuition, sense my physical reactions and emotional state, and if there is something unexplained, then will measure these with proper instruments!

Learn to develop emotional granularity

When people or situation gets us down, do we feel just generally "bad"? Or do we have more precise emotional experiences, such as grief or despair or frustration?

In psychology, people with finely tuned feelings are said to exhibit "emotional granularity". Emotional granularity isn't just about having a rich vocabulary; it's about experiencing the world, and ourselves, more precisely. This can make a difference in our life. In fact, there is growing scientific evidence that precisely tailored emotional experiences are good for us, even if those experiences are negative.

According to a collection of studies, finely grained, unpleasant feelings allow people to be more agile at regulating their emotions, less likely to drink excessively when stressed and less likely to retaliate aggressively against someone who has hurt them. Perhaps surprisingly, the benefits of high emotional granularity are not only psychological. People who achieve it are also likely to have longer, healthier lives. They go to the doctor and use medication less frequently, and spend fewer days hospitalised for illness. Cancer patients, for example, have lower levels of harmful inflammation when they more frequently categorise, label and understand their emotions.

Lisa Feldman Barrett, a professor of psychology at Northeastern University in US, and author of the book 'How Emotions Are Made' discovered emotional granularity in the 1990s. Her lab asked hundreds of volunteers to keep track of their emotional experiences for weeks or months. Everyone they tested used the same stock of emotion words, such as "sad" and "angry" and "afraid", to describe their experiences.

Our brain, they showed, in a very real sense constructs our emotional states - in the blink of an eye, outside of our awareness - and people who learn diverse concepts of emotion are better equipped to create more finely tailored emotions. This is why emotional granularity can have such influence on our well-being and health: it gives our brain more precise tools for handling the myriad challenges life throws at us.

Suppose we're facing the city's troubles with water contamination. Suppose that each morning, as we turn on the tap, we experience an unpleasant feeling of general badness. It's important to note that we've created that vague feeling of badness. Neuroscience has shown that human brains are not "reactive" organs that merely respond to the world in some predetermined way, such as spiking our blood pressure when we see the words "non-potable water". Rather, our brain regulates our body's energy needs proactively, spiking our blood pressure in anticipation of what might come next, based on experience. This process is like keeping a budget for our body. And just like a financial budget, a body budget needs to be kept balanced in order to be healthy.

So in the above example of water contamination, our brain anticipates a threat and our cortisol level spikes, readying our body for action, but a feeling of general badness calls for no specific action. We merely feel awful because our brain has made a needless withdrawal from our body budget. And the next time we're in a similar situation, our brain goes through the same process. Again we feel lousy and trapped by our circumstances. Over time, a poorly calibrated body budget can pave the road to illness.

With higher emotional granularity, however, our brain may construct a more specific emotion, such as righteous indignation, which entails the possibility of specific actions. We might telephone a friend and rant about the water crisis, or we might Google "lead poisoning" to learn how to better protect our children. We are no longer an overwhelmed spectator but an active participant. We have choices. This flexibility ultimately reduces wear and tear on our body (for example, unnecessary surges of cortisol).

The good news is that emotional granularity is a skill, and many people can increase theirs by learning new emotion concepts. Instead of grouping all negative emotions as 'bad', we need to feel them: is it awful, frustration, gloom, anxiety, helplessness or disgust! Schoolchildren who learn more emotion concepts have improved social behaviour and academic performance, as research by the Yale Centre for Emotional Intelligence shows.

If we incorporate such concepts into our daily life, our brain will learn to apply them automatically. Emotion concepts are tools for living. The bigger our tool kit, the more flexibly our brain can anticipate and prescribe actions, and the better we can cope with life. 

Monday 4 April 2016

How are micro nutrients different from macro nutrients?

The food we eat provides us with energy as well as with essential ingredients necessary for the growth and maintenance of our body. All these nutrients can be divided into two main categories, micro and macro nutrients.

Micro Nutrients

These are present in our diets, but in very small amounts. These can be found in vitamins, minerals and trace elements. Micro nutrients, just like water do not provide energy; however they are still needed in adequate amounts to ensure that all our body cells function properly. Even though their presence is in minute amounts, these are very important to nutrition.

Most of the micro nutrients are known to be essential nutrients, meaning they are those which are indispensable to life processes, and what the body cannot make itself. In other words, this means these essential nutrients can only be obtained from the food we eat.
Micro nutrients can be found in;
· Vitamins
· Minerals
· Trace elements

Macro nutrients

These are present in our diets in large amounts, and make up the bulk of our diets.
They can be found in;
· Carbohydrates
· Fat
· Protein

And water.

Dalai Lama's 18 Rules for Living

For happy and healthy living, let's go through the 18 rules proposed by H.H. Dalai Lama:

1.Take into account that great love and great achievements involve great risk.

2.When you lose, don't lose the lesson.

3.Follow the three R's:
  • Respect for self
  • Respect for others
  • Responsibility for all your actions.
4.Remember that not getting what you want is sometimes a wonderful stroke of luck.

5.Learn the rules so you know how to break them properly.

6.Don't let a little dispute injure a great friendship.

7.When you realize you've made a mistake, take immediate steps to correct it.

8.Spend some time alone every day.

9.Open your arms to change, but don't let go of your values.

10.Remember that silence is sometimes the best answer.

11.Live a good, honorable life. Then when you get older and think back, you'll be able to enjoy it a second time.

12.A loving atmosphere in your home is the foundation for your life.

13.In disagreements with loved ones, deal only with the current situation. Don't bring up the past.

14.Share your knowledge. It's a way to achieve immortality.

15.Be gentle with the earth.

16.Once a year, go someplace you've never been before.

17.Remember that the best relationship is one in which your love for each exceeds your need for each other.

18.Judge your success by what you had to give up in order to get it.

Sunday 3 April 2016

Laptops in classrooms: Learning aid or distraction?

Almost a decade ago, there was a news report which stated that Australia had introduced a programme to ensure every secondary school student in the country had a computer as part of a so-called "digital education revolution". The A$2.4 billion programme was introduced in 2007 and laptop was declared "the toolbox of the 21st century". But the push to roll out technology in classrooms is facing a backlash now, with some schools and teachers saying computers are a "distraction" and can hinder learning.

One of Australia's leading schools, Sydney Grammar School, which was attended by Prime Minister Malcolm Turnbull, has now banned students from bringing laptops to school. The elite private boys' school is also requiring students up to grade 10 - the third-last year of secondary school - to handwrite assignments. The school's headmaster, Dr John Vallance, says the use of laptops and iPads in the classroom is a distraction and prevents students from being able to express themselves by writing. Dr Vallance expresses his views,"We see teaching as fundamentally a social activity... It's about interaction between people, about discussion, about conversation. We find that having laptops or iPads in the classroom inhibits conversation - it's distracting".

Another school, St Paul's Catholic College in Sydney has banned the use of laptops for one day a week to encourage students to play sport and to reduce reliance on the machines. "Computers have been oversold and there is no evidence that they improve outcomes," said the principal, Mr Mark Baker of the above-mentioned college. "The problem is maturity. They (students) are very good at using technology for social interaction but not for learning." The Australian Education Union's President, Ms Correna Haythorpe, said schools should consider ways to effectively incorporate technology, including protocols to ensure computers were being used for education purposes.

The use of computers in classrooms has become a vexed topic among schools and educators around the globe. The Organisation for Economic Cooperation and Development has expressed concern about the potential overuse of technology in schools. Its research has found that as students use technology more intensively, their reading skills begin to drop substantially. "The reality is that technology is doing more harm than good in our schools today," the organisation's director for education and skills, Mr Andreas Schleicher, reportedly told the Global Education and Skills Forum in Dubai in February,2016.

According to OECD figures from 2012, Australia had the world's second-highest proportion of students using computers in school - 93.7 per cent, slightly behind the Netherlands. Singapore's figure was 69.9 per cent. But Singapore was at the top of the OECD global education rankings for maths and science released last May, with Australia in 14th place. Australia's place in the rankings has slipped in recent years, despite the promotion of technology in classrooms.

Experts in Australia have expressed mixed views about the technology roll-out. An expert on learning and technology, Professor Glenn Finger from Griffith University, said he did not agree with banning computers or requiring handwritten assignments but supported a "balanced" approach. "To go the other way and not use any technology at all may not be productive either. You can have a blended learning approach which takes advantage of the technology and of excellent teaching," he said. He likened banning computers to banning books. "For a student, it is dangerous to have a ban," he said. "Handwritten assignments are from a pre-1993 analogue world. It is not how most people in business or government or young people operate."

So, we the educators and policy makers, have to sit and rethink on our overemphasis on the use of laptop as the main mode of imparting education, which is reducing the role of the teacher to that of facilitator. That is already showing its negative effects on the students, in the form of their shorter attention spans, distractions, hooked to social media, poor performance in class as well as in exams! We need to adopt the blended approach to learning and teaching in such a way that technology complements the process of actual teaching in the classroom.

Saturday 2 April 2016

Managing our cholesterol levels

We often hear of some people avoiding certain types of food like eggs because they think their cholesterol level will go up by eating it. For sure, high cholesterol levels will put us at risk of coronary heart disease, heart attack and stroke, but dietary cholesterol does not raise our blood cholesterol as much as large amounts of saturated fats do.

Cholesterol is a waxy, fat-like substance that is made and used by our bodies to make some hormones (e.g., sex hormones), vitamin D, bile and other substances. It is mainly found in foods from animal sources, such as meat, poultry and full-fat dairy products. When we eat a diet high in saturated and trans fats, our livers will produce more cholesterol.

Many people do not know their cholesterol level is too high as high cholesterol levels do not cause symptoms. A blood test will tell us if our cholesterol level is too high. We can control our cholesterol level with a healthy diet and regular aerobic exercise, though some people will also need to take medications. Let's take a closer look.

How does high cholesterol affect us?

Excess cholesterol in our blood will build up in the walls of our arteries and this plaque will make it harder for our heart to circulate blood. A heart attack or stroke can occur from sudden blood clots in these narrowed arteries. Cholesterol is transported through our blood stream by carriers called lipoproteins, of which the two main types are low-density lipoprotein (LDL) and high-density lipoprotein (HDL).

LDL is the bad cholesterol as it carries cholesterol to the tissues, including the arteries, and elevated levels of LDL are strongly linked to increased cardiosvascular risk. HDL is considered "good" cholesterol because it transports cholesterol back to the liver, where it is passed from the body.

What types of food should we avoid?

Cholesterol levels can certainly be lowered by dietary changes, especially by avoiding red meat, butter, fried foods, cheese and other foods that have a lot of saturated fat. Also, we need to restrict our intake of sugar, sweets and refined grains, which are found in food such as white bread, white rice and most forms of pasta. Eggs are fine; however, moderate consumption of up to one a day, is acceptable and is safe for the heart.

What is a good diet to follow?

A Mediterranean diet appears to reduce the risk of cardiovascular events. This diet typically consists mainly of fruits and vegetables, whole grains, beans, nuts, seeds and olive oil as an important source of fat. It also usually includes low to moderate amounts of fish, poultry, dairy products and little red meat.

When do I need medication (e.g., statins)?

Statins are drugs that can block a substance our body needs to make cholesterol. Several trials have unequivocally demonstrated the benefit of statins in patients with coronary artery plaque disease and especially in those who have suffered a heart attack. The decision to start statins is made after a personalised assessment of the patient's overall cardiovascular risk. So, if one's long-term risk of experiencing a heart attack or stroke is high for instance, statins may help.

Although medications can rapidly lower our cholesterol levels, it often takes 6 to 12 months before the effects of lifestyle modifications are noticeable. The treatment of high cholesterol - and triglycerides, a type of fat that contributes partly towards our total cholesterol count - is a lifelong process. It is thus important to stick with the treatment plan once we begin to see results.

Many people request to discontinue their statin treatment because of its side effects. Statins are generally very safe, though some people do not tolerate it well. Some of the common side effects include muscle and joint pain, nosebleeds, sore throat, headache and problems with the digestive system like diarrhoea, according to Britain's National Health Service (NHS). Also, liver problems can happen but are rare. Cognitive impairment like memory loss and confusion is another side effect that has been reported. The US Food and Drug Administration said these experiences are rare. Furthermore, the symptoms were not serious and were reversible within a few weeks after the patient stopped using the drug.

Statins may also confer "a small increased risk of developing diabetes" and this risk becomes slightly greater with high doses than moderate doses. However, there is overwhelming evidence from clinical trials that shows that statins reduce heart attacks in patients with and without diabetes. The beneficial effects of statins on cardiovascular protection thus far outweigh the increased risk.

How much would you pay to extend your life by a year?

If you have recently watched TV (in Singapore or New Delhi), you will have likely seen the commercial where somebody's future older self scolds his current younger self for not properly planning for the future. The commercial raises an interesting question about our ability to predict what our future self is likely to want.

A study published in the prestigious journal Science suggests that our predictions about future are likely to be wrong. It presents something the authors call the "end-of-history illusion". The authors surveyed more than 19,000 adults and asked them to report how much they had changed in the past and to predict how much they would likely change in the future.

Results showed that regardless of how old they were, people generally responded that they felt they had changed a lot in the past but were unlikely to change much in the future. In other words, in the moment, we all believe we know our true selves, but we are almost surely wrong. As the illusion becomes clear, when asked again at any point in the future, we are likely to make the same erroneous claim.

The end-of-history illusion suggests that our current and future selves are likely to disagree on many issues, but it is the concept of 'present bias' that provides insight into which side the two parties will take. Present bias implies an irrational preference for current over future consumption, and therefore too little investment in the future. From the future self's perspective, the current self will exercise too little, eat too much, and not save for a rainy day. By the time the current self becomes the future self, it is too late and all that is left to do is to regret the decisions made by their former selves.

There are many practical implications of these biases.Many studies were conducted at the Lien Centre for Palliative Care at Duke-NUS Medical School, Singapore, to explore the extent to which these and other biases influence treatment choices for life-limiting illnesses like cancer. These showed that caregivers were far more aggressive in their willingness to pursue treatments with only moderate survival benefits. As a result, patients who do not have a say in their treatment are likely to be over-treated compared to what they would receive if actively involved in the treatment decisions. In short, the results are alarming and suggest that patients with life-limiting illnesses are unlikely to receive care consistent with their preferences.

As an example, over 500 healthy older Singaporeans and 320 cancer patients were surveyed to explore how much each group would be willing to pay for moderately life-extending treatments and other end-of-life services. Healthy older adults stated, on average, that they would pay less than $3,000 to extend their life by one year if diagnosed with a life-limiting illness such as advanced cancer. Cancer patients, on the other hand, were willing to pay roughly $18,000, six times what healthy adults thought they would pay if in the same situation. Clearly, the current and future selves are seeing things differently.

In Singapore, as with many other Asian countries, patients often defer to family members to make decisions as to which end-of-life treatments to receive. Anecdotally, up to one-third of cancer patients either do not know or pretend not to know that they have cancer. For them, all treatment decisions are made by the family, with input from the doctor.

This would not be problematic if patients and their family caregivers had similar views on end-of-life treatments. However, a second study conducted by at the Centre reveals that is unlikely to be the case, presumably because caregivers want to retain hope and avoid any regret for not doing everything within their power to extend the life of their loved one. In this study, cancer patients' willingness to pay for end-of-life treatments was compared with that of their family caregivers. We found that caregivers were far more aggressive in their willingness to pursue treatments with only moderate survival benefits.

For a treatment that would extend the patient's life by one year, in contrast to the $3,000 stated by healthy adults for extending their own life and the $18,000 stated by patients, caregivers would pay over $61,000 - more than three times what patients would pay for themselves. As a result, patients who do not have a say in their treatment are likely to be overtreated compared with what they would receive if actively involved in the treatment decisions.

One might hope that doctors would intervene to ensure overtreatment does not occur. However, a third study suggests this is unlikely to be the case. In this study, 285 local doctors were surveyed and given hypothetical scenarios describing patients with life-limiting illnesses but with characteristics that varied by age, expected survival, cognitive status and treatment costs. For each scenario, the physicians were asked whether or not they would recommend life-extending treatments.

Results showed a lack of consistency in physician recommendations. For example, for a 75-year-old patient who is not cognitively impaired and whose life could be extended by one year at a cost of $55,000, roughly 45 per cent of physicians stated they would recommend the life-extending treatment and the remainder said they would not. This is close to a coin toss and suggests that if a patient were to get a second or third opinion on the recommended course of treatment, it would almost surely differ from the first. This would clearly cause great anxiety on the part of the patient and family.

However, it suggests that physicians should educate patients and their families on the clinical benefits of various treatment options but because clinical benefit is only one of many factors that influence treatment choices for patients with life-limiting illnesses, physicians are not in a good position to make treatment decisions on behalf of the patient. This is best left to the patient, with input from the family.

So what are the implications of the above?

In short, the results of the Singapore studies suggest that we cannot count on our current selves to properly forecast what our future selves would want if diagnosed with a life-limiting illness. Most likely, we will underestimate our future demand and therefore not plan appropriately. Contrarily, our loved ones, not wanting to give up hope and wanting to avoid future regret, are likely to push us towards treatments that we will feel are not worth the expense, thus exposing the family to significant financial risk. Physicians, often spurred by a healthcare system that pursues aggressive treatments even with limited survival benefits, are unlikely to go against the wishes of the family if the care has any potential to extend life.
There are no easy fixes to these problems. Advance care planning and open discussions about treatment choices if diagnosed with an advanced illness are other recommended solutions. These discussions should take place early and often, and should include considerations about costs but also discussions about trade-offs between quality of life and care that modestly extends life but potentially at a low quality.

One finding this research makes clear is that patients value dying at their place of choice, avoiding severe pain and receiving well-coordinated healthcare where they are treated with dignity and respect far more than they value moderate increases in life expectancy. Caregivers, providers and policymakers need to understand what matters most to patients as they approach the end of life and work to ensure those priorities are met.

Monday 28 March 2016

GPS: The road to ruin for the brain

I was reading an article written by Greg Milner who is the author of the forthcoming book 'Pinpoint: How GPS Is Changing Technology, Culture And Our Minds', and that set me thinking how reliance on GPS while driving could erode our cognitive maps because we stop thinking for ourselves. In Human Evolution, we talk about the 'use and disuse theory' where disuse of an organ leads to its loss over a period of time!

As Greg mentions, after a couple of Swedes mistakenly followed their GPS to the city of Carpi (when they meant to visit Capri), an Italian tourism official dryly noted to the BBC that "Capri is an island. They did not even wonder why they didn't cross any bridge or take any boat". And an Upper West Side blogger's account of the man who interpreted "turn here" to mean onto a stairway in Riverside Park was headlined: GPS, Brain Fail Driver.

But some have tragic endings - like the couple who ignored "Road Closed" signs and plunged off a bridge in Indiana last year. Disastrous incidents involving drivers following disused roads and disappearing into remote areas of Death Valley in California became so common that park rangers gave them a name: "Death by GPS." Last October, a tourist was shot to death in Brazil after GPS led her and her husband down the wrong street and into a notorious drug area.

If we're being honest, it's not that hard to imagine doing something similar ourselves. Most of us use GPS as a crutch while driving through unfamiliar terrain, tuning out and letting that soothing voice do the dirty work of navigating. Since the explosive rise of in-car navigation systems around 10 years ago, several studies have demonstrated empirically the downside of using GPS. Cornell researchers who analysed the behaviour of drivers using GPS found these drivers to be "detached" from the "environments that surround them". Their conclusion: "GPS eliminated much of the need to pay attention."

We seem so driven to transform our cars, or other conveyances fitted with latest GPS that we fail to see one of the consequences is a possible diminution of our "cognitive map", a term introduced in 1948 by the University of California, Berkeley psychologist Edward Tolman. In a groundbreaking paper, Dr Tolman analysed several laboratory experiments involving rats and mazes. He argued that rats had the ability to develop not only cognitive "strip maps" - simple conceptions of the spatial relationship between two points - but also more comprehensive cognitive maps that encompassed the entire maze.

Could society's embrace of GPS be eroding our cognitive maps? For Dr Julia Frankenstein, a psychologist at the University of Freiburg's Centre for Cognitive Science, the danger of GPS is that "we are not forced to remember or process the information - as it is permanently 'at hand',' we need not think or decide for ourselves". She has written that we "see the way from A to Z, but we don't see the landmarks along the way". In this sense, "developing a cognitive map from this reduced information is a bit like trying to get an entire musical piece from a few notes". 

There is evidence that one's cognitive map can deteriorate. A widely reported study published in 2006 demonstrated that the brains of London taxi drivers have larger than average amounts of grey matter in the area responsible for complex spatial relations. Brain scans of retired taxi drivers suggested that the volume of grey matter in those areas also decreases when that part of the brain is no longer being used as frequently. "I think it's possible that if you went to someone doing a lot of active navigation, but just relying on GPS," Dr Hugo Spiers, one of the authors of the taxi study, "you'd actually get a reduction in that area."

GPS is just one more way for us to strip-map the world, receding into our automotive cocoons as we run the maze. Maybe we should be grateful when, sometimes we end up in an entirely different but beautiful place, even if by accident!

New take on the Midlife crisis

The phrase 'midlife crisis' is the stage in the middle of the journey of life when people feel youth vanishing, their prospects narrowing and death approaching. So they start feeling anxious and nervous. But there's only one problem with the cliche. It isn't true.

"In fact, there is almost no hard evidence for midlife crisis at all, other than a few small pilot studies conducted decades ago," reporter Barbara Bradley Hagerty writes in her new book, Life Reimagined. The vast bulk of the research shows that there may be a pause, or a shifting of gears in the 40s or 50s, but this shift "can be exhilarating, rather than terrifying". Ms Bradley Hagerty looks at some of the features of people who turn midlife into a rebirth. They break routines, because "autopilot is death". They choose purpose over happiness - having a clear sense of purpose even reduces the risk of Alzheimer's. They put relationships at the foreground, as career often recedes.

Life Reimagined paints a portrait of middle age that is far from grim and decelerating. Midlife begins to seem like the second big phase of decision-making. According to the book, our identity has been formed; we know who we are; we've built up our resources; and now we have the chance to take the big risks precisely because our foundation is already secure.

The theologian Karl Barth described midlife in precisely this way. At middle age, he wrote: "The sowing is behind; now is the time to reap. The run has been taken; now is the time to leap. Preparation has been made; now is the time for the venture of the work itself." The middle-aged person, the late Barth continued, can see death in the distance, but moves with a "measured haste" to get big new things done while there is still time.

What Barth wrote decades ago is even truer today. People are healthy and energetic longer. Greater longevity is changing the narrative structure of life itself. People between age 20 and the early 30s can now take a little more time to try on new career options, new cities and new partners. Also, another profound change is the altered shape of middle age. What could have been considered the beginning of a descent is now a potential turning point - the turning point we are most equipped to take full advantage of. It is the moment when we can look back on our life so far and see it with different eyes. We begin to see how all our different commitments can be integrated into one meaning and purpose.

We might have enough clarity by now to orient our life around a true north on some ultimate horizon. Lincoln, for example, found in midlife that everything so far had prepared him to preserve the Union and end slavery. The rest of us don't have causes that grand, but plenty of people bring their life to a point. They dive fully into existing commitments, or embrace new ones.

Either way, with a little maturity, they're less likely by middle age to be blinded by ego, more likely to know what it is they actually desire, more likely to get out of their own way, and maybe a little less likely to care about what other people think. They get off that 'supervisor's perch' and put themselves in direct contact with the people they can help the most.

They achieve a kind of tranquility, not because they've decided to do nothing, but because they've achieved focus and purity of will. They have enough self-confidence to say no to some things, so that they can say yes to others. From this perspective, middle age is kind of inspiring. Many of life's possibilities are now closed, but the remaining possibilities can be seized more bravely, and lived more deeply.

Sunday 20 March 2016

What comes after the wedding matters more than the ceremony

Last week there was an article in The Straits Times, on how people getting married in Singapore are spending huge amounts of time and money on planning and having big banquets (whether dinners or lunches), many times stretching beyond their means in the name of 'one time affair'. I have seen this in my home country India also, where people can spend an obscene amount of money just to show off their status and wealth, which places an unsaid burden on other sections of society.

I had been thinking to pen down my thoughts when I came across a letter today written by a fellow reader echoing similar sentiments. The writer, Dr Patrick Liew writes that as a solemniser and counselor, he has seen that many couples spend more time preparing for the wedding than for what comes after it.These couples are often more excited about going for a honeymoon than about growing the values of a shared life. They are more focused on how to make a living than on how to enrich each other's life.

There is more time spent on pursuing quantity of possessions rather than quality of contributions. They plan for and are more prepared to handle life's pleasures rather than the pressures of building a healthy family and treasures that will last a lifetime. It serves to always remember that a house does not automatically become a home. A marriage does not naturally become a union for the greater good.

Marriage can be likened to developing a fruitful farm. Couples need to invest time and effort to nurture and strengthen their relationship. At the same time, they will have to balance the time they spend between looking at each other and looking ahead to serve a higher calling and worthier cause. Their commitment to a healthy marriage is not just made in a ceremony, but in the heart. It has to be made again and again to develop a meaningful, exciting and fulfilling relationship.

By investing wisely in the marriage, they will be rewarded with fruits of happiness, well-being and achievement.

Saturday 19 March 2016

Real process of learning

The real process of education should be the process of learning to think through the application of real problems - John Dewey

I agree with the above statement by John Dewey and firmly believe that unless the education is linked to real problems, it doesn’t serve much purpose. Let me elaborate!

Based on my teaching experience of 32 years, mostly at the undergraduate level, I have seen that to inspire and motivate my students to learn enthusiastically in the classroom, it is imperative to show them how what they are learning in the classroom is relevant to their life, future work and their own bodies! You see, I teach modules like Human Anatomy and Physiology, Biochemistry, Genetics, Nutrition, Dietary Supplements, Psychology, etc. When my students are learning these highly technical subjects, it can be quite difficult for some of them; more so who have not done much Biology in school. But as soon as I incorporate medical case studies and share my personal experiences with patients related to that day’s topic, it starts becoming alive to them. They then start sharing their stories from among the family members, and how they are able to understand it better now! 

So rather than just memorizing the structure and function of heart for example, we shift the focus to real life situations, and learn through understanding the processes behind them. When we discuss the chest pain related to heart attack, how to read ECG and how it changes during heart attack, role of artificial pacemaker or causes of hypertension, our students are more interested in learning as these are common issues they have come across, and hence are better focused and motivated to grasp the finer details about the detailed structure and functioning of the heart. The moment of truth comes when they sometimes bring ECG or medical report of their grandparent in the next class to discuss with me! I then feel satisfied that my job as an educator or teacher is done as I have ignited that spark of learning in them!!

Some teachings from Gita

Swami Vivekananda said," Human is divine and our goal is to reach divinity". All of us want three things in life: happiness, peace and freedom. How to get these is a big question, and our sacred text, Gita, offers us many many insights and ways to get there. I attended the Annual Gita Forum in Singapore some time ago, and noted down some key points there as explained by learned speakers:

1. Listen first, talk later as you solve the problem by listening, rather than by being smart.

2. Ask deep, incisive questions to listen and understand the real issues, and be humble before the facts.

3. To make fair decisions, have a clear head, tough hand and warm heart.

4. Find a mentor for yourself, and provide mentor-ship to others.

5. Live your life as "yagya" or sacred offering by practicing giving. When you give more than you take, your life is fulfilled.

6. The best way to find yourself is to lose yourself in others' service.

7. Look at the intersection of life goals and career goals, and aim to achieve them to have a balanced life.

8. You will be treated as you act; whether as a victim or a leader.

9. You are an idiot if you don't believe in yourself.

10. Enjoy what you do rather than what you get.

11. Infuse your life with love and care. Spend time with people who love and inspire you.

12. Surrender completely through humility, love and compassion.

13. Find unique strengths of people around you, rather than look at their weaknesses.

14. Try to emulate the way great people live their lives-- uncompromising, adhering to truth, simple, loving and compassionate.

15. Do regular introspection, think and reflect on what is happening outside as well as inside you.

16. You have to BE the person you want others to be.

Thursday 11 February 2016

Intellectual Enlightenment--engaging senior adults meaningfully

In Hawaii, people above the age of 60 are allowed to "audit" all courses in all publicly funded universities--- something that we should really be considering seriously for senior citizens. For those who are not familiar with the nomenclature of US universities, to "audit" means you can attend classes and listen and participate, but you will not be asked to do the "papers", and the professor will neither mark the papers nor give you a grade. It is intellectual enlightenment without the pressures of having to write papers to deadline and worry about the grades. Perfect for a sophisticated older gentlemen or lady-- what more can one ask for?

Another interesting bit of information was revealed from the professors who have taught such "audit" courses where many senior citizens were present. The presence of the senior citizens in the classroom changes the cultural dynamic of the course, especially during discussions, when the seniors with more experience and longer memories contribute. Whereas many of the young adults might have been only toddlers when the Berlin Wall fell, the seniors would have watched it on television. So it is not just the seniors but also the young students who benefit from the discussion cutting across generational lines. So everyone wins when 'diversity' comes in, particularly some of our know-it-all aggressive students.

As far as I know, no such schemes exist in Asian universities which cater mostly to those under 30 years old. As our population ages with more and more seniors, the educational institutions might have a vital role to play. The wealthy elite can benefit from courses on appreciating the opera or on "spirituality" or even join "light" physical activities such as yoga at the university gym. The elderly poor-- unfortunately the majority of the people fall into this category-- can be taught simple computer skills that will allow them, for example, to pay their electricity and phone bills online rather than by having to line up at the nearby post office.

Another area to be benefited from this is understanding the needs of the young and senior adults by one another, be it declining moral ethics or the pain of unemployment (as a result of competitive attitude). If enough seniors were hanging around classrooms and student canteens, and reasoning with the young students before problems exploded in the public domain, then, perhaps, we might have more enlightened campuses. And better understanding among both the young and old.

Why few child prodigies grow up to be geniuses

They learn to read at age two, play Bach at four, breeze through calculus at six, and speak foreign languages fluently by eight. Their classmates shudder with envy; their parents rejoice at winning the lottery. But, to paraphrase T.S. Eliot, their careers tend to end not with a bang, but with a whimper.

Consider the most prestigious award in the US for scientifically gifted high school students, the Westinghouse Science Talent Search, called the Super Bowl of science by one American president. From its inception in 1942 until 1994, the search recognised more than 2,000 precocious teenagers as finalists. But just 1 per cent ended up making the National Academy of Sciences, and just eight have won Nobel Prizes. For every Lisa Randall who revolutionises theoretical physics, there are many dozens who fall far short of their potential.

Child prodigies rarely become adult geniuses who change the world. We assume that they must lack the social and emotional skills to function in society. When you look at the evidence, though, this explanation doesn't suffice: Less than a quarter of gifted children suffer from social and emotional problems. A vast majority are well adjusted - as winning at a cocktail party as in the spelling bee.

What holds them back is that they don't learn to be original. They strive to earn the approval of their parents and the admiration of their teachers. But as they perform in Carnegie Hall and become chess champions, something unexpected happens: Practice makes perfect, but it doesn't make new.

The gifted learn to play magnificent Mozart melodies, but rarely compose original scores. They focus their energy on consuming existing scientific knowledge, not producing new insights. They conform to codified rules, rather than inventing their own. Research suggests the most creative children are the least likely to become the teacher's pet, and in response, many learn to keep their original ideas to themselves. In the language of the critic William Deresiewicz, they become the excellent sheep.

In adulthood, many prodigies become experts in their fields and leaders in their organisations. Yet "only a fraction of gifted children eventually become revolutionary adult creators", laments psychologist Ellen Winner. "Those who do must make a painful transition" to an adult who "ultimately remakes a domain". Most prodigies never make that leap. They apply their extraordinary abilities by shining in their jobs without making waves. They become doctors who heal their patients without fighting to fix the broken medical system, or lawyers who defend clients on unfair charges but do not try to transform the laws themselves.

So what does it take to raise a creative child? One study compared the families of children who were rated among the most creative 5 per cent in their school system with those who were not unusually creative. The parents of ordinary children had an average of six rules, like specific schedules for homework and bedtime. Parents of highly creative children had an average of fewer than one rule.

Creativity may be hard to nurture, but it's easy to thwart. By limiting rules, parents encouraged their children to think for themselves. They tended to "place emphasis on moral values, rather than on specific rules", Harvard psychologist Teresa Amabile reports.

When psychologist Benjamin Bloom led a study of the early roots of world-class musicians, artists, athletes and scientists, he learnt that their parents didn't dream of raising superstar kids. They weren't drill sergeants or slave drivers. They responded to the intrinsic motivation of their children. When their children showed interest and enthusiasm in a skill, the parents supported them.Even then, though, parents didn't shove their values down their children's throats. When psychologists compared the most creative architects in the US with a group of highly skilled but unoriginal peers, there was something unique about the parents of the creative architects: "Emphasis was placed on the development of one's own ethical code." Yes, parents encouraged their children to pursue excellence and success - but they also encouraged them to find "joy in work". Their children had freedom to sort out their own values and discover their own interests. And that set them up to flourish as creative adults.

Top concert pianists didn't have elite teachers from the time they could walk; their first lessons came from instructors who happened to live nearby and made learning fun. Mozart showed interest in music before taking lessons, not the other way around. Mary Lou Williams learnt to play the piano on her own; Itzhak Perlman began teaching himself the violin after being rejected from music school.

Even the best athletes didn't start out any better than their peers. When Bloom's team interviewed tennis players who were ranked in the top 10 in the world, they were not, to paraphrase Jerry Seinfeld, doing push-ups since they were a foetus. Few of them faced intense pressure to perfect the game as Andre Agassi did. A majority of the tennis stars remembered one thing about their first coaches: They made tennis enjoyable.

Since Malcolm Gladwell popularised the "10,000-hour rule" suggesting that success depends on the time we spend in deliberate practice, debate has raged about how the number of hours necessary to become an expert varies by field and person. In arguing about that, we've overlooked two questions that matter just as much.

First, can't practice itself blind us to ways to improve our area of study? Research reveals that the more we practise, the more we become entrenched - trapped in familiar ways of thinking. Expert bridge players struggled more than novices to adapt when the rules were changed; expert accountants were worse than novices at applying a new tax law.

Second, what motivates people to practise a skill for thousands of hours? The most reliable answer is passion - discovered through natural curiosity or nurtured through early enjoyable experiences with an activity or many activities.

Evidence shows that creative contributions depend on the breadth, not just depth, of our knowledge and experience. In fashion, the most original collections come from directors who spend the most time working abroad. In science, winning a Nobel Prize is less about being a single-minded genius and more about being interested in many things. Relative to typical scientists, Nobel Prize winners are 22 times more likely to perform as actors, dancers or magicians; 12 times more likely to write poetry, plays or novels; seven times more likely to dabble in arts and crafts; and twice as likely to play an instrument or compose music.

No one is forcing these luminary scientists to get involved in artistic hobbies. It's a reflection of their curiosity. And sometimes, that curiosity leads them to flashes of insight. "The theory of relativity occurred to me by intuition, and music is the driving force behind this intuition," Albert Einstein reflected. His mother enrolled him in violin lessons when he was five, but he wasn't intrigued. His love of music only blossomed as a teenager, after he stopped taking lessons and stumbled upon Mozart's sonatas. "Love is a better teacher than a sense of duty," he said.

Hear that, Tiger Mums and Lombardi Dads? You can't programme a child to become creative. Try to engineer a certain kind of success, and the best you'll get is an ambitious robot. If you want your children to bring original ideas into the world, you need to let them pursue their passions, not yours.


(This article by Adam Grant appeared in The Straits Times on 7th Feb, 2016. Adam Grant is a professor of management and psychology at the Wharton School of the University of Pennsylvania, and author of Originals: How Non-Conformists Move The World)

Tuesday 9 February 2016

Want to be happier? Start thinking more about death

Want a better 2016? Try thinking more about your impending demise.

Years ago, on a visit to Thailand, Arthur was surprised to learn that Buddhist monks often contemplate the photos of corpses in various stages of decay. The Buddha himself recommended corpse meditation. "This body, too", students were taught to say about their own bodies, "such is its nature, such is its future, such its unavoidable fate."

Paradoxically, this meditation on death is intended as a key to better living. It makes disciples aware of the transitory nature of their own physical lives and stimulates a realignment between momentary desires and existential goals. In other words, it makes one ask:"Am I making the right use of my scarce and precious life?"

In fact, most people suffer grave misalignment. In a 2004 article in the journal Science, a team of scholars, including the Nobel Prize winner Daniel Kahneman, surveyed a group of women to compare how much satisfaction they derived from their daily activities. Among voluntary activities, we might expect that choices would roughly align with satisfaction. Not so. The women reported deriving more satisfaction from prayer, worship and meditation than from watching television. Yet the average respondent spent more than five times as long watching TV as engaging in spiritual activities.

If anything, this study understates the misalignment problem. The American Time Use Survey from the Bureau of Labour Statistics shows that , in 2014, The average US adult spent four times longer watching television than "socialising and communicating", and 20 times longer on TV than on "religious and spiritual activities". The survey did not ask about hours surfing the Web, but we can imagine a similar disparity.

This misalignment leads to regret. Millions have resolved to waste less time in 2016 and have already failed. I imagine some readers of this article are filled with self-loathing because they just wasted 10 minutes on an article titled "Celebrities With Terrible Skin".

Some might say that this reveals our true preferences for TV and clickbait over loved ones and God. But I believe it is an error in decision-making. Our days tend to be an exercise in distraction. We think about the past and future more than the present; we are mentally in one place and physically in another. Without consciousness, we mindlessly blow the present moment on low-value activities. The secret is not simply a resolution to stop wasting time, however. It is to find a systematic way to raise the scarcity of time to our consciousness.

Even if contemplating corpse is a bit too much, you can still practise some of the Buddha's wisdom resolving to live as if 2016 were your last year. Then remorselessly root out activities, small and large, that don't pass the "last-year test". There are many creative ways to practise this test. For example, if you plan a summer vacation, consider what would you do for a week or two if this were your last opportunity. With whom would you reconnect and spend some time? Would you settle your soul on a silent retreat, or instead, spend the time drunk somewhere?

If this year were your last, would you spend the next hour checking social media, or read something uplifting? Would you compose a nasty comment on this article, or use the time to call a friend to see how she is doing? Some might think that the last-year test is impractical. In a new paper in the science journal PLOS One, two psychologists looked at the present value of money when people contemplated death. One might assume that when reminded of death, people would greatly value current spending over future spending. But that's not how it turned out. Considering death actually made respondents less likely to want to blow money now than other scenarios did.

Will cultivating awareness of the scarcity of your time make you grim and serious? Not at all. In fact, there is some evidence that contemplating death makes you funnier. Two scholars in 2013 published an academic paper detailing research in which they subliminally primed people to think about either death or pain, and then asked them to caption cartoons. Outside raters found the death-primed participants' captions to be funnier.

There's still time to rethink your resolutions. Forget losing weight and saving money. Those are new Year's resolutions for amateurs. This year, improve your alignment, and maybe get funnier in the process: Be fully alive now by meditating on your demise. Happy 2016!


(Adapted from an article by Arthur C. Brooks in New York Times )