Lapidarium notes RSS

Amira Skomorowska's notes

"Everything you can imagine is real."— Pablo Picasso

Lapidarium

Tags:

Africa
Age of information
Ancient
Anthropology
Art
Artificial intelligence
Astronomy
Atheism
Beauty
Biography
Books
China
Christianity
Civilization
Cognition, perception, relativity
Cognitive science
Collective intelligence
Communication
Consciousness
Creativity
Culture
Curiosity
Cyberspace
Democracy
Documentary
Drawing
Earth
Economy
Evolution
Friendship
Funny
Future
Genetics
Globalization
Happiness
History
Human being
Illustrations
Imagination
Individualism
Infographics
Information
Inspiration
Internet
Knowledge
Language
Learning
Life
Literature
Logic
Love
Mathematics
Media
Metaphor
Mind & Brain
Multiculturalism
Music
Networks
Neuroscience
Painting
Paradoxes
Patterns
Philosophy
Poetry
Politics
Physics
Psychology
Rationalism
Religions
Science
Science & Art
Self improvement
Semantics
Society
Sociology
Storytelling
Technology
The other
Time
Timeline
Traveling
Unconsciousness
Universe
USA
Video
Violence
Visualization


Pensieri a caso
Photography
A Box Of Stories
Reading Space
Homepage

Twitter
Facebook

Contact

Archive

Sep
22nd
Thu
permalink

Timothy D. Wilson on The Social Psychological Narrative: ‘It’s not the objective environment that influences people, but their constructs of the world’

                                 

"In the mid 1970’s, Tim Wilson and Dick Nisbett opened the basement door with their landmark paper entitled “Telling More Than We Can Know,” [pdf] in which they reported a series of experiments showing that people are often unaware of the true causes of their own actions, and that when they are asked to explain those actions, they simply make stuff up. People don’t realize they are making stuff up, of course; they truly believe the stories they are telling about why they did what they did.  But as the experiments showed, people are telling more than they can know. The basement door was opened by experimental evidence, and the unconscious took up permanent residence in the living room. Today, psychological science is rife with research showing the extraordinary power of unconscious mental processes. (…)

At the center of all his work lies a single enigmatic insight: we seem to know less about the worlds inside our heads than about the world our heads are inside.

The Torah asks this question: “Is not a flower a mystery no flower can explain?” Some scholars have said yes, some scholars have said no. Wilson has said, “Let’s go find out.” He has always worn two professional hats — the hat of the psychologist and the hat of the methodologist. He has written extensively about the importance of using experimental methods to solve real world problems, and in his work on the science of psychological change — he uses a scientific flashlight to chase away a whole host of shadows by examining the many ways in which human beings try to change themselves — from self-help to psychotherapy — and asking whether these things really work, and if so, why? His answers will surprise many people and piss off the rest. I predict that this new work will be the center of a very interesting storm.”

Daniel Gilbert, Harvard College Professor of Psychology at Harvard University; Director of Harvard’s Hedonic Psychology Laboratory; Author, Stumbling on Happiness.

It’s not the objective environment that influences people, but their constructs of the world. You have to get inside people’s heads and see the world the way they do. You have to look at the kinds of narratives and stories people tell themselves as to why they’re doing what they’re doing. What can get people into trouble sometimes in their personal lives, or for more societal problems, is that these stories go wrong. People end up with narratives that are dysfunctional in some way.

We know from cognitive behavioral therapy and clinical psychology that one way to change people’s narratives is through fairly intensive psychotherapy. But social psychologists have suggested that, for less severe problems, there are ways to redirect narratives more easily that can have amazingly powerful long-term effects. This is an approach that I’ve come to call story editing. By giving people little prompts, suggestions about the ways they might reframe a situation, or think of it in a slightly different way, we can send them down a narrative path that is much healthier than the one they were on previously. (…)

This little message that maybe it’s not me, it’s the situation I’m in, and that that can change, seemed to alter people’s stories in ways that had dramatic effects down the road. Namely, people who got this message, as compared to a control group that did not, got better grades over the next couple of years and were less likely to drop out of college. Since then, there have been many other demonstrations of this sort that show that little ways of getting people to redirect their narrative from one path down another is a powerful tool to help people live better lives. (…)

Think back to the story editing metaphor: What these writing exercises do is make us address problems that we haven’t been able to make sense of and put us through a sense-making process of reworking it in such a way that we gain a new perspective and find some meaning, so that we basically come up with a better story that allows us to put that problem behind us. This is a great example of a story editing technique that can be quite powerful. (…)

Social psychology is a branch of psychology that began in the 1950s, mostly by immigrants from Germany who were escaping the Nazi regime — Kurt Lewin being the most influential ones. What they had to offer at that time was largely an alternative to behaviorism. Instead of looking at behavior as solely the product of our objective reinforcement environment, Lewin and others said you have to get inside people’s heads and look at the world as they perceive it. These psychologists were very influenced by Gestalt psychologists who were saying the same thing about perception, and they applied this lesson to the way the mind works in general. (…) But to be honest, the field is a little hard to define.  What is social psychology?  Well, the social part is about interactions with other people, and topics such as conformity are active areas of research. (…)

Most economists don’t take the social psychological approach of trying to get inside the heads of people and understanding how they interpret the world. (…)

My dream is that policymakers will become more familiar with this approach and be as likely to call upon a social psychologist as an economist to address social issues. (…)

Another interesting question is the role of evolutionary theory in psychology, and social psychology in particular.  (…)

Evolutionary psychology has become a dominant force in the field. There are many who use it as their primary theoretical perspective, as a way to understand why we do what we do. (…)

There are some striking parallels between psychoanalytic theory and evolutionary theory. Both theories, at some general level are true. Evolutionary theory, of course, shows how the forces of natural selection operated on human beings. Psychoanalytic theory argues that our childhood experiences mold us in certain ways and give us outlooks on the world. Our early relationships with our parents lead to unconscious structures that can be very powerful. (…)

One example where evolutionary psychology led to some interesting testable hypotheses is work by Jon Haidt, my colleague at the University of Virginia. He has developed a theory of moral foundations that says that all human beings endorse the same list of moral values, but that people of different political stripes believe some of these values are more important than others. In other words, liberals may have somewhat different moral foundations than conservatives. Jon has persuasively argued that one reason that political discourse has become so heated and divisive in our country is that there is a lack of understanding in one camp of the moral foundations that the other camp is using to interpret and evaluate the world. If we can increase that understanding, we might lower the heat and improve the dialogue between people on opposite ends of the political spectrum.

Another way in which evolutionary theory has been used is to address questions about the origins of religion. This is not a literature I have followed that closely, to be honest, but there’s obviously a very interesting discourse going on about group selection and the origins and purpose of religion. The only thing I’ll add is, back to what I’ve said before about the importance of having narratives and stories to give people a sense of meaning and purpose, well, religion is obviously one very important source of such narratives. Religion gives us a sense that there is a purpose and a meaning to life, the sense that we are important in the universe, and that our lives aren’t meaningless specks like a piece of sand on a beach. That can be very powerful for our well-being. I don’t think religion is the only way to accomplish that; there are many belief systems that can give us a sense of meaning and purpose other than religion. But religion can fill that void.”

Timothy D. Wilson, is the Sherrell J. Aston Professor of Psychology at the University of Virginia and a researcher of self-knowledge and affective forecasting., The Social Psychological Narrative — or — What Is Social Psychology, Anyway?, Edge, 6 July 2011 (video and full transcript) (Illustration: Hope Kroll, Psychological 3-D narrative)

See also:

Daniel Kahneman: The Marvels and the Flaws of Intuitive Thinking
Iain McGilchrist on The Divided Brain and the Making of the Western World
Dean Buonomano on ‘Brain Bugs’ - Cognitive Flaws That ‘Shape Our Lives’
David Eagleman on how we constructs reality, time perception, and The Secret Lives of the Brain
David Deutsch: A new way to explain explanation
Cognition, perception, relativity tag on Lapidarium notes

Jul
16th
Sat
permalink

Keri Smith on ‘How To Be An Explorer of the World’


                                                           (Click image to enlarge)

"Artists and scientists analyze the world in surprisingly similar ways."

Science – “The intellectual and practical activity encompassing the systematic study of structure and behavior of the physical and natural world through observation and experiment.”

— Oxford American Dictionary, cited in ibidem, p. 199.


p. 1

"[The residual purpose of art is] purposeless play. This play, however, is an affirmation of life - not an attempt to bring order out of chaos nor to suggest improvements in creation, but simply a way of waking up to the very life we’re living, which is so excellent once one gets one’s mind and one’s desires out of its way and lets it act of its own accord.”

John Cage, cited in ibidem, p. 104


p.116


p. 75

“Sometimes a tree can tell you more than can be read in a book.”

Carl Jung cited in ibidem, p. 138.

Keri Smith, author, illustrator, guerilla artist, How To Be An Explorer of the World: Portable Life Museum, Penguin Books, 2008.

Jun
24th
Fri
permalink

The Neurobiology of “We”. Relationship is the flow of energy and information between people, essential in our development

  

"The study of neuroplasticity is changing the way scientists think about the mind/brain connection. While they’ve known for years that the brain is the physical substrate for the mind, the central mystery of neuroscience is how the mind influences the physical structure of the brain. In the last few decades, thanks to PET and MRI imaging techniques, scientists can observe what’s actually going on in the brain while people sleep, work, make decisions, or attempt to function under limitations caused by illness, accident, or war. (…)

Dr. Daniel Siegel, Clinical Professor of Psychiatry at the UCLA School of Medicine, co-director of the Mindful Awareness Research Center, and director of the Mindsight Institute (…) convinced that the “we” connection is a little-understood, but powerful means for individual and societal transformation that should be taught in schools and churches, and even enter into politics.

Interpersonal neurobiology isn’t a form of therapy,” he told me, “but a form of integrating a range of scientific research into a picture of the nature of human reality. It’s a phrase I invented to account for the human effort to understand truth. We can define the mind. We can define mental health. We can base everything on science, but I want to base it on all the sciences. We’re looking for what we call ‘consilience.’ If you think of the neuroscientist as a blind man looking at only one part of an elephant, we are trying to discover the ‘whole-elephant’ view of reality.” (…)

“We is what me is!”

Our nervous system has two basic modes: it fires up or quiets down. When we’re in a reactive state, our brainstem signals the need for fight or flight. This means we’re unable to open ourselves to another person, and even neutral comments may be taken as fighting words. On the other hand, an attitude of receptivity activates a different branch of the brainstem as it sends messages to relax the muscles of the face and vocal chords, and normalizes blood pressure and heart rate. “A receptive state turns on the social engagement system that connects us to others,” Siegel explains in his recent book, Mindsight. “Receptivity is our experience of being safe and seen; reactivity is our fight-flight-freeze survival reflex.” (…)

He describes the brain as part of “an embodied nervous system, a physical mechanism through which both energy and information flow to influence relationship and the mind.” He defines relationship as “the flow of energy and information between people.” Mind is “an embodied and relational process that regulates the flow of energy and information, consciousness included. Mind is shared between people. It isn’t something you own; we are profoundly interconnected. We need to make maps of we because we is what me is!” (…)

[Siegel]: “We now know that integration leads to health and harmony. We can re-envision the DSM symptoms as examples of syndromes filled with chaos and rigidity, conditions created when integration is impaired. So we can define mental health as the ability to monitor ourselves and modify our states so that we integrate our lives. Then things that appeared unchangeable can actually be changed.” (…)

Relationships, mind and brain aren’t different domains of reality—they are each about energy and information flow. The mechanism is the brain; subjective impressions and consciousness are mind. The regulation of energy and information flow is a function of mind as an emergent process emanating from both relationships and brain. Relationships are the way we share this flow. In this view, the emergent process we are calling “mind” is located in the body (nervous system) and in our relationships. Interpersonal relationships that are attuned promote the growth of integrative fibers in the brain. It is these regulatory fibers that enable the embodied brain to function well and for the mind to have a deep sense of coherence and well-being. Such a state also creates the possibility of a sense of being connected to a larger world. The natural outcome of integration is compassion, kindness, and resilience.” (…)

“Everything we experience, memory or emotion or thought, is part of a process, not a place in the brain! Energy is the capacity to do stuff. There’s nothing that’s not energy, even ‘mass.’ Remember E=MC squared? Information is literally a swirl of energy in a certain pattern that has a symbolic meaning; it stands for something other than itself. Information should be a verb; mind, too—as in minding or informationing. And the mind is an embodied and relational emergent process that regulates the flow of energy and information.”

“We can be both an ‘I’ and part of an ‘us’”

[Siegel]: “Certain neurons can fire when someone communicates with you. They dissolve the border between you and others. These mirror neurons are a hardwired system designed for us to see the mind-state of another person. That means we can learn easily to dance, but also to empathize with another. They automatically and spontaneously pick up information about the intentions and feelings of those around us, creating emotional resonance and behavioral imitation as they connect our internal state with those around us, even without the participation of our conscious mind.” And in Mindsight: “Mirror neurons are the antennae that pick up information about the intentions and feelings of others.… Right hemisphere signals (are those) the mirror neuron system uses to simulate the other within ourselves and to construct a neural map of our interdependent sense of a ‘self.’ It’s how we can be both an ‘I’ and part of an ‘us.’” (…)

So how can we re-shape our brain to become more open and receptive to others? We already know the brain receives input from the senses and gives it meaning, he points out. That’s how blind people find ways to take in information and map out their world. According to Siegel, they do this on secondary pathways rather than the main highways of the brain. That’s a major key to how we can bring about change: “You can take an adult brain in whatever state it’s in and change a person’s life by creating new pathways,” he affirms. “Since the cortex is extremely adaptable and many parts of the brain are plastic, we can unmask dormant pathways we don’t much use and develop them. A neural stem cell is a blob, an undifferentiated cell in the brain that divides into two every twenty-four hours. In eight–ten weeks, it will become established as a specialized neural cell and exist as a part of an interconnected network. How we learn has everything to do with linking wide areas of the brain with each other.”

He calls the prefrontal cortex “the portal through which interpersonal relations are established.” He demonstrates, by closing his hand over his thumb, how this little tiny piece of us (the last joint of the two middle fingers) is especially important because it touches all three major parts of our brain: the cortex, limbic area, and brainstem as well as the body-proper. “It’s the middle prefrontal fibers which map out the internal states of others,” he adds. “And they do this not only within one brain, mine, but also between two brains, mine and yours, and even among many brains. The brain is exquisitely social, and emotions are its fundamental language. Through them we become integrated and develop an emergent resonance with the internal state of the other.” (…)

“Relationship is key,” he emphasizes. “When we work with relationship, we work with brain structure. Relationship stimulates us and is essential in our development. People rarely mention relationship in brain studies, but it provides vital input to the brain. Every form of psychotherapy that works, works because it creates healthier brain function and structure.… In approaching our lives, we can ask where do we experience the chaos or rigidity that reveal where integration is impaired. We can then use the focus of our attention to integrate both our brain and our relationships. Ultimately we can learn to be open in an authentic way to others, and to ourselves. The outcome of such an integrative presence is not only a sense of deep well-being and compassion for ourselves and others, but also an opening of the doors of awareness to a sense of the interdependence of everything. ‘We’ are indeed a part of an interconnected whole.””

— Patty de Llosa, author, The Neurobiology of “We”, Parabola Magazine, 2011, Daniel Siegel, Clinical Professor of Psychiatry at the UCLA School of Medicine, co-director of the Mindful Awareness Research Center. (Illustration source)

Mar
27th
Sun
permalink

Later. What does procrastination tell us about ourselves?
                           

"Many of us go through life with an array of undone tasks, large and small, nibbling at our conscience. But George Akerlof saw the experience, for all its familiarity, as mysterious. He genuinely intended to send the box to his friend, yet, as he wrote, in a paper called Procrastination and Obedience (1991), “each morning for over eight months I woke up and decided that the next morning would be the day to send the Stiglitz box.” He was always about to send the box, but the moment to act never arrived. Akerlof, who became one of the central figures in behavioral economics, came to the realization that procrastination might be more than just a bad habit. He argued that it revealed something important about the limits of rational thinking and that it could teach useful lessons about phenomena as diverse as substance abuse and savings habits. Since his essay was published, the study of procrastination has become a significant field in academia, with philosophers, psychologists, and economists all weighing in. (…)

The term itself (derived from a Latin word meaning “to put off for tomorrow”) entered the English language in the sixteenth century, and, by the eighteenth, Samuel Johnson was describing it as “one of the general weaknesses” that “prevail to a greater or less degree in every mind,” and lamenting the tendency in himself: “I could not forbear to reproach myself for having so long neglected what was unavoidably to be done, and of which every moment’s idleness increased the difficulty.” And the problem seems to be getting worse all the time. According to Piers Steel, a business professor at the University of Calgary, the percentage of people who admitted to difficulties with procrastination quadrupled between 1978 and 2002. In that light, it’s possible to see procrastination as the quintessential modern problem.

It’s also a surprisingly costly one. Each year, Americans waste hundreds of millions of dollars because they don’t file their taxes on time. The Harvard economist David Laibson has shown that American workers have forgone huge amounts of money in matching 401(k) contributions because they never got around to signing up for a retirement plan. Seventy per cent of patients suffering from glaucoma risk blindness because they don’t use their eyedrops regularly.

Procrastination also inflicts major costs on businesses and governments. The recent crisis of the euro was exacerbated by the German government’s dithering, and the decline of the American auto industry, exemplified by the bankruptcy of G.M., was due in part to executives’ penchant for delaying tough decisions. (In Alex Taylor’s recent history of G.M., “Sixty to Zero,” one of the key conclusions is “Procrastination doesn’t pay.”)

Akrasia—doing something against one’s own better judgment

Philosophers are interested in procrastination for another reason. It’s a powerful example of what the Greeks called akrasia—doing something against one’s own better judgment. Piers Steel defines procrastination as willingly deferring something even though you expect the delay to make you worse off. In other words, if you’re simply saying “Eat, drink, and be merry, for tomorrow we die,” you’re not really procrastinating. Knowingly delaying because you think that’s the most efficient use of your time doesn’t count, either. The essence of procrastination lies in not doing what you think you should be doing, a mental contortion that surely accounts for the great psychic toll the habit takes on people. This is the perplexing thing about procrastination: although it seems to involve avoiding unpleasant tasks, indulging in it generally doesn’t make people happy. In one study, sixty-five per cent of students surveyed before they started working on a term paper said they would like to avoid procrastinating: they knew both that they wouldn’t do the work on time and that the delay would make them unhappy.

Most of the contributors to the new book agree that this peculiar irrationality stems from our relationship to time—in particular, from a tendency that economists call “hyperbolic discounting.” (…)

Hyperbolic discounters are able to make the rational choice when they’re thinking about the future, but, as the present gets closer, short-term considerations overwhelm their long-term goals. A similar phenomenon is at work in an experiment run by a group including the economist George Loewenstein, in which people were asked to pick one movie to watch that night and one to watch at a later date. Not surprisingly, for the movie they wanted to watch immediately, people tended to pick lowbrow comedies and blockbusters, but when asked what movie they wanted to watch later they were more likely to pick serious, important films. The problem, of course, is that when the time comes to watch the serious movie, another frothy one will often seem more appealing. This is why Netflix queues are filled with movies that never get watched: our responsible selves put “Hotel Rwanda” and “The Seventh Seal” in our queue, but when the time comes we end up in front of a rerun of “The Hangover.” (…)

Why does this happen? One common answer is ignorance. Socrates believed that akrasia was, strictly speaking, impossible, since we could not want what is bad for us; if we act against our own interests, it must be because we don’t know what’s right. Loewenstein, similarly, is inclined to see the procrastinator as led astray by the “visceral” rewards of the present. As the nineteenth-century Scottish economist John Rae put it, “The prospects of future good, which future years may hold on us, seem at such a moment dull and dubious, and are apt to be slighted, for objects on which the daylight is falling strongly, and showing us in all their freshness just within our grasp.” Loewenstein also suggests that our memory for the intensity of visceral rewards is deficient: when we put off preparing for that meeting by telling ourselves that we’ll do it tomorrow, we fail to take into account that tomorrow the temptation to put off work will be just as strong.

"The planning fallacy”

Ignorance might also affect procrastination through what the social scientist Jon Elster calls “the planning fallacy.” Elster thinks that people underestimate the time “it will take them to complete a given task, partly because they fail to take account of how long it has taken them to complete similar projects in the past and partly because they rely on smooth scenarios in which accidents or unforeseen problems never occur.” (…)

A fuller explanation of procrastination really needs to take account of our attitudes to the tasks being avoided. A useful example can be found in the career of General George McClellan, who led the Army of the Potomac during the early years of the Civil War and was one of the greatest procrastinators of all time. When he took charge of the Union army, McClellan was considered a military genius, but he soon became famous for his chronic hesitancy. In 1862, despite an excellent opportunity to take Richmond from Robert E. Lee’s men, with another Union army attacking in a pincer move, he dillydallied, convinced that he was blocked by hordes of Confederate soldiers, and missed his chance. Later that year, both before and after Antietam, he delayed again, squandering a two-to-one advantage over Lee’s troops. Afterward, Union General-in-Chief Henry Halleck wrote, “There is an immobility here that exceeds all that any man can conceive of. It requires the lever of Archimedes to move this inert mass.”

McClellan’s “immobility” highlights several classic reasons we procrastinate. Although when he took over the Union army he told Lincoln “I can do it all,” he seems to have been unsure that he could do anything. He was perpetually imploring Lincoln for new weapons, and, in the words of one observer, “he felt he never had enough troops, well enough trained or equipped.” Lack of confidence, sometimes alternating with unrealistic dreams of heroic success, often leads to procrastination, and many studies suggest that procrastinators are self-handicappers: rather than risk failure, they prefer to create conditions that make success impossible, a reflex that of course creates a vicious cycle. McClellan was also given to excessive planning, as if only the ideal battle plan were worth acting on. Procrastinators often succumb to this sort of perfectionism.

“The divided self”

Viewed this way, procrastination starts to look less like a question of mere ignorance than like a complex mixture of weakness, ambition, and inner conflict. But some of the philosophers in “The Thief of Time” have a more radical explanation for the gap between what we want to do and what we end up doing: the person who makes plans and the person who fails to carry them out are not really the same person: they’re different parts of what the game theorist Thomas Schelling called “the divided self.” Schelling proposes that we think of ourselves not as unified selves but as different beings, jostling, contending, and bargaining for control. Ian McEwan evokes this state in his recent novel “Solar”: “At moments of important decision-making, the mind could be considered as a parliament, a debating chamber. Different factions contended, short- and long-term interests were entrenched in mutual loathing. Not only were motions tabled and opposed, certain proposals were aired in order to mask others. Sessions could be devious as well as stormy.” Similarly, Otto von Bismarck said, “Faust complained about having two souls in his breast, but I harbor a whole crowd of them and they quarrel. It is like being in a republic.” In that sense, the first step to dealing with procrastination isn’t admitting that you have a problem. It’s admitting that your “you”s have a problem.

If identity is a collection of competing selves, what does each of them represent? The easy answer is that one represents your short-term interests (having fun, putting off work, and so on), while another represents your long-term goals. But, if that’s the case, it’s not obvious how you’d ever get anything done: the short-term self, it seems, would always win out. The philosopher Don Ross offers a persuasive solution to the problem. For Ross, the various parts of the self are all present at once, constantly competing and bargaining with one another—one that wants to work, one that wants to watch television, and so on. The key, for Ross, is that although the television-watching self is interested only in watching TV, it’s interested in watching TV not just now but also in the future. This means that it can be bargained with: working now will let you watch more television down the road. Procrastination, in this reading, is the result of a bargaining process gone wrong.

"The extended will"

The idea of the divided self, though discomfiting to some, can be liberating in practical terms, because it encourages you to stop thinking about procrastination as something you can beat by just trying harder. Instead, we should rely on what Joseph Heath and Joel Anderson, in their essay in The Thief of Time,” call “the extended will”—external tools and techniques to help the parts of our selves that want to work. A classic illustration of the extended will at work is Ulysses’ decision to have his men bind him to the mast of his ship. Ulysses knows that when he hears the Sirens he will be too weak to resist steering the ship onto the rocks in pursuit of them, so he has his men bind him, thereby forcing him to adhere to his long-term aims. Similarly, Thomas Schelling once said that he would be willing to pay extra in advance for a hotel room without a television in it. (…)

Kantian ethics

Not everyone in “The Thief of Time” approves of the reliance on the extended will. Mark D. White advances an idealist argument rooted in Kantian ethics: recognizing procrastination as a failure of will, we should seek to strengthen the will rather than relying on external controls that will allow it to atrophy further. This isn’t a completely fruitless task: much recent research suggests that will power is, in some ways, like a muscle and can be made stronger. The same research, though, also suggests that most of us have a limited amount of will power and that it’s easily exhausted. In one famous study, people who had been asked to restrain themselves from readily available temptation—in this case, a pile of chocolate-chip cookies that they weren’t allowed to touch—had a harder time persisting in a difficult task than people who were allowed to eat the cookies.

Dividing projects into smaller, more defined sections

Given this tendency, it makes sense that we often rely intuitively on external rules to help ourselves out. A few years ago, Dan Ariely, a psychologist at M.I.T., did a fascinating experiment examining one of the most basic external tools for dealing with procrastination: deadlines. (…)

Beyond self-binding, there are other ways to avoid dragging your feet, most of which depend on what psychologists might call reframing the task in front of you. Procrastination is driven, in part, by the gap between effort (which is required now) and reward (which you reap only in the future, if ever). So narrowing that gap, by whatever means necessary, helps. Since open-ended tasks with distant deadlines are much easier to postpone than focussed, short-term projects, dividing projects into smaller, more defined sections helps. That’s why David Allen, the author of the best-selling time-management book “Getting Things Done,” lays great emphasis on classification and definition: the vaguer the task, or the more abstract the thinking it requires, the less likely you are to finish it. One German study suggests that just getting people to think about concrete problems (like how to open a bank account) makes them better at finishing their work—even when it deals with a completely different subject. Another way of making procrastination less likely is to reduce the amount of choice we have: often when people are afraid of making the wrong choice they end up doing nothing. So companies might be better off offering their employees fewer investment choices in their 401(k) plans, and making signing up for the plan the default option.

It’s hard to ignore the fact that all these tools are at root about imposing limits and narrowing options—in other words, about a voluntary abnegation of freedom. (Victor Hugo would write naked and tell his valet to hide his clothes so that he’d be unable to go outside when he was supposed to be writing.) But before we rush to overcome procrastination we should consider whether it is sometimes an impulse we should heed. The philosopher Mark Kingwell puts it in existential terms: “Procrastination most often arises from a sense that there is too much to do, and hence no single aspect of the to-do worth doing. Underneath this rather antic form of action-as-inaction is the much more unsettling question whether anything is worth doing at all.” In that sense, it might be useful to think about two kinds of procrastination: the kind that is genuinely akratic and the kind that’s telling you that what you’re supposed to be doing has, deep down, no real point. The procrastinator’s challenge, and perhaps the philosopher’s, too, is to figure out which is which.”

James Surowiecki, American journalist, Later, The New Yorker, October 11, 2010. (Illustration Barry Blitt)

Eliezer Yudkowsky: Working hurts less than procrastinating, we fear the twinge of starting

"When you procrastinate, you’re probably not procrastinating because of the pain of working.

                

How do I know this? Because on a moment-to-moment basis, being in the middle of doing the work is usually less painful than being in the middle of procrastinating. (…) So what is our brain flinching away from, if not the pain of doing the work? I think it’s flinching away from the pain of the decision to do the work - the momentary, immediate pain of (1) disengaging yourself from the (probably very small) flow of reinforcement that you’re getting from reading a random unimportant Internet article, and (2) paying the energy cost for a prefrontal override to exert control of your own behavior and begin working.

Thanks to hyperbolic discounting (i.e., weighting values in inverse proportion to their temporal distance) the instant pain of disengaging from an Internet article and paying a prefrontal override cost, can outweigh the slightly more distant (minutes in the future, rather than seconds) pain of continuing to procrastinate, which is, once again, usually more painful than being in the middle of doing the work. (…)”

Eliezer Yudkowsky, American artificial intelligence researcher concerned with the Singularity and an advocate of Friendly Artificial Intelligence, Working hurts less than procrastinating, we fear the twinge of starting, Less Wrong, Jan 2, 2011

 

                                                   (Diagram source)

See also:

George Akerlof, Procrastination and Obedience, The American Economic Review
☞ D. Ariely, K. Wertenbroch, Procrastination, Deadlines, and Performance: Self-Control by Precommitment, MIT, Psychological science
☞ Piers Steel, The nature of procrastination: A meta-analytic and theoretical review of quintessential self-regulatory failure, Psychological Bulletin
☞ Dr. Bill Knaus, Break a Perfectionism and Procrastination Connection Now, Psychology Today, March 12, 2011.
Procrastination, University of Cambridge Counselling Service
☞ Mark D. White, The Thief of Time: Philosophical Essays on Procrastination (table of contents and links)

Mar
21st
Mon
permalink

Jonah Lehrer on which traits predict success (the importance of grit)

                            

"The sweet spot: that productive, uncomfortable terrain located just beyond our current abilities, where our reach exceeds our grasp. Deep practice is not simply about struggling; it’s about seeking a particular struggle, which involves a cycle of distinct actions." — (Daniel Coyle, The Talent Code: Unlocking the Secret of Skill in Sports, Art, Music, Math, and Just About Everything Else, Bantam Books, 2009.)

"The intrinsic nature of talent is overrated – our genes don’t confer specific gifts. (There is, for instance, no PGA gene.) This has led many researchers, such as K. Anders Ericsson, to argue that talent is really about deliberate practice, about putting in those 10,000 hours of intense training (plus or minus a few thousand hours). Beethoven wasn’t born Beethoven – he had to work damn hard to become Beethoven. As Ericsson wrote in his influential review article “The Role of Deliberate Practice in the Acquisition of Expert Performance” (pdf): “The differences between expert performers and normal adults are not immutable, that is, due to genetically prescribed talent. Instead, these differences reflect a life-long period of deliberate effort to improve performance.” (…)

Talent takes effort. Talent requires a good coach. But these answers only raise more questions. What, for instance, allows someone to practice for so long? Why are some people so much better at deliberate practice? If talent is about hard work, then what factors influence how hard we can work?

And this leads me to one of my favorite recent papers, “Deliberate Practice Spells Success: Why Grittier Competitors Triumph at the National Spelling Bee.” (pdf) The research, published in the journal of Social Psychological and Personality Science, was led by Angela Duckworth, a psychologist at Penn. (Anders-Ericsson is senior author.) The psychologists were interested in the set of traits that allowed kids to practice deliberately. Their data set consisted of 190 participants in the Scripps National Spelling Bee, a competition that requires thousands of hours of practice. After all, there are no natural born spellers.

The first thing Duckworth, et. al. discovered is that deliberate practice works. Those kids who spent more time in deliberate practice mode – this involved studying and memorizing words while alone, often on note cards – performed much better at the competition than those children who were quizzed by others or engaged in leisure reading. The bad news is that deliberate practice isn’t fun and was consistently rated as the least enjoyable form of self-improvement. Nevertheless, as spellers gain experience, they devote increasing amounts of time to deliberate practice. This suggests that even twelve year olds realize that this is what makes them better, that success isn’t easy.

Why were some kids better at drilling themselves with note cards? What explained this variation in hours devoted to deliberate practice? After analyzing the data, Duckworth discovered the importance of a psychological trait known as grit. In previous papers, Duckworth has demonstrated that grit can be reliably measured with a short survey that measures consistency of passions (e.g., ‘‘I have been obsessed with a certain idea or project for a short time but later lost interest’’) and consistency of effort (e.g., ‘‘Setbacks don’t discourage me’’) over time using a 5-point scale. Not surprisingly, those with grit are more single-minded about their goals – they tend to get obsessed with certain activities – and also more likely to persist in the face of struggle and failure. Woody Allen famously declared that “Eighty percent of success is showing up”. Grit is what allows you show up again and again. Here are the scientists:

Our major findings in this investigation are as follows: Deliberate practice—operationally defined in the current investigation as the solitary study of word spellings and origins—was a better predictor of National Spelling Bee performance than either being quizzed by others or engaging in leisure reading. With each year of additional preparation, spellers devoted an increasing proportion of their preparation time to deliberate practice, despite rating the experience of such activities as more effortful and less enjoyable than the alternative preparation activities. Grittier spellers engaged in deliberate practice more so than their less gritty counterparts, and hours of deliberate practice fully mediated the prospective association between grit and spelling performance. (…)

Success in the real world depends on sustained performance, on being able to work hard at practice, and spend the weekend studying the playbook, and reviewing hours of game tape. Those are all versions of deliberate practice, and our ability to engage in such useful exercises largely depends on levels of grit. The problem, of course, is that grit can’t be measured in a single afternoon on a single field. (By definition, it’s a metric of personality that involves long periods of time.) The end result is that our flawed beliefs about talent have led to flawed tests of talent. Perhaps that explains why there is no “consistent statistical relationship between combine tests and professional football performance.” We need to a test that measures how likely people are to show up, not just how they perform once there.

The second takeaway involves the growing recognition of “non-cognitive” skills like grit and self-control. While such traits have little or nothing to do with intelligence (as measured by IQ scores), they often explain a larger share of individual variation when it comes to life success. It doesn’t matter if one is looking at retention rates at West Point or teacher performance within the Teach for America program or success in the spelling bee: Factors like grit are often the most predictive variables of real world performance. Thomas Edison was right: even genius is mostly just perspiration.

Taken together, these studies suggest that our most important talent is having a talent for working hard, for practicing even when practice isn’t fun. It’s about putting in the hours when we’d rather be watching TV, or drilling ourselves with notecards filled with obscure words instead of getting quizzed by a friend. Success is never easy. That’s why talent requires grit.”

Jonah Lehrer, Which Traits Predict Success? (The Importance of Grit), Wired Science, March 14, 2011. (Picture source)

See also: Malcolm Gladwell on success

Mar
16th
Wed
permalink

The power of lonely. What we do better without other people around

                    

"One ongoing Harvard study indicates that people form more lasting and accurate memories if they believe they’re experiencing something alone. Another indicates that a certain amount of solitude can make a person more capable of empathy towards others. (…)

Solitude has long been linked with creativity, spirituality, and intellectual might. The leaders of the world’s great religions — Jesus, Buddha, Mohammed, Moses — all had crucial revelations during periods of solitude. The poet James Russell Lowell identified solitude as “needful to the imagination;” in the 1988 book “Solitude: A Return to the Self,” the British psychiatrist Anthony Storr invoked Beethoven, Kafka, and Newton as examples of solitary genius. (…)

Sharing an experience with someone is inherently distracting, because it compels us to expend energy on imagining what the other person is going through and how they’re reacting to it.

“People tend to engage quite automatically with thinking about the minds of other people,” Burum said in an interview. “We’re multitasking when we’re with other people in a way that we’re not when we just have an experience by ourselves.” (…)

When we let our focus shift away from the people and things around us, we are better able to engage in what’s called meta-cognition, or the process of thinking critically and reflectively about our own thoughts. (…)

Spending a certain amount of time alone, the study suggests, can make us less closed off from others and more capable of empathy — in other words, better social animals.”

— Leon Neyfakh, The power of lonely, The Boston Globe, March 6, 2011

See also: William Deresiewicz on solitude, Lapidarium

Jul
3rd
Sat
permalink

Flynn effect


                                     Source: Alex Kurtagic, Intellectual Radioactivity

"The Flynn effect describes an increase in the average intelligence quotient (IQ) test scores over generations (IQ gains over time). Similar improvements have been reported for other cognitions such as semantic and episodic memory. The effect has been observed in most parts of the world at different rates.
The Flynn effect is named for James R. Flynn, a political scientist from New Zealand, who put forth this observation, although it was substantiated by various other psychologists and academicians. The term itself was coined by the authors of The Bell Curve." — (Wikipedia)

"This effect has been observed across cultures, although in varying degrees. You would have come across people, or even you might have noticed that now-a-days children are more intelligent or quick to absorb a new concept. (…)

The cognitive psychology of a succeeding generation has a lot of stimulation for the abstract mind, and hence a better interpretative ability to assimilate these ideas. This demands a lot of thinking and reasoning from an average human brain. A simple example can be the scientific advancement which has undergone a sea of change. A person now in his 40s had limited access to technological inventions, the web, or mobile communication in his childhood. In stark contrast to this, consider his son born in the 1990s, who is quite adept and comfortable using these advancements. Even though he is using these technologies unknowingly, (his brain comprehends more facts than what his father’s did, at his age) the average effort put in by his brain to understand a particular system is higher than his father’s brain. This can be due to variety of reasons like better nutrition, large scale exposure to many concepts at a relatively tender age, interactive media and so on.” — (Prashant Magar, What is the Flynn Effect, Buzzle)

                                          Possible projection of Flynn through ages

Curve of IQ evolution during History. Correlation of the Flynn effect and observation of Dickens en Spensdale versus limit declared by Jensen. How intelligence has grown during Ages and how we observe a possible actuel limit.

(…) By reverse-engineering the pattern of improvement in IQ tests, you can tell how mental priorities have changed over the century. It turns out that we, far more than our recent ancestors, take seriously the ability to find abstract similarities between objects. And we are better at applying logic to finding abstract patterns, as in Raven’s Progressive Matrices.

"At that point I began to get excited", says Flynn, "because I began to feel that I was bridging the gulf between our minds and the minds of our ancestors. We weren’t more intelligent than they, but we had learnt to apply our intelligence to a new set of problems. We had detached logic from the concrete, we were willing to deal with the hypothetical, and we thought the world was a place to be classified and understood scientifically rather than to be manipulated."

Flynn cites his own father, who was 50 when he was born in 1934: “He was a highly intelligent man. But he had only an intermediate school education; he and all his brothers had gone into factory work between the ages of 12 and 14. I don’t think he would have taken a ‘matrices’ problem seriously. He wouldn’t have had any practice in his everyday life at finding logical patterns in abstract shapes; he could do logic all right, but it was mainly applied to concrete situations.”

(…) "I reject the idea that either these are intelligence gains or else they’re insignificant," says Flynn. "They’re not in any simple sense intelligence gains, but they are still highly significant." His father was also rather unwilling to waste his time on profitless speculation. "I remember frustrating occasions when it was natural for me to take hypothetical situations seriously and he thought of this as silly. We might argue about race, and I would say: ‘What would you think if your skin turned black?’ And his response would be: ‘Who has ever heard of such a thing?’ Most moral argument cannot get off the ground unless you take the hypothetical seriously."

(…) this story makes me think of my own 18-month-old son, who, when I took him to his room the night before our conversation, had turned the light switch on and off, again and again, until finally, to his disgust, I had pulled him away to put him in his cot. “Exactly! Merely being surrounded by mechanical contrivances prepares you to have a different mindset. They are artificial causal networks. That in itself helps to free the mind.”

There is still the puzzle of how environmental differences can be so weak when we compare individuals born at the same time, but so strong over time. The key, which Flynn attributes to fruitful discussions with his collaborator, William Dickens, an economist at the Brookings Institution in Flynn’s home town of Washington, DC, lies in the observation that superior genes cause superior performance by co-opting superior environments.” — (The world is getting smarter, More Intelligent Life, The Economist, November 27th 2007)

According to Flynn, the environment will always be the principal determinant of whether or not a particular genetic predisposition gets to be fully expressed. “There is a strong tendency for a genetic advantage or disadvantage to get more and more matched to a corresponding environment.” — (EzraKlein Archive)

”(…) There is one way an individual can walk a personal path to enhanced cognitive skills. He or she must internalize the goal of seeking challenging cognitive environments — seeking intellectual challenges all the way from choosing the right leisure activities to wanting to marry someone who is intellectually stimulating. Better off still are those who develop a certain kind of character formation — a character such that I carry about within myself a stimulating mental environment I myself create. Then I would be relatively free of needing good luck to enjoy a cognitively enriched environment throughout life. I would have instant access to a portable gymnasium that exercises the mind. Books and ideas and conversation are easier to transport than a basketball court. No one can keep me from using mental arithmetic so habitually that my arithmetical skills survive.” James R. Flynn, Rethinking intelligence and what affects it (doc)

See also: James R. Flynn, The Flynn Effect: Rethinking intelligence and what affects it

Merrill Hiscock, The Flynn effect and its relevance to neuropsychology (pdf), University of Houston, Houston, TX, USA

Jun
1st
Tue
permalink

Edward Carr on the last days of the polymath

      

"A polymath (Greek πολυμαθής, polymathēs, “having learned much”) is a person whose expertise spans a significant number of different subject areas. In less formal terms, a polymath (or polymathic person) may simply be someone who is very knowledgeable. Most ancient scientists were polymaths by today’s standards.

The terms Renaissance man and, less commonly, Homo Universalis (Latin for “universal man” or “man of the world”) are related and used to describe a person who is well educated or who excels in a wide variety of subjects or fields.The idea developed in Renaissance Italy from the notion expressed by one of its most accomplished representatives, Leon Battista Alberti (1404–1472): that “a man can do all things if he will.” It embodied the basic tenets of Renaissance humanism, which considered humans empowered, limitless in their capacities for development, and led to the notion that people should embrace all knowledge and develop their capacities as fully as possible. Thus the gifted people of the Renaissance sought to develop skills in all areas of knowledge, in physical development, in social accomplishments, and in the arts”. (Wiki)

One moment, of the Well of Life to taste—
The Stars are setting, and the Caravan
Starts for the dawn of Nothing—Oh, make haste!

The Persian sage’s carpe diem, The Rubaiyat of Omar Khayyam, (translation by Edward FitzGerald)

"When Thomas Young was alive the world contained about a billion people. Few of them were literate and fewer still had the chance to experiment on the nature of light or to examine the Rosetta stone. Today the planet teems with 6.7 billion minds. Never have so many been taught to read and write and think, and then been free to choose what they would do with their lives. The electronic age has broken the shackles of knowledge. Never has it been easier to find something out, or to get someone to explain it to you. 

Yet as human learning has flowered, the man or woman who does great things in many fields has become a rare species. Young was hardly Aristotle, but his capacity to do important work in such a range of fields startled his contemporaries and today seems quite bewildering. The dead cast a large shadow but, even allowing for that, the 21st century has no one to match Michelangelo, who was a poet as well as a sculptor, an architect and a painter. It has no Alexander von Humboldt, who towered over early-19th-century geography and science. And no Leibniz, who invented calculus at the same time as Newton and also wrote on technology, philosophy, biology, politics and just about everything else. 

Although you may be able to think of a few living polymaths who rival the breadth of Young’s knowledge, not one of them begins to rival the breadth of his achievements. Over the past 200 years the nature of intellectual endeavour has changed profoundly. The polymaths of old were one-brain universities. These days you count as a polymath if you excel at one thing and go on to write a decent book about another. (…) 

[Today] so many scientists are publishing research in each specialism that merely to keep up with the reading is a full-time job. “The frontier of knowledge is getting longer,” says Professor Martin Rees, the president of the Royal Society, where Young was a leading light for over three decades. “It is impossible now for anyone to focus on more than one part at a time.”

Specialisation is hard on polymaths. Every moment devoted to one area is a moment less to give over to something else. Researchers are focused on narrower areas of work. In the sciences this means that you often need to put together a team to do anything useful. Most scientific papers have more than one author; papers in some disciplines have 20 or 30. Only a fool sets out to cure cancer, Rees says. You need to concentrate on some detail—while remembering the big question you are ultimately trying to answer. “These days”, he says, “no scientist makes a unique contribution.”

It is not only the explosion of knowledge that puts polymaths at a disadvantage, but also the vast increase in the number of specialists and experts in every field. This is because the learning that creates would-be polymaths creates monomaths too and in overwhelming numbers. If you have a multitude who give their lives to a specialism, their combined knowledge will drown out even a gifted generalist. And while the polymath tries to take possession of a second expertise in some distant discipline, his or her first expertise is being colonised by someone else. (…)

Just knowing about a lot of things has never been easier. Never before have dabblers been so free to paddle along the shore and dip into the first rock pool that catches the eye. If you have an urge to take off your shoes and test the water, countless specialists are ready to hold your hand.

And yet you will never get very deep. Depth is for monomaths—which is why experts so often seem to miss what really matters. Specialisation has made the study of English so sterile that students lose much of the joy in reading great literature for its own sake. A generation of mathematically inclined economists neglected many of Keynes’s insights about the Depression because he put them into words. For decades economists sweated over fiendish mathematical equations, only to be brought down to earth by the credit crunch: Keynes’s well-turned phrases had come back to life. (…)

Polymaths were the product of a particular time, when great learning was a mark of distinction and few people had money and leisure. Their moment has passed, like great houses or the horse-drawn carriage. The world may well be a better place for the specialisation that has come along instead. The pity is that progress has to come at a price. Civilisation has put up fences that people can no longer leap across; a certain type of mind is worth less. The choices modern life imposes are duller, more cramped.

The question is whether their loss has affected the course of human thought. Polymaths possess something that monomaths do not. Time and again, innovations come from a fresh eye or from another discipline. Most scientists devote their careers to solving the everyday problems in their specialism. Everyone knows what they are and it takes ingenuity and perseverance to crack them. But breakthroughs—the sort of idea that opens up whole sets of new problems—often come from other fields. The work in the early 20th century that showed how nerves work and, later, how DNA is structured originally came from a marriage of physics and biology. Today, Einstein’s old employer, the Institute for Advanced Study at Princeton, is laid out especially so that different disciplines rub shoulders. I suspect that it is a poor substitute.

Isaiah Berlin once divided thinkers into two types. Foxes, he wrote, know many things; whereas hedgehogs know one big thing. The foxes used to roam free across the hills. Today the hedgehogs rule.”  

—  Edward Carr, The last days of the polymath, More Intelligent Life Magazine, The Economist, Autumn 2009, (Picture Credit: MASA/Breed London)