Lapidarium notes RSS

Amira Skomorowska's notes

"Everything you can imagine is real."— Pablo Picasso

Lapidarium

Tags:

Africa
Age of information
Ancient
Anthropology
Art
Artificial intelligence
Astronomy
Atheism
Beauty
Biography
Books
China
Christianity
Civilization
Cognition, perception, relativity
Cognitive science
Collective intelligence
Communication
Consciousness
Creativity
Culture
Curiosity
Cyberspace
Democracy
Documentary
Drawing
Earth
Economy
Evolution
Friendship
Funny
Future
Genetics
Globalization
Happiness
History
Human being
Illustrations
Imagination
Individualism
Infographics
Information
Inspiration
Internet
Knowledge
Language
Learning
Life
Literature
Logic
Love
Mathematics
Media
Metaphor
Mind & Brain
Multiculturalism
Music
Networks
Neuroscience
Painting
Paradoxes
Patterns
Philosophy
Poetry
Politics
Physics
Psychology
Rationalism
Religions
Science
Science & Art
Self improvement
Semantics
Society
Sociology
Storytelling
Technology
The other
Time
Timeline
Traveling
Unconsciousness
Universe
USA
Video
Violence
Visualization


Pensieri a caso
Photography
A Box Of Stories
Reading Space
Homepage

Twitter
Facebook

Contact

Archive

Feb
20th
Wed
permalink

Albert Bandura on social learning, the origins of morality, and the impact of technological change on human nature

image

"Technology has changed the speed and the scope of social influence and has really transformed our realities. Social cognitive theory is very compatible with that. Other learning theories were linked to learning by direct experience, but when I look around today, I see that most of our learning is by social modeling and through indirect experiences. Errors can be very costly and you can’t afford to develop our values, our competences, our political systems, our religious systems through trial and error. Modeling shortcuts this process. (…)

With new technologies, we’re essentially transcending our physical environment and more and more of our values and attitudes and behavior are now shaped in the symbolic environment – the symbolic environment is the big one rather than the actual one. The changes are so rapid that there are more and more areas of life now in which the cyber world is really essential. One model can affect millions of people worldwide, it can shape their experiences and behaviors. We don’t have to rely on trial and error.

There’s a new challenge now: When I was growing up, we didn’t have all this technology, so we were heavily involved in personal relationships. Now the cyber world is available, and it’s hard to maintain a balance in the priorities of life. (…)

The internet can provide you with fantastic globalized information – but the problem is this: It undermines our ability for self-regulation or self-management. The first way to undermine productivity is temporizing, namely we’re going to put off what we need to do until tomorrow, when we have the illusion that we’ll have more time. So we’re dragging the stuff with us. But the really big way is detouring, and wireless devices are now giving an infinite detour. They create the illusion of business. I talked to the author of a beststeller and I asked him about his writing style. He said: ‘Well, I have to check my e-mails and then I get down to serious writing, but then I get back to the e-mails.’ The challenge of the cyber world is establishing a balance between our digital life and life in the real world. (…)

The origins of morality

Originally our behavior was pretty much shaped by control, by the external consequences of our lives. So the question is: How did we acquire some standards? There are about three or four ways. One: We evaluate reactions to our behavior. We behave in certain ways, in good ways, in bad ways, and then we receive feedback. We begin to adopt standards from how the social environment reacts to our behavior. Two: We see others behaving in certain ways and we are either self-critical or self-approving. Three: We have precepts that tell us what is good and bad. And once we have certain self-sanctions, we have two other potent factors that can influence our behavior: People will behave in certain ways because they want to avoid legal sanctions to their behavior or the social sanctions in their environment. (…)

Many of our theories of morality are abstract. But the primary concern about the acquisition of morality and about the modes of moral reasoning is only one half of the story, the less interesting half. We adopt standards, but we have about eight mechanisms by which we selectively disengage from those standards. So the challenge to explain is not why do people behave in accordance with these standards, but how is it that people can behave cruelly and still feel good about themselves. Our problem is good people doing bad things – and not evil people doing bad things. (…)

Everyday people can behave very badly. In the book I’m writing on that topic I have a long chapter on moralist disengagement in the media, in the gun industry, in the tobacco industry, in the corporate world, in the finance industry – there’s fantastic data from the last few years – in terrorism and as an impediment to environmental sustainability. That’s probably the most important area of moralist disengagement. We have about forty or fifty years, and if we don’t get our act together, we’ll have a very hard time. It’s going to be awfully crowded on earth and a good part of our cities will be under water. And what are we doing? We don’t have the luxury of time anymore. (…)

Human nature is capable of vindicating behavior. It isn’t that people are bad by nature. But they have a very playful and rewarding lifestyle, filled with gadgets and air conditioning, and they don’t want to give it up. (…)

Q: ‘The story of men is a story about violence, love, power, victory and defeat’ – that’s how poets talk about the course of history. But from an analystic point of view…

A. Bandura: That’s not true for all societies. We assume that aggression is inbred, but some societies are remarkably pacifistic. And we can also see large variations within a society. But the most striking example might be the transformation from warrior societies into peaceful societies. Switzerland is one example. Sweden is another: Those vikings were out mugging everyone and people would pray for protection: “Save our souls from the fury of the Norsemen!” And now, if you look at that society, it’s hard to find child abuse or domestic violence. Sweden has become a mediator of peace.

Q: In German, there’s the term “Schicksalsgemeinschaft,” which translates as “community of fate”: It posits that a nation is bound together by history. Do you think that’s what defines a society: A common history? Or is it religion, or the language we speak?

A. Bandura: All of the above. We put a lot of emphasis on biological evolution, but what we don’t emphasize is that cultures evolve, too. These changes are transmitted from one generation to another. A few decades ago, the role of women was to be housewives and it was considered sinful to co-habit without being married. If you look at the role of women today, there’s a fantastic transformation in a short period of time; change is accelerated. Homogenization is important, picking things from different cultures, cuisines, music traditions, forms of behavior, and so on. But we have also polarization: Bin Laden’s hate of the West, for example. And there’s hybridization as well. (…)

And society is changing, too. Now it’s considered completely normal to live with your partner without being married. In California, it was only about 40 years ago that homosexuality was treated as a disease. Then people protested, and eventually they got the state to change the diagnostic category to sexual orientation rather than a disease. Psychiatry, under public pressure, changed the diagnostic system. (…)

Q: It’s quite interesting to compare Russia and China. Russia has a free internet, so the reaction to protests is very different than in China. If social networks become increasingly global, do you foresee something like a global set of values as well?

A. Bandura: Yes, but there is another factor here, namely the tremendous power of multinational corporations. They now shape global culture. A lot of these global forces are undermining the collective African society, for example. The society does no longer have much control over the economy. In order to restore some power in leverage, societies are going to be organized in unions. We will see more partnerships around the world. (…)

The revolutionary tendency of technology has increased our sense of agency. If I have access to all global knowledge, I would have fantastic capacities to educate myself. (…) The important thing in psychology is that we need a theory of human agency, rather than arguing that we’re controlled by neural networks. In every aspect of our lives we now have a greater capacity for exercicing agency. (…)

Q: But at the same time globalization removes us from the forces that shape our environment.

A. Bandura: The problems are powerful transnational forces. They can undermine the capacity to run our own society: Because of what happens in Iran, gas prices might soon hit five dollars per gallon in the US. That’s where the pressure comes from for systems and societies to form blocks or build up leverage to protect the quality of life of their citizens. But we can see that a global culture is emerging. One example is the transformation of the status of women. Oppressive regimes see that women are able to drive cars, and they cannot continue to deny that right to them. We’re really changing norms. Thanks to the ubiquity of television, we’re motivating them and showing them that they have the capability to initiate change. It’s about agency: Change is deeply rooted in the belief that my actions can have an effect in the world.”

Albert Bandura, a psychologist who is the David Starr Jordan Professor Emeritus of Social Science in Psychology at Stanford University. For almost six decades, he has been responsible for contributions to many fields of psychology, including social cognitive theory, therapy and personality psychology, and was also influential in the transition between behaviorism and cognitive psychology, "We have transcended our biology, The European, 18.02.2013. (Photo: Linda A. Cicero / Stanford News Service)

See also:

‘Human beings are learning machines,’ says philosopher (nature vs. nurture), Lapidarium notes
What Neuroscience Tells Us About Morality: ‘Morality is a form of decision-making, and is based on emotions, not logic’

Jan
21st
Sat
permalink

'Human beings are learning machines,' says philosopher (nature vs. nurture)

                       

"The point is that in scientific writing (…) suggest a very inflexible view of human nature, that we are determined by our biology. From my perspective the most interesting thing about the human species is our plasticity, our flexibility. (…)

It is striking in general that human beings mistake the cultural for the natural; you see it in many domains. Take moral values. We assume we have moral instincts: we just know that certain things are right and certain things are wrong. When we encounter people whose values differ from ours we think they must be corrupted or in some sense morally deformed. But this is clearly an instance where we mistake our deeply inculcated preferences for natural law. (…)

Q: At what point with morality does biology stop and culture begin?

One important innate contribution to morality is emotions. An aggressive response to an attack is not learned, it is biological. The question is how emotions that are designed to protect each of us as individuals get extended into generalised rules that spread within a group. One factor may be imitation. Human beings are great imitative learners. Rules that spread in a family can be calibrated across a whole village, leading to conformity in the group and a genuine system of morality.

Nativists will say that morality can emerge without instruction. But with innate domains, there isn’t much need for instruction, whereas in the moral domain, instruction is extensive. Kids learn through incessant correction. Between the ages of 2 and 10, parents correct their children’s behaviour every 8 minutes or so of waking life. In due course, our little monsters become little angels, more or less. This gives us reason to think morality is learned.

Q: One of the strongest arguments for innateness comes from linguists such as Noam Chomsky, who argue that humans are born with the basic rules of grammar already in place. But you disagree with them?

Chomsky singularly deserves credit for giving rise to the new cognitive sciences of the mind. He was instrumental in helping us think about the mind as a kind of machine. He has made some very compelling arguments to explain why everybody with an intact brain speaks grammatically even though children are not explicitly taught the rules of grammar.

But over the past 10 years we have started to see powerful evidence that children might learn language statistically, by unconsciously tabulating patterns in the sentences they hear and using these to generalise to new cases. Children might learn language effortlessly not because they possess innate grammatical rules, but because statistical learning is something we all do incessantly and automatically. The brain is designed to pick up on patterns of all kinds.

Q: How hard has it been to put this alternative view on the table, given how Chomskyan thought has dominated the debate in recent years?

Chomsky’s views about language are so deeply ingrained among academics that those who take statistical learning seriously are subject to a kind of ridicule. There is very little tolerance for dissent. This has been somewhat limiting, but there is a new generation of linguists who are taking the alternative very seriously, and it will probably become a very dominant position in the next generation.

Q: You describe yourself as an “unabashed empiricist” who favours nurture over nature. How did you come to this position, given that on many issues the evidence is still not definitive either way?

Actually I think the debate has been settled. You only have to stroll down the street to see that human beings are learning machines. Sure, for any given capacity the debate over biology versus culture will take time to resolve. But if you compare us with other species, our degree of variation is just so extraordinary and so obvious that we know prior to doing any science that human beings are special in this regard, and that a tremendous amount of what we do is as a result of learning. So empiricism should be the default position. The rest is just working out the details of how all this learning takes place.

Q: What are the implications of an empirical understanding of human nature for the way we go about our lives. How should it affect the way we behave?

In general, we need to cultivate a respect for difference. We need to appreciate that people with different values to us are not simply evil or ignorant, and that just like us they are products of socialisation. This should lead to an increase in international understanding and respect. We also need to understand that group differences in performance are not necessarily biologically fixed. For example, when we see women performing less well than men in mathematics, we should not assume that this is because of a difference in biology.

Q: How much has cognitive science contributed to our understanding of what it is to be human, traditionally a philosophical question?

Cognitive science is in the business of settling long-running philosophical debates on human nature, innate knowledge and other issues. The fact that these theories have been churning about for a couple of millennia without any consensus is evidence that philosophical methods are better at posing questions than answering them. Philosophy tells us what is possible, and science tells us what is true.

Cognitive science has transformed philosophy. At the beginning of the 20th century, philosophers changed their methodology quite dramatically by adopting logic. There has been an equally important revolution in 21st-century philosophy in that philosophers are turning to the empirical sciences and to some extent conducting experimental work themselves to settle old questions. As a philosopher, I hardly go a week without conducting an experiment.

My whole working day has changed because of the infusion of science.”

Jesse Prinz is a distinguished professor of philosophy at the City University of New York, specialising in the philosophy of psychology. He is a pioneer in experimental philosophy, using findings from the cognitive sciences, anthropology and other fields to develop empiricist theories of how the mind works. He is the author of The Emotional Construction of Morals (Oxford University Press, 2007), Gut Reactions (OUP, 2004) and Furnishing the Mind (MIT Press, 2002) and Beyond Human Nature: How culture and experience make us who we are, 'Human beings are learning machines,' says philosopher, NewScientist, Jan 20, 2012. (Illustration: Fritz Kahn, British Library)

See also:

Jesse Prinz: Morality is a Culturally Conditioned Response
Human Nature. Sapolsky, Maté, Wilkinson, Gilligan, discuss on human behavior and the nature vs. nurture debate

Apr
28th
Thu
permalink

Isaac Asimov predicted the Internet of today 20 years ago (1988)



Bill Moyers: Can we have a revolution in learning?

Isaac Asimov: “Yes, I think not only that we can but that we must. As computers take over more and more of the work that human beings shouldn’t be doing in the first place - because it doesn’t utilize their brains, it stultifies and bores them to death - there’s going to be nothing left for human beings to do but the more creative types of endeavor. The only way we can indulge in the more creative types of endeavor is to have brains that aim at that from the start. (…)

In the old days, very few people could read and write. Literacy was a very novel sort of thing, and it was felt that most people just didn’t have it in them. But with mass education, it turned out that most people could be taught to read and write. In the same way, once we have computer outlets in every home, each of them hooked up to enormous libraries, where you can ask any question and be given answers, you can look up something you’re interested in knowing, however silly it might seem to someone else.

Today, what people call learning is forced on you. Everyone is forced to learn the same thing on the same day at the same speed in class. But everyone is different. For some, class goes too fast, for some too slow, for some in the wrong direction. But give everyone a chance, in addition to school, to follow up their own bent from the start, to find out about whatever they’re interested in by looking it up in their own homes, at their own speed, in their own time, and everyone will enjoy learning.

BM: What about the argument that machines, like computers, dehumanize learning?

IA: As a matter of fact, it’s just the reverse. It’s through this machine that for the first time, we’ll be able to have a one-to-one relationship between information source and information consumer. In the old days, you used to hire a tutor or pedagogue to teach your children. And if he knew his job, he could adapt his teaching to the tastes and abilities of the students. But how many people could afford to hire a pedagogue? Most children went uneducated. Then we reached the point where it was absolutely necessary to educate everybody. The only way we could do it was to have one teacher for a great many students, and to give the teacher a curriculum to teach from. But how many teachers are good at this? As with everything else, the number of teachers is far greater than the number of good teachers. So we either have a one-to-one relationship for the very few, or a one-to-many for the many. Now, with the computer, it’s possible to have a one-to-one relationship for the many. Everyone can have a teacher in the form of access to the gathered knowledge of the human species. (…)

BM: What would such a teaching machine look like?

IA: I find that difficult to imagine. It’s easy to be theoretical, but when you really try to think of the nuts and bolts, then it becomes difficult. I could easily have imagined a horseless carriage in the middle of the nineteenth century, but I couldn’t have drawn a picture of it. But I suppose that one essential thing would be a screen on which you could display things, and another essential part would be a printing mechanism on which things could be printed for you. And you’ll have to have a keyboard on which you ask your questions’ although ideally I would like to see one that could be activated by voice. You could actually talk to it, and perhaps it could talk to you too, and say, “I have something here that may interest you. Would you like to have me print it out for you.?” And you’d say, “Well, what is it exactly?” And it would tell you, and you might say, “Oh all right, I’ll take a look at it.” (…)

BM: But the machine would have to be connected to books, periodicals, and documents in some vast library, so then when I want to look at Isaac Asimov’s new book Far as Human Eye Could See, the chapter on geochemistry, I could punch my keys and this chapter would come to me.

IA: That’s right, and then of course you ask - and believe me, I’ve asked - this question: “How do you arrange to pay the author for the use of the material?” After all, if a person writes something, and this then becomes available to everybody’ you deprive him of the economic reason for writing. A person like myself, if he was assured of a livelihood, might write anyway, just because he enjoyed it, but most people would want to do it in return for something. I imagine how they must have felt when free libraries were first instituted. “What? My book in a free library? Anyone can come in and read it for free.?” Then you realize that there are some books that wouldn’t be sold at all if you didn’t have libraries.

BM: With computers, in a sense, every student has his or her own private school.

IA: Yes, he can be the sole dictator of what he is going to study. Mind you, this is not all he’s going to do. He’ll still be going to school for some things that he has to know.

BM: Common knowledge for example.

IA: Right, and interaction with other students and with teachers. He can’t get away from that, but he’s got to look forward to the fun in life, which is following his own bent.

BM: Is this revolution in personal learning just for the young?

IA: No, it’s not just for the young. That’s another trouble with education as we now have it. People think of education as something that they can finish. And what’s more, when they finish, it’s a rite of passage. You’re finished with school. You’re no more a child, and therefore anything that reminds you of school - reading books, having ideas, asking questions - that’s kid’s stuff. Now you’re an adult, you don’t do that sort of thing any more. (…)

You have everybody looking forward to no longer learning, and you make them ashamed afterward of going back to learning. If you have a system of education using computers, then anyone, any age, can learn by himself, can continue to be interested. If you enjoy learning, there’s no reason why you should stop at a given age. People don’t stop things they enjoy doing just because they reach a certain age. They don’t stop playing tennis just because they’ve turned forty. They don’t stop with sex just because they’ve turned forty. They keep it up as long as they can if they enjoy it, and learning will be the same thing. The trouble with learning is that most people don’t enjoy it because of the circumstances. Make it possible for them to enjoy learning, and they’ll keep it up.

There’s the famous story about Oliver Wendell Holmes, who was in the hospital one time, when he was over ninety. President Roosevelt came to see him, and there was Oliver Wendell Holmes reading the Greek grammar. Roosevelt said, “Why are you reading a Greek grammar, Mr. Holmes?” And Holmes said, “To improve my mind, Mr. President.”

BM: Are we romanticizing this, or do you think that Saul Bellow's character Herzog was correct when he said that the people who come to evening classes are only ostensibly after culture. What they're really seeking, he said, is clarity, good sense, and truth, even an atom of it. People, he said, are dying for the lack of something real at the end of the day.

IA: I’d like to think that was so. I’d like to think that people who are given a chance to learn facts and broaden their knowledge of the universe wouldn’t seek so avidly after mysticism.”

Isaac Asimov, American author and professor of biochemistry at Boston University, best known for his works of science fiction and for his popular science books (1920-1992), Bill Moyers interviewed author Isaac Asimov, World of Ideas, PBS, 1988.

Apr
14th
Thu
permalink

Language and Your Brain (infographic)


                                                            (Click image to enlarge)

"For decades, research into the brain basis of language was limited to the study of the effects of neurological disease and brain lesions on human language processing and production. Nowadays, however, new techniques are allowing researchers to create a picture of a normal brain at work processing language - helping to shed light on the mysteries of language and the brain."

Language and Your Brain, Voxy Blog, Apr 5, 2011.

Mar
21st
Mon
permalink

Jonah Lehrer on which traits predict success (the importance of grit)

                            

"The sweet spot: that productive, uncomfortable terrain located just beyond our current abilities, where our reach exceeds our grasp. Deep practice is not simply about struggling; it’s about seeking a particular struggle, which involves a cycle of distinct actions." — (Daniel Coyle, The Talent Code: Unlocking the Secret of Skill in Sports, Art, Music, Math, and Just About Everything Else, Bantam Books, 2009.)

"The intrinsic nature of talent is overrated – our genes don’t confer specific gifts. (There is, for instance, no PGA gene.) This has led many researchers, such as K. Anders Ericsson, to argue that talent is really about deliberate practice, about putting in those 10,000 hours of intense training (plus or minus a few thousand hours). Beethoven wasn’t born Beethoven – he had to work damn hard to become Beethoven. As Ericsson wrote in his influential review article “The Role of Deliberate Practice in the Acquisition of Expert Performance” (pdf): “The differences between expert performers and normal adults are not immutable, that is, due to genetically prescribed talent. Instead, these differences reflect a life-long period of deliberate effort to improve performance.” (…)

Talent takes effort. Talent requires a good coach. But these answers only raise more questions. What, for instance, allows someone to practice for so long? Why are some people so much better at deliberate practice? If talent is about hard work, then what factors influence how hard we can work?

And this leads me to one of my favorite recent papers, “Deliberate Practice Spells Success: Why Grittier Competitors Triumph at the National Spelling Bee.” (pdf) The research, published in the journal of Social Psychological and Personality Science, was led by Angela Duckworth, a psychologist at Penn. (Anders-Ericsson is senior author.) The psychologists were interested in the set of traits that allowed kids to practice deliberately. Their data set consisted of 190 participants in the Scripps National Spelling Bee, a competition that requires thousands of hours of practice. After all, there are no natural born spellers.

The first thing Duckworth, et. al. discovered is that deliberate practice works. Those kids who spent more time in deliberate practice mode – this involved studying and memorizing words while alone, often on note cards – performed much better at the competition than those children who were quizzed by others or engaged in leisure reading. The bad news is that deliberate practice isn’t fun and was consistently rated as the least enjoyable form of self-improvement. Nevertheless, as spellers gain experience, they devote increasing amounts of time to deliberate practice. This suggests that even twelve year olds realize that this is what makes them better, that success isn’t easy.

Why were some kids better at drilling themselves with note cards? What explained this variation in hours devoted to deliberate practice? After analyzing the data, Duckworth discovered the importance of a psychological trait known as grit. In previous papers, Duckworth has demonstrated that grit can be reliably measured with a short survey that measures consistency of passions (e.g., ‘‘I have been obsessed with a certain idea or project for a short time but later lost interest’’) and consistency of effort (e.g., ‘‘Setbacks don’t discourage me’’) over time using a 5-point scale. Not surprisingly, those with grit are more single-minded about their goals – they tend to get obsessed with certain activities – and also more likely to persist in the face of struggle and failure. Woody Allen famously declared that “Eighty percent of success is showing up”. Grit is what allows you show up again and again. Here are the scientists:

Our major findings in this investigation are as follows: Deliberate practice—operationally defined in the current investigation as the solitary study of word spellings and origins—was a better predictor of National Spelling Bee performance than either being quizzed by others or engaging in leisure reading. With each year of additional preparation, spellers devoted an increasing proportion of their preparation time to deliberate practice, despite rating the experience of such activities as more effortful and less enjoyable than the alternative preparation activities. Grittier spellers engaged in deliberate practice more so than their less gritty counterparts, and hours of deliberate practice fully mediated the prospective association between grit and spelling performance. (…)

Success in the real world depends on sustained performance, on being able to work hard at practice, and spend the weekend studying the playbook, and reviewing hours of game tape. Those are all versions of deliberate practice, and our ability to engage in such useful exercises largely depends on levels of grit. The problem, of course, is that grit can’t be measured in a single afternoon on a single field. (By definition, it’s a metric of personality that involves long periods of time.) The end result is that our flawed beliefs about talent have led to flawed tests of talent. Perhaps that explains why there is no “consistent statistical relationship between combine tests and professional football performance.” We need to a test that measures how likely people are to show up, not just how they perform once there.

The second takeaway involves the growing recognition of “non-cognitive” skills like grit and self-control. While such traits have little or nothing to do with intelligence (as measured by IQ scores), they often explain a larger share of individual variation when it comes to life success. It doesn’t matter if one is looking at retention rates at West Point or teacher performance within the Teach for America program or success in the spelling bee: Factors like grit are often the most predictive variables of real world performance. Thomas Edison was right: even genius is mostly just perspiration.

Taken together, these studies suggest that our most important talent is having a talent for working hard, for practicing even when practice isn’t fun. It’s about putting in the hours when we’d rather be watching TV, or drilling ourselves with notecards filled with obscure words instead of getting quizzed by a friend. Success is never easy. That’s why talent requires grit.”

Jonah Lehrer, Which Traits Predict Success? (The Importance of Grit), Wired Science, March 14, 2011. (Picture source)

See also: Malcolm Gladwell on success

Feb
24th
Thu
permalink

Reflection Improves Learning


“A study done by MIT Professors David Foster and Matthew Wilson, one which Sahar found rather convincing regarding the importance of reflection in our everyday lives and specifically its effects on our cognitive abilities.

In this study researchers looked inside the brains of rats. They paid particular attention to the hippocampus, a brain structure that has shown to be responsible for learning and memory in rodents, primates, and humans. They performed brain scans on rats as they went through a maze, and then also after the experience, during times of reflection.
“What the results suggest is that while there certainly is some record of your experience as it is occurring (in other words when the rats were running the maze), the actual learning – when you try to figure out: ‘What was important? What should I keep and throw away?’ – that happens after the fact, during periods of quiet wakeful introspection.”
Rats who were given a chance to relax and reflect showed better signs of learning than rats who were not given a chance to relax and reflect. Scientists have implied that it could be that “replaying a sequence of behavioral events in our mind” is an important mechanism in effective learning and memory retention.

All knowledge is processed knowledge. We don’t know things in the form of their raw sensory experience, but the form in which we conceptualize them. We then integrate these concepts into our representation of the world, just like the rat does when it makes a mental map of a maze.

As Sahar describes in his lecture, information is the sensory data of what we experience and transformation is the map we create from that data. When we reflect we are re-initiating this process of transformation by deriving new meaning from our memories.

During transformation we decide what parts of the experience were most important and worth paying attention to. When our mental schema doesn’t work, we can always reflect back, re-focus, and adjust our understanding of that experience.” “
Steven Handel, Reflection Improves Learning, The Emotion Machine, Jan 17, 2010
Aug
29th
Sun
permalink
Baddeley’s Working MemoryModel 
A Modification of Baddeley’s Working Memory Model, Source: F. L. Coolidge & T. Wynn , The Rise of Homo sapiens. The Evolution of Modern Thinking, A John Wiley & Sons, Ltd., Publication, 2009, p. 43.
Alan Baddeley and Graham Hitch proposed a Model of Working Memory in 1974, in an attempt to describe a more accurate model of short-term memory.
The original model of Baddeley & Hitch was composed of three main components; the executive  which acts as supervisory system and controls the flow of information from and to its slave systems: the phonological loop and the visuo-spatial sketchpad. The slave systems are short-term storage systems dedicated to a content domain (verbal and visuo-spatial, respectively). In 2000 Baddeley added a third slave system to his model, the episodic buffer. Baddeley & Hitch’s argument for the distinction of two domain-specific slave systems in the older model was derived from experimental findings with dual-task paradigms. Performance of two simultaneous tasks requiring the use of two separate perceptual domains (i.e. a visual and a verbal task) is nearly as efficient as performance of the tasks individually. In contrast, when a person tries to carry out two tasks simultaneously that use the same perceptual domain, performance is less efficient than when performing the tasks individually. (Wikipedia)
"A Memory Primer  Cognitive psychologists have long distinguished between declarative memories like facts, details, strings of sounds and words, and procedural memories for learned motor movements like stone-knapping and visuospatial tasks like memorizing directions for locations. Both types appear to use relatively independent neural pathways; selective types of brain damage may affect one type of memory but not the other. There are varieties of declarative memory. An episodic memory is a coherent, story-like reminiscence for an event, often associated with a specific time and place, and a feeling signature. Episodic memory is sometimes labeled personal memory or autobiographical memory. The other major type of declarative memory is semantic, the memory for general facts. A reminiscence, of course, will include semantic details but its recall and subjective experience will be psychologically and neurologically different from the recall of the semantic components alone.”
— F. L. Coolidge & T. Wynn , The Rise of Homo sapiens. The Evolution of Modern Thinking, A John Wiley & Sons, Ltd., Publication, 2009, p. 42-43.

Baddeley’s Working MemoryModel

A Modification of Baddeley’s Working Memory Model, Source: F. L. Coolidge & T. Wynn , The Rise of Homo sapiens. The Evolution of Modern Thinking, A John Wiley & Sons, Ltd., Publication, 2009, p. 43.

Alan Baddeley and Graham Hitch proposed a Model of Working Memory in 1974, in an attempt to describe a more accurate model of short-term memory.

The original model of Baddeley & Hitch was composed of three main components; the executive which acts as supervisory system and controls the flow of information from and to its slave systems: the phonological loop and the visuo-spatial sketchpad. The slave systems are short-term storage systems dedicated to a content domain (verbal and visuo-spatial, respectively). In 2000 Baddeley added a third slave system to his model, the episodic buffer.

Baddeley & Hitch’s argument for the distinction of two domain-specific slave systems in the older model was derived from experimental findings with dual-task paradigms. Performance of two simultaneous tasks requiring the use of two separate perceptual domains (i.e. a visual and a verbal task) is nearly as efficient as performance of the tasks individually. In contrast, when a person tries to carry out two tasks simultaneously that use the same perceptual domain, performance is less efficient than when performing the tasks individually. (Wikipedia)

"A Memory Primer Cognitive psychologists have long distinguished between declarative memories like facts, details, strings of sounds and words, and procedural memories for learned motor movements like stone-knapping and visuospatial tasks like memorizing directions for locations. Both types appear to use relatively independent neural pathways; selective types of brain damage may affect one type of memory but not the other. There are varieties of declarative memory. An episodic memory is a coherent, story-like reminiscence for an event, often associated with a specific time and place, and a feeling signature. Episodic memory is sometimes labeled personal memory or autobiographical memory. The other major type of declarative memory is semantic, the memory for general facts. A reminiscence, of course, will include semantic details but its recall and subjective experience will be psychologically and neurologically different from the recall of the semantic components alone.”

— F. L. Coolidge & T. Wynn , The Rise of Homo sapiens. The Evolution of Modern Thinking, A John Wiley & Sons, Ltd., Publication, 2009, p. 42-43.

Jan
29th
Fri
permalink