People often try to justify choosing one thing (such what to eat) over another by convincing themselves that their choice is superior, even though both items seemed equally good a few seconds prior. This is our way of dispelling the cognitive dissonance choosing creates; if we don’t do this, we tend to fret about whether we decided correctly. A couple of researchers decided to test how this dissonance might be affected by the act of washing, which is sometimes linked with moral self-judgement. Washing hands is not only healthy, but it seems it may may also put one’s mind at ease about recent past decisions. Students were asked to choose between two objects out of several they had ranked. When they washed their hands after making the choice, they seemed to experience less cognitive dissonance, while students who did not wash their hands behaved as if they still needed to justify their choices to themselves. Physical cleanliness may have a broader impact on individual psychology than previously thought. So does this make doctors overconfident about their diagnoses because they wash their hands (at least I hope they do) after each?
Thinking, after a while, becomes the most pleasurable thing in the world.
—We Are What We Repeatedly Do
Feb. 14, 2011
It’s impossible to predict the behaviour of smarter-than-human intelligences with which (with whom?) we might one day share the planet. Maybe we’ll merge with them to become super-intelligent cyborgs, using computers to extend our intellectual abilities the same way that cars and planes extend our physical abilities. Maybe the artificial intelligences will help us treat the effects of old age and prolong our life spans indefinitely. Maybe we’ll scan our consciousnesses into computers and forever live inside them as software, virtually. Maybe the computers will turn on humanity and annihilate us. The one thing all these theories have in common is the transformation of our species into something that is no longer recognisable as such to humanity circa 2011. This transformation has a name: the Singularity. The difficult thing to keep sight of when you’re talking about the Singularity is that even though it sounds like science fiction, it isn’t, no more than a weather forecast is science fiction. It’s not a fringe idea; it’s a serious hypothesis about the future of life on Earth.
Until then, we’ll just have to live with what we’ve got. This posting is of articles related to the Mind and Brain
The existentially indifferent appear to live a life of complacency, with few highs and little or no introspection. Without commitment to sources of meaning, life can be superficial. But superficiality is not necessarily a state of suffering. These people aren’t classified as having “psychological stress.” An existentialist would say they are “asleep”. But, then, existential philosophers and psychologists from Heidegger to Frankl have for years drawn distinctions between an authentic, complex life and a shallow, “everydayness” mode of existence. The existentially indifferent characterise this “everyday” mode and as — if to defy existentialism — are perfectly fine with it. To replace meaningful pursuits, they have a wide array of superficial weaponry. Surrogates for meaningful commitment abound, ranging from material possessions to pleasure seeking, from busy-ness to sexuality. An Austrian research psychologist surveyed perceived meaningfulness in a sample of 603 Germans. She found that 61% were “meaningful,” 4% suffered a “crisis of meaning,” and 35% were “existentially indifferent” (those who “neither experience their lives as meaningful nor suffer from this lack of meaning”). I’m okay with that — some of my best friends are existentially indifferent. |
By definition, narcissists are impatient, vainglorious, easily insulted, and aggrieved. They’d never dream of making sacrifices on anyone else’s behalf, unless it simultaneously advances an agenda of their own. But the fact is, everyone is capable of narcissism in times of crisis — a typical response to feeling out of control, especially if the person has had plenty of control before (or at least the illusion of it) and especially if there’s still a means to express dissatisfaction. Similarly, if conditions are right, an entire culture can plunge into narcissistic behaviour. We’ve been there before. In The Culture of Narcissism, a 1979 classic about the spread and normalisation of self-absorption in the US, historian Christopher Lasch suggests that 70s rebellion culture was at once the result of too many constraints and too few. On the one hand, people felt powerless in the face of a changing economy and the expanding impersonal complexity of the modern world, a world that “made the individual dependent on the state, the corporation, and other bureaucracies.” At the same time, a sexual revolution was taking place, mass media was replacing the church and family as the main source of culture and values, and Madison Avenue was “undermining the horrors of indebtedness” — all of which gave people a sense of lawlessness and dizzying personal freedom. The result was a culture where people felt the same paradoxical combination experienced by angry children: powerlessness and a destructive, deceptive sense of might. And this very much describes people’s experience of life today. Our world combines extreme complexity with dehumanising, tumbling-down institutions and fast-dissolving social mores. When people become more powerless, they become more distrustful of those who have power or authority, so they want systems (like the TSA? Like God?) that protect them against someone else (like imagined terrorists? Like temptation?). This paradoxically disempowers them more. |
From the Thai Health Coding Centre / International Statistical Classification of Diseases and Related Health Problems / Symptoms, Signs and Abnormal Clinical and Laboratory Findings Not Elsewhere Classified / Involving Cognition, Perception, Emotional State and Behaviour |
In ethology, a fixed action pattern (FAP) is an instinctive behavioural sequence that is indivisible and runs to completion. FAPs are invariant, produced by a neural network known as an innate releasing mechanism, initiated in response to an external sensory stimulus known as a releaser (a signal from one individual to another). But apparently many such instinctive responses are, in fact, actually learned. For example, the newly-hatched herring gull chick’s supposed FAP of hitting the red spot on its parents’ beak for food is more complex: what is innate is only a tendency to peck at an oscillating object in the field of view. The ability to target the beak, and the red spot on the beak, develops quickly but is acquired experientially. Clearly, certain sensitivities must be innate, but the specifics of development into various behavioural acts depends on how the organism interacts with its surroundings and what feedback it receives. Especially for humans, it is not simply a matter of conditioning Response R to Stimulus S, but rather of evaluating as much input as possible. If we wish to understand why we often act in certain predictable ways, particularly if there’s a desire or need to change these behavioural responses, look for possible releasers that stimulate a FAP. The FAP might actually be a response learned over time, initially with respect to something quite basic, which now affects aspects of social interactions or quick decision-making processes in professional lives. Given an understanding of our own FAPs, and those of other individuals with whom we interact, we — as humans with cognitive processing powers — could re-think (and thus change) our behaviour habits. (I’d say this person definitely believes in free will.) |
The earliest record of makeup use dates back to 3000 BC and the ancient Egyptians. Evidence suggests the origins of makeup may go back much further — Neandertals may have used coloured pigments on their skin 50,000 years ago. Makeup works because it’s a good lie. In much of the animal kingdom, females advertise youth, health and sexual availability through physical signals — girls of the animal world know sex sells. In humans, these signals are less pronounced. Makeup works because it exaggerates or even completely fabricates these signs of fertility and sexual availability, thus making a woman subliminally more appealing. Women tend to be naturally darker around the eyes than men. Eyeliner, eye shadow and mascara enhance this, making a face look more feminine. Women have darker mouths than men of the same skin tone, so manipulating lips to be darker increases femininity and attractiveness. When women ovulate, estrogen rises, enhancing vascular blood flow under the skin. When most fertile, women report they’re more easily turned on and have more interest in sex; they also tend to have redder lips (due to that enhanced blood flow), so by putting on red lipstick, women accentuate a natural fertility signal. Her red lips say she’s young, healthy AND interested. That increased blood flow also pinkens cheeks, so blush adds to the effect. As skin ages, it tends to discolour so even skin tone is seen as younger and more attractive (hence foundation makeup). When you poll men about makeup preferences, 20% say their significant other wears too much makeup while 10% wish women didn’t wear makeup at all. But actions speak louder than words — study after study finds that when shown pictures of women with and without makeup, men consistently rate images with makeup as more attractive, confident, and healthier. They also rate women wearing makeup as more intelligent, and as having higher earning potential and more prestigious jobs. Tt’s no wonder that more than $40 billion dollars a year (in the US alone, I presume) is spent on cosmetics. |
Walk into psychiatrist Drew Ramsey’s office in Manhattan and you’ll likely be greeted by Gus, a 4-year-old shih tzu. After escorting you through the waiting room, he may hop onto the ottoman and go to sleep or sit beside you on the couch. Some patients pat Gus while they talk to Dr Ramsey. A few talk to Gus instead. And if they get emotional, Gus provides physical comfort that therapists can’t offer. “We can’t hug patients, but patients can hug Gus,” says Dr Ramsey, who began bringing his dog to his office 2 years ago. Now, he says, “I think about Gus the way a cowboy thinks of his horse — he’s part of the job.” A growing number of private practice therapists bring their dogs to work, where they help patients calm down and cheer up, a happy distraction with a wagging tail. This is similar to what therapy dogs do when they visit hospitals or nursing homes but these “canine therapy-assistants” often work full days and get to know the patients as well as the doctor does. Even some medical doctors have put their dogs to work, though they aren’t allowed in procedure rooms. Research shows that a few minutes of stroking a pet dog decreases cortisol, the stress hormone, in both the human and the dog. It also increases prolactin and oxytocin, hormones that govern nurturing and security, as well as serotonin and norepinephrine, neurotransmitters that boost mood. One study found that 5 minutes with a dog was as relaxing as a 20-minute break for hospital staffers. My thoughts? The temperament of the dog is crucial — these are the therapists’ own pets and people — therapists included — tend to think their children and their pets are much better than average. And the animal must never be forced on anyone. (Plus, it’s important for a therapist not to negatively evaluate a patient who doesn’t seem to like her Little Precious.) |
Humans have roughly one million neurons for each one in a fly. And out of a human’s 100 billion neurons emerge some pretty remarkable things. With enough quantity (and also a greater degree of connectedness), you generate quality — sufficient for language, independent control of each finger, gratification postponement, long-term planning (to name a few) — and — metaphors. [From J Ruth Gendler’s wonderful The Book of Qualities, “Compassion speaks with a slight accent. She was a vulnerable child, miserable in school, cold, shy. In 9th grade she was befriended by Courage. Courage lent Compassion bright sweaters, explained the slang, showed her how to play volleyball.” Despair has stopped listening to music. Anger sharpens kitchen knives at the local supermarket. Beauty wears a gold shawl and sells 7 kinds of honey at the flea market. Longing studies archaeology.] Symbols, metaphors, analogies, parables, synecdoche, figures of speech: we understand them. But the brain processes literal and metaphorical versions of a concept in the same brain region which can sometimes cause interesting results. The same part of the brain is disgusted by rotted meat and moral failures. Stick a pin in our toe and we hurt. Stick a pin in our child’s toe and we also hurt — same part of the brain. We can metaphorically absolve our sins by washing our hands. Let us hold a warm cup of coffee and we like people better. (There are many ways in which the human brain isn’t all that fancy.)
An often-overlooked network of neurons lining our guts is so extensive that some scientists have nicknamed it our “second brain”. Although its influence is far-reaching, it isn’t the seat of conscious thought. Technically known as the enteric nervous system (ENS), this brain consists of sheaths of neurons embedded in the walls of the long tube of the gut, or alimentary canal, which measures about 9 metres end-to-end from the esophagus to the anus. The ENS contains 100 million neurons, more than in either the spinal cord or the peripheral nervous system. Equipped with its own reflexes and senses, it controls gut behaviour independently of the brain. 90% of the fibres in the primary visceral nerve, the vagus, carry information from the gut to the brain and not the other way around. The ENS uses more than 30 neurotransmitters, just like the brain, and, in fact, 95% of the body’s serotonin is found in the bowels. Because antidepressant selective serotonin reuptake inhibitors (SSRIs) increase serotonin levels, it’s little wonder that meds meant to cause chemical changes in the mind often provoke gut issues as a side effect. Irritable bowel syndrome afflicts more than two million Americans. It arises in part from too much serotonin in entrails. Serotonin made by the ENS plays a role in more surprising diseases: a drug that inhibits the release of serotonin from the gut counteracts osteoporosis in postmenopausal rodents. Serotonin seeping from the second brain might even play a part in autism — the same genes involved in synapse formation between neurons in the brain are involved in alimentary synapse formation. If these genes are affected in autism, it could explain why so many people with autism have gastrointestinal motor abnormalities. Research is currently investigating how the ENS mediates the body’s immune response; after all, at least 70% of our immune system is aimed at the gut to expel and kill foreign invaders. The trillions of bacteria found in the gut also appear to somehow “communicate” with ENS cells. |
Bees can solve complex mathematical problems which keep computers busy for days. They learn to fly the shortest route between flowers discovered in random order, effectively solving the “travelling salesman problem”. Computers solve that problem by comparing the lengths of all possible routes and choosing the shortest. Bees manage to reach the same solution using a brain the size of a grass seed. Modern living depends on networks such as traffic flows, internet information and supply chains. Arriving at “shortest path” solutions as easily as a bee could be helpful. Michael O’Malley, The Wisdom of Bees author, says it would be “phylogenic hubris to think we have nothing to learn from bees.” They have patience and restraint; they act today in anticipation of tomorrow by seeking new nectar fields while still exploiting rich ones. Almost everything a bee does is for the benefit of the hive. A hive typifies excellent communications, networking and enforced co-operation; it is well-managed and highly efficient. But bees are scarcely a model for financial success. Humans (at least Westerners) don’t submerge their identities to the greater good of the hive or nest as do social insects. This point was succinctly made by E O Wilson, Harvard natural historian and ant expert. Asked for his views about communism, Wilson said, “Great idea, wrong species.” Bee lovers take note. |
People’s minds wander 30% of the time during all activities except sex. A wandering mind often stumbles downhill emotionally. People spend nearly half their waking lives thinking about things other than what they’re actually doing, though these imaginary rambles frequently make them feel bad. Mind wandering serves useful purposes, such as providing a way to reflect on past actions, plan for the future, and imagine possible consequences of important decisions. We may tend to reflect on things that went poorly or are a cause for worry — not a recipe for happiness, even if it’s necessary. (Perhaps this is why philosophical and religious teachings assert that happiness is found by living in the moment, resisting mind wandering. And why people with a more realistic outlook tend to be a bit more depessed than the average.) |
By John Paul Getty III’s 15th birthday he was partying hard, taking drugs and crashing expensive motorbikes and cars. Expelled from no fewer than 7 schools, he decided on an artistic career. He sold his paintings to local trattoria and supplemented what he made by modelling nude for life classes. In July 1973, he was 16 and living in a small apartment with two young painter friends. The long-haired youth was kidnapped by a Mafia gang who sent a ransom note demanding £17 million for his safe return. They blindfolded him and chained him to a stake in a cave for 5 months. His family suspected the kidnapping was a hoax to extract money from his notoriously tight-fisted grandfather, so refused to pay. A Roman newspaper received an envelope containing a hank of hair, a decomposing human ear and another note threatening the boy with further mutilation unless the ransom (now lowered to £3.2 million) was paid. Getty Sr paid, making it a loan that Getty Jr had to repay at 4% interest. Getty III (who apparently received no counselling after his ordeal and who knew his family had balked at paying any money to save his life) married his pregnant girlfriend (6 years his senior) as soon as he turned 18. When the senior Getty died in 1976, he left $500 to Getty Jr and nothing to Getty III. 8 years later, an overdose of alcohol, methadone and valium caused Getty III to have a stroke that left him in a coma for 6 months. Oxygen deprivation left him quadriplegic, almost blind, and confined to a wheelchair for the rest of his life. Unable to talk, he had to be spoon-fed, dressed, bathed and cared for around the clock; he had only peripheral vision, problems communicating, and could emit only high-pitched screams. Getty Jr, now a billionaire himself, refused to pay Getty III’s medical bills, running at £16,000 a month, until the courts forced him to. On his death in 2003, Sir Getty II left Getty III chattels worth only £50,000. No one indicated whether Getty III was glad he had lived or wished he hadn’t. Now it doesn’t matter — he has died after long illness. Getty III’s only child, film actor and model Paul Balthazar Getty, lives in Los Angeles.
Alzheimer’s disease is the leading cause of dementia among the elderly, and with the ever-increasing size of this population, cases of Alzheimer’s disease are expected to triple over the next 50 years. Consequently, the development of treatments that slow or halt the disease progression have become imperative to both improve the quality of life for patients and reduce the health care costs attributable to Alzheimer’s disease. The active component of marijuana, Delta9-tetrahydrocannabinol (THC), competitively inhibits the enzyme acetylcholinesterase (AChE) as well as prevents AChE-induced amyloid beta-peptide aggregation, the key pathological marker of Alzheimer’s disease. Compared to currently approved drugs prescribed for the treatment of Alzheimer’s disease, THC is a considerably superior inhibitor of plaque aggregation, and cannabinoid molecules may directly impact the progression of this debilitating disease. |
A 29-year-old man who had been deaf since infancy (we’ll call him Sal) also had Tourette’s. In most people, coprolalia (involuntary utterance) involves randomly blurting out obscenities. Sal, however, wasn’t shouting out obscenities — he was signing them. Researchers noted that Sal was particularly fond of making sexual signs when talking to women and was also known, as part of his tic, to spell out expletives, letter by letter, in sign language. The implications are fascinating to consider. Coprolalia isn’t a compulsion of vocalization, per se – it’s a compulsion of expression. Sal’s tics involved expletives, not other random signs. When Sal was in cooking class, rather than signing expletives, he tended to display the sign for “vomit.” In some ways, in that particular context, with classmates eagerly preparing their dishes, the word vomit is even more subversive than an obscenity would be. What the authors conclude from this accumulated evidence is that coprolalia, then, comes from some sort of urge to disrupt or disturb others. In fact, they say, coprolalia is a kind of linguistic aggression: “The utterance of obscenities is a form of aggressive behaviour, and there may be failure in the control of these brief aggressive impulses in Gilles de la Tourette syndrome.” |
The following was published in the November 2010 issue of Game Developer magazine… “The procedure is actually quite simple. First, you arrange things into different groups. Of course, one pile may be sufficient depending on how much there is to do. If you have to go somewhere else due to lack of facilities, then that is the next step. Otherwise, you are pretty well set. It is important not to overdo things; that is, it is better to do too few things at once than too many. In the short run, this may not seem important, but complications can easily arise. A mistake can be expensive as well. At first, the whole procedure will seem complicated. Soon, however, it will just become another facet of life.” Did the paragraph make any sense, or did it seem like a string of nonsense? Most likely, it was the latter, and the reason is that the text is completely devoid of context. Now, try reading the paragraph again, but think of this simple phrase first: “Dirty laundry”. Now, the information should read completely differently and actually mean something. The text is simply a set of instruction about how to wash one’s laundry. In fact, now that context has been established, reading the paragraph again without thinking about clothes is probably impossible. A schema is a mental framework centering on a specific theme, helping us process and classify new information. Schemas are only useful if they’re activated. The original paragraph was meaningless until the appropriate schema was triggered in the reader’s mind by the phrase. |
In 1909, biologist Jakob von Uexküll introduced the concept of the umwelt to express a simple (but often overlooked) observation: different animals in the same ecosystem pick up on different environmental signals. In the blind and deaf world of the tick, the important signals are temperature and the odour of butyric acid. For the black ghost knifefish, it’s electrical fields. For the echolocating bat, it’s air-compression waves. The small subset of the world that an animal can detect is its umwelt. The bigger reality, whatever that might mean, is called the umgebung. Each organism presumably assumes its umwelt to be the entire objective reality “out there.” Why would anyone stop to think that there is more beyond what he can sense? In the movie The Truman Show, the eponymous Truman lives in a world completely constructed around him by an intrepid television producer. At one point an interviewer asks the producer, “Why do you think Truman has never come close to discovering the true nature of his world?” The producer replies, “We accept the reality of the world with which we’re presented.” People accept their umwelt and stop there. But human brains are tuned to detect a shockingly small fraction of surrounding reality. Consider the criticisms of policy, the assertions of dogma, the declarations of fact that you hear every day — and imagine if all of these could be infused with the proper intellectual humility that comes from appreciating the amount that goes by unseen or unconscious.
Scientists have discovered that people are more generous and willing to help when they are physically higher than others. Commuters give more money to the charity collector at the top of an escalator than the one at the bottom. Volunteers are more generous with their time if standing on a higher spot than others. People are less likely to inflict punishment on another person if they’ve just ascended steps – and are crueller if they’ve just come down stairs. The link between height and morality sounds unlikely, yet even everyday language links them. People talk about putting other people on pedestals, looking up to those they admire or taking the moral high ground. The use of metaphors linking height with good behaviour affects how people behave in real life. (Then could racism against skin colour be linked to the negative connotations aggregating to yellow being used to mean cowardice or black implying evil or sinister?) People talk about sinking to new depths, scraping the bottom, and looking down on people. Perhaps employers could boost the helpful contributions of staff by holding meetings on the top floor? And remember to invite your friends to come upstairs before you ask them for an important favour. |
It’s easy to be an airline industry critic these days. But consider this: A man in Los Angeles learned that his daughter’s live-in boyfriend had murdered her 3-year-old son. The boy was being taken off life support at 9 that night and his daughter wanted him to be there. He left for the airport at once. In LAX, the lines to both check a bag and get through security were exceptional. He got to the airport two hours early but was still late getting to his plane. Every step of the way, he was on the verge of tears and tried to get assistance from both TSA and Southwest employees to get to his plane on time. According to him, everyone he talked to couldn’t have cared less. When he was done with security, he grabbed his computer bag, shoes and belt and ran to his terminal in his stocking feet. When he got there, the pilot of his plane and the ticketing agent both said, “Are you Mark? We’ve held the plane for you and we’re so sorry about the loss of your grandson.” The pilot held the plane that was supposed to take off at 11:50 until 12:02. As the man walked down the jetway with the pilot, he said, “I can’t thank you enough for this.” The pilot responded with, “They can’t go anywhere without me and I wasn’t going anywhere without you. Now relax. We’ll get you there. And again, I’m so sorry.” A representative from Southwest said the airline was “proud” of the way the pilot had held the flight. If I have a choice, I know which airline I’ll pick on my next flight. |
Over the past 30 years, an increasing number of studies have highlighted the importance of creativity for individuals, organisations, and societies, stressing the potential and real benefits of creative thinking. For instance, creative products generate an average return significantly higher than that of “common” products and investments in creativity and innovation positively impact organisational performance. Creativity is beneficial at the individual level, helping people manage their daily lives by finding creative solutions to problems. But creativity doesn’t always lead to “good.” New studies demonstrate that creativity may produce negative effects by leading individuals to more frequently engage in dishonest behaviour: there is a significant relationship between creative personality and dishonesty. (Creativity is a better predictor of dishonest behaviour than intelligence.) Participants primed with a creative mindset are more likely to cheat; those primed to think creatively are more likely to behave dishonestly due to an enhanced motivation to be creative. Creative people often cleverly (creatively?) justify such dishonest behaviour. Employees who are in positions that require creativity are more likely to do wrong in the workplace, perhaps due to an increased motivation to think outside the box. (There is a link between creativity and rationalisation.) The ability of most people to behave dishonestly might be bounded by their ability to cheat and at the same time convince themselves that they are behaving morally. To the extent that creativity allows people to more easily behave dishonestly and rationalise it away, it might be a driver of dishonesty and play a useful role in understanding unethical behaviour. I have a bit of a problem with this. Creative people are more dishonest because they CAN be? All of us are crooked down deep and some just don’t act on it because we have a higher chance of getting caught? This sounds too much like religious talk for my taste. I must seriously question the interpretation of these results. Via the New Shelton wet/dry. |
Cars kill a lot more people than spiders, bats, snakes and wolves, but people don’t fear them in the same visceral way. Although some people fear snakes more than others, baby humans, chimps and monkeys seem equally jumpy when confronted with a black plastic snake. Some psychological traits are shared by all mammals. In experiments conducted in the 1960s, baby goats and baby humans were separately offered the opportunity to walk or crawl onto a transparent surface that gave the impression of walking off a cliff. Both declined. Fear of heights is so widespread and understandable that psychologists consider it a normal fear. Newborn rhesus monkeys are afraid of toy snakes and toy crocodiles — but not, say, toy rabbits. Humans have no fear of Twinkies and cheeseburgers — au contraire — although these foods have become more dangerous to health than anything that skitters, flits or crawls. Put a gun in front of a monkey or a baby and (unfortunately) they wouldn’t react like they do with a plastic snake. And why do some people appear addicted to fear, as evidenced by the popularity of increasingly horrifying horror movies? Some enjoy chilling enactments of instinctive ancient fears, while the others would rather eat popcorn in front of anything else, even the Shopping Channel. Low-sensation-seekers don’t like to be aroused by unpleasant things. Humans are perhaps the only species that has much a wide variation in risk-taking. Interestingly, although people with varying levels of some personality traits, such as neuroticism, agreeableness and conscientiousness, intermarry at a seemingly random rate, high-sensation seekers — as well as those on the low end of the scale — tend to pick similarly arousable people as mates. To the extent that such traits are genetic rather than learned, they tend to get intensified by this inbreeding, resulting in a greater range in the population’s tolerance of risky behaviour. Via the New Shelton wet/dry.
An encompassing explanation of horror’s inherent appeal is how it helps us master our fears. This seems to be particularly important for youngsters who flock to scary media as an ultimately safe way to exercise their emotional chops and deal with real-life scary stuff. Watching a horror film gives people back some control. They can experience an adverse event through film and know that it will end. They’ll survive. They’ll go on with their lives. Interestingly, this co-opting of horror only really happens if the player or viewer knows that what they see is fake. In one famous experiment, researchers had subjects watch a movie featuring authentic scenes of live monkeys having their brains scooped out and of children having facial skin peeled away in preparation for surgery. The vast majority of the study’s participants refused to finish watching despite the fact that more grotesque movies playing at the theatre down the street could outdo those scenes. People seem to need to know it’s fake. Excitation transfer theory, credited with enabling spooky soundtracks to be effective, has also been hypothesised to give a kind of “thank god that’s over” high. People become physically aroused due to the fear they experience during the media event — and then when the media event ends, that arousal transfers to the experience of relief and intensifies it. They don’t so much enjoy the experience of being afraid — rather, they enjoy the intense positive emotion that directly follows. |
What makes people psychopaths is not an idle question. Prisons are packed with them. And so, according to some, are boardrooms. The combination of a propensity for impulsive risk-taking with a lack of guilt and shame (the two main characteristics of psychopathy) may lead, according to circumstances, to a criminal career or to a successful business one. That has provoked a debate about whether the phenomenon is an aberration or whether natural selection favours it, at least when it’s rare in a population. The boardroom, after all, is a desirable place to be — and before the invention of prisons, even crime might often have paid. (As if it no longer does?) Past work has established that psychopaths have normal levels of intelligence (they are only rarely Hannibal Lecter-like geniuses). Nor does their lack of guilt and shame seem to spring from a deficient grasp of right and wrong. Ask a psychopath what he is supposed to do in a particular situation, and he can usually give you what non-psychopaths would regard as the correct answer. It is just that he does not seem bound to act on that knowledge. Psychopaths don’t seem to possess the instinctive grasp of social contracts — the rules that govern obligations — that other people have and don’t seem as bound to follow them even when they learn it when that knowledge is intellectual rather than emotional. |
Much of the discussion about torture concentrates on the moral and ethical dilemmas involved, but in fact these arguments and make-believe situations are irrelevant if torture doesn’t work in the first place. If those who advocate it can’t prove that it works, then they’ve already lost the debate. Many in the military and intelligence communities seem decidedly unconvinced about the effectiveness of torture. Ali Soufan, a former FBI special agent with considerable experience interrogating al-Qaeda operatives, points out, “When they’re in pain, people will say anything to get the pain to stop. Most of the time, they’ll lie — make up anything — to make you stop hurting them. That means the information you’re getting is useless.” A number of former intelligence people have expressed similar views, and these words are echoed by the US Army Training Manual’s section on interrogation, which suggests that the use of force is a poor technique, as it yields unreliable results, may damage subsequent collection efforts, and can induce the source to say whatever he thinks the interrogator wants to hear. Torture is used to satisfy the desire to hurt the person you assume has done something bad. (This is the same reason parents spank their children.) It is more punishment than prybar. (From one of the comments: “The value of torture is not so much in the information that you extract from your subjects but rather in the social effects upon your population at large, when they know that they can be tortured as an example to others. It makes little sense to keep [a willingness to torture / the fact of torture] completely secret — drop a few hints, nothing that can be proven, easy to deny, but enough to let the public know that it can happen to them if they draw attention to themselves.” Using that logic, torturing the individual’s children would probably work better. But surely no sane person would countenance that.) |
Some 17,000,000 Americans with college degrees are doing jobs that the BLS says require less than the skill levels associated with a bachelors degree. There are 5,057 janitors in the US with Phd’s, other doctorates, or professional degrees. As more and more try to attend colleges, either college degrees will be watered down (something already happening) or drop-out rates will rise (also already happening). At a time when resources are scarce, when American governments are running $1.3-trillion deficits, when they face huge unfunded liabilities associated with commitments made to the growing elderly population, should they be subsidising increasingly problematic educational programmes for students whose prior academic record would suggest little likelihood of academic, much less vocational, success? Higher education is on the brink of big change, like it or not. A perceptive response from the comments section: “College degrees are basically an arms race. No, you may not need a degree to perform the task at hand, but you’re competing against people who have degrees. If you don’t have one too, you’ll be at a disadvantage. Ditto raises and promotions, in which education level is a significant factor. This is true across multiple industries, not just higher ed employment. Is it fair? Of course not. Should experience trump the framed piece of paper on the wall? In many cases, yes. But that’s not the way the world works, and national salary survey data back this up. I don’t know what it will take to restore the employment marketplace to sanity, but there’s no arguing with the facts. A young person may not find a professional job with only a bachelors degree, but good luck getting that job without it!” (Besides, more schooling means less fundamentalism — that’s a good reason to go to college right there.) Via Tywkiwdbi.
All college students face stress, but mental-health professionals say art students face particular, and particularly intense, kinds of stress that their peers in many other scholastic situations don’t. Nationally, some 15% of students entering college are already on a prescribed psychotropic medication (for anxiety, attention-deficit disorder, depression, or some other condition). Art-college students are no exception. Those numbers can rise as students deal with the stress of college life. It helps if the art-college therapist understands how the art-college experience and programme differs from that of other colleges. Studio-art classes may last 6 hours at a time, and a student may have 3 or 4 of those in a given week as well as academic courses that usually last an hour and assign homework. The sheer amount of work is greater than at other schools and students have less free time. Studio-art training can be far more stressful than other fields of study. A psychology major, for instance, learns about the field of psychology — its history, the variety of techniques, the manner in which research is done and written up — without any expectation of developing something markedly new and different. The psychology student, in other words, is trained to fit into the field, to be competent. Art students, in contrast, may receive a certain level of technical training — how to draw the human figure, how to cast bronze, how to render a design on the computer. But they are expected to produce something that is original almost from their first class. They have to be creative on demand and then handle public critique. Critiques can be quite harsh, far different from the experience of being handed back an assignment with a grade on it. Art school is traumatizing. |
The human brain contains many regions that are specialized for processing specific decisions and sensory inputs. Many of these are shared with our fellow mammals (and, in some cases, all vertebrates), suggesting that they are evolutionarily ancient specializations. But innovations like writing have only been around for a few thousand years, a time span that’s too short relative to human generations to allow for this sort of large evolutionary change. In the absence of specialized capabilities, how has it become possible for such large portions of the population to become literate? Has literacy "stolen" areas of the brain that were once involved in other functions? Yes, apparently. Activity within the visual cortex is distributed differently in literate and non-literate individuals. Literate individuals seem to have reduced activity when looking at images of faces, but increased activity when looking at anything that vaguely resembles text, such as black-and-white images or checkerboard patterns with a horizontal orientation. The decreased activity in response to faces seems to occur in those who achieve literacy during childhood (it is one of the only differences between them and adult learners). The area that responds to faces normally expands with age, and learning to read may limit this expansion by putting nearby brain areas to other uses. This isn’t to say that you’re going to be worse with faces if you know how to read well, although researchers intend to check into that. But it does indicate that literacy involves a new specialization in some areas of the visual system. |
Four in 10 Americans believe in strict creationism. The belief in the evolutionary origin of humans is, however, slowly rising. Creationists believe God created humans in their present form about 10,000 years ago. 38% believe God guided the process by which humans developed over millions of years from less advanced life forms. Only 16%, up slightly from years past, believe humans developed over millions of years, without God’s involvement, the “secular evolution” view (that number was 9% in 1982). At the same time, the 40% of Americans who hold the “creationist” view that God created humans “as is” a mere 10,000 years ago is the lowest in Gallup’s history of asking this question, and down from a high point of 47% in 1993 and 1999. There has been little change over the years in the percentage holding the “theistic evolution” view that humans evolved under God’s guidance. Americans’ views on human origins vary significantly by level of education and religiosity. Those who are less educated are more likely to hold a creationist view. Those with college degrees and postgraduate education are more likely to hold one of the two viewpoints involving evolution. A full 85% of Americans have a religious identity. (And most of the rest believe in fairies.) |
Who is the authentic self — the rude or bigoted person who may come out when drunk, enraged, or exhausted? Or the person one sees the other 99% of time, when sobriety allows tamping down unsavory impulses? The Implicit Association Test (IAT) — a categorisation of words and faces — is based on the premise that harder tasks take longer to do. If a person is slower at pairing positive attributes with African Americans than with whites, it suggests an implicit bias against blacks. One study found that faster associations on the IAT between self and death predicted suicide attempts better than known risk factors like depression. People with high levels of such associations were 6 times more likely to attempt suicide within 6 months than people who did not exhibit the same bias. Doctors with high implicit bias against blacks were less likely to recommend clot-busting drugs to treat heart disease in hypothetical scenarios with black patients. Meanwhile, real-world research finds that doctors are twice as likely to recommend these potentially life-saving drugs to white patients than black patients — bias could account for that difference. As people age, their ability to inhibit impulses is reduced, a loss of control that may increase expressions of racism. In one study, elderly participants whose mental focus was purposefully disrupted by a laboratory distraction task were found to be more likely to make biased remarks. However, providing rapid fuel to the brain decreases the expression of bias: adults who drank lemonade containing real sugar expressed fewer homophobic sentiments than those who were given Splenda-sweetened lemonade. This is no argument for drinking full-sugar beverages but does suggest that transient states — like drunkenness or hunger — can affect self-control. Basically, more brain power can mean less prejudice.
If you are writing a drama in which the main characters are all of European extraction,
you can prove to the world that you are not a racist by specifying an African-American actor for some minor authority figure, such as a judge or mayor,
and giving him one line to speak.
— Dr H Albertus Boli, Master Satirist
The idea that other people’s expectations about us directly affect how we behave was examined in a classic social psychology study in 1977. Male students held conversations with female students they’d just met through microphones and headsets. One of the quickest ways that people who’ve just met stereotype each other is by appearance. People automatically assume others who are more attractive are also more sociable, humorous, intelligent and so on. To manipulate this, just before the conversation, along with biographical information about the person they were going to meet, the men were given a photograph. Half were shown a photograph of a woman who had been rated for attractiveness as an 8 out of 10 and half were given a photo of a woman rated as a 2 out of 10. Then the men talked to the women but without seeing them so they didn’t know they weren’t actually talking to the woman in the picture. Half expected to be talking to the attractive woman, half to the unattractive woman. When independent observers listened to the tapes of the conversation they found that when women were talking to men who thought they were very attractive, the women exhibited more of the behaviours stereotypically associated with attractive people: they talked more animatedly and seemed to be enjoying the chat more. What was happening was that the women conformed to the stereotype the men projected on them. The same can apply to standard stereotypes about class, race and nationality. |
J Craig Venter, genome scientist: “I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. The recent discoveries of numerous Earth and super-Earth-like planets outside our solar system, including water worlds, greatly increases the probability of finding life. Sasselov estimates approximately 100,000 Earth and super-Earths within our own galaxy. The universe is young, so wherever we find microbial life there will be intelligent life in the future. Expanding our scientific reach further into the skies will change us forever.” Just the knowledge that life exists elsewhere will change us? Or will we need to have some meaningful contact with it? Since we can’t decode what dolphins or elephants say to each other, I find it unlikely that humans will be the leaders in any useful information exchange. But I hope I’m around to find out. |
State lotteries have become a deeply regressive tax. On average, households that make less than $12,400 a year spend 5% of their income on lotteries. Some people are happy to spend $3 for approximately 15 seconds of irrational hope, for the pleasure of thinking about what might happen if they suddenly win millions of dollars. While most players know they won’t win — the odds are a joke — the latex-coated ticket is a cheap permission to daydream, to think about the possibility of a better life. Implicit comparisons with other income classes increases low-income individuals’ desire to play the lottery. Participants are more likely to purchase lottery tickets when they’re primed to perceive that their own income is low relative to an implicit standard. Participants also purchase more tickets when they consider situations in which rich people and poor people have similar advantages, implicitly highlighting the fact that everyone has an equal chance of winning. What to do? Cease marketing and advertising that targets the poor. Promote games that appeal to wealthier players (such as Powerball). Increase the number of moderate prizes. Issue investment instruments with lottery-like qualities (small amounts, available conveniently, small chance of much larger upside — such as “bonus bonds”). Or live with it. |
Falling in love can elicit not only the same euphoric feeling as using cocaine, but it also affects intellectual areas of the brain. Researchers find falling in love only takes about a 1/5th of a second. When a person falls in love, 12 areas of the brain work in tandem to release euphoria-inducing chemicals such as dopamine, oxytocin, adrenaline and vasopression. The love feeling also affects sophisticated cognitive functions, such as mental representation, metaphors and body image. Blood levels of nerve growth factor, or NGF, also increase, playing an important role in the phenomenon “love at first sight.” By identifying the parts of the brain stimulated by love, doctors and therapists can better understand the pains of love-sick patients. Unconditional love, such as that between a mother and a child, is sparked by the middle of the brain. Passionate love is sparked by the reward part of the brain, and also associative cognitive brain areas that have higher-order functions, such as body image. |
Entanglement is “spooky action at a distance”, as Einstein liked to say (he actually did not like it at all, but at some point he had to admit that it exists.) In quantum physics, two particles are entangled when a change in one particle is immediately associated with a change in the other particle. Here comes the spooky part: we can separate our “entangled buddies” as far as we can, they still remain entangled. A change in one of them is instantly reflected in the other one, even though they are physically far apart (and I mean different countries!) Entanglement feels like magic. Recent evidence suggests that entanglement may be more robust and even more widespread than we initially thought. Photosynthesis may happen through entanglement, and recent brain data suggest that entanglement may play a role in coherent electrical activity of distant groups of neurons in the brain. (This gets a bit flowery toward the end, so I cut it off early.) |
Hassle Me! Not eating enough fruit? Forgot to feed the fish again? Need a little help keeping your New Year’s resolutions? Tell us what to hassle you about, and we’ll nag you via email at semi-unpredictable intervals. HassleMe is unique because you never quite know when your reminder will come along. Example Hassles:
Why not set up a Hassle now for either yourself or a friend? |
Questionable Content — quite a whimsical online comic — ostensibly about romance, indie rock, little robots, and the problems people have.
I know this is old — but I had filed a reminder to myself in to use it — and put it in the wrong place. The topic is still relevant. If you haven’t seen it — it’s a good, short (fictional) example of how even well-intentioned events can get out of hand with today’s level of interconnectedness. Flash Mob Gone Wrong by Tom Scott. |
Sherry Turkle is a professor of the “social studies of science” at MIT. In her latest book, Alone Together, she seems upset by the banalities of electronic interaction, as “our range of expression is constrained by our gadgets and platforms.” We aren’t “happy” anymore: we’re simply a semicolon followed by a parenthesis. Instead of talking on the phone, we send a text; instead of writing wistful letters, we edit our blog. As Turkle notes, these trends show no sign of abating, since people increasingly gravitate toward technologies that allow us to interact while inattentive or absent. Our excuse is always the same — we’d love to talk, but there just isn’t time. Send us an e-mail — we’ll get back to you. But obvious objections shouldn’t obscure the real mystery: If the Internet is such an alienating force, then why can’t we escape it? If Facebook is so insufferable, then why do hundreds of millions of people check their page every day? Why did I just text my spouse instead of calling? Frequent blogging apparently leads to increased levels of social support and integration, so perhaps it’s still too soon to tell what the final verdict will be. |
The way in which people frantically communicate online via Twitter, Facebook and instant messaging can be seen as a form of modern madness, according to American sociologist and MIT professor Sherry Turkle. “A behaviour that has become typical may still express the problems that once caused us to see it as pathological.” She appeared last week on Stephen Colbert’s late-night comedy show, The Colbert Report. When Turkle said she had been at funerals where people checked their iPhones, Colbert quipped: “We all say goodbye in our own way.” Turkle feels that under the illusion of allowing us to communicate better, it is actually isolating us from real human interactions in a cyber-reality that is a poor imitation of the real world. “We have invented inspiring and enhancing technologies, yet we have allowed them to diminish us,” she writes. Another author, Nicholas Carr, has suggested that use of the internet is altering the way we think to make us less capable of digesting large and complex amounts of information, such as books. We Have Met the Enemy author Daniel Akst describes the problems of self-control in the modern world, of which the proliferation of communication tools is a key component. Defenders say theirs is just a different form of communication that people might have trouble getting used to. Professor William Kist, designated “education expert” at Kent State University, Ohio, says that the “real world” that many social media critics hark back to never really existed. Before everyone began travelling on the bus or train with their heads buried in an iPad or a smart phone, they usually just travelled in silence. “We did not see people spontaneously talking to strangers. They were just keeping to themselves.” (So all you nay-sayers are just WRONG! Neener neener.) |
There is a common idea: because the mind seems unified, it really is. Many go only a bit further and call that unified mind a “soul.” For the believers in the soul (let’s call them soulists), the “soul” assumption appears to be only the smallest of steps from the existence of a unified mind. But today there isn’t even evidence for that place soulists step off from, the unified mind. Stroke victims who are paralysed sometimes think that the part of their body they can’t move must belong to someone else. This condition is called “neglect” and there are countless cases. How can we explain this? In general, the brain is divided into two hemispheres. The left one processes speech and also motor and sensory information for the right side. The right hemisphere processes nonverbal information and also representations from the left side. If the right hemisphere is non-functional, it can’t process information from the left arm. The left hemisphere isn’t set up to take over immediately. So the left side of the body essentially no longer exists. A neglect case only makes sense if you consider each hemisphere as its own separate entity. A unified perception relies on neuronal machinery humming in the background, far beneath conscious awareness. Your sense of unity, only perceptible to you, is a sheen on the surface, not a deeper layer of reality. Where does this leave the soul? Unsupported by the collected works of neurology and neuroscience. The way things seem isn’t always how they really are. |
Although it seems obvious that there is a single "you" inside your head, research from several subdisciplines of psychology suggests that this is an illusion. Three decades ago cognitive scientist Colin Martindale advanced the idea that each of us has several subselves, and he connected his idea to emerging ideas in cognitive science. Central to Martindale’s thesis are a few fairly simple ideas, such as selective attention, lateral inhibition, state-dependent memory, and cognitive dissociation. Although all the neurons in our brains are firing all the time, we’d never be able to put one foot in front of the other if we were unable to consciously ignore almost all of that hyperabundant parallel processing going on in the background. State dependent memory helps sort out all that incoming information for later use, by categorising new info according to context — if you learn a stranger’s name after drinking a doppio espresso at the local java house, it will be easier to remember that name if you meet again at Starbucks than if the next encounter is at a local pub after a martini. In other words, we all have a number of executive subselves, and the only way we manage to accomplish anything in life is to allow only one subself to take the conscious driver’s seat at any given time. there is not a single information-processing organ inside our heads, but, rather, multiple systems dedicated to solving different adaptive problems. One subself is dedicated to getting along with friends, one is dedicated to self-protection, one is dedicated to winning status, another to finding mates and a distinct one for keeping mates (a very different set of problems, as some of us have learned), and yet another to caring for offspring. Thinking of the mind as composed of several functionally independent adaptive subselves helps us understand many seeming-inconsistencies and irrationalities in human behaviour. |
In her autobiographical essay, “A Sketch of the Past”, Virginia Woolf claims that To the Lighthouse came out “in a great, apparently involuntary, rush. Blowing bubbles out of a pipe gives the feeling of the rapid crowd of ideas and scenes which blew out of my mind.” Like others gripped by mania, she didn’t feel herself the author of her own thoughts, but a puppet of another consciousness: “What blew the bubbles? ... I have no notion.” Woolf plays creatively with two prime forces behind her bifurcated self: a fragile sense of ego in relation to those of her parents, and self-doubt as an artist. When depressed, she took to bed and withdrew, viewing the world as meaningless and without hope. On the upswing to mania she wrote at breakneck speed, the words seeming to compose themselves. No two of her books are alike. “Not this, not that,” she seems to be saying as she rejects convention in a lifelong experiment to portray consciousness and the character of thought. Her ideas about the unreliability of language were prescient given what science now knows: the structure of the human brain allows language to introspect only a fraction of consciousness. The brain is not a passive antenna for “objective data” impinging on it, as people often suppose. Each brain actively pursues what interests it, filtering the world in its uniquely subjective way. Genes prime a life while experience in a given culture contextualizes it. Woolf’s portrayal of psychic fragmentation anticipated the discovery that brains do have multiple domains of consciousness. First suggested in 1844 by Arthur Wigan in The Duality of Mind, he describes the autopsy of someone he knew well in which one brain hemisphere was discovered to be entirely missing! Wigan had wits enough to deduce that a single hemisphere was enough to be a person. This implied to him that the brain is not a single organ of two halves, but a closely apposed pair, just as the kidneys or lungs are paired one to a side. Wigan concluded that if one hemisphere was sufficient to have a mind then the customary pair made having two minds inevitable. Being conscious of our actions does not mean we intended to cause them. Nonetheless, the subjective certainty that thoughts do cause actions is enough to override any amount of scientific preaching that their actual causes are generated unconsciously while memory-related circuits invent a narrative to explain matters after the fact. (Which, I think, also explains our interpretation of the so-called dreams we have, often brought about by certain physical sensations we experience while we sleep.) |
So I have multiple selves? Whether it feels that way or not? For the sake of discussion, let’s call these various selves I have Anne1 through Zelda9 (I have no clue even roughly how many there might actually be). Sometimes Anne1 is in charge. Sometimes it’s Bella7. Once each year or so, Mandy4 is in charge — and so on. When Mandy4 is the boss in charge of consciousness, where is Anne1? Asleep? In suspense? Non-existent? Is the last time a player appears technically the date of his /her death? Can two or more of you gang up and assassinate one of you that none of you likes? Is there a moral component if that happens? You are no longer the person you were a decade ago. Is that person now (mostly) dead (new cast of “selves” now)? “You” are a similar person today, with similar memories — but perhaps some rather incompatible habits. This seems to blur the boundary of self — particularly when looked at across time. Am I missing something? |
Trend Spotting — The Top 9 Rises and Falls We See in the Year Ahead. Predicted are a rise in the celebration of science to encourage more youth to enter the field, more IPOs, more tablets, increased use of nuclear power, more electric vehicles, unexpected technology breakthgroughs (that’s certainly vague enough to be correct no matter what), more robots, higher risk, and — my favourite — The Fall of Personal Privacy (and the Rise of Good Behaviour?): “Regardless of what privacy advocates say, between diplomatic hackers like WikiLeaks, industrial hackers like the Stuxnet virus, geo-tagging, real-time data, status updates, tweets, check-ins, and phone cams in the hands of everyone, the fact is that personal privacy has irreparably changed. Tech guru Kevin Kelly made a provocative statement: 'Nobody really knows what any technology will be good for; the real question is what happens when everyone has one?’ Now compare modern technology to Southern gun culture. Some believe Southern gentility arose from a gun culture, where politeness was a survival tactic. If people assume they’re being watched all the time, what effect might it have on ethics and behaviour?” More importantly, what effect will it have on creativity, individuality, self-esteem, anger, openness, and stress levels (to name but a few)?
A radical pessimist’s guide to the next 10 years: It gets worse, feels weird and goes too fast (no surprises there). There’ll be freaky, extreme weather, you’ll feel anxiety, the middle class will dissipate, and if you can live near a subway entrance, you’ll save a lot on (probably) outrageously expensive fuel. Suburbs are doomed, people will continue to be instantly connected to everyone else in the world, old people will know more, you’ll be more deprived, there’ll be less fresh food and move preserved stuff, something smarter than you is coming, and you will burn out (or maybe burn up) just trying to maintain your individuality. North America may fragment into smaller countries, years will feel like hours, everyone will be in a similar situation, being alone will become a treat, relationships will get even shallower, you can live longer (or not, if you don’t see any point), more people will accept technology, you will feel nostalgic, stupid people will be in charge, language will grow slang like barnacles, people won’t emphasise personal looks as much (I doubt that), politics will become subliminal (perhaps bypassing democracy altogether like it sometimes does today?), roads will deteriorate — and you’ll realise you brought most of this on yourself and deserve what you get. (So there. Suck it up.) |
You will see this site is offline. If you look at the page source, you will see something else besides — the following comment: <!—— |
A very shy man went into a bar, where he spotted a very beautiful woman sitting alone. After an hour of gathering up his courage, he finally went over to her and asked, tentatively, “Um, would you mind if I chatted with you for a while?” She responded by yelling, at the top of her lungs, “NO! I won’t sleep with you tonight!” Everyone in the bar stared at them. The shy man was hopelessly and completely embarrassed and slunk back to his table. After a few minutes, the woman walked over to him and apologised. She smiled and said, “I’m sorry if I embarrassed you. You see, I’m a graduate student in psychology, and I’m studying how people respond to embarrassing situations.” To this the man responded at the top of his lungs, “What do you mean $200?!”
What is the difference between a psychiatrist and a psychologist?
If you say to a psychiatrist, “I hate my mother,” he will ask. “Why do you say that?”
A psychologist will say, “Thank you for sharing that with us.”