o That multitaskers are superhumans, capable peak performance, while completing several tasks simultaneously
o That multitaskers have a highly developed ability to switch attention from one task to another in an orderly way
What in fact did the research reveal?
“We all bet that multitaskers were going to be stars at something, We were absolutely shocked. We lost all our bets. It turns out that multitaskers are terrible at every aspect of multitasking. They’re terrible at ignoring irrelevant information; they’re terrible at keeping information in their head nicely and neatly organized; and they’re terrible at switching from one task to another.”
In The Organized Mind: Thinking Straight in the Age of Information Overload, Daniel J. Levitin has this to say:
“We all want to believe that we can do many things at once and that our attention is infinite, but this is a persistent myth. What we really do is shift our attention rapidly from task to task. Two bad things happen as a result: We don’t devote enough attention to any one thing, and we decease the quality of attention applied to any task. When we do one thing — uni-task — there are beneficial changes in the brain’s daydreaming network and increased connectivity…You’d think people would realize that they’re bad at multitasking and would quit. But a cognitive illusion sets in, fueled in part by a dopamine-adrenaline feedback loop, which multitaskers [begin italics] think [end italics] they are doing great.”
When multitasking, we don’t get more work done. We get less work done and of a much lower quality. Now you know.
* * *
To read the complete report, Cognitive Control in Media Multitaskers, please click here.
Tragically, Clifford Nass died November 2, 2013, at Stanford Sierra Camp near South Lake Tahoe, after collapsing at the end of a hike. He was only 55.
Clifford Nass earned a B.A. cum laude in mathematics (1981) and a Ph.D. in sociology (1986), both from Princeton University. Before attending graduate school, Nass worked as a computer scientist at Intel Corporation. Nass focused on experimental studies of social-psychological aspects of human-computer interaction. He directed the Communication between Humans and Interactive Media (CHIMe) Lab. The four foci of the CHIMe Lab are: 1) Communication in and between Automobiles: Research on Safety, Information Technology, and Enjoyment (CARSITE); 2) Social and Psychological Aspects of Computing Environments (SPACE), which focuses on mobile and ubiquitous technology; 3) Abilities of People: Personalization, Emotion, Embodiment, Adaptation, Language, and Speech (APPEEALS); and 4) human-robot interaction. He is also co-Director of the Kozmetsky Global Collaboratory, which focuses on developing countries. To learn more about him and his important work, please click here.
Daniel J. Levitin is the James McGill Professor of Psychology and Music at McGill University, Montreal, where he also holds appointments in the Program in Behavioural Neuroscience, The School of Computer Science, and the Faculty of Education. An award-winning teacher, he now adds best-selling author to his list of accomplishments as “This Is Your Brain on Music” and “The World in Six Songs” were both Top 10 best-sellers, and have been translated into 16 languages. Before becoming a neuroscientist, he worked as a session musician, sound engineer, and record producer working with artists such as Stevie Wonder and Blue Oyster Cult. He has published extensively in scientific journals as well as music magazines such as Grammy and Billboard. Recent musical performances include playing guitar and saxophone with Sting, Bobby McFerrin, Rosanne Cash, David Byrne, and Rodney Crowell.
Dr. William (Bill) Seidman has worked as a manager or consultant with many large and small organizations including Hewlett-Packard, Jack in the Box, Intel, Tektronix, CVS Pharmacies, and Sears. As a recognized expert on leadership in high-performing organizations, he contributes an in-depth understanding of the processes required to discover and use expert wisdom to create extraordinary organizational performance. He is co-founder and chief executive officer of Cerebyte, Inc., co-author with Rick Grbavac of Strategy to Action in Ten Days and then The Star Factor: Discover What Your Top Performers Do Differently–and Inspire a New Level of Greatness in All, published by AMACOM in 2013. The Star Factor presents Affirmative Leadership, a methodology for discovering what your top performers do differently – and inspiring a new level of greatness in all.
Seidman earned his doctorate at Stanford University where he did a study of how management training effected the development of managers’ attitudes, cognitive patterns and behaviors. As part of this study, he developed a technique for analyzing management down to the single word and action level. This technique is the basis for understanding what makes a star performer so extraordinary and understanding the newest neuroscience for elevating everyone else’s performance to the level of the stars.
He lives in Lake Oswego, Oregon, with his wife. He enjoys traveling, golf and spending time with his three kids.
Here is an excerpt from my second interview of him. To read the complete interview, please click here.
* * *
Morris: When and why did you and Rick decide to write The Star Factor and do so in collaboration?
Seidman: It was the convergence of two factors. First, after years of development, the Affirmative Leadership process had reached a maturity where it was producing the same excellent results in terms of participant response and impact on business outcomes every time, regardless of the organization or industry.
We felt that the underlying methodology was now strong enough to share with others. Second, at about the same time, three books were published – DRiVE by Pink, Your Brain at Work by Rock and The Power of Positive Deviance by Pascale, Sternin and Sternin — that legitimized different aspects of the methodology. Although these came after we had proven the methodology, they independently supported the role of positive deviants (our stars), the importance of purpose and mastery and the connection of all of these to the neuroscience of learning.
By connecting all of these through a applied methodology, an organization could get performance that was literally beyond what they previously thought possible. Our compelling purpose became to share something we believed would improve people’s lives, organizations and ultimately society.
Morris: Were there any head-snapping revelations while writing it? Please explain.
Seidman: The single most “head-snapping” revelation that came from writing the book was the importance and value of self-directed learning. We realized that one of the most important characteristics of the stars was that they were fanatical learners. We also realized that the way we were doing the Launch Workshops and the Guided Practicum – particularly the emphasis on people adapting the learning tasks to generate enhanced value — were a significant learning and leadership breakthrough.
It was transformational for us to see people’s reactions to true self-directed learning. There was a real joy at the re-awaking of their natural, human desire to learn. Put together, we realized that the complexity of today’s world requires leaders to be great learners. You simply can’t be a great leader without being a great [begin italics] learner [end italics], which was a new idea to us and, as far as we could see, a new idea in the literature on leadership.
Morris: To what extent (if any) does the book in final form differ significantly from what you originally envisioned?
Seidman: About 75% of the book is how we originally envisioned it. The two big changes were the emergence of self-directed learning as a core theme. We also had planned to do a lot more on the implications of Affirmative Leadership programs for executive decision-making. One of the effects of Affirmative Leadership programs is to illuminate disconnects and other types of conflicts in organizations. This can be incredibly valuable for executive teams if they accept the information and use it for better decision-making. But it can be destructive if the executives reject, ignore or overtly suppress the information. We wanted to talk a lot more about how executives can use the issues that bubble-up from Affirmative Leadership programs to be better leaders themselves. But this added 10,000 more words than we were allowed by the publisher. Maybe that will be our next book.
Morris: As I indicate in my review of the book for various Amazon websites, there are dozens of passages throughout your narrative that caught my eye. For those who have not as yet read the book, please suggest what you view as the most important point or key take-away in each of several passages.
First, The Affirmative Leadership Methodology (Pages 6-9)
Seidman: This is the only methodology for cultural development and change leadership that we know of that consistently and systematically works. As one executive put it: “you mean you can generate levels of performance in six weeks that I couldn’t achieve in five years?”
Yes, because of the synergy between all of the different elements and the underlying science tells us precisely how to drive these changes. This gives organizations capabilities that are quite revolutionary. The most important impacts are in some ways the least tangible, though. When an organization uses Affirmative Leadership for multiple roles, the culture visibly changes. It is just a better, more confident, more productive place at which to work. You can feel the difference and it feels great.
Morris: Your Stars (18-21)
Seidman: They are just great people. Not only are they consistently the top performers on a variety of metrics and perspectives, but they are just plain great people. We hesitated to use the term “stars” because that term is so often associated with egotistical, self-centered people. Our stars are invariably humble, gracious and considerate, in part because their deep commitment to achieving a significant purpose makes them very aware of how little they know and are able to accomplish. The beauty of the methodology is that it causes what is truly the best in people to surface and this then drives creating better places to work.
Morris: Unconscious Competence, and, Engaging Stars (24-28)
Seidman: We often see organizations trying to create best practices through observation and interviews focused on what people are doing. These approaches consistently miss what is most important, and unconscious about the stars, which is how they think. Everything that makes them a star derives from an unconscious sense of deep purpose so you have to start with understanding that sense of purpose – and nourishing – to learn what makes them extraordinary. Fortunately, if you ask them about their purpose in the way we do in the Wisdom Discovery, they just love talking about it. They become so engaged that we have to be very assertive to drive them to solidify their purpose into a written statement.
* * *
To read the complete interview, please click here.
Here’s a link to my first interview of Bill
Here is a link to my review of The Star Factor.
Bill cordially invites you to check out the resources at these websites:
YouTube video link
NeuroLeadership Institute link
Here is a portion of a mini-interview featured by Amazon during which David Shenk responds to questions about his latest book, The Genius in All of Us: New Insights into Genetics, Talent, and IQ.
Photo © Alexandra Beers
* * *
Q: You assert in the book that everything we’ve been told about genetics, talent, and IQ is wrong. Everything? How so?
It is a bold statement, and it reflects how poorly the public has been served when it comes to understanding the relationship between biology and ability. The clichés we’ve been taught about genetic blueprints, IQ, and “giftedness” all come out of crude, early-20th century guesswork. The reality is so much more interesting and complex. Genes do have a powerful influence on everything we do, but they respond to their environments in all sorts of interesting ways. We’ve now learned a lot more about the developmental mechanisms that enable people to get really good at stuff. Intelligence and talent turn out to be about process, not about whether you were born with certain “gifts.”
Q: You also state that the concept of nature versus nurture is over. Scientists, cognitive psychologists, and geneticists are moving towards an idea of ‘interactionism.’ What does this mean? If the battle of genes versus environment is over, who has won? Which is more important?
They both won, because they’re both vitally important. But the new science shows us that they do not act separately. Declaring that a person gets X-percent of his/her intelligence from genes and Y-percent from the environment is like saying that X-percent of Shakespeare’s greatness can be found in his verbs, and Y-percent in his adjectives. There is no nature vs. nurture, or nature plus nurture; instead, it’s nature interacting with nurture, which is often expressed by scientists as “GxE” (genes times intelligence). This is what “interactionism” refers to. A vanguard of geneticists, neuroscientists, and psychologists have stepped forward in recent years to articulate the importance of the dynamic interaction between genes and the environment.
Q: You describe genes and environment as a sound board. How so?
In the past, we’ve been taught that each distinct gene contains a certain dossier of information, which in turn determines a certain trait; if you have the blue-eyed gene, you get blue eyes. Period.
It turns out, though, that the information contained inside genes is only part of the story; another critical part is how often genes get “expressed,” or turned on, by other genes and by outside forces. It’s therefore helpful to think of your genome as a giant mixing board with thousands of knobs and switches. Genes are always getting turned on/off/up/down by hormones, nutrients, etc. People actually affect their own genome’s behavior with their actions.
Q: How do these new findings affect the concept of the “The Bell Curve”–that we live in an increasingly stratified world where the “cognitive elite,” those with the best genes, are more and more isolated from the cognitive/genetic underclass? Is that idea now completely obsolete?
Yes, it is obsolete. The idea that there is a genetic super-class that has a corner on high-IQ genes is nonsense. This comes out of a profound misunderstanding of how genes work and how intelligence works, and also from a misreading of so-called “heritability” studies. I am not saying that genes don’t affect intelligence. Genes affect everything. But by and large I think the evidence shows that people with low intelligence are missing out on key developmental advantages.
Q: Lewis Terman invented the IQ test at Stanford University in 1916. He declared it the ideal tool to determine a person’s native intelligence. Are IQ tests accurate? What are the benefits and fallout of the IQ test?
IQ tests accurately rank academic achievement. That’s quite different from identifying innate intelligence, which doesn’t really exist. Tufts intelligence expert Robert Sternberg explains that “intelligence represents a set of competencies in development.” In other words, intelligence isn’t fixed. Intelligence isn’t general. Intelligence is not a thing. Instead, intelligence is a dynamic, diffuse, and ongoing process.
The IQ test has valid uses. It can help teachers and principals understand how well students are doing and what they’re missing. But the widespread belief that it defines what each of us are capable of (and limited to) is disabling for individuals and society. People simply cannot reach their full potential if they honestly believe that they are so severely restricted.
* * *
David Shenk is the national bestselling author of five previous books, including The Forgetting, Data Smog, and The Immortal Game. He is a correspondent for The Atlantic.com, and has contributed to National Geographic, Slate, The New York Times, Gourmet, Harper’s, The New Yorker, NPR, and PBS. His new book, The Genius in All of Us, has been called “engrossing” by Booklist (starred review) and “empowering…myth-busting” by Kirkus.
Shenk’s work inspired the Emmy-award winning PBS documentary The Forgetting, and was featured in the Oscar-nominated feature Away From Her. He has advised the President’s Council on Bioethics, and is a popular speaker. His original term “data smog” was added to the Oxford English Dictionary in 2004.
To visit David’s Amazon page, please click here.
How and why “the craftsman mindset is the foundation for creating work you love”
Curious, I checked on the etymology of the word “career” and learned this: Origin in 1530s, “a running, course” (especially of the sun, etc., across the sky), from M.Fr. carriere “road, racecourse.” Only centuries later (early 1800s), through the evolution of usage, did the word’s meaning emerge as the “course of a working life.” I mention all this because one of Cal Newport’s primary objectives is to help his reader select the most appropriate career course and remain on it while achieving near-, mid-, and long-term goals; then, if and whenever necessary, adjust the course, pace, and focus to accommodate unforeseen changes. Viewed as a journey, Newport also calls it a “career mission” that serves as “an organizing principle to your working life. It’s what leads people to become famous for what they do and ushers in remarkable opportunities that come along with such fame.”
Years ago during a commencement address at Stanford, Teresa Amabile urged the new graduates to do what they love and love what they do. I think that is excellent advice. I also agree with Newport that it is also very important to develop capabilities, skills that will “trump passion in the quest for work you love.” That is why Newport focuses on what he calls “the craftsman mindset,” one that focuses on what you can offer to the world. Unlike “the passion mindset” that focuses on what the world can offer you, the craftsman mindset “asks you to leave behind self-centered concerns about whether your job is ‘just right,’ and instead put your head down and plug away at getting really damn good. No one owes you a great career, it argues; you need to earn it — and the process won’t be easy.”
Here are a few of the dozens of passages that caught my eye:
o Rule #1: “Don’t Follow Your Passion” (Pages 3-26)
o The Science of Passion: Three Conclusions (14-19)
o Craftsman Mindset vs. Passion Mindset (49-55)
o Rule #2: “Be So Good They Can’t Ignore You” (29-101)
[Note: Newport explains that this comment was made by Steve Martin during an appearance on "The Charlie Rose Show."]
o “The Career Capital Theory of Great Work” (42-57)
o Rule #3: “Turn Down a Promotion/or Control” (105-143)
o “Control Traps” (115-131)
o “The Law of Financial Liability” (137-141)
o Rule #4: “Think Small, Act Big/The Importance of Mission” (147-197)
Newport devotes the final chapter to a brief but revealing discussion of his own “quest” to (a) answer the question, “How do people end up loving what they do?” and (b) obtain a faculty appointment at a university. He explains how he achieved both objectives. Near the end of the book, he observes, “Once you build up the career capital that these skills generate [and others value highly], invest it wisely. Use it to acquire control over what you do and how you do it, and to identify and act on a life-changing mission. This philosophy is less sexy than the fantasy of dropping everything to go live among the monks in the mountains, but it’s also a philosophy that has been shown time and again to actually work.”
No brief commentary such as mine can possibly do full justice to the scope and depth of the information, insights, and counsel that Cal Newport provides. However, I hope that those who read this review will have at least a sense of what his purposes are and how well he serves them. Presumably he agrees with me that it would be a fool’s errand to attempt to act upon, immediately, all of his suggestions. Read strategically, highlight whichever passages are most important, formulate a “game plan,” and then proceed with both determination and patience during your own journey of self-discovery. Bon voyage!
“All procrastinators put off things they have to do. Structured procrastination is the art of making this negative trait work for you. The key idea is that procrastination does not mean doing absolutely nothing. Procrastinators seldom do absolutely nothing; they do marginally useful things such as gardening or sharpening pencils or making a diagram of how they will reorganize their files when they get around to it…The procrastinator can be motivated to difficult, timely, and important tasks, however, as long as these tasks are a way of not doing something more important.
“Structured procrastination means shaping the structure of the tasks one has to do in a way that exploits this fact. In your mind, or perhaps written down somewhere, you have a list if things you want to accomplish, ordered by importance. You might even call this your priority list. Tasks that seem most urgent and important are on top. But there are also worthwhile tasks to perform lower on the list. Doing those tasks becomes a way of not doing the things higher up on the list. With this sort of appropriate task structure the procrastinator be comes a useful citizen. Indeed, the procrastinator can even acquire, as I have, a reputation for getting a lot done.”
One of these days, I may give some serious thought to these observations….
* * *
John Perry is an emeritus professor of philosophy at Stanford University. His essay “Structured Procrastination” won a 2011 Ig Nobel Prize in Literature and he was then able to complete the book by putting off grading papers and evaluating dissertation topics.
Here is an excerpt from an article written by Jonathan Schlefer for the Harvard Business Review blog. To read the complete article, check out the wealth of free resources, and sign up for a subscription to HBR email alerts, please click here.
* * *
One of the best-kept secrets in economics is that there is no case for the invisible hand. After more than a century trying to prove the opposite, economic theorists investigating the matter finally concluded in the 1970s that there is no reason to believe markets are led, as if by an invisible hand, to an optimal equilibrium — or any equilibrium at all. But the message never got through to their supposedly practical colleagues who so eagerly push advice about almost anything. Most never even heard what the theorists said, or else resolutely ignored it.
Of course, the dynamic but turbulent history of capitalism belies any invisible hand. The financial crisis that erupted in 2008 and the debt crises threatening Europe are just the latest evidence. Having lived in Mexico in the wake of its 1994 crisis and studied its politics, I just saw the absence of any invisible hand as a practical fact. What shocked me, when I later delved into economic theory, was to discover that, at least on this matter, theory supports practical evidence.
Adam Smith suggested the invisible hand in an otherwise obscure passage in his Inquiry Into the Nature and Causes of the Wealth of Nations in 1776. He mentioned it only once in the book, while he repeatedly noted situations where “natural liberty” does not work. Let banks charge much more than 5% interest, and they will lend to “prodigals and projectors,” precipitating bubbles and crashes. Let “people of the same trade” meet, and their conversation turns to “some contrivance to raise prices.” Let market competition continue to drive the division of labor, and it produces workers as “stupid and ignorant as it is possible for a human creature to become.”
In the 1870s, academic economists began seriously trying to build “general equilibrium” models to prove the existence of the invisible hand. They hoped to show that market trading among individuals, pursuing self-interest, and firms, maximizing profit, would lead an economy to a stable and optimal equilibrium.
Leon Walras, of the University of Lausanne in Switzerland, thought he had succeeded in 1874 with his Elements of Pure Economics, but economists concluded that he had fallen far short. Finally, in 1954, Kenneth Arrow, at Stanford, and Gerard Debreu, at the Cowles Commission at Yale, developed the canonical “general-equilibrium” model, for which they later won the Nobel Prize. Making assumptions to characterize competitive markets, they proved that there exists some set of prices that would balance supply and demand for all goods. However, no one ever showed that some invisible hand would actually move markets toward that level. It is just a situation that might balance supply and demand if by happenstance it occurred.
* * *
To read the complete post, please click here.
Jonathan Schlefer is author of The Assumptions Economists Make (Belknap/Harvard, 2012). The former editor of Technology Review, he holds a Ph.D. in political science from MIT and is currently a research associate at Harvard Business School.
To read more more blog posts by Jonathan Schlefer, please click here.