Latest

Gaming the System One Click at a Time

DeathtoStock_Wired2

Be in the boardroom in 10 minutes,” reads the E-mail from Senior Vice President Alan Young. The CEO is out on his boat, and a storm has knocked out all communication. Worse, there has been a massive fire in the call center in South America. “We could lose billions,” Young says. The board has given senior staff emergency powers. You’re a top manager who has been called in to help. What do you do?

 

No, it’s not a bad TV movie but rather the final scenario in a new computer-based simulation called Virtual Leader. Mixing educational content with a dash of video-game spice, such computer simulations are the newest high-tech training ground for managers. “It’s not often that you can hit the rewind button in a business situation,” says Alfredo Herrera, a product engineering manager with Advanced Micro Devices in Austin, who took part in a company-sponsored simulation earlier this year. “Simulations allow you to do that.”

 

While hands-on learning has always appealed to students and educators, says E-learning expert Pat Galagan of the American Society for Training and Development, only recently have the technology and price been right for computer-based simulations to emerge from military and flight schools and enter the corporate classroom. According to Gartner Inc., a Stamford, Conn.-based research firm, within four years 70 percent of all corporate E-learning will include some kind of simulation.

Realistic learning experiences, delivered over the Internet or via CD, are at the heart of most simulations. San Francisco-based Ninth House even uses TV and movie actors, like Brian George (he played the failed Pakistani restaurateuron Seinfeld), in its streaming-video programs on how to hire employees and manage projects. Other computer-based training programs replicate the mundane details of real-life businesses. Applying to be a pharmacist at Walgreens? You can spend about two hours on a day-in-the-life module created for the drugstore chain by CognitiveArts in Evanston, Ill. Tasks include filling out work schedules, answering phones, and reprimanding employees.

 

SimuLearn, the Norwalk, Conn., company that created Virtual Leader and is now marketing it to potential clients, uses a database of over 200 body movements to construct characters that behave–sort of–like real people. Push too hard when negotiating with your virtual colleagues (all of whom look as if they walked off the set of a slightly out-of-date video game), and they might put their faces in their hands or even slam the table in anger. Things could be worse: Give poor advice to a saleswoman in a business gaming program created by London-based Imparta, and she may just give you the finger.

 

“People learn by practicing, by making mistakes,” says Clark Aldrich, executive vice president of SimuLearn. Most of the simulations allow users to repeat sessions so they can see the results of different choices if their decisions didn’t go well the first time around. Some programs, like Imparta’s, even have electronic “mentors” who guide users through rough patches. But it’s the true-to-life narratives that keep busy managers engaged. During his daylong session with a simulation created by Philadelphia-based Strategic Management Group, for instance, one of Herrera’s “employees” quit. Dealing with the departing employee’s workload made him realize how unprepared his own team was for such a situation. “The simulation pointed out weaknesses in the way I was managing my [group],” Herrera says. So he began to cross-train his employees to cover for one another in an emergency. A few months later when one of his employees quit in real life, Herrera says, he was ready.

 

 

Simulations are not without their drawbacks. Morgan McCall, a professor of management and organization at the University of Southern California, says there’s simply no substitute for real-world experience. An interactive computer program is better than a textbook, he says, “but it’s not the same as facing a troublesome subordinate across the table.” Simulations teach only limited interpersonal skills because they provide limited choices. “It’s emotional realism, not theoretical or intellectual, that really drives learning,” McCall says.

 

But for some, a little role-playing at the computer screen can yield valuable insights. Rahul Roy, a management director with the advertising firm J. Walter Thompson in Chicago, played a marketing director in an Imparta-created program last year. “It made me think from a different point of view,” he says. Roy’s clients are typically marketing directors, so he relished getting an up-close look at everything from their vocabulary to the problems they face in talking to their own research, sales, and production departments. “For a day,” he says, “I was my client.”

This article first appeared in US News and World Report

Gardner Heist Anniversary: An update on investigative angles

A reader pinged me the other day asking for an update on the Gardner heist, and with the 24th anniversary of the theft coming up in a few days, here are a few developments. I’ve cribbed in places from earlier posts here and elsewhere.

 

The Bulger angle. Whitey Bulger was arrested in 2012. He does not appear to have made any mention of the art to prosecutors before or after his recent trial, and in the end, there are not any concrete clues of a Bulger angle to the Gardner heist. All of Bulger’s old associates—Stevie Flemmi, Kevin Weeks,  John Martorano—have turned state’s witness, and not one of them  ever fingered Bulger for the museum robbery. In all of the Bulger wiretaps and court documents and surveillance records, there has never been any mention of the paintings, either.

To be sure, I imagine it’s possible that Bulger made some phone calls after he learned of the theft and it’s certainly possible that he knows–or thinks he may know–who did rob the museum. But in the end, he has no idea where the art is today.

 

The David Turner angle. Since my book came out, there’s new evidence tying David Turner and the crew of Carmello Merllino/TRC Auto to the robbery. In my book, I present some evidence that Boston mobster David Turner was one of the Gardner thieves and suggest that George Reissfelder was his accomplice.

After my book was published, more evidence came out that implicated Reissfelder, and the Boston Herald interviewed George Reissfelder’s brother, Richard, and Richard claims that he saw one of the stolen Gardner paintings in George’s apartment. I’ve heard recently that Turner is shopping around a book proposal. No new additional evidence on the whereabouts of the actual paintings, though.

 

Robert Gentile angle. Among Gardner observers, there’s a theory that the paintings went from the Turner/Merlino/TRC crew to a bank robber named Bobby Gurente. One theory is that Gurente then passed the paintings to Robert Gentile. Gentile is a member of organized crime, and the FBI recently raided Gentile’s home. They found nothing. Gentile spent some time in prison. He got out in January, and he doesn’t seem to be giving up any information.

This is perhaps the most promising lead in recent years, but excitement here has slowed as Gentile has not come forward with the art. This Hartford Courant article gives a great summary of Gentile and his alleged connection to the stolen paintings. 

Last year, the FBI announced that they also believe that the art went down to the Philadelphia area. (The idea is that the paintings moved through the organized crime network that goes up and down the Northeast.) The Boston Globe sums it up well: The FBI believes “the paintings have changed hands several times, making their way through organized crime circles from Boston to Connecticut and Philadelphia, where some of the art was offered for sale as recently as a decade ago.”

When it comes to the public playing a role in the case, this might be the best hope for the art to come forward. In other words, we need to continue to dream of someone in the New Jersey/Philadelphia area of calling in a new tip about the whereabouts of the stolen art.

 

 

 

In the Quest to Improve Schools, Have Teachers Been Stripped of Their Autonomy?

Over the past few years, there has been an ever-growing chorus of pundits who argue that teachers have grown to deeply dislike their jobs. They argue that teachers are unhappy with their lack of control and freedom. These pundits believe that discouraged educators have been fleeing the profession in droves.

Take, for instance, teacher and education blogger Vicki Davis who recently argued in TheWashington Post that many educators are leaving schools because of cookie-cutter approaches to teaching and learning. “Many U.S. teachers don’t even have the authority to upgrade their web browser or fix a printer,” Davis wrote. Or consider UCLA education management expert Samuel Culbert who wrote in a New York Times article last year that teachers need far more space to try new things. “If [teachers] are allowed to search for the best answers, they’ll find them.” And then there is Furman University education professor Paul Thomas, who argues that educators today are “teaching in a time of tyranny.”

But do teachers really lack autonomy and freedom? And more importantly: As a nation, have we reached the right balance of accountability and autonomy that is necessary for workplace innovation, career satisfaction, and overall results?

To gain a better handle on this issue, we examined a number of relevant data sets. First, we conducted an analysis of the 2011-12 Schools and Staffing Survey, or SASS, a nationally representative survey of teachers and principals administered regularly by the National Center for Education Statistics. These data are the most recent available. Second, we looked at various state surveys, including 2013 data from Kentucky and Tennessee, as well as other recent national polling data on teacher attitudes.

The data suggest something much different than the conventional wisdom. In fact, teachers are far more autonomous—and far more satisfied—than most people believe. In many ways, the problem is how we think about educator autonomy. The issue is that for years, we as a nation have believed that teachers should have day-in, day-out control over both what they should teach (such as what students should know and be able to do by the end of high school) and how they should teach (such as specific instructional strategies and methods). This mindset should change because the real problem in public education today is that many teachers have too much control over what they teach each day in their classroom—and it prevents them from perfecting how they teach.

Read more here

Excerpt from my recent report for the Center for American Progress. 

Sorry Legacy of the Founding Fathers

In 1784, five years before he became president of the United States, George Washington, 52, was nearly toothless. So he hired a dentist to transplant nine teeth into his jaw–having extracted them from the mouths of his slaves.

That’s a far different image from the cherry-tree-chopping George most people remember from their history books. But recently, many historians have begun to focus on the role slavery played in the lives of the founding generation. They have been spurred in part by DNA evidence made available in 1998, which almost certainly proved Thomas Jefferson had fathered at least one child with his slave Sally Hemings. And only over the past 30 years have scholars examined history from the bottom up. Works by Gore Vidal, Henry Wiencek, and Garry Wills reveal the moral compromises made by the nation’s early leaders and the fragile nature of the country’s infancy. More significant, they argue that many of the Founding Fathers knew slavery was wrong–and yet most did little to fight it.

More than anything, the historians say, the founders were hampered by the culture of their time. While Washington and Jefferson privately expressed distaste for slavery (Jefferson once called it an “execrable commerce”), they also understood that it was part of the political and economic bedrock of the country they helped to create.

Political capital. For one thing, the South could not afford to part with its slaves. Owning slaves was “like having a large bank account,” says Wiencek, author of An Imperfect God: George Washington, His Slaves, and the Creation of America. The southern states would not have signed the Constitution without protections for the “peculiar institution,” including a clause that counted a slave as three fifths of a man for purposes of congressional representation.

And the statesmen’s political lives depended on slavery. The three-fifths formula handed Jefferson his narrow victory in the presidential election of 1800 by inflating the votes of the southern states in the Electoral College. Once in office, Jefferson extended slavery with the Louisiana Purchase in 1803; the new land was carved into 13 states, including three slave states.

Still, Jefferson freed Hemings’s children–though not Hemings herself or his approximately 150 other slaves. Washington, who had begun to believe that all men were created equal after observing the valor of black soldiers during the Revolutionary War, overcame the strong opposition of his relatives to grant his slaves their freedom in his will. Only a decade earlier, such an act would have required legislative approval in Virginia. He suspected the country would eventually come to its moral senses and find the notion of owning other human beings repugnant, says Joseph Ellis, author of the bestselling Founding Brothers. “He knew his legacy depended on it. He knew that we were watching.”

Yet how should we view other framers of independence such as signer of the Declaration of Independence Richard Henry Lee and Patrick Henry, who traded and whipped their slaves? Or James Monroe, who, as governor of Virginia in 1800, after rushed trials, executed nearly 30 slaves after an attempted revolt? For some historians, such actions cloud their legacy. “The other founders resisted emancipation, not because it was a mad scheme but because they did not want to relinquish the wealth which slave sales poured into their coffers,” says Wiencek.

Other scholars believe the Founding Fathers can best be seen squarely within their time. “To contextualize is not to excuse,” says Rutgers University historian Jan Lewis. “It’s to show the complexity.” Understanding the early leaders’ severe lapse in judgment over slavery, say Lewis and other historians, makes their ability to found a new and democratic nation all the more incredible.

 

This first appeared in US News and World Report.

We’re All Lying Liars

Admit it: You’ve lied. You told a friend that his shirt looked stylish when you actually thought it was tacky and garish. Or maybe you said to your boss that her presentations were fascinating when in fact they were insipidly mindless. Or perhaps you told your landlord that the rent check was in the mail.

Don’t feel bad. You’re in good, dishonest company. A growing body of research shows that people lie constantly, that deception is pervasive in everyday life. One study found that people tell two to three lies every 10 minutes, and even conservative estimates indicate that we lie at least once a day. Such incessant prevarication might be a necessary social evil, and researchers have recently discovered that some fibbing might actually be good for you. “We use lies to grease the wheels of social discourse,” says University of Massachusetts psychologist Robert Feldman. “It’s socially useful to tell lies.”

Researchers have been studying deception for decades, trying to figure out why we tell lies. It turns out that we spin facts and make up fictions for all sorts of reasons. We might want to gain a raise or a reward, for example, or to protect friends or a lover. Our capacity for deceit appears nearly endless, from embroidering stories to wearing fake eyelashes to asking “How are you?” when we don’t actually care. We even lie to ourselves about how much food we eat and how often we visit the gym.

Small embellishments can have positive psychological effects, experts say. In a study released last year, researchers found that college students who exaggerated their GPA in interviews later showed improvement in their grades. Their fiction, in other words, became self-fulfilling. “Exaggerators tend to be more confident and have higher goals for achievement,” explains Richard Gramzow, a psychologist at the University of Southampton in England and one of the study’s coauthors. “Positive biases about the self can be beneficial.”

People who deceive themselves also tend to be happier than people who do not, some research suggests. There are social payoffs, too: Studies have shown that people who lie frequently are viewed as friendlier and more amiable than their more truthful counterparts. Still, lying is generally regarded as immoral and distasteful. “No one likes being lied to,” says former FBI agent and lying expert Joe Navarro. “We feel betrayed. When is it that they are telling the truth?” And people do really want to know the truth. A new Fox drama, Lie to Me, which features a steely British deception expert, has become one of the most popular shows on television.

Lying begins early. By the age of 3, most children know how to fib, and by 6, most lie a few times a day. Experts believe that children learn to lie by observing their parents do it—that they become practiced in the art of deception by imitating Mom and Dad. And parents sometimes explicitly encourage children to tell lies. Grandma Suzy will send some ugly wool socks or an itchy sweater, and parents will ask their son or daughter to say the item is lovely. As one study concluded, children “may learn to lie in the same way as they learn to speak.”

Many experts don’t see much difference between a little lie (telling Grandma you loved the ugly socks) and a big lie (covering up an extramarital affair). “Anything that is not accurate is a lie. You can argue that a lie done to make someone else feel better is relatively minor. But they have an effect. The bottom line is that a lie is a lie,” says Feldman. “That’s the great paradox here. I do believe the more lies, the more degradation. But you can’t stop lies entirely. Society would grind to a halt.”

Still, people act differently when they’re gilding a story and when they’re telling a massive whopper. When people tell a bold and blatant lie, they typically become tense and fidgety. Their heart rate speeds up. Their body temperature increases. But when telling white, or social, lies, they usually don’t feel any anxiety at all. In fact, electrodes attached to the bodies of students in Gramzow’s study revealed that the students who exaggerated their GPAs showed less nervous-system activity than students who were honest about their marks. “In certain situations, such as when someone asks you if you like the awful meal they just served you or the hideous outfit they are wearing, it probably takes less thinking to tell the expected polite lie than the more difficult truth,” explains University of California-Santa Barbara psychologist Bella DePaulo.

That doesn’t make it any easier for people to sort out fact from fiction. Studies have shown that people can identify lies only about 50 percent of the time, or about the same as chance. To be sure, researchers have been able to figure out some clues to uncovering deception. When people tell a significant lie, for instance, they typically gesture less and their arms may appear stiff. People telling lies also might have dilated pupils because they feel nervous about spinning an untruth.

Even with the development of such research, there’s no surefire way to catch a liar. But someone with a known track record of lying is likely to pay a price. “Lies add up,” says Feldman. “The more you know that someone is not telling you the truth, the less trustworthy they are. They’re just telling you stuff you want to hear, and you won’t listen to them anymore.”

 

This article first appeared in US News and World Report.