Skip to main content

NEW YORK (TheStreet) -- If you're inclined to lie and cheat -- and some days, it looks like just about everybody is -- it helps to distance yourself from your aberrant acts. Golfers, for example, will cheat more if they get a chance to nudge the ball closer to where they want by using a club, as opposed to picking it up with their hand.

And financial professionals, well, we'll get to the kinds of people who swap tips about confidential

Goldman Sachs

board meetings in a bit. But there's good news and bad news in the research behind

The (Honest) Truth About Dishonesty: How We Lie to Everyone -- Especially Ourselves

, the latest book by Duke University's engaging behavioral economist Dan Ariely. The good news is that there are actually ways to minimize the chances people will cheat. The bad news is that in the U.S., the most popular economic philosophy is not, in Ariely's view, one that nurtures corporate honesty.

I met Ariely just after he'd finished giving a speech in Connecticut last month. Amid a crowd of gray-suited executives, the professor was pretty easy to spot, clad in jeans, a Mountain Hard Wear hoodie and a pair of blue, white and orange sneakers.

"What's interesting about the economic crisis is that it's not just cheating -- it's cheating with a religious belief," he told me as we were sitting down to chat. Proponents of a free-market economy have taken the philosophy of "individual as good, government as bad" to extremes, Ariely says. "This invisible hand idea is a very, very nice story, and in unique cases it can work," but Ariely says the approach is better suited to perfectly matched chess games where luck doesn't come into play than it is to global economies.

For starters, he doesn't buy the idea that the only way to incentivize people is money. "People say that if you need the best in the business, you have to pay them as an incentive, but is that true?" he asks. "There are other things people care about, like taking a vacation or doing social good."

To get an idea of when people cheat the most, and when they don't cheat at all, Ariely and his colleagues set up experiments that often used students as guinea pigs. Much of the research was a variation on this idea: An experimenter hands out a set of problems and tells the group that the more problems they solve, the more money they will be paid. Some subjects are deliberately set up with an easy chance to cheat, having been instructed to put their test sheet through a shredder after they're done but before anyone can check to see how well they did. Thinking that their correct scores are destroyed (though that's not true, because the shredder is rigged), they report their inflated scores to the experimenter and collect their money.

Ariely discovered in these sessions that not only can cheating be catchy -- seeing one person get away with cheating during an experiment could inspire cheating by others -- but it can also be altruistic. When cheaters learned that someone else would also benefit from their cheating, they became even more dishonest. But there are ways to cut back on cheating, and red flags to watch for:

TheStreet Recommends


Ambiguous rules create opportunities for cheaters.

Ariely says a lot of what sets off today's corporate scandals is that ambiguous regulations give cheaters lots of ways to rationalize what they do. "When the rules are not clear, you can play with them," he said. People in creative jobs tend to have more of what Ariely euphemistically refers to as "moral flexibility" -- meaning they drum up more rationalizations to cheat -- than do others. He asked a group of ad agency employees a series of questions about moral dilemmas and found that the higher the level of creativity, the more "flexible" the person. Accountants were the least "flexible."


Financial regulations based on disclosure are an invitation for trouble.

U.S. securities regulations rest on a philosophy that companies must disclose what they're doing on a timely basis, and it's up to investors to do their homework and analyze the mountains of documents. Ariely doesn't think that's working out so well. He'd rather see a financial version of the Food and Drug Administration, where products would be vetted to see if they had any value. It's an interesting idea, but would require a regulatory redo that isn't likely.


There's more cheating today because it's easier to do.

In an increasingly cashless world, financial cheaters are ever more tempted to do the kind of thing that the cheating golfer does: break the rules to get what they want without touching the ball. If you're looking to steal money, you have lots of opportunities these days that don't require touching the cash. On top of that, money changes hands with more frequency than ever, magnifying the volume of cheating opportunities.

In one of the experiments that Ariely ran, some students would get cash on the spot when they reported to the room monitor how many problems they'd solved correctly. But other students were forced to go through two steps, first reporting to one person what they were owed and getting plastic chips representing the amount; and then walking over to a table to pick up their money. The ones who had a step separating their lie from their receipt of the money cheated twice as much.

There's probably no risk that it will put insider-trading rings out of business, but Ariely does come up with promising ideas about honor codes. He tempted students from several universities to cheat, thinking that the ones from Princeton, which has a long-established and rigid code of honor, would cheat less than the ones at MIT and Yale. But he learned that all it took was two weeks from the time Princeton freshmen had finished their ethics training before they would cheat in Ariely's experiments just as much as the others.

That, of course, is nothing to be inspired about, but along the way, Ariely observed this: Remind someone about their ethical obligations right before they're exposed to temptation, and they're less likely to cheat. Remind them after -- like asking for a signature at the end of an exam -- and it's too late. A takeaway from the honor code research is that if you remind people early and often to do the right thing, it actually makes a difference.