20180113

risk, reward

For a while I was obsessed with payoff matrices, those little boxes you see in game theory which tell you who wins and loses. The concept itself is unremarkable, but once I learned about it it was like I could see these numbers floating around every decision. I'd see people going through their day picking fights and getting angry and I wanted to just grab them and pull them aside and point at the numbers. They're losing out here. This isn't a good strategic move. You can do so much better.

I finally asked a friend who'd just finished a heated argument, and he said that it's not about winning, it's about being right. It doesn't matter if you convince the other person, you have to let them know they're wrong. "You should try it sometime," he told me. "It'll do you some good."

"Okay," I said, and decided to take his advice right there. "You weren't right in that argument. I think you just think that being angry is the same thing as being right. Most people do."

"Man, fuck you," he told me.

Neither of us, the numbers told me, were better off for this exchange. And that's the real problem with game theory is it always assumes we're rational actors. That our behavior is always calculated to maximize our payoff. Risks are only taken if they come with great reward. Poor strategic decisions are never made, because they are poor decisions.

Eventually the numbers went away. They were largely useless anyway--of course I knew that humans are self-destructive creatures at heart. But now I'm starting to wonder if perhaps the numbers were wrong.

No comments: