Aurélien Baillon can read minds. Well, not exactly – but he gets pretty close. Baillon combines economy and psychology to wield a method that he coins the ‘economical truth serum’.
TEXT: Loes Singeling-van der Voort
PHOTOGRAPHY: Mark Uyl
Why do we need a truth serum?
"Usually we judge the answers people give by means of a score, of a grade, and then we give feedback. We reward them whenever they give us the right answer. But if we ask a question that doesn’t have a right answer, then that’s beyond our control. How do you score that accurately? We’re talking about subjective questions here, ones that only you know the right answer to. For example, whether you’ve ever cheated during an exam."
A truth serum – sounds kind of magic. What are the components?
"Our 'truth serum' is made up of two components: psychology and economy. Psychology helps us map out your thought processes – what you think or did, and what influences your notion of what others are thinking or have done. Students who have cheated in the past assume that others have also cheated. Students who've never cheated, however, assume that people rarely cheat.
The second component is economics. To settle on a score we usually make use of input. For example, I make a bet with you. If you say that you know for certain that one statement is the right one, then I’ll say, sure. I’ll bet you for it. If you win, you’ll get a reward. That’s how I can be certain that you’ve convinced yourself of your own answer, and are speaking the truth as to what you believe is true. The way you agree to a bet says something about you – and your opinions of others. You reveal yourself, really. Put all of this together and there you have it – a truth serum. In betting on others you reveal the truth about who you are. We can make people say the truth by asking them questions and making them take bets on what others might say."
"We can make people say the truth by asking them questions and making them take bets on what others might say."
For whom is this truth serum beneficial?
"A website like TripAdvisor, for example, could make use of it. Instead of just piling up all the reviews and taking the median of that score as a reflection of the hotel, they should put more value on trusty reviews. But they can hardly investigate every review to see if the judgement is fair. What they can do, however, is make the review into a bet. As in: not just ask for a review, but also ask you to bet on whether you think others might say much of the same things. If you win the bet, you’ll get a reward: for example a status on the website or a gift card. That’s how they can estimate how 'truthful' your reviews are.
The truth serum can also be used in questionnaires about sensitive subject, like political convictions. People who vote for extreme parties usually don’t want to admit as much. Using this method would make it possible to reward people who the answer with care, and allows us to gauge at what their belief system is. At the moment we’re mostly working on validating this method and showing that it works in controlled environments. But like I’ve said before: this method can also work for questions that we don’t have a definitive answer for, like climate change. Or, for example, think of a different kind of crisis – one where we’d have no idea who the expert would be. In the last years we’ve developed models that we can use to figure out who potential experts might be, if they might be valuable in a certain field, and this before we have a definitive solution for the issue at hand. It’s very exciting."