Your Finger Length Determines Your Ability to Make Money?

finger length

When (Not Very) Good Reporting Goes (Extra) Bad

So there’s this news story zooming around the world right now.

Maybe you’ve seen it or heard about it.

First, some high-profile researchers from Cambridge University in the UK last week published an article, “Second-to-fourth digit ratio predicts success among high-frequency financial traders” in the journal Proceedings of the National Academy of Sciences (”PNAS”).

Two days later, the story was reprinted in The Economist as, “Digitally Enhanced – Successful financial traders are born as well as made.”

Then, this morning, the Economist story was reprinted in the Star Tribune as, “Looking for a winner? Check the ring finger – Cambridge study shows: successful financial traders are born as well as made.”

The study found that men working as “high-frequency” stock traders often had longer ring fingers than middle fingers – a trait that is known to be a sign of high levels of testosterone. The articles then drew the conclusion that a useful way to find out if someone is good at making money is to measure their fingers.

Here’s my beef with all of this.

I’ve read the Economist and Star Tribune articles and the abstract from PNAS, but full text PNAS articles require a password, so I haven’t read the full version of the original study

The Problem of Sample Size and Source

The study that claims to have drawn scientific conclusions regarding all “high-frequency” traders sampled a recruited group of 44 people from a single trading floor with 200 people total. That is, there is no indication that this group was at all randomized, or representative of the makeup of any other trading floor in the world. Also, even if it were 44 random people, that doesn’t seem to me like enough people to draw useful medical or statistical correlations.

The study makes it very clear that trader “experience, counted for a lot.”  I can’t figure out how a study can use such a tiny sample size, which finds one strong correlative factor (experience),  and still draw any conclusive findings related to the correlation of another far-weirder factor.

The Problem of Specialty Generalization

Next, though the original study makes clear that they were only analyzing the limited sub-group of “high-frequency” traders, the news reports made the much broader claims that “successful financial traders are born as well as made” and that “making money comes naturally to some people — specifically to men exposed to high levels of testosterone before they were born.” It should be totally obvious to anyone that even a valid study of “high-frequency” traders doesn’t by logical extension make any claims about “successful traders” generally or even worse, people good at “making money.”

The Problem of Survivorship Bias

Survivorship Bias is the logical error of drawing conclusions about an activity based only on data related to those successful at the activity, while ignoring data related to anyone who attempted the same activity, but failed.  Here, the researchers only studied current traders’ fingers, not the fingers of anyone who had failed in the same role.  If they were to do the additional research and find that failing “high-frequency” traders also have long ring fingers, then maybe finger length/testosterone predicts for an interest in that kind of work more than predicting for success in that work, as both the study and articles claim.

The Problem of Hindsight Bias

Hindsight Bias is the logical error of drawing conclusions about future success based on past success. This concept has tremendous application in the field of finance.  In this case, the study and the article drew the conclusion that because these traders had been successful in the past, that they were, therefore, going to be successful at the work in the future.  That is, they explicitly made the claim that 44 people who have been successful at this kind of trading were therefore talented at it.  Is it possible that “high-frequency” trading takes a tremendous amount of skill?  Certainly.  Is it also possible that “high-frequency” trading just takes a lot of luck and the “survivors” that were sampled happened to be the lucky few? Seems possible.  The big problem is, this question is not addressed.  We are just told, as fact, that “success” at this kind of work is a game of skill, not chance.

The Problem of Drawing Practical Recommendations from Scientific Research

Journalists know that most people who read their articles (especially in science reporting) will assume that the whole article is based on valid scientific study. These journalists know that very few readers will ever bother trying to find and read the original study. Yet, here these articles try to convince people that maybe they need to start worrying about the finger length of their family’s financial advisor or banker. This crazy-generalized claim is never made in the original study, but it certainly helps a newspaper editor get excited about publishing the article. Shame on them.

This is all to say that (1) I’m highly skeptical of the original Cambridge study; (2) I’m disappointed in the mainstream media who report on these findings and draw their own conclusions from it, without showing any skepticism themselves; and (3) it’s made worse when media outlets republish other’s flawed reporting without any original analysis on their part.

Am I being too harsh? Probably.

But if you’re interested in learning more about cognitive biases, logical errors, and financial trading, I highly recommend Nassim Nicholas Taleb’s book “Fooled by Randomness.”

For an overview of Taleb’s theories, check out Malcolm Gladwell’s New Yorker article, “Blowing Up.”

Should we be able to wager on terrorist attacks?

SAMSUNG CSCFreakonomics author Steven Levitt argues that the US government should reconsider its decision to prohibit predictive markets – a pilot project where individuals could purchase “futures” contracts on the likelihood of future events like terrorist attacks, global warming, election outcomes, etc. The idea is that self-interested economic participation in these markets might lead to new intelligence data and warnings about impending public events.

It’s a fascinating concept, and Levitt makes a strong argument that the public reaction against such concepts as “betting on a terrorist attack” are actually overblown given the potential value of such market data.