Eat s**t, 300 trillion flies can’t be wrong.
[old joke punchline] “No, I dropped them in that dark alley, but I’d never find them there. That’s why we’re looking under the light post.”
I came across a recent rant by a financial consultant (http://www.littlebear.us/wp-content/uploads/ITCI-Little-Bear-July-2015-FINAL-WORD-PDF.pdf) in which they stated a certain stock was a bad idea. The central concept in their ‘post’ was that a small pharmaceutical company should have reported percentage change, because everyone else does. And since they didn’t report percentage change they were hiding something. I don’t know if percentage change is the standard for anti-psychiatric drugs or if the pharmaceutical company was hiding something. Frankly, I don’t care. As I stated in Blog 18, if percentage change was the ‘industry standard’, I would recommend including percentage change only as a tertiary parameter (i.e., present median and no p-values or confidence intervals). If they and the industry like a certain scale (PANSS) excellent. If the raw metric is interpretable see Blog 3 for assessing effect size. If the scale isn’t intuitively interpretable or their study’s mean or sd is idiosyncratic see Blog 4 for assessing effect size.
However, this investment firm imputed a percentage change by computing the average baseline and dividing it into the average change from baseline. Simply incorrect math.
Let me review the pre-algebra you learned in grammar school. You probably remember the cumulative, associative, and distributive laws.
Cumulative law: a+b = b+a or a*b = b*a
Associative law: a+(b+c) = (a+b)+c or a*(b*c) = (a*b)*c
Distributive law: a*(b+c) = a*b + a*c
Let me focus on the distributive law. It works with multiplication, but it DOES NOT work with division. a/(b+c) ≠ a/b + a/c
24/(4+8) = 24/12 = 2, but
24/4 + 24/8 = 6 + 3 = 9
Why is this relevant? Percentage change divides each individual’s change from baseline by their baseline (like 24/4 and 24/8). It is quite different from dividing by the average baseline (like 24/(4+8)).
Let me illustrate the fallacy with a brief example. Say we had a ten point scale and two patients. One patient who was almost asymptomatic (1) at baseline, got slightly worse (-1, he went from 1 to 2), a second patient who was severely ill (9) at baseline improved moderately (3, he went from 9 to 6).
-1/1 = -1.00 (or a percentage worsening of 100%)
3/9 = 0.33 (or a percentage improvement of 33%)
If we averaged the baselines, we would get an average baseline of 5. If we averaged the changes from baseline, we would get an average change from baseline of 1. Average percentage change from baseline/Average baseline = 0.2, an improvement of a fifth of a point, a pseudo percentage improvement of 20%.
The average change from baseline is -0.333, a worsening of a third of a point or a percentage improvement of MINUS 33.3%.
In sum, it is mathematically incorrect to compute percentage change by dividing an average change by an average baseline. I don’t care if you have no other way to compute average percentage change, it was wrong. Just ask your 5th grade son. <rolling his eyes> “Oh, Dad!”