There has lately been a spate of very expensive, full-page colour advertisements in national South African newspapers touting a product called Bio-Strath® as a solution for ADD/ADHD and learning problems in children. No specific mechanism through which Bio-Strath should work for ADHD was suggested, except its nutritional value. As a result of the advertisements, I received many enquiries from parents of children who are currently on medication such as Ritalin and Concerta. All wanted to know whether this "natural" product could be a suitable replacement. I referred them to their medical practitioners, but at the same time decided to look into the research evidence for Bio-Strath as a treatment for attention deficit and attention deficit hyperactivity disorder (ADD/ADHD).
The full text of the advertisement can be seen here.
The advertisements referred to a 2006 study in an European journal, Pädiatrie. A homeopath, Irma Schutte, made the following claims and comments about this study:
"A ground-breaking clinical trial recently revealed that Bio-Strath Elixir led to an 76% improvement in the ADD/ADHD symptoms of the children participating in the study... The clinical proof that Bio-Strath taken three times daily can solve almost 80% of these children's problems is staggering news! It would be a grave oversight if children were given medication which could carry risk and has known side effects when Bio-Strath could easily have been the answer."
Strong words indeed, but does the research referred to justify her confidence? I was unable to access the journal article in an Internet search. I then e-mailed SA Natural Products, the South African distributors of Bio-Strath, for more information. They had no problem supplying me with English copies of said article, as well as various other reprints of articles about Bio-Strath. The detail of the particular article is:
König, S. & Joller, P. 2006. Influence of a food supplement on the behaviour of children with attention deficit disorders (ADD/ADHD): Application study with the herbal yeast preparation Bio-Strath® in children. In Pädiatrie 1/2006. (no more info on page numbers, volume, etc.).
Does the article support the strong claims in Bio-Strath's advertisements? Unfortunately not. Here is why:
Author not independentAt least one of the article's authors seemed to have links with Bio-Strath. Dr. Peter Joller authored numerous other articles on Bio-Strath. Another blogger, Frank Swain of
SciencePunk, questioned Joller's independence and never received an answer. There was, however, no acknowledgement of a possible conflict of interest in the article.
Was it peer-reviewed? The lack of scientific rigour evident from the article (see below), suggests that it may not not have been peer-reviewed. A peer-review process would have given one more confidence in the findings.
Inadequate research designLet us now consider the research itself. The research design was a simple pre-experimental one-group pretest-posttest design. This is considered a very weak design, prone to numerous threats to internal and external validity. Suffice to say, this kind of research without blinded, random assignment to experimental, control and placebo groups, is really not capable of proving anything. Bio-Strath has been on the market for more than 50 years and the first claims that it could improve concentration in children date back at least 40 years. Surely decent, so called
gold standard research is long overdue?
The participants in the research were selected based on scores on an ADD/ADHD rating scale of unknown reliability and validity. The rating scale was translated from English into German, but there's no indication of adequate translation controls (i.e. a back-translation process). The participants were subjected to pretests and posttests six weeks apart on the Integrated Visual and Auditory Continuous Performance Test (IVA CPT). The
IVA CPT seems to be a valid, well standardised test.
The small number of participants (18) and large number of tested variables (19), increased the probability of chance effects. The uncontrolled test-retest design is prone to many confounding effects, including history, maturation, test-retest practice, and so forth. The fact that the experiment was not blinded, made it prone to participation effects such as placebo and Hawthorne effects (and yes I know that there is some doubt about the Hawthorne experiments).
Misleading reportingThe reporting protocol used in the research was certainly very strange and misleading. It seems very much as if they cherry-picked which results to report on and how.
The results were tabled in a 18 x 19 matrix, with the participants ranked in terms of the number of variables they improved upon. König and Joller reported that 12 of the 18 subjects improved on at least one of the variables. According to their criteria, that meant that 66% of the participants improved! The range of reported improvement was from improvement on one variable, to improvement on 15 of the variables. Though not explicitly reported, the average improvement for all candidates, was on only four of the variables, or 21%. Some of the variables were composite scores, which add to questions of the appropriateness of the reporting. The authors did not distinguish between ADD, ADHD, or HD in their analysis.
They further reported 116 statistically significant changes, 93 positive, 23 neutral or negative. There was no indication how the statistical significance was determined. A neutral "change" makes no sense and I fail to see how it can be statistically significant, maybe I'm missing something. According to their criteria, 93 out of 116 represent an 80% improvement. What they seem to be ignoring, however, is the 226 scores which showed no significant change {(18 x 19) - 116}! Taking that into consideration, the percentage change would be 27% (93/342). Yes, I know this kind of reporting makes no sense, but I followed their reporting procedure.
The "statistical" analysis was followed by an anecdotal discussion of the subjective improvements (as reported by the parents) shown by the two top ranked participants, who respectively improved on 15 and 13 scores. This discussion was clearly of little value, as the parents' report was subject to among other things, confirmation bias. The limitations of anecdotal evidence are well known.
These were just some of the limitations of the research. There are also other, older studies available. These studies seem to suffer from the same limitations. The study that is hyped in the newspaper advertisements, is the one discussed here.
Why take the trouble to do such inadequate research about a product that has presented as a solution for ADHD for more than 40 years? Steven Novella on his blog,
Neurologica, explains this phenomenon better than I can (I've just replaced the word "accupuncture" with "Bio-Strath"):
"Poor studies that are virtually guaranteed to generate a positive result (like this one) are also useful for marketing propaganda. They create great headlines - and most of the public are not going to read much beyond the headlines and so will be left with the sense that there is more and building evidence that Bio-Strath works. As propaganda this study is very effective."
ConclusionThe advertisement was misleading to say the least. The heading in huge, bold letters claims: "76% Improvement in ADD/ADHD symptoms". I've shown that to be incorrect - the study is of little value and does not support that claim. Even taking the study at face value, at most an improvement of 27% can be claimed and even that is highly debatable. The ADD/ADHD symptoms referred to, are from the test used and may not directly translate to real world problems.
I believe that Bio-Strath South Africa, knowingly or unknowingly, placed an advertisement that deceived many parents. The ethical thing for them to do would be to withdraw this advertisement immediately and publically admit that there is very little evidence that Bio-Strath is effective for ADD/ADHD.
See a later post on Bio-Strath for cognitive enhancement at
Bio-Strath is at it again.