I am currently writing a piece on genetic testing, basically arguing that genetic testing is still a research method and whole genome sequencing nothing for prime time as basically now summarized also in JAMA:
In this exploratory study of 12 volunteer adults, the use of WGS was associated with incomplete coverage of inherited disease genes, low reproducibility of detection of genetic variation with the highest potential clinical effects, and uncertainty about clinically reportable findings. In certain cases, WGS will identify clinically actionable genetic variants warranting early medical intervention. These issues should be considered when determining the role of WGS in clinical medicine.
Maybe the judgment of any scientific method was largely limited to experts about 20 years ago. You had to know something about research, you had to go to a library, you had to find the relevant information and eventually put it into the right context. Only a few people and only a few journalists could do that. (and only the latter would even publish their opinion).
This has completely changed with so many research papers now being published online. There is no more gate, no more gatekeeper. It means, however, that research papers are frequently misinterpreted – from patient advocacy groups to companies to medical doctors. I would wish that research papers would carry a “For research use only!” label as printed on many bottles with enzymes, antibodies and alike (Medical information is otherwise still restricted in Germany to physicians, pharmacies and medical staff). Given that rather muddle-headed situation in genetic testing, I think the new JAMA paper is a welcome recommendation for everybody!
incomplete … low reproducibility .. uncertainty
I didn’t find so much time to update the blog during the past few months – there are too many attractions out there, and so many interesting things to do. The never ending problem is that there is too much to read and too little time. This is, however, what also other people find, for example genomeweb.com
Pedro Beltrao at the Public Rambling blog says there never seems to be enough time to keep up with all the literature researchers keep churning out. In 2009, 848,865 papers were added to PubMed, he says — that’s something like 1.6 papers per minute. While there’s definitely no scarcity of outlets to publish, is anyone even paying attention?
Or the Latest Everything blog
From a half-forgotten Einstein quote to the complete works of J. S. Bach, everything is instantly available. But what can we really do with it all? A HALF-CENTURY ago Marshall McLuhan wrote: “We are today as far into the electric age as the Elizabethans had advanced into the typographical and mechanical age. And we are experiencing the same confusions and indecisions which they had felt when living simultaneously in two contrasted forms of society and experience.”
who republishes theNew Scientist article (04 April 2011) pp. 1-3 in Surfing the data flood: Too much to read too little time weiterlesen
Nassim Taleb points in his black swan book (p181 in the German edition 2008) towards the “toxicity” of continuously added information. He is citing experiments from the 1960ies where students were offered increasingly sharp pictures of water pipes. Students in the group with slightly inceasing sharp pictures had much more problems to recognize the pipe in constrast to the group being offered the same picture without any interim pictures. Together with the experiments of Stuart Oskamp it seems more difficult to discover real breakthroughs when being too deeply involved (and obsessed by getting the complete literature in a particular field). This even questions my daily Pubmed alerts; I will change them now to monthly update.