Monday 13 May 2013

Technology is in the driver seat and we're heading to petaland

There is little doubt that technology has had a large impact on the course of scientific discovery in the past. For example it was Tycho Brahe's development of more precise measurement techniques for astronomical observation that paved the way for Johannes Kepler's identification of regularity in the motion of the planets (i.e. Kepler's three laws of planetary motion), that eventually led to Newton's formulation of the laws of celestial (and terrestrial) mechanics. While Brahe made his observations with the naked eye, the invention of the telescope by Galileo Galilei, and his subsequent observation of the moons of Jupiter, were important events that, together with other empirical evidence
eventually toppled the Aristotelian world-view.  During the same period there was a improvement in observing not only the very distant, but also the very small. The first microscope appeared at the turn of the 16th and 17th century, and within 50 years the technology had improved to such an extent that single cells were visible to the human eye. In fact the word 'cell' was coined by Robert Hook in his book Micrographia (1665). It would take another 200 years of observation before the dynamics of cell division were observed, and another 50 years until it was understood that the chromosomes, that were being shuffled between the two daughter cells, were the carriers of hereditary information.

Since the days of the scientific revolution in the 17th century technology has advanced enormously and every aspect of human life is influenced by technological artefacts. Most of us don't have the faintest idea of how they are constructed or even how they operate, but this is not really an issue since someone knows how they are built and how to fix them if they fail. More disconcerting is the fact that we are often told that only the latest gadgets are worth owning, and that each epsilon change to a piece of technology will revolutionise its use. The need for the latest gadget might be a fundamentally human capacity, and rather of economical and political interest, but what happens when this need for new technology enters into the scientific method and our ways of doing science?

The last 30 years of biological research has been heavily influenced by advances in technology, which have lead to a massive increase in knowledge about the living world. DNA sequencing, RNA-expression measurements (i.e. microarrays) and DNA methylation measurements, just to mention a few, have allowed biologists to address questions that for a long time remained unanswered. But technology doesn't just answer existing questions it also poses new ones. Microarray measurements made it possible to map out the RNA expression levels of all the genes in the genome at once. Since all processes that occur within a human cell are affected by RNA-expression of this or that gene(s), it soon became standard practice within all branches of molecular biology to perform microarray measurements, and basically required if you wanted to publish your findings in a respected journal. The data that emerged was high-dimensional, complex, and to this date we don't have a precise understanding of how RNA-expression relates to gene regulation and protein expression. Completely ignorant of this lag between measurement, and theory and concept formation the biotech industry has continued to develop techniques with even higher coverage and larger depth. The scientific community has become drawn into this technological whirlwind and today, when we still don't have a full understanding of the microarray-data, using it is basically frowned upon, and we are asked why we didn't make use of RNA-seq, or 'Next generation sequencing' in our study.

New technology lets us probe further and deeper into living matter, and doubtless this has helped our understanding of these systems. However, considering how little we have learned about human biology from the Human Genome Project (sequencing of the entire human genome), it's tempting to speculate about where we would have been today if instead all that effort and money that went into sequencing was spent on dedicated experiments and hypothesis-driven inquiry.

Today we are producing terabytes upon terabytes of data from biological systems, but at the same time we seem to know less and less what that data actually means. I think it's time to focus on the scientific problems at hand, and try to understand and build theories of the data produced by current technology, before we rush into the next next-generation piece of technology, that in the end will just make us forget what we initially were asking. If not, it won't be long until we count our datasets in petabytes.



No comments: