December 16, 2011

Is the sequencing market saturated?

Since 2007, the cost of sequencing has declined by around 80% each year. This is amazing progress, a testimony to the ingenuity of the sequencing industry - and, I'll argue in this post, increasingly irrelevant.

My claim is based on the observation that sequence generation (library preparation and running the sequencing machine) is only part of the cost of a sequencing project. In a recent paper, Mark Gerstein and coworkers estimated that currently around a third of the cost of a sequencing project is spent on sequence generation, whilst the rest of the money goes to sample collection, data management, data analysis and other tasks. In a few years, it is likely that those other costs remain at a similar level, whilst the cost of sequence generation is likely to continue to decline massively. Less than $100 per genome is not unrealistic.

This means that the cost of sequence generation in comparison to other costs becomes negligible, and that any further decreases in that cost will be irrelevant compared to the potential savings that can be achieved elsewhere.

The cost of library preparation and actually running a sequencing machine (Sequence generation) is likely to decline as a proportion of the total cost of a sequencing project (Conceptual)
In other words, very soon the market demand for ever cheaper sequencing may be saturated. Nevertheless, it seems that low cost is still the single most important thing established sequencing companies are aiming for.

My guess is that this will provide an opportunity for start-ups that concentrate on different performance metrics, such speed, simplicity or ease of use. In part, this is happening already. Ion Torrent has started selling a sequencer that is more expensive to run than the competition's. However, it is also more convenient to use and takes less space. It is reported to be selling well.

Do you agree that future decreases in the cost of sequencing will become less important, whilst other performance metrics will become more important? If yes, what are these performance metrics likely to be?


  1. Google Koomey's Law for an interesting new metric for the long-term trend of computing cost: computing operations per kilowatt-hour. In big compute farms, power and cooling are major issues that are likely to become even more important. In portable computing, battery weight is a major concern that is likely to become even more important.

  2. I agree that different metrics will be important, but I disagree that this is an opportunity for start-ups. The incumbents are more than willing to focus on simplicity/ease of use as you describe for the Ion Torrent, hence market disruption will not be easy.

    There is opportunity in downstream data processing as Christensen describes in the law of conservation of attractive profits. Whether or not the incumbents will pursue this is yet to be seen.

    As for market saturation, there is still a large unserved market; the researchers who weren't genomics oriented, but are looking into the field due to the decline in cost. There is room for a lot of innovation here, but again, I'm sure that the incumbents are going after this.