In today's digital age a billion is the new million in data terms. Before the 1970s, the British billion was different to the US billion - a million million as opposed to one thousand million. Think what confusion that would cause in today's global business world. Thankfully, nobody needs to feel short-changed these days as both have been aligned, but for most of us a billion simply means an extremely large figure.
However, the data explosion has brought us a whole new vocabulary when it comes to numbers. High performance computers have been with us since before the re-calibration of the British billion, but suddenly they are becoming pivotal to the new knowledge big data can bring.
Bull has recently announced a new exascale programme with the aim of designing and developing its next generation of supercomputers. It's believed that eventually these will perform at more than one billion billion operations a second; that's the number 1 followed by no less than 18 noughts.
In case you're wondering, an exascale computer is one that achieves a performance of more than one exaflop a second. And in case you're still wondering, an exaflop refers to those billion billion operations per second. This is a thousand times more powerful than current systems and the order of processing power that matches the human brain.
So what's the point of this brain-addling power? It's thought that most of the complex problems scientists and engineers will need to address in the future lie at the crossroads where big data meets high performance computing. For instance, areas such as genomic therapy and drug innovation, full-scale simulations of aircraft or modelling of entire weather systems need this exascale power.
Bull's exascale programme relies on tight integration between this enormous computing power and the capacity to process massive volumes of data. It combines faster processors, more data capacity, ultra-fast connections, greater energy efficiency and enhanced cooling. It is the result of major investments in research and development and close co-operation with European labs such as the CEA (the French Alternative Energies and Atomic Energy Authority) and other partners.
However, it would be wrong to think that these computers will never leave the rarefied environment of academic research. The cloud has already created a new delivery model known as HPC-on-Demand. Although this is not a general purpose cloud service, it does mean that businesses can buy access to today's supercomputers as they would with any other cloud-based solution.
This is bringing the world of supercomputers to smaller but expanding organisations that want to quickly take on more work and burst capacity when required. We're not yet talking exaflops but there's certainly enough power to reduce 'time to insight' - the time between the presentation of a problem and reaching an understanding of how to solve it.
Meanwhile, the exascale programme is now progressing to help us solve the challenges of the 2020s. It seems we will soon have to start thinking in trillions and quadrillions - and watching out for for the next quintillionaire.