Big data is being hailed as one of the forces that will transform Singapore into a Smart Nation, but a big crunch could throw a spanner in the works.
The amount of data created by the world annually is expected to balloon by more than 10 times in the next decade, as people fill their hard drives, USB drives and smartphones with thousands of photos and personal documents.
This is creating another set of problems. Scientists have predicted that, at this rate, the world will run out of data storage capacity in 181 years' time - even if every atom on earth were used to store data.
"Storage is on the way to becoming the next 'fossil fuel'," said Assistant Professor Anupam Chattopadhyay at Nanyang Technological University's School of Computer Science and Engineering, one of the scientists in an international team that did the calculations using 12 atoms to encode one unit of data.
Even the best-case scenario of cramming data into every subatomic particle to its theoretical limit - known as the Bekenstein Bound - gives just 345 years to saturation point. Their paper is under review in the journal Cognitive Computation.
A report by global storage firm Seagate and IT research firm International Data Corporation last week showed that 16 zettabytes (ZB) of data were created last year.
A regular computer hard drive today is around a terabyte (TB) in size, while a zettabyte is a billion times that. By 2025, annual data creation will swell to a total of 163ZB, and nearly 20 per cent of it will be critical to people's daily lives.
Data guzzlers include a telescope on Hawaii which is being used by the United States National Aeronautics and Space Administration for scientific analysis that includes detecting and avoiding future collisions between earth and space objects like asteroids. The telescope has a 1.4-gigapixel camera that generates 1.4TB of data every night.
The researchers suggest a few ways to stave off the impending crisis, including aggregating data into useful knowledge that takes up less space than raw data, developing technology to fit more data into less space, and even storing data on other planets.
Associate Professor Biplab Sikdar, from the National University of Singapore's (NUS) Department of Electrical and Computer Engineering, said the latest in data storage capacity is artificial DNA, which can store more than 200,000TB in a gram of DNA, a much smaller volume than a regular hard drive.
However, it is nowhere near the theoretical maximum of the Bekenstein Bound, and too slow for practical use.
Humanity's hunger for data is already a problem today. The world's data centres consumed more than 400 terawatt hours - one terawatt equals one trillion watts - of electricity in 2015, with the same carbon footprint as the aviation industry, according to a report by the British news website Independent.
Singapore's efforts to tackle this include an experimental Tropical Data Centre set up last year in a government-industry partnership to operate with less cooling, which could cut energy costs by up to 40 per cent.
NUS scientists are also developing an ultra-thin data storage medium that may need 10 times less electricity than today's systems.
What would happen if the day comes when there's no more storage?
"It would be chaotic. As a user, you cannot even communicate since every new word that you type or every new phone call that you make requires some free storage," said Prof Anupam. "At the corporate or state level, it will come to a standstill."