The growth of data is the key driver that is enabling new solutions in the technology space. Now the forms of data and its sources differ in many ways. For example, video data is one of the main sources of today’s consumer and surveillance. Similarly, for enterprises, web-generated data like customer behavior, A/B design testing, ad effectiveness, heat maps, or semantic analysis are many data generation sources. Based on the data and its storage requirement, data storage technologies have been evolved.
Similarly, streaming data is an important factor for trend-heavy industries like fashion, food, social media, and entertainment. This is required to predict the market trends of their business. Besides, today’s business is more focused on specificity and granularity, increasing the two vital characteristics of Big data: volume and velocity.
For example, nowadays, food retailers track their supply chains from shipping to track who bought the package, and that is also at the individual level. Furthermore, as storage costs continue to decline at 25% to 40% annually, it turns more and more applications economic, resulting in increasing demand for storage.
Interestingly, the amount of data that need to be stored over the coming decade will rise to the exabyte range consisting of streaming data, video, machine learning, artificial intelligence, IoT, and more, which will drive private data stores exabyte range. Furthermore, as there is an increase of AI intelligence, that needs exponential amounts of training data and so the storage for the same. Not to mention, companies that leverage better opportunities in analytics and big data will lead the others.
Related post – Hot data storage 2020 technology trends
How Data storage technologies have been changed over the period?
Two decades ago, data storage technologies were solely dependent on the storage silos. Data strategists welded applications to the server OS and storage arrays. Simultaneously, upgrades to those storage solutions meant expensive new hardware with risky migrations that needs handling of usage spikes, which meant a chronically over configured infrastructure.
Data storage technologies solutions change with OS virtualization, cloud integration, containers, and the scale-out architectures that support them. However, this may make us long for the days when we could walk into a datacentre to touch our storage. Nowadays, cloud gateways are integrated into enterprise storage arrays. As a result, the developers can spin up hundreds of terabytes for software testing. Hence, it is incredibly harder than ever to know who is using storage or why. Furthermore, it is even harder to know if it is cost-effective with that particular data storage technology.
So, what’s the solution? Maybe eventually, it will appear as a cross-vendor storage tracking and analysis application. You can leverage machine learning to understand and advise admins on optimizing the total storage infrastructure related to performance and cost. These applications will know about various storage options’ costs, performance details, availability or reliability, and weigh. This is a long list, so what about current futuristic data storage technologies on the list? Let’s have an overlook of the following six ones.
1. Helium Drives
Helium-filled hard drives, no doubt, pushing the capacity boundaries of hard drives, as these are typically filled with helium and not air. Several companies are engaged in producing such data storage technology, like Western Digital announced the first 10TB hard drive. Seagate also announced an 8TB air-filled hard drive. As these drivers use helium instead of air, it takes less power to spin the disks due to less air resistance. However, the technology is still expensive. Still, these high-performance drives will likely only get cheaper and even more expansive—perhaps affordable enough even for consumer use.
2. Shingled Magnetic Recording (SMR)
Among data storage technologies, SMR or Shingled Magnetic Recording is a new hard drive recording technology. SMR technology allows for higher capacity on hard drives than traditional storage methods. How does this happen? As per the explanation provided by Seagate-
In SMR technology, the tracks are squeezed to make it closer together. As a result, it achieves higher areal densities, and tracks overlap like shingles on a roof. This allows more data to be written in the same space. Now, when new data is written, it trims the drive tracks, or in other words, shingled. As the reader element on the drive head is smaller than the writer, data is easily read off the trimmed track and without compromising data integrity or reliability. Furthermore, you can use traditional reader and writer elements for SMR.
SMR is a cost-effective solution as it does not require new production capital to be and will enable SMR-enabled HDDs. SMR hard drive came into the market in 2014 by Seagate that considerably improved hard drive density by 25%.
3. DNA
Is it hard to believe that biological molecules can be a part of data storage technologies? Yes, though it sounds the strangest, DNA is the new storage technology of the future. The molecules that store biological information could also be used to store other kinds of digital data. Harvard researchers in 2012 were able to encode DNA with digital information, including a 53,400-word book in HTML, eleven JPEG images, and one JavaScript program.
With DNA, you can store 2.2 petabytes per gram, which is incredible in terms of storage density. This means a DNA hard drive about the size of a teaspoon could fit all of the world’s data on it! Not only space savings, but also DNA is ideal for long-term storage.
However, the read/write time for DNA is high, and also the technology is still too expensive for use. According to New Scientist, in one recent study, the cost to encode 83 kilobytes was £1000 (about $1,500 U.S. dollars). It sounds like a Sci-fi story, but scientists encode information into artificial DNA and add it to bacteria. Not to mention, DNA could be the ultimate eternal drive one day.
4. Large memory servers – NVRAM
Intel is the first one who has introduced non-volatile, random access memories (NVRAM). The magic of these memories is they retain data without batteries through power cycles. As NVRAM sits on the server’s memory bus, it is significantly faster than SSDs or disks. However, NVRAM can be accessed as either 4K storage blocks or memory bytes, which is unlikely in SSDs. As a result, you get maximum compatibility and performance.
NVRAM is commonly used in large memory servers. For example, the latest Xeon SP (Skylake) servers can support up to 1.5TB of memory per processor. Instead, Intel’s Optane NVRAM DIMMs are priced as low as $625 per 128GB and consume much less power. Also, an Optane DIMMs, large databases can be run in memory, dramatically improving performance.
5. Rack scale design
Another mind-blowing data storage technology concept that Intel is promoting for years is Rack scale design (RSD), which has already been launched in the market with more advances coming in the way. If you are looking for answers to the differing rates of technology advances in storage, CPU, GPUs, and networks, then RSD is the answer for you.
Furthermore, the RSD concept is pretty simple. It can be viewed as an individual rack of CPU, storage, memory, and GPUs connected with a high-bandwidth, low-latency interconnect. This can be configured with software, and virtual servers with whatever combination of computes, memory, and storage a particular application requires. Hence, we can consider RSD as a highly configurable private cloud.
HP’s Synergy system is one implementation of the RSD concept. Similarly, Liqid Inc. offers a software version that supports multiple fabrics and commodity hardware.
6. 5D Optical storage
So far, we have seen many varieties of data storage technologies. It could be super cold storage, biomolecular, or even revolutionizing data storage by using lasers to carve terabytes of data into tiny glass discs! Yes, I am talking about optical storage. Researchers at the U.K.’s University of Southampton are involved in developing this type of digital data storage. Interestingly, this type of data storage can potentially survive for billions of years. At the same time, they have created a recording and retrieval process based on femtosecond laser writing.
Their goal is to replace magnetic tape and why they choose for this is the extreme durability of quartz glass, which can survive disasters like solar flares or fires and ideal for data centers. Furthermore, this storage solution can encode five-dimensional information in multiple layers that include the usual three dimensions. It gets five degrees of freedom for data storage by encoding in the size of imprinted structures and orientation.
As a result, you can store hundreds of terabytes per disc in 5D Optical storage with thermal stability up to 1800 degrees Fahrenheit. The research work has gained Microsoft’s attention as it can exploit 5D optical storage in the glass.
Conclusion
This is a data-centric computing era. It is almost 4.5 billion computers in use today, with most of them mobile. Along with that, there is the growth of IoT still in the future. Not to mention, data will be the top priority for technology and governance for both economic and legal reasons.
Your business data is your competitive weapon and so keeping yourself updated with the latest data storage technologies is of utmost importance. With the contribution of new analytical tools, properly stored data can offer value. Fortunately, data storage is more cost-effective than ever, and this trend will continue in the coming future.