Inside backup and storage: The expert’s view
How much data do we create? How do we secure it? Store it? Retrieve it?
When professional community Wikibon recently translated the amount of digital information that is estimated to be created in 2010 in more physical terms, they calculated that to store all that data would require 75 billion fully-loaded 16 GB Apple iPads. It makes the mind reel, doesn’t it?
It was also noted that the amount of digital information created today surpasses by 35 percent the capacity of storage space that is currently available, and that the percentage will only be getting bigger as the years pass.
If this statistic and prediction sound too wild to be credible, just pause a moment and think about how much content you yourself produce every day – at home and at work. Then multiply that number with the latest numbers regarding the estimated number of Internet users (it was 1,966,514,816 on June 30, 2010, by the way). It doesn’t sound that far-fetched anymore, does it?
Well, the point that I really wanted make with this brief introduction is that the human race seems to be pouring out massive amounts of data like the world’s going to end tomorrow.
Some of it will vanish into the far reaches of this global system of networks that we call the Internet, fragments of it stored in various places, but for all effective purposes lost because it will be unsearchable. And that’s all right, since most of it wasn’t meant to be saved anyway.
But what about the data we do want to save? The seemingly inexorable progress of the human race is tied closely to our learning capabilities and the fact that we can access the knowledge left to us by our ancestors – whether they used stone tablets, books, or data storage devices.
The decisions that we make daily are largely based on the information we have at our disposal. Whether these decisions concern our private or business life, we need information.
So now we come to the crux of the matter and this article – what do we know and what can we expect in the future when data storage and the backup process are concerned?
The recently concluded bidding war between Dell and HP to acquire 3PAR has put the spotlight on the storage sector, and has indicated that cloud storage – however omnipresent the concept may be currently – is just one of the trends that drive this market, and that physical data centers are still very much in demand.
It may be that the time will come when cloud storage becomes the mainstream storage model, but that still isn’t the case. “Cloud storage definitely solves a major problem – that being hardware maintenance,” says Adrian Gheara, Project Manager, GFI Backup. “Very often small companies don’t have the resources for a strong hardware infrastructure required by a backup strategy (redundant hard-drives, dedicated servers, load balancers, an administrator that constantly monitors the health of hardware equipment). Cloud computing will ensure that for a decent fee they get the best possible reliable infrastructure for backups.”
Peter Airs, EMEA Storage Product Manager for Netgear agrees. He thinks that cloud storage is ideal for smaller customers without a second site to replicate data to. “Cost and complexity is massively reduced compared to deploying a tape solution and it is a “set it and forget it” solution shifting critical data off site as it gets saved locally. And although cloud backup like our embedded ReadyNAS Vault is not replacement disaster recovery for applications, it fits smaller customers looking for peace of mind protection for files while addressing capital expenditure with a pay-as-you-go model.”
Larger enterprises and mid-size organizations can also benefit from the cloud option, even if they have already implemented high availability or disk-to-disk backup, thinks Christian Willis, EMEA Technical Director for Vision Solutions. He believes that cloud storage and recovery can complement their existing strategy and further reduce recovery time and recovery point objectives.
As regards the matter of data security, he says that apart from defining and sticking to best practices such as encrypting information before it goes off-site and using secure networks to move the data, it is of crucial importance to specify what responsibilities the cloud provider will take on, and what will remain with the company.
Willis is convinced that the likes of Amazon and the other major cloud providers have such large estates and established security procedures that data at rest is protected – the standards at which they work are comparable or better than those you can achieve as a single organization.
GFI’s Gheara believes that the trend that has seen many large enterprises moving to cloud backups will continue unabated. “A very important advantage of cloud storage is that your backup is remote,” he says. “If there’s a fire in the office, the backup will not be destroyed or damaged as well. The only disadvantage with a cloud-based solution is speed. If a restore is needed, the download will take some time. In the long term, downloads speeds will go up and costs will go down, so cloud backups will become easier and better. As to security, encryption and data distribution across multiple machines will cover these risks.”
What is interesting to note is that when it comes to backup, a lot of organizations are focused on data protection, and it’s often the case that the quality and speed of the recovery process – which is, after all, the reason they are doing it in the first place – tends to be overlooked.
Simplicity and ease of use are also of great importance. “Everybody knows how important backups are. Yet there are still epic tales of people losing all their files, sometimes going out of business in the process,” muses Gheara. “The truth is that people are lazy. They know they have to backup their data and want to do so, but because of the complexity to set up and create backups they tend to postpone, or avoid doing so. And disasters usually strike when you’re least prepared. Providing an easy-to-understand user-experience is a key factor to get people to actually create backups. With GFI Backup for Business, for example, usability is something we pay particular attention to. We are constantly trying to make the process simpler and easier.”
Netgear’s Airs concurs, especially when it comes to small business and mid-sized enterprises. He notes that backup always consists of a hardware and software component and it is up to vendors to ensure that these components dovetail to provide a cost effective, high performance yet trouble free experience for the customer.
Vision Solutions’ Willis says that ease of use is especially important when it comes to backup being deployed across multiple different platforms. “Virtualization has made some elements of backup easier, but it has also introduced some new challenges to consider,” he says. “As an example, if a company has VMware within its main HQ, but is running Microsoft Hyper-V in its branch offices for reasons of cost, then it can have some problems in making sure that all its virtual machines are properly protected.”
And while Toshiba provides only consumer backup solutions, Manuel Camarena, product manager at Toshiba’ Storage Device Division, points out that while the majority of people does seem to be aware of the importance of regularly backing up their computers, a recent survey they sponsored revealed that 54 percent of them says that they simply forget about it. To try to influence that situation for the better, they issued a line of portable hard disk drives that include pre-loaded backup software that has an easy setup process and “set-it-and-forget-it” operation.
But while ease of use is (predictably) an important characteristic of backup solutions (business or otherwise), it is definitely not the only one on which my interviewers agree. When it comes to business backup, a centralized backup management solution also seems to be preferred.
“SMBs have particular resource challenges but centralizing into a single easy to manage platform that can take care of all of a business’s storage and backup needs makes sense from a financial and management overhead point of view,” says Airs. He also thinks that when it comes to SMBs, they are often relying on backups and disaster recovery policies being adhered to by staff who’s primary function is elsewhere in the business, meaning that backups don’t get done and tapes are not managed correctly, and says that many of these issues can be addressed by moving to a centralized disk storage system and an automated backup regime which requires minimal human intervention once set up.
“For companies with multiple computers it is important to have an easy administration panel, that allows a centralized management of tasks; otherwise it’s very likely that problems will arise during the backup process and nobody will ever know,” concurs Gheara.
According to the results of a recent storage study by TheInfoPro, data de-duplication is a leading technology when it comes to backup on companies’ existing storage resources but, interestingly enough, online data de-duplication and data reduction appears to be a waning technology.
But what do these experts think about it?
“Data de-duplication is relevant only for large companies. It works very well with cloud storage for reduced traffic,” says Gheara. “In small companies, due to the lower volume of data that is transferred, de-duplication is not really necessary; and may not be a viable option because of the need to set up the software and its’ cost.”
Airs says that, so far, Netgear’s customers haven’t shown much inclination towards it, and prefer to ride the cost/capacity curve for the time being and to employ higher capacity storage systems. He thinks that its time will come, but that adoption is slower due to the lower capacity requirements, time and budget pressure in the small business and mid-sized enterprise space.
But to return for a moment to the 75 billion fully-loaded 16 GB Apple iPads from the beginning of the article and the storage issue, and mention that Toshiba recently made a significant inroad when it comes to a new technology that will improve areal disk density and allow us to store five times the amount of data per inch than we can store today.
“As perpendicular magnetic recording (PMR) – the current HDD industry standard – nears its fundamental capacity limit, the industry is investigating new technologies to increase areal density,” says Patty Kim, product manager at Toshiba’s Storage Device Division. “Bit-patterned media (BPM) is one approach. Two others that hold significant interest are heat assisted magnetic recording (HAMR) and microwave assisted magnetic recording (MAMR). Toshiba is evaluating these approaches, all of which have potential technical hurdles, but the developments we’ve made with BPM certainly make it a strong contender for future production.”
All in all, Toshiba has managed to fabricate a hard disk with an areal density of 2.5 terabits per-square-inch and a practical servo pattern by using an etching mask made of a self assembled polymer, but they still haven’t managed to read or write data in the drives. Obviously, a considerable amount of time will pass until this technology becomes a standard, but they predict that density of 5Tb/in2 will be achievable in the lab by 2012.
And while Seagate seemed to opt for heat assisted magnetic recording, and Hitachi GTS for the bit-patterned media, so far no drive manufacturer has thrown all their eggs in one basket.
Setting aside the issue of disk density, I also wondered if the self-encryption capability of some of Toshiba’s drives was becoming a strong selling point, and asked if they noticed an increase in demand.
“Absolutely. Many customers see them as the best – and most cost effective – way to protect “data at rest” on PCs and storage systems,” says Scott Wright, product manager for mobile storage with Toshiba’s Storage Device Division. “The interest stems not only from the desire to protect against the potential data and economic loss from a lost or stolen notebook, but also from the need for IT departments to manage their compliance with privacy laws and regulations governing data security. This is particularly true for highly regulated enterprises in such industries as health care and finance. However, regardless of the type of business, the simple fact is that all disk media eventually leaves a company’s control, whether it’s decommissioned, disposed of, sent for repair, misplaced or stolen.”
And when it comes to drives that are getting withdrawn from service and disposed of, Toshiba has also thought about and implemented a wipe technology that provides the ability to automatically erase the SED’s internal security key when the drive’s power supply is turned off – such as when a system is powered-down or when the drive is removed from the system – instantly making all data in the drive indecipherable.
In the end, it seems to me that even though there are vast amounts of data that must be stored, and stored well, the good news is that we don’t lack in options to choose from.
There may be glitches here and there, but no technology is or ever will be flawless. That is a fact that we must accept, and learn to always have a (no pun intended!) backup plan.