Inside Else Inside TEMPlate====>

10 Tips to Maximize the Value of Public Cloud Storage

By Drew Robb

Going slowly, doing your homework and making smart choices can ease your transition to cloud storage services.

Enterprises are using the public cloud for storage more than ever before. They are relying on services such as Amazon Web Services (AWS) Simple Storage Service (S3) and Microsoft Azure Storage as repositories for everything from backups and overflow storage to disaster recovery (DR) and mission-critical application data.

A recent IDG Research survey found that 59 percent of organizations are either currently investing in or are planning to invest in more public cloud over the next 12 months. For those using the public cloud for storage, the following tips may help to ease selection and deployment, reduce costs and improve overall service levels.

1. Gradually Does it

Be sensible about moving to the public cloud. Top management may have ordered you to dump everything into the cloud in their haste to cut storage costs, get out of the in-house IT business and maybe even trim back on staff. But temper their enthusiasm with some due diligence. Crawl, walk, run very much applies with the public cloud. There is a learning curve that will take time to master.

“Adopt increasing levels of services as your teams get up to speed, and understand how to leverage APIs and automate everything through code,” said Mark Bloom, director of product marketing, compliance and security at Sumo Logic, a cloud-native machine data analytics service.

2. Start with Backup and DR

One of the effective use cases for cloud storage services like Amazon S3 and Microsoft Azure storage is for off-site backup storage. In this scenario, backup copies are retained in the cloud in case disaster occurs in the main data center.

However, just throwing backups to the cloud does not absolve IT of any further responsibility. It is up to IT to determine how easily it will be able to recover files and entire systems in the event of an outage, data loss incident or natural disaster. This means finding out from the provider just what guarantees they provide, what additional costs are involved should you need to recover your data, and how long recovery will take.

“In order to properly assess recovery time objectives (RTOs), organizations will need to be aware of the data retrieval, recovery and migration capabilities of the service provider,” said Goran Garevski, vice president engineering at Comtrade Software, a provider of IT infrastructure management solutions for data protection, system, network and application performance.

3. Know Your Vendor

One public cloud is not the same as any other public cloud. And within the major providers, there are a wide range of flavors to choose from. It pays to know your vendor and what services they offer. Azure has Blog Storage, Queue Storage, File Storage and Data Lake Storage as a few of the services it offers. Amazon has S3 storage as well as Amazon Elastic File System (EFS), Amazon Glazier, Amazon EC2 Block Storage and Amazon Import/Export Snowball. Do your homework so you know what is on offer and how the various service levels impact pricing.

4. Choose Carefully

Resist the temptation to deposit your data in the first service you come across or the one that seems to be the cheapest. It pays to do your homework and find the right fit. Once you know the available services, flavors and associated costs, take more time to assess the needs of your own data sets and storage needs.

Amazon, Azure and other public cloud storage vendors have a broad selection of storage services, and selecting the best-suited option for each data set can help when it comes to optimizing storage costs. In addition, taking advantage of built-in features for data management, monitoring and audits can reduce ongoing operational effort around storage.

“A precursor to selection of an optimum mix of services is clearly identifying the storage requirements for the different data sets in the organization,” said Deepak Mohan, research director, IDC. “This classification, combined with experimentation with various storage services (which is a cost-effective task in public cloud) will help organizations choose the best mix of storage services for their needs.”

5. Use Appropriate Storage Tiers

Among the many flavors of cloud storage are a variety of tiers of storage. These include data for hot files, infrequently accessed data and archive data.

“An important aspect of managing storage costs is tiering your data based on attributes like frequency of access and retention period,” said Sriprasad Bhat, senior program manager, Azure Storage.

Azure, for example, provides services for cool data — data that is infrequently accessed but requires similar latency and performance to hot data. Known as Cool Blob Storage, it is low-cost storage for cool object data such as backups, media content, scientific data, compliance and archival data. Depending on the region, costs for this service can be as low as $0.01 per GB, and Microsoft just released new regions where this service is available. This includes the much of the USA, and parts of Germany, Australia and Brazil.

6. Reduce Data Transfer Costs

Mohan noted that some organizations grouch about the high price tag for data transfer. If you find, then, that your foray into the public cloud has shifted from simply dumping data there to moving lots of information back and forth, there are ways to cut costs.

“As the volume data transfer out increases, services like AWS Direct Connect can help reduce data transfer costs for organizations,” he said.

7. Combine Compute and Storage

Those looking at the cloud only for storage may be taking too limited a view, said Gary Watson, vice president of technical engagement, Nexsan. He pointed out that there are also cloud compute services which, when combined with storage, can provide greater value from the cloud and further reduce upfront costs for new in-house hardware.

“The public cloud can be the best solution when the compute layer is also in the cloud,” said Watson. “For example when you connect Amazon EC2 to S3 and use the rest of the surrounding cloud ecosystem.”

8. Use the Public Cloud for Shared Content

Another smart way to make use of the public cloud is for content that is to be shared with the public. This can save time and expense in erecting different internal storage compartments and policing access controls. After all, you don’t want people poking around in your publicly available files then being able to slip into more sensitive areas of the enterprise. Sticking all public data into the cloud is one way to ensure separation.

“If the output file is going to be shared with public, it makes sense to put a copy in the public cloud,” said Michael King, senior director of marketing operations, DDN. “Not only does this make it easier to make it accessible to the public, but you reduce the risk of inadvertently exposing other internal data to the public.”

9. Enable Collaboration

King also called attention to collaboration with remote office workers, partners and suppliers external to the enterprise. In-house-based collaboration applications are likely to be more rigid and pricier than those that use the public cloud, particularly if security is not a major concern. Dropbox, he said, is one example of this where it is easy to share files and data with a specified group.

10. Use Technology to Improve Data Transfer

Services are available that make it possible to move data sets around the planet economically while continuing to harness the relatively cheap cost of public cloud storage. For example, Panzura makes uses of caching appliances (with deduplication built in) in various locations which work in tandem with Azure.

Take the scenario where you are conducting global software development: dev is in one location and test in another. In this case, 50 GB files took 10 hours to transmit. And several were being relayed each day.

“This was brought down to less than a minute,” said Barry Phillips, chief marketing officer at Panzura.

Photo courtesy of Shutterstock.

  This article was originally published on Thursday Dec 8th 2016
Mobile Site | Full Site