The final quarter of HPE’s financial year 2018 has just come to a close and the press announcements and analyst reports are...
Tiered and Policy Based Data Backup –
The new Paradigm (Part 1)
Tributary Systems Inc.Dan
In this first part of a two part series on data backup, TSI will introduce the concepts of tiering and policy and highlight their importance they both have for any enterprise planning to do just a little more than “making copies” as the sole fallback solution whenever disasters occur. Selecting the right solution for an enterprise involves following a few basic steps but all too often, one or two of these steps may be overlooked and in doing so, lead to dire circumstances. What follows here has been incorporated into a feature article that will appear shortly in the Nonstop community publication, Connect Converge. Check our web site for further information on how to access this article – www.tributary.com
Aside from personnel, the most irreplaceable asset of any business organization is data. Data is both the fuel – and the product – of the automated systems that support virtually all business processes today. Securing, protecting and ensuring immediate access to data in an efficient cost effective manner are the most important responsibilities assigned to contemporary business information technologists.
Unfortunately the approach taken to providing such backups, in too many circumstances, is to “just make a copy” and mission accomplished! No real consideration is given to the criticality of the data, appropriate backup locations for various data pools, required data retention periods, the cost per TB of backed-up data, and backup and restore windows. With the vast amount of data being backed up every second, this “one-size-fits-all” paradigm needs to change.
The Tiered Policy-Based Approach
Nothing in the above should surprise any CIO or data center manager. However, even as storage considerations have led to incorporating a number of different types of storage each of which balances an applications need for immediacy versus cost, so too has there been a rethinking of exactly the type of storage to be used in each case where data backups have been implemented. The emergence of cloud computing whereby storage within the cloud – public as well as private – have only helped add another tier to the model such that today, policies mandated by the business has led to a tiering approach in support of all types of data destined to backups.
Think of a tiered policy-based approach like this – no one would rent a room in the Four Seasons Hotel to store old magazines, holiday ornaments or Grandma’s used furniture! Conversely, no one would be foolish enough to rent a locker at the local bus station to store priceless first state Rembrandt etchings either! A thoughtful individual will determine the appropriate storage location based on the rarity, value, replacement cost and condition of the items being retained. So why should business data be any different? It shouldn’t!
A three-step approach to implementation
When it comes to pursuing a tiered policy-based approach to backup it all starts with planning and then implementing the best approach and this involves three steps, as follows:
Intelligent Backup Data Analysis
Data management planning must be preceded by the due diligence of business process analysis. Determination of which business processes are mission critical and which are merely important is mandatory. Getting more granular in the assignment of management, protection and recovery services to data assets – creating policies for tiered data sets based on a thorough analysis of requirements – is the best way to rationalize expense while streamlining and optimizing data backup.
Building an Integrated Data Management Strategy
A business impact analysis should be performed to identify the current infrastructure and data associated with a given business process, and the impact of an interruption in the services and access to data associated with that process.
The results of the impact analysis will naturally drive the setting of retention and recovery objectives that define the criticality and restoration priority of the subject process and its “time-to-data” requirement (sometimes called a recovery time objective).
Of course, there is truth to the assertion that protecting data is a simple matter: make a copy of the data and move the copy off-site so it is not consumed by a disaster that befalls the original. Almost every data protection strategy provides protection as a function of redundancy, as replacing data following a disaster event is not a feasible strategy.
While there is universal agreement on this concept, different vendors seek to promote different technologies for making the safety copies of the data, each promoting their wares as the one true path to data continuity. We find ourselves locked in a perpetual battle over what is the best technology for the protection of our data.
It is our belief based on our experiences to date that a “storage director” serves as the cornerstone of efficient, tiered policy-based data management, is conceptually simple, almost “common-sense.” With TSI Storage Director, enterprises can tier stored data and data policies down to individual data volumes on multiple host platforms based on business criteria and importance to business resiliency and restoration. This is intelligent data management!
A successful strategy involves following the above steps in detail before decisions about a tiered policy-based approach is taken and to better meet the needs of all those businesses that go down this path and determine that a storage direction is what will meet all their needs, TSI Storage Director may very well prove to be the ideal solution. For even more detail on the three steps outlined above, look for the second part of this article to appear in the May issue of NonStop Insider.
For further information about the Storage Director Appliance, contact me, Glenn Garrahan, Director of HPE Business for Tributary Systems, Inc. at email@example.com or visit our website: www.tributary.com