dataoptimizer-banner







Challenges

Data ages more quickly than you’d believe! Files classified as “hot” today are already “cold” tomorrow and no longer in use. Think about TV and media – nothing is older than yesterday’s news but we still keep the data. Medical data such as X-ray images is only hot if the patient has a check-up. In between, it remains cold for many years. Even in industry, companies will not access a lot of data for months and years, with data remaining unused and merely managed. This costs money, as the data is mostly stored on the costly primary storage. Measurements reveal the following insights: 15% of data is hot data, and actively used in the first month. 35% of data is warm data, and already less actively used in the period from 1 to 6 months. 50% of data is inactive cold data, which is the case after about 6 months.

IT knows that most data is cold but it lacks the tools to clear up primary storage regularly and move data to cheaper storage or to the cloud. Moreover, users would like to access their data through their familiar data paths and not look for legacy data through special archives. Therefore, storage based on costly HDDs or SSDs is bought in the short term, resulting in a vicious circle.


data-growth

The solution: Transparent file tiering for NetApp storage with DataOptimizer

DataOptimizer automatically and periodically relocates data that is identified as rarely used or unused. NetApp storage is thereby optimized, and users and applications can access all files as usual.



Functionality

With DataOptimizer, IT has a tool that meets all the identified challenges at once. NetApp storage can be scanned and analyzed beforehand using DataAnalyzer (also a ProLion tool). These results already reveal the potential savings that can be achieved by using DataOptimizer, whereby little used to unused files are automatically relocated to cheaper storage. A policy-based set of rules is applied. Even the periods are individually adjustable. Data is relocated based on the principle of file tiering: frequently used data is kept on fast storage such as SSDs. Less frequently used data is kept on cost efficient storage based on SATA drives, or relocated to the cloud.

Access to files remains transparent and unchanged for users and applications. This is facilitated by the stubbing method: The file is first copied to the secondary storage and then the original file on the primary storage is replaced by a link with the header information (stub file). Users and applications can continue to access the desired file transparently through the original data path. Data access is forwarded to the secondary storage via the stub file. The file is then delivered directly from the secondary storage (path-through access) to the user or the application.







Price model

DataOptimizer is licensed per TB based on relocated capacities: Only the volume of data moved by DataOptimizer from the costly storage to the cost efficient storage counts towards the calculation of the licence costs.



Advantages

  • The NetApp storage is cleared up regularly, thereby lowering costs and boosting performance.
  • DataOptimizer implements the results of DataAnalyzer
  • File tiering considers many different file attributes, such as age, size and type of files. The principle: important files on fast storage, the rest on cost efficient storage.
  • With the stubbing method, users and applications can continue working and access their data transparently, all without interruption.
  • Backup and restore times get optimized.
  • The automatic rule-based process of DataOptimizer relieves the IT department of routine tasks
idea