Subscribe to Active Archive Alliance updates - blog posts, newsletters, and more!
By: Eric Polet
Modern enterprise data centers are some of the most technically sophisticated operations on earth. Ironically enough, they are also often fortresses of inefficiency, with some equipment being utilized less than 10 percent of the time, and servers being ineffective 30 percent of the time (using electricity but performing no useful information services). Storage administrators struggle to keep pace with rapid changes in computing equipment deployments and the ongoing costs of maintaining a responsive storage environment.
These problems have led organizations to focus more closely on improving their data center management. While almost every enterprise data center has taken steps to improve its operations, virtually all are less efficient, more expensive, and less flexible than they could be. Those shortcomings ultimately prevent data centers from delivering maximum business value to the organizations that own them.
Enter active archive enterprise storage solutions. A data center built around the core principals of an active archive provides an organization access to all of its data, all the time. Through unified management, regardless of tier, users have the most responsive archive implementation available that frees up valuable space on their Tier 1 storage and maximizes the value of their data. With the ability to respond to future growth and evolve as data expands, an active archive solution is one that will grow with your organization, delivering a responsive archive, built to adapt to your changing storage landscape.
As organizations continue to face challenges of storing data in this age of exponential data growth, implementing an archive is the most affordable way to store data for the long term. Active archive solutions provide scalable data storage, designed to meet the requirements of modern data centers. With integrated policy based management to intelligently manage how, and on what tier data is stored, organizations get the highest return on investment for their storage solution. Active archive solutions bring together leaders in the storage industry to construct a customized archive solution that meets the needs of today’s enterprise organizations.
By Peter Faulhaber
We are pleased to welcome our newest member, MT-C, to the Active Archive Alliance. MT-C simplifies access to content through its intelligent catalog and metadata, and brings a unique solution for an active archive that lowers storage total cost of ownership.
MT-C’s solutions are a great fit with the Alliance, which is focused on promoting active archives for simplified access to all of your data, all of the time. Active archive technologies include file systems, active archive applications, cloud storage, high density tape and disk storage.
Here’s a little background on MT-C and its core active archive product NODEUM:
- Belgium-based MT-C is a fast growing software engineering company specializing in data storage and protection systems. The company provides solutions for companies that need guaranteed and flawless protection for their data, and serves a multitude of industries including biomedical and genetics research centers, cloud service providers, broadcasting and post-production, spatial centers, video surveillance, retail, banks and insurance.
- MT-C’s key solution, NODEUM, is a highly scalable, hybrid software-defined storage platform that virtualizes attached storage nodes as flash, disks, high capacity tapes and cloud. Its embedded file management classification feature allows organizations to analyze their data content and to predict their future needs in terms of ILM (information lifecycle management). Its integrated content catalog, custom metadata tagging and search engine provides the classification to store, and retrieve massive volumes of structured and unstructured data. Storing and accessing data has never been easier. Futhermore, NODEUM lowers storage total cost of ownership with its high-density capacity and optimized active archival system and offers optimal, secure and long-term management of data.
With today’s fast-paced proliferation of long-term data and the growth of data sources and types, active archives are seeing increasingly strong interest and demand. We continue to expand memberships in the Active Archive Alliance with companies whose products support active archiving and that enable organizations to implement the best solution to fit their specific long-term storage and archiving needs.
If your company is interested in becoming a member of the Active Archive Alliance, please visit: http://www.activearchive.com/membership.
A cognitive strategy to manage the data explosion
By Floyd Christofferson
When researching ways to contain rising storage costs and reduce the complexity of heterogeneous storage environments, it is natural to look at storage solutions for ways to solve these problems. Data lives on storage, so it seems reasonable to assume that the answers to managing the explosion of data would be found in various storage options.
The storage industry is naturally focused on storage-centric answers to the problem of data management. To a hammer, everything looks like a nail. But even the best storage products cannot do this alone. Storage-centric solutions simply do not have the intelligence about the data they store, nor were they designed to work across multi-vendor storage types, different file systems, or protocols.
The good news is all digital files contain multiple types of metadata that can drive a data-centric solution to the problem, and in the process bridge incompatible storage types and use cases. Metadata is literally data about the data. Think of it as a roadmap that gives you a bird’s eye view of everything about your data, and which can drive data management policies that transcend the storage layer.
As an example, data-aware management solutions can leverage the intelligence derived from multiple types of metadata to pro-actively plan for and implement storage optimization, data protection, workflow automation, business continuity, and other tasks.
There are many different types of metadata, starting with file system metadata, such as file name, create time, access time, modify time, etc. But most file types also include additional rich metadata in headers, such as geospatial coordinates or other information that can drive workflow policies. And then there is external metadata which organizations may have accumulated, which may live in other databases or records, such as project-related tags, or other information about their data that captures business value, retention policies and more.
As all of these metadata types are coalesced into an aggregated environment, a metadata-centric solution becomes data-aware, or intelligent enough to automate data and storage management without needing to alter the underlying storage infrastructure.
This is also crucial for implementing an active archive strategy in a heterogeneous environment. Metadata-driven policies can move data anywhere, on any storage type or location. So the active archive can truly be universally accessible by any authorized user.
So rather than trying to physically normalize all the data at the infrastructure level to overcome silos, a data-centric strategy does this in metadata to enable global management across all storage types, file systems, and locations. And as policies or use cases evolve over time, metadata-driven strategies ensure that the data lifecycle requirements can be implemented regardless of the storage types existing today.
In this way, metadata can drive data placement policies to virtualize any storage type including archive. So the active archive can be online and accessible for all users, whether it is on disk, tape, cloud, or a remote site. Such metadata-enabled systems also enable users to search on any metadata fragment, to find the data whether in active archive, or a disaster recovery site, or any other storage tier anywhere.
Data is the lifeblood of every organization; but protection and accessibility also need to be cost effective and flexible. By looking at data management solutions that can leverage the power of metadata, a whole new horizon of possibilities opens up to customers to manage any data, on any storage anywhere.
By David Cerf
Today’s storage environments are comprised of a complex mix of file and object storage, and various file systems each with their unique behaviors. Due to the dependencies between clients and storage resources, users can find it difficult to deploy an active archive for migrating or tiering to cost effective storage. Complex file storage infrastructure can result in low utilization and limit data mobility. The continued growth in unstructured data (files & objects) is increasing the complexity, costs, and operational overhead for data management.
How metadata is changing active archives
Embedded in every file is a treasure that can be used to improve data management and power an active archive, it is metadata. Metadata may be just a simple description of a file but it has very relevant information about the data. When combined with other metadata, policy engines and artificial intelligence, metadata becomes an incredibly valuable source of hidden information that can be used to drive data lifecycle management, improve workflow and enhance applications. Most importantly, this metadata can drive an active archive architecture, using metadata as the trigger to enable data defined tiering and data protection.
Data management is now “cognitive”
Most of today’s file-based workflows have no way of discovering or making use of this information in a simple to use, automated manner. But new cognitive data management solutions with their policy engines, are designed to complement any existing storage environment and enable an automated active archive. This means any file system can now add tape and object storage easily, improving data preservation and retention, search and collaboration, resulting in evergreen storage strategies for simplified migrations and tiering.
Over the past year, new storage technology innovations for active archives have enabled organizations to gain reliable access to all of their data, all of the time. As a result, these organizations have experienced increased cost savings, decreased energy consumption and improved storage administrator efficiency.
The key drivers that will impact the continued use of active archive in the future include the decreased cost of flash storage, greater automation, the rise of tiered storage workflows and the growth of tape in public and private cloud infrastructures.
Members of the Active Archive Alliance recently shared their perspective on the outlook for data storage and active archive in 2017. Here is a list of the top trends to watch:
Automated Policies and Artificial Intelligence Come to Storage
Greater automation and the use of artificial intelligence (AI) will simplify storage management and the use of active archives. New technology will help resolve two of the greatest challenges facing data management - data classifications and storage classifications. New software tools will use metadata to power automation providing a simple solution for data management.
Tiered Storage Workflows on the Rise
As the cost of flash storage decreases and disk continues to struggle to maintain its cost and capacity curve, customers will look to adopt tiered storage workflows. Flash’s useable cost could drop below $1.00 per gigabyte while tape in large systems will cross below $.06 per gigabyte near line and below $.03 per gigabyte for offline or cold storage.
Energy Consumption Challenges Drive Active Archive Adoption
Active archiving will get a boost as organizations continue to seek ways to reduce energy consumption - a significant and growing component of operating expenses for today’s rapidly expanding data centers. With servers and HDDs consuming more than 30% of the energy required to run IT hardware, data center managers will look for ways to reduce utilization of power-intensive hard disk drive technology. Less frequently accessed data will be moved to an active archive where it will remain accessible but consume less power on more economical storage tiers such as automated tape libraries on premises or in the cloud.
Onramps Drive Growth in Tape Usage
Tape usage will grow substantially as a key component of public and private cloud infrastructures for cold and active archive data. Tape’s inherent attributes of low-cost, reliability and even portability combined with the increasing availability of file and object-based onramps to tape will accelerate adoption of tape beyond its historical role as a target of backup and recovery software. Solutions that offer data management intelligence and that integrate well with storage targets will alleviate historical management burdens associated with tape automation deployment, further fueling tape’s penetration of cloud infrastructures where long term preservation and access to that data are required.
Cloud and Object Storage Bring More Flexibility to Archived Data
Object storage software can transform an archive into an active archive that is positioned between high-performance storage and tape. A combined cloud and object storage infrastructure provides additional capabilities allowing users to collaborate at LAN speeds and access data from any device, anywhere in the world. As content continues to grow, an active archive can scale seamlessly to billions of objects in a single namespace. Flexible, user-defined data protection options will be key in making this a reality.
Ethernet Continues to Gain Market Share
Ethernet is winning. There are still a number of ways to connect storage and active archives to compute - Fibre channel, Ethernet, SAS, SATA, and InfiniBand. Even though some of these connections are lighter weight from a protocol standpoint, Ethernet will continue to gain market share as the external host connection means.
The following Active Archive Alliance members contributed to this list: DDN Storage, Fujifilm Recording Media USA, Inc., Spectra Logic, Strongbox Data Solutions and Quantum.
By Rich Gadomski
I recently returned from the great city of Boston where I had a lot of fun with the locals who have a “wicked” sense of humor and a funny accent I can’t imitate being from New York. I was in town facilitating our 8th Annual Global IT Executive Summit. The theme this year was “Exploring the New World of Storage.” Boston was a fitting location given its history in shaping our country and the storage industry too. Speakers from the analyst, vendor and end user communities presented on the latest trends that are emerging in a new world of storage driven by so many innovations in flash, disk, tape, cloud and data management software.
One subject that kept popping up was the need to control runaway costs associated with unrelenting data growth that’s compounded by long-term retention requirements. A common solution that many speakers referenced was the need for a well-planned tiering strategy where data moves as it ages from expensive tiers of primary storage to secondary, tertiary and even the cloud as a fourth tier. Speakers presented on various solutions to manage data growth with long-term retention requirements stemming from compliance regulations, protection of business assets and content, disaster recovery and big data analytics.
A key concept that got a fair amount of the spotlight in the tiering conversations was active archiving solutions. In an active archive, long-term data remains cost effectively online and easily accessible by leveraging innovative and integrated solutions that intelligently manage data across flash, disk, tape and the cloud. This is really where cost savings can come into play by matching data types to the right tier of storage.
It was pointed out by more than one speaker that an active archive solution can be implemented with existing storage equipment and without changing workflows while being transparent to the end user. Since we can’t predict when archival data will be needed again and with that data being kept for longer and longer retention periods, what’s needed are solutions that are cost effective and automatically migrate data from generation to generation of low cost yet reliable media such as LTO tape.
It was also noted during the conference that with today’s advanced tape solutions, long-term affordable active archiving is a reality. Tape now offers: the lowest TCO for long-term storage thanks in part to its high capacity and very low energy consumption; the best reliability as measured in bit-error rate; transfer speeds greater than HDD at 360 MB/second; and a long archival life of 30+ years. Speaking of capacity, the potential for 220 TB on a single tape cartridge based on Barium Ferrite technology has already been proven. Barium Ferrite will enable achievement of the tape technology roadmap plans well into the future and will support active archiving for many decades to come.
Now if we can just get the Bostonians to pronounce the “r” in active archive, we’ll be all set!
Join us for an exciting webinar series presented by leaders from the Active Archive Alliance. Our goal is to align the education and technologies needed to meet the rapidly evolving requirements for data archive.
Alliance members strive to extend solutions beyond the high-end supercomputing and broadcast markets to the greater general IT audience that is in need of online data archive options. The following three-part series will help to educate and promote active archive strategies for data storage.
Thursday, June 2 at 8am Pacific/11am Eastern
Object Storage May be the Cloud You are Looking For
- Why use object storage?
- Learn more through examples from organizations currently using object storage in an Active Archive solution.
Hosted by: Quantum/HGST
Tuesday, June 7 at 8am Pacific/11am Eastern
Best Practices in Leveraging Active Archives to Solve Data Protection and Cloud Requirements
- What is block, file, object?
- How is this defining new approaches to active archive and object storage?
- What considerations should customers take into account when implementing?
Hosted by: Spectra Logic/DDN Storage
Tuesday, June 21 at 8am Pacific/11am Eastern
MLB Network Hits Home Run with Active Archive
- What is active archiving and why is it important today?
- A quick lesson: backup vs. archive - what’s the difference?
- Technologies used for archiving: tape, cloud, disk, flash and data movers.
Hosted by: Fujifilm/StrongBox Data Solutions
Register now to learn more about how an active archive can give you access to all your data, all the time: http://bit.ly/1R6nUk2.
In today’s media and entertainment industry, workflows need to seamlessly integrate primary storage and application platforms to deliver performance and accessibility. Active archive has become invaluable for repurposing assets that already exist and require large amounts of storage. And, we’re seeing continued innovation in the use of active archives for long-term data access and preservation.
All of the Active Archive Alliance members will be showcasing their solutions for active archive at NAB 2016 next week in Las Vegas. Here’s a peek at what each one has planned and where to find them:
DDN Storage (Booth SL8016)
DDN Storage will showcase how its solutions like the high-performance MEDIAScaler™ Converged Media Storage can modernize your entire workflow in one easy step and deliver much more value and profitability to your organization.
Fujifilm Recording Media USA, Inc. (Booth SL7613)
Fujifilm will be exhibiting its Dternity solution, including new features like Partial File Restore and the Dternity VM – the world’s first virtual machine active archive. Additionally, Fujifilm will be discussing their new Data Migration Services aimed at helping customers break their archives out of the past while integrating new technologies.
StrongBox Data Solutions, Inc. (Booth SL7613)
Visit the StrongBox team in the Fujifilm Dternity booth (SL7613) and discover how to simplify your workflow for long-term content preservation. StrongBox is giving away a PETABYTE of storage, so be sure to stop by the booth to enter.
Hewlett Packard Enterprise (Booth SL2425)
Hewlett Packard Enterprise will be demonstrating the HPE StoreEver Tape archive solution with the latest LTO-7 tape drives. This solution allows for M&E customers to easily integrate the cost effectiveness and reliability of LTO tape as if it were disk into their workflows for long-term archive of their media assets. Come see the solution in action in the StudioXperience booth (SL2425).
HGST (Booth SL9721)
HGST will showcase its 4.7PB Active Archive System; an object storage system that transforms silos of data storage into cloud-scale active archives, featuring its innovative helium-filled 8TB drives. Visit the booth to enter a daily drawing for a G-Technology 1TB G-Drive, or play the HGST Partner Passport Program game for a free t-shirt. Tweet a photo wearing your shirt to enter a daily drawing for an Amazon Echo.
Quantum is announcing new partner integrations with both StorageDNA and Marquis to deliver comprehensive AVID ISIS archival solutions. These active archive capabilities enable ISIS administrators to manage their storage effectively, offloading completed or stalled AVID projects to Quantum Artico, Scalar tape libraries, Lattus object storage and Q-Cloud for near term and long term content retention and protection Stop by Quantum’s booth (#SL8416) to learn more about this and many other exciting demonstrations.
Spectra Logic (SL11816)
This year at NAB Show 2016, Spectra Logic is a finalist for the IABM Game Changer Awards, will be giving a presentation about ‘Genetic Diversity’ April 20th at 10:00am PST, and has products on display in several partner booths. Interested in hearing about our hybrid storage ecosystem for media and entertainment? Come by Booth #SL11816 to learn how Spectra’s deep storage solutions utilize active archive to grant users access to multiple tiers of highly scalable and affordable storage.
Be sure to visit these member booths and ask how an active archive can help you more easily manage and preserve your digital media assets.
by Shreyak Shah
Data from all industries is growing with massive amounts of structured and unstructured data that is expected to quadruple in size by 2020. From Life Sciences, Media and Entertainment, Video Surveillance, and Oil and Gas to cloud providers many organizations are seeking ways to add more storage capacity and performance while still keeping their ‘archived’ data accessible and secure. One approach that is seeing a lot of traction is active archive, which is beneficial in terms of accessibility, flexibility, security and scalability and is ideal for organizations with data that requires long term retention and fast, easy retrieval.
Archives are undergoing a radical transformation, fueled by rapidly growing file sizes and new access requirements. Object storage enables IT managers to archive content sooner reducing the cost of tier-one storage. This preserves opportunities to monetize content while increasing access and speed at a substantially lower price point than traditional RAID disk-based solutions (and not much more than tape).
All too often businesses have had to offload months to years of historical transactions to secondary or tertiary data storage repositories. With the increased volume of data sources and speed of new data creation, keeping pace with these repositories has become a daunting task for any size of organization. More and more organizations require an active archive that allows for easy access to all pools of storage permitting the management of limitless volumes of data for better intelligence, competitive advantage or for regulatory requirements.
Object storage software can transform an archive into an ‘active’ archive that is positioned between high-performance storage and offline tape. The active archive solution brings it all together -- cost-effective spinning disk storage or online tape libraries, a collaboration platform, online content distribution and data protection. It also provides the foundation for a resilient object storage platform that delivers the highest level of data durability in the industry, surviving an entire data-center outage when deployed across multiple sites.
CTOs do not wish to purchase two to three years of storage in advance as it often leads to over budgeting and under utilization. Power, cooling and floor space are all wasted with this model. Object storage enables administrators to buy, manage and deploy exactly what is needed and to scale the performance and capacity as the organization expands.
It also allows an organization to take control of its content and easily create and manage public, private or hybrid clouds that sit securely behind a firewall. Administrators can also economically and efficiently manage capacity of up to double digit Petabytes with just one full-time employee and enjoy the benefits of the system’s self-healing capabilities. As content continues to grow, an active archive can scale seamlessly to billions of objects in a single namespace. Flexible, user-defined data protection options are key in making this a reality.
Implementing an archive active creates incremental revenue opportunities and provides the ability to convert what has historically been a cost center into a revenue-producing asset. A combined cloud and object storage infrastructure provides additional capabilities, allowing users to collaborate at LAN speeds and access data from any device, anywhere in the world.
By Rich Gadomski
I recently attended the Storage Visions 2016 Conference to participate on a speaker panel entitled: “Saving Data Forever: Long Term Content Preservation and Archiving.” The panel was in agreement that “forever” is a long time especially in the world of IT storage. While we were hard pressed to predict what storage technologies would be available 5,000 years from now as one attendee asked, our advice was to put an active archive in place that can routinely manage the migration of data from performance tier to economy tier and from older storage formats to new ones. The benefits of old to new migration typically include better performance, reduced footprint from greater density and lower total cost of ownership.
In a typical active archive environment, data management software migrates data by policy from expensive primary storage tiers to more cost effective tiers such as tape while maintaining the convenience of online file access to all of the data. Data can also auto migrate from older tape formats to new formats within a tier. Take LTO tape as an example; you can upgrade your LTO-5 drives to LTO-7 drives and auto migrate data in this tier from LTO-5 media to LTO-7 media. In doing so, you will get the benefits of an easier conversion with a big jump in per cartridge capacity from 1.5 TB to 6.0 TB, a much faster transfer rate going from 140 MB/sec to 300 MB/sec, thus reducing the amount of drives and robotic library slots required, and giving you more room to grow.
The ability to migrate is key to keeping up with relentless data growth for the long term. That is why having a reliable technology roadmap is so important. The LTO roadmap has been extended from eight generations to ten generations. Currently, the newest generation in the market is LTO-7 with a native capacity of 6.0 TB. Generation 9 and 10 have been added to the LTO roadmap where we can expect native capacity in generation 10 to rise to an impressive 48 TB, eight times greater than LTO-7, with an impressive native transfer rate of 1,100 MB/sec. These new generations will provide the ability for ongoing migration necessary to keep up with data growth and will ensure backwards compatibility with the two previous LTO generations as usual. No other data recording technologies can present a roadmap that has this much capability to look forward and plan for comprehensive data archiving.
Is generation 10 the end of the roadmap? Not likely, as IBM and Fujifilm announced back in April of 2015 the achievement of a new record of 123 billion bits per square inch in areal data density on linear magnetic tape using Barium Ferrite particle technology. This equates to a standard LTO cartridge capable of storing 220 TB of uncompressed data, 36 times greater than LTO-7 capacity! Given this achievement, the new LTO roadmap should be easily achieved and extended beyond generation 10.
This is good news for your migration strategy and long term, cost-effective active archiving!