Big Data has a variety of meanings today. For many, it’s simply the challenge of managing out of control data growth. For others, Big Data is the start of deploying analytics solutions to find that nugget of data from the enormous volume, velocity, variety, and variability of the enterprise data they generate for that next competitive advantage. Whether backing up billions of files, protecting a few to thousands of databases, or leveraging data warehouse appliances, NetBackup’s advanced integrated technologies can help simplify and reduce the cost, risk, and time to value for globally deployed big data.
The Big Data experiment is evolving, and many are realizing the possibility to accelerate the success. The promise for Big Data may lie in the leverage of known, proven technologies. Databases and Data Warehouses have been used for decades to run businesses, mine for data, and are well understood. Now they are viewed as key components for architecting advanced analytics solutions. NetBackup Intelligent Database and Data Warehouse appliance agents integrate and leverage the proven global management and scalability of the NetBackup Platform for Big Data solutions.
The final key component in architecting an advanced analytics solution is how to make sense out of the data generated in the enterprise today. The emerging trend is to use a Hadoop, MapReduce style processing engine to create a Data Refinery where the sheer volume, velocity, variety, and variability of millions of files can be standardized for import, into either a data warehouse or database applications for analysis. To ensure integrated protection and scalability, customers are starting with Symantec and the NetBackup Platform to architect Big Data solutions.