Gartner predicts that over the next two years, organizations that don’t revise data retention policies to reduce the overall data held, and by extension the data that is backed up, will face a huge sanction risk for noncompliance as well as the impacts associated with an eventual data breach.
For instance, GDPR non-compliance will come with regulatory fines of up to 4% of annual global turnover. The challenge for organizations is balancing security and business needs.
A recent panel discussion examined the data security dilemma. Featuring speakers from NetApp and Veeam, it argued that companies' view of backup needs to change.
The new privacy mindset
The panel discussion began by examining the impact of the General Data Protection Regulation 2016/679 or GDPR.
The panelists noted that the multi-faceted EU law changed the way companies viewed data privacy. It also reshaped the data security landscape by making compliance a global concern.
“The truth is that there is no perfect solution for testing GDPR compliance,” said Joseph Chan, regional director – Hong Kong, Macau & Taiwan, Veeam.
Chan noted that they are solutions that tackle some parts of the regulation. For example, Veeam DataLabs Staged Restore removes sensitive data before restoring backup data. It is a valuable feature for addressing "the right to be forgotten” rule in GDPR.
Vendors are also looking within, said Qinghong You, director of Technology and Solutions Group, NetApp. He acknowledged that NetApp, being a global organization, itself has invested in improving GDPR awareness.
“GDPR is now part of our sales, technical and professional services training. We also built our product portfolio with GDPR compliance in mind,” said You.
The virtual challenge
New regulations are one reason for changing attitudes. Another is the drive to become an online business. It changes the way companies see data backups.
In the past, IT teams focused on backup windows and costs. Companies bought faster backup systems and tiered data according to their value.
“Today, they focus on restoration times and service level agreements,” said Veeam’s Chan. The shift in focus is because “no one can tolerate a few seconds of delay when doing an online transaction.”
Customers also expect online businesses to operate 24/7. NetApp's You argued that it challenges the way companies design their infrastructure. “It needs to be elastic enough to meet peak demands."
Faced with limited budgets, many are choosing a “consumption model.” "So, they pay for what they actually use," he said. It also changes the way vendors develop and sell their software, he added.
How to not trust anyone
Many companies embrace the consumption model through the cloud. But many, also, overlook the need for their security approach to change as well.
The traditional “castle and moat” strategy becomes irrelevant, said NetApp’s You. “It considers anyone outside the castle a barbarian or intruder. The moat protects the castle from them. Anyone inside [the castle] is assumed to be trustworthy,” he explained.
This approach was fine when the internal environment had a defined perimeter. The cloud environment blurred these perimeters. Your internal employee or trusted partner may become your biggest threat. A third-party developer accidentally installing a backdoor can compromise your internal environment.
So NetApp is championing zero trust networks, a concept the company adopted internally. “In zero trust networks, you trust nobody. A segmentation gateway predefines everything you do. It is how we control our intellectual property,” said You.
In the zero-trust model, actions that are not defined “will be denied." You compared it to school wisdom when students are taught to question using the 5W’s and 1H: “who, what, when, where, why and how.”
“A zero trust network defines every action explicitly based on these conditions. These capabilities are now built into our NetApp products,” said You. The company’s industry-leading “F-policy framework” for data access is a prime example of this approach.
But solutions are only part of the answer. NetApp's You advised companies to look at how they handle data. Employing a chief data officer (CDO) could help, he added.
“The primary job of the CDO is to classify the data. If you classify a piece of data as toxic, you can take the right actions to “de-tox” it by either encrypting or isolating it,” he said.
NetApp's You noted that such classifications could help companies to reduce their data liability. "Keeping data you do not need is an unnecessary liability," he said.
Becoming data smart
Companies now see data portability as important as protection. It is especially so for those with different cloud environments, said Veeam's Chan.
“[Companies] are always looking to move their data securely and easily across different clouds. We are looking to be the bridge,” said Chan.
Veeam and NetApp are using APIs to make protection smarter. Both Chan and You saw the API integration effort between their two companies benefiting their clients.
Meanwhile, NetApp is using APIs to connect with other security vendors. The shared information can help to improve security readiness.
“Take for example ransomware. Sharing data can help companies to be aware of any internal threats that go unreported,” said You.
APIs also allow software developers to integrate NetApp capabilities. “You can now build your infrastructure using our capabilities,” You added.
Expanding the backup promise
Advances in backup and storage are opening new possibilities for companies. The panelists ended the discussion by looking at the future impact for companies.
One area of immense interest is automation. Economics is the primary motivation. "A lot of companies conduct disaster recovery drills, but they spend up to three months to prepare for one. It is a lot of man-hours spent," said Veeam's Chan.
His company’s SureBackup cuts these months into weeks by automating processes. “It also allows you to repurpose your human resources for other tasks,” he added.
Veeam is using backup data for setting up sandboxes. After all, the data offers accurate snapshots of the company's internal environment. Developers can use sandboxes to test and fine-tune their applications, said Chan.
Companies are also using the data in artificial intelligence and robotic process automation. Algorithms can use backup data for supervised training and optimizing processes.
“We have technology in place to train your software bots faster using your backup data,” said NetApp’s You.
Arguments for a revisit of security approaches
“External factors and security-specific threats are converging to influence the overall security and risk landscape,” noted Peter Firstbrook, research vice president at Gartner. He urged leaders in the space to properly prepare to improve resilience and support business objectives.
As organizations continue to push towards adoption of cloud, in all its forms, he suggests that organizations address data security and data protection starting with an assessment of the risks it addresses. Only then should the issue of what technology to bring should come into play. Too many organizations take the technology-first approach and trying to adapt these to suite the business needs.