Data management deserves more airtime at board-level meetings
According to Veeam Software’s APJ Senior VP, Shaun McLagan, the company has always had a history of being strong in the small- to medium- business (SMB) and commercial space. Their move to enter the enterprise space happened later, but this was not because of a gap in their technology offerings and Veeam not being able to address enterprise-level requirements, said the SVP.
With Veeam’s co-CEO Peter McKay, recent restructure of the company to align better with partners, technology alliances have become a key thrust for Veeam’s entry into the enterprise market.
Some of their key alliance partners are HP Enterprise, Cisco, Netapp, Pure Storage and so on, for a total of ten so far. Not to mention, the way that Veeam has delineated their markets regionally, is similar to how their partners like Netapp delineate their own markets, allowing joint sales motions to happen easily and seamlessly.
McLagan said, “We didn’t perceive a gap in the way we approached (enterprise), just that our channel focus was SMBs in the past. We now engage a high-touch sales force to direct opportunities back to Veeam, and (our partners) help us enter accounts.”
Alliance partners, for example Netapp which is going to begin in May, do not work exclusively with Veeam, McLagan added, while also adding that feedback from these partners has been that Veeam is taking market share from the other five data protection players in the Gartner Magic Quadrant; the likes of Commvault, Dell-EMC and Veritas.
“Where I think we succeed, is with organisations that are looking to embrace modern IT, or to change something, or to do a data centre refresh, or are simply just looking to move off their legacy solutions.”
The SVP observed that most other data protection players in Gartner’s Magic Quadrant, are actually based on legacy technologies. “But, we were born in the virtualisation space. And modern IT is definitely virtualisation, cloud-based, and lately, it’s about moving between clouds.”
He spoke of a period of time, Cloud 1.0, when workloads were set and then forgotten about. Things have changed however. “Now, with cloud as a vehicle of compute, we see data move between (different) clouds, and come back on-premises in some instances,”
With anticipation that this trend of data and workload mobility becomes more prevalent, Veeam wants to be ready and able to protect any data in any cloud.
The future of data management
McLagan shared that in APJ region, the enterprise contribution is still small. “We have a long history of SMB and commercial sales channels. With enterprise, the deals are larger, more complicated and take up more time.
“However, every quarter, we are doing more 6-figure deals that tend to come out of the enterprise segment.”
Veeam has also moved from talking about product features to talking about enterprise requirements and outcomes.
“We don’t have a tech gap; the past year and half we only had a perception gap. Today, the fact that we are bigger in revenue compared to Commvault, is a surprise to people.”
That isn’t the only thing that surprised the industry about Veeam. “Last year we did a couple of things differently – released our first agent for Windows to get access to standalone systems.” They also acquired software from Cristie Software, to be able to also have Unix agents.
“We didn’t do that because we think that there’s lots of workloads in Unix. We did it to allow data to be brought back to the Veeam platform,” McLagan rationalised, while adding that Veeam delivers an API approach to allow quick integration with other tech alliance partners like Pure Storage.
“This is especially as we move up higher-end. Organisations look to us to be integrated into their platform.”
Roadmaps and changing mindsets
What if the management of data could be more intelligent? Instead of relying on just policy which can have rigid schedules about what to do about data at certain specific times, might analytics and artificial intelligence provide a more ‘responsive’ way to data management? For example, quickly taking a snapshot of data and moving the data elsewhere, when malware is detected on the network?
McLagan described that APIs can sit in front and accept ‘calls’ from third-party platforms to initiate some sort of data movement, failover, replications and so on.
“There is still a bit of policy and scheduling, but we can take more input from behaviour-based intelligences and third-party technologies, by integrating with the likes of Cisco Hytrust, for example.”
McLagan also shared that these types of integration can be done, and are being done in pockets. “I won’t say it’s ubiquitously used in data management and policy, but I see take up being quite strong in 12 to 24 months.”
This is due to the increasing interest that McLagan observed coming from the security and risk side of organisations. “The concept of disaster recovery is quite ubiquitous, but incident recovery is more prevalent in cybersecurity scenarios.
“A lot of times, availability is the last step in an incident response scenario. As organisations build incident response plans, the two worlds of Chief Information Security Officer (CISO) and CIO try to understand how their respective teams – infrastructure and IT – can respond to incident planning.”
“Acknowledgement that we can protect data anywhere is more prevalent now than before, and organisations’ efforts to have less manual intervention, is there.
“Our job is to offer that, not just in single silos but overall, and we will continue to invest in acquisitions and R&D, for this to happen,” McLagan concluded.
Data management and outcomes like availability in the event of disaster, needs to be a much senior-level topic. Organisations that have been reported about in the press, for poor handling of data, simply strengthens this view.