Guest Article NEWS

Don’t Let Your Workloads Get Stuck in the Mud – Building for Portability in the Cloud

Rick Vanover

In today’s data-driven business world, integrating large data volumes is hindered by legacy or customized applications that resist migration and concerns about vendor lock-in to specific cloud services. This challenge can feel like fitting square pegs into round holes. Furthermore, the persistent threat of cyberattacks, such as ransomware, emphasizes the need to balance cost and security for each workload. IT teams are focusing on environment flexibility and adaptability to address these issues. Before progressing, addressing key questions is essential.
By Rick Vanover, Senior Director – Product Strategy, Veeam

As businesses look to optimise their costs to weather economic downturns, ramping up cloud spend can cause some headaches. While there are lots of different options to help mitigate this, from moving workloads to a more cost-effective environment (or even back to on-premises) or re-architecting to save costs, organisations often lack the technical agility to make the most of them.

With modern businesses carrying so much data, some legacy or homegrown applications not allowing for transfer and cloud lock-in all to contend with, it can quickly feel like trying to fit a thousand square pegs through a thousand round holes. All of this is against the backdrop of cyber threats like ransomware – so the right balance between cost and security needs to be found for every workload. To avoid this, IT teams are increasingly designing and adjusting their environments with portability in mind, but there are some questions to ask yourself first.

Why move data at all?
To state the obvious for a second, modern enterprise IT environments are vastly complex. They can be monolithic and highly dispersed, with the growing data gravity of some environments making many companies essentially “digital hoarders.” This is problematic as is, as holding on to data you don’t need exposes you to unnecessary cybersecurity and compliance risks. But data bloat in the cloud also brings severe financial consequences and the dreaded “bill shock” when that invoice lands.

So, even though many companies moved to the cloud in the first place to optimise costs, the flexibility that the cloud gives businesses can be something of a double-edged sword. While the attractiveness of the cloud is that you only pay for what you need, the flip side is there is no “spending cap” so costs can easily get out of control. To solve this, better data hygiene can help, but for the data you do need, it’s about picking the right platform for the workload. This may involve re-platforming or re-architecting to optimize costs. This is where data governance and hygiene come in – before looking to move data or improve processes, you need to know exactly what data you have, and where.

What data can we move?
So, once you’ve established what data you should think about moving, either to a different environment, server, or storage tier, the next, more difficult question is what data you can move. Unfortunately, this is where many organisations face challenges. Having data portability is crucial to be able to move things around as needed and to simply maintain data hygiene in the long term. But several factors can make it difficult to move or transfer workloads from one location to another. The first is “technical debt” – essentially the extra work and maintenance required to update older or scratch-built applications to get them to a point where they are transferable and compatible with other environments. The cause of these issues might be taking shortcuts, making mistakes, or simply not following standard procedures during software development. But leaving it unfixed makes it impossible to optimize environments and can cause additional problems for things like backup and recovery.

The other, perhaps more infamous, issue that can affect data portability is cloud lock-in. It is a well-known fact at this point, that businesses can easily be locked into using specific cloud providers. This can be due to dependencies like integrations with services and APIs that can’t be replicated elsewhere, the sheer “data gravity” it might have in a single cloud, and a simple knowledge gap meaning teams know how to use their current cloud, but lack the expertise to work with a different provider. Of course, this will only affect moving workloads out of the cloud, so it’s still possible to build for better portability to give you better storage options and promote better data hygiene. Essentially, where possible businesses need to create some standardisation, across their environments, making data more uniform and portable and mapping and categorizing it so they know what they have and what it’s for.

The (constant) security question
Finally, it’s crucial when building and capitalising on data portability that security is not left behind. Of course, improving security can (and should) be a motive for moving workloads in the first place but if you’re migrating workloads to optimise costs this must be balanced against security considerations. Security needs to be part of the data hygiene process, so teams need to ask “What do we have?” “What things do we not need?” and “What are the critical workloads we absolutely cannot afford to lose?” Beyond this, continue to patch servers and when moving data to colder storage etc remove internet access when it’s not needed.

Having backup and recovery processes in place is also key when moving workloads. To come full circle, having easy data portability is also important for disaster recovery. In a critical event like ransomware, the original environment, be it a cloud or on-premises server is often unavailable to recover damaged workloads (via a backup) as it is typically cordoned off as a crime scene, and the environment might still be compromised. In order to recover quickly and avoid costly downtime, workloads sometimes need to be recovered to a new temporary environment, like a different cloud for example.

As organisations strive to manage their IT environments and avoid financial and cyber security surprises, it’s important to constantly assess what data and applications you have, and where they are kept. But to manage this and adjust as needed, businesses must build with portability in mind. By doing this, businesses can create a more agile and cost-effective cloud environment and will find it easier to bounce back and recover from disasters like ransomware.

Related posts

Interactive Protection Simulation multiplayer update from Kaspersky

Channel 360 MEA

Qualys Expands Cloud Platform across Biz sizes

Channel 360 MEA

MBZUAI signs MoU with OurCrowd Arabia

Channel 360 MEA

Leave a Comment