When we decompose our monolithic systems, we shift the complexity of them from software to operations. That requires both simpler software and higher complexity of infrastructure. How can we tackle this issue? DevOps is the answer.
What exactly is DevOps?
Let me tell you what it definitely is not – a job title. We’ve seen it way too many times: “Looking for DevOps people”, “Our DevOps team”, “The DevOps guys can take care of it”. The term has been even more twisted by IT recruiters and became just a buzzword these days.
What the DevOps is then? We would say it’s a mindset. It is about taking all the great practices developers have used for years and applying them to Operations. But it is also about developers having Operations in mind when building their apps. No more tossing software over the fence and saying: “Works in Dev, it’s Ops problem now”. To succeed Developers and Operations need to work hand in hand and draw from each other’s experience. However, DevOps requires a specific team structure as well as a different way of thinking on both sides.
Over the years engineers that live by the DevOps’ standards came up with a lot of new tools that help to manage and operate complex systems in a way that ensures security and reliability without compromising such important aspects as flexibility and speed.
You can spin up a whole Virtual Private Cloud (VPC) by using tools like Terraform and Helm.io in a matter of minutes. And what’s more – you can be certain that it works just as you expect it to. However, those tools alone aren’t enough. You need an infrastructure to run your VPC on. Don’t be fooled, even serverless apps need servers. We’d like to focus on public clouds with emphasis on GCP. Why? Let’s explain it briefly.
What is a public cloud?
Nowadays we try to time-share everything from scooters, cars and offices to apartments. We don’t need an electric scooter daily, we just want to get from A to B quickly. We don’t want to think neither about the maintenance of said scooter, nor about getting it back to where we started my journey.
So, what is a public cloud? Think of it as renting a data center for minutes. A highly advanced, super-efficient, top-of-the-line data center that can accommodate your systems. It is not just the underlying infrastructure like servers, switches, loadbalancers, power supplies, etc.
All major public cloud providers offer a ton of services – from managed database engines to pre-trained AI models. You have to know which ones can help you build better, faster and more resilient systems. And the best part is you can use it right now and for as long as you’d like. You don’t have to make any investments into your infrastructure, just buy hardware and worry about your computing power only.
Is public cloud safe?
How “public” the public cloud is? From the physical point of view you can’t really get in without a tank (please refer to the video). You’ve got multiple layers of security, access verification and a professional process to dispose of the old hardware.
What about my data? That’s a different story. It is as safe as you make it. All major cloud providers are GDPR-compliant. They offer wide range of solutions to protect your systems such as at-rest data encryption, VPN connections, identity management and so on. It is their responsibility to use these tools wisely. Fortunately, they do come pre-configured with best practices in mind, so you would have to deliberately make some changes to lower their safety level.
After you decompose your system, you might be hesitant to migrate every part of your new system to the cloud. Luckily, you don’t have to. You can set up your new architecture to run parts of your system on your on-prem infrastructure, some microservices in Azure and some in GCP. All parts of your complex system can connect safely using VPNs or other secure connections.
Keep in mind though, that this will add another layer of complexity to your system. You are now managing multiple cloud providers and multiple VPCs. I’m not saying that it’s a showstopper.
Using a hybrid cloud can:
- ease your way into the public cloud
- help you use different services from each cloud provider
- save you from vendor lock-in
Benefits of cloud
Benefits of using cloud are described in the two latest publications:
- Google Cloud and DORA (DevOps Research and Assessment) in their 2021 Accelerate State of DevOps Report focuses on different features, while showing new insights, implementation best practices and the advantages of embracing them:
"Our research continues to illustrate that excellence in software delivery and operational performance drives organizational performance in technology transformations. (…) This year we saw that elite performers continue to accelerate their pace of software delivery, increasing their lead time for changes from less than one day to less than one hour.”
- The findings provided by Google in their report are complemented in the 2021 State of DevOps Report by Puppet 2021 State of DevOps Report.
"97% of respondents with highly evolved DevOps practices agree that automation improves the quality of their work. (…) Almost everyone is using the cloud, but most people are using it poorly. However, highly evolved DevOps teams are using it well (…). While cloud and automation are important, organizations also need to address organizational and team aspects, namely helping teams clarify their mission, primary customers, interfaces, and what makes for healthy interactions with others.”
These are the examples of features which, together with agility and reliability, can be greatly increased by decomposing your current monolithic system, DevOps practices and the use of public clouds.
The biggest concerns that we encounter regarding migration to cloud are often connected with the idea that you have to migrate your whole business and all its components at the same time.
Once you understand that decomposition is the way to go, the whole process becomes way less complicated, easier to plan and execute. So take advantage of public clouds, DevOps practices and succeed!