Cloud Native Trend Report: A Faster, More Scalable Approach To Building Applications
Cloud Native Trend Report: A Faster, More Scalable Approach To Building Applications
Cloud Native Trend Report: A Faster, More Scalable Approach To Building Applications
Trend Report
A faster, more scalable approach to building applications
B RO UG H T TO YOU I N PA R T NER SH I P WI T H
Table of Contents
3 Highlights and Introduction
B Y FR A N K E AVE S
Here at DZone, it’s always exciting to see new technologies emerge and to observe trends that spark evolution across industries.
We’re further inspired by the diverse groups of people and companies who join forces to innovate, create tools, and contribute to
the space. Because of that, such efforts cascade down to the individual developer and have a lasting impact.
Today, these initiatives are driving the adoption of a standardized approach for developing and deploying software applications to
the cloud — becoming ‘cloud native.’
In the past, a large piece of software contained many modules, usually tightly coupled. New principles like CI/CD emerged and
put an emphasis on continuous testing and security. As a result, testing became more efficient, the cost of software maintenance
declined, and the overall complexity of the SDLC decreased.
With the growing development and availability of cloud technology also came different ways of looking at and solving software
problems. These techniques and supporting tools shaped what’s broadly known today as cloud-native computing.
Microservices — an approach that’s taken over the software industry and its utilization of cloud resources — involves breaking
applications into smaller separate pieces of software, all running independently to solve a larger problem.
Both large and small software companies are moving in this direction, migrating all architecture to the cloud and adopting
microservices to manage applications. As part of this process, large software companies are breaking down their monolithic
pieces of software into microservices. This is done so that they can leverage cloud resources, offloading a significant portion of
software management to cloud providers.
What is perhaps more significant is that they remain competitive with the smaller companies and startups who are already using
these services for more productivity among their developers and current software products and services in the market.
This report highlights key findings from our original research on these trends and addresses the impact of cloud-native technology
across industries. Further are explorations of becoming cloud-native and the business advantages upon adoption.
Malvi Goyal kicks us off with a detailed look at what it takes to be “cloud-native,” exploring the business value generated by cloud-
native applications, including things like auto-scalability and infrastructure investments.
In Viraj Phanse’s article, “Deploying Microservices and the Future of Cloud Native,” he dives deeper into the role of microservices
in cloud-native technology and how this will change development and deployment for the better.
According to Goyal, “the million-dollar question for today’s enterprises is less about why they need cloud-native and more about
how they become cloud-native!”
Since the early 2010s, IT organizations around the world have been transforming the way they build applications using an approach
known as “cloud native.” According to the Cloud Native Computing Foundation,
“Cloud native technologies empower organizations to build and run scalable applications in modern, dynamic environments such
as public, private, and hybrid clouds. Containers, service meshes, microservices, immutable infrastructure, and declarative APIs
exemplify this approach.”
To better understand the current state of cloud native almost a decade after its birth, we surveyed 396 software developers from
DZone’s global audience.
Improve quality by having teams focus on just one piece of an app. 51%
44%
Now that we know why developers use microservices, how do they build them? For a majority of developers that we surveyed on
DZone, the clear winner was Spring Boot — other tools included Java EE, Amazon Lambda, and Prometheus.
Perhaps most revealing about the impact of microservices, 60% of developers believe that it makes their jobs easier, which is
twice the number of those that felt their jobs are not easier. The remaining group still felt unsure. So what challenges do developers
experience with microservices?
While almost everyone uses microservices today, it does not mean that the technique comes without complication. Across the
board, developers face similar challenges, though one is unrelated to technology: changing the culture to be open to microservices
— a major barrier for nearly half of respondents. Communicating between microservices and changing API contracts was close
behind. Security, of course, was a noteworthy challenge for more than a third of respondents (Figure 1, see next page).
Inherent to innovation is the need to depart from traditional ways of working. Particularly for larger, more established companies,
switching from legacy architectures to microservices can take significant time and effort. In fact, when asked what challenges
they face while switching from legacy architecture to microservices, one-third of developers said that the time investment alone
is a major challenge. However, the biggest challenge developers face by far (64%) was determining how to break up monolithic
components of existing apps. Challenges following that included overcoming tight coupling (46%), bugs (39%), and incorporating
other technologies like containers (35%).
50%
45%
44%
40%
41%
35%
30%
30% 30%
20%
10%
As with any change in software development and deployment, security is always an important consideration. The leading security
concerns related to cloud native for developers are sensitive data exposure (56%) and API security risks (55%). Secondary
concerns revolve around how security fits into day-to-day work. This result is not surprising given the constant pressure and
demand developers now face to create, test, and deploy new apps on a continuous integration and continuous delivery model (CI/
CD) — a critical component of being cloud native.
44%
Microservices have been revolutionary in changing the way apps are developed and deployed. However, breaking down larger
applications into smaller microservices adds additional layers of complexity to the process. Managing increasing numbers of
microservices can become difficult if developers are not careful. This is the point at which containers come to the rescue. With the
explosion of microservices have come an equally widespread use of containers to manage them. Virtually every organization uses
containers — or is working to implement them — within their workflow to help manage their microservices architecture.
No, plan to in
next year
No, and no
plans to
When organizations begin using containers, there are many moving parts right from the start. Of the respondents who said that
their organization has used containers for the past one to two years, two-thirds use them in both development and production. By
year three, the number jumps to 84%.
At the same time, the number of developers who said that their organization uses containers only in development differs
significantly between years 1 and 2 (22%) and years 3 or more (8%). This demonstrates that while organizations jump right into
container development and production, it can take several years to fully integrate containers into the DevOps process.
Like microservices, using containers adds another layer of complexity to the software development lifecycle. While
containers help to manage microservices, managing their deployment manually takes away precious time from developers. That
is why 60% of organizations use orchestration tools to automate certain tasks and another 30% plan to in the next year. Container
orchestration allows development teams to focus on building, deploying, and maintaining apps continuously and scaling apps
more quickly. For developers, Kubernetes is the clear orchestration tool of choice (53%) followed by AWS ECS (26%).
60
53%
50
40
30
26%
20
16%
10 12%
9% 8%
4% 1%
0 Kubernetes AWS Docker Azure Other Google Cloud CoreOS
ECS Swarm Container Container Foundry Fleet
Service Engine Diego
In their efforts to become cloud native, developers said that their organizations use a mix of cloud infrastructure models. There is a
slight edge in favor of public cloud, but there is not a clear dominant approach to date.
Figure 4: What Cloud Infrastructure Are You Using to Run Your Apps?
Public 42%
Private 33%
Hyrbid of Public
and Private 32%
Cloud and
On-Premise 31%
Not Sure 4%
At the end of the day, many organizations are working toward building cloud native capabilities, but very few are actually there.
In fact, nearly half of all organizations represented in the survey have reached cloud native capabilities with less than one-fourth
of their existing apps. Only 12% of organizations can say 100% of their apps are cloud native. For many organizations, there is still
quite a ways to go toward becoming cloud native.
100% 7% 100%
12%
80% 10%
12%
60%
12%
36%
40%
20%
11%
0% None 1-24% 25-49% 50-74% 75-99% 100% Not Sure Subtotal
For the next big development in cloud native — using serverless technology to run apps — organizations are even further away
from adoption. In fact, 43% of respondents said that none of their apps are serverless, and 37% said that less than one-fourth of
their apps are. Serverless technology is still very much in its early stages of development.
120 100%
7% 101%
100 2%
3% 2%
7%
37%
80
60
43%
40
20
Additionally, the idea that cloud native will put a spotlight on security was far more prevalent for developers in North America
(17%) compared to those anywhere else in the world. Lastly, serverless architecture was a significant trend among 25% of
developers in Asia-Pac, compared to just 6% and 10% for developers in North America and EMEA, respectively.
32%
27% 27%
26%
25% 25%
23%
15% 15%
12% 12%
11%
9% 9% 9% 9%
7%
6% 6% 6%
Executive Insights
Everything You Need to Know About Cloud-Native
Questions around how to efficiently manage microservices, accelerate deployments, and make applications scalable are all
answered through cloud-native technology.
Cloud-native is all about taking advantage of the cloud in every way possible. This results in faster, more efficient ways to run,
develop, and deploy applications — every aspect of your application infrastructure has been adopted and implemented with the
cloud in mind.
So now, instead of dealing with a monolithic architecture and massive amounts of code at a time, applications utilize
microservices to create a distributed architecture that’s much easier to manage.
The speed and efficiency at which cloud-native applications run allow businesses to get products and services to market faster.
And being able to scale and deploy applications faster than your competitors will serve as a huge win for your organization.
To learn more about why this technology matters — to both developers and executives — we sat down with experts in the cloud-
native space to discuss trends, challenges, and predictions for the upcoming year.
Based on our research findings, we found that 90% of respondents currently or plan to use microservices in the next year. This
means a major increase in cloud-native applications and questions around managing microservices.
The migration from a traditional monolithic architecture to microservices helped to evolve application architecture. A monolithic
architecture follows the traditional model for app development where the business logic, UI, and database are all under the same
umbrella, so to speak, while the introduction of microservices allowed for individual services to be managed within one applica-
tion, including a separate database, business logic, etc. for individual services — all under the same UI.
When adopting microservices, the leading challenge is determining where to break up monolithic components, which is not
surprising based on the challenge many organizations face when finding a valid use case. This is the result of two things: trouble
understanding the market and a basic knowledge gap between microservices and what they can do for your business.
This post “Where Microservices Are Actually Useful: Two Types of Use-Case” provides thoughts on the most common use cases
for adopting microservices. “The most obvious use cases are those of a CPU- or RAM-intensive part of the application. That nor-
mally goes into a separate deployment, offering an interface to the rest of the application.”
Services that consume a lot of RAM are impractical to run every time a developer starts the application, so moving to a distributed
architecture can resolve some of these concerns.
But microservices aren’t limited to these two use cases. There are increasingly more use cases as these services become a larger
piece to the cloud-native puzzle.
Another important topic to consider when adopting cloud-native technologies are operators and operator patterns. According to
Kubernetes.io:
“Operators are software extensions to Kubernetes that make use of custom resources to manage applications and their components.
The Operator pattern aims to capture the key aim of a human operator who is managing a service or set of services.”
We talked with Director of Community Development at Red Hat, Diane Mueller, who further stressed the importance of executives
learning about operators sooner rather than later.
“If there’s anything to get across to executives right now, the new word that they’re going to hear, if they haven’t heard it already, is
operators in 2020,” said Mueller. “Operators take the next step beyond just installing and configuring a service or an application to
managing the day-two stuff, like operations and lifecycle management.”
This includes applying a patch, reconfiguring parts of the application, automatically installing updates, completing back-ups,
and more. All of these tasks were previous time-sinks for Ops teams, but now, they can all be automated through operators and
operator patterns, which reduce downtime and time spent testing.
Automation and speed are major components of successful cloud-native adoption, so being able to build upon these concepts
with the incorporation of operators will give your business a one-up.
Muller explains: “‘Automate everything’ is the key mantra of DevOps movement for some time, and cloud-native development is
now a way of life for most organizations. Most organizations are building and operating cloud-native applications and services and
leveraging automation practices that integrates the concepts of DevOps, continuous delivery, microservices, and containers.”
Containers play a major role in moving to a cloud-native architecture. The Linux Foundation defines cloud-native and the role of
containers as:
“[Cloud-native applications] use an open-source software stack to deploy applications as microservices, packaging each part into its
own container and dynamically orchestrating those containers to optimize resource utilization.”
As our research shows, 76% of organizations currently use containers, and 20% are not currently using containers but plan to in
the next year. So, essentially, nearly every organization is working to integrate containers into their workflow and microservices
management solutions.
Container registries are made up of container repositories, access control rules, indexes, and API paths. Offering continuous
availability within Kubernetes, container registries allow developers to have several versions of an individual container created at
various times.
According to Mueller, not only are container registries important, but she predicts more innovation around registries, scanning,
and security in cloud-native technology: “Everybody needs a container registry,” explained Mueller. “In terms of a container
registry, you’re going to see more people focusing on getting highly scalable container registries and worrying more about finding
trusted hubs.” Which, we will touch more on security concerns in one of the following sections.
Security
Our findings showed that 55.1% of respondents were most concerned with API security and data leaks, with concerns over finding
the balance between security and DevOps processes.
With the perpetual demand to develop, test, and deploy new apps on a continuous basis, and consequences when code
fails and/or is breached, these challenges are extremely real and hold major consequences when the proper procedures and
precautions are not taken. Most organizations have adopted basic security hygiene ever since the Equifax hack back in 2017. Now,
organizations are getting more sophisticated and strategic about security. The basic stuff, while equally as important, will simply
not cut it anymore.
In addition to tackling security with a DevSecOps approach, our findings suggested there to be a lack of time for training and
refactoring as well as a general lack of understanding around cloud-native development throughout organizations. Lack of time
and training were cited as key reasons to not build out a microservices and/or cloud-native architecture.
According to Mueller, one of the best ways to combat knowledge and training gaps is to become active in the cloud-native
community:
“It’s not the tech anymore. It’s the collaboration. That’s going to make us all successful and making sure that we have healthy
collaborations, healthy connections, and we have active and vibrant communities. This is really the essential component for the
continued success of the cloud-native ecosystem.”
Bottom Line
In an interview with Pivotal about cloud-native technologies and adoption, James McGlennon, Executive VP and CIO, Liberty
Mutual Insurance Group, said:
“One of the things we’ve learned is that if you can’t get it to market more quickly, there is no doubt that the market will have
changed and no matter how well you’ve engineered it or built it or deployed it or trained your folks, it’s not going to be quite right
because it’s just a little too late.”
This applies to cloud-native and microservices adoption now more than ever. Cloud-native is not going away. This approach to
software development is unparalleled in terms of speed, delivery, and scalability.
How Cloud-Native
Development Will Create
Future Business Value
By Malvi Goyal, Product Marketing Manager at DXchange.io
Building applications in the cloud-native world is a whole different ball game. Cloud-native applications provide superior
functionality that makes systems more cohesive and adapt quickly to changes in a fast-paced environment.
Per Gartner, many organizations are now focused on cloud-first strategies as they turn their attention to advancing the use of
cloud services across the business. There are two approaches: migrating your existing applications to the cloud, which can be a
major hassle, or making indigenous cloud-native applications. The difference between the two approaches will set world-class
organizations apart from the rest.
Unlocking the true powers of the cloud — resiliency, agility, ease of implementation, and scalability — will require enterprises to
adopt a cloud-native mindset. This article is aimed to look at how enterprises can implement a cloud-native deployment model
and the benefits of adoption.
Cloud-native development refers to how applications are built and deployed, not whether they sit on a public, private, or hybrid
cloud. In technical terms, a cloud-native application is one born in the cloud and built as microservices packaged into containers.
IDC reports that by 2022, 90% of new apps will feature microservices architectures that improve the ability to design, debug,
update, and leverage third-party code; and 35% of all production apps will be cloud-native. The whole premise of using cloud-
native applications is to replace the investments and manpower required to maintain an enterprise’s data centers, which are not
scalable and cannot adapt to the rapidly changing IT requirements of the enterprise.
Microservices architecture is a modern approach to developing software by creating multiple smaller software entities, better
known as microservices. Each of these services is independent of each other and designed to perform a specific task. This
independence ensures each service can be revoked or iterated without disrupting the complete application or the experience
of end users. Each service is deployed in light-weight containers, which are faster to deploy and can be scaled further with
automation and orchestration processes.
A cloud-native microservices architecture is scalable and faster when compared to its traditional web models and IT frameworks
counterparts. Enterprises often use terms “cloud-native” and “cloud-enabled” interchangeably, but there is a huge difference
between the two and their individual functionalities. A cloud-enabled application is made in a static environment on in-house
servers and is merely a traditional enterprise software enabled for the cloud.
Let’s see how enterprises can prepare for the future with cloud-native application development.
Although moving to cloud-native application development seems lucrative and advantageous, meticulous planning is required to
carve out a roadmap for adoption. Below are important considerations for enterprises that aim to become cloud-native.
Moving IT assets to the cloud is a tedious and time-consuming process. For instance, after being bought by Facebook, it took the
Instagram team over a year to establish governance for the transfer of all its services from AWS to Facebook data centers. Hence,
a comprehensive change management plan is the first and foremost step to encapsulate the overhaul of traditional monolithic
applications and whether these need to be modernized or built from scratch.
Thorough risk assessment and mitigation plans, security and compliance issues, and smooth movement of on-prem applications
to a cloud ecosystem without causing bottlenecks/downtime all need to be addressed before starting development.
Microservices architectures are the backbone of cloud-native development. Applications need to be broken down into granular,
single microservices, which are loosely coupled and can be developed continuously without causing disruption or downtime to
the overall system.
Cloud-native development is much more than changing programmers to DevOps. Cloud-native applications are designed to be
hosted as multi-tenant instances and demand continuous delivery, automatic deployment, and support for frequent changes.
These are dynamically orchestrated in containers that are actively scheduled to optimize resource utilization.
A highly skilled and trained DevOps team drives cloud-native development by building, testing, and releasing software more
rapidly and reliably using automated processes for software delivery.
More than a technical change, cloud-native development is a cultural change that engulfs an entire enterprise. Successful
use cases for shifting to cloud-native infrastructure involve agile methodologies and collaboration between IT teams and
business groups, through which developers use continuous feedback from business groups and quickly iterate updates. Clear
communication involves sharing best practices and consistent feedback loops between IT teams and users to ensure business
continuity and to develop applications that meet user expectations.
In a nutshell, cloud-native developments create discrete elements/services that can be rapidly deployed. Future-ready enterprises
must evolve and inculcate the right culture to cope with the pace of innovation that comes with cloud-native development.
Cloud-native application development prepares enterprises to embrace the onset of digital technologies. It provides organizations
with an edge to perform in a competitive business environment. With a scalable architecture, they can focus more on
differentiating their underlying business proposition rather than investing in infrastructure.
1. Auto-Scalability – This is probably one of the most strategic benefits. Cloud-native applications are highly scalable;
real-time changes can be made to a microservice without disrupting the entire application as per the deemed requirement. A
real-life use-case has been shown by IBM cloud solutions, which helped American Airlines move out of their existing legacy
architecture and enhance the customer experience.
2. Infrastructure Investments – Cloud-native application development is relatively cost-effective, as the costs are based
on the licenses and the storage required on the cloud for an application. It requires no software and/or hardware upgrades or
installations, curtailing down on infrastructure investments.
3. Maintenance as a Service – Cloud-native applications are comprised microservices, which can be maintained easily and
improved incrementally. Further, changes can be applied to microservices without interruptions or downtime. They can be
updated, managed, and deployed individually.
4. Ease of Implementation – The implementation of cloud-native applications is fast and efficient as no hardware or
software configurations are required. The whole process of cloud-native development more accurately matches the speed
and innovation that is demanded by today’s changing business requirements.
The future belongs to cloud-native as it impacts the complete SDLC from all aspects — design, implementation, deployment, and
operation. Oracle states that 80% of all enterprise workloads will move to the cloud by 2025. As the world is envisaging digital
disruption in day-to-day life, they will be opting for tailor-made cloud applications to deliver business needs and gain an edge over
competitors.
Cloud-based architecture is enabling enterprises to adapt and respond quickly to business changes through continuous delivery.
The power of cloud architecture lies in moving ideas to implementation in production as quickly as possible, making the most out
of available business opportunities.
So the million-dollar question for today’s enterprises is less about why they need cloud-native and more about how they become
cloud-native!
2020 will be the year that a vulnerability in runC, similar to CVE-2019-5736, will be exploited in a big way before it gets patched. This
event will expose the shortcomings of using legacy tools to secure cloud environments. Security teams who are not as intimate with
securing the cloud will begin to realize that “it’s running in a container” does not mean it is safe from threats and that legacy security
tools do not fully protect them in a dynamic cloud environment.
The risk shouldn’t result in missed opportunities because it causes companies to shy away from container adoption. Rather, DevOps
teams and security professionals just need to educate themselves on the actions they need to take to reduce risk exposure.
The responsibilities assigned to DevOps teams continue to grow. Similar to our first prediction, companies will evolve and create a
new position, a dedicated security engineer. This position will sit within the DevOps team, but they will be responsible for managing
the security of the Kubernetes platform. At least fifty percent of the Fortune 500 will have this position by the end of 2020.
Exploitable vulnerabilities are different every time, but attackers leave a recognizable trail of indicators: high resource usage,
access to specific data, execution of unusual child processes, etc. Machine learning (ML) lowers the barrier to entry for enterprises
implementing cloud security by automating threat detection, policy setting, and compliance reporting. Manually setting policies
introduces human error and is impossible at scale. Learning models enable security tools to protect common container images out-
of-the-box.
As more tools adopt ML, writing security policies will begin to be considered a bad practice. By the end of 2020, people will be
talking about integrating ML into the entire DevOps cycle, from being a part of the software development tools to being integrated
throughout the entire CI/CD pipeline. At this level, ML will be able to detect attack vectors even before the software is built.
Over the years, digital transformation has forced enterprise products and platforms to evolve toward a highly scalable architecture
that helps achieve simultaneous performance and availability.
In the late 90s and early 2000s, cloud systems were traditionally set up as resource-intensive, monolithic client server systems.
These systems were simple to maintain and deployed via a single codebase, database, functionality, and process.
As organizations grew and business requirements became more complex, the need for large-scale, cloud-based systems became
evident. The tightly coupled nature of monolithic applications did not meet these new expectations of scale and deployment.
This gave rise to Service Oriented Architecture (SOA) in the late 1990s, a precursor of microservices, whereby an application is
divided into smaller components to make it modular and consumable.
SOA became widely understood and accepted as a paradigm that organizes software via distributed application components
called services to help manage, maintain, and deploy internet-scale applications.
• 2012: The word “microservices” was coined for the first time at 33rd Degree in Krakow.
This new architecture took SOA to the next level by making these services autonomous, disposable, and loosely coupled.
• 2015: Google, Intel, IBM, VMware, and other industry leaders formed the Cloud Native Computing Foundation (CNCF) to
The CNCF led to the birth of a new SDLC paradigm, Cloud-Native Application Development, expanding the concepts of
microservices and containers.
The increasing complexity of enterprise platforms and products that demand performance, scalability, security, reliability, and
availability led to the popularity of a microservices-based software architecture.
In this architectural style, an application does not run as a single, monolithic codebase but is broken down into a collection
of smaller services that help developers build and modify apps in a rapid, agile manner. They can also focus on developing
dynamically orchestrated microservices to run and manage scalable, fault-tolerant systems responsively without worrying about
the underlying cloud infrastructure, whether public, private, or hybrid.
Containers
The power of microservices can be best unleashed by encapsulating them in a lightweight, consistent, self-contained, and
isolated workload environment. What’s known as containers in the software space offers this in a virtualized system.
• Packages code to run a service isolated from its environment (libraries, settings, systems tools, etc.).
Serverless
Serverless is another primary software architecture style that can be used as a vehicle for microservices. The serverless cloud
computing model:
• Breaks down applications into individual functions that run as a service, often referred to as Function as a Service (FaaS).
• Eliminates the headache of managing or hosting underlying cloud infrastructure and hardware.
• Scales functions instantly as the underlying runtime is provisioned on a request basis and for practical purposes.
Third-party cloud vendors such as Amazon AWS and Microsoft Azure take care of server and cloud management. This helps run
consumer and enterprise platforms in areas such as telecom, e-commerce, payments, and banking, which particularly need the
ability to scale at speed.
Deploying Microservices
Cloud-native app development began with embracing PaaS (Platform as a Service). However, PaaS locked application developers
down to a single cloud platform and did not allow sharing of resources.
This limitation led to having only two orchestration approaches favorable to cloud-native app development — containerization and
serverless. These two modes of deploying microservices have their own advantages and disadvantages.
On one hand, containers give software developers more control over their environment and how services are packaged. This is
extremely helpful in digital transformation initiatives involving migration of legacy applications to the cloud. However, serverless
allows teams to focus more on the business capabilities and customer experience and is more applicable in digital transformation
initiatives around developing and deploying new-age applications in the cloud.
With the push to cloud-native app development, microservices, containers, and serverless have changed the IT landscape, acting
as a catalyst and the base to develop applications at scale in emerging areas like edge computing, Internet of Things (IoT) and
Industrial IoT (IIoT), and artificial intelligence (AI) that form the foundation of digital transformation.
Allowing computing near the edge or source of the data, edge computing typically deals with collecting and processing data from
thousands of IoT devices through tools such as Docker and Microsoft’s Azure:
• Docker containers help securely distribute software to the edge and run containerized applications on a lightweight,
isolated, and granularized framework.
• Azure Kubernetes Service (AKS) allows orchestration of containers on IoT edge devices.
AI requires the underlying architecture to scale in an elastic manner with functional programming paradigm central to the theme.
One of the possible ways to achieve AI is by combining the powers of Amazon’s AWS Lambda and Tensorflow. Together, they can
help the development and deployment of deep learning systems at scale.
Cloud-native app development has also transformed the software development management landscape. Well suited for agile
software development, this new methodology has paved the way toward modern SDLC management tools and practices such as
CI/CD and DevOps. It has helped leaders in traditional industries like banking and finance, retail, and manufacturing transform
into software companies, and paradoxically help transform software companies into real, hard-core businesses.
• Intesa Sanpaolo – One of Italy’s largest banking groups embraced cloud-native app development to replace their incum-
bent technologies and systems by CI/CD, DevOps, Kubernetes orchestration to provide customers with a state-of-the-art
customer experience.
Today, the bank runs more than 3,000 applications. Of those, more than 120 are now running in production using the new
microservices architecture, including two of the 10 most business critical for the bank.
• Netflix – Realizing that they weren’t a data center operations provider, cloud-native application development helped Net-
flix transform from a new-age software company to a new-age distributor and creator of entertainment content.
Cloud-native app development and its pillars like microservices and orchestration techniques like containerization aim to make
the digital transformation initiatives of enterprises successful by helping them transform their business models into highly
scalable, agile, and robust software products and platforms.
Refcards Podcasts
Introduction to Serverless Monitoring Get an Serverless Chats Dive
introduction to serverless computing and monitoring, learn how serverless can into a new topic in the serverless
play a role in IoT and machine learning, see how monitoring and observability space each week.
differ, and more.
Cloud Zone
Container technologies have exploded in popularity, leading to diverse use cases
and new and unexpected challenges. Developers are seeking best practices for
container performance monitoring, data security, and more.