Third Wave Dev + Ops: Self-Learning DevOps for a Self-Learning World

DevOps expert Aruna Ravichandran on how the software engineering culture must adapt to meet businesses' future needs.

The technologies and methods that today's enterprises use to advance their digitial transformations are inspiring. They will also soon be obsolete.

This looming obsolescence, driven by intense technological change, means that DevOps teams cannot be complacent. They must evolve with the times (and new technologies) in order to remain relevant—and to continue providing the apps and services that customers demand.

A Brief History of DevOps

To understand where DevOps is going, it helps to review where it has been.

The first generation of linkage between application development and IT operations occurred in the client-server world. An explosion in developer innovation—and in the business's appetite for that innovation—promised tremendous benefits for the enterprise.

Security must therefore become intrinsic to DevOps QA—not something for someone else to clean up after the fact.

— Aruna Ravichandran, VP of global marketing, CA Technologies

But we all soon discovered that what worked on developers' desktops didn't always work so well in production. IT leaders then began trying to improve collaboration between coders and the ops staff. The results included better applications, better performance, accelerated time to benefit, and improved economic efficiency. And so the foundation of DevOps was laid.

Then came mobile and cloud. Competition for compelling mobile engagement drove developers to roll out new capabilities even more quickly, while cloud offered fast, adaptive access to the infrastructure necessary for increased digital agility on a pay-by-the-drink basis.

This game-changing combination of demand and supply catalyzed the second generation of DevOps, which has been broadly embraced across vertical markets. Enterprises are building well-integrated pipelines to bring new digital value to market. They're also embracing APIs, microservices and containerization so they can more adaptively mix and match existing code in innovative ways.

Looking Ahead to the Self-Learning Enterprise

As awesome as second-generation DevOps may be, the new competitive challenges looming on the horizon will render it inadequate. AI—including machine learning, natural language processing (NLP), and other algorithmic data science—is creating opportunities for digitally aggressive enterprises to disrupt markets by automating and accelerating the OODA (observe, orient, decide, act) loop.

Enterprises are no longer under pressure to simply write better code faster and continuously deliver it into production. They now have to constantly fine-tune their algorithmic engagement with their customers, their supply chain, and their own internal processes. 

The exploding volume of data being produced by the physical world (IoT) and the human world (social media, home assistants, increased mobile and digital engagement) is fueling AI-related disciplines. And another emerging technology—blockchain—has the potential to created meshed-data environments that will require enterprises to engineer autonomic real-time response mechanisms that bear little resemblance to what we have historically understood as being “application code."

This introduction of AI, IoT, blockchain and other transformative technologies into the digital mix will render current DevOps approaches obsolete.

Key Attributes of Third-Generation DevOps

How will DevOps have to evolve over the next three to five years? What steps will enterprises need to take in order to successfully compete in a world of algorithmic business and pervasive data?

Here are some predictions:

1. SPEED: DevOps' must tighten up their own OODA loops—that is, the speed and accuracy with which IT can improve code and get it into production. You can't successfully bring a self-driving car to market if you can't quickly detect and correct even the slightest software design flaw that might render your vehicle unsafe. In fact, AI invariably creates downside risks in proportion to its upside opportunities.

To achieve this third-generation DevOps OODA loop compression, we must apply machine learning to the software development lifecycle. As a result, DevOps pipelines must provide rich complete data about inputs and outputs, and dev teams need algorithms that accurately determine how those inputs and outputs relate to each other. Why? So that teams can address shortfalls in quality, speed and scalability with the utmost precision.

2. TESTING: Continuous testing must become the norm. Sure, shift left QA efforts have already helped us to reduce costs and accelerate digital delivery by discovering issues early on when they can be fixed quickly and inexpensively. But in five years, that won't be adequate. We need to detect issues even earlier in the software development lifecycle. We also need to close the latency between the time that our DevOps algorithms get data about developer behaviors and the time that those algorithms get data about testing outcomes.

DevOps AI success, in other words, may be largely contingent upon continuous testing. After all, we want our AI-powered recommendation engines to guide customers toward the right items while they're still on our websites. Why wouldn't we similarly want our AI-powered DevOps to guide developers toward the right behaviors while they're still on a feature or fix?

3. SECURITY: Most IT organizations still treat cybersecurity as its own functional silo. In five years, this approach won't work—especially since next-generation architectures, such as blockchain, that depend on innate code might run on imperfectly secured environments beyond enterprise IT teams' control.

Security must therefore become intrinsic to DevOps QA—not something for someone else to clean up after the fact. Building security checks directly into the integrated development environment (IDE) helps to protect companies from increasingly sophisticated attacks designed to discover and exploit design flaws. Security-enabled DevOps also saves money and speeds time to market in precisely the same way that QA shift left does.

DevOps will probably have to evolve in other ways, too. But the imperatives above highlight why CA is investing so heavily now in the capabilities that we believe our enterprise customers will need tomorrow. Technological change continues to accelerate. With AI-enabled DevOps, continuous testing and a more integrated approach to security, enterprises can keep pace with that change—rather than being overwhelmed by it.

Aruna Ravichandran
By Aruna Ravichandran | June 6, 2018

Subscribe to The Blueprint

Share the wealth and suggest a friend to subscribe to The Blueprint:

At CA, your time and privacy are just as important to us as they are to you. We use the information you provide to us under our legitimate interests to make sure you hear about topics of interest to you. If we got it wrong you can update your preferences by clicking here. If you'd like to know more about how we use your personal information, you can read our privacy statement here.

Please fill out all required fields

You are now subscribed to The Blueprint.