Estimated read time 2 minutes

2025 has been a defining year for the team at Aster. The accelerated disruption of AI has been exhilarating yet overwhelming, and our customers and partners feel it, too.

This shift seems to have leveled the playing field, pushing me and the team at Aster to spend the year validating lessons from our decades of experience in the tech space. The verdict? AI cannot fix a broken process; it only amplifies it.

We watched this play out across different ecosystems like SAP, Salesforce, IFS, and even custom landscapes. Teams with clean pipelines, reliable tests, and proper documentation used AI to accelerate feedback. Teams without those basics just saw more noise. Back in 2024, Gartner predicted 30% of GenAI projects would fail to deliver value. Looking at the data, that prediction was right on the money.

The biggest successes came from teams that targeted specific pain points rather than trying to apply AI to everything, everywhere, all at once.

Just as importantly, the most effective teams kept human review front and center around customer data, security, and production changes, treating AI as a powerful assistant rather than an autopilot.

AI Isn’t A DevOps Silver Bullet

Over the last 20 years, working in software testing, product, and leadership, I have seen plenty of “next big things,” but AI is genuinely different, and Aster’s first year has only reinforced this.

AI isn’t going to magically fix DevOps, but when used with care, it can make good teams better and help you improve bad systems. Without clear guardrails, it will accelerate failure and increase risk for both your team and your customers.

AI has gone from a fun curiosity to an everyday tool for most techies. Used well, it’s immensely powerful. When you are weaving AI into your DevOps landscape, it is worth remembering Uncle Ben’s point about great power and responsibility.

Where AI Can Help DevOps Teams

You don’t need me to tell you that AI is excellent at many things. It grinds through your drudgery, lets you bounce ideas around, challenges your assumptions, and carries out a whole host of other amazing and entertaining functions. It is already firmly embedded in many people’s working practices.

When it comes to DevOps, the AI cat is well and truly out of the bag, too. It helps teams write and review code, generate tests (including valid data), spot anomalous patterns in logs, and reduce repetitive ops work. In fact, these days it’s harder to find an engineer who isn’t leaning on Cursor or another AI assistant.

If the basics are in place, such as solid CI/CD, decent tests, and reasonable monitoring, AI is genuinely helpful. It reduces noise, shortens feedback loops, and frees people up to work on more interesting problems.

Where Things Go Wrong With AI

We are still in the early days of AI, and it often feels more like a frontier than a mature, well-regulated space. This is not confined to DevOps, but the whole AI landscape still has a strong “try it and see” vibe. People use their own preferred tools haphazardly as shortcuts around their tedious, but often essential, daily tasks.

In DevOps specifically, if pipelines are flaky, tests are unreliable, and telemetry is a mess, AI mostly helps ship risky changes and generates more alerts that people learn to ignore. You end up with “AI‑enhanced” dashboards and bots sitting on top of the same old problems, just moving faster.

Underneath all of this are a few dangerous assumptions about what AI and DevOps are actually for, and that is where things really start to go off the rails.

What Thoughtful Teams Do Next

If you want your DevOps and AI to work well together, it starts with getting the basics right;  Aster’s first year brought this into sharp focus. Teams that actually saw improvements invested up front in making their delivery and telemetry boringly reliable, so the data was trustworthy and the systems were ready for automation.

Next, they picked specific pain points (not just “AI everywhere”) and applied AI where it made the most sense: speeding up testing, catching build failures earlier, or reducing repetitive work in deployments. Early wins came from well-defined, contained use cases, not from blanket automation.

Moving From Automation to Autonomy

DevOps is evolving from simple automation to genuine autonomy. But that doesn’t mean taking shortcuts; you can’t just point AI at the problem and hope for the best. Autonomy is the summit of a maturity curve.

Over Aster’s first year, the most successful AI-based outcomes consistently stemmed from solid processes, governance, and organisation. Before even considering autonomy, you need to get this right. Only when you have solid foundations can you take the next step and automate repetitive tasks and implement rigorous monitoring.

Finally, when all of this is in place, you can graduate to autonomy, where agents orchestrate complex workflows with situational awareness, grounded in high‑quality data and proper safety guardrails.

AI‑powered DevOps with autonomous agents is the future, but it’s hard to overstate how important it is to get the fundamentals right first. You need to crawl before you walk, and walk before you run. Fortunately, that journey doesn’t have to be long or arduous and you don’t have to navigate it alone. This is what experts at Aster do in their day-to-day, and we’ll be more than happy to help you make the transition smoothly.

The Importance of Human Intelligence

As we move toward this autonomous future, the role of humans transcends from work producers to agent orchestrators. Human intelligence sits at the center, ensuring that every AI-generated decision (especially those involving customer data, security, or live systems) is powered by judgment. AI surfaces problems and makes suggestions, but real progress happens when experts stay in the loop to make the final calls and set the guardrails.

They also pay close attention to feedback, tracking clear metrics like failure rates, recovery time, and customer experience, rather than just how much AI output is generated. If something’s not improving after AI comes in, they tweak, scale back, or pivot as needed.

Key Takeaway: Fundamentals First, AI Next

Remember that AI for DevOps is a continuous journey, and we’ve only just set off. As tools and practices evolve, so do the risks and benefits.

After seeing Aster’s first-year lessons play out in real projects, the focus now is on using that experience to guide where AI belongs in real delivery pipelines. As we head into 2026, our team’s emphasis is on autonomous QA agents and smarter, context-aware test result analysis, grounded in boringly reliable delivery and telemetry.

What lessons have you learned while implementing AI in your DevOps pipeline? Share your thoughts and let’s connect!