Data Center Knowledge – Peter Waterhouse – 2/9/17
[Ed note: Author Peter Waterhouse is Senior Strategist, DevOps Solution Marketing, CA Technologies]
The days of managing monolithic style applications running on a single platform are over. Organizations are now committed to delivering their customers a far richer variety of digital services using multiple channels. This means applications are more likely to execute from the cloud, via a multitude of microservices interacting with virtualized resources, containers and software-defined networks.
In this new normal, teams can no longer afford to get bogged down with reactive fire-fighting and lengthy war room sessions. But with so many moving parts, increased application complexity, and dizzying rates of software delivery, what strategies can IT operations employ to prevent being wiped-out by waves of operational big data?
The traditional approach is to buy more monitoring tools; one for every new wave of technology adopted. But this doesn’t scale, negatively impacts margins, and only provides narrow views into the all-important customer experience. So putting tools aside, where should organizations turn?