Four months after the announcement of the Red Hat acquisition, and some time before it is expected to officially close, I caught up with Dr. Jim Comfort, GM of Multicloud Offerings and Hybrid Cloud at IBM, to talk about the company’s recent multi-cloud-related news, and to get a sense of its updated view of multi-cloud. Jim joined IBM Research in 1988 and since then has held a variety of development and product management roles and executive positions. He has been closely involved with IBM’s Cloud Strategy, including acquisitions such as Softlayer, Aspera and others. [Note: as a Forbes contributor, I do not have any commercial relationship with IBM or its staff.]
Dr. Jim Comfort; IBM
First, what is significant about the new announcements?
At its THINK 2019 conference, IBM announced Watson Anywhere, the latest evolution of its fabled AI platform (originally restricted to the IBM Cloud), which will now run across enterprise data centers and major public clouds including AWS, Azure, and Google Cloud Platform. My reading is that this is a potential sign of things to come with Red Hat: IBM seems to be coming to terms with its relative weakness as an IaaS player, and its tremendous strengths as a platform and managed service provider. Building the multi-cloud muscle with Watson now, will serve IBM well when it comes to Red Hat OpenShift later on.IBM also made a series of announcements with regards to its hybrid cloud offerings, including a new Cloud Integration Platform connecting applications from different vendors and across different substrates into a single development environment; and new end-to-end hybrid cloud services, bringing IBM into the managed-multi-cloud space. Again, to me this seems like infrastructure-building for the day IBM turns on the multi-cloud ‘power button’, using its existing and new technology and formidable channel assets.
How the Red Hat deal might change IBM
Jim agrees we haven’t seen an IT wave move as broadly and as quickly as containers have in decades—not even cloud, which took the best part of a decade to take off. This isn’t just due to Solomon Hykes’s eye-opening original Docker demo in 2013, or to the CNCF’s stellar community- and ecosystem-building efforts—the reasons are mainly technological and strategic.
Amongst their many advantages, containers truly offer the potential to decouple the underlying technology from the application layer. In a key insight, Jim suggested that while multi-cloud used to mean private+public, and then evolved into a term for ‘pick & choose’ strategy (for example, Google Cloud Engine for compute, with Amazon RDS for a database), containers and related orchestration systems are leading us to a more powerful definition. Multi-cloud now is about ‘build once, ship many’, on completely abstracted infrastructure. As I wrote in a previous post, ‘soft-PaaS’ is on the rise for many related reasons, but the added insight here is about a fuller realization of the multi-cloud vision. In this new stage of the market, the challenge shifts to areas such as managing data sources and making multi-cloud operations easier. In this sense, OpenShift as PaaS, together with IBM’s services capabilities, become potentially powerful strategic assets.
How IBM sees its role in this cloud-native world
Jim being from the services side, his view on this was not a hardware model for a software-defined world; he claimed that IBM has always helped companies deal with massive change, and that still is the main mission. The market, if you believe press releases, is constantly shifting like a Newton’s cradle from public cloud to private cloud and back again. Personally, every time I hear of a large company going ‘all-in’ on either side, I tend to chuckle: as I covered in a previous post, oversimplifying architecture and infrastructure has ended many an IT career. Remember that leading UK news outlet that declared it is going ‘all-in with AWS’ because it couldn’t realise its ambitions on DIY-OpenStack? I was in the room when the former CIO refused vendor help in implementing this complex private cloud platform and—surprise—the experiment failed.Jim suggested that in his experience, vendors can help customers focus on the company’s required business objectives, using five areas for analysis and planning, as IBM already does:
- Infrastructure architecture
- Application architecture
- DevOps and automation
- Evolved operational models
- Evolved security models
Yes, the new tech is shiny, but even more important for customers are things like managing legacy-IT resistance, re-skilling an older workforce, and managing generational gaps (cloud native devs can be much younger than their Ops counterparts). In a surprising statistic, Jim claimed that 50% of IBMers have been less than five years with the company, and that the company has specific millennial programs.
IBM’s open source position gets a boost
A major advantage of the acquisition that has not received enough attention, in my opinion, is that in acquiring Red Hat, IBM has shot into the top-three organizations measured by open source software contributions. In a star-studded panel during the THINK conference, IBM seems to have embraced this position gladly. Analyst firmRedMonk‘s co-founder Steve O’Grady correctly warned that, “the future success of open source is neither guaranteed nor inevitable.” Similar to some point I covered in a previous post, O’Grady outlined profiteering, licensing and business models as systemic challenges that must be addressed.However, even if open source continues to thrive, it is a predominantly bottom-up IT phenomenon, which me be at odds with IBM’s more traditional CIO-downwards approach. To me this is one of the most interesting areas to watch: as Red Hat gets integrated into the family, will IBM be successful in changing its own culture, taking full advantage of its history (for example, its talent-hothouse lab the Linux Technology Center), its new and existing technologies (Kubernetes-based offerings, free Linux OSs) and its newfound open source dominance?
(Originally posted on Forbes.com)