“In the midst of winter, I found there was, within me, an invincible summer.
And that makes me happy. For it says that no matter how hard the world pushes against me, within me, there’s something stronger – something better, pushing right back.”
As I caught a glimpse of the Bay Bridge from my plane on yet another gorgeous afternoon last Friday, I happened to remember Albert Camus’ inspirational words.
I have spent nearly half of September in California, mostly at AI focused events. From a distance, it is a state which appears, especially after the tech meltdown last year, to be caught in a “doom loop”. But walking with my wife amongst its Redwood giants and surveying the majestic Pacific from Pigeon Point we felt nothing but awe. Even more, I was impressed by San Francisco which has cleaned up and stepped up to host several of those AI focused events. Quite a bounce back for a city which has descended to depressing lows. As Marc Benioff of Salesforce, ever a cheerleader for the city, has tweeted “San Francisco is the #1 AI City! SF will soon be sold out again! Never seen commercial real estate leasing at this rate. # of companies looking for huge amounts of space is incredible.”
You have to marvel at the power of people and places to self-heal.
Of course, as an analyst I hang around colleagues who slap me when my optimism gets carried away. I told a fellow analyst I love the term “joule” and SAP’s use of the term to brand its recently announced Generative AI digital assistant. It conveys momentum, energy and work, I said. He reminded me “it is a tiny unit of measure.” He is right - it is a force of a newton to move something only about a meter.
That is a good metaphor for the state of Enterprise AI between all the news coming out of Salesforce’s Dreamforce, Workday Rising, Google Next, Oracle OpenWorld, SAP’s announcements among others. Lots to be optimistic about, but realistically we are just getting started. Lots of moving and pushing ahead.
Let’s start with the positives first:
To me, the noticeable thing is how enterprise tech is threading the needle – taking advantage of the excitement generated by OpenAI and other LLMs and NVIDIA GPUs and yet distancing itself from the “wild west’ which is dependent on unstructured and often unreliable data from the consumer web.
The message from Rising was their AI harnesses “the largest, cleanest data set from 65M+ users and 629B+ transactions a year.” Dreamforce used even more eye-popping numbers for their Data Cloud, the underpinning for their AI: “900 trillion records processed per quarter”.
I heard the word “Trust” over and over. Not just about the data, but the ability to transition customer bases across yet another architectural shift.
Aneel Bhusri of Workday expressed it pretty well in a session with analysts
“From mainframe to client server and then to the cloud (the shifts were about) re-platforming for business processes. When cloud got started, very early on, we said this is not just about business processes, it's about data. And so really, since day one, we've been building the foundation for data … I can't say we knew this world was coming but we were architected for the way that this world has materialized and you can't do the AI stuff if you don't have a really strong foundation.”
Two SAP executives made an even more compelling case of their “shock absorbing” capability in a book I helped them write last year. The preface said
“In 2022, we celebrated SAP’s 50th anniversary. Fifty years in the software industry means that generations of technology have transformed enterprise computing—from mainframes to client/server architectures, to the internet, and into the cloud. And we see the crests of new technology waves on the horizon.
Our solutions support and enable the daily operations of more than 400,000 customers. And what we consider “business as usual” today once was revolutionary and very un-usual. Real-time computing with SAP R/2 was disruptive when most of the world ran yesterday’s data and last month’s analytics on homegrown software. SAP R/3, with its scalable client/server computing, displaced the monolithic mainframes. Integrated 24/7 internet-based businesses replaced the usual 9-to-5 weekday schedules. The cloud is paving the way to scalable and adaptable computing power, high-security data management, and continuous innovation, giving companies the agility and resilience they need for an increasingly volatile environment. The pace of change has increased dramatically in the last few years as enterprises navigate one shock after another resulting from COVID-19, the war in Ukraine, climate change concerns, and massive digital transformations.”
One caveat – the trust factor would be much higher if vendors shared metrics on how many explicit customer permissions they have gathered to use their data to train their machines.
The use cases I heard in the last month were not just about Generative AI. The vast majority of enterprise data is structured. Between SAP’s early Leonardo and Salesforce’s Einstein examples, we have early successes around demand forecasting, anomaly detection, next-best action and other scenarios. We need to turbocharge these incorporating newly available external datasets. And when it comes to GenAI a killer app will be around DevOps productivity, especially with hyperscaler platforms as George Gilbert pointed out in this Burning Platform episode. Talking of leveraging a wider group of developers, I liked Workday’s focus on AI in its Extend toolkit and its focus on startups in a new AI Marketplace.
I also liked Salesforce’s comments about “bring your own models” and commoditizing LLMs and SAP’s investment in Aleph Alpha, Anthropic and Cohere. I want application vendors to build more use cases, collect more relevant datasets and build more domain expertise, not invest as much in the plumbing.
Specific to use cases I saw many which should appeal to individual users – to help craft better email responses or job descriptions or career paths. I did not see them expressed enough in enterprise terms. “You can use the functionality for 10,000 instances a month and save xxx of $ in people time as a result”. We need that to show clear value from AI applications. In a recent conversation, I heard we are going through a phase of “AI fog” – a good way to describe that we have hundreds of potential AI use cases, but not enough value analysis around each so enterprises can easily prioritize them.
And yet, the reality is even after 25 years of cloud applications, if you chart a grid of available solutions by industry and global regions, 75% of the squares have very little coverage. Most vendors do not have the specific domain expertise or access to relevant data as a result.
In a LinkedIn post about Oracle I recently wrote
“Brian believes Oracle’s earlier acquisitions - JDE, Siebel, MICROS, Primavera and others have brought it plenty of vertical knowledge. To me, those are all legacy applications, many from the 1980s. Made more contemporary at a cloud architectural level, but not always reflecting rapidly changing sectors. As is the core Cerner EHR. I have been particularly interested in new vertical applications which have become viable in the last 5 years”
My view is enterprises will find more AI value in the other “squares” on that grid. Vendors will need to move out of their functional comfort zones to help customers better understand these “unknown unknowns”. Their machines cannot be expected to provide truly valuable insights without curiosity about these unexplored domains. Even SAP which covers more squares in that industry/country grid tends to focus more on well-trodden horizontal areas. Over the next month we will hear at its events about use cases around HCM, procurement and CRM, when vertical and geographic operational use cases are much more elusive to find.
Another area for improvement- too many software vendors have taken Marc Andreesen’s comments on “software will eat the world” too literally. Software in concert with satellites, sensors, robotics, drones and other physical automation delivers X times more value to customers.
Mercifully, I did not see too many demands for premium AI pricing at these events. In earlier months, I had heard talk of that to mimic inflated pricing around GPUs and AI talent, but it appears to have quietened down a bit. Focus on expressing crisply the value from an AI use case and the pricing discussion becomes much easier.
Over the last couple of decades we have seen consumer tech lead the enterprise through waves of cloud, social and mobile adoption. With this wave of AI, I see a caution around blindly adopting consumer tech. That’s healthy. Next, we need to broaden our horizons to help enterprises with data and analytics around way too many blind spots they continue to live with. But do so quickly at the speed of consumer tech.
PS – I am not a political analyst, but if I were I would deliver a similar objective scorecard on the state of the State of California 😊
Thoughts from SAP TechEd, Bangalore
SAP did a very nice job bringing TechEd in Bangalore, India to analysts and other registered users around the world. They had a preview for analysts on Monday and had keynotes on Thursday and Friday conveniently timed for viewers who were several time zones away.
There were a slew of announcements, especially around GenAI and other AI , growing BTP product functionality and growing customer base for the development tools.
I was particularly fascinated by the Vector Engine, Datasphere and the Build Code discussions.
Juergen Mueller, the CTO simplified vector talk – which honestly, I mostly associate with pilot communications with air traffic controllers. We live in a 3-dimensional world and vectors allow for representation of many more – hundreds and thousands of - dimensions. As Juergen explained “The embedding function maps semantically similar text to vectors close to each other in a high dimensional space. Running a semantic search, then simply becomes a nearest neighbor search and that vector space with 1536 dimensions. So where it differentiates is you're not searching content, you're searching for condensed semantic meaning.” It should hopefully allow us to go beyond today’s excitement around LLM’s powering text and document-centric sources to include many more structured and unstructured data formats.
Datasphere promises to unify all your data into a single semantic model. Over the years, customers have been exporting SAP data into data lakes. SAP’s delays in integrating many of its acquisitions encouraged some of that thinking. Competitors have also been encouraging customers to bring operational data into their supposedly “more user friendly’ analytical tools. From my early conversations with SAP customers there is plenty of excitement around Datasphere
Build Code brings generative AI and Joule copilot capabilities to ease/speed up development of new applications and extensions to SAP applications. There was plenty of talk around how that would accelerate low and no-code development.
However, reality struck me when Juergen jokingly said “I didn't know if I will have a headache tonight thinking about the space that has more than 1500 dimensions”.
It’s tough to reconcile enterprise complexity with promises of citizen developers. I am hearing similar promises from hyperscalers and IBM that turbocharged DevOps will be the killer app from GenAI. The rapid adoption of ChatGPT has fired up the imagination of every enterprise vendor – we have to make our tools way more consumer friendly.
Like our solar system, enterprise landscapes continue to expand exponentially. After 25 years of cloud applications you cannot find decent choice for most cells if you look at a grid by country/by industry.
I wished I had gone in person to Bangalore. There were TechEd strategy sessions that were not streamed I would have like to have watched. But even more I would have liked to have visited with many Indian outsourcers and tried to understand how they are using GenAI on their own projects.
As an industry we have done several million ERP and CRM implementations, global rollouts and upgrades. Vast number are around SAP products. Which means we have at least 10X the number of artifacts – test scripts, parameter configurations, data conversion code, training modules etc. Can we populate LLMs and generate fairly decent first versions for the next 10,000 or 100,000 customers, rather than reinventing the wheel each time? Can you imagine the labor savings we could deliver?
I did several trips to Bangalore and other Indian and E. European cities in the first few years at Deal Architect I founded in 2003. I took C level customer executives on due diligence trips to many vendor campuses. May be time to start doing more of that.
We are staring at a massively different enterprise landscape. New DevOps tools. And very promising roles for everyday developers.
Time for new approaches.
November 06, 2023 in AI, ML, Industry Commentary | Permalink | Comments (0)