2

Chatting about ChatGPT - enterprise tech leaders set out their generative AI sta...

 1 year ago
source link: https://diginomica.com/chatting-about-chatgpt-enterprise-tech-leaders-set-out-their-generative-ai-stalls
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Chatting about ChatGPT - enterprise tech leaders set out their generative AI stalls

By Stuart Lauchlan

March 14, 2023

Dyslexia mode

generative AI

For vendors that didn't really make the leap to cloud, generative AI is going to be impossible.

A bold statement from Workday co-CEO Aneel Bhusri at last week’s Morgan Stanley Technology, Media & Telecom Conference, at which the potential/hype around generative AI and ChatGPT in particular was a consistent topic of conversation. So, is he correct? The Morgan Stanley event gave a number of enterprise tech players room to set out their various stalls, including Workday itself, with Bhusri making a pragmatic plea: 

We shouldn't just discard traditional AI and ML [Machine Learning]. It's funny calling it traditional AI, because that's relatively new. They're solving different use cases. Traditional AI and ML, you throw a bunch of data at it, it gives you a recommendation. It's either basically automating drudgery work, or it's giving you insight into a process that would take humans a decade to analyze and the machines can analyze it in minutes. That trend needs to continue to go forward and generative AI is not going to replace that.

But generative AI is “really an amazing technology”, he added, suggesting that the first wave of use cases from Workday will be seen in around six of seven months. While he’s inevitably keeping his powder dry, Bhusri cited a relatively simple use case by way of example, in the shape of performance reviews: 

People hate writing performance reviews. The way that we're using generative AI is to look at taking an input, either a combination of, maybe, a review done over Zoom, all the data that you already track in Workday from a performance perspective, and then [generative AI] actually writing the performance report that nobody likes to write. In our early trials, it works really well. You would think it's written by human, except there's no grammar errors. I mean it's pretty flawless…our data is absolutely clean and it's normalized. So [with] ChatGPT, you can ask good questions and you get some wacky answers at times, because the Internet is the Wild Wild West, so maybe it's actually getting wilder. But our data is absolutely constrained, normalized, it's clean. And as a result, we can produce pretty powerful results.

For Alphabet, CFO Ruth Porat said Large Language Models (LLM) are going to reach across the entire product landscape for her firm’s various businesses, arguing that LLM tech first began to appear around four years ago:

We've used AI that really helps advertisers achieve and maximize their objectives around ROI. We've used it in cloud. And so, we're building on a decade of extraordinary work and look forward to sharing more in the next weeks to months. We firmly believe that AI is transformative for each one of our businesses, our ability to connect more closely to customers, our ability to extract operating efficiency in our businesses, our ability to address some of the risk management requirements that we need.

Search is a good case in point, she added: 

Probably one of the most exciting new applications that really leverages Large Language Models is everything we're doing with multi-modal search. And to give you a sense of what that is, say you want tonight, anyone [who] wants to plan a great dinner somewhere in San Francisco, and you've got your favorite Italian dish that you want, [such as] lasagne. You can take a screenshot, photo, combine it with text, with Google Search around Near Me, and you'll get a full listing of where can you go for that great lasagne dish that you've seen that you want here in San Francisco.

Inflection

This is an inflection point, according to Jeff Lawson, CEO of Twilio, even if it doesn’t necessarily feel like that:

There's a point of inflection [around] what these [LLMs] can do, but also what they can't do...I think it's still early for enterprises, to be frank. But when you look at the landscape of things, these Large Language Models are able to make incredible predictions of language, and to accomplish tasks that humans can do in terms of language.

So it's obvious to look at contact center conversations. Can they have sales conversations with my customers? Can they do support? Can they write marketing copy? Can they even generate the marketing images? All these types of things they can do and I think a lot of those things will come to fruition in the coming years.

Lawson predicts that LLMs will have three layers of knowledge: 

There's the knowledge about the English language in the world. And you get that by training these models on the Internet and Wikipedia and everything else, and that’s what OpenAI has [done]. The next layer is details about the company that AI is representing. That's where you feed it the website and the facts and the knowledge bases and all that kind of stuff. That will be easy, every company will go [and] solve that. There'll be a lot of ways to solve that, but it won't be super hard. 

But then the third layer on top of that is the world, the company, the customer. You're going to have to tell the model, who am I talking to? Here's how you pronounce it. Here's what she bought. She's a new customer. She's a life-long customer. She's a high-value customer. Oh no, she's about to churn. Here's our purchase history. Here's the prior interactions we've had, all that context for who [she] is. The bot's going to need to know, otherwise it can have a great conversation with you about Wikipedia, but it won't actually have any business value. 

And there’s a lot of work to be done to achieve that value, he cautioned, citing a test demo from his own dev team, based around getting the price for an airline flight:

They were feeding in a bunch of data, pretending like you were interacting with an airline and saying, could you book a flight with a Large Language Model? They were trying to convince it that when you don't know the price of a flight, call this API to go get that knowledge. Instead, the Large Language Model insisted on making up a flight and its price. Like, yeah, it's $450 to fly on Flight 482. We're like, ‘That's just [BS], where did you get that?’. That's what Large Language Models are doing today. But…that will get solved in the coming years as these models get more sophisticated.

Age of AI

As they do, the ‘Age of AI’ will kick in, according to Scott Guthrie, Microsoft’s EVP of Cloud and AI, who calls this “the most significant technology transformation in any of our lifetimes”. As to its enterprise readiness, he argues: 

I think we're still early innings. The thing that's been great about ChatGPT, and then also around Bing, is the fact that end users can do it. The number of people I've talked to who maybe haven't used all of the new products from all the tech companies, but seem to have tried those two, and [they've] said, ‘Hey, we're using it now. My children are using it for homework', which you're not supposed to, or ‘We're using it in a variety of use cases’. I hear more and more interesting ones. I think it has actually made what has been a very technical concept - large foundational AI models - [into] transformer-based learning.

Most people didn't know when the word even meant 12 months ago and yet hundreds of millions of people have heard of ChatGPT and Bing now and tried it. I think that is actually making it much more real, which is giving us an opportunity to then say, ‘Hey, let's show you how you can use this in customer support. Let's show you how we can use it with developer productivity. Let's show you how we can use it for sales productivity’. It's a good conversation starter.

My take

The ChatGPT hype is real, but so is the potential for enterprise application. No vendor presentation is complete in 2023 without a generative AI angle to it, for better or worse. But the genie is out the bottle. Last words to ServiceNow CEO Bill McDermott: 

We're big on AI, we’re big on generative AI. We do think this is a transformational moment. I know the hype cycle is high right now, but it's absolutely a transformational moment. When you think about Netflix, as an example, getting its first million users in 3.5 years, and Twitter doing it in two years, and ChatGPT doing it in five days, you know something's going on out there.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK