3

Enterprise hits and misses - Shadow AI is here, but is it a problem? Autonomous...

 1 year ago
source link: https://diginomica.com/enterprise-hits-and-misses-shadow-ai-here-it-problem-autonomous-transport-hits-barriers-and-digital
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Enterprise hits and misses - Shadow AI is here, but is it a problem? Autonomous transport hits barriers, and digital transformation get a report card

By Jon Reed

June 26, 2023

Dyslexia mode

success-failure-road-for-businessman

Lead story - Shadow AI is here, but is it a big problem?

When I learned that upwards of 90% of developers (and marketers) are using ChatGPT, I started in on the phrase Shadow AI. No surprise - this is a thing. As Chris writes, AI is causing a new wave of shadow IT, warns HCLTech’s Ashish K Gupta.

First off, any IT leaders who need a warning about this should turn in their credentials today. News headlines already made it clear: your corporate data could wind up in these ChatGPT models, which are not built for enterprise privacy concerns (Samsung banned the use of these tools internally after such an incident; they are not the only ones).

These problems aren't going away. Chris quotes Gupta on the declining control of IT, on pretty much any type of software adoption:

There was a time when everything used to run through IT, or the CIO’s desk. But I can tell you that is now a rapidly falling curve. And as technology becomes more democratized, and as business becomes more intelligent about using it for competitive advantage, the percentage of IT department involvement is only going to decrease further.

When it comes to software purchase decisions, that may be healthy. But it's baffling that organizations wouldn't involve IT in vetting the security and privacy aspects of their chosen tools. If this isn't happening, it's a major gaffe. Where Chris and I don't agree fully is when he asserts:

AI is increasingly doing the creative work while humans feed the machines, forcing down the earning potential of skilled, creative professionals in the process.

Yes, some companies will do this - the cynical ones will double down on it. Companies would love nothing more than to reduce creative costs. But the savvy creatives (and developers) will employ such tools to increase their productivity. These tools simply cannot do peak creative work on their own, so it comes down to how much mediocre creative work a company can get away with crudding out in a competitive environment. Using AI to generate job and product descriptions at scale is another matter. Automating processes to free up workers for higher value activities, such as "face time" with important customers or incident escalations, remains a crucial goal of most companies with automation and AI. We can't settle that debate here, but there's no question that employees are feeding corporate data into these third party AI tools, for a variety of reasons.

Is this a problem? Yes, in the short term. Counting on these tools to protect corporate data would be nutso. IT managers need to issue Shadow AI guidelines, pronto. Samsung intends to solve this problem by providing their own tools. Other companies will wait until enterprise-grade LLMs come along (I anticipate the spring 2024 timeframe for generally available tooling of these kind).

Build or buy, doesn't matter - there will be a "wild west" gap until generative AI tools with proper data guardrails are in place. Therefore, I see Shadow AI as a high priority, but short term problem. The overall role of IT in the proper vetting of software tools is the the bigger issue Chris rightly calls attention to, and that's not going away.

Diginomica picks - my top stories on diginomica this week

Vendor analysis, diginomica style. Here's my three top choices from our vendor coverage:

Fresh use cases from the road:

A few more vendor picks, without the quotables:

Jon's grab bag - Madeline addresses the debilitating experience of imposter syndrome in Women in tech - how to banish imposter syndrome and get comfortable with self-promotion. Mark Samuels takes on the problem of drug fakery in UNICEF partners to tackle fake medicine abuse internationally. Meanwhile, Chris finds flaws in autonomous vehicle narratives: Why technologists must recognize the cultural barriers to autonomous transport.

I would simply add: the technology conversation isn't done either. Technologists need to accept that the technology for fully autonomous vehicles isn't there yet. There are viable/interesting use cases in more controlled settings, but the urban use cases and road environments are too unpredictable for the current tech. One of many examples: interacting poorly with emergency vehicles.

San Francisco's fire chief is fed up with robotaxis that mess with her firetrucks. And L.A. is next https://t.co/CsfvBYfFRi

"putting more robotaxis on public streets even as they prove inept at dealing with firetrucks, ambulances and police cars."

- let's move fast, break stuff

— Jon Reed (@jonerp) June 25, 2023

Finally, I got lured into writing about the business of web scraping in The business of real-time data - is there an ethical approach to web scraping? I'm still unconvinced there is such a thing as "ethical web scraping" - what say you?

Best of the enterprise web

My top seven

  • Mass adoption of generative AI tools is derailing responsible AI - Joe McKendrick also flags the Shadow AI issue. But there is a "responsible AI" aspect in play also: "responsible AI programs 'should cover both internally built and third-party AI tools,' Renieris and co-authors urge." Also see: McKendrick's A thorny question: Who owns code, images, and narratives generated by AI?
  • Why Digital Transformations Are Failing at an Alarming Rate - Eric Kimberling with a thorough post on how digital transformation goes wrong. I hope to pick this up with him when I join his video show tomorrow.
  • Elevating The Voice of the Supply Chain Contrarian – Lora Cecere is on fire again: "As large manufacturers work the conference circuit in the industry to increase brand awareness for their supply chain efforts, they feed the perception that their stories demonstrate supply chain leadership. However, in the process, no company is held accountable for a performance standard, and as a result, we are learning glossy versions of stories from laggards." Ouch!
  • Uncertain Outcomes - Hank Barnes reflects on lessons from his recent Gartner event: "There is a group, your unacceptable profile, that you should avoid spending resources on unless there are clear, unique signals to the contrary."
  • Seven Rules For Internet CEOs To Avoid Enshittification - A (very) strongly-worded piece on doing business the right way: "If you’re charging for something that was once free, you’re taking away value from your community. You’re changing the nature of the bargain."
  • The New New Moats - This long-form read from Greylock's Jerry Chen is a "red line" update on a prior piece, accounting for the major AI developments since Chen originally issues this piece. I can't summarize all the points, but the "new new moats" is a provocative (and useful) concept - meaning that big tech companies turn out to have a major advantage in data-rich, large language model scenarios ("the old moats matter more than ever"). The implications for enterprise software are fascinating - including disruption in user licensing and product categories (if I ask my "AI" to carry out a workflow for me, I don't want to hear I don't have a license that covers your service cloud). Does the transaction layer go away? I'm not sure, but it certainly becomes a commodity. If so, that changes the enterprise software market considerably. I'll look to write about this soon.

Whiffs

Event apps can be handy - well, except when you can't verify your log in details and the app vendor doesn't do a thing about it:

Looks like we can add "cvent customer success" o the list of business oxymorons... unless the goal was to ensure a bunch of attendees couldn't log into the event lol! cc: @CventSuccess https://t.co/E1ZXF6wuOx

— Jon Reed (@jonerp) June 24, 2023

I'm all for app security but, do we really need heightened security for browsing conference sessions? On the other hand, for health care data security, I'm all in:

Gotta love the wonderful contrast between

1. "We take your data privacy very seriously" and

2. The many steps now taken to bolster web security that should have been in place all along - and might have prevented a damaging ransomware attack...https://t.co/lZpEO95RwA pic.twitter.com/aKX4ICqSlr

— Jon Reed (@jonerp) June 19, 2023

I seem to be struggling to get that data security balance right, eh? On top of that, I'm coping with jealousy with diginomica contributor Brian Sommer, who is getting all kinds of leadership recognition in his inbox. He passed this one along, complete with typos:

We are glad to invite our Editorial Team has shortlisted Brian Sommer to feature in The 10 Most Inspiring Leaders to Follow in 2023.

Now why can't I get this kind of recognition in my inbox? Ah well, a guy can dream... See you next time - and I'll get to IBM's shopping spree next week, as well as Anaplan's layoffs.

If you find an #ensw piece that qualifies for hits and misses - in a good or bad way - let me know in the comments as Clive (almost) always does. Most Enterprise hits and misses articles are selected from my curated @jonerpnewsfeed.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK