3

Meta Says Its New Speech-Generating AI Model Is Too Dangerous For Public - Slash...

 1 year ago
source link: https://meta.slashdot.org/story/23/06/19/2052231/meta-says-its-new-speech-generating-ai-model-is-too-dangerous-for-public
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Meta Says Its New Speech-Generating AI Model Is Too Dangerous For Public

Do you develop on GitHub? You can keep using GitHub but automatically sync your GitHub releases to SourceForge quickly and easily with this tool so your projects have a backup location, and get your project in front of SourceForge's nearly 30 million monthly users. It takes less than a minute. Get new users downloading your project releases today!Sign up for the Slashdot newsletter! or check out the new Slashdot job board to browse remote jobs or jobs in your area
×

Meta Says Its New Speech-Generating AI Model Is Too Dangerous For Public (theverge.com) 60

Posted by BeauHD

on Monday June 19, 2023 @06:00PM from the don't-let-the-genie-out-of-the-bottle dept.
An anonymous reader quotes a report from The Verge: Meta says its new speech-generating AI model is too dangerous for public release. Meta announced a new AI model called Voicebox yesterday, one it says is the most versatile yet for speech generation, but it's not releasing it yet: "There are many exciting use cases for generative speech models, but because of the potential risks of misuse, we are not making the Voicebox model or code publicly available at this time." The model is still only a research project, but Meta says can generate speech in six languages from samples as short as two seconds and could be used for "natural, authentic" translation in the future, among other things.

Do you have a GitHub project? Now you can sync your releases automatically with SourceForge and take advantage of both platforms. Do you have a GitHub project? Now you can automatically sync your releases to SourceForge & take advantage of both platforms. The GitHub Import Tool allows you to quickly & easily import your GitHub project repos, releases, issues, & wiki to SourceForge with a few clicks. Then your future releases will be synced to SourceForge automatically. Your project will reach over 35 million more people per month and you’ll get detailed download statistics. Sync Now

when you can no longer trust your eyes and ears you now have to *gasp* do actual research and find reliable sources.

You can't look at a video and have a knee jerk reaction because you'll know there's a 50/50 chance it's fake.

People are going to learn cynicism. They're also going to have to learn how to evaluate sources. In other words, like it or not they'll have to learn to think critically. Otherwise they'll look like complete idiots again and again.

Yeah there's the Qanon nutter, but those people have always existed and you don't even need AI fakes to fool them, they'll believe anything.

The regular folks though, the ones who have been letting corporate owned media fool them by pushing their buttons, are about to be dragged kicking and screaming into the wonderful world of critical thinking. Whether they like it or not.
  • when you can no longer trust your eyes and ears you now have to *gasp* do actual research and find reliable sources.

    The regular folks though, the ones who have been letting corporate owned media fool them by pushing their buttons, are about to be dragged kicking and screaming into the wonderful world of critical thinking. Whether they like it or not.

    Yeah, maybe people will learn to think for themselves, but I'm already quite cynical. I'm a firm believer that most people are naturally lazy and will look to optimize this extra work out of their lives as fast as possible -- they don't have the personal bandwidth to constantly research the validity of everything they see and hear from popular culture, so they'll look for "trusted sources" to outsource that task so that they can instead focus their limited time on the things that matter most in their daily lives.

    It's even possible that the combination of this general lack of trust and the splintering of information sources, away from major media and toward the Internet, will just create a much larger and more diversified field of "belief bubbles"

    I hope your vision is the one that ultimately wins out. Mine feels like what we already have with cable news, only x1000 - ie, kind of awful. Haha

    • Am with you on this one. People keep saying that when automation takes over all the menial jobs, people will adapt and get the new jobs AI couldn't do. But am afraid that the reason everyone isn't an engineer or a doctor is simply because not everyone can be an engineer or a doctor. Not because of a lack of intellect, but the lack of desire to seek that knowledge. Same reason why tech companies need more people in tech. Because the only people who want to sit in front of a computer 8 hours a day writing code at work only to go home and spend MORE time in front of a computer... are already tech workers. It's not for everyone.
      And the same goes for the "pursuit of truth". Most people don't want the truth, just what's convenient. A minority wants to find the truth even if it goes against their beliefs, and those people already doubt what's out there. Same as the people who are willing to be knee-deep in someone's guts, risking their careers to save someone's life. They're already surgeons.
      And let me close this with one of my favorite quotes:

      "What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny "failed to take into account man's almost infinite appetite for distractions."

      In 1984, Huxley added, "people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us".”

      Neil Postman, Amusing Ourselves to Death: Public Discourse in the Age of Show Business

      • Re:

        It definitely is a lack of intellect.
        Most people are morons, incapable of reason.
        Your observation that they are also super lazy is (also) correct.

      • that makes it sound like a moral failing. Like laziness.

        What they lack is a single-minded obsession with one specific category of knowledge. When scientists study "smart" people that's what they found. Their brains could focus on one area and form a specific specialty around that area, making them a valuable expert in that field.

        Lots of folks don't have that obsessive single-mindedness, and this means that while they can do useful work, they can't become the kind of high end specialists that are goi
        • Re:

          you are giving AI and automation too much credit. Chess was a "big brain" task until we discovered the mechanical idiot savants can do it better. What we view as impressive is arbitrary or simply challenging for most humans.

          Deciding what is a bus in a photo, THAT turns out to be much more difficult than winning at chess. The machines will need humans to do trivial tasks which are difficult and expensive to perform for the machine but relatively easy for our brains; designing these systems so an illiterate

    • I agree. Personally I blame both government and religion for dumbing people down.
  • Re:

    It's just marketing. They said the same thing about GPT-2, GPT-3, GPT-4... It's amazing they think this will work again. Who still believes this nonsense?

    • Re:

      And even if it work exactly as advertised, and even if they somehow built in protection from nefarious uses... what good would it be for? It doesn't help with information, it only helps with "style", and style produced at zero cost is worth exactly that much.

      • Re:

        ... what's strange is if you look at LinkedIn most people are heaping praise at this stuff, as if that somehow makes them look more "professional".

  • Re:

    Pretty sure 2024 will be the first campaign where deep fakes are used to discredit opponents, because some of those MAGA Republicans are stupid enough to try to pull something like that and inevitably get caught.
    • Re:

      Didn't that already happen? A Desantis ad with deepfake Trump (kissing Faucci)?

  • when you can no longer trust your eyes and ears you now have to *gasp* do actual research and find reliable sources.

    We already know that this is not what most people will do. Faced with the myriad lies and facts out there on the internet people instead go with what their gut tells them is true. If whatever they are reading seems true to them then they believe it even if it is a pack of lies. However, if it challenges their current view of the world and would require them to change their ideas then even if it is absolutely factually correct they don't believe it.

    That's why modern society feels like it is breaking down. It was one thing when we had disagreements on politics and how to solve problems but right now I don't think we can even agree on what is objectively real - indeed we even get some idiots trying to tell us that each of us has our own objective reality!

  • People are going to learn cynicism. They're also going to have to learn how to evaluate sources. In other words, like it or not they'll have to learn to think critically. Otherwise they'll look like complete idiots again and again.

    Back in the 90s I was doing tech support for one of the first open-publishing sites , and we where pretty excited for the idea of news that didnt make it to the mainstream reaching the public for the first time.

    What however ALSO happened is we started getting a lot of somewhat crazed conspiracy theorists (back then it was all about bill clintons black helicopters,and lots of "jews control the world" nonsense) posting frankly made up nonsense and we started debating internally if we are doing harm leaving it up.

    I argued strongly that having this stuff up teaches people they shouldnt even trust alternative media and will learn to think critically about what they read and this will help them consume regular media with a much more skeptical mindset,

    The end result was much worse. Instead people that we knew as solid thoughtful people started repeating the nonsense in the conspiracy posts. These where smart people degrees in STEM and Analytical Philosophy and similar fields with high value placed on logical reasoned thinking and even they where getting bamboozled by it.

    The lesson here is dont rely on common people to recognize nonsense from sense. If even the people most equipped to do so fail, what hope do the rest have.

    • Re:

      I have a black-and-white TV series zebra I'd like to sell you...(Wwhhhiiilberrr.)

    • Re:

      If there was a secret cabal of lizardmen running the world it would make sense for them to spread the seed that only the cleverest, bestest, smartest people would have a knee-jerk dismissal response to such "obvious conspiracy nut rubbish." Lord knows there's certainly been true stories of late that would read like the ravings of a lunatic had some reporter put them up on your site many years ago. I wish we could normalise news sites having wiki style citations. More and more I find that those citations don
    • Re:

      Even if that is still case, it is still the superior option then instituting a "ministry of truth".

  • Re:

    I can't tell if your talking about facebook or AI?

  • Re:

    Well I'd also say that's the general problem with the world as a whole. What's the sane and trustable source now. Big media is corporate controlled, they care about keeping their advertisers happy over finding the truth. Then there's the internet, the wild west. Then you have "independent" news, which half the time is a puppet for a particular party that's hiding it's real motives, or just a crazy guy talking out of his own ass with no real sources. Trusted journalism is darn near an oxymorn these days.
  • Re:

    The problem is that when it easy to generate deep fakes, information space may be flooded with them. There will be an arms race between creating undetectable deep fakes, and systems to detect them, so there may not be any tools to ensure authenticity.

    There is a good chance people will believe information that supports what they already believe, and reject any that doesn't. This could further enhance the social bubbles that are causing so many problems
  • when you can no longer trust your eyes and ears you now have to *gasp* do actual research and find reliable sources.

    You do realize that the people watching Fox News, OANN, Sinclair stations and Infowars already think they are watching reliable sources, do you not?

  • What makes you think even the regular folks would want to do research as opposed to believing whatever they want to believe in? Most people already don't do research when confronted with something new. People already are fast to believe extremely dubious sources.
  • Re:

    Well, yes and no. Only something like 10-15% of all people can fact-check. For them, not much will change. For the rest, they will just get overwhelmed and fixate on the first stupid thing they like and then claim that obviously it is all true and verified and, yes, has Science on its side. You know, the usual insightless crap people with big egos and small skills do.

    Hence I think essentially nothing will change. If we get some nice AI-generated porn out of this I will call it an overall improvement, but I

    • Re:

      [citation needed]

  • Re:

    That's not the solution you think it is. Cynicism works in multiple ways. The reality is the people who suffer the greatest are those most cynical of everything around them while also being incapable of research. Deepfakes won't fix the latter, just make it more difficult.

    We're going to see more idiots disbelieving science and reality as a result of this, not less.

  • Re:

    Nah, they will just believe whatever the true leader says in whatever official accounts, and everything else is fake news.

  • Re:

    hey wont they will carry on the same as they as ways did - believing what they want to believe and crying it some one elses fault when they are conned and the government needs to compensate them.
  • Re:

    I think the last several years have proven that there are more than enough people perfectly happy to look like idiots over and over again that this is troubling on a society-wide level. Granted, we're already spiraling the toilet bowl and headed toward our doom, but it'd be nice to think we could maybe think about slowing our failure rather than accelerating it with bullshit like deepfake voices. Which we absolutely, 100% know will be used by media companies and others to fuck with elections and erode, furt


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK