7

WhatsApp, Signal and Encrypted Messaging Apps Unite Against UK's Online Safety B...

 1 year ago
source link: https://news.slashdot.org/story/23/04/18/1021231/whatsapp-signal-and-encrypted-messaging-apps-unite-against-uks-online-safety-bill
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

WhatsApp, Signal and Encrypted Messaging Apps Unite Against UK's Online Safety Bill

Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

binspamdupenotthebestofftopicslownewsdaystalestupid freshfunnyinsightfulinterestingmaybe offtopicflamebaittrollredundantoverrated insightfulinterestinginformativefunnyunderrated descriptive typodupeerror

Do you develop on GitHub? You can keep using GitHub but automatically sync your GitHub releases to SourceForge quickly and easily with this tool so your projects have a backup location, and get your project in front of SourceForge's nearly 30 million monthly users. It takes less than a minute. Get new users downloading your project releases today!

Sign up for the Slashdot newsletter! or check out the new Slashdot job board to browse remote jobs or jobs in your area
×
WhatsApp, Signal and other messaging services have urged the UK government to rethink the Online Safety Bill (OSB). From a report: They are concerned that the bill could undermine end-to-end encryption - which means the message can only be read on the sender and the recipient's app and nowhere else. Ministers want the regulator to be able to ask the platforms to monitor users, to root out child abuse images. The government says it is possible to have both privacy and child safety. "We support strong encryption," a government official said, "but this cannot come at the cost of public safety. "Tech companies have a moral duty to ensure they are not blinding themselves and law enforcement to the unprecedented levels of child sexual abuse on their platforms. "The Online Safety Bill in no way represents a ban on end-to-end encryption, nor will it require services to weaken encryption." End-to-end encryption (E2EE) provides the most robust level of security because nobody other than the sender and intended recipient can read the message information. Even the operator of the app cannot unscramble messages as they pass across systems - they can be decrypted only by the people in the chat. "Weakening encryption, undermining privacy and introducing the mass surveillance of people's private communications is not the way forward," an open letter warns.

Do you have a GitHub project? Now you can sync your releases automatically with SourceForge and take advantage of both platforms.
Do you have a GitHub project? Now you can automatically sync your releases to SourceForge & take advantage of both platforms. The GitHub Import Tool allows you to quickly & easily import your GitHub project repos, releases, issues, & wiki to SourceForge with a few clicks. Then your future releases will be synced to SourceForge automatically. Your project will reach over 35 million more people per month and you’ll get detailed download statistics.
Sync Now

  • But think of the children...

    But is Government always right? Sometimes it's Left...and sometimes it's just plain wonky.

    • There's three ways to do anything: The right way, the wrong way, and the government way.

      The problem is that the government has powers to make your life very, very miserable if you don't comply.

      • Re:

        The 4th way to do things is the Disney way...make the State of Florida and Gubernor Ron the Saint miserable.

    • Re:

      The government is made of politicians, who as shown here are people that can't help but lie, even when there is no need to lie, and lying actively harms their goals.
      So no, government is never right, they can't seem to help it.

      All they have to do is throw the choice out there.
      Do you want communications protected at the expense of anyone seeing those communications?
      Or do you want to enforce laws on those communications by eliminating protected communications?

      I have a feeling a huge number of people would be i

      • Giving the majority what it wants is arguably why it's almost impossible to buy something which isn't a piece of shit which is broken before it even leaves the store - the majority wants cheap at the expense of everything else - despite that 'cheap is expensive'.

        Extrapolate to explain all the ills of the world.

      • Clearly noone can argue with preventing child exploitation (presumably the reason this benefit is hilighted).

        Can't they come up with anything else?
          * We're super-nosy and want to eavesdrop on everyone's private time
          * We want to go on a fishing expedition so we can determine how to get the best ROI on future plans to make X illegal
          ?

    • Re:

      They are thinking of the children. They want to make sure the grooming quota is being met.

    • Re:

      They can already ask for transparency.

      It is possible to have opacity, and transparency for child safety.

      We support opacity, except for transparency for "public safety".

      We don't support opacity. We want transparency and we want services to provide transparency.

      We want opacity, we don't want services to provice transparency.

    • Re:

      The British government lost all credibility when they protected one of the most prolific sex offenders of all time, Jimmy Savile. https://en.m.wikipedia.org/wik... [wikipedia.org]

    • Re:

      Maybe it will be like those scanners and printers that refuse to scan/print money.

      • Re:

        Depending on how the law is worded, this might actually be sufficient. If a regulator asks a platform to monitor its users for child sexual abuse, the platform turns on a switch in the app which causes every image to be compared to images of child sexual abuse provided by the regulator. If there is a match, the app refuses to send the image.

        • Re:

          which, of course, means no end-to-end encryption because your comms are first sent to the tech giant's approval centre first.

          The government also quietly adds a clause saying "child porn, and other government-requested data for specified users" and thus the surveillance state is created as the government checks up on anything anyone might send whenever they like, and perhaps even with clauses that say the tech companie scannot inform anyone that they are even checking your comms.

          Meanwhile, the real child por

          • Re:

            No, the image to be tested would stay on the sender's computer and be compared there with images downloaded from the regulator.

            • Re:

              "No, the image to be tested would stay on the sender's computer and be compared there with images downloaded from the regulator."

              I don't think you've thought this through. You seem to be suggesting the regulator will send you illegal images, all of them, burying you in data and automatic crime.

              The way things like this are typically done is you send them a hash of your image and they perform a hash check and respond with a Yes/No. Ignoring the huge potential for misuse here, all it takes is a single p
              • Re:

                There's no reason that the local app couldn't have the hashes if all you really care about is not being able to send copies of existing material that's been identified. There are also plenty of algorithms that produce hashes which don't depend on single pixel or other common types of changes that are designed to handle these types of common manipulations.

                It hardly matters either way. Criminals will always find a way to break the law. That and any kind of database of existing media to watch for won't stop
                • Re:

                  I would be interested in evaluating a hash that will correctly flag the vast majority of test images as belonging to the database, and will not incorrectly flag any images that do not belong to the database.

                  Apparently, new child sexual abuse images are expensive to produce. Most trading in such images is done with existing material, which is in the database.

                  • Re:

                    Apparently, new child sexual abuse images are expensive to produce. Most trading in such images is done with existing material, which is in the database.

                    Perhaps for now, recent examples of machine learning 'art' suggest this may not hold for long. In which case you might not want to publish any picture at all of your kids, anywhere.

                    snake

                    • Re:

                      Interesting. There is no way to avoid publishing pictures of one's children, short of living in a cave. I wonder what the law would do about a picture that was generated from a prompt like "Make a picture of a child being abused". The law forbidding pictures of child abuse gets around the First Amendment by saying that such pictures are evidence of child abuse, but a synthesized picture isn't evidence of anything. This is an area that the law will struggle with, I think.

              • Re:

                The images can be encrypted with the regulator's public key. The sender's computer would encrypt each of its images using the regulator's public key before comparing it to the downloaded encrypted images. Having these encrypted images would not be illegal because without the regulator's private key they cannot be viewed.

                Storage is cheap. It should be possbile to store an encrrypted copy of every image in the regulator's illegal image database on a standard cell phone. If not, the regulator can pay the e

                • Re:

                  You think all phones should have an encrypted folder containing all known CSAM images/videos?! Constantly updated, I assume. You say storage is cheap, but we'd be talking about many, many terabytes of space. This is the most asinine thing I have ever heard.

                  • Re:

                    I strarted using computers in 1963, when core memory was a dollar a word. Storage prices have fallen like a cliff since them. Even if the database does occupy terrabytes of space, that amount of storage will soon be cheap. I just bought a 16 TB disk through e-bay for $125 plus shipping.

                • Re:

                  ".... The images can be encrypted with the regulator's public key."

                  That brings absolutely no useful value. A simple hash would give the same result and be much smaller. Encryption only has value if there is Decryption, without that all you need is a hash. Why in the world would you want to send millions of regulator content vice one user content, that's crazy!

                  "It is possible to do a series of Fuzzy Hashes to determine near matches but that begs the question of exactly what is near."
                  "I don't think
                  • Re:

                    I suggested encryption rather than hashing because I want to avoid hash collisions. I agree that a hash of the image is good enough, provided the hash is long enough to provide a very low probability that a legal image will have the same hash as an illegal image.

                    I was disagreeing with the premise that such fuzzy hashing is possible without addressing the question of how to determine if an image is "near enough" to an illegal image to also be regarded as illegal..

        • Re:

          Would that mean transmitting every known child porn image to the suspect's phone?

          • Re:

            Yes, though as noted in a previous response they could be encrypted.

            • Re:

              That might take quite a while, and necessitate expanding the phone's storage a bit. If they encrypt, changing even a single pixel on the image to be sent would make the comparison fail or they would have to use encryption so weak that the system might itself legally constitute distribution of child porn.

              If they're serious about stopping the problem, they're just going to have to put their coffee or tea down and go check on the wellbeing of children.

              Of course, they could also try following up on the evidence

              • Re:

                I suspect combating child porn is just an excuse--what they really want is to be able to monitor all communications. If that is true, a proposal like this one, that does not allow them to snoop on everyone, will be rejected, though they might have to think a bit to come up with a reason.

        • Re:

          How many gigabytes of child sexual abuse images have to be included in the app download to make this work?

          • Re:

            The download would happen only when the switch was turned on, and of course it would have to be kept up to date as the database of illegal images changes. I don't know how large the database is, but storage is cheap: it should be possible to store it on a cell phone. Perhaps the regulator could subsidize cell phones and forbid those without enough storage.

  • We want to assure no one has to risk their safety, liberties, data or communications, so put a complete and total black box over everything, that's hardened, and prefect. Oh, but please cut a peephole in it so you, and us, can watch everything, log everything and violate what should be fundamental rights of our citizens.

    Yes, illegal stuff happens, and the internet makes it easier in a lot of cases, but remove the internet from the equation. Laws exist, so to protect everyone else, should we monitor everyone, at all times, under the government's strict control and surveillance? Basically, if you want to go for a walk, you should first register the route with the government, just to make sure they approve.

    That's the level of violation we're talking about, frankly it's getting to the point they'll want to monitor your bedroom, to make sure your sheets consent to being slept in.
  • Its so easy to say we need to "protect the children", but how much child abuse would stop if these platforms allowed govt monitoring vs how much damage would occur when the govt database is inevitably hacked and released? Surly if these platforms become monitored, people will find other ways to conduct criminal activity.
    • Re:

      This is really the key.

      Child abuse has been there since forever and is not new due to technology. Let's not forget the actual crime is the child being abused. If it's done in secret and only 10 perverts see it gathering in a basement on a projector, versus 1000 due to whatspp... the actual abuse of the child has not changed.

      While possession of child porn is a crime, we should always remember the metric for success of a policy should be on reducing the number of children being abused.

      I guess one really valid

      • Re:

        If they had the technology they would make imagining child porn illegal. They'd make fantasising about committing a crime illegal. Thus annihilating much of literature and cinematography. But its for the children, so meh.

    • Re:

      How much abuse would this stop? Probably none at all. This is about _pictures_ of child abuse being _sent_. It is not concerned with the creation of those pictures. And children that have their abuse not being documented (probably the vast majority) are apparently not a concern at all. The whole thing is a big, fat lie.

      • Re:

        The theory is that if there is a market for pictures of child abuse, there will be an incentive to create it, and the creation involves abusing children.

  • Why would any company oppose the above? And if they oppose, what's their alternative? It's like they're literally asking for permissions to steal messages. And the way they word their argument is like they're doing the world a favor.

    • Re:

      I think you're misunderstanding it. They're not asking for permission to steal messages. The "which means" thing applies to and defines for an audience unaware of the concept "end-to-end encryption" not "the bill".

      What's a little amusing here is that at least one of the entities opposed to this bill is Facebook (WhatsApp), and Facebook would, actually, like your messages. They built an entire business on destroying your privacy. But I guess they dislike the government doing it even more...

  • Most of these apps are meant for shortish messages. If you have kidde pr0n, you encrypt it with serious encryption before you share it. That takes the medium out of the equation. The dark web is where most of it is traded. So, this hole in the messaging apps is really aimed at drug dealers and other petit criminals. Maybe catch some cheating spouses, be able to leverage them. It won't catch many pr0n traders.
  • The best thing we can do for our kids - beyond ensuring the survival of our species and our civilization - is to say no to institutionalized, legalized, normalized privacy violation. So it's absurdly ironic that the Brits are using a "think of the children" argument here in an attempt to justify their authoritarian-bordering-on-dictatorial spying.

    Any government that insists on the right to routinely examine and read its citizen's private communications, is an authoritarian regime in the making. It would be


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK