Skip Navigation

Gmail users warned to opt out of new feature - what we know

68 comments
  • You have been automatically OPTED IN to allow Gmail to access all your private messages & attachments to train AI models.

    "Feature"

  • I'm not seeing where any of this gives Google permission to train AI using your data. As far as I can see it's all about using AI to manage your data, which is a completely different thing. The word "training" appears to originate in Dave Jones' tweet, not in any of the Google pages being quoted. Is there any confirmation that this is actually happening, and not just a social media panic?

    • The only option is smart features, on or off. That requires Google to read the email to categorize them and do a lot of basic stuff. It doesn't let you narrowly have more privacy on specific features. It's all or nothing, and if you get a lot of emails then it's hard to turn it off if you already use categories. Google always does all or nothing because they know people need some of it, same with location. It has to be precise location tracking to use things, you can't just do rough location.

      • Yes, but the point is that granting Google permission to manage your data by AI is a very different thing from training the AI on your data. You can do all the things you describe without also having the AI train on the data, indeed it's a hard bit of extra work to train the AI on the data as well.

        If the setting isn't specifically saying that it's to let them train AI on your data then I'm inclined to believe that's not what it's for. They're very different processes, both technically and legally. I think there's just some click-baiting going on here with the scary "they're training on your data!" Accusation, it seems to be baseless.

    • Not that I've seen, no.

    • I would opt out just in case. I remember using Adobe Acrobat at work and noticed they read every single PDF and generate a few comments about it even when you never asked them to. Meaning they‘re scanning through potentially confidential data. I have no doubts Google will do the same sooner or later.

    • i probably agreed to 'smart' something, long ago for spellcheck and others. they've expanded those options a lot. i no longer need google spellcheck. firefox since the chrome went downhill

    • Understand that basically ANYTHING that "uses AI" is using you for training data.

      At its simplest, it is the old fashioned A/B testing where you are used as part of a reinforcement/labeling pipeline. Sometimes it gets considerably more bullshit as your very queries and what would make you make them are used to "give you a better experience" and so forth.

      And if you read any of the EULAs (for the stuff that google opted users into...) you'll see verbiage along those lines.

      Of course, the reality is that google is going to train off our data regardless. But that is why it is a good idea to decouple your life from google as much as possible. It takes a long ass time but... no better time than today.

      • Understand that basically ANYTHING that "uses AI" is using you for training data.

        No, that's not necessarily the case. A lot of people don't understand how AI training and AI inference work, they are two completely separate processes. Doing one does not entail doing the other, in fact a lot of research is being done right now trying to make it possible to do both because it would be really handy to be able to do them together and it can't really be done like that yet.

        And if you read any of the EULAs

        Go ahead and do so, they will have separate sections specifically about the use of data for training. Data privacy is regulated by a lot of laws, even in the United States, and corporate users are extremely picky about that sort of stuff.

        If the checkbox you're checking in the settings isn't explicitly saying "this is to give permission to use your data for training" then it probably isn't doing that. There might be a separate one somewhere, it might just be a blanket thing covered in the EULA, but "tricking" the user like that wouldn't make any sense. It doesn't save them any legal hassle to do it like that.

    • Wait, wait. You want me to suspend reality and believe that Google isn't doing anything super shady at all here just because they said they totally aren't? bro, bro....BRUH!!

      • If you believe that Google's just going to brazenly lie about what they're doing, what's the point of changing the settings at all then?

        In fact, Google is subject to various laws and they're subject to concerns by big corporate customers, both of which could result in big trouble if they end up flagrantly and wilfully misusing data that's supposed to be private. So yes, I would tend to believe that if the feature doesn't say the data is being used for training I tend to believe that. It at least behooves those who claim otherwise to come up with actual evidence of their claims.

68 comments