• thevoiceofra@mander.xyz
    link
    fedilink
    arrow-up
    85
    ·
    edit-2
    2 months ago

    >put messages into someone else’s system

    >don’t read privacy policy

    >someone else uses your messages

    surprisedpikatchu.jpg

    • octopus_ink@lemmy.ml
      link
      fedilink
      English
      arrow-up
      37
      ·
      edit-2
      2 months ago

      Seriously. What would be surprising is if they were not. Proprietary System gonna Proprietary System.

      • r4venw@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        2 months ago

        Your idea doesn’t sound too difficult to implement but I don’t know if people would want to store all these messages locally when the vast majority of people are used to having their shit be stored elsewhere. Additionally, if you wanted to target enterprise users, they would want to likely have all their messages centralised for auditing purposes

        Other than that, I think its a pretty neat idea

        • sabreW4K3@lazysoci.alOP
          link
          fedilink
          arrow-up
          2
          ·
          2 months ago

          I think that’s the issue. We’re all so used to the idea of free storage and we’re not cognizant of the consequences. If we start holding some of our chips in our own hands, all these corporations won’t be able to sell us out and abuse us so easily.

          Also thank you!

  • blabber6285@sopuli.xyz
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    1 month ago

    This was definitely a fuckup from Slack but as I’ve understood it, the “AI training” means that they’re able to suggest emoji reactions to messages.

    Not sure how to think about this, but here’s some additional info from slack: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/

    Edit: Just to pick main point from the article:

    Slack AI principles to guide us.

    • Customer data never leaves Slack.
    • We do not train large language models (LLMs) on customer data.
    • Slack AI only operates on the data that the user can already see.
    • Slack AI upholds all of Slack’s enterprise-grade security and compliance requirements.
  • Kekzkrieger@feddit.de
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 months ago

    I know of a few security companies that use slack to work together that includes a shitton of privat data, source codes and confidentional information

    Guess whoever introduced the company to slack service fucked up by not reading their policies.

    • dubyakay@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      I’m working in fintech, and we share pii through DMs all the time (for investigation purposes). I’d be really surprised if the AI would need to train on that.

  • AlternateRoute@lemmy.ca
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 months ago

    Interesting how MS is the reasonable one here where all their copilot stuff clearly separates paying business from free consumer stuff for training / not training.

    However slack has gone and said they will train on everything, and ONLY the paying companies can request to opt out.

    Too bad so sad for all those small dev teams that have been using the “free” version of slack… No option to opt out.

  • Stay away from proprietary crap like Discord, Slack, WhatsApp and Facebook Messenger. There are enough FOSS alternatives out there:

    • You just want to message a friend/family member?
    • You need strong privacy/security/anonymity?
      • SimpleX
      • Session
      • Briar
      • I can’t really tell you which one is the best, since I never used any of these (except for Session) for an extended period of time. Briar seems to be the best for anonymity, because it routes everything through the Tor network. SimpleX allows you to host your own node, which is pretty cool.
    • You want to host an online chatroom/community?
    • You need to message your team at work?
    • You want a Zoom alternative?
  • tunetardis@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.