>put messages into someone else’s system
>don’t read privacy policy
>someone else uses your messages
surprisedpikatchu.jpg
Seriously. What would be surprising is if they were not. Proprietary System gonna Proprietary System.
Just the other day, me and @rottingleaf@lemmy.zip “designed” a new messenger to combat things like this: https://lazysoci.al/comment/9619656
Your idea doesn’t sound too difficult to implement but I don’t know if people would want to store all these messages locally when the vast majority of people are used to having their shit be stored elsewhere. Additionally, if you wanted to target enterprise users, they would want to likely have all their messages centralised for auditing purposes
Other than that, I think its a pretty neat idea
I think that’s the issue. We’re all so used to the idea of free storage and we’re not cognizant of the consequences. If we start holding some of our chips in our own hands, all these corporations won’t be able to sell us out and abuse us so easily.
Also thank you!
This was definitely a fuckup from Slack but as I’ve understood it, the “AI training” means that they’re able to suggest emoji reactions to messages.
Not sure how to think about this, but here’s some additional info from slack: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/
Edit: Just to pick main point from the article:
Slack AI principles to guide us.
- Customer data never leaves Slack.
- We do not train large language models (LLMs) on customer data.
- Slack AI only operates on the data that the user can already see.
- Slack AI upholds all of Slack’s enterprise-grade security and compliance requirements.
AI training to suggest emoji reactions? Really? 😂
I know of a few security companies that use slack to work together that includes a shitton of privat data, source codes and confidentional information
Guess whoever introduced the company to slack service fucked up by not reading their policies.
Or they’re using the paid tier
I’m working in fintech, and we share pii through DMs all the time (for investigation purposes). I’d be really surprised if the AI would need to train on that.
Interesting how MS is the reasonable one here where all their copilot stuff clearly separates paying business from free consumer stuff for training / not training.
However slack has gone and said they will train on everything, and ONLY the paying companies can request to opt out.
Too bad so sad for all those small dev teams that have been using the “free” version of slack… No option to opt out.
Wasn’t there a competitor named Mattermost?
a FLOSS competitor?
Stay away from proprietary crap like Discord, Slack, WhatsApp and Facebook Messenger. There are enough FOSS alternatives out there:
- You just want to message a friend/family member?
- Signal is the way to go
- You need strong privacy/security/anonymity?
- SimpleX
- Session
- Briar
- I can’t really tell you which one is the best, since I never used any of these (except for Session) for an extended period of time. Briar seems to be the best for anonymity, because it routes everything through the Tor network. SimpleX allows you to host your own node, which is pretty cool.
- You want to host an online chatroom/community?
- You need to message your team at work?
- You want a Zoom alternative?
In the perfect world where you can convince your company to use anything other than MS Teams and that your family bothers to use anything that isn’t WhatsApp or Telegram. Unfortunately I don’t live on it 😭
Ok sure, it’s more complicated in a corporate environment. But you can easily convince your friends to switch to Signal, I got almost all of my friends and family to use Signal and it’s great.
The problem with Signal is that it’s just not very user orientated.
Wdym? The user experience is basically 1:1 the same as on WhatsApp
I got almost all of my friends and family to use Signal
That should be easy, since I’d have to convince one guy to do so. Won’t happen, though
- You just want to message a friend/family member?
We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.