Slack trains machine-learning models on user messages, files and other content without explicit permission. The training is opt-out, meaning your private data will be leeched by default.
First all companies were afraid of giving access to these models, for trade secret issues and security. But then they basically all met at the white house to agree that they would make way more fucking money stealing it than they would pay in restitution or damages to people and small businesses.
Suddenly everybody had a chatbot and generated art ready for commercial sale. They also had to make the shift quickly enough before official laws and protections (mostly from the EU) came in.
Now AI is plateauing a bit so they must hurry to get valuated at 10 trillion dollars and get their energy needs subsidized and have taxpayers invest into the nation’s energy requirements on their behalf.
I doubt that most corporations would even consider allowing Slack as a trusted app if they weren’t hosting their own instances themselves.
I have to assume that this training is exclusively on instances hosted on Slacks’ servers. So probably lots of smaller businesses that don’t know any better. And this was probably agreed to in the ToS as part of utilizing free and easy to set up cloud servers.
Ahh, looked at it and you’re right. They have an “Enterprise” version which seems like it’s security conscious.
Still, I stand by my original assertion. I have worked for FAANG companies with completely locked down security that allowed us to use Slack. I would be extremely surprised if their contract with Slack didn’t ensure complete data privacy.
We’re talking about companies where a product leak makes international news. There is zero chance Slack employees have access to communications.
So slack is stealing trade secrets?
We talk fairly openly about everything but passwords on slack…
So did SBF and his company lawl. It’s great opsec
First all companies were afraid of giving access to these models, for trade secret issues and security. But then they basically all met at the white house to agree that they would make way more fucking money stealing it than they would pay in restitution or damages to people and small businesses.
Suddenly everybody had a chatbot and generated art ready for commercial sale. They also had to make the shift quickly enough before official laws and protections (mostly from the EU) came in.
Now AI is plateauing a bit so they must hurry to get valuated at 10 trillion dollars and get their energy needs subsidized and have taxpayers invest into the nation’s energy requirements on their behalf.
I doubt that most corporations would even consider allowing Slack as a trusted app if they weren’t hosting their own instances themselves.
I have to assume that this training is exclusively on instances hosted on Slacks’ servers. So probably lots of smaller businesses that don’t know any better. And this was probably agreed to in the ToS as part of utilizing free and easy to set up cloud servers.
You may be thinking of something else, Slack doesn’t have a self-hosted version.
Ahh, looked at it and you’re right. They have an “Enterprise” version which seems like it’s security conscious.
Still, I stand by my original assertion. I have worked for FAANG companies with completely locked down security that allowed us to use Slack. I would be extremely surprised if their contract with Slack didn’t ensure complete data privacy.
We’re talking about companies where a product leak makes international news. There is zero chance Slack employees have access to communications.
Sure, even though Slack itself admits so in their privacy policy.
This guy never worked at a corporation