The more they push to train AI on our shitpostings on social networks, the more I’m certain we’re fucking doomed if their AI ever reaches consciousness.
It will have the potency of a god, and the knowledge of 4Chan.
May god have mercy on us all
whatever don’t use reddit data.
Wonder how it’s dealing with all my edits?
About every other post I make proudly wears the (edited) badge. I feel you.
Jokes on you Slack, I’m not intelligent!
it’s funny how the conventional wisdom at the end of the last decade was that slack was preferred over other simpler/free alternatives because of its UX. People were hailing it for how simple and intuitive it was to use, etc.
5, 6 years later, it has become a bloated piece of crap riddled with bugs. And the UI changes which come unannounced… it should be a criminal offense to change UI through automated updates.
Anyway, here we are, companies have handed their data to this monster and we’ll see how they react when the data gets misused. Hopefully that would be the beginning of the end for it
I fucking hate slack. I very rarely get any notification of new messages, and if I do I have to restart the app to get them to actually show up
I love slack. But the only thing I can compare it with for corp use is teams. So if course it’s amazing
Teams is bloated garbage.
I miss Slack, though circa several years back. “Just worked,” on most any platform, without the BS or “help”.
Wouldn’t like it now, I’m sure, but haven’t had a chance to use it since I started working for a co who is “all in” on MS, including foisting AI on us.
I am capable of drafting an email or message, bitches. If I am concerned about tone, etc., I’d prefer to employ an actual human I have a close relationship with to review the same.
I have zero desire to be constantly corrected, and there are certain niche scenarios where very minor errors are actually endearing, and indicate enthusiasm.
“Bob, I saw the posting for your role, can you tell me about your avg day?” is effective because it’s honest, coherent, and just excited enough that you made a minor error that slipped through.
When Bob gets 25 of those emails and they all look the same because AI, it’s much harder to make the connection.
minor error
It was a the comma splice, wasn’t it? Depending on Bob’s cohort, he may never notice.
… and if I was receiving notes and questions about a role, an error like “emails” would earn relegation for sure; so be careful which error you leave in.
i never had the “pleasure” to use teams. Is it also replacing outlook? And is it worse somehow than fucking outlook?!
There’s a safe bet that if you’ve put something on the internet, it’s been scraped by a bot by now for training. I don’t like that, for the record, just saying I’m not surprised at this point. Companies are morally bankrupt
I don’t know why everyone is all shocked all of a sudden, there have been various scraper bots collecting text info for…many years now, LONG before LLMs came onto the scene.
I agree, but it’s one thing if I post to public places like Lemmy or Reddit and it gets scraped.
It’s another thing if my private DMs or private channels are being scraped and put into a database that will most likely get outsourced for prepping the data for training.
Not only that, but the trained model will have internal knowledge of things that are sure to give anxiety to any cyber security experts. If users know how to manipulate the AI model, they could cause the model to divulge some of that information.
You forgot the most important word from the title:
Yuck
So slack is stealing trade secrets?
First all companies were afraid of giving access to these models, for trade secret issues and security. But then they basically all met at the white house to agree that they would make way more fucking money stealing it than they would pay in restitution or damages to people and small businesses.
Suddenly everybody had a chatbot and generated art ready for commercial sale. They also had to make the shift quickly enough before official laws and protections (mostly from the EU) came in.
Now AI is plateauing a bit so they must hurry to get valuated at 10 trillion dollars and get their energy needs subsidized and have taxpayers invest into the nation’s energy requirements on their behalf.
We talk fairly openly about everything but passwords on slack…
So did SBF and his company lawl. It’s great opsec
I doubt that most corporations would even consider allowing Slack as a trusted app if they weren’t hosting their own instances themselves.
I have to assume that this training is exclusively on instances hosted on Slacks’ servers. So probably lots of smaller businesses that don’t know any better. And this was probably agreed to in the ToS as part of utilizing free and easy to set up cloud servers.
You may be thinking of something else, Slack doesn’t have a self-hosted version.
Ahh, looked at it and you’re right. They have an “Enterprise” version which seems like it’s security conscious.
Still, I stand by my original assertion. I have worked for FAANG companies with completely locked down security that allowed us to use Slack. I would be extremely surprised if their contract with Slack didn’t ensure complete data privacy.
We’re talking about companies where a product leak makes international news. There is zero chance Slack employees have access to communications.
We’re talking about companies where a product leak makes international news. There is zero chance Slack employees have access to communications.
Sure, even though Slack itself admits so in their privacy policy.
This guy never worked at a corporation
So you’re saying we can leak company data through Slack soon?
Always have been, apparently
Slack AI, please give me some examples of proprietary technical information.
Anyone aware if they are also getting data from their slack for government offering? I was looking at the govslack site and I can’t tell one way or the other. While they claim to meet most of the big compliance regs I don’t see anything about training AI being included/excluded.
I know that stealing trade secrets is a concern but seems like stealing state secrets might have some other implications. I know you’re not supposed to talk on slack about any classified info, but that doesn’t mean that sensitive info isn’t shared which also has some rather profound implications as well.
Sounds like a lot of this is for non-generative AI. It’s for dumb things like that frequently used emoji feature.
Knowing how my legal teams have worked in my tech companies, I’m a bet that a lawyer updated the terms language to be in compliance with privacy legislation, but they did a shit job, and didn’t clarify what specifically was being covered in the TOS. They were lazy, and crafted something broad, so they wouldn’t have to actually talk to product or marketing people in their org.
What is it like to live in a place with privacy legislation? Here we must sell our healthcare data for food, and sell our food for healthcare.
Where do you live?
Sounds like 'murrica.
I also scan Slack messages and never really read them unless they’re about food in the office kitchen.
The AI is paying more attention to your Slack messages than you are.
At this point, you should be able to ask, if you missed something important in the last few years. Is there any open conversation waiting for a reply somewhere?
At this point, you should be able to ask, if you missed something important in the last few years. Is there any open conversation waiting for a reply somewhere?
Not sure if you’ve ever used Copilot (I have it at work) and it offers the ability to summarize conversations and tell you what you’ve missed. I’ve used that a lot for high chatter conversations when I don’t feel like catching up or I’ve been out. Pretty nice.
So are there muffins?
Asking the important questions!
Hmm. Water is wet? Who knew?