cross-posted from: https://lemmy.ml/post/15741608

They offer a thing they’re calling an “opt-out.”

The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn’t actually opt-out.

When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won’t be shared with other companies.

LOL no. That’s not an opt-out. The way to opt-out is to stop using Slack.

https://slack.com/intl/en-gb/trust/data-management/privacy-principles

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Also pretty sure training LLMs after someone opts out is illegal?

    Why? There have been a couple of lawsuits launched in various jurisdictions claiming LLM training is copyright violation but IMO they’re pretty weak and none of them have reached a conclusion. The “opting” status of the writer doesn’t seem relevant if copyright doesn’t apply in the first place.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        Nor is it up to you. But fact remains, it’s not illegal until there are actually laws against it. The court cases that might determine whether current laws are against it are still ongoing.

      • Grimy@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        If copyrights apply, only you and stack own the data. You can opt out but 99% of users don’t. No users get any money. Google or Microsoft buys stack so only they can use the data. We only get subscription based AI, open source dies.

        If copyrights don’t apply, everyone owns the data. The users still don’t get any money but they get free open source AI built off their work instead of closed source AI built off their work.

        Having the website have copyright of the content in the context of AI training would be a fucking disaster.