Dushan published at 3rd June 2024

Opt-in (AI training) should be the default, but it isn't...

There has been quite a furore recently regarding Slack, their use of customer data for AI Model training, clarity around the specifics and boundaries of use and the fact that all users are transparently opted-in.

The direct issues related to this have been discussed at length and while these are of concern, much has already been said about this and Slack has responded to it.

What we want to talk about today is the final point, default opting-in.

We believe this practise (and to be clear - Slack is far from the only culprit) does not match any generally accepted definition of consent in the physical world. Yet for some reason companies seem to think this is ok and continue to get away with doing it...

The situation highlights some important and often overlooked costs of SaaS software at large, while it is obvious that in buying any SaaS you are placing trust in the provider and trading off agency for convenience/speed of solution, what many people don't realise is the terms you agree to are only true at the point of signature.

Theoretically, any SaaS provider you use could tomorrow decide to modify their terms of use/privacy policy and start training their AI on all the data of all their customers and using the result in any way they chose.

The provider may send an email advising of this change (likely in lengthy legalese), which may or may not be read, or they may rely on an escape hatch in their contract that places the obligation on the user to periodically review the privacy policy for any changes.

In any case, the key issue here is when the new terms come into effect, you would be opted in, under this model inaction equals consent.

If the tool holds a lot of your data, is deeply integrated into your workflow or is difficult to export data from - migrating away from it, notwithstanding the time this takes away from your companies actual work is impractical, so you are effectively duressed into acceptance by the cost of a hard break from the tool.

Companies adopting SaaS need to be considering this risk - especially for their sensitive data and should be applying the principle of least privilege wherever possible, implementing data minimisation controls, training staff and reading those emails from their SaaS providers.

Finally larger companies with enough financial leverage to make the given provider care (this ratio will depend on provider) should be pushing back on SaaS providers for stricter data privacy terms in their contracts and contractually demanding active engagement and affirmative consent any time a change of terms is proposed.

Like any contract, negotiation is always possible...

What about other SaaS AI products?

As we said, this practice of default opting-in consumers is rife within the SaaS landscape, some other examples include:

It is also worth noting that for many providers, opting out is not possible on a free tier account and your rights may be limited if you are not a paid customer.

Privacy and how the provider uses data need to be at the forefront of decision making when a company chooses to use any product.

Epilogue - how do I opt out in the case of Slack?

The process is neither automated nor simple (one might argue, by design), it:

  • Must be performed by a workspace owner from their email account registered with Slack
  • Requires sending of a specifically formatted email (apparently in order to match mail filtering rules)
  • Requires then waiting for an undefined amount of time for it to be actioned by Slack with no specific timeframe for action given

For anyone needing to do it, the email format is as follows:

To: feedback@slack.com
Subject: Slack global model opt-out request
Body: <slack workspace url>

References

  • Slack users horrified to discover messages used for AI training - arstechnica
  • Slack AI Training with customer data - HackerNews
  • How to stop your data from being used to train AI - Wired
  • How Slack protects your data when using machine learning and AI - Slack

Protect Your Data Today

Subrosa uncovers AI usage risks across your organisation and protects your business from shadow AI and data leaks.