Skip Navigation

(title is bad, see comments) | ‘AI is reliant on mass surveillance’ and we should be cautious, warns head of messaging app

This was an interview on ABC (Australian public broadcaster) with Signal Foundation president Meredith Whittaker. It covered current events about Signal and encrypted messaging, with a small bit on AI at the end. The original title of the video is bad.

Key points in the video:

  • 1:30 - Should platforms be held responsible for [the content]
  • 3:15 - (paraphrased) Governments want law enforcement to have access to encrypted communications, why not?
  • 4:15 - (paraphrased) What if people are using it for criminal behaviour
  • 7:00 - (paraphrased) Random AI section
1 comments
  • This part of the interview felt relevant to the fediverse (note that this was pasted from a transcript, and you might find it easier to watch the video than read the transcript):

    Australia's safety commissioner recently took on Elon Musk for example requesting the removal of vision of a stabbing in a church here in Sydney. It was unsuccessful, should tech platforms be held responsible for spreading that sort of content.

    Well I think we need to break that question down and actually question the form that tech platforms have taken, because we live in a world right now where there are about five major social media platforms that are very literally shaping the global information environment for everyone. So we have a context where these for-profit surveillance tech actors have outsized control over our information environment, and present a very very attractive political target to those who might want to shape, or misshape, that information environment. So I think we need to go to the root of the problem. The issue is not that every regulator doesn't get a chance to determine appropriate or inappropriate content. The issue is that we have a one-size fits all approach to our shared information ecosystem, and that these companies are able to determine what we see or not, via algorithms that are generally calibrated to increase engagement; to promote more hyperbolic or more inflammatory content, and that we should really be attacking this problem at the root: beginning to grow more local and rigorous journalism outside of these platforms and ensuring that there are more local alternatives to the one-size fits-all surveillance platform business model.