Users are growing increasingly suspicious of their smart speakers, as more and more news stories reveal the extent to which these devices listen to the households around them - and how much of that private information makes its way back to their makers' servers. The idea of faceless employees listening in on our private lives without our knowledge is enough to make anyone doubt the benefits of voice-triggered technology.
But let's face it: we expect this sort of thing from Google and Amazon. Surely Apple is better? In this article we investigate to see if Apple's privacy practices match up with its policies, and whether Siri is spying on your conversations. For related advice, take a look at How to use Siri.
The privacy advocate
Apple has always presented itself as a champion of privacy for its users. "Privacy to us is a human right - it's a civil liberty," said CEO Tim Cook in an interview with MSNBC last year, whilst also criticising companies such as Facebook for their poor privacy practices in the wake of the Cambridge Analytica scandal. (FaceBook's Mark Zuckerberg allegedly provided a suitably mature response by telling his staff not to use iPhones any more…)
Of course, Apple does gather a lot of information about its users - and especially users of its HomePod smart speaker, which uses the Siri voice assistant to listen to your conversations at home in order to answer questions and respond to commands. And, of course, you can use Siri on iPhones and iPads, as well as recent Mac models, and even the Apple Watch.
There have been some recent news stories about Amazon's rival Alexa speakers listening in on conversations - and other, er, non-verbal activities you might engage in at home - but Apple has always maintained that Siri is more secure and private than its rivals.
Apple's official privacy policies state that Siri running on a HomePod does not send any voice recordings to Apple until it hears the special command phrase "Hey Siri". It then sends your voice recordings to Apple's servers so they can use voice-recognition technology to respond to your questions and commands.
Other Apple devices, such as the Apple Watch, may send more detailed information to Apple - such as your location, if you ask for the weather report when you're going out for a jog - but that will depend on the specific activity that you're taking part in at the time.
Apple's policy also states that the recordings are sent to Apple using only an 'anonymous ID' - rather than identifying you by name, or your personal Apple ID - and that the recordings are encrypted to protect the privacy of your communications with Apple. And the company insists that it "doesn't gather your personal information to sell to advertisers or other organisations".
But in order to continually improve the quality of its voice-recognition and artificial-intelligence systems, Apple does keep your voice recordings for up to six months. After six months the company may create another copy of the recording that has no personal 'identifier' information, which it keeps for further analysis for up to two years.
However, there's a grey area that crops up with what Apple refers to as "a small subset of recordings" that may be kept beyond that two-year period "for ongoing improvement and quality assurance of Siri".
Do Apple employees listen to my conversations?
Keeping anonymous recordings to improve Siri's voice-recognition sounds reasonable enough - after all, the recordings aren't linked to your Apple ID, so they keep your identity safe, and they're not used for advertising or other commercial purposes. But Apple does submit some of these recordings to a process that it calls 'grading'.
Apple informed Macworld that "graders do not receive the user's random identifier", but a story in the UK's Guardian newspaper recently revealed that grading staff aren't always Apple employees, and can be external contractors working in locations all around the world.
The Guardian was approached by a whistleblower who stated that Siri can sometimes be activated accidentally - perhaps by voices on the television talking about something that sounds like "Siri" (something we've experienced on more than one occasion with our own HomePods). The Apple Watch seems to be particularly vulnerable here, as Siri can also be activated by users who accidentally press the 'crown' button on the watch, as well as by the "Hey Siri" spoken command.
As you can imagine, these 'accidental' recordings can contain very personal details, including the names of family members. The Guardian's informant also claimed that some recordings contain confidential medical or financial information - all of which is made available to grading staff who aren't even employed by Apple.
How can I stop Siri listening to me?
Of course, none of this is particularly unusual: it's known that both Amazon and Google also use human staff to listen to recordings from Alexa and Google Home speakers. But Apple's highly public stance on privacy makes this potential security flaw particularly embarrassing, prompting a spate of "Is Siri spying on me?" headlines across the internet.
To be fair, Apple has responded very quickly on this occasion. Its traditional response to journalist enquiries is "no comment", but on this occasion Macworld received an entire paragraph in response via email.
"We are committed to delivering a great Siri experience while protecting user privacy," a representative told us. "While we conduct a thorough review, we are suspending Siri grading globally."
The Guardian also confirmed that grading staff employed by Apple in Ireland had been sent home - with no information about their future employment. As well as suspending the grading system, Apple also added that it will release a future software update that allows users to opt out of the grading system entirely.
And don't forget that you can turn Siri off altogether as well if you prefer - although there's not much point in buying an expensive HomePod speaker from Apple if you're not planning on using Siri (you can read our reviews of the best HomePod alternatives here).
In the short term, this flaw in Siri is certainly embarrassing for Apple - especially as it attempts to broaden the reach of Siri into home automation in order to catch up and compete against the market-leading Amazon Alexa. But if Apple is true to its word and adopts new, more transparent privacy policies that protect the privacy of its users, then it may be able to set an example that will be followed by Amazon, Google and other companies that are more cavalier in their approach to privacy. In fact, Amazon has already given Alexa users the option to opt out of its grading system, although it has not discontinued the system altogether.
And, as of this morning, our HomePod's response to the question - "Hey, Siri - are you spying on me?" - was a very terse "No!"