It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.
From the link:
Put simply: You take a photo; your Mac or iThing locally outlines what it thinks is a landmark or place of interest in the snap; it homomorphically encrypts a representation of that portion of the image in a way that can be analyzed without being decrypted; it sends the encrypted data to a remote server to do that analysis, so that the landmark can be identified from a big database of places; and it receives the suggested location again in encrypted form that it alone can decipher.
If it all works as claimed, and there are no side-channels or other leaks, Apple can’t see what’s in your photos, neither the image data nor the looked-up label.
It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.
What if I don’t want Apple looking at my photos in any way, shape or form?’
I don’t want Apple exflitrating my photos.
I don’t want Apple planting their robotic minion on my device to process my photos.
I don’t want my OS doing stuff I didn’t tell it to do. Apple has no business analyzing any of my data.
Well they don’t. I don’t want to justify the opt-in by default but, again (cf my reply history) here they are precisely trying NOT to send anything usable to their own server. They are sending data that can’t be used by anything else but your phone. That’s the entire point of homomorphic encryption, even the server they are sent to do NOT see it as the original data. They can only do some kind of computations to it and they can’t “revert” back to the original.
If they don’t look at my data, they don’t even have to encrypt it.
If they don’t try to look at my data, they don’t need to wonder whether they should ask my permission.
I don’t want Apple or anybody else looking at my data, for any reason, is my point.
Yet I’ll still try to clarify the technical aspect because I find that genuinely interesting and actually positive. The point of homomorphic encryption is that they are NOT looking at your data. They are not encrypting data to decrypt them. An analogy would be that :
we are a dozen of friends around a table,
we each have 5 cards hidden from others,
we photocopy 1 card in secret
we shred the copied card, remove half of it, put it in a cup and write a random long number on that cup
we place that cup in a covered bowl
one of us randomly picked gets to pick a cup, count how many red shards are in it, write it back in the cup and writes adds the number to the total written on the bowl, we repeat that process until all cups are written on only once
once that’s done we each pick back our up without showing it to the others
Thanks to that process we know both something about our card (the number of red shards) and all other cards (total number of red shards on the bowl) without having actually revealed what our card is. We have done so without sharing our data (the uncut original card) and it’s not possible to know its content, even if somebody were to take all cups.
So… that’s roughly how homomorphic encryption works. It’s honestly fascinating and important IMHO, the same way that cryptography and its foundation, e.g. one way functions or computational complexity more broadly, are basically the basis for privacy online today.
You don’t have to agree with how Apple implemented but I’d argue understanding how it works and when it can be used is important.
Let me know if it makes sense, it’s the first time I tried to make an analogy for it.
PS: if someone working on HE has a better analogy or spot incorrect parts, please do share.
It makes sense, but you totally miss my point.
To go with your analogy, my point is:
I’m not interested in playing cards
That’s it.
I don’t care how fascinating the technology is and how clever Apple are: they are not welcome to implement it on my device. I didn’t invite them to setup a card game and I expect them not to break into my house to setup a table.
I wish, sadly that’s not how using non open source or open hardware devices work. You are running their software on their hardware with their limitations. It’s not a PC or SBC.
Edit: if we were to stick to the card game analogy, it’d be more like playing the card game in a hotel, in a room that you rented, rather than at home.
And it should, unfortunately it’s not. Maybe right to repair and other laws will, hopefully, change that but for now, it’s bundling, part pairing and locks all the way down.
TLDR edit: I’m supporting the above comment - ie. i do not support apple’s actions in this case.
It’s definitely good for people to learn a bit about homomorphic computing, and let’s give some credit to apple for investing in this area of technology.
That said:
Encryption in the majority of cases doesn’t actually buy absolute privacy or security, it buys time - see NIST’s criteria of ≥30 years for AES. It will almost certainly be crackable <oneday> either by weakening or other advances… How many people are truly able to give genuine informed consent in that context?
Encrypting something doesn’t always work out as planned, see example:
“DON’T WORRY BRO, ITS TOTALLY SAFE, IT’S ENCRYPTED!!”
Yes Apple is surely capable enough to avoid simple, documented, mistakes such as above, but it’s also quite likely some mistake will be made. And we note, apple are also extremely likely capable of engineering leaks and concealing it or making it appear accidental (or even if truly accidental, leveraging it later on).
Whether they’d take the risk, whether their (un)official internal policy would support or reject that is ofc for the realm of speculation.
That they’d have the technical capability to do so isn’t at all unlikely. Same goes for a capable entity with access to apple infrastructure.
The fact they’ve chosen to act questionably regarding user’s ability to meaningfully consent, or even consent at all(!), suggests there may be some issues with assuming good faith on their part.
How hard is it to grasp that I don’t want Apple doing anything in my cellphone I didn’t explicitely consent to?
I don’t care what technology they develop, or whether they’re capable of applying it correctly: the point is, I don’t want it on my phone in the first place, anymore than I want them to setup camp in my living room to take notes on what I’m doing in my house.
My phone, my property, and Apple - or anybody else - is not welcome on my property.
Sorry for my poor phrasing, perhaps re-read my post? i’m entirely supporting your argument. Perhaps your main point aligns most with my #3? It could be argued they’ve already begun from a position of probable bad faith by taking this data from users in the first place.
What if I don’t want Apple looking at my photos in any way, shape or form?’
Then you don’t buy an iPhone. Didn’t they say a year or two ago that they’re going to scan every single picture using on-board processing to look for images and videos that could be child porn and anything suspicious would be flagged and sent to human review?
It’s not that simple. If I don’t want any of my photos scanned, I would have to avoid every iphone, ipad, and mac with this feature turned on, effectively meaning I can’t send a photo to anyone using an apple device.
Well, the other cloud services just did server side csam scan long before apple and they do it respecting your privacy less than apple.
Apple wanted to improve the process like EU wants it, so that no illegal data can be uploaded to apple’s servers making them responsible. That is why they wanted to scan on devices.
But any person who ever used spotlight in the last 4 years should have recognised how they find pictures with words. This is nothing new, apple photos is analysing photos with AI since a very long time.
So you take a pic, it’s analysed, the analysis is encrypted, encrypted data is sent to a server that can deconstruct encrypted data to match known elements in a database, and return a result, encrypted, back to you?
Doesn’t this sort of bypass the whole point of encryption in the first place?
Edit: Wow! Thanks everyone for the responses. I’ve found a new rabbit hole to explore!
Doesn’t this sort of bypass the whole point of encryption in the first place?
No, homomorphic encryption allows a 3rd party to perform operations on encrypted data without decrypting it. The resulting answer is in encrypted form and can only be decrypted by whoever has the key.
Extremely oversimplified example:
Say you have a service that converts dollar amounts to euros using the latest exchange rate. You send the amount in dollars, it multiplies by the exchange rate and then returns the euro amount.
Now, let’s assume the clients of this service do not want to disclose the amounts they are converting. What they could do is pick a large random number and multiply the amount by this number. The conversion service multiplies this by the exchange rate and returns the ridiculously large number back. Then you divide thet number by the random number you picked and you have converted dollars to euros without the service ever knowing the actual amount.
Of course the reality is much more complicated than that but the idea is the same: you can perform operations on data in its encrypted form and now know what the data is nor the decrypted result of the operation.
So homomorphic encryption means the server can compute on the data without actually knowing what’s in it. It’s counter-intuitive but better not think about it as encryption/decryption/encryption precisely because the data is NOT decrypted on the server. It’s sent there, computed on, then a result is sent back.
Wait, it’s called homomorphic encryption? All we’d have to do is tell MAGAs that Tim Apple just started using homomorphic encryption with all the iphones and the homophobic backlash would cause Apple to walk this back within a week.
It might still be possible to compare ciphertexts and extract information from there, right? Welp I am not sure if the whole scheme is secure against related attacks.
I don’t think so, at least assuming the scheme isn’t actually broken… but then arguably that would also have far reaching consequence for encryption more broadly, depending on what scheme the implementation would be relying on.
The whole point is precisely that one can compute without “leaks”.
Why do I say “funnily enough” is because, just like with e.g. IMEC for chips, some of the foundation of modern technology, comes from the tiny and usually disregarded country of Belgium.
I’m not pretending to understand how homomorphic encryption works or how it fits into this system, but here’s something from the article.
With some server optimization metadata and the help of Apple’s private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.
There’s a more technical write up here. It appears the final match is happening on device, not on the server.
The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate by using high-level multimodal feature descriptors, including visual similarity scores; locally stored geo-signals; popularity; and index coverage of landmarks (to debias candidate overweighting). When the model has identified the match, the photo’s local metadata is updated with the landmark label, and the user can easily find the photo when searching their device for the landmark’s name.
That’s really cool (not the auto opt-in thing). If I understand correctly, that system looks like it offers pretty strong theoretical privacy guarantees (assuming their closed-source client software works as they say, with sending fake queries and all that for differential privacy). If the backend doesn’t work like they say, they could infer what landmark is in an image when finding the approximate minimum distance to embeddings in their DB, but with the fake queries they can’t be sure which one is real. They can’t see the actual image either way as long as the “128-bit post-quantum” encryption algorithm doesn’t have any vulnerabilies (and the closed source software works as described).
by using high-level multimodal feature descriptors, including visual similarity scores; locally stored geo-signals; popularity; and index coverage of landmarks (to debias candidate overweighting)
…and other sciencey-sounding technobabble that would make Geordi LaForge blush. Better reverse the polarity before the dilithium crystals fall out of alignment!
That’s the point. It’s a list of words that may or may not mean something and I can’t make an assessment on whether or not it’s bullshit. It’s coming from Apple, though, and it’s about privacy, which is not good for credibility.
It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.
From the link:
What if I don’t want Apple looking at my photos in any way, shape or form?’
I don’t want Apple exflitrating my photos.
I don’t want Apple planting their robotic minion on my device to process my photos.
I don’t want my OS doing stuff I didn’t tell it to do. Apple has no business analyzing any of my data.
Well they don’t. I don’t want to justify the opt-in by default but, again (cf my reply history) here they are precisely trying NOT to send anything usable to their own server. They are sending data that can’t be used by anything else but your phone. That’s the entire point of homomorphic encryption, even the server they are sent to do NOT see it as the original data. They can only do some kind of computations to it and they can’t “revert” back to the original.
If they don’t look at my data, they don’t even have to encrypt it.
If they don’t try to look at my data, they don’t need to wonder whether they should ask my permission.
I don’t want Apple or anybody else looking at my data, for any reason, is my point.
I agree on permission.
Yet I’ll still try to clarify the technical aspect because I find that genuinely interesting and actually positive. The point of homomorphic encryption is that they are NOT looking at your data. They are not encrypting data to decrypt them. An analogy would be that :
Thanks to that process we know both something about our card (the number of red shards) and all other cards (total number of red shards on the bowl) without having actually revealed what our card is. We have done so without sharing our data (the uncut original card) and it’s not possible to know its content, even if somebody were to take all cups.
So… that’s roughly how homomorphic encryption works. It’s honestly fascinating and important IMHO, the same way that cryptography and its foundation, e.g. one way functions or computational complexity more broadly, are basically the basis for privacy online today.
You don’t have to agree with how Apple implemented but I’d argue understanding how it works and when it can be used is important.
Let me know if it makes sense, it’s the first time I tried to make an analogy for it.
PS: if someone working on HE has a better analogy or spot incorrect parts, please do share.
It makes sense, but you totally miss my point. To go with your analogy, my point is:
That’s it.
I don’t care how fascinating the technology is and how clever Apple are: they are not welcome to implement it on my device. I didn’t invite them to setup a card game and I expect them not to break into my house to setup a table.
I wish, sadly that’s not how using non open source or open hardware devices work. You are running their software on their hardware with their limitations. It’s not a PC or SBC.
Edit: if we were to stick to the card game analogy, it’d be more like playing the card game in a hotel, in a room that you rented, rather than at home.
It’s funny how it feels like my money when I pay for the device at the cash register.
And it should, unfortunately it’s not. Maybe right to repair and other laws will, hopefully, change that but for now, it’s bundling, part pairing and locks all the way down.
sdklf;gjkl;dsgjkl;dsgjkl;dsgsjkl;g
TLDR edit: I’m supporting the above comment - ie. i do not support apple’s actions in this case.
It’s definitely good for people to learn a bit about homomorphic computing, and let’s give some credit to apple for investing in this area of technology.
That said:
Encryption in the majority of cases doesn’t actually buy absolute privacy or security, it buys time - see NIST’s criteria of ≥30 years for AES. It will almost certainly be crackable <oneday> either by weakening or other advances… How many people are truly able to give genuine informed consent in that context?
Encrypting something doesn’t always work out as planned, see example:
“DON’T WORRY BRO, ITS TOTALLY SAFE, IT’S ENCRYPTED!!”
Source
Yes Apple is surely capable enough to avoid simple, documented, mistakes such as above, but it’s also quite likely some mistake will be made. And we note, apple are also extremely likely capable of engineering leaks and concealing it or making it appear accidental (or even if truly accidental, leveraging it later on).
Whether they’d take the risk, whether their (un)official internal policy would support or reject that is ofc for the realm of speculation.
That they’d have the technical capability to do so isn’t at all unlikely. Same goes for a capable entity with access to apple infrastructure.
How hard is it to grasp that I don’t want Apple doing anything in my cellphone I didn’t explicitely consent to?
I don’t care what technology they develop, or whether they’re capable of applying it correctly: the point is, I don’t want it on my phone in the first place, anymore than I want them to setup camp in my living room to take notes on what I’m doing in my house.
My phone, my property, and Apple - or anybody else - is not welcome on my property.
Sorry for my poor phrasing, perhaps re-read my post? i’m entirely supporting your argument. Perhaps your main point aligns most with my #3? It could be argued they’ve already begun from a position of probable bad faith by taking this data from users in the first place.
Oh yeah I kinda missed your last point. Sorry 🙂
Then you don’t buy an iPhone. Didn’t they say a year or two ago that they’re going to scan every single picture using on-board processing to look for images and videos that could be child porn and anything suspicious would be flagged and sent to human review?
It’s not that simple. If I don’t want any of my photos scanned, I would have to avoid every iphone, ipad, and mac with this feature turned on, effectively meaning I can’t send a photo to anyone using an apple device.
Correct.
Well, the other cloud services just did server side csam scan long before apple and they do it respecting your privacy less than apple.
Apple wanted to improve the process like EU wants it, so that no illegal data can be uploaded to apple’s servers making them responsible. That is why they wanted to scan on devices.
But any person who ever used spotlight in the last 4 years should have recognised how they find pictures with words. This is nothing new, apple photos is analysing photos with AI since a very long time.
“opt out” to looking at my data ✅
Narrator: It doesn’t.
Wait, what?
So you take a pic, it’s analysed, the analysis is encrypted, encrypted data is sent to a server that can deconstruct encrypted data to match known elements in a database, and return a result, encrypted, back to you?
Doesn’t this sort of bypass the whole point of encryption in the first place?
Edit: Wow! Thanks everyone for the responses. I’ve found a new rabbit hole to explore!
No, homomorphic encryption allows a 3rd party to perform operations on encrypted data without decrypting it. The resulting answer is in encrypted form and can only be decrypted by whoever has the key.
Extremely oversimplified example:
Say you have a service that converts dollar amounts to euros using the latest exchange rate. You send the amount in dollars, it multiplies by the exchange rate and then returns the euro amount.
Now, let’s assume the clients of this service do not want to disclose the amounts they are converting. What they could do is pick a large random number and multiply the amount by this number. The conversion service multiplies this by the exchange rate and returns the ridiculously large number back. Then you divide thet number by the random number you picked and you have converted dollars to euros without the service ever knowing the actual amount.
Of course the reality is much more complicated than that but the idea is the same: you can perform operations on data in its encrypted form and now know what the data is nor the decrypted result of the operation.
So homomorphic encryption means the server can compute on the data without actually knowing what’s in it. It’s counter-intuitive but better not think about it as encryption/decryption/encryption precisely because the data is NOT decrypted on the server. It’s sent there, computed on, then a result is sent back.
Wait, it’s called homomorphic encryption? All we’d have to do is tell MAGAs that Tim Apple just started using homomorphic encryption with all the iphones and the homophobic backlash would cause Apple to walk this back within a week.
I’m only half joking.
It might still be possible to compare ciphertexts and extract information from there, right? Welp I am not sure if the whole scheme is secure against related attacks.
I don’t think so, at least assuming the scheme isn’t actually broken… but then arguably that would also have far reaching consequence for encryption more broadly, depending on what scheme the implementation would be relying on.
The whole point is precisely that one can compute without “leaks”.
Edit: they are relying on Brakerski-Fan-Vercauteren (BFV) HE scheme, cf https://machinelearning.apple.com/research/homomorphic-encryption
IIRC, for this kind of guarantee, you need a CCA(Chosen-ciphertext attack)-security. I dunno if this scheme satisfies such a security.
Dunno either, funnily enough skimming through https://eprint.iacr.org/2012/144 I noticed authors are from KUL https://www.esat.kuleuven.be/
Why do I say “funnily enough” is because, just like with e.g. IMEC for chips, some of the foundation of modern technology, comes from the tiny and usually disregarded country of Belgium.
I’m not pretending to understand how homomorphic encryption works or how it fits into this system, but here’s something from the article.
There’s a more technical write up here. It appears the final match is happening on device, not on the server.
That’s really cool (not the auto opt-in thing). If I understand correctly, that system looks like it offers pretty strong theoretical privacy guarantees (assuming their closed-source client software works as they say, with sending fake queries and all that for differential privacy). If the backend doesn’t work like they say, they could infer what landmark is in an image when finding the approximate minimum distance to embeddings in their DB, but with the fake queries they can’t be sure which one is real. They can’t see the actual image either way as long as the “128-bit post-quantum” encryption algorithm doesn’t have any vulnerabilies (and the closed source software works as described).
…and other sciencey-sounding technobabble that would make Geordi LaForge blush. Better reverse the polarity before the dilithium crystals fall out of alignment!
Heh though that’s all legit right?
That’s the point. It’s a list of words that may or may not mean something and I can’t make an assessment on whether or not it’s bullshit. It’s coming from Apple, though, and it’s about privacy, which is not good for credibility.
I don’t know what a geo-signal is, but everything else listed there makes perfect sense given the context.
Maybe they “encrypt” it in jpg? XD