Close ad

At the end of last week, we informed you about a rather interesting novelty, which is a new system for detecting images depicting child abuse. Specifically, Apple will scan all photos stored on iCloud and, in case of detection, report these cases to the relevant authorities. Although the system works "securely" within the device, the giant was still criticized for violating privacy, which was also announced by the popular whistleblower Edward Snowden.

The problem is that Apple has so far relied on the privacy of its users, which it wants to protect under all circumstances. But this news directly disrupts their original attitude. Apple growers are literally faced with a fait accompli and have to choose between two options. Either they will have a special system scan all the pictures stored on iCloud, or they will stop using iCloud photos. The whole thing will then work quite simply. The iPhone will download a database of hashes and then compare them with the photos. At the same time, it will still intervene in the news, where it is supposed to protect children and inform parents about risky behavior in a timely manner. The concern then stems from the fact that someone could abuse the database itself, or even worse, that the system might not only scan photos, but also messages and all activity, for example.

Apple CSAM
How it all works

Of course, Apple had to respond to criticism as quickly as possible. For this reason, for example, it released an FAQ document and now confirmed that the system will only scan photos, but not videos. They also describe it as a more privacy-friendly version than what other tech giants are using. At the same time, the apple company described even more precisely how the whole thing will actually work. If there is a match when comparing the database with the pictures on iCloud, a cryptographically secured voucher is created for that fact.

As already mentioned above, the system will also be relatively easy to bypass, which was confirmed by Apple directly. In that case, simply disable Photos on iCloud, which makes it easy to bypass the verification process. But a question arises. Is it worth it? In any case, the bright news remains that the system is being implemented only in the United States of America, at least for now. How do you view this system? Would you be in favor of its introduction in the countries of the European Union, or is this too much of an intrusion into privacy?

.