Close ad

One of the biggest strengths of Apple's operating systems is their security and emphasis on privacy. At least that's how Apple presents itself when it promises maximum protection to its users. On the other hand, the truth is that in these systems we can find a number of handy functions in the form of Sign in with Apple, App Tracking Transparency, iCloud+, blocking trackers in Safari, safe storage of passwords and others. For example, such an iOS system is also so good that Apple itself cannot break its protection.

After all, Apple fans have known about this since December 2015, when the American FBI asked Apple to develop a tool for unlocking any iPhone without knowing the password. That's when the police confiscated the iPhone 5C of one of the shooters who participated in the terrorist attack in the Californian city of San Bernardino. But the problem was that they had no way to get into the phone and Apple refused to develop such a tool. According to the company, creating a backdoor would create a number of unfriendly opportunities to breach the protection, effectively making every iPhone vulnerable. Apple therefore refused.

Will Apple unlock the backdoor to iPhones?

Anyway, years ago, Apple confirmed to us that it does not take the privacy of its users lightly. This incident thus strengthened the reputation of the entire company with regard to privacy. But did Apple do the right thing? The truth is that it is not exactly a twice as easy situation. On the one hand, we have a possible help with the investigation of a crime, on the other, a possible threat to the entire iOS operating system. However, as we mentioned above, the Cupertino giant has taken a firm position in this regard, which it has not changed. After all, the mentioned concerns are indeed justified in this regard. If the company itself had the ability to unlock literally any iPhone, regardless of the strength of the password used or the setting of biometric authentication (Face/Touch ID), it would really unlock the possibility of something like this being easily abused. All it takes is one small mistake and these options could fall into the wrong hands.

That is why it is important that there are no back doors in the systems. But there is a small catch. A number of apple growers complain that the introduction of the so-called backdoor is approaching anyway. This is indicated by the introduction of CSAM protection. CSAM, or Child sexual abuse material, is material depicting the abuse of children. Last year, Apple unveiled plans to introduce a feature that would scan each message and compare whether it captures something related to the subject. In the same way, images stored on iCloud (in the Photos application) should be scanned. If the system found sexually explicit material in the messages or photos of younger children, Apple would then warn parents in case the children tried to send the material further. This feature is already running in the United States.

apple tracking
The introduction of this protection provoked a strong reaction from apple growers

Protecting children or breaking the rules?

It was this change that sparked a heated discussion on the topic of safety. At first glance, something like this seems like a great gadget that can really help children at risk and catch a potential problem in time. In this case, the scanning of the mentioned photos is handled by a "trained" system that can detect the mentioned sexually explicit content. But what if someone abuses this system directly? Then he gets his hands on a powerful weapon for persecuting practically anyone. In the worst cases, it would be a suitable tool for the breakdown of specific groups.

In any case, Apple argues that it thought the most about the privacy of its users with this news. Therefore, photos are not compared in the cloud, but directly on the device via encrypted hashes. But that's not the point at the moment. As mentioned above, although the idea may be correct, it can again be easily misused. So is it possible that in a few years privacy will no longer be such a priority? Currently, we can only hope that something like this will never happen.

.