The FBI, however, could not get into the iPhone, as it had advanced security features on it. Sure, they could have just brute forced the password of the iPhone by continuously guessing it. However, a security feature was enabled on the phone that deleted everything after ten incorrect password attempts. They then handed it over to the NSA to see if they could get the iPhone unlocked and even they could not do it.
So, the federal government asked Apple if there was some kind of way they could get into the iPhone without having to put in the passcode. Or, if that was not possible, if Apple could develop a special operating system that could be downloaded onto the phone to disable the security features that were preventing the FBI from brute forcing into the iPhone. Apple then proceeded to basically tell the government to screw off. Funnily enough, the US government was not the only government that Apple was trying to show dominance over, as they were also having a heated legal battle with Ireland regarding billions of tax dollars that went unpaid.
But enough with Apple’s legal issues, the real interesting part of this story is the precedent that Apple would have set if they had complied with the FBI and given them a backdoor‒a way to bypass normal security to access a certain program or system‒into the iPhone. Nothing like this had ever knowingly occurred within the US. While it might not seem like a huge deal, essentially the government was asking a private company to give them a method that could be used to access anything on their product. That is a pretty big favor to ask for. If Apple complied, then the federal government could use this case as an example of why more companies should just give them backdoors to all of their products. Quite a slippery slope.
See, the thing about backdoors is that they are great for developers or QA testers. It allows someone to quickly access all of the current components of the program without any kind of security check. This way, a developer does not have to login to their account every time they want to test if a new feature works or if a bug was fixed, as that would just be a waste of time. The problem, however, is if someone who is not supposed to know about the backdoor, figures out about it. For example, let’s say that for your application, a user must login using a valid email and password and then set up a two-step verification through their phone. So, everytime they want to login, they have to enter the right email, password, and the correct code that was sent to their phone. But, that must get pretty annoying if you are constantly trying to fix something with the program and have to keep logging in to see if the fixes worked. So, instead of having to go through the entire process multiple times, you set up a backdoor so that you can just enter in “!=test932742” into the username section and then press enter. And viola, you are into the application and have administrator privileges. Now, imagine if a random user found out that “!=test932742” allows them to gain administrator privileges over the application. Anyone want to guess how long it is until a troll face appears somewhere within the application?
But developer created backdoors aren’t that big of a problem. Well-created backdoors that a developer knows about can easily be closed or maintained to ensure that no one else can access the program through the backdoor. However, externally created backdoors can be a huge issue. Hackers and others wishing to infiltrate someone’s computer or a particular application can create their own backdoors into the system. Essentially, they get another unsuspecting user to install a new program onto the computer that serves as a backdoor, as it may have code within it that alters the application’s security settings or that gets rid of them altogether. This could be absolutely devastating if they are able to gain privileges that allow them view personal information, like credit card numbers or social security numbers, as that could lead to financial ruin and identity theft for many people. Many of the successful malware of the early 2000s established a backdoor on the user’s system that allowed them to consistently access a user’s Microsoft Outlook contacts. That way, emails could be sent to other people containing the malware, causing it to spread if someone opened the email and downloaded the malware.
As you can see, backdoors are necessary, but they can be used maliciously. It is almost guaranteed that all major companies have some kind of backdoor allowing them to access their device/program without going through normal security measures. They are necessary as they can help save time and they are great when troubleshooting issues. However, a developer must ensure that they keep the backdoor confidential or ensure that they remove it after it becomes unnecessary. On top of this, developers must also account for any loopholes or problems in their code that may lead to the creation of unintentional backdoors, allowing external users to access them. Interestingly, this is actually how the FBI ended up gaining access to that iPhone. They hired professional hackers to figure out a way to bypass the security mechanism that caused the iPhone to delete its contents after ten failed password attempts.
So what did we learn today? First, that the NSA and the FBI somehow can’t bypass the iPhone’s security but can bypass other government’s security. Second, that Apple has been and is continuing to do some shady things with their taxes, which should be a surprise to none. And third, that backdoors are a great tool that all developers use when testing their software, but that they can be compromised if a developer is careless.