Apple-1 to iPhone: Jail-breaking and firmware hacking
I have heard many times online that companies like Apple are greedy. They deliberately slow down their operating system version by version. After that, they intentionally blocked users from downgrading the lower version so they could sell the hardware. They also make it difficult to jailbreak their phones because of their greed. So, how correct are these comments?
Let’s go back in time to the years 1970–1980, when Apple was one of the pioneers in making computers for individuals. Before that, the computer was something that you had to go to the office to use a workstation computer to connect to the server for it to work. Or if you do not have a workstation, then each person can only use the server (mainframe) for a short time after they had reserved it — called time-sharing.
Back then, the idea of having a personal computer for personal use at home was a breakthrough. Steve Wozniak (Woz) is an intelligent electronics engineer, he invented many ways to optimize chips. He built a very cheap personal computer with a very powerful configuration at that time. Woz went to show off his computer at the Homebrew Computer Club, along with all the drawings so that anyone who wanted to make the second would go and buy the chip themselves. Steve Jobs was the less technical savvy guy, who prompted Woz to commercialize and sell the machine. The two guys released the Apple I computer for $ 666.66, sold at the Byte Shop on El Camino Real Street in Mountain View, California (ie “Silicon Valley”).
Back then, commercial computers often came with drawings so that anyone can fix it themselves when the computer failed. The program is also very easy to understand and you can copy as much as you like. Computers sold out only have a Basic translation program (written by Bill Gates), and the command line. Anyone who wants to run any program must enter the source code from the journals and run it. With such hardware and software, if anyone wants, they can write anything for themselves, and run whatever program they want. Even if you want a faster, more powerful computer to understand its working mechanism, you can buy a few more chips.
As computers became more commercialized, two things happened. One is to reduce area and cost, manufacturers seek to switch from wiring to integrated electronic circuits. Thus, a relatively knowledgeable person can hardly understand how a computer works because its integration has soared. Second, with the advent of video game machines, software has become a fertile land.
To avoid software being copied indiscriminately, companies have found all kinds of ways to prevent the piracy of their software. First, they try to protect against software copying, i.e prompt user to enter the serial number to unlock the software. Then, manufacturers of video game hardware also faced problems with not locking their software. Turns out, even copying a chip that contains a video game program isn’t too difficult. If anyone can copy the game, no one will dare to invest in writing the game, and no one would buy a device to play. Nintendo solved that by attaching two chips to two places, one for video game tapes and the other for a video game machine. Usually how it works is that the chip follows a genuine video game tape will answer a random “question” that the chip in the machine can ask. If it passes this step, it will run the video game program. If Nintendo was the only one able to produce the chip attached to the tape, then they just got the game maker money, the game maker would have less headache than worrying about their software being copied.
When anti-software copy mechanisms appear, a new phenomenon occurs, which is the computer or video game machine can refuse to run programs that it does not “believe” the program was authentic. Those are the ways to protect hardware and software from 20–30 years ago and only happen in places where software is not necessarily tied to the hardware (for example, a video game machine can run a lot of different video game tapes).
With other devices such as phones and PDAs, there is less to be worried about, as the phone ends up having only one person who wrote the software for it (the hardware manufacturer). Once the whole set is sold, copying what the software or hardware runs on is usually not what the hardware manufacturer is concerned with. Furthermore, the phones in the 1990s and early 2000s didn’t have an “app store,” which was meant to run embedded software to make calls, receive messages, and run games like Snake. The chips in phones simply read the program from memory, run it, and doesn’t care about who wrote it.
The iPhone is a splash of cold water that hits every other phone manufacturer. The phone is now suddenly a treasure trove of applications, personal information, and secrets of the owner of the phone. The phone simply turned out to be one of the most complex, powerful computing devices that everyone had in the palm of their hand. If you have anything that a program with high access can exploit into the wrong hands, you could lose everything from money to personal information. So securing information on a phone is one of the most important tasks for a phone manufacturer.
The problem is the trust model of phones like Nokia and Siemens, which is running what it read will be a very serious problem. If someone gets spyware installed on your smartphone, before it reaches you, or while it’s in your hand and you don’t know it, it will be a disaster not only for you but also for the phone manufacturer.
So, these phones have a special self-defense mechanism like the old Nintendo video game console, which has a special chip inside the processor. Its task is to check the electronic signature of the software in the memory chip, only when it checks that the signature of this software is the signature that it “believes” does the program in the phone continue to run. This ensures your information is safe, so that no one other than the phone manufacturer can silently run malicious code on your phone. The fact that a manufacturer can ensure billions of phones with maximum safety regardless of price, regardless of position, no one is afraid of being exposed to personal information because of malicious code is a very respectable feat. All of this is due to the invention of an electronic signature.
But the software that comes with the phone is never perfect, especially in a complex environment with many different software components. Sometimes the phone software has bugs, which cause the phone to run code that the manufacturer has no control over. The result of this is that you can use your phone to run software called jailbreaks. Jailbreak has a good point that you can run anything you want, change anything you want, run your own software or software. But the bad thing about jailbreaking is that you do not run the software in the protective shell of the operating system written by the manufacturer. You can run the malicious code pretending to be a phone or a program. When this happens, any protection that the manufacturer provides you with is gone.
To prevent this from happening, software manufacturers release periodic operating system updates or when new bugs are discovered. When upgrading to a new version without failures, manufacturers often have an extra layer of protection, which is the blocking of downgrading the operating system from returning to the old version. Because if one can downgrade the operating system, then a bad person can downgrade your fault-free operating system to the buggy version and take advantage of the published bug to install malware on the phone or device. your calculator.
Disabling the downgrade OS is just a side effect of the operating system security mechanism and is not usually a malicious intent of the manufacturer. What we have today is an achievement of decades of evolutionary practices and steps from commercial to secure personal information for billions of people.
There are many people wishing to run self-written software on computers, phones, and other smart devices. To satisfy this need, there are some manufacturers that allow you to push a physical or electronic button on the device, then the machine will run in1984 style, which means you can run whatever you put on it, this is also called unlocking bootloader. It will not check the manufacturer’s signature. Google’s mobile or laptop devices often have this feature. To ensure the safety of the user, whenever this function is turned on, the phone or computer will notify you with a “intimidating” screen upon start-up and there is no way to turn it off, because it is possible that you are running malicious code without knowing it. This is a relatively smart solution to the computer trust problem.
Unfortunately, Apple phones do not allowed users to unlock the bootloader, but only gives users the option of running the software that Apple has written. But that, in my opinion, is a good choice for most of the people using their phones.
One thing you might be wondering, Apple itself and other hardware makers how they write test software on their own phones, when even their engineers don’t know the digital signature. Apple? Usually, modern solutions are in the CPU there are usually a number of electronic “fuses” called eFUSE. When it comes to the real production line, they use the command to burn the fuse away, then the phone will only run the code that has the electronic signature attached. When designing, they work on devices that the fuse has not burnt, then the hardware will run freely whatever. These devices are called prototype hardware, and if anyone has them on hand, they can be very useful to learn about hardware and software.. That’s why prototype hardware can be worth a lot of money.
The world of the computer is the world of precision, of perfection. Since a 99.999% unobtrusive design is not good enough, for a product like the iPhone to be born for billions of people from politicians to civilians, a phone must be 100% secure. When we don’t understand something, it is easy to accuse someone of deception and greed.