Introduction to Linux Operating System
Recommended by 0 users
UNIX, the “mother ship” to the Linux operating system, is not unlike a continuous flow of hot magma; always evolving. Since its inception in the 1960s, it has undergone drastic changes that have made it a favorite for most developers, both software and mobile application. Perhaps, this is due to its open source nature.
An operating system is the suite of programs behind the workings of your personal or work computer. There is a raging debate in the tech world as to whether there is a difference between Linux and UNIX. Linux, or UNIX if you may, is a stable, multitasking, multi-user operating system for laptops, servers, and desktops. Additionally, if you are a windows user (the Cain to Linux Abel) you will be relieved to learn that UNIX has a GUI (graphical user interface) similar to the working environments of Microsoft Windows, which makes it easier for users to navigate and use. However, it is important to note that in this GUI, there are operations that are out of reach. Therefore, knowledge of UNIX use is necessary to operate commands and operations not covered by the GUI or in some instances such as a telnet session when the GUI interface is unavailable.
To get a better understanding of Linux and in effect, UNIX, let us go back in time to the circumstance that led to the birth of this OS.
In our time machines
The year is 1960 something… Afros and bell-bottoms are the “in” thing. Computers are the preserve of big tech companies and to make it worse, they are as big as Noah’s ark. Despite their size, this is by no means the most pressing problem that the then “geeks” are battling. Again, think of the computer as Noah’s Ark; the Lions and the Gazelles need different chambers, so were the computers back then. Each computer had a different operating system. This means that, unlike today when you can own different computers all running the same system, back then, each computer had to have its own operating system to serve a specific function or purpose (think of it like this; if it was in today’s setting, you would need one computer to type on and another computer to watch movies on). To top it off, being an expert in one system did not automatically mean that you were an expert in all the other systems. It is in these difficult and tumultuous times that scientists from Bell Labs decided that enough was enough. In 1969, they decided to develop a new operating system that has three things
#1 Elegant and simple.
#2 Written in another computer language called C programming rather than the commonly used assembly code.
#3 A system that could be able to recycle the code generated.
After the development, the Bell scientists decided to name their brainchild “UNIX”. UNIX was a “mass mover” mainly because it was the only system able to recycle code, unlike the other systems developed for one system. At that time, UNIX needed one piece of the special code i.e, the kernel, which is its popular name.
The kernel is the base of the UNIX system and is the piece of code needed to adapt to specific computers and functions. In essence, what UNIX did was revolutionized things as they were then. The operating systems and all other functionalities of a computer were written in C language around the kernel.
They have created the ‘C’ language specifically for the UNIX system. The language (the C language) proved to be more flexible and could allow the creation of operating systems to run on different hardware. It is important to note that in its early days, UNIX was not so much a home system and was thus, used in big organizations with mainframes and minicomputers such as the then government and universities. It is also in this environment, that smaller computers were developed. At this stage of development, there were several versions of UNIX available, but they were very slow and not really free; this led to the increase in use of the MS DOS on home computers.
We fast-forward to the 90s when the computers got powerful enough to run a full UNIX system. A young man called Linus Torvalds is studying computer science at the University of Helsinki, and he thinks to himself, “hmm… would it not be nice if there was a free academic version of the UNIX?” Being a computer science student, and an inquisitive one at that, he started coding and asking many questions about UNIX. Most of the questions he asked revolved around being able to get the UNIX system running on his PC. There are many correspondences between him and the group called net landers but the one that captures your attention is the one below posted in comp.os.minix in 1991.
From the correspondence, we can see that from the beginning, it was Linus’ goal to create a system compliant with the original UNIX but for free. We can tell this by the fact that he asked for the POSIX standards, with POSIX being the standard for UNIX. Back in those days, plug and play was not yet “a thing”. However, this did not stop many people from showing a keen interest in owning and operating the UNIX system. What Linux is today is thanks to the people back then who were very keen on making sure that every new driver available for new hardware was submitted to the Linux test. This ended up causing a release of new codes at an amazing speed.
Now that we have looked at the somewhat unexpected birth of Linux, it is about time we got down to the inner workings of this operating system.