Operating System Design/Process

A process is more than just a copy of a program in memory. An executable program is like a recipe. It's a series of instructions which the computer reads in order to accomplish a task. When a cook prepares a dish, he reads the recipe, gathers all the materials the recipe calls for, and combines them as the recipe dictates. This is similar to the way that a program is run.

When you run a program on your computer the operating system gathers resources (like memory and CPU time) and begins following the instructions listed in the executable. The OS must keep track of which instruction it's performing at any given moment, and the current state of the program. All of this information, including the program itself, the list of resources allocated to it, and its state information, are part of a process.

In multi-tasking operating systems, like Windows (95 and later), Linux, and MacOS, multiple processes can be run by the operating system at the same time, with each process taking turns running on the CPU. The operating system allocates time for each process to run on the CPU, and switches between each so quickly that they appear to run simultaneously. MS-DOS, which preceded Microsoft Windows, only ran one process one at a time.

Processes are used to carry out every task that needs to be performed by the computer. The operating system itself is a process. It spawns (or starts) processes that run the display, interact with the keyboard, run the mouse, and interact with disk drives. There are processes that are started by the user such as word processors, spreadsheets, web browsers, and e-mail clients. Some of these programs consist of multiple co-operating processes; for example, a web server might have multiple processes to handle multiple incoming requests at the same time.

Interprocess Communication edit

Of interest when designing an Operating System is Inter-Process Communication, or IPC. Essentially, when more than one process is running and sharing the same computer, there might be times when the processes need to communicate with another process or multiple processes might want to share the same resource. When this happens, care should be taken not to create certain dangerous conditions, namely, race conditions, where two processes attempt to use the same resource at the same time. Which ever is fastest gets first crack at it, but the other process then gets next crack and so on, confusing the resource. A deadlock occurs when two devices both lock into a waiting mode, waiting for the other to complete before they start, and so neither can complete, and other rarer and less predictable errors that are even harder to debug can occur as well.

The main problem is that processes to control processes take time, and during that time, another process can attempt to use the same resource. To run many processes concurrently requires that at no time can a resource be accessed by more than one process, and that while a resource is being locked for access, another process can't interrupt the processing and interfere with it. One way of keeping this from happening is to build commands into the hardware that are atomic, and can't be interrupted while they are critically involved with a resource. A number of such processes have been experimented with including locks, semaphores, mutexes, monitors, etc. The latest technique is a technique called Message passing that works well with OOP.

Threads edit

Although threads are sometimes not considered real processes because they do not carry all the heavy freight of a full process, concurrency using threads is just as fraught with problems as Inter-Process Communication. The primary difference is that a thread does not attach all the resources that a full process must. The specific difference lies in the CONTEXT fields of the process. Instead of having their own context, threads inherit the context of the main process, they were spawned from, and so care must be taken that they do not change variables used by other threads without some communication synchronization happening.

Threads are sometimes called lightweight processes. Both processes and threads provide an execution environment, but creating a new thread often requires fewer resources than creating a new process.

Threads exist within a process — every process has at least one. Threads share the process's resources, including memory and open files. This makes for efficient, but potentially problematic, communication.

Multithreaded execution is an essential feature of the Java platform. Every application has at least one thread — or several, if you count "system" threads that do things like memory management and signal handling. But from the application programmer's point of view, you start with just one thread, called the main thread. This thread has the ability to create additional threads, as we'll demonstrate in the next section.