Concurrency in Operating System

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 3

CONCURRENCY IN OPERATING SYSTEM

Concurrency is the execution of a set of multiple instruction sequences


at the same time. This occurs when there are several process threads
running in parallel. These threads communicate with the other
threads/processes through a concept of shared memory or through
message passing. Because concurrency results in the sharing of system
resources - instructions, memory, files - problems can occur. like
deadlocks and resources starvation. (we will talk about starvation and
deadlocks in the next module).

Principles of Concurrency :
With current technology such as multi core processors, and parallel
processing, which allow for multiple processes/threads to be executed
concurrently - that is at the same time - it is possible to have more than a
single process/thread accessing the same space in memory, the same
declared variable in the code, or even attempting to read/write to the
same file.

The amount of time it takes for a process to execute is not easily


calculated, so we are unable to predict which process will complete first,
thereby allowing us to implement algorithms to deal with the issues that
concurrency creates. The amount of time a process takes to complete
depends on the following:

 The activities of other processes


 The way operating system handles interrupts
 The scheduling policies of the operating system

Problems in Concurrency :

 Sharing global resources


Sharing of global resources safely is difficult. If two processes both
make use of a global variable and both make changes to the
variables value, then the order in which various changes take place
are executed is critical.
 Optimal allocation of resources
It is difficult for the operating system to manage the allocation of
resources optimally.
 Locating programming errors
It is very difficult to locate a programming error because reports are
usually not reproducible due to the different states of the shared
components each time the code runs.
 Locking the channel
It may be inefficient for the operating system to simply lock the
resource and prevent its use by other processes.
Advantages of Concurrency :

 Running of multiple applications


Having concurrency allows the operating system to run multiple
applications at the same time.
 Better resource utilization
Having concurrency allows the resources that are NOT being
used by one application can be used for other applications.
 Better average response time
Without concurrency, each application has to be run to completion
before the next one can be run.
 Better performance
Concurrency provides better performance by the operating system.
When one application uses only the processor and another
application uses only the disk drive then the time to concurrently
run both applications to completion will be shorter than the time to
run each application consecutively.

Drawbacks of Concurrency :

 When concurrency is used, it is pretty much required to protect


multiple processes/threads from one another.
 Concurrency requires the coordination of multiple processes/threads
through additional sequences of operations within the operating
system.
 Additional performance enhancements are necessary within the
operating systems to provide for switching among applications.
 Sometimes running too many applications concurrently leads to
severely degraded performance.

Issues of Concurrency :

 Non-atomic
Operations that are non-atomic but interruptible by multiple
processes can cause problems. (an atomic operation is one that
runs completely independently of any other processes/threads - any
process that is dependent on another process/thread is non-
atomic)
 Race conditions
A race condition is a behavior which occurs in software applications
where the output is dependent on the timing or sequence of other
uncontrollable events. Race conditions also occur in software which
supports multithreading, use a distributed environment or are
interdependent on shared resources
 Blocking
A process that is blocked is one that is waiting for some event, such
as a resource becoming available or the completion of an I/O
operation.[Processes can block waiting for resources. A process
could be blocked for long period of time waiting for input from a
terminal. If the process is required to periodically update some data,
this would be very undesirable.
 Starvation
A problem encountered in concurrent computing where a process is
perpetually denied necessary resources to process its
work. Starvation may be caused by errors in a scheduling or mutual
exclusion algorithm, but can also be caused by resource leaks
 Deadlock
In concurrent computing, a deadlock is a state in which each
member of a group waits for another member, including itself, to
take action, such as sending a message or more commonly
releasing a lock. Deadlocks are a common problem in
multiprocessing systems, parallel computing, and distributed
systems, where software and hardware locks are used to arbitrate
shared resources and implement process synchronization

You might also like