Chapter 3 Naming and Threads
Chapter 3 Naming and Threads
Chapter 3 Naming and Threads
Naming
in
Distributed Systems
Chapter 3
Outline
• Process and Threads in DS
• Client and Server
• Naming Basics
• Name Resolution
• Flat Naming vs. Hierarchical Naming:
• Name Services
• Name Spaces
Threads and Process in Distributed Systems
• In distributed systems, processes and threads are fundamental
concepts that play a crucial role in enabling concurrent execution,
efficient resource management, and overall system performance.
• They provide the means to structure and manage the execution of
tasks and programs in a distributed environment.
• Here's an introduction to processes and threads in distributed
systems:
Cont’d…
• Processes:
• A process is an independent and self-contained program execution
unit within an operating system.
• Each process operates in its own isolated memory space and has its
own system resources, making it independent of other processes.
• Key characteristics of processes in distributed systems include:
• Isolation: Processes are isolated from one another, meaning that the
failure or misbehavior of one process does not affect others. This
isolation enhances system security and stability.
Cont’d…
• Resource Management: Each process has its own dedicated
resources, such as CPU time, memory, and file handles. This allows for
efficient resource management and prevents resource contention.
• Security: Processes can be individually protected and managed.
Access control mechanisms can be applied to restrict unauthorized
access and interactions between processes.
• Parallelism: Processes enable parallel execution on multi-core
processors, which is essential for leveraging the full potential of
modern hardware.
Cont’d…
• Processes are typically created using system calls provided by the
operating system. In distributed systems, processes can communicate
with each other through inter-process communication (IPC)
mechanisms like pipes, sockets, shared memory, or message queues.
• Threads:
• A thread is a lightweight unit of execution within a process.
• Threads share the same memory space and resources within a
process and are sometimes referred to as "lightweight processes."
Key characteristics of threads in distributed systems include:
Cont’d…
• Concurrency: Threads enable concurrent execution of multiple tasks
within a single process, taking advantage of multi-core processors.
This concurrency improves system responsiveness and throughput.
• Efficiency: Threads have lower overhead compared to processes
because they share resources and memory. This makes them more
efficient for tasks that require frequent context switching or
parallelism.
• Shared State: Threads within the same process can easily share data
and communicate with each other, making them suitable for
collaborative and closely related tasks.
Cont’d…
• In distributed systems, processes and threads are essential for various
reasons, including:
• Handling concurrent requests: Processes and threads can handle multiple
incoming requests simultaneously, improving system responsiveness and
throughput.
• Resource utilization: By distributing tasks across processes or threads,
distributed systems can utilize system resources efficiently, reducing idle
time.
• Isolation and security: Processes provide strong isolation, while threads
can be useful for sharing data and resources within a controlled context.
• Parallelism and scalability: Processes and threads enable distributed
systems to take advantage of multi-core processors, improving
performance and scalability.
Clients and servers
• Clients and servers in distributed systems use processes and threads
to handle various tasks and provide services efficiently.
• Here's how clients and servers employ processes and threads to
achieve their respective roles:
• Servers:
• Process Per Connection Model:
• Servers often use a "process per connection" model, where a new process is
created for each incoming client connection.
• This model provides strong isolation, ensuring that client interactions do not
interfere with each other.
• Each process can be dedicated to handling a single client request or session,
allowing the server to maintain multiple concurrent connections.
Cont’d…
• Thread Per Connection Model:
• An alternative to the process per connection model is the "thread per
connection" model.
• In this approach, a new thread is created for each incoming client
connection within a server process.
• Threads share the same process address space but have their own
execution context.
• This model is more lightweight than creating separate processes and
is suitable for handling a large number of simultaneous connections.
Cont’d…
• Worker Threads or Processes:
• Servers often employ a pool of worker processes or threads to handle
incoming client requests. When a new client connection is established, the
server assigns the task to an available worker thread or process. This
approach balances the load and efficiently manages resources.
• Task Parallelism:
• Within a server, multiple threads or processes may be responsible for
different tasks, such as listening for incoming connections, processing client
requests, and handling database queries. This division of labor improves the
server's overall performance and responsiveness.
Cont’d…
• Shared State Management:
• When multiple threads or processes are used within a server,
mechanisms for managing shared state and data are critical.
• This may involve using synchronization techniques, such as locks or
semaphores, to prevent data corruption and ensure thread-safe
access to resources.
Client
• Concurrent Requests:
• Clients may use threads to issue multiple concurrent requests to remote
servers.
• Each thread can send a request, receive a response, and manage its own
session with the server.
• This concurrency allows clients to efficiently use the network and reduce
latency.
• Asynchronous Operations:
• In some cases, clients use asynchronous programming models, such as
asynchronous I/O or event-driven architectures, to issue requests to servers.
These models allow clients to continue executing other tasks while waiting for
responses from the server, improving overall client responsiveness.
Cont’d…
• Parallelism in Distributed Computing:
• In distributed computing scenarios, clients may employ threads to parallelize
data processing tasks or computation across multiple servers.
• This parallelism accelerates data analysis, simulations, or other computational
workloads.
• Load Balancing:
• Clients can use multiple threads or processes to interact with multiple server
instances or replicas, distributing the load and ensuring redundancy.
• Load balancing mechanisms help optimize resource utilization and enhance
system reliability.
Cont’d…
• Resource Management:
• Clients may create and manage threads to handle various tasks, such
as user interfaces, data processing, and communication with servers.
• Effective resource management is essential to ensure that client
applications remain responsive and efficient.
• Clients and servers collaborate to provide distributed services
effectively by using processes and threads for concurrent execution,
load distribution, and resource management.
NEXT…