Microsoft .NET provides support for threads at the language
level just like Sun’s Java. This article describes some important related
terminology and discusses Multithreading in .NET with some code examples.
Process and Thread
A process may be defined as the
running instance of a program characterized by change of state and attributes. Each
and every process maintains a Process Control Block or PCB of its own. A thread is a light weight process. It is the smallest unit
of CPU utilization and is also the path of execution within a process. A task
is an enhanced form of a thread. Each and every process should have at least
one thread. This is the primary or the main thread of the application. All
other user created threads run in the background and are also called worker
threads. When the application’s main thread terminates so does the
application. Unlike processes, threads of the same process share the same
address space and inter-thread communication is faster than inter-process
communication due to the less overhead involved. Context switching between
threads is also faster compared to processes.
Single threaded Applications
Single threaded applications are those in which a thread
cannot execute until the earlier thread has completed its execution. The MS-DOS
operating system is an example of a single threaded operating system. These
environments do not have any support for Multithreading and they monopolize the
processor and have low system throughput. Throughput is a measure of the
amount of the job done in unit time.
Multithreading
Multithreading is the ability of the operating system to
have at the same point of time multiple threads in memory which switch between
the tasks so as to provide a pseudo parallelism, as if all the tasks are
running simultaneously. This illusion of concurrency is ensured by the
Operating System by providing a specific time slice to each and every thread
and then switching between the threads once their slice is over. This
switching is very fast. The switching between the threads involves a context
switch which in turn involves saving the current thread’s state, flushing the
CPU and handling control of the CPU to the next thread in the queue. Remember
that at any point of time the CPU can execute only one thread. It is to be
noted here that Multiprocessing involves multiple processor with each executing
one thread at any particular point of time.
Multithreading can be of the following two types
·
Cooperative
·
Preemptive
In the Cooperative mode of multithreading a thread can have
the control of the processor as long as it needs without the need to
necessarily preempt them. In order words, in this type of multithreading the
control of the processor lies with the executing thread. In the preemptive
mode of operation however, the operating system has control over the processor
and decides the time slice for each thread for which it would execute and
preempts threads if and when required. Cooperative multithreading is supported
by Windows 3.11 while preemptive mode is supported by Windows 98, NT.
Advantages and Disadvantages of Multithreading
The following are the major advantages of using
Multithreading.
·
Improved responsiveness
·
Faster execution
·
Better CPU and Memory utilization
·
Support for Concurrency
The following are the drawbacks of using threads.
·
Problems in testing and debugging due to the non – deterministic
nature of execution
·
Complexity
Multithreading in C#
The C# library has a namespace called System.Thread that
provides the functionality of implementing threads in C#. The following are the
methods of the Thread class.
·
Start
·
Suspend
·
Resume
·
Join
·
Abort
·
GetCompressedStack
·
SetCompressedStack
The following are the properties of the Thread class.
·
ApartmentState
·
CurrentCulture
·
CurrentUICulture
·
IsAlive
·
IsBackground
·
IsThreadPoolThread
·
Name
·
Priority
·
ThreadState
Thread States
From the Operating System concepts we can classify a thread
broadly in one of the following states.
·
Ready or Runable state
·
Running state
·
Wait state
When a thread is first created it is put in the ready state.
A thread in the ready state is one that has all the required resources for it
to execute except the processor. It waits in the runable queue for its turn to
come. It would be scheduled from the ready or the runable state to the running
state when its turn comes. However, a thread of a higher priority than another
thread would be scheduled prior to the other threads in the ready queue. A
thread in the wait state is waiting for its IO to be complete.
A thread is scheduled from the runable queue to the running
state by a module of the operating system known as the scheduler. A thread in
the running state has everything including the processor. Remember that at any
point of time a single processor can execute only one thread.
Supported Thread States in C#
In Sun’s Java and Microsoft’s .NET we find that they have
modified or enhanced the above states and given some meaningful names to them. The
ThreadState enum in the System.Threading namespace in C# contains the various supported
thread states in .NET. The following are the members of the ThreadState enum.
·
Unstarted
·
Running
·
Background
·
StopRequested
·
Suspended
·
SuspendRequested
·
WaitSleepJoin
·
Aborted
·
AbortRequested
·
Stopped
Creating threads in C#
The Thread class that belongs to the System.Threading
namespace contains the necessary members for creating, suspending, resuming and
aborting threads.
Let us take an example. Consider the following code.
Listing 1
//Some code
Thread threadObj = new Thread(new ThreadStart(MyWorkerThreadMethod));
threadObj.Start();
When the thread object is first created it is in the
Unstarted state. The Start() method is responsible for starting the thread. Remember
that when we start a thread it might not be immediately started. To be
specific, it is actually put in the ready or the runnable state. It is the
responsibility of the Operating System to actually schedule the thread from the
runnable state to the running state. The method MyThreadMethod() contains the
actual thread code that would be executed once a call to the Start() method is
made. It should be remembered that the following should hold good for a method
to be a thread method.
·
It should have no parameters.
·
It should have a void return type.
Generally a
thread method looks like the following:
Listing 2
public void MyWorkerThreadMethod()
{
while(condition)
{
//Some code
}
}
The following sample code shows how we can assign a name to
a thread object.
Thread currentThreadObject =Thread.CurrentThread;
currentThreadObject.Name = "PrimaryThread";
The following is the complete listing of a code that shows
how we can create threads in C#.
Listing 3: Creating threads in C#
using System;
using System.Threading;
class Test
{
static void MyThreadMethod()
{
Console.WriteLine("This is the workerthread.");
}
static void Main()
{
Console.WriteLine("This is the main orthe primary
thread of the application.");
Thread threadObj = new Thread(newThreadStart(MyThreadMethod));
threadObj.Start();
}
}
Suspending threads in C#
A thread in the running state can be suspended by making a
call to the Suspend() method. The thread in the suspended state waits for its
suspension to be revoked. To be precise, the call to the Suspend() method puts
the thread in a SuspendedRequest state. It is to be noted that the .NET
runtime does not suspend a thread immediately after a call to the Suspend()
method. The Suspend() method would be executed once a safe point is achieved. This
is decided by the runtime and it might allow the thread to execute a few more
instructions before the thread reaches a point where suspension might be
possible. This is a point at which the GC can safely work. This is what is
known as the safe point. This is purely for a better and safer performance of
the garbage collector.
Listing 4
//Some Code
if (threadObject.ThreadState ==ThreadState.Running )
{
threadObject.Suspend();
}
//Some code
Resuming threads in C#
A suspended thread can be resumed by making a call to the
Resume()method. If the thread on which the Resume method is called is not in a
suspended state, the request for resumption of the thread would simply be
ignored.
Listing 5
//Some Code
if (threadObject.ThreadState ==ThreadState.Suspended )
{
threadObject.Resume();
}
//Some Code
Making a thread to Sleep in C#
In order to put a thread to sleep, the Sleep() method is
invoked. This is a blocking call indicating that the thread will resume its
execution once the time for which it is made to sleep elapses.
The following code makes the thread to sleep for 5 seconds.
Listing 6
//Some code
Thread.Sleep(5000);
To make a thread sleep infinitely use the following code.
//Some code
Thread.Sleep(TimeSpan.Infinite);
Joining threads in C#
This method allows a thread to wait until another thread has
completed its execution. The following code shows how we can use this method to
make a thread wait until the other thread is complete.
Listing 7
//Some Code
if(Thread.CurrentThread.GetHashCode()!=
threadObject.GetHashCode())
{
threadObject.Join();
}
Terminating threads in C#
The method Abort() stops a thread prematurely; it can be
used to terminate execution of a thread. This method raises a
ThreadAbortException.
Listing 8
//Some Code
if (threadObject.IsAlive == true )
{
threadObject.Abort();
}
//Some Code
Thread Priorities
Based on their importance we can set the priorities of threads.
Meaning we can set a thread as having a higher priority than another. Here is
an example: suppose there is an application where the application is accepting
user input using a thread and another thread is displaying a message to the
user indicating the time that has elapsed after the form has be opened. We can
set the thread that is responsible for accepting the user input to a higher priority
than the other to increase the user responsiveness. This is because the thread
with the higher priority would be executed more frequently than one that has a
lower priority.
Thread priorities in C# are defined using the ThreadPriority
enum. The following are the possible values:
·
Highest
·
AboveNormal
·
Normal
·
BelowNormal
·
Normal
The ThreadPool class
Thread Pooling is a concept where the tasks are stored in a
queue and the threads are created to handle these tasks. This creation of the
threads is taken care of by the Thread Pool itself. Thus thread management is
handled by the thread pool. Thread Pooling is enabled by the usage of the
ThreadPool class in the System.Threading namespace. It enables us to use the
resources efficiently by optimizing thread time slices on the processor. At
any point of time there would be one thread pool per process and the maximum
limit is 25 indicating that the thread pool can contain, at any point of time,
a maximum of 25 worker threads in the pool. The Thread Pool creates the worker
threads and assigns each a task from among the pending tasks in the queue.
The Timer class
The Timer class in the System.Threading namespace can be
used to run a task at periodic intervals of time. This can be used to
automatically execute a task in the background at specific time spans. We can
use this class to backup files or databases at particular intervals of time.
Thread Synchronization
Thread Synchronization guarantees that only one thread can
access the synchronized block of code or synchronized object at any point of
time. Let us take an example. Consider the code snippet that follows.
Listing 9
Test obj = null;
//Some Code
if(obj == null)
obj = new Test();
//Rest of the code
This implementation is not thread safe. There can be two
threads trying to access the condition at the same point of time. If both of
these threads evaluated the condition if(obj == null) and found it true, then
both would have created the objects. We can issue a locking mechanism to ensure
that at any point of time only one thread has access to the condition stated
above.
Listing 10
Test obj = null;
//Some Code
lock(this)
{
if(obj == null)
obj = new Test();
//Some code
}
//Rest of the code
The above code can be executed by one thread at any point of
time. However, from the performance perspective it is advisable not to lock on
the current instance of the class. The following code snippet is the better
choice in this context.
Listing 11
private static readonly object lockObj = newobject();
//Some code
Test obj = null;
//Some Code
lock(lockObj)
{
if(obj == null)
obj = new Test();
//Some code
}
//Rest of the code
The lock statement performs a mutual exclusion lock or mutex
on the object that is passed to it. When any thread is trying to access a
block of code that has already been locked by another thread, the thread is
simply put to sleep until the earlier thread completes its execution and
relinquishes the control of the block.
Thread Synchronization should however, be used judiciously
as it can impact the performance of the application. This is due to the
overhead involved in locking and releasing objects. Therefore, it slows down
the execution and consumes more memory resources. The pitfalls of Thread
Synchronization are Deadlocks and Race Conditions that have been explained in
the following sections.
Deadlocks
A deadlock is a condition that occurs when two threads
attempt to access a resource that has already been locked by them. Let there
be two threads, T1 and T2, executing simultaneously. Let there be references
to two resources, R1 and R2, being accessed by these threads. Let the
following block of code be executed by the thread T1.
Listing 12
lock(R1)
{
//Some code
lock(R2)
{
//Some code
}
}
In just the reverse way let the code that is associated with
the thread T2 be as follows:
Listing 13
lock(R2)
{
//Some code
lock(R1)
{
//Some code
}
}
The first thread locks on the object R1 while the other
thread T2 acquires a lock on the object R2. After some time, the first thread
T1 encounters the statement lock(R2) and goes into a sleeping state, waiting
for the lock on the object R2 to be released. On the other hand, the other
thread T2 encounters the statement lock(R1) and moves onto the sleeping state. The
problem here is that this lock would never be released as this lock (lock on
the object R1) is owned by the thread T1 and it is waiting to have a release on
the lock on the object R2 by the thread T2. The result is that the application
hangs indefinitely. This is known as a deadlock situation. This situation
could have been avoided if both these threads acquired locks on these objects
(R1 and R2) in the same order.
Race Conditions
A race condition can occur when several threads try to
access the same data at the same point of time. A race condition may be
defined as one in which two threads try to access a shared resource
simultaneously and, as a consequence, leave the resource in an undefined state.
Improper thread synchronization in such situations may result in a race
condition. It is a general view that we have race conditions more among
threads than among processes. The reason for having more race conditions among
threads than processes is that threads can share their memory while a process
cannot.
Let there be two threads, T1 and T2, where both are
accessing a shared variable x. They first read data from the variable and then
try to write data on the variable simultaneously. They would race to find
which of these threads can write the value last to the variable. In this case,
the last written value would be saved. As another example, we can have these
two threads, T1 and T2, trying opposite operations on this variable. Let
thread T1 increase the value of the variable and the thread T2 decrease the
value of the variable at the same point of time. What is the result? The
value remains unchanged. A proper mutual exclusive lock on this variable can
prevent this condition. We can prevent this condition by ensuring that objects
that are modifiable by multiple threads have one and only one mutex associated
with them. Therefore, we may say that thread safety is the best solution to
preventing race conditions.
Suggested Readings
Please refer to the following links for further references
on this topic:
http://www.codeproject.com/dotnet/multithread.asp
http://www.c-sharpcorner.com/Code/2005/April/Thread.asp
http://www.developer.com/net/asp/article.php/2202491
Conclusion
Even though multithreading can is a powerful feature, it
should be used carefully. Thread synchronization issues should be taken care
of very quickly in some situations to avoid deadlocks and race conditions. This
article has provided a detailed discussion of the concept of threads, tasks,
multithreading and how to use threads in C#. Happy reading!