Why Should You Read This Book?
Multicore processors made a big splash when they were first introduced. Bowing to the physics of heat and power, processor clock speeds could not keep doubling every 18 months as they had been doing for the past three decades or more. In order to keep increasing the processing power of the next generation over the current generation, processor manufacturers began producing chips with multiple processor cores. More processors running at a reduced speed generate less heat and consume less power than single-processor chips continuing on the path of simply doubling clock speeds.
But how can we use those extra cores? We can run more than one application at a time, and each program could have a separate processor core devoted to the execution. This would give us truly parallel execution. However, there are only so many apps that we can run simultaneously. If those apps arent very compute-intensive, were probably wasting compute cycles, but now were doing it in more than one processor.
Another option is to write applications that will utilize the additional cores to execute portions of the code that have a need to perform lots of calculations and whose computations are independent of each other. Writing such programs is known as concurrent programming . With any programming language or methodology, there are techniques, tricks, traps, and tools to design and implement such programs. Ive always found that there is more art than science to programming. So, this book is going to give you the knowledge and one or two of the secret handshakes you need to successfully practice the art of concurrent programming.
In the past, parallel and concurrent programming was the domain of a very small set of programmers who were typically involved in scientific and technical computing arenas. From now on, concurrent programming is going to be mainstream. Parallel programming will eventually become synonymous with programming. Now is your time to get in on the ground floor, or at least somewhere near the start of the concurrent programming evolution.
Who Is This Book For?
This book is for programmers everywhere.
I work for a computer technology company, but Im the only computer science degree-holder on my team. There is only one other person in the office within the sound of my voice who would know what I was talking about if I said I wanted to parse an LR(1) grammar with a deterministic pushdown automata. So, CS students and graduates arent likely to make up the bulk of the interested readership for this text. For that reason, Ive tried to keep the geeky CS material to a minimum. I assume that readers have some basic knowledge of data structures and algorithms and asymptotic efficiency of algorithms (Big-Oh notation) that is typically taught in an undergraduate computer science curriculum. For whatever else Ive covered, Ive tried to include enough of an explanation to get the idea across. If youve been coding for more than a year, you should do just fine.
Ive written all the codes using C. Meaning no disrespect, I figured this was the lowest common denominator of programming languages that supports threads. Other languages, like Java and C#, support threads, but if I wrote this book using one of those languages and you didnt code with the one I picked, you wouldnt read my book. I think most programmers who will be able to write concurrent programs will be able to at least read C code. Understanding the concurrency methods illustrated is going to be more important than being able to write code in one particular language. You can take these ideas back to C# or Java and implement them there.
Im going to assume that you have read a book on at least one threaded programming method. There are many available, and I dont want to cover the mechanics and detailed syntax of multithreaded programming here (since it would take a whole other book or two). Im not going to focus on using one programming paradigm here, since, for the most part, the functionality of these overlap. I will present a revolving usage of threading implementations across the wide spectrum of algorithms that are featured in the latter portion of the book. If there are circumstances where one method might differ significantly from the method used, these differences will be noted.
Ive included a review of the threaded programming methods that are utilized in this book to refresh your memory or to be used as a reference for any methods you have not had the chance to study. Im not implying that you need to know all the different ways to program with threads. Knowing one should be sufficient. However, if you change jobs or find that what you know about programming with threads cannot easily solve a programming problem you have been assigned, its always good to have some awareness of what else is availablethis may help you learn and apply a new method quickly.
Whats in This Book?
, anticipates and answers some of the questions you might have about concurrent programming. This chapter explains the differences between parallel and concurrent, and describes the four-step threading methodology. The chapter ends with a bit of background on concurrent programming and some of the differences and similarities between distributed-memory and shared-memory programming and execution models.