A traditional computer simply executes a sequence of commands in a single order - like oxen plowing a field. But, according to Blaise Barney of the Lawrence Livermore National Laboratory, parallel computing works by partitioning a problem into multiple tasks which can be done at the same time. For example, your graphics processor may render each section of the screen separately or a signal processing program might apply multiple filters to audio data simultaneously. Think of it as if the thousand twenty four chickens each only had to plow a few feet of the field. IBM software engineer Paul McKenney notes that programming parallel software is exceedingly hard, much more difficult than straightforward sequential programs. But performance benefits outweigh the programming difficulty
A traditional computer simply executes a sequence of commands in a single order - like oxen plowing a field. But, according to Blaise Barney of the Lawrence Livermore National Laboratory, parallel computing works by partitioning a problem into multiple tasks which can be done at the same time. For example, your graphics processor may render each section of the screen separately or a signal processing program might apply multiple filters to audio data simultaneously. Think of it as if the thousand twenty four chickens each only had to plow a few feet of the field. IBM software engineer Paul McKenney notes that programming parallel software is exceedingly hard, much more difficult than straightforward sequential programs. But performance benefits outweigh the programming difficulty