parallel The simultaneous use of more than one computer to
solve a problem. There are many different kinds of parallel
computer (or "parallel processor"). They are distinguished by
the kind of interconnection between processors (known as
"processing elements" or PEs) and between processors and
serial) computers according to whether all processors execute
- MIMD).
The processors may either communicate in order to be able to
cooperate in solving a problem or they may run completely
independently, possibly under the control of another processor
which distributes work to the others and collects results from
problem solving is aptly demonstrated by the following dubious
reasoning:
If it takes one man one minute to dig a post-hole
then sixty men can dig it in one second.
Processors communicate via some kind of network or bus or a
(all processors have equal access to all memory) or private
or a combination of both.
A huge number of software systems have been designed for
and programming language level. These systems must provide
mechanisms for partitioning the overall problem into separate
tasks and allocating tasks to processors. Such mechanisms may
compiler or some other program) partitions the problem and
allocates tasks to processors automatically or
explicitparallelism where the programmer must annotate his program to
show how it is to be partitioned. It is also usual to provide
to allow processes to share resources without conflict.
moving tasks from heavily loaded processors to less loaded
ones.
the other and in fact, at the lowest level, shared memory uses
message passing since the address and data signals which flow
between processor and memory may be considered as messages.
(1996-04-23)