dict.org

The DICT Development Group


Search for:
Search type:
Database:

Database copyright information
Server information
Wiki: Resources, links, and other information


2 definitions found
 for parallel processing
From WordNet (r) 3.0 (2006) :

  parallel processing
      n 1: simultaneous processing by two or more processing units
           [syn: multiprocessing, parallel processing]

From The Free On-line Dictionary of Computing (18 March 2015) :

  parallel processing
  multiprocessing
  multiprocessor
  parallel
  parallel computing
  
      (Or "multiprocessing") The simultaneous use of more
     than one computer to solve a problem.  There are many
     different kinds of parallel computer (or "parallel
     processor").  They are distinguished by the kind of
     interconnection between processors (known as "processing
     elements" or PEs) and between processors and memory.  Flynn's
     taxonomy also classifies parallel (and serial) computers
     according to whether all processors execute the same
     instructions at the same time ("{single instruction/multiple
     data" - SIMD) or each processor executes different
     instructions ("{multiple instruction/multiple data" - MIMD).
  
     The processors may either communicate in order to be able to
     cooperate in solving a problem or they may run completely
     independently, possibly under the control of another processor
     which distributes work to the others and collects results from
     them (a "{processor farm").  The difficulty of cooperative
     problem solving is aptly demonstrated by the following dubious
     reasoning:
  
     	If it takes one man one minute to dig a post-hole
     	then sixty men can dig it in one second.
  
     Amdahl's Law states this more formally.
  
     Processors communicate via some kind of network or bus or a
     combination of both.  Memory may be either shared memory
     (all processors have equal access to all memory) or private
     (each processor has its own memory - "{distributed memory")
     or a combination of both.
  
     Many different software systems have been designed for
     programming parallel computers, both at the operating system
     and programming language level.  These systems must provide
     mechanisms for partitioning the overall problem into separate
     tasks and allocating tasks to processors.  Such mechanisms may
     provide either implicit parallelism - the system (the
     compiler or some other program) partitions the problem and
     allocates tasks to processors automatically or explicit
     parallelism where the programmer must annotate his program to
     show how it is to be partitioned.  It is also usual to provide
     synchronisation primitives such as semaphores and monitors
     to allow processes to share resources without conflict.
  
     Load balancing attempts to keep all processors busy by
     allocating new tasks, or by moving existing tasks between
     processors, according to some algorithm.
  
     Communication between tasks may be either via shared memory
     or message passing.  Either may be implemented in terms of
     the other and in fact, at the lowest level, shared memory uses
     message passing since the address and data signals which flow
     between processor and memory may be considered as messages.
  
     The terms "parallel processing" and "multiprocessing" imply
     multiple processors working on one task whereas "{concurrent
     processing" and "{multitasking}" imply a single processor
     sharing its time between several tasks.
  
     cellular+automaton,{symmetric+multi-processing">See also cellular automaton,{symmetric multi-processing.
  
     Usenet newsgroup: news:comp.parallel.
  
     http://ccsf.caltech.edu/other_sites.html)">Institutions (http://ccsf.caltech.edu/other_sites.html),
     research groups
     http://cs.cmu.edu/~scandal/research-groups.html)">(http://cs.cmu.edu/~scandal/research-groups.html).
  
     (2004-11-07)
  

Questions or comments about this site? Contact webmaster@dict.org