next up previous

1.4 Does Computational Science have Language Requirements?     continued...

The three main classes of parallel programming we can observe at this time also exhibit a variety of approaches to address the language issues:

data parallelism
the easiest to learn, and arguably the easiest to write, debug, and tune, provided the problem maps well to this style. Languages for this type of parallelism are simple extensions of the corresponding sequential language. Examples are C, Dataparallel C, and MasPar's MPL as extensions of C; pc++ as an extesion of C++; and the parallel array operations in HPF and Fortran90.
parallel libraries
runtime libraries such as PVM, P4, Linda, and MPI, which have procedures that can be called from any language. Users have to explicitly parallelize their code, and contend with synchronization problems. There are two main classes within this group -- shared memory and message passing -- but as far as programming language issues are concerned, both classes are implemented by augmenting an existing sequential language with library routines for creating and coordinating parallel tasks.
new high level languages with implicit parallelism
the functional and logic programming languages often fall into this category. This approach requires programmers to learn a whole new programming paradigm, not just a new language sytnax, but adherents claim the effort will be worth it in the long run.

In summary we observe the lack of a common programming style among computational scientists and a resulting breakdown of efficient communication within the community.