Distributed Programming with Shared Data

Until recently, at least one thing was clear about parallel programming: shared-memory machines were programmed in a language based on shared variables and distributed machines were programmed using message passing. Recent research on distributed systems and their languages, however, has led to new...

Full description

Bibliographic Details
Published in:Computer Languages
Main Authors: Bal, H.E., Tanenbaum, A.S.
Format: Article in Journal/Newspaper
Language:English
Published: 1991
Subjects:
Online Access:https://research.vu.nl/en/publications/76b5059e-d59e-48a2-a83f-3d993a969edd
https://doi.org/10.1016/0096-0551(91)90003-R
https://research.vu.nl/ws/files/119276305/10984
http://www.scopus.com/inward/record.url?scp=0025894356&partnerID=8YFLogxK
http://www.scopus.com/inward/citedby.url?scp=0025894356&partnerID=8YFLogxK
Description
Summary:Until recently, at least one thing was clear about parallel programming: shared-memory machines were programmed in a language based on shared variables and distributed machines were programmed using message passing. Recent research on distributed systems and their languages, however, has led to new methodologies that blur this simple distinction. Operating system primitives and languages for programming distributed systems have been proposed that support shared data without the presence of physical shared memory. We will look at the reasons for this evolution, the resemblances and differences among these new proposals, and the key issues in their design and implementation. It turns out that many implementations are based on replication of data. We take this idea one step further, and discuss how automatic replication can be used as a basis for a new model with similar semantics as shared variables. Finally, we discuss a new language, Orca, based on this model. © 1991.