Edinburgh Research Explorer

Stochastic Process Algebras and their Markovian Semantics

Research output: Contribution to journalArticle

Related Edinburgh Organisations

Open Access permissions

Open

Documents

  • Download as Adobe PDF

    Rights statement: Notice to Contributing Authors to SIG Newsletters By submitting your article for distribution in this Special Interest Group publication, you hereby grant to ACM the following non-exclusive, perpetual, worldwide rights: — to publish in print on condition of acceptance by the editor — to digitize and post your article in the electronic version of this publication — to include the article in the ACM Digital Library and in any Digital Library related services — to allow users to make a personal copy of the article for noncommercial, educational or research purposes However, as a contributing author, you retain copyright to your article and ACM will refer requests for republication directly to you.

    Final published version, 418 KB, PDF-document

    License: Other

http://www.cs.ox.ac.uk/andrzej.murawski/siglog_news_16.pdf
https://dl.acm.org/citation.cfm?id=3212023
Original languageEnglish
Pages (from-to)20-35
Number of pages15
JournalSIGLOG News
Volume5
Issue number2
DOIs
StatePublished - 1 Apr 2018

Abstract

There is a long tradition of quantitative modelling in computer and telecommunication systems for performance evaluation and dependability analysis, dating back to the early work of Erlang on telephone exchanges at the beginning of twentieth century [Erlang 1917]. These models are typically stochastic, not because the behaviour of the system under study is inherently random, but because the models usually abstract from data and because interactions with humans are often intrinsic to the system, and both of these give rise to apparently stochastic behaviour. Whilst systems remained of moderate size, models could readily be hand-crafted, and results such as Little's Law were derived and widely-used, for example to optimise job shops [Little 1961]. Much work, like Little's Law, was based on a queueing abstraction, treating the resource in the system as the server of a queue, with users following an appropriate queueing discipline before gaining access to the resource. The more contention there is for the resource, the longer jobs are likely to spend in the queue waiting for access. The queueing paradigm is based on an essentially sequential view of computation and communication, with jobs flowing between resources, completing one task after another. Nevertheless it functioned extremely well until the 1980s, and for many people performance modelling was synonymous with queueing theory. However, the arrival of distributed systems that allowed jobs to access more than one resource at once, and multicast and broadcast communication, broke the assumptions of basic queueing theory making it much harder to carry out quantitative analysis of stochastic systems in order to predict non-functional properties such as performance, availability, reliability and dependability.

Download statistics

No data available

ID: 58799718