David Blockley's Philosophy of Engineering Risk

This philosophy is one derived from engineering itself.

Background: Engineering Risk

In the last chapter of his popular book, David Blockley concentrates on the implication of designing ever more complex systems and the nature of the risk which then arises. Out of this comes a thoughtful and cogent philosophy of engineering design and practice [1].


He lays the foundation with the commonsense premise that engineering is practical - it is about creating tools that work properly. The process of engineering which creates these is constrained by financial, social, political and cultural situations. The simple reality, he then notes, is that when we act, there is bound to be a gap between what we do and what might happen, and this gap is filled by risk.


Not all risk, he points out, is unacceptable. We tolerate risks which we are familiar with. So engineering risk levels which do not exceed natural and societal background levels, arguably, would be acceptable as well. How do engineers actually reduce risk? The answer is by including safety factors, and in case these fail, design for defence-in-depth. In very complex systems risk will need to be managed in [functional] layers. And our considerations ought not to be confined to single systems. In interconnected ones like the Internet, the electricity grid and traffic control, for example, small damage [in one area] could conceivably cascade to disproportionately large consequences [elsewhere]. Also we need to factor in the [inevitable] interaction between hard-systems (physical, material objects and their constituent parts) and soft-systems (people).


Steps in Deriving an Engineering-based Philosopy

David Blockley asks 'how do we judge the quality of information on which we depend to make decisions that could risk someone dying?' He points out that as engineers we have a duty of care and we want the information to be true. The relationship between truth and risk sheds light on an engineering philosophy.


He begins this analysis in pre-modern times in which there were two broad ways of arriving at a truth. One was 'mythos' which created emotional truth and was derived from story-telling with roots in the mystical, the religious, and the emotional. It required faith - that is, belief that could not be proved to the satisfaction of everyone. The other way of arriving at the truth was 'logos' which was about discerning facts and external realities and is the kind of reasoning used to get something done.


While to the modern mind engineering seems to be based in logos, the way we actually live our lives and find meaning and purpose in life springs from mythos. This meaning and purpose is not testable in any objective way. So what we do as human beings is based on what we think we know; what we believe to be true. The gap between what we know and do and what we perceive be the consequences is filled by faith and this is from where risk arises.


Engineers look for reliable dependent information on which to build and test their models of understanding so as to minimise risk. In complex multi-layered systems. the characteristic behaviour of components within a layer can be a results of interactions with the layer below. Put another way, the system is the interaction of its parts, not just the sum of its parts. This extends to the interaction of hard-systems (equipment) and soft-systems (people).


The problem with soft-systems (people) is that these have multiple levels of intentionality which it is not possible to fully model. The motivation that drives the flow of change in soft-systems stems from a need or a want. So how are we to deal with risk in a complex multi-layered system made up of hard- and soft-systems?


David Blockley points out that [from a technical viewpoint] it comes down to monitoring the performance of a complex piece of equipment or system to detect changes that might indicate damage and potential harm before these become obvious or dangerous. In both hard- and soft-systems there may be inherent flaws or defects. In soft-systems these may not just apply to human lapses or mistakes but organisational and cultural ones as well. Put another way, it is about recognizing the pre-conditions of failure.


[From a professional viewpoint,] engineering is being asked to deliver ever more substantial systems which have lower thoughputs of materials and energy which are also durable, repairable, adaptable, robust and resilient. This is inducing tensions which are pulling apart the traditional divisions between engineering disciplines. In such an era, specialization has lead to fragmentation and loss of overview. While there is a continued need for specialized expertise, engineers who take refuge in their own disciplines can not contribute to meeting challenges that do not fit into traditional boxes.


Engineering has to achieve a balance between knowing and doing. While the advancement of knowledge (science) and action (engineering) have leap-frogged each other in the past, the engineer today, has to act in a risk-minimising way and must value doing and knowing equally. It is an appreciation of the holistic or systems view is needed as well as knowledge itself. The truths of science are not enough to ensure a safe engineered future.


It is only in these ways that engineers can do their job to bring risk down to acceptable levels.


References

[1] David Blockley
      Engineering: A Very Short Introduction,
      Chapter 6, The age of systems - risky futures
      Oxford University Press Inc, New York, 2012
      ISBN 978-0-19-957869-6