On Cooperative Systems

If we are to evaluate a class of objects, it helps to have some understanding of what those objects are. I have denoted my research "the evaluation of cooperative systems", and so need to give some consideration to the meaning of this term. In its usage in Lancaster, it is simply used to denote the computer programs sometimes called groupware or CSCW systems, and it has purely arisen as an easier and less loaded term than those. Sommerville et al (1993) define the term as follows:
systems which [are] essentially cooperative in the sense that they [are] team-based

However, it is clear that this usage is a neologism, and that both 'cooperative' and 'systems' have a long history in various areas of academic and practical discourse. Therefore I shall attempt to discern something of the meaning of the term cooperative systems by considering the meaning of its constituent parts. It must be stressed that I do this for the sake of understanding, and that the synthesis (putting together) will be as important as the analysis (pulling apart). It is with the latter, however, that I shall start.

The Nature of Cooperation

It was Marx who first used the term "cooperative work" - although, as Hughes et al (1991) have more recently pointed out, to suggest that there could be such a thing as work that is not cooperative is to misunderstand the nature of the modern organisation and the fundamentally intertwined nature of all work. However, it is possible to make some sensible remarks about cooperation and its nature.

Cooperation literally means "working together" (from the Latin "co", together; and "operare", to work). We can see this as the root of all modern uses of the term - it denotes some kind of activity conducted between two or more people, to common gain. Much work has gone on in social psychology in studying the subject, and from this Michael Argyle (1991, p.4) offers the following definition of cooperation:
acting together, in a coordinated way at work, leisure or social relationships, in the pursuit of shared goals, the enjoyment of the social activity, or simply furthering the relationship.

In this descriptive sense, then, cooperation refers to any sort of activity that two or more people conduct together. Studies of such situations have been made by sociologists (e.g. Heath and Luff, 1991), by anthropologists (e.g. Hutchins, 1991), and by social psychologists (e.g. Axelrod, 1984). A point that must be made about the last of these three groups is that the studies of cooperation in social psychology have tended to use artificial laboratory situations where only a limited amount of cooperation is permitted (such as the famous but vacuous Prisoner's Dilemma game), whereas the sociologists and anthropologists study real situations of cooperation.

Nevertheless, the following general principles about cooperation arise:

Some of these points lead us on to a second way in which cooperation has been considered - not just to say how it works, but rather to advocate cooperation as a good in itself. This has particularly been apparent among those considered with personal growth and social justice. A Quaker poster illustrates this well. Two mules are shown with two piles of food. A rope joins the two, but it is too short to allow them both to eat from their own food piles at once. They strain against each other for a while, resulting in neither of them eating anything, then realise their folly and join together: first they both eat from one pile of food, then from the other. The caption reads: "Cooperation is better than conflict".

The same notion is found in training for assertiveness and negotiation skills. One is advised that an assertive transaction, or a successful negotiation is not one where you get what you want and the other gets nothing, but rather one where both, or all, people get what they want (or at least the most important parts of that). In other terms, the aim of such transactions is a 'win-win' situation. It is suggested by many that the world would be a happier place if conflicts were addressed in such a manner rather than aiming for one to win and others to lose (or ignoring them altogether, which is also destructive).

So we can see two meanings for cooperation: a description of how people tend to work together, and a prescription for how they should work together. When phrases like "cooperative system" and "cooperative work" are used, the former is the most immediate meaning; but we should not forget the latter vision of how things could be.

Systems and Systems Thinking

It is crucially important to distinguish between two uses of the word "system" when we consider it in this context. The word is in use by computer scientists and organisational theorists to denote a collection of computer hardware, software and networks - one talks of a computer system and means such this kind of technological mixture. Indeed, the name under which a large part of the students of computers in use collects themselves is "information systems". This overshadows in computer circles an older and more general use of the term, which has been in use since around 1940 - the use of systems thinking.

Systems thinking is an approach which views the world in terms of systems. These are models of real-world situations which have the common property that, in Aristotle's phrase, "the whole is greater than the sum of its parts" - there are properties of the entity viewed as a whole that are not to be found by considering the constituent parts of that entity. A good example is given by Lewis (1994:44), who considers a bicycle. This is composed of a number of pieces - two wheels, frame, handlebars, chain, saddle and so on - but taken separately none of these has any particular meaning. However, by combining the pieces together in the right way, we may create an system that affords transport. That is, the ability of a bicycle to carry me to work (given motor power from my legs) is an emergent property of the complete system. As Senge (1990:68) puts it,

systems thinking is a discipline for seeing wholes ... a framework for seeing interrelationships rather than things, for seeing patterns of change rather than snapshots.

This older notion of system - from which the computer sense arose - is intended as a general perspective on all kinds of entities. It arose in biology, through the work of Ludwig von Bertalanffy (1969) and while to some extent, as Morgan (1986:45) comments, it can be seen as a "biological metaphor in disguise", it is taken by its theorists to be considerably more general. Indeed, Kenneth Boulding, one of those working under the banner of "general systems theory", has written a book entitled The World as a Total System (1985), in which he identifies a hierarchy of eleven levels of system in which we can view the world:

mechanical systems (those governed by Newtonian physics);

cybernetic systems (based on the principle of feedback to maintain equilibrium);

positive feedback systems (those which are not in equilibrium);

'creodic' systems (those based on the carrying out of a plan, such as genetics);

reproductive systems (which are capable of reproducing themselves);

demographic systems (the behaviour of populations);

ecological systems (the interaction of species);

evolutionary systems (where the rules governing the system change);

human systems (the organisation of individual humans);

social systems (the interaction of human beings, and the use of artefacts); and  

transcendental systems (which we know through religious experience).

While Boulding gives examples of all kinds within the 'lower' systems, most of human collective experience can be found in the category of the social systems. Accordingly, these has been the object of study of most systems theorists. In particular, they have concerned themselves with the study of the interactions of people within organisations, and particularly with companies and public-sector institutions. Thus systems thinking has been substantially used within business schools.

However, two distinct kinds of systems thinking have arisen as the discipline has developed. The first is mathematically-based, likes to create formal models of situations, and finds it useful to draw analogies between human and social systems and the better understood mechanical and cybernetic systems. This has been dominant in operational research, for example, which has been concerned to create mathematical models of situations requiring planning, the better to assist the planners in their task. Similarly, the work of the RAND Corporation in the USA on solving problems using a method derived from engineering that it referred to as systems analysis, continues to be influential in the design of information systems.

Another strain can be found in the work of those who take a holistic perspective on the world, requiring that mind and nature not be separated, and likewise that situations can best be understood by studying them from all sides. One of the best early proponents of this approach was Gregory Bateson (1972a), who made extensive studies in anthropology, psychiatry, learning, and zoology, continuously pushing against assumptions that things need to be considered separated which can instead be considered together. Peter Checkland (1981) comes from a different angle, but his "soft systems" methodology takes an excellent approach to human systems that stresses the importance of holism in studying these, and how vital it is that multiple perspectives be brought to bear.

Checkland also stresses a key point about systems: that they do not exist as such in the real world, but are rather ways of viewing the world - they belong to epistemology rather than ontology. This is crucially important, as it emphasises that different systems will be identified by different people, and we need to be clear what we mean. For example, if we consider the phrase "the computer system", some use this to mean a standalone PC (box containing chips plus monitor, keyboard and mouse); others include software on the PC (perhaps just the operating system - another use of the word! - or perhaps also the word-processor etc.); others include networking to other PCs; and others also include fileservers on a local network, the files on them, the other computers, and even (implicitly) the technicians that keep the system running!

Finally, a mention of the crucial phrase "socio-technical systems" (which is to be found at several points in this report) must be made. Coined by Eric Trist of the Tavistock Institute of Human Relations, it refers to "the interdependent qualities of the social and technical aspects of work ... these aspects of work are always inseparable, because the nature of one element in this configuration always has important consequences for the other" (Morgan 1986:44). This combination of people and technology is vital to my purposes here: when I refer to a computer system, I most decidedly do include the people that keep the technology running, but also those that use it, who built it, who work with those that use it, and so on.

Putting the words together: what is a cooperative system?

So to summarise the above material - cooperation is a process of two or more people engaging in an activity for shared gain, supported by communication and coordination; and a system is a collection of objects with emergent properties, here involving people and technology. Putting these together, I suggest the following definition for 'cooperative system':

a combination of technology, people and organisations that facilitates the communication and coordination necessary for a group to effectively work together in the pursuit of a shared goal, and to achieve gain for all its members.

A few comments can be made about this definition:

Of course, like Checkland's remarks on systems being epistemology rather than ontology, this is just one view of what constitutes a cooperative system. Others may have different views, emphasising the technology more or the group dynamics more. For me, keeping these things in balance is a key to understanding the nature of cooperative systems.

References

Argyle, Michael (1991). Cooperation: The Basis of Sociability. London: Routledge.

Axelrod, Robert (1984). The Evolution of Cooperation. New York: Basic Books.

Bateson, Gregory (1972a). Steps to an Ecology of Mind, Chandler.

von Bertalanffy, Ludwig (1969). General system theory: foundations, development, applications. New York: Braziller.

Boulding, Kenneth (1985). The World as a Total System. Sage.

Checkland, Peter (1981). Systems Thinking, Systems Practice. Chichester: John Wiley.

Heath, Christian and Paul Luff (1991). Collaborative Activity and Technological Design: Task Coordination in London Underground Control Rooms. Proceedings ECSCW '91 .

Hughes, John, Dave Randall and Dan Shapiro (1991). CSCW: Discipline or Paradigm? A Sociological Perspective. Proceedings of ECSCW 91.

Hutchins, Edwin (1991). Organizing work by adaptation. Organizational Science, 2 (1): 14-39.

Lewis, Paul (1994). Information-systems development: systems thinking in the field of information-systems. London: Pitman.

Morgan, Gareth. Images of Organisation. Sage, 1986.

Senge, Peter (1990). The fifth discipline: the art and practice of the learning organization. New York: Doubleday.

Sommerville, Ian, Richard Bentley, Tom Rodden and Peter Sawyer (1993). Cooperative Systems Design. Report CSCW/10/93, Computing Department, Lancaster University.


And how do I evaluate these things? Go back to my PhD home page: Evaluation of Cooperative Systems.
Cooperative Systems Engineering Group | Computing Department | Lancaster University

Magnus Ramage 24 October 1996