Skip to content
Skip to main content

About this free course

Download this course

Share this free course

Introducing ethics in Information and Computer Sciences
Introducing ethics in Information and Computer Sciences

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

1.7 Ideology

The notions of a final vocabulary and that of ideology are closely related. Anthony Giddens defined ideology as ‘shared ideas or beliefs which serve to justify the interests of dominant groups’ (Giddens, 2006, p. 1020). There are all sorts of problems with this definition. One difficulty, for example, is that ideas and beliefs, if they have any kind of existence, are hidden away and have to be inferred by what people do and say. Another difficulty surrounds exactly how these things can be shared. Rorty, to some extent, avoids these issues by referring to word use that can be directly experienced by observers or listeners who then have some evidence enabling them to draw conclusions about the common word usage amongst a group. What Rorty cannot do, though, is to provide a way of identifying whether or not a word is decisively part of a final vocabulary.

To illustrate this, let's look at another example taken from Shaw's Major Barbara. The main character states he will abide by ‘the faith of the armourer’, which, he says, is ‘to give arms to all men who offer an honest price for them, without respect of persons or principles’. The root of good reasons for selling arms is then solely based on an ‘honest price’. His successor proposes a different creed and asserts ‘I shall sell cannons to whom I please and refuse them to whom I please’. Subtly this alters things and makes ‘pleasing the armourer’ the foundation of the reasons for selling arms.

So various technologies might have ideologies identified as expressed in the final vocabularies and rules that the technologists in the field share. Many of these premises about what qualifies as ‘good’ and ‘bad’ outcomes shift with changes in technology, so engineers need to be alert to avoid being trapped in a tradition that can no longer justify its maxims.

Example 6: Number of transistors in electronics design

In the electronics industry it used to be desirable to minimise the number of transistors in a design. For many designers this was accepted as an unquestionable rule. Others may have questioned the rule but would have gone on to support it when it was expressed in terms of costs and reliability. But now, with improved manufacture and millions of transistors on a chip, this is no longer the imperative it once was, and that can only be revealed by treating the rule as something to be analysed rather than obeyed. One of the things that ethicists can do is to look at the rules that people use and see if they can be broken down or analysed. This sounds like a call to question everything but, of course, unbounded questioning does not get things done, and we have to accept that to get things done we have to limit our questioning and mainlly get on with the vocabulary we have. Delay, itself, can have bad consequences.

One thing that the study of ethics investigates are the consequences of adopting a particular final vocabulary or ideology, and sometimes showing how it can be analysed or broken down into other terms. Often, when you look into texts on ethics, the final vocabulary is reduced to terms like ‘pleasure’ and ‘pain’, ‘duties’ or, perhaps, ‘virtues’ and ‘vices’. The final vocabulary in technology development is more often than not reduced to financial terms (‘money’) or to a set of quantified ‘risks’. But if you try and reduce it in this way, there are always things that will be missed out, and of course one of the dangers of ideologies is that important things are ignored.

Activity 4

You might like to read John Monk's text ‘Risk is not Ethics’, available by clicking the link below. The penultimate paragraph points out that people with different interests will have different views on how to apply the word ‘risk’. A contrast is bound to arise when a product or service provider and the user assess ‘risk’. Indeed, a service or product user may shy away from using the term ‘risk’ since it implies ‘bad’ consequences are possible.

Risk is not Ethics [Tip: hold Ctrl and click a link to open it in a new tab. (Hide tip)]

Just to round off this section, I thought it would be worth mentioning that ethicists have developed a shared (even if the meanings are contested) terminology to categorise and deal with issues within the remit of their studies. Box 2 presents some of the basic terms you will find in academic texts on ethics, and I have included them here just to give you a flavour of what you will find if you follow up on an academic route.

Box 2: Some terminology

Deontological: related to the ‘rightness’ or ‘wrongness’ of actions, often expressed as duties such as the ‘duty of care’ that we might expect a factory manager would have towards factory workers.

Consequentialism: related to setting store by outcomes, however they are achieved.

Virtue ethics: concerns the formation of ‘good’ character and presumes ‘good’ people will bring about ‘good’ things.

Utilitarianism: concerned with maximising utility, which has been interpreted as maximising ‘happiness’ or ‘pleasure’. ‘Cost-benefit’ is consistent with this, but it expresses value in monetary terms.