5 Critical review of research approaches

A critical review involves looking at the research process with an impartial eye, stepping back from the detail and considering the research as a holistic investigation and the plausibility of the conclusions and recommendations drawn from it. As such you are assessing the construct validity of the research. This involves evaluating whether there are good operational definitions and measures, multiple sources of evidence and a chain of evidence which establishes links between the research questions, research methods, data and conclusions. This is complemented by judging three aspects of the work: the academic rigour brought to bear on the investigation; the contribution to understanding of technology policy and innovation and the accessibility and coherence of the project reporting style.

Key to this will be the conceptualisation of the problem. As Mukherjee and Wuyts say, ‘...good conceptualisation is akin to an opening in a chess game: it sets the stage for the game but does not predetermine its final outcome.’(in Thomas, Chataway and Wuyts, 1998 Fast, p.257). Conceptualising your research question will determine your approach to the data. But even when you have your data, data does not speak for itself. You will need to approach and interrogate the data from different angles, some of which may lead nowhere. Even where some results are disappointing or unexpected, it is worth thinking critically about why this was and why it failed to support the idea you had. Indeed, this may suggest an alternative idea that you may not have uncovered but for this more interrogative approach. So all data is meaningful, both in terms of what it tells you and what it doesn’t tell you. As Mukherjee and Wuyts say,

‘You approach data from different angles because you want to investigate rival notions in the light of evidence each brings to bear on the problem at hand. Through contrastive inference you seek to arrive at that notion which appears more plausible in the light of the overall evidence. Your interest is not just to test a particular idea in isolation ...What matters is to compare ideas. The implications are that you do not merely accept an idea because some evidence points towards it, because it may support other ideas equally well, and you pay particular attention to evidence which goes counter to an idea, because it may point towards a superior notion.

In policy analysis, you may make extensive use of secondary data relevant to your research question. It is important, therefore, to reflect carefully on the limitations inherent in the data. This is not just a question of checking the quality of the data. What matters as well is to question the type of questions data allow you to answer and which questions cannot possibly be tackled with the data at hand. This point matters equally when you collect your own data (for example, through survey research). The design of a survey is predicated by the type of questions you seek to answer.

Analysing data in relation to a specific question almost invariably implies further data manipulation...Data manipulation are the ‘bread and butter’ of good applied research. Good conceptualisation of research often reveals itself first by the quality of data manipulations and, conversely, careless thinking often shows itself in the way data are organized or manipulated.

The question you seek to answer, the type of data at your disposal, and the way you manipulate and transform them, jointly set the stage for the kind of techniques best suited to address the question. This is not a one-way street. Often data are manipulated in certain ways so as to make them amenable to the use of a particular technique. Nor is it a one-shot effort.

In general, trying out different techniques using the same data to answer the same question is a good thing. This way you avoid turning technique into a fetish: a tool to impress rather than to gain insights.

Numerical skills matter provided they are not used in a mechanical way. They should be used in combination with, and not as a substitute for, clear conceptualisation of the problem...Numerical skill in itself does not provide magical answers to problems. Good conceptualisation of the problem determines whether numerical skills are used appropriately or foolishly...Few patterns in the data can be seen by merely staring at them. It is through data manipulations, data transformations and the calculation of numerical summaries or the use of graphs that conceptualisation interacts with numerical skill to bring out patterns in data which either test or generate ideas. There is no easy shortcut, but fortunately much mileage can be obtained with relatively simple techniques and extensive practice.

Finally, good data analysis should be fun, although it may involve hard work. There is something very exciting about ‘finding out’. This is why data analysis should never just consist of testing old ideas with new data, but new data should also generate new ideas (Heckman, 1992, p.884). Do not hammer your data into submission in a mindless exercise of number crunching. Nor turn your back to them in an attack of data phobia. It is worth learning to converse with quantitative data in a genuine theory-driven dialogue (in Thomas, Chataway and Wuyts, 1998, pp.257-259)

This quote relates to quantitative data, but I think the guidance is sound for qualitative data too. That is, that data evidence cannot speak for itself, that it has to be explored, interrogated and cross checked if meaningful results are to be derived. Finally, I would like to leave you with a checklist for evaluating research which has utility both for evaluating the research of others when carrying out a literature survey for example, and for producing your own piece of research to ensure it is robust and authoritative.

  1. Is the research topic or problem clearly stated and its significance for technology policy and innovation clear?
  2. Are the research questions unambiguous and appropriate?
  3. Is the research design strong with a range of meaningful and trustworthy methods to facilitate triangulation and a consideration of the ethical dimensions?
  4. Does ‘pragmatic opportunism’ inform the research in order to set manageable boundaries?
  5. Does the research demonstrate competent and confident critical reflection on research practice with steps taken to eliminate the effects of the researcher and the research process on respondents?
  6. Is the presentation of results innovative, interesting, unambiguous and free from distortion?
  7. Are the conclusions and recommendations plausible and convincing?