Cultural Audits: Multiple Mixed Methods in Action


The case study provides an analysis of the development of exploratory sequential mixed- methods research into issues of equal opportunities in the form of a cultural audit. The research was constructed from first principles, and the various stages are presented in detail. At the cornerstone was the construction and implementation of a questionnaire given to all members of the organization. This was premised upon and complemented by a wide range of other methods. These included interviews, focus groups, and documentary research. Some of the questions used were taken from earlier national surveys, and these facilitated the triangulation of the results of the cultural audit with known national patterns of values and attitudes. The case study also outlines how the cultural audit provided powerful instruments for effective human resources management.

Learning Outcomes

By the end of this case, students should be able to

  • Understand how to implement exploratory sequential mixed-methods research
  • Understand the various stages in an empirical research project and how they complement each other
  • Recognize the difficulties in accessing a workforce for interviews, focus groups, and the administration of a questionnaire
  • Understand how to construct and administer a cultural audit of an organization
  • Understand how a cultural audit can inform management decision-making


This case study presents an example of exploratory sequential mixed-methods research. The case study presents the various stages involved in the creation of a cultural audit of equal opportunities. Such a tool can provide both a useful social scientific technique and a powerful resource for management. The case explains how the cultural audit was devised and the process for its use in the field. Central to the case is the notion of employing multiple methods for effective research. The case itself involved the use of interviews, documentary evidence, and focus groups as well as the development of a questionnaire. The processes whereby these were produced are covered in detail.

Mixed Methods

The present case study is an example of mixed-methods research. This approach to social research has become far more popular over the last 25 years. It involves an integrated combination of the collection and analysis of qualitative and quantitative data (see Cresswell, 2015, for a succinct guide to this methodology). It is based upon an attempt by social researchers to move beyond the sterile arguments about the exclusive advantages of either quantitative or qualitative methods (see Bryman, 2006; Sieber, 1973). The specific type of mixed-methods research presented and analyzed in this case study involved what Cresswell and Plano Clark (2011) term “exploratory sequential design.” This involves the exploration of phenomena initially using qualitative methods and then the subsequent development and deployment of a quantitative phase grounded in the earlier qualitative analysis.

The Context

The context for the development of the cultural audit involved a combination of internal and external factors. As a social scientist at Lancaster University I had developed an expertise in the areas of ethnicity and gender (see Penn, 1993, 1998, 1999). I was involved in teaching students about issues of ethnic diversity and wider issues of equal opportunities, including how it can be assessed empirically.

Lancashire Fire and Rescue Service (LFRS) approached Lancaster University for assistance and the request was passed on to me. I arranged a meeting with the Assistant Chief Fire Officer in my office at the university and he outlined their problem as an organization. The Home Office Fire Inspectorate (1999) had recently produced a Report that claimed that British fire services were “sexist,” “racist,” and “homophobic.” The Report had been presented to the entire workforce in a series of meetings and the reaction from the workforce had been hostile, disbelieving, and negative. The senior management had decided that they needed to assess the nature of attitudes to gender, ethnicity, and sexuality within their organization. They told me that they wanted to commission a cultural audit. I pointed out that university-based research was not cheap and that we needed certain guarantees. The most important was the issue of anonymity and confidentiality. It was made clear that any information collected at the individual level would be anonymous and that such data would not be passed on to the Fire Service. This stipulation was crucial for our impartiality and central to our academic integrity. They were also asked whether they were willing to accept the findings of our research irrespective of our results and conclusions. They provided such reassurances, and it became clear in these early discussions that they understood the basic principles of academic research.

Cultural Audits

The origins of cultural audits as research tools were rooted in the new managerialism within the public sector that developed during the 1980s. This originated in the field of financial accounting but spread into new areas and acquired a new set of meanings. This conceptual inflation produced “academic audits,” “health and safety audits” and even “democratic audits.” All represented elements of a new regime of control that has been labeled “coercive accountability” (see Shore & Wright, 2000). This genealogy has meant that the dominant understanding of an audit is that it is a top-down instrument of managerial control. Many see them as negative and punitive. However, there is an alternative way of looking at and conducting an audit which is positive in orientation. An audit can be a valuable way that employees in an organization can express their values and provide an antidote to the solipsism so prevalent in modern management. By this I mean the tendency of managements to believe their own rhetoric and to regard how their employees actually behave as irrational and based on “miscommunication.” We shall see several examples of this in the detailed case study itself. The Chartered Institute of Internal Auditors (2014) has recently extended its remit to encompass cultural audits, but these remain highly formalistic and have nothing to say about equal opportunities. It is the contention of this case study that cultural audits of equal opportunities remain highly relevant in the contemporary context and that presenting the process whereby the cultural audit was developed and implemented remains both methodologically and substantively important.

Organizational Culture

The notion of organizational culture became central to management studies after the publication of Hofstede’s (1980) seminal research on IBM. Subsequently, the concept has been refined to encompass values and meaning systems within an organization (see Needle, 2004). Organizational culture signifies the taken-for-granted, “common sense” ways of doing things in any particular organization. It can be seen as the totality of cognitive interpretations of a workforce at any given time. These form a structure of cultural parameters that can (and often do) exist over time. Anyone who has changed universities for another job, for instance, will immediately understand that as a new person he or she must understand the implicit assumptions dominant within that environment to function successfully (see Ravasi & Schultz, 2006).

Organizational culture can be seen as a set of common assumptions that defines appropriate ways of behaving in a variety of contexts. In a more abstract sense, it can be seen as the prescribed script for behavior that new members must internalize through a process of socialization.

The Empirical Research Process and Design

The Previous Report

To conduct a cultural audit into LFRS, the first prerequisite was to read the Home Office Inspectorate’s Report itself. This Report, written by people with many years’ direct experience of the British Fire Service, was very critical of many aspects of equality and fairness within the Fire Service.

The Report found that the “overwhelming view of uniformed staff (i.e., firefighters) was strong opposition to women being employed in the operational fire service” (p. 23). Many reasons were adduced for this. It was often claimed that women were not capable of doing what was taken to be a “man’s job.” Some had genuine but misguided concerns about the health and welfare of women in the Fire Service over the longer term. Lack of physical strength and fitness to do the job were frequently cited alongside concerns that standards should not be dropped to allow women to get through the physical tests which are part of the recruitment process.

Sexual harassment of female employees within the Fire Service was noted as common, and the Report claimed that incidents of sexual harassment were often rationalized by Firefighters and senior officers on the basis that women were “not up to the job.” The Report further noted that “All the women who had taken cases to employment tribunal had left the service” (p. 24). It also concluded that there was a need for a “major change in the culture of a service that is so strongly opposed to women being able to join the uniformed service and perform operational duties” (p. 66). Overall, the Report concluded that the Fire Service was “institutionally sexist” (p. 68).

It also stated that members of the Fire Service had a “more balanced” opinion in relation to the right of men from ethnic minority backgrounds to join the service when compared to their general hostility toward female recruits. Nevertheless, some real concerns were noted in this area as well. In particular, the Report noted “a marked lack of understanding of the need for diversity in the service and the issues that need to be addressed to achieve diversity” (p. 25). Acceptance of ethnic minorities, as well as women, was based upon the notion of “fitting in,” with little or no account taken of differences in the traditions, backgrounds, values, norms, and preferences of those not from the dominant ethnic group. The Report concluded that there was “a surprising and worrying ignorance of matters relating to race and culture” and that on the basis of the interpretation provided by Sir William MacPherson’s Inquiry (1999) into the murder of Stephen Lawrence, the Fire Service “might be held to be ‘institutionally racist’” (p. 68).

The Report went on further to claim that the subject of homosexuality was an “absolute taboo” for most members of the service and that there was little understanding or tolerance of the issues involved. Some members of the Fire Brigades Union’s Gay and Lesbian Support Group reported that they had experienced “extreme opposition” and expressed particular concern about the attitudes of middle-aged, middle-ranking officers. It concluded that the Fire Service was “not currently capable of dealing positively with sexuality” (p. 69).

Overall, the HM Fire Inspectorate’s Report was highly critical of the current situation in relation to equal opportunities throughout the British Fire Service and portrayed the Fire Service as sexist, racist and homophobic. However, the actual empirical evidence presented to support the Report itself was both anecdotal and also highly selective.

Literature Review

A central part of any research project involves a literature review of pertinent previous work (see Fink, 2013 and Ridley, 2008, for the best texts on how to undertake an effective literature review). This proved to be complicated as the terrain covered research on equal opportunities, fire services themselves and on cultural audits as a method of research. Social scientific research into fire services themselves was virtually non-existent and what there was had an exclusively U.S. focus. The area of equal opportunities on the other hand was extensive, but there were no examples at that time of cultural audits into issues of gender, ethnicity, and sexuality. We concluded that we needed to administer a questionnaire to all employees at LFRS in order to develop a comprehensive assessment of their attitudes, beliefs, and practices associated with issues of equal opportunities. This decision was guided by a combination of practical considerations and also by a range of general literature on research methods (see Bryman, 2015; Fowler, 2008).

The reasons for administering the questionnaire to the entire workforce rather than to a smaller sample were twofold. First, we wanted to maximize the legitimacy of the research. If everyone from the Chief Fire Officer to the cleaning staff had an opportunity to provide their responses, we felt that this would avoid the assumption that only a section of the workforce was relevant to an overall assessment of the culture. Second, we did not want to generalize from a smaller sample. In this case, the sample was the population—a technical term in statistics which means “everyone” and is often termed a “complete enumeration” or a “census” (see Fielding & Gilbert, 2006; Treiman, 2009). Such an approach increased the cost of the research but by a very small amount. LFRS employed 1650 employees, and we secured agreement that we could distribute a questionnaire to all members of the workforce at work and that respondents would be encouraged to complete it during working hours.

However, we had no template to guide us. We needed to construct a questionnaire from first principles ab initio. This was informed by the main current texts on questionnaire design (see Gillham, 2008; Oppenheim, 2008). Prior to this, we needed to complete a series of related pieces of research to facilitate the construction of an appropriate instrument. As with all questionnaire construction, this involved a range of complementary research methods. The research therefore embodied multiple research methods which included interviews, secondary data analysis, and focus groups in a specific combination and order.

Preliminary Research

The initial phase of the research involved interviews with senior management. These were based upon open-ended, semi-structured interviews involving the research team and respondents (the best current guides to interviewing are Arksey & Knight, 2015; Gubrium, Holstein, Marvasti, & Mckinney, 2012; Holstein & Gubrium, 2015; Wengraf, 2009). Notes were taken during these interviews and transcribed immediately afterwards. These proved invaluable as a guide to the overall terrain to be examined. In particular, we learned about the history of equal opportunities in the fire service and how senior management had developed procedures to monitor these. From these interviews, we began to build up a picture of the social aspects of the organization and the issues surrounding equal opportunities.

Documentary Evidence

As a result of these interviews, we sought (and obtained) details of these procedures and how they had been functioning over the previous 5 years. This included detailed documentation of their procedures for dealing with complaints from staff about bullying and sexual harassment. The research team also obtained documentary evidence that provided information on the gender and ethnic characteristics of the existing workforce. There were no data collected at that time on sexuality within LFRS. Subsequently, we obtained documentary evidence on the characteristics of applicants for jobs within the fire service in the previous year, the characteristics of successful applicants and the process of attrition during the initial phase of training prior to deployment in the various roles.

The pattern of employment within the LFRS certainly indicated major imbalances within the workforce. Only 174 out of 1,650 employees were female: this represented 10.5% of the total. Of these, 42 worked in the Control Room where 999 calls were taken, while 6 out of 1,013 firefighters were female. Only 12 employees overall were from ethnic minority backgrounds. There were none in the Control Room and only 0.4% within the complement of firefighters.

Recruitment figures made even more stark reading. Between April 1, 1999 and March 31, 2000, there had been 1,180 applications to join the ranks of firefighters at the LFRS. Of these, 77 were female (75 were white females) and 47 were from ethnic minority backgrounds (including 11 Indians and 14 Pakistanis). None of the women were successful nor were any of those from ethnic minority backgrounds. The successful applicants comprised 59 white males.

Focus Groups

The construction of a questionnaire as a research instrument is a complex process. In order to gauge how members of LFRS thought about issues of gender, ethnicity/race, and sexuality a series of focus groups was organized (see Liamputtong, 2012; Kreuger & Casey, 2014; Morgan & Kreuger, 1997 for the best guides to focus groups). Focus groups have been defined as “a research technique that collects data through group interaction on a topic determined by the researcher” (Morgan, 2003, p. 324). They highlight respondents’ attitudes, language, and frameworks of understanding (see Kitzinger, 2003). Breen (2006) argued strongly that focus groups can be used effectively to provide insights into the language of descriptions used within an organization by its members. Focus groups can also be used to help identify topics for the main research to focus upon. Sometimes, a researcher will not know all the issues or even the right questions to pose. A focus group can be particularly useful at the initial stages of a research project. It can also be used to explore initial findings via another type of method.

They are by no means new. Bogardus (1926) and Merton and Kendall (1956) were earlier advocates of their use as a research technique, but they have certainly become more prevalent over the last 30 years. Focus groups have many advantages. They are particularly useful for probing group attitudes, feelings, and beliefs. They are also cheaper and more efficient than individual interviews. Attendance at the focus groups took place during working hours and was voluntary. The latter principle is essential for proper focus group research. Anonymity and confidentiality were guaranteed by the research team. This is also paramount. Indeed, only first names were used by participants, and these were not recorded. However, there is also a potential down-side. Dominant members of a group may influence others and/or render them reluctant to voice their views. This has to be carefully managed by the moderator.

A series of focus groups was organized prior to the administration of the questionnaire to familiarize the research team with issues of equal opportunities as seen by various parts of the LFRS. The focus groups were not taped due to the sensitive nature of the topics under review. Notes were taken by the assistant moderator and a detailed debriefing took place between the moderator and the assistant immediately afterwards and a written summary produced. These took place in two Divisions, at the Service Headquarters, among Control Room staff and among a group of Retained Firefighters. Membership of these focus groups, apart from the Training Centre, was selected at random by the research team. This was based upon selection of a particular shift (or “watch”) at random and then including everyone working at the time of the meeting in the focus group. They all took place during working hours. This meant that all respondents were paid by their employer to attend. However, being at work might have inhibited some participants but, in practice, this was not evident as comments were forceful and uninhibited. The segmented nature of the sample reflected the organizational structure of the LFRS itself and reflected the desire of the research team to scrutinize all parts of the organization.

The final focus group took place at the Training Centre. Unlike all the other focus group meetings which had been well organized internally, the Training Centre focus group proved extremely difficult to arrange. Nobody contacted the research team, as had been arranged by the senior management at LFRS, and nobody met us when we arrived there at the time given to us. We gained a strong impression upon arrival that our presence was less than welcome. This was in marked contrast to our reception among all other focus groups. This would appear to be a serious cause for concern: the Training Centre was where recruits first encountered LFRS in any systematic way. If our experiences were anything to go by, the Training Centre came over as an unwelcoming place to outsiders. The actual focus group provided at the Training Centre proved illuminating. The participants were helpful and friendly. However, none of them were actually involved in directly providing the main training courses provided for LFRS personnel! This was of concern to the research team as was highlighted in the Final Report provided to the LFRS. The overall number of focus group interviews undertaken was six which sat well with the norms of focus group research. Indeed, Morgan (2003) argued that “most projects consist of four to six focus groups” (p. 337). The numbers in the focus groups varied between 6 and 14. The largest focus group at the Control Room was hard to moderate and a smaller number between 6 and 8 would have been optimum (see Kreuger & Casey, 2014; Morgan & Kreuger, 1997).

The moderator, who was skilled in group discussions, explained the purpose of the focus groups at the onset. Respondents were informed that the research team was conducting a cultural audit into issues of equal opportunities and were using the focus groups to gain an initial insight into how members of the LFRS saw these issues themselves with a view to designing and administering an effective questionnaire to everyone working there. The moderator explained that there were no right or wrong answers, only differing points of view.

The sequence of each focus group followed the same broad trajectory. Issues surrounding gender were raised first, and these were followed by questions on ethnicity/race and sexuality. All three were contentious topics both within the organization and the wider society. Each area was probed via open-ended questions such as “why do you think that there are so few female firefighters in the LFRS?.”

All focus group participants were to a degree defensive initially. Many reported that they felt “insulted” by the HM Fire Service Inspectorate’s Report and almost all felt resentful about its central claims. Indeed, this widened into a broader set of attitudes that included feelings that equal opportunities’ issues had become far too central to LFRS and that recent efforts to improve the ethnic/gender balance within the workforce had gone “too far” and had generated a “climate of fear,” particularly about the limits of acceptable language while at work. Many participants told the research team that they were very unsure about what they could and, conversely, what they could not say in the current situation. In addition, many respondents felt that the Special Awareness Courses for women and ethnic minorities at the Training Center were themselves intrinsically “unfair” since they excluded white males. These results illustrate one of the central advantages of using focus groups in that they can reveal issues that might not emerge from a questionnaire. Focus groups are particularly useful for uncovering unexpected findings.

The focus groups were cumulative and iterative. The central advantage of these focus groups was that they helped the research team progressively to understand the issues of equal opportunities as perceived by the workforce itself. They also provided an insight into the everyday language used to discuss equal opportunities. Indeed, it provided a route into the wider language of the organization. Every organization has a set of codes to describe that organization that can prove obscure to outsiders. In my own earlier work, I had encountered this in several pieces of research I had conducted. The first involved participant observation among telephone maintenance technicians (see Penn, 1990). The language that they used to describe British Telecom and their jobs was opaque to put it mildly. Part of any successful fieldwork is “learning the language of descriptions” (see Garfinkel, 1967). I experienced very similar issues when I researched the changing skills of coal miners (see Penn & Simpson, 1986). I spent a great deal of time in discussing with miners the occupational system at the time of the research as well as in previous periods. This proved extremely complicated to achieve.

The Questionnaire

The questionnaire was pivotal to the cultural audit. Questionnaires are an efficient method of asking the same questions to a large number of respondents (see Gillham, 2008 and Oppenheim, 2008, for excellent guides to questionnaire design). In this case, the number was 1650. The questions were grounded in the previous hermeneutic probing of the world of the LFRS undertaken via interviews, focus groups, and documentary evidence. As such it exemplified an “exploratory sequential” mixed-methods research design (see Cresswell, 2015). The precise questions were based upon conjectures about the nature of their social and cultural world. Questionnaires can be seen as analogous to a conversation between respondent and researcher and the same protocols of politeness apply. The questionnaire was approved by the senior management and by representatives of the three trade unions that operated within the LFRS. These acted as two sets of gatekeepers for the cultural audit. The questionnaire was given to respondents to fill in during working hours and was accompanied by a letter from the Chief Fire Officer and the local trade union representatives urging them to complete it. After completion, the questionnaires were placed in a sealed box (akin to a ballot box in elections) and subsequently picked up by the research team.

  • Cover Sheet

The questionnaire began with the cover sheet (see Questionnaire in Appendix A. Details of how to access this are provided at the end of the Case). This is crucial to any questionnaire and is central to achieving a good response rate. The cover sheet was printed in bold and gave Lancaster University in capitals and “Lancashire Fire and Rescue Service Cultural Audit” immediately below in the centre of the page. My personal details were written below that. At the bottom of the cover sheet was a box containing the following text:


This questionnaire is confidential and anonymous. All questionnaires will be processed and held at Lancaster University. NO information will be passed on to anyone that could allow for the possibility of identifying persons completing them.

This was very important as respondents had to trust the research team. As Lancaster was the local university, this was facilitated by the high prestige of higher education and served to differentiate the research team from external consultancy firms. The confidentiality and anonymity guarantees provided a strong ethical component to the research.

  • Types of Questions

The questionnaire contained a range of different types of question. This is good practice as it helps engage the respondent in the process of data collection itself (see Gillham, 2008, p. 39).

The early questions were factual ones such as gender and age:

1. Are you? (Please tick)

(i) Male

(ii) Female

2. What is your age? (Please tick)

(i) 16–25 years

(ii) 26–35 years

(iii) 36–45 years

(iv) 46–55 years

(v) Over 55 years

The adjective “factual” refers to the type of information sought, not the accuracy with which it is given (see Moser & Kalton, 1979, p. 315). The first question concerned gender and was binary in form. Clearly, the world of gender and sexuality is more complex than this, but the research team judged this simple binary question as the most acceptable to respondents. Probing further would have risked alienating respondents who would have, in all likelihood, regarded such questions as highly intrusive.

The second question on age used mutually exclusive and exhaustive age bands rather than specific age. Generally, respondents are more willing to answer such questions than to provide their precise age (see Oppenheim, 2008).

The section on Employment Details (Questions 4 and 5) proved the most controversial in terms of getting prior approval from the three trade unions operating within the LFRS. They were very wary about questions that they felt could potentially have led to an individual being identified from their answers, despite the anonymity guaranteed. Normally, it is bad practice to use acronyms in questionnaires as they are often misunderstood or unknown to respondents. However, the terms “SO1” and “Scale 6” were well understood within the organization.

Question 6 used a filter to inform respondents that it was for retained firefighters only. This was designed to obtain additional information on retained firefighters who worked in the community and acted as part-time firefighters mainly in rural areas.

Questions 8 and 9 (“How did you originally find out about joining the LFRS?” and “If you had a job before joining LFRS, please give brief details below”) were both open-ended. This was because the research team was unsure how to provide a fixed set of categories (boxes) for the question. It is generally good practice to use closed questions where the potential outcomes are clear-cut (such as Question 11 “Please indicate the HIGHEST level of educational qualifications you hold…)” but to use open-ended questions where these choices are indeterminate (see Oppenheim, 2008). Open-ended questions involve much more work after the questionnaires have been completed. Each response needs to be coded, and this can be very time consuming if respondents write a lot of information. Indeed, this was the main reason why the space allocated was relatively small. Coding open-ended responses is a form of content analysis (see Krippendorf, 2012 and Weber, 1990 for excellent guides to this technique).

The question on “highest educational qualifications” had four options, with the fourth being “other.” This is a standard device in questionnaire construction (see Gillham, 2008), and the space below allowed respondents to provide answers not covered by the first three previous options. These could have included respondents educated outside England and Wales.

The questionnaire also included a wide range of questions designed to probe respondents’ attitudes, beliefs, and values about gender, ethnicity/race, and sexuality. These involved eight questions (Q12–14, Q19–20, and Q22–24) on these topics where respondents were asked to evaluate a statement by selecting one of five categories ranging from “strongly agree” through to “strongly disagree.” Such questions are standard in social science practice and are known as Likert-type items. Question 13, for example, asked for opinions on the statement “women can handle job pressures as well as men.” Several questions were truncated in form and simply asked whether respondents agreed or disagreed with a statement (Q21 and Q29–35). Question 21 asked if “it is better for everyone in organizations like LFRS if homosexuals keep their sexuality a secret.” The choices were “agree” or “disagree.” There were two main reasons for adopting such a simple format. The first was to reduce the length of the questionnaire to make it easier for respondents to complete. The second was that we wanted respondents to make a binary choice to force an answer. Other questions sought attitudes to whether certain jobs were “particularly suitable for men only, women only or both men and women” (Q16).

There were also questions that had three choices for respondents (Q15, Q17–18, and Q25–28). Question 25 asked respondents if they were “very prejudiced against people of other races,” “a little prejudiced,” or “not prejudiced at all.” Several questions were open-ended (Q36 and Q37). These are an important resource as part of a cultural audit but need to be limited as their analysis is very time-consuming. The prior focus groups were very influential in our choice of both questions and language used.

The questionnaire was organized thematically in terms of its sequence of questions (see Gillham, 2008, on the importance of effective sequencing of questions). The successive fields were Personal Details, Employment Details, Education, Gender, Sexuality, Ethnicity/Race, and specific LFRS Issues. The earlier sections sought answers to questions whereby the research team could ascertain the background of respondents in areas such as gender, type of job, and level of educational attainment. These generated a set of “explanatory” variables. Subsequent sections produced the “response” variables which were interpreted in terms of their relationships to these “explanatory” variables.

Clearly, the questions in the questionnaire varied in format. This helped to make the instrument more attractive to respondents and helped to raise the overall response rate. This was 44.2% which was very good, especially given the sensitivity of many of the questions.

  • Triangulation

The triangulation of the results of the cultural audit with prior research at the national level was critical to the appropriate and relevant interpretation of the results (see Denzin, 2017, for an extensive discussion of various types of triangulation). Nationally, there was evidence of sexist, racist, and homophobic attitudes at the time of the research. These had been presented in the annual British Social Attitudes Survey, as well as in the British Household Panel Study (now renamed Understanding Society) and the International Social Survey Program. The cultural audit replicated questions used previously by all three instruments (see Q12, Q15–16, Q17–18, Q22, Q25–26, Q28, and Q31–33). Using these questions had two great advantages: one methodological and the other hermeneutic. The questions taken from earlier surveys had been piloted and tested before use in the field as part of large-scale national surveys. This meant that the research team could be confident in them as questions. It was also possible to compare the responses within the LFRS with known national averages. For example, the research showed a very similar pattern of stated levels of “prejudice against people of other races” as had been evident in an earlier British Social Attitudes Survey (see Table 35 in the “Final Report to LFRS” in Appendix B). This led us to conclude that the LFRS reflected broader national patterns and that there were not pronounced levels of racist values among its workforce. However, behind this overall figure, it was clear that women working for the LFRS, particularly those in the Control Room, were far more prejudiced. This issue was highlighted in the Final Report.

Data Analysis

The data from the completed questionnaires were entered into the statistical software package SPSS (see Field, 2017, for an excellent guide to using SPSS). This permitted the generation of tables and associated tests of statistical significance. These were the type of results that the audience for the cultural audit could understand. Clearly, more advanced statistical modeling of the data would have been possible, but the tabular results were sufficient for the recipients of our final report. It is always important to know the appropriate level of statistical knowledge among the intended audience.

Table 22 from the report illustrates this style of analysis and presentation:

Table 22 ‘Do you feel that attempts to give equal opportunities to homosexuals in Britain …?’ by Occupational Grouping.





Have gone too far





Are about right





Have not gone far enough





The table compares attitudes to homosexuality across groups within the LFRS. Clearly, firefighters felt that “attempts to give equal opportunities to homosexuals in Britain” had “gone too far” to a significantly greater extent than managers or Control Room Staff (FCOs). This was highlighted in the Final Report.

The Final Report

The cultural audit fed directly into the final report to the LFRS (see Appendix B). This contained a short Executive Summary containing a limited number of bullet points. These were necessarily short and punchy as most readers would only read these as well as the recommendations at the end of the report. There were 14 main recommendations written in clear and simple language, and these provided a catalogue of issues that senior management could then prioritize. In fact, the first areas to be tackled as a result of the Cultural Audit were the issues of bullying and sexual harassment. The analysis revealed that these were common and had continued to feature over the previous 2 years (see Tables 49 and 50 in the Final Report. Details of how to access this are provided at the end of the Case).

This also illustrates that a properly conducted cultural audit can be an effective resource for management. It provided clear empirical evidence of problems within the organization that centered upon attitudes and beliefs of staff. It can also act as an effective antidote to the classic management response to any issues where they claim that “we have procedures to deal with that.” It was abundantly clear from the Cultural Audit that these procedures did not have the confidence of many members of staff.

Management were very concerned about the high levels of bullying and sexual harassment reported in the Control Room. This was investigated subsequently and several members of staff were sacked as a result of these investigations.

Practical Lessons Learned

The main lessons of this case study involved the importance of integrating a wide range of different research methods and techniques for the successful implementation of a cultural audit. This involved the application of a mixed-methods perspective (see Cresswell, 2015). The approach adopted was exploratory sequential design, involving an initial exploration of the phenomena using qualitative methods followed by a subsequent quantitative phase. This entailed the construction, delivery, and analysis of a questionnaire given to the entire LFRS workforce. This was based upon prior research into the issues of equal opportunities using face-to-face interviews and focus groups. It also involved examining secondary data (including national surveys) and documents as well as conducting an extensive review of pertinent literature.

The other main lesson was that the specific cultural audit implemented within the LFRS could be contextualized and benchmarked against known national patterns of attitudes and beliefs through the process of triangulation.


The development of the cultural audit instrument outlined in this case study was innovative and exciting. In a very real sense, new territory was being explored. There were no templates for the research and the cultural audit was constructed from first principles. Central to the cultural audit was a questionnaire administered to all members of the organization. This proved important both for the legitimacy of the audit and for the overall response rate. The questionnaire itself was developed on the basis of other research techniques, including interviews, focus groups, and documentary research. Any successful mixed-methods research involves a multiplicity of related and complementary methods and techniques. The cultural audit presented in this case study provides a clear example of this in action.

Further Developments

The initial cultural audit for LFRS was well received by all parties. It received coverage in the industry’s main journal Fire (Penn, 2003), and this led to three further cultural audits in Durham, Northumbria, and Tyne and Wear. The instrument was refined, and the response rates improved considerably. Indeed, at Tyne and Wear, the response rate was 80.1%. Links to these subsequent cultural audits—all of which can be found easily in Research Gate—are provided in the web resources listed below.


To gain access to Appendices A and B and for a copy of the final Report, contact the author directly by email: They are also available via Research Gate at the links provided in the Web Resources section.

Exercises and Discussion Questions

  • How important were the initial qualitative phases of the research for the subsequent development of the quantitative phase?
  • How could the cultural audit described be improved today? What new approaches could be used?
  • What are the ethical issues surrounding research into equal opportunities? How were they tackled in the cultural audit?
  • Which new data sets could be used for triangulation purposes in any new cultural audit?
  • How useful is a cultural audit? What other management strategies should complement it?

Further Reading

Arksey, H., & Knight, P. (2015). Interviewing for social scientists: An introductory resource with examples (
2nd ed.
). London, England: SAGE.
Field, A. (2017). Discovering statistics using IBM SPSS (
5th ed.
). London, England: SAGE.
Kreuger, R., & Casey, M. (2014). Focus groups: A practical guide for applied research (
5th ed.
) London, England: SAGE.
Krippendorf, K. (2012). Content analysis: An introduction to its methodology. London, England: SAGE.


Arksey, H., & Knight, P. (1999). Interviewing for social scientists: An introductory resource with examples. London, England: SAGE.
Bogardus, E. (1926). The group interview. Journal of Applied Sociology, 10, 372382.
Brace, I. (2013). Questionnaire design: How to plan, structure and write survey material for effective market research (
3rd ed
). London, England: Kogan Paul.
Breen, R. (2006). A practical guide to focus group research. Journal of Geography in Higher Education, 30, 463475.
Bryman, A. (2006). Integrating quantitative and qualitative research: How is it done?Qualitative Researcher, 1, 822.
Bryman, A. (2015). Social research methods (
5th ed.
). Oxford, UK: Oxford University Press.
Chartered Institute of Internal Auditors. (2014). Culture and the role of internal audit: Looking below the surface. London, England: Author.
Cresswell, J. (2015). A concise introduction to mixed methods research. London, England: SAGE.
Cresswell, J., & Plano Clark, V. (2017). Designing and conducting mixed methods research (
3rd ed.
). London, England: SAGE.
Denzin, N. (2017). Sociological methods: A sourcebook. London, England: Routledge.
Field, A. (2017). Discovering statistics using IBM SPSS (
5th ed.
). London, England: SAGE.
Fielding, J., & Gilbert, N. (2006). Understanding social statistics (
2nd ed.
). London, England: SAGE.
Fink, A. (2013). Conducting research literature reviews. London, England: SAGE.
Fowler, F. (2013). Survey research methods (
5th ed.
). London, England: SAGE.
Garfinkel, H. (1967). Studies in ethnomethodology. New York, NY: Prentice Hall.
Gillham, B. (2008). Developing a questionnaire (
2nd ed.
). London, England: Continuum.
Gubrium, J., Holstein, J., Marvasti, A., & Mckinney, K. (2012). The SAGE handbook of interview research: The complexity of the craft (
2nd ed.
). London, England: SAGE.
Hofstede, G. (1980). Cultures and organizations: Software of the mind. New York, NY: McGraw-Hill.
Home Office Fire Inspectorate. (1999). Equality and fairness in the fire service. London, England: Home Office.
Kitzinger, J. (2003). The methodology of focus groups: The importance of interaction between research participants. In N.Fielding (Ed.), Interviewing (Vol. 1, pp. 347364). London, England: SAGE.
Kreuger, R., & Casey, M. (2014). Focus groups: A practical guide for applied research (
5th ed.
). London, England: SAGE.
Krippendorf, K. (2012). Content analysis: An introduction to its methodology. London, England: SAGE.
Liamputtong, P. (2012). Focus group methodology: Principles and practice. London, England: SAGE.
MacPherson, W. (1999). The Stephen Lawrence inquiry (Cm 4261–1). London, England: Home Office.
Merton, R., & Kendall, P. (1956). The focused interview. New York, NY: Free Press.
Morgan, D. (2003). Focus groups. In N.Fielding (Ed.), Interviewing (Vol. 1, pp. 323346). London, England: SAGE.
Morgan, D., & Kreuger, R. (1997). The focus group kit (Vols. 1–6). London, England: SAGE.
Moser, C., & Kalton, G. (1979). Survey methods in social investigations. London, England: Heinemann.
Needle, D. (2015). Business in context: An introduction to business and its environment (
6th ed.
). Boston, MA: Cengage Learning.
Oppenheim, A. (2008). Questionnaire design, Interviewing and attitude measurement (
3rd ed.
). London, England: Pinter.
Penn, R. (1990). Skilled maintenance work at British Telecom. New Technology, Work and Employment, 5, 135144.
Penn, R. (1993). Ethnicity, class and gender in the transition from education to employment: A report to Rochdale Training Enterprise. Lancaster, UK: Lancaster University.
Penn, R. (1998). Social exclusion and modern apprenticeship: A comparison of Britain and the USA. Journal of Vocational Education and Training, 50, 259275.
Penn, R. (1999). The dynamics of decision-making in the sphere of skills’ formation. Sociology, 33, 619638.
Penn, R. (2003, August). Cultural audit disputes “dinosaur service” tag. Fire, pp 2630.
Penn, R., & Simpson, R. (1986). The development of skilled work in the British Coal Mining Industry, 1870–1985. Industrial Relations Journal, 17, 339349.
Ravasi, D., & Schultz, M. (2006). Responding to organizational identity threats: Exploring the role of organizational culture. Academy of Management Journal, 49, 433458.
Ridley, D. (2008). The Literature Review. London, England: SAGE.
Shore, C., & Wright, S. (2000). Coercive accountability. In M.Strathern (Ed.), Audit cultures: Anthropological studies in accountability, ethics and the academy (pp. 5789). London, England: Routledge.
Sieber, D. (1973). The integration of fieldwork and survey methods. American Journal of Sociology, 78, 13351359.
Treiman, D. (2009). Quantitative data analysis: Doing social research to test ideas. San Francisco, CA: Jossey-Bass.
Weber, R. (1990). Basic content analysis. London, England: SAGE.
Wengraf, T. (2009). Qualitative research interviewing: Bibliographies, narrative and semi-structured methods. London, England: SAGE.
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles