Diffusion of the Personal Computer Innovation in Public High Schools: A Mixed Methods Approach

Abstract

Based on the work of E.M. Rogers regarding the diffusion of innovations (i.e. change), the study described in this case was designed and conducted to test the relationships of key psychological variables on the participants’ (high school teachers) behavior regarding the degree of personal computer use. The merger of specific quantitative and qualitative methods allowed for the quantification and comparison of the key variables while providing a rich understanding of the history and current context relative to the innovation in question. The study led to the conclusion that an individual’s perceptions of and confidence with an innovation are clearly connected to the decision to use it. However, as Rogers noted, the conditions and characteristics of the decision-making unit also clearly impact the rate of diffusion of this innovation with teachers. The combination of research methods makes the mixed methods approach particularly useful for studying psychological issues.

Learning Outcomes

By the end of this case, students should be able to

  • Discuss why this case was enhanced by the mixed methods approach
  • Explain how qualitative data can be used to provide context for quantitative data
  • Discuss some of the limitations of quantitative survey data
  • Describe the parameters of mixed methods research in terms of research questions, methods, and outcomes

Project Overview and Context: Diffusion of Personal Computers among High School Teachers

The first time I employed a mixed methods research design was in the 1980s while completing my doctoral dissertation. When I started the program, the faculty had decided that their doctoral candidates needed more exposure to all available research methods. Consequently, they formed a new course that would allow faculty experts to speak on a different research method each week. The lectures were followed by a 2-h small group session led by a faculty member, who would also guide the students through an exercise focused on the method under consideration. It was a very beneficial experience. However, at that time, no lecture was presented regarding mixed method designs. John W. Creswell (2015) explains that the methodology began as a field of inquiry about 25 years ago (approximately 1990). My interest in the method was the need to develop a deeper understanding of the problem I was exploring, and I was concerned that the survey method would not be sufficient. In the end, these suspicions were proved correct.

Given the pervasiveness of the personal computer (PC) and related communications systems today, the study described in this case may seem quite unwarranted. PCs, smartphones, tablets, and the like are very much an integral part of modern life. However, the fundamental problem for this study was not the PC itself, but education professionals’ attitudes and behavior regarding an innovation of that time. In short, the focus was on the dynamics of individual and organizational change.

Research Practicalities

To understand the nuances of this research, it is necessary to consider two aspects: mixed methods and change theory. Today, there are many mixed methods designs that combine quantitative and qualitative philosophies and methods. Creswell (2015) notes three basic designs: convergent, explanatory sequential, and exploratory sequential. Convergent designs allow data sets to be compared and to use one set of data to validate the other. Explanatory sequential designs use qualitative data to help explain the quantitative data. Exploratory sequential designs use qualitative data to help define the issue more clearly and to help design the relevant variables for the quantitative portion of the study. This case is about an explanatory sequential design.

This study was founded on diffusion of innovations, as well as individual and organizational change theory. At that time, I argued that the research to date had provided a detailed understanding of the diffusion of innovations and individual and organizational change from systemic and ‘sociological’ points of view. The view at that time could be likened to studying the effect of throwing a rock into a body of water. That is, the rock represents the change being introduced. The research to date had focused on the nature of the rock and the state of the water. Furthermore, it had studied who threw the rock and with what force. And finally, it focused on the impact of the disturbance on the water and how long it took for the entire body of water to be affected by the disturbance. For this study, the argument was that we did not know enough about the individual molecules of water (i.e. the individuals being asked or forced to change).

According to E.M. Rogers (1983), the innovation-decision process is a developmental process that includes knowledge, persuasion, decision, implementation, and confirmation. His work led him to theorize about many aspects of the social unit and the innovation that impacted the rate at which an innovation diffuses through any social unit. In addition, he developed a detailed analysis of individuals in the social unit based on their typical reactions to innovations and the reasons for such reactions. By and large, I found that I agreed with Rogers’ findings and conclusions. However, it occurred to me that he was actually implying that an individual’s reaction to innovation was perhaps part of that individual’s personality. I argued that there was not enough evidence for that conclusion and that additional work from a psychological perspective was needed. However, Rogers also noted that the context in which the innovation is introduced is also very important. Specifically, the characteristics of the decision-making unit and the prior conditions were key in understanding the rate and nature of the diffusion of any innovation.

In a different but similar way, Kurt Lewin (1952) used the analogy of ice to describe the change process. He divided the change process into three phases: unfreezing, moving, and refreezing. The unfreezing stage is similar to Rogers’ knowledge and persuasion stages. During this stage, the individual learns of the change and is either persuaded or coerced to adopt it. If the ‘change agent’ is successful, the individual learns the new way of doing things and with practice becomes more proficient. Refreezing occurs when the change has been adopted by the individual or organization and has reached the point of being the accepted way of doing things.

Gene E. Hall and Shirley M. Hord (1987) had also forwarded the concerns-based adoption model regarding individuals’ ‘stages of concern’ as they progressed through the ‘levels of use’ of the innovation. Focusing their work on teachers, they attempted to clarify what individuals were concerned about at each phase of the process of considering and implementing a change.

Given my desire to add to the understanding of change and the diffusion of innovations from a psychological perspective, this theoretical foundation caused me to realize that I was probably not going to be able to address all of the relevant issues with a quantitative survey of the individuals involved (in this case, teachers). To rely exclusively on a survey would have required a rather lengthy questionnaire that I feared the participants would not be willing (or perhaps even able) to complete.

Research Design

Therefore, the problem for this study was the need for additional empirical research that focused on change from the individual’s perspective and on the personal and environmental factors that may affect an individual’s willingness to adopt that innovation. The purpose of the study was to understand the context in which the participants worked and to determine the relationship of several factors to their degree of PC use. Specifically, the quantitative portion of the study (i.e. survey research) considered 11 independent variables (age, gender, educational level, years of teaching experience, subject taught, perception of value of computers, use of audio-visual equipment, level of computer anxiety, perception of self-efficacy, perception of innovativeness, and professionalism) and their degree of PC use (dependent variable). The qualitative portion of the study (i.e. modified case study research) provided data to address the history of computer use in each school, how computers were introduced to the participants, the history of innovation in each school, the computing environment in each school, and the nature of the perceived need for computers by teachers and administrators.

Population and Sample

The population for this study could be considered all high school teachers in the United States. The population was limited to teachers employed in public high schools in Illinois. It was further limited to a stratified, random sample of Illinois high schools. School districts were selected based on a stratified random sampling technique that was based on the size of the school (small, medium, and large) and its location (urban, suburban, and rural). The Illinois State Board of Education’s list of schools was used to select the six districts. There were a total of 10 high schools in the six districts with a total of 605 teachers. All of the teachers were included in the quantitative phase of the study. This study was completed prior to the current reviewing requirement by the institution’s research review board for human subjects. However, I secured permission from the districts’ superintendents and school board members and building principals. Furthermore, the teachers were informed that participation was voluntary, that all data would be reported as aggregate data, and would be anonymous.

Data Collection and Analysis

Survey data were collected using a seven-page instrument developed and assembled by the researcher. In addition to the items and scales developed by the researcher, E.R. Oetting’s (1983) Computer Anxiety Scale Short Form was used to determine each individual’s level of computer anxiety. The survey was returned by 414 of the 605 teachers; 7 participants were removed from the pool because the questionnaire was not complete (return rate = 67%). Demographic data included age, gender, educational level, years of teaching experience, and subject taught. These variables were analyzed using frequency and percentage. The other independent variables (perception of value of computers, use of audio-visual equipment, level of computer anxiety, perception of self-efficacy, and perception of innovativeness) were analyzed using mean and standard deviation. The primary comparison was Pearson product-moment correlation between the independent variables and degree of PC use. Additional comparison was completed using t test (gender and degree of PC use) and analysis of variance (subject taught and degree of PC use).

Case study data were collected by numerous interviews with teachers, students, and administrators; by direct observations; and by reviewing all relevant artifacts (e.g. course catalogue, bulletin boards, and school policies). In short, I worked to find every computer in each school and to talk with everyone who used those computers or controlled the education being delivered. All data were collected in the form of field notes. Triangulation (i.e. comparing the data from the three sources to ensure accuracy and prevalence) was used to ensure that all trends identified were confirmed by the other forms of data (more will be said about triangulation in the next section). All interviews were guided by a list of minimum questions that were developed to collect the data needed to address the research questions.

Method in Action

Prior to launching the study, the superintendents of each selected district were contacted to solicit their permission for the study. During the randomization process for participants, I made sure to select more districts than needed to ensure that there were at least two districts in each of the six stratification cells. The superintendents were informed that confidentiality would be maintained and that no one would be forced to either complete the questionnaire or talk with the researcher. Also, the districts were offered a copy of the final report and were informed that the researcher would provide in-service (pro bono) if they were interested.

Working with the superintendents, each high school principal was informed of the study. I wrote a letter to the teachers in each school, and the principal and I co-signed it. The letters were sent to the teachers about 2 weeks before data collection.

For each school, I scheduled 2 days for data collection. I arrived the morning of the first day and placed a copy of the questionnaire with a cover letter in each teacher’s mailbox. I also placed a box on the counter in the main office near the mailboxes to collect the completed questionnaires. Throughout the 2 days, I conducted interviews with students, teachers, directors, and principals; I located all of the computers and collected relevant artifacts. Throughout the day, I made notes on my findings. Afterward, I would return to my hotel room and write as long as necessary to capture all of my impressions. The research questions, interview questions, and growing list of trends were used to guide note-taking and subsequent analysis. I refer to this process as a ‘brain dump’ and used this technique to make sure I captured all of the relevant data.

I knew at the outset that it would be important to the study to know how the teachers were introduced to PCs. However, I did not have a complete appreciation for how important this issue would be until I had an interesting encounter with a physical education teacher in one of the schools. I typically took lunch with the teachers in the teachers’ lounge so that I could continue my data collection informally. This day, I was in the largest high school in my sample. I was eating quietly when I noticed that there was a whistle dangling in front of me. I looked up and could see that a rather large person was standing across the table from me and clearly wanted my attention. He leaned over and stuck his finger in my face and said, ‘I will be danged if I am going to learn programming again!’ (by the way, he did not say ‘danged’). I quickly told him that I was not there to do anything to them. I was simply collecting data. But I could tell there was something important here, so I asked him to sit and tell me more. He went on to tell me that a couple of years before, the district administration held an all-faculty meeting, gave everyone a computer disk (the early version of a memory stick) and told them that they would have to learn how to program in Pascal. The teachers did what they were told, but few ever used what little bit of programming they learned. Furthermore, it served to create some rather negative attitudes toward PCs. Coincidentally, while I was in this district, the administration announced a new initiative (i.e. they would be eliminating the Apple computers and outfitting the schools with IBM computers). They again called an all-faculty meeting.

This is a clear example of how triangulation is used in qualitative research. One comment during a brief interview led to additional interviews and tracking down all relevant materials regarding this issue (artifact review) so that the accuracy and impact of that event could be confirmed. This incident caused me to ‘dig’ a bit deeper into the issue and learn as much as I could about the history at each school. This clearly was an important issue and would definitely impact the interpretation of the quantitative data.

Findings

The majority of the teachers were using a PC; however, they were primarily using it for one application, word processing. There was limited software available at that time and most of the teachers found that the PC clearly helped them make handouts and write tests and quizzes. This was contrary to the reasons given by the administrators and school boards for having PCs in the schools. They defended the costs of acquisition and use by referring to the needs of students to be computer literate by the time they graduated from high school. Consequently, the schools purchased PCs for specific classes (mathematics, business, and special education) and for the libraries. They were primarily for the teachers of those subjects, students, and librarians; they were not for teachers in general.

It was also the case that the innovation at that time was not very robust. There was little educational software that would have warranted use by teachers. The software at that time was primarily of the drill-and-practice type and of limited value to any high school teacher. Many of the administrators predicted a day when classrooms for all subjects would be computer laboratories with the majority of the instruction delivered automatically by the programming. Most of the teachers believed that the computer would never be able to deliver instruction as well as a trained teacher.

The questionnaire data show that 64.6% of the teachers used PCs. Furthermore, the data show that none of the demographic variables were correlated to degree of PC use. However, perceived value of computers (r = .57), use of other audio-visuals (r = .15), computer anxiety (r = −.61), perceived self-efficacy (r = .60), perceived innovativeness (r = .36), and professional orientation (r = .26) were significantly (p < .01) correlated to degree of PC use.

The mixed methods approach allowed me the opportunity to test a set of independent variable (i.e. recognized demographic and psychological constructs) on the degree of PC use by teachers. Furthermore, the modified case study portion of the study afforded the opportunity to explore the issue with teachers and administrations in more depth. It provided a context and reasons for the attitudes and behaviors of the participants. In short, the qualitative data created a rich picture of the history and current status of teaching and PC use in the school districts of Illinois in the latter half of the 1980s.

Data collected through this study led to the development of an understanding of individual decision making when introduced to an innovation of any sort. Figure 1 is a conceptual map of my understanding of this complex issue.

Figure 1. A conceptual understanding of the nature of innovation and an individual’s decisions regarding specific innovations.
Figure 1. A conceptual understanding of the nature of innovation and an individual’s decisions regarding specific innovations.

The yellow portion of the model indicates that technology can be defined basically as the way humans do things (i.e. fulfill their needs and wants). There are pressures that drive continual development and constraints that limit the development possible. It can be argued that technology is continually trying to come up with new products, systems, and processes. It goes on whether or not anyone, other than those doing the developing, knows it is going on.

As noted by Rogers (1983), individuals do not need to make a decision until they learn of the innovation. Human perception involves screening new information through existing knowledge and our attitudes and emotions. In addition, it seems that individuals base their reactions to the innovation on what is typically called a cost–benefit analysis. This is consistent with Rogers’ conclusion that innovations with a clearly observable relative advantage enjoy a faster than usual rate of adoption. However, in most cases, the innovation does not constitute a large enough benefit to outweigh the immediate, real, and quite personal costs (e.g. fear of the future, actual costs of the innovations, training, incompetence, and loss of productivity). Furthermore, the benefits are often for others and merely promises of improvement (especially when the innovation is first introduced), whereas the costs are immediate and personal.

Practical Lessons Learned

First and foremost, the use of mixed methods design clearly impacted the outcome of this study positively. So much so that I rarely design (or guide) a survey study without looking for ways to add context through qualitative methods. A survey tells only part of the story, and without contextual information, the interpretation is often an invention of the researcher. It is better to base interpretation on reality, not supposition.

A second practical lesson was that it was essential to gain the support of the administration of each school district. Once the superintendents and school boards agreed, it was relatively easy to gain the principals’ support. Their support clearly increased the return rate during the survey portion of the study. Furthermore, this administrative support allowed me open access to the schools, teachers, and students. They accepted me as an expert and a colleague.

Also, when one thinks of qualitative research, the idea of ‘very time consuming’ often comes to mind. However, in the case of mixed methods research, the case study portion is not conducted to gain everything one might learn from a particular case, but to collect essential data that will provide context for the quantitative data that will empower the researcher to make deeper observations. In short, the qualitative data helped me make sense of the quantitative data. When the study was completed, there was a sense that the reasons for the observed levels (and types) of PC use were clear.

Conclusions

Survey data are self-reported data and self-reported data are biased data. That is, people tend to ‘put themselves in the best light’, they often tell the researcher what he or she wants to hear, and their emotional state changes and as a result their opinions and perspectives also change. Including qualitative data (observation and case study) can help overcome some of the shortcomings of self-reported data by providing more data points and the opportunity to explore issues in more depth. I knew that my survey, at seven pages in length, was at the upper limit of participants’ willingness to provide data. By employing qualitative methods (i.e. interviews, observations, and artifact review), I was able to develop a more complete picture of the phenomenon under consideration.

Exercises and Discussion Questions

  • What understandings would the author have missed if the study were designed exclusively as a survey?
  • What are some of the major limitations of survey research? How might these limitations be overcome?
  • There are inherent differences in philosophy for qualitative and quantitative research. That being the case, how might these varied philosophies complement one another to provide greater understanding?
  • Quantitative researchers use variables and collect data using reliable and valid means; qualitative researchers employ framing questions and collect data through observations, interviews, and artifact reviews that are validated through triangulation (i.e. cross-checking). Discuss how these techniques are similar and how they differ.
  • Industrial psychologists and others endeavor to understand job satisfaction. Researchers know a great deal about the subject, but more is needed. Imagine that you are the head of a research team who will design a study to research the effect of mid- and upper-level leaders’ leadership style on employee satisfaction. You have reliable and valid instruments to measure both leadership style and job satisfaction. Using mixed methods, develop a proposal abstract for this project to include a purpose statement, research questions for both aspects of the study, and methods for data collection and analysis.

Further Readings

Creswell, J. W. (2015). A concise introduction to mixed methods research. Thousand Oaks, CA: SAGE.
Creswell, J. W., & Plano Clark, V. L. (2010). Designing and conducting mixed methods research. Thousand Oaks, CA: SAGE.
Fowler, F. J., Jr. (2014). Survey research methods. Thousand Oaks, CA: SAGE.
Ivankova, N. V. (2014). Mixed methods applications in action research. Thousand Oaks, CA: SAGE.
Plano Clark, V. L., & Ivankova, N. V. (2015). Mixed methods research: A guide to the field. Thousand Oaks, CA: SAGE.

Web Resources

National Institutes of Health. (2011). Best practices for mixed methods research in the health sciences. Retrieved from http://obssr.od.nih.gov/scientific_areas/methodology/mixed_methods_research/section2.aspx

References

Creswell, J. W. (2015). A concise introduction to mixed methods research. Thousand Oaks, CA: SAGE.
Hall, G. E., & Hord, S. M. (1987). Changes in schools: Facilitating the process. Albany: State University of New York Press.
Lewin, K. (1952). Group decisions and social change. In G. C.Swanson, T. M.Newcomb, & E. L.Hartley (Eds.), Readings in social psychology (pp. 459473). New York, NY: Henry Holt and Company.
Oetting, E. R. (1983). Manual for Oetting’s Computer Anxiety scale. Fort Collins, CO: Rocky Mountain Behavioral Science Institute.
Rogers, E. M. (1983). Diffusion of innovations (
3rd ed.
). New York: NY: The Free Press.
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles