A Mixed-Methods Design in Educational Institutions: Gender, Teacher Praise and Criticism, and Student Motivation in St. Vincent and the Grenadines

Abstract

The gender achievement gap, where males demonstrate lower performance in academics, is not fully understood. This research was the first of its kind undertaken in St. Vincent and the Grenadines in 2012-2013. I employed a mixed-methods design for this research to reduce issues inherent in the use of each type of method, as well as to gather data from the most relevant stakeholders. Research questions, sample, and data collection modes were varied. Using a social psychological perspective, I investigated several questions under four research categories: (a) amount of praise and criticism, (b) types of praise and criticism, (c) interpretation of praise as criticism, and (d) source of praise and criticism. The sample consisted of Third Form secondary school students (aged 13-17 years). The data collection comprised classroom observations of four selected classes based on specific criteria, a student survey given to a total of nine classes (four from the initial selection and five others randomly selected from the secondary/high schools across the country), 48 random interviews of students, as well as end-of-term grades for mathematics and English, from the four initial classes. I concluded my research by creating a guide for the St. Vincent and the Grenadines teachers complete with research findings and suggestions.

Learning Outcomes

  • Discuss justifications for the use of a mixed-methods design
  • Examine the merits of data collection instrument choices in relation to the mixed-methods design
  • Identify sample and methodological challenges of mixed-methods design
  • Provide solutions to address sample and methodological challenges of mixed-methods design
  • Evaluate sequential versus concurrent data collection phases within mixed-methods design
  • Discuss the pros and cons of using a mixed-methods design for current research interests

Project Overview and Local Context

My research sought to advance understanding of the gender achievement gap, where males demonstrate lower performance in academics, while addressing some of the limitations of previous studies based on the literature review. First, I used a population in St. Vincent and the Grenadines (SVG) that is relatively homogeneous with respect to race and culture. SVG is a small island chain in the Caribbean with a population of approximately 110,000—the majority of whom are Afro-Caribbean. Norms, culture, and environment are also relatively uniform as the country encompasses 150 square miles, where approximately two-thirds of the secondary schools are located on mainland St. Vincent. Consequently, the SVG demographics allowed me to investigate the extent to which praise and criticism affect student motivation without me having to deal with complications and confounding factors that may be apparent in extensive multiracial settings. Second, this study facilitated my investigation of a global, macro-problem, namely, lower male to higher female academic achievement, by using a micro-level context. To this end, the use of a mixed-methods design was important because much of the survey and interviews would provide the story aspect of how teacher praise and criticism affect student motivation, whereas most of the classroom observations and exam results would provide the numerical aspect of this investigation. In addition, the sample itself was unique: Through the use of a naturally occurring, relatively homogeneous student population, I expected that the roles of praise, criticism, and students’ subsequent motivation in producing the gender achievement gap would be better understood and would contribute to this body of knowledge, especially in the areas of interpretation of praise and criticism, and the impact of teacher gender (or source) on praise and criticism.

Research Practicalities

Sample Justification

Several reasons informed my decision to use Third Form secondary students (aged 13-17 years), which corresponds to Grade 10 students in the United States. First, each school decides the number of subjects students take at the Third Form level, which includes the core subjects of English and Mathematics in all schools. Therefore, all Third Form classes in any selected school had a chance to be chosen to participate in the research due to relative uniformity in the curriculum. Second, Third Formers represent the transition point from lower to higher secondary levels. Therefore, students are at a crucial bridging stage of academic growth. Their decision, taken at the end of Form 3, will affect what they study for the next 2 years. Also, Fourth and Fifth Formers (Grade 11 and 12 students, respectively) will already be working toward their Caribbean Secondary Education Certificate (CSEC) exams—comprising various subjects set by the Caribbean Examination Council and taken mainly by Fifth Formers (Grade 12). At the Fourth and Fifth Form levels, students may already be more focused and, consequently, more motivated. Therefore, praise and/or criticism may not be as important at this latter phase where intrinsic motivation may be more apparent. Next, Third Formers do not intermix for their subjects. Each form remains separate and each student remains in his or her assigned seat. This allows research to be more precise in terms of using seating arrangements and following the students’ participation and behaviors over the entire period of observation. Therefore, this situation is practical for data collection.

In addition, past research shows that females outperform males during their early formative years in kindergarten and primary school (see Entwisle, Alexander, & Olson, 2005, 2007) and that biological factors play a role in this difference (see Gurian, Henley, & Trueman, 2001). This suggests that initially, the gender achievement gap varies by maturation and education level. Biologically, males tend to exhibit a growth spurt around ages 13-16 years as well as an increase in performance. This age group places them directly into the Third Form level in secondary school. Accordingly, it is at this stage that male students’ maturational level should be on par with their female counterparts. For all these reasons, the Third Formers represent the group that is best suited for research on praise/criticism and achievement motivation of male students compared with female students.

Research Design: Mixed-Methods Data Collection and Procedures

According to John Creswell and Vicki Plano Clark (2007), mixed-methods design in its simplest form complements qualitative (stories) and quantitative (numbers) data to give a more holistic picture of whatever is under investigation. An example is the illustration of an exceptional basketball athlete whose team was winning; however, his actual score did not equate to a major positive contribution. When other aspects, such as his ability to defend the hoop and his overall team efforts, were observed, his contribution became more obvious (see John Creswell, 2013). The bottom line here is that numbers do not always provide the complete story.

I undertook this mixed-methods research design for two main reasons: First, the classroom observations, survey, interviews, and exam results used can complement each other and provide a more accurate picture of the topic because different instruments can be more practical to solicit specific types of information. Second, the shortcomings inherent in each type of data collection method may be addressed via other data collection modes. For example, if there are social desirability answers (whereby persons do not answer questions honestly and instead try to make themselves look good) in the surveys and interviews, these can be compared with actual classroom data collection and exam results to provide a more complete description and interpretation of the data.

I based my data collection choices on two aspects that John Creswell (2003) highlighted:

  • The “match between problem and approach” (p. 21) because this research was the first of its kind and qualitative measures would provide the rich story aspect that was needed at this juncture;
  • “Personal experiences” (p. 22) because quantitative measures are established hallmarks for research and I was willing and able to invest the time needed to employ both types of approaches to gain the most from this research.

As this mixed-methods design utilizes a number of data collection methods, I will now provide a full description of the various aspects, issues, and procedures that I had to consider for each choice, given the fact that my research was to be conducted in educational institutions.

Classroom Observations

There are 26 secondary schools across SVG. However, from the onset of my research, eight secondary schools were excluded from consideration to guard against unique differences. These differences included student composition (single-sex schools), philosophy (religiously oriented schools), resource allocation/access (human qualifications/experience of teachers, caliber of students; physical aspects—such as access to laboratories and computer labs), and environment. Inclusion of these eight schools would have created several problems, including the primary problem of generalization of findings to the average student.

My goal was not to determine variance among the different schools. Therefore, four schools from the remaining 18 secondary schools that comprised the target population were deliberately selected to represent the average classroom in terms of socioeconomic status, primary school exam results, and human and physical resources based on specific selection criteria. To make the selections, I relied on interviews conducted with principals at each of the remaining 18 secondary schools to inform me about the teacher qualifications and length of service in the teaching profession of their Third Form Mathematics and English teachers.

In my research, the number of schools was less important than the level of class to be observed, the subjects under observation, and the gender of teacher. In fact, teacher gender was particularly important. Randomly selected schools may not have had available male and female teachers who teach either English A (i.e., English Language) or Mathematics and could have introduced extraneous variables that were not representative of the target population. Until I interviewed all principals, I had absolutely no idea which schools would actually fit the selection criteria, so deliberate sampling in this context was more theoretically useful.

The Selection Process, Issues, and Solutions

As highlighted in the above section, classroom observations were based on specific criteria which required deliberate sampling. To select the four schools for the study, I divided SVG into four zones to coincide with existing zoning boundaries established by the Government of SVG and selected one Third Form class per zone via the Ministry of Education’s list of schools that comprise each zone. The four zones used are as follows:

  • Kingstown (capital of mainland St. Vincent);
  • Leeward (west coast of mainland St. Vincent);
  • Windward (east coast of mainland St. Vincent);
  • The Grenadines (several small islands off mainland St. Vincent).

One school was selected from each zone to ensure that the sample was representative of SVG students and to avoid possible limitations if I had only concentrated on one specific zone. My goal was to determine whether male and female students have similar or different reactions to praise and criticism and by extension, motivation.

Mathematics and English were the subjects chosen for my class observations because these courses are mandatory for all students in SVG. Usually, schools would have both subjects (Mathematics and English) timetabled for the same day, 5 days a week, because they are core subjects.

My preliminary inquiries at the schools revealed that there were a total of 31 Mathematics classes from which to choose four Third Form classes taught by 11 male and 20 female teachers. In comparison, for English, only four classes were taught by males and 26 by females. This difference in the numbers of male and female teachers in Mathematics and English raised a number of issues that demanded particular solutions.

First, to avoid subject–teacher gender conflation, I needed a total of two male and two female English teachers, and two male and two female Mathematics teachers. However, of the four male English teachers, only one had a degree. Other male English teachers were Qualified Assistant Teachers (QATs). This means that they are teacher-trained and they have at least A Levels (i.e., they have completed 2 years of college education, but they have not as yet attained an undergraduate degree). Therefore, I had to use one male English QAT in my sample (see Table 1). I felt that this solution was not only practical but should not negatively affect students’ perceptions of their teachers because this male QAT teacher had over 10 years of experience with teaching English and was rated as an excellent teacher by the school’s principal—who maintained direct supervision of all teachers at the school.

Table 1. Final distribution of subjects, teacher gender, teacher qualifications, and school zones.

Subject

Grenadines teacher qualifications

Leeward teacher qualifications

Windward teacher qualifications

Kingstown teacher qualifications

Mathematics

Female, degree

Male, degree

Male, degree

Female, degree

English

Male, degree

Female, degree

Male, QAT

Female, degree

QAT: Qualified Assistant Teacher.

Second, I also decided to use the same Third Form class for both the Mathematics and English teachers in each school to reduce possible differences among students. If students reacted differently and were in the same class, I only had to contend with the possible impact of differences in student attitudes about Mathematics in comparison with English, rather than differences with both the subjects and the students themselves.

Third, equal teacher gender participation in the two subject areas was also important. How students interacted with the male and female teachers was crucial to the theoretical aspects of the study. It was also important that a school be chosen from each school zone bearing in mind that male English teachers are few.

My selection of schools was therefore based on specific criteria, but for three of the four zones, I was still able to make a “random” choice. Given the criteria and the issues previously raised, two school zones were easily selected. Only one school in the Grenadines met the aforementioned criteria, whereas one from two possible schools was randomly chosen from the Windward zone. For the other two school zones, the only alternative to avoid subject, teacher–gender conflation was to choose one Third Form with all male teachers from one school and the other with all female teachers from another school. The Kingstown and Leeward zones provided three and two possible choices, respectively. As a result, I was able to randomly select one school from each zone. Therefore, although the schools had to satisfy specific criteria, it was only in one zone that I was unable to achieve a measure of “randomness” in the final selection.

It is important to reiterate that until I solicited the pertinent information from the principals and divided the schools into zones, I had no idea which of the 18 schools would be included or excluded based on the criteria. Altogether, several choices were available, but these substantially lessened with each successive selection of zone, school, and other criteria. For example, once a male graduate teacher for Mathematics and a female graduate teacher for English were chosen from the Leeward zone, although there were similar choices within that zone and other zones, I could not use those similar choices in the next selection because I wished to avoid subject–teacher conflation but still have all zones represented. What is important to note is that despite the specific criteria used, at least one or more of the four classes selected reflect similar characteristics of other Third Form classes at the remaining 14 schools.

Permission and Ethics

Not only did I have to obtain Institutional Review Board (IRB) approval from my university, but I also needed to obtain permission from a variety of sources within SVG. Before this research could take place in the SVG secondary school system, I had to first obtain permission from the Chief Education Officer; otherwise, principals would not have allowed me to conduct the research. Then, as a courtesy and to facilitate smooth interactions, I obtained oral permission from the secondary school principals and cooperation from relevant teachers. Finally, as most of these students were under the age of consent (18 years), I had to obtain written consent from parents/guardians/students. The main point here is that before research is conducted, researchers need to get the requisite approval from those in authority, especially as said permission may require quite a bit of time and effort that has to be factored into the time allotted for the research to be implemented. Furthermore, researchers who maintain ethics and a good rapport with authorities pave the way for other researchers to be accepted—unfortunately, the converse is true.

Classroom Observation Methodology

The school principals also gave their consent for me to conduct classroom observations at each of the four schools. Based on the timetables of the selected Third Form in each of the schools, I tried to ensure that I rotated the days and times of the classroom observations. This practice was in an effort to guard against the effect of any extraneous variable such as time of day or even the day itself, when, for example, students may have just had a class they may not have really liked.

My projected timeline was from October to mid-November 2012. However, due to teacher absenteeism for various legitimate reasons, I extended my initial timeline beyond November 15 to maximize the number of observations while keeping in mind upcoming examinations. The least number of observations for a subject in any particular school was four. However, across the four schools, the total number of observations for Mathematics and English was relatively even. Each session lasted approximately 45 min.

For each classroom observation, I used an observation chart where student seating was verified by the form teacher to record attendance for each class and subject and to keep track of students’ participation and reactions in the classroom (see Figure 1 for an example of the observation instrument). Key descriptors on the chart include on-task behavior (o), distraction (d), positive feedback (p), and negative feedback (n) for each student. These descriptors for each of the four categories, adapted from research by Susan Capel, Marilyn Leask, and Tony Turner (2005), allowed for more precise standardization of observations. During each observation, I recorded the number of times students were on task, noted distractors and the number of times teachers used positive and negative feedback, and whether or not students responded positively or negatively to said feedback. Actual teacher comments and non-verbal gestures were recorded separately as well as the students’ subsequent behaviors, reactions, and comments.

Figure 1. Classroom observation form.
Figure

The specifics of my research were not discussed with school authorities, teachers, or students. I informed both principals and teachers that I was researching student achievement and motivation. Students were told that I was observing what went on in classes at various schools. I did not supply information that I was specifically looking at praise and criticism so as to avoid demand characteristics and participant bias because I did not wish to have either teachers or students change how they interact in the classroom. All teachers in the classes under observation are teacher-trained and are accustomed to having others in the classroom to evaluate their teaching, so it was not unusual to have persons visiting periodically. However, I did not want teachers to feel that I was evaluating their teaching, so I spoke to each teacher before commencing my research to alleviate any fears they may harbor on that score. I found them to be extremely cooperative, and it was a pleasure to interact and work with them. I also explained to principals, teachers, and students that although I needed specific information on teachers and students, final data presentation would never be linked to any teacher, student, or school.

I also needed to minimize the effects of the disruption and possible change in student behavior caused by my presence in the classroom. Therefore, at my request, I was positioned at the back of the class so as to lessen distractions by my presence. I also avoided commenting on behavior and helping students with tasks to minimize my position as that of an authority figure. Casual dress in jeans and a shirt helped to cement my image as an observer as secondary school teachers are generally professionally dressed.

The Student Survey

A total of 217 Third Form students from nine secondary schools participated in the second phase of my research, which consisted of administering a student survey on praise, criticism, and motivation. The survey was executed following the completion of classroom observations.

Pretesting the Instrument

My survey was pretested in both Washington State (in the United States) and SVG before the final version was adopted. I used a convenience sample of my classmate’s children in Washington of three high school students (two males and one female). I employed the method of cognitive interviewing to determine whether or not the questions and answer choices were clearly understood. Students were also asked to provide feedback on the general layout and ease of navigation. The survey was then refined based on the outcome of each cognitive interview. Questions and answer choices, as well as layout, were revised and then rechecked by one of the foremost experts in survey methodology.

The survey was further tested in SVG, again using a convenience sample of four average Third Form students (two males and two female) selected by the principals of schools that did not comprise the sample. This was done to ensure that I did not inadvertently give the same students the survey twice but that students similar to those in the actual sample could clearly understand the questions and easily navigate the survey. The major revision from these cognitive interviews pertained to survey appearance. The final two students were given a choice of surveys with different color schemes. They reported that they felt that the one with more color would motivate respondents to start and complete the survey. The final survey was therefore based on the feedback of seven students.

The Selection Process

I administered the student survey to nine Third Form classes including the same four classes in which I had conducted classroom observations. However, the other five classes represented a random selection of Third Form classes from the remaining 14 schools. It was during the survey period that students became aware for the first time that my focus was on issues of praise, criticism, and motivation.

Survey Administration, Problems, and Solutions

After contacting principals and receiving permission to conduct the surveys, I provided letters of consent and spoke to the students and their class teachers. I emphasized that students could not participate in the survey without parental/guardian signed consent, as well as their own assent, and explained the ethical issues. I returned during the following week to administer the survey and confronted a major problem, namely, unsigned consent forms. All students stated that they wanted to do the survey, but had either misplaced their letters or forgot to return them. Their main objection was that they felt that they did not need the signature of their parent/guardian to do the survey.

Overall, survey administration took about 2 months and was accomplished with the help of class teachers from the selected secondary schools. Teachers had to constantly remind the students to fill out and return their consent forms. Hundreds of replacement forms were required and I had to return to the various schools several times to administer the survey in small batches because I did not allow students to take the survey home because of the issue with their forgetting to return consent forms. Given the fact that the survey took longer than I had anticipated, some students who completed the survey later would have already heard about some of the questions. There is the definite possibility of bias here in that they would have had more time to think about their answers than those who did the survey first. Overall, however, I actually think that this would have been a positive rather than a negative effect because answers from students who did the survey later may have been more representative of their experiences than those who were the first to complete the survey. A total of 88.6% (217/245) of Third Form students completed the survey.

For the purpose of more holistic research, I particularly wanted the original students from my classroom observations to do the survey. Therefore, I decided to attend each of those school’s Parents/Teachers’ Day (PT) at which the schools provide parents with the students’ reports and parents can speak with teachers. Two schools scheduled PT Day in December; however, few parents turned up, and PT Day had to be rescheduled for January. When I was finally able to meet parents during PT Day, I encountered absolutely no problem in obtaining signed consent forms. Following survey completion, thank you/debriefing letters were given to students.

Interviews and Exam Results: Selection Process, Problems, and Solutions

The third and final phase of my research involved interviewing a random selection of 48 students from the original four schools under classroom observations. As I observed four classes, I used a random selection of 48 students (representing 36% of the total sample under observation, with 12 from each class). Interviews focused on end-of-term exam results in Mathematics or English (which teachers provided on all of the students in the initial four classes). Interviews were conducted primarily during each school’s PT Day. As with the survey, I also encountered the problem of unsigned parental/guardian consent forms with the student interviews. Although students were willing to do the interviews, this process, like the survey administration, was again quite lengthy due to issues with unsigned consent forms. Constant reminders from teachers coupled with personal phone calls to many of the parents paid off, and I was finally able to interview all 48 students who were randomly selected.

Method in Action

I divided each type of data collection within the mixed-methods design into sequential phases. Classroom observations comprised the first phase of my data collection process. Only after all classroom observations were completed did I commence the second phase consisting of the survey. Examination results were only available at the end of the term, and it was then that I randomly conducted the interviews in the third and final phase.

My 27 years of teaching in SVG provided me with the knowledge and experience about how the education system in my country operates and allowed me to navigate the entire research process effectively. I divided my research into three phases based on my knowledge of how this system operates and the various timelines within schools in SVG. However, had I lacked such knowledge, I would have solicited the advice of experienced persons within the field before undertaking this research. Bear in mind that although school personnel and students may not mind allowing researchers to operate within their institutions, they will not appreciate having major disruptions. Therefore, researchers need to work very closely with school schedules. Also, mixed methods are time-consuming albeit informative. You need to ensure that you have the necessary time, resources, and permission to carry out your research.

Practical Lessons for Conducting a Mixed-Methods Design in Educational Institutions

  • Consider a mixed-methods design which elicits both qualitative and quantitative data for the study. A mixed-methods design can ultimately give a more comprehensive picture of the topic under investigation. This is especially important for first-time research. However, it usually entails more time and effort to gather and analyze the data. Your choice of one method or a mixed-methods design will therefore depend on a variety of considerations.
  • Use the literature as your foundation. Past research provides the best practices and highlights research issues and methods such as the strengths and weaknesses of various data collection instruments. Such information is vital to your prospective research. For example, the literature informed which descriptors I used on my classroom observation form. Furthermore, my choice of SVG was deliberate based on issues highlighted in the literature about heterogeneous samples (where, for example, race may be a confounding variable); consequently, I could then emphasize that the relatively homogeneous student population in SVG would address such issues.
  • Be flexible. As researchers we must plan our research, but when we deal with educational institutions there are certain factors that we cannot control (e.g., permission from school authorities, parents/guardians, teacher absence, school schedules, student behaviors). These all necessitate understanding that we will need to be flexible and that our timelines require flexibility. Certain research may not be doable if we have strict timelines. For example, I experienced a number of problems with obtaining consent from parents and guardians. Therefore, I ended up spending a considerable amount of time, money, resources, and energy to ensure that consent was given. For example, if I had only 3 months to collect all my data, I would have been “up the creek without a paddle.”
  • Pilot surveys. Whenever we are immersed in research, we may get tunnel vision. It is therefore important to have others, especially those who represent the characteristics of your target population, provide you with feedback so that you can refine your data collection instrument (e.g., in terms of ease of navigation, length, phrasing, and clarity). A well-thought-out and designed survey instrument is essential for effective data collection.
  • Practice using the classroom observation data collection instruments. I actually practiced using the instruments to become fast and accurate. Classroom dynamics and incidents can be quite rapid. Although a video camera can be used, this method is invasive and can cause behaviors to change. Therefore, I determined that paper-based instruments were best in this situation. However, they would only be as effective as the person using them—here, my skills honed over the years in the classroom proved exceedingly useful.
  • Follow the rules and respect others. This refers to both the ethics of research and the rules of the educational institutions. Not only will you maintain a congenial relationship with the educational institutions, but you will also help to ensure that future researchers can also gain permission; the converse is also true. You will need to give respect from the “least” person in the school to the highest authority. This provides a more conducive research environment and elicits cooperation from all stakeholders.

Conclusion

I undertook this research as my PhD dissertation, and it was the most extensive research I had done at that time. All of the skills that I learned from my undergraduate to PhD programs were called into play along with my own experience in the classroom. Suffice to say, efficient and effective research is not for the faint of heart. However, if you ask questions of knowledgeable persons, listen to their answers, and critically think through your process based on the information you receive, you will be heading in the right direction. There are no shortcuts, just due diligence and honing of skills that you have either been taught or have naturally developed. In my case, a mixed-methods design allowed me to glean both qualitative and quantitative data, which provided a more holistic view of my research area. Based on your research question and target population among other factors, you can also determine whether it is best to use one type of data collection or a mixed-methods design.

Exercises and Discussion Questions

  • Why did the author decide to use a mixed-methods design? What other factors should be considered when developing a research design?
  • How is the mixed-methods design enhanced by the merits of each data collection instrument?
  • Identify three challenges based on the case sample and methodological issues. Giving reasons for your answers, state what changes you would make to the solutions employed by the author to address the sample and methodological issues.
  • Discuss advantages and disadvantages of the classroom observation data collection instrument provided in Figure 1.
  • Give one possible reason to explain why the author may have used sequential rather than concurrent data collection phases within this mixed-methods design. Discuss the pros and cons of each approach.
  • Can you apply any information (mixed-methods design/procedures/challenges/issues) from this study to your own research interest? Explain your answer.

Further Reading

Qiang, Z., & Derrick, T. (2016). Doing mixed methods research in comparative education: Some reflections on the fit and a survey of the literature. International Perspectives on Education & Society, 28, 165191. doi:10.1108/S1479-367920150000028014

Web Resources

Creswell, J. W. (2013). Telling a complete story with qualitative and mixed methods research. Retrieved from https://www.youtube.com/watch?v=l5e7kVzMIfs

References

Capel, S., Leask, M., & Turner, T. (2005). Learning to teach in the secondary school: A companion to school experience (
4th ed.
). New York, NY: Routledge.
Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (
2nd ed.
). Thousand Oaks, CA: SAGE.
Creswell, J. W. (2013). What is mixed methods research? Retrieved from https://www.youtube.com/watch?v=1OaNiTlpyX8
Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: SAGE.
Entwisle, D. R., Alexander, K. L., & Olson, L. S. (2005). First grade and educational attainment by age 22: A new story. American Journal of Sociology, 110, 14581502.
Entwisle, D. R., Alexander, K. L., & Olson, L. S. (2007). Early schooling: The handicap of being poor and male. Sociology of Education, 80, 114138.
Gurian, M., Henley, P., & Trueman, T. (2001). Boys and girls learn differently! A guide for teachers and parents. San Francisco, CA: Jossey-Bass/John Wiley.
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles