Download as pdf
Download as pdf
You are on page 1of 12
valuation ond Program Planning, Vol. 16, pp. 131-142, 1993, Pte inthe USA. Al ight revered (0149-7189/93 $6.00 + 00 Copyright © 1983 Pergamon Press Lid. MIXED METHODS IN A STUDY OF IMPLEMENTATION OF TECHNOLOGY-BASED MATERIALS IN THE ELEMENTARY CLASSROOM BERNADETTE B. RUSSEK and SHARON L. Wenvnena [New York Universiy ABSTRACT Ethnographic techniques and philosophy guided data collection and analysis to determine the ‘extent and nature of implementation of technology-based materials in elementary school math- ‘ematics classrooms. However, unlike mest ethnographic studies, quantitative data collection ‘and analyses played an important role inthe overall findings. Each method provided is own strength. In some cases a blend of both methods afforded a deeper insight than elther could provide separately Despite the continued discussion on the incompatil ity of the qualitative and quantitative paradigms (Howe & Eisenhart, 1990), researchers have come to realize hhow complementary these two approaches can be on an applied level. Such is the case in this study where a com- bination of qualitative and quantitative methods was utilized in an investigation of the implementation of technology in the mathematics curriculum at the elemen- tary school level. Because our initial questions were of the nature, “What is going on here?” and “What are the explanations for what we see happening?”, we decided carly on that a holistic-inductive paradigm would be ‘most appropriate to our research agenda. We decided further that ethnographic techniques and their accom- panying philosophy would guide our data collection and analysis, ‘Although traditionally ethnographic data almost ex- clusively have been qualitative, this study departs from that tradition by also employing methods of quantita- data collection and analysis. We expected and found ‘hat not only did each method provide its own strength to broaden the study, but in some cases a blend of both ‘methods afforded a deeper insight into the nature and extent of technology implementation. Qualitative anal- ysis predominated early on, giving way to quantitative ‘methods as the study progressed. The qualitative data and analysis “opened up” many areas of inquiry, which were then confirmed or elaborated upon by our quanti tative procedures. According to Solomon (1991), ‘Such mutual acceptance is inereasingly manifested in the ‘use of quantitative methods of phenomenologcaly oriented esearchers in social psychology (e.g, Kruglanski, 1989) and education (e.g, Nicholls & Thorkldsen, 1985), inthe tes ing of a priori models and hypotheses by qualitative re. searchers (e.g, Goldenberg, 1989), and inthe descriptive study of individuals’ phenomenological perceptions, val~ ues, and beliefs by quantitative researchers (eg. Miller, Leinhardt & Zigmond, 1988). (p. 10) Patton (1980) and others view the guiding principles of ethnography as general principles, important to both quantitative and qualitative researchers alike. For exam- ple, the principle that measurement should not disturb or alter the system being measured is basic not only 10 the social sciences, but also to the physical sciences as ‘embodied in the Heisenberg Uncertainty Principle. Ac- cording to Patton (1980), ‘The problem of how the observer affects what s observed is not unique to qualitative research methods. The Heisen- berg Uncertainty Principle in physies expresses the same problem from the perspective of natural science. The Heisenberg Uncertainty Principle states thatthe instruments ‘used to measure velocity and position of an electron alter the accuracy of measurement. When the scientist measures, the postion ofan electron, is velocity is changed, and when the focus of measurement is on the velocity, it becomes Requests For reprints should be sent 10 Bernadette E, Rustek, School of Education, 129 Dowling College, Oakdale, NY 11769, 1 - 12 BERNADETTE E. RUSSEK and SHARON L. WEINBERG ‘more difficult ro measure accurately the electron's position. ‘The process of observing affeets what is observed, These are real effets, not just errors of perception ot measur ‘ment. The situation is changed by the intrusion of the ob. server. (p. 189) Each paradigm uses appropriate techniques to assure ‘minimum disturbance of the system. According to Pat- ton (1980), the pure or typical qualitative strategy is ‘made up of a holisticsinductive research design of nat- uralistic inquiry, qualitative data, and content or case analysis. The pure or typical quantitative strategy is made up of a hypothetico-deductive research design that is experimental or quasi-experimental, quantitative data, and statistical analysis. McKerrow and MeKerrow (1991) suggest that since a naturalistic research paradigm re- 4uires that inquiry be conducted without interference or manipulation of the research setting, qualitative designs often are best suited to minimize interference. Patton (1980) suggests, however, that there can be a number of ‘mixed methodological forms. One such combination is {o-use naturalistic inquiry, collect quantitative data, and perform statistical analysis. Another possibility isto use naturalistic inquiry, collect qualitative data, and perform statistical analysis. As indicated earlier, this study ut lized a number of strategies within the holistic-inductive Paradigm of naturalistic inquiry, emphasizing quanti. tative data collection and analysis to.a greater extent than is ordinarily the case in such situations, THE TECHNOLOGY IMPLEMENTATION PROJECT To provide the reader with a contextual framework for understanding the methods employed, we describe briefly the technology implementation project of which the current study isa part, For a more detailed account Of the larger project, the reader is referred to Russek (1991), and Willoughby and Weinberg (1991), ‘In recent years it has been noted that although com- Puters and calculators are pervasive in our society, there hhas been a reluctance by teachers and administrators to Dut this technology into use in the mathematics class. room (Becker, 1987; Hambree & Dessart, 1986; Lein. wand, 1985; Suydam, 1982). Accordingly, in an attempt {0 promote progress inthis area of curriculum develop- ‘ment, the National Science Foundation (NSF) funded 8 four-year project entitled, Supplementary Mathemat- ics Materials for a Technological Society (SMMTS) (Goldberg & Sgroi, 1986; Willoughby, 1988). The ini- Boals of this project were to develop and field test technology-based elementary mathematics curriculum ‘materials. The ultimate goals were to evaluate the effec. tiveness of these materials in the classroom setting. ‘The purpose of the current study was to determine, uring the first year of field testing, the extent to which {wo sets of supplementary mathematics lessons, one uti- lizing the calculator and one utilizing the computer, were implemented by elementary school teachers, and the fac. tots that governed such implementation. Data were gath. ered from a suburban school district with approximately 7,000 students. The particular sample on which our Study is based consists of 16 teachers from 6 elementary schools. These schools represent a wide cross-section of student abilities and cultural backgrounds. Qualitative data were gathered from informal and in-depth interviews of teachers and administrators, in- formal classroom observations, school documents, and teacher-written responses to open-ended questions or questionnaires. Quantitative data were obtained from classroom observation checklists, Lesson Evaluation Forms (LEFs), Workshop Evaluation Forms (WEFs), and two self-report measures of teacher feelings, att. tudes, and concems, The last two were a Self-Evaluation Questionnaire (adapted from Spielberger, 1977) and the Stages of Concern Questionnaire (Hall, George, & Ruth. erford, 1977) Qualitative procedures during data collection defined an interactive relationship between data collection and analysis, which included keeping a log in which strat- esies, hunches, and themes were recorded, testing of hunches, triangulation, alternative hypothesis testing, and the isolation of key events. Qualitative analysis after data collection consisted of making data displays in the form of tables, matrices, or diagrams (Huberman & Miles, 1984). On the quantitative side, analyses in. cluded formulation and testing of statistical hypotheses ‘regarding linear relationships and correlations as well 4s the development of mathematical models to explain the data, ‘The extent of implementation was based on two com- plementary measures, one quantitative and one quali tative: percentage of lessons completed and degree of ‘match between teacher behaviors and behaviors expected by the developers. The factors that influenced teacher implementation were uncovered largely through quali tative methods. Quantitative methods were then useful in determining the relationship between these factors and the extent of implementation ETHNOGRAPHIC PRINCIPLES This study used ethnographic research methods and val- les. An ethnography can be defined as an in-depth an. lytical description of an intact cultural scene (Borg & Gall, 1983). The main characteristic of ethnography is that the researcher uses continuous observation, trying to record virtually everything that occurs in the seting being studied. The main elements of the value system Mixed Methods « Phenomenology, which requires the researcher to de- velop the perspectives of the group being studied, that js, to develop the “insider's” viewpoint. + Holism, which places emphasis on attempting to per- ‘ceive the big picture rather than focusing upon a few clements within a complex situation. « Nonjudgmental orientation, which requires that re- searchers do not start with specific hypotheses since Such hypotheses, judgments, or preconceptions may ‘distort what the researcher sees. The emphasis is on evording the total situation without superimposing 133 ‘one’s own value system. Researchers may start with ¢ theoretical framework or with tentative working hypotheses that provide general guidelines tothe ob- sewer about what behavior may be important. In this study, ideas from innovation theory and learning the; ‘ty were used to formulate the original thrust and direction of research questions. ‘Contextualism, which requires that all data be con- Sidered only in the content of the environment in which it was gathered (Borg & Gall, 1983, pp. 492-893, ‘TECHNIQUES OF INQUIRY ‘There are several techniques of inquiry that have been ‘established to assure that ethnographic principles are up~ fheld, They are naturalistic inquiry, the use of the role of the researcher as a participant observer, and multi- ple methods of data collection and analysis. Ethnogra- phy permits the use of quantitative methods, as Tong as they do not contradict the basic ethnographic research principles. Naturalistic Inquiry {A naturalistic method assures that activities are recorded in an undisturbed, holistic environment that allows ‘events to happen on their own, in their own complex ity, In tis study, ongoing observations and unobtrusive measures provided minimum disturbance to the class- room and the teacher. The researchers established rap- port with the teachers and their students by planning Pavance visits to the schools and by holding informal hats. Teachers became familiar with the observers and fnrerviewers and, as a result, 10 a great extent, carried fon their classroom activities in a normal manner. Open- ended interviews invited teachers to describe their feel- {ngs and experiences from their own perspectives, in their ‘own language, and encouraged them to feel personally involved in the study Participant Observer ‘One of the most powerful tools available to the ethno- graphic researcher to ascertain the insider’s viewpoint ‘Rthat of participant observer. In this study, the first au thor functioned primarily as an observer, participating ‘only in a limited way to gain compatibility with the teachers. ‘Because the participant observer isthe sole or primary instrument of measurement, the use of such a tool may tack validity (Bogdan & Biklen, 1982; Borg & Gall, 1983; Leithwood & Montgomery, 1987; King, Morris, & Fitz: sibbon, 1987). The following criteria were used to judge the vabdity of the participant observer approach in this study (Smith, 1978). «© Quality of Direet On-Site Observation and Face-t0- ‘Pace Interviews. People often attempt to ‘mask’ from the researcher their true feelings and actions. As will be discussed later in this paper, results from this study suggest that such masking is made more dificult when Gate are collected via participant observation rather than via written questionnaires. As such, the use of fa participant observer was viewed as a strength of this study. Freedom of Access. Although the first author had general freedom of access to the research site, such freedom was limited by the need to make appoint ments with the teachers to view their lessons. As are sult, many teachers prepared for such visits, and a “normal, unbiased picture of what was going on” may not always have been provided. The limited freedom df access was viewed as a weakness of this study, mit- {gated somewhat by the intensity of observation. « Intensity of Observation. “As the amount of direct “observation increases, the chances improve of obtain ing a valid and credible picture of the phenomena be- ing studied” (Borg & Gall, 1983, p. 491), and the Hikelihood of “faking” or “putting on an act” is de- creased. In this study, the entire cycle of a school sear was observed, which helped the researchers to 0b- Tain a complete and valid picture of the SMMTS im plementation. For example, had observations been Carried out only in May, a number of developments ‘vould have been missed during the early part of the year, such as the reactions of teachers to the lateness ‘bf materials distributed in the fall, and the tensions between other schoo! commitments and the SMMTS project. The intensity and duration of observation are Seen as a strength of this study, + Purposeful Sampling. Because information in this type of study is examined in depth, the use of large samples is prohibitive. To procure a sample that is rep- resentative of the total data universe, a strategy of pur poseful sampling, as suggested by Patton (1987), was Peed. In this study the unit of sampling was the teacher, In an attempt to achieve diversity, in the face fof a necessarily small sample, teachers were selected purposefully from different sites and different grade fevels. The final sample consisted of teachers with Fep- resentative characteristics in terms of theit own age, 16 BERNADETTE E. RUSSEK and SHARON L. WEINBERG experience, and attitude, and in terms of the grade, ability and socio-economic status of the children in their classes, * Unobtrusive Measures. Unobtrusive cues can provide insight into the phenomena being studied. One such cue in this study was the reluctance on the part of ‘many teachers to inform the researchers as to when ‘technology lesson would be taught, One must ques- tion whether such uncooperative behavior is a mai festation of these teachers’ concerns for the integrity of their professional images and the lack of confidence they may have in being able to master the materials to be taught. These unobtrusive cues provided new dic rections for questioning and observing the teachers. ‘Triangulation and Multimethods This technique refers to the strategy of using multiple sources of data and multiple methods of data collection to achieve a more complete picture of empirical reality. Consistent with the idea of triangulation, data were col- Jected from field observations, interviews with teachers and administrators, questionnaires, and other observ- «ers. The collection of both qualitative and quantitative data provided another means by which to confirm or findings. Most often triangulation is thought to provide corrob- orative or verificatory indicators of a finding, and as such, its typically perceived to be a strategy for improv- ing the validity of research findings. According to Miles ‘and Huberman (1984), “triangulation is supposed to sup- Porta finding by showing that independent measures of itagree with it or, at least, don’t contradict it” (p. 234). Triangulation is essentially a strategy that helps re- duce researcher bias and allows for the elimination of plausible alternative rival hypotheses in the search for the truth (Denzin, 1978, as cited by Mathison, 1988). Mathison (1988) delineates three possible outcomes of triangulation convergence, inconsistency, and contra- iction, To benefit from the use of triangulation when ‘multiple methods are used for assessment, care must be taken to rule out the possibility that a change in method does not also change what is being measured. Examples of convergent and inconsistent findings relative to this study are discussed in a later section of this paper, DATA COLLECTION AND ANALYSIS Data collected from formal and informal interviews functioned to suggest new areas of inquiry, promote ‘apport with participants, provide in-depth contextual information, and provide input to help formulate ana- Iyic questions. Observational data provided context in formation, confirmed or contradicted teacher verbal Feports, and provided a basis to create observation checklists. Observation checklists, together with geneva observational dat, provided a measure ofthe match be tween teacher behaviors and expesed innovation belay. fots. Information from other observers supplemented researcher observations, thus providing a check for re- searcher bias. The five quantitative measures provided quick, uniform, and easy-to-analyze data Within the qualitative data collection phase, the re- searcher alternated between the tasks of data collection and data analysis in an interactive, dynamie fashion. Data were entered into logs and then analyzed accord. {ng to implied strategies, hunches, and themes. At fist, Qualitative data collection was open-ended, attempting, to record all events, attitudes, and context. After each observation or research session, the researcher rendered ‘description of the people, activities, conversations, and events in the logs. The researcher also recorded ideas, strategies, reflections, and hunches, and noted patterns ‘or themes that emerged. More data were then collected, entered, and analyzed interactively until the data collec” tion phase came to an end. At that point, the analysis of the qualitative data contained inthe logs of field notes shifted to the creation of data displays in the form of tables, matrices, or diagrams. On the quantitative side, analysis began only after the end of the data collection phase. It consisted of the formulation of statistical hy- Potheses regarding linear relationships and correlations, and the development of mathematical models to explain the data, A sample of the log is provided in the excerpts that follow. Al items in parentheses represent notes or ‘memos made in transcribing. The material in brackets represents a synopsis of statements made by the re- searcher during the session, Log Excerpt 1. Teacher GB—Pre-Activity Interview, 9/19/88 | worry about using the computer. I don’t want to use it before 1 am ready. Lam frustrated. Like the other day I tried using the computer, you know those disks he had given us. «(these were the sofware programs from the work- shop) I didn’t get too far with any of them. It's really frus- trating because you think it's really simple and yet there fare things that don’t go right. When you are teaching chil. dren and things don't go right. . » You lose kids that way ‘They're waiting, they'e waiting for you and you don't know ‘hat todo and that’s where you lose your clas sometimes. Ifitdoesn’t work right, what do you do next? [suggested adress rehearsal, a dry run before she presented the lesson to the class] 1 did adry run the frst day of school I never used com- ter with a printer before s0 I had to go to my computer Aide and she had to help me. You can see the printer, I's right there (she pointed to the printer in the earner of the oom), and I wil have a computer right in the room. And ‘the computer room is right next door and if there's not 2 scheduled class, I could take them there. I'm very lucky, in that sense. So, the first day I 100k adry run, Lwas afraid that | might o something wrong. It wasn't coming out right. 1 (the Mixed Methods 135 computer) said, ‘no data for this." vas wondering, did 1 ‘erase something? I didn't know why it didn’t have it (the ‘data. I though when we practiced at Jefferson Schoo, that wehad it. And the computer person came in and sald, ‘did you take out the disk when the red light was on?” I sid, ‘Obh, I might have done that... want to be very, very sure that I don't erase any pro- ‘grams on the computer. I'm not sure what's on there. For ‘example, today they (her students) are going to measure themselves and we are going to use it for the first lesson ‘on the computer—in the graphing. I wasn’t sure when {was ‘trying to put this in the computer. Just what goes on the xcaxls and what goes on the y-axis? And it (the computer} says, “give ita name’ and I did that. And it says, "Now choose afield’ or something, I don't know exactly what the ‘question was. Bue Thad difficulty doing that because when TTpat the thing in and it says, ‘Not enough information’ it might have been something very simple that I had to do, but | didn't know what. ‘Log Excerpt 2. An Analytic Memo by the Researcher, 10/26/88 1 seems that a number of teachers have not told us when they gave introductory lessons or activities. For example, know the Tollowing ‘WW did three calculator lessons, but never called. CK did two caleulator lessons, but never called. CY did three compute lessons, but never called. When asked CY to inform us as fo when he plans to give ales- son, he said that he likes to be ‘spontaneous,’ however, he will inform us from now on. 'SM did two calculator lessons, but never called. When asked SM to inform us she said that she also likes to be spontaneous, When I asked her for her teaching times, she ‘couldn’t/wouldn’s give them to me. She said she is ‘lex ble.’ However, she complained that she didn't have 1 ‘equipment, and if we expected her to bein the project, she should be adequately equipped, ‘Observations seem to indicate that teachers are very con- ‘cerned about their image as an authority, as someone in ‘contol, inthe classroom. New technology, particularly the ‘computer, seems to threaten this image. Many ofthe teach- fersare very anxious, not sure of what they know and what they don’t know As can be noted, the log was used to record observa- tions and events as well as to provide a mechanism for analysis. The log allowed the researcher to keep track of the development of the research and to visualize how the direction of questioning was being affected by the data already collected. The researcher continually read and re-read the log, wrote memos, coded and looked for patterns, and formulated hunches and new analytic questions. This cycle of data collection and analysis continued during the fieldwork period until the study focused on particular phenomena, and a general descrip- tion and explanation of these phenomena were estab- lished. As noted earlier, during this process various techniques were used for analysis—triangulation, satu- ration, the search for similarities (common themes) and dissimilarities, the resolution of inconsistencies, the isolation of key events, and the writing of memos. Following guidelines set forth by Miles and Huber- ‘man (1984), once data collection was complete, the ma- jor activities of analysis were data reduction, by making summaries and partitions, identifying clusters, and writ- ing memos, and the creation of data displays. Accord- ing to Miles and Huberman (1984), a display is an organized assembly of information that clarifies what is happening and permits the drawing of conclusions. In this study, displays took the form of summarizing tables, matrices, and diagrams expressing linkages and interactive structures (models). Theme clusters and in- teractive structures were identified, and grounded the- ‘ries were verified with periodic re-entry into the field for needed information, For example, the use of a data display matrix facili- tated the determination of those factors that appeared to be most relevant to the degree to which teachers implemented the technology curriculum in their class- rooms. Many factors from a variety of data sources were examined, including the teacher's age, sex, teaching experience, personal commitment to the project, skill/ knowledge, pre-actvity atitudes, beliefs about learning, perceived rewards, scheduling difficulties, other school commitments, priorities, and training, as well as colle- ial interaction, extent of school personnel support, and availability and complexity of the hardware. These fac- tors were listed horizontally inthe data matrix, while the names of the teachers were listed vertically in descend- ing order according to the extent to which they imple- ‘mented the technology curriculum in their classrooms (sce Fig. 1 for an excerpt of this matrix). By studying the material in this reduced and well-organized form, we ‘were aided in our task of uncovering the factors that ap- peared to be most salient to implementation or lack thereof. ‘A Focus on the Barriers to Implementation From the analysis based on the data display matrix we were able to idemtify preliminarily what appeared to be barriers to implementation, Additional qualitative anal ysis, based on the observational and interview data as recorded in the log as well as on new discussions with the participating teachers, the elementary school prin- cipals, and the mathematics coordinator for the district, further illuminated the barriers to implementation. A summary display of such bartiers appears in Table 1. Ta- ble 1 also contains information as to which factors with the exception of skill/knowledge were identified as bar- riers by each teacher. Skill/knowledge was identified as a barrier, not by the teachers, but by the researchers based on classroom observation. ‘To exemplify part of the process used to arrive at Ta- ble 1, we offer the following discussion with respect 10 the second barrier listed in Table 1, teacher beliefs in conflict with the goals of SMMTS. This barrier repre- sents teacher beliefs that the materials were too difficult 2 & 2 8 ish sou just ae (8 yoursetf* ‘aoeruouos oye a i payeposse so}eg “any yediaviad = 4 “szmussan seino = oD “enRujpice) Fo}aRUIEN + OM “ssaUBHRD JbuLe 4g saummED, 137 138 BERNADETTE E. RUSSEK and SHARON L. WEINBERG Tae) BARRIERS TO THE IMPLEMENTATION (o = 151* Teachers Barns uS CK DR JM CY RB ES WW MM BR GS EC CB SM AA Lack of kivinowlodge xox x x nae ‘Teacher conict with SMMTS goals Xoo x « x x x Matrisesson afcuies: Presentation Banat x Length of lessonsbackgrouns xox ow x x ree x ‘operational cticuties rs x ox x Lick of time—Coniet with other schoo! commitments. Ko x Se x Lack of supportintraction x xx xox x ‘Teacher AS has been omitted because hi information was not obtained for her. ‘The teachers are liste n order of degree of implementation, ro last to most. =Sllknowiedge was assessed by the researcher. Al other batiors were dented by the teacher fora particular group of students, thatthe topics were when these beliefs and values were in conflict withthe appropriate for the grade level, or that the use of goals and materials of SMMTS. calculators inthe classroom reduces mathematics learn- Another barter, lack of support/interaction, was ing. The material from the log that supported the iden- identified in similar ways, The frequency with which tifleaion ofthis factor asa barrier is represented by the teachers alluded to this factor as a barter led us to in- following excerpts. vestigate further the relationship between whether a 7 teacher worked with a colleague at school and the num- iMlautgnsomethingand toy never think spout tin ther, eV of lessons that teacher completed. This analysis was ‘own minds? When do they earn how to multiply properly? fried out quantitatively by computing a Pearson prod- (Teacher AS, reactivity Interview 1: pp. 46-32, Rusek, et moment correlation between these two variables. The a vet soutemansun e resulting correlation of 0.56, based on the sample of 16 Fm nt sure I do understand why negative numbers are eachers, suggested that a moderate correlation exists be- tne Ta Thosceae ch cue that they have toabeave, ’Ween these two variables; that, in general, those teach- Glescher DR sir imorvew es pe ao Ger Russek ison), ef& Who tended to work with other teachers in their Tvegotten upto Leson 7, and no, Idon'tthing 'm go. school completed more lessons than others. ing 1 g0 on eventhough is only February. I just thik The information in Table | motivated additional ee ee oe sian for ares Lael S quantitative analyses as well. For example, from Table 1 frosting rather than iluminating so U have ort of ve” i appears that those teachers with low skill/knowledge weucretiyingtotcachesou know, likeyoucouldaddand FePOrt more obstacles to implementation of the technol Subiract exponents todo calculations and Ithink the es ogy-based curriculum than teachers with high skll/ mating of rcs asco ar. twas above them They knowledge, To veri this impression we computed the ‘weren’t that interested in it. I think that migk 4 little "mean number of obstacles (excluding skill/knowledge) bicmanure for fourth graders at las these fou grad: reported by both teachers of low skill/knowledge and vee 7 aa . — teachers of high skill/knowledge. The respective means. Because9of the 15 teachers gave similar such descrip- of 5.3 and 2.5 suggest that our impressions were correct. tions, and none ofthese 9 implemented the curriculum Teachers with low skill/knowledge did report more dif: to the extent intended, we concluded that teacher beliefs ficulties implementing the curriculum than teachers with and values did constitute a barrier to implementation high skill/knowledge. METHODOLOGICAL IMPLICATIONS ‘We anticipated at the beginning of the study that the quantitative measures used in this study. Based on this. ‘quantitative measures would be used to validate the more finding, it became clear that the use of both qualitative subjective, qualitative ones. In fact, however, the reverse and quantitative measures provided important data for was the case. The qualitative measures, in a triangula- triangulation, and that, in general, the mixed method tion contest, served either to validate or invalidate the paradigm served this study well Mixed Methods 139 ‘Triangulation: Convergence and Inconsistency As noted earlier, triangulation efforts can lead to con- vergence, inconsistency, or contradiction. Although ‘most of the data collected in this study converged to provide a consistent, overall picture of implementation, there were some instances of inconsistency. Case 1 presents a situation of convergence, whereas case 2 presents a situation of inconsistency. Case 1: One of the key variables of this implementation study was the degree to which each teacher implemented the lessons in her or his classroom, operationally defined in quantitative terms as the number (or percentage) of lessons taught by each teacher. Three different sources of information were used to access this informatio teacher verbal reports, classroom observation, and les- son evaluation forms. A summary of this information is provided in Table 2, For seven teachers, there is com- plete agreement among data sources; for eight teachers, there is agreement between two of the three data sources; and, for one teacher, all three indicators are different from each other. Accordingly, inthis case, there appears to be good agreement among indicators; triangulation has resulted in convergence. Case 2, A self-evaluation questionnaire, consisting of twenty Likert scale items adapted from Spielberger (1977), was administered in December to all teachers to assess their anxiety level in the classroom when teach- Thats 2 ‘TRIANGULATION: THE QUANTITATIVE DEGREE (OF IMPLEMENTATION Data Sources Tescher Verbal Clasxoom Reports beervations Ler ww Low Low tow es Low tow Low eM Moderate Moderste Low oR Moderate Moderato Moderato ra Hoh Mogersto Low aR High ih Hh aa Hoh igh Low ca. Hoh Moderate High vm Hoh Hoh Moderate a8 Moderate tow Low cy Low Low Low ot Hoh sh High ca Moderate Mocerste Low ac High Low low 4s Moderate Modereto Law cx tow tow tow “Theat at obsoned was 27 Thro very pos at he completed al the lessons ater ths date. Algo the teacher Dt won corebaraad by samples of conpeed huent wanton. ing the calculator and computer lessons. To tap this same construct through qualitative means, teachers were asked, in face-to-face interviews, questions that paral- leled closely those on the self-evaluation questionnaire. ‘Their levels of anxiety were assessed also via classroom observations. An analysis of data from all sources re- vealed that inconsistency existed between the quantita- tive and qualitative methods of assessment. Responses to the questionnaires were in sharp contrast to the atti- tudes expressed in face-to-face interviews and the atti- tudes observed through classroom observation, Teachers who voiced deep concerns in interviews and who ap- peared very anxious when observed in the classroom did not, in general, evaluate themselves as anxious on the written questionnaire, After weighing the two types of evidence, qualitative and quantitative, it became clear that the paper and pencil self-evaluation questionnaire, the quantitative measure, was the less valid of the two types. There are several possible explanations for this ob served inconsistency. The qualitative and quantitative assessment procedures were purportedly tapping anxi- ly, uncertainty, and perhaps, in the minds ofthe teach- ers, professional competence. The teachers appeared to be reluctant to put on paper what they freely revealed orally. Perhaps, the paper and pencil questionnaire was ‘more intimidating to teachers in ths setting than in other settings where the scale has been used successfully. Per- hhaps, the teachers perceived the questionnaire a a state ‘ment of incompetence that could be put into their files as a permanent record to become professionally dam- aging, whereas they perceived talking with someone as, being far more benign. One might infer, if this is the ‘case, that, in general, teachers exercise more caution in ‘completing written self-evaluations than in responding rally to an interviewer; and conversely, that open- ended, oral formats provide a more comfortable forum in which to reveal true feelings. We agree with Borg and. Gall (1983) and Patton (1980) that although question- naires are easier to administer and analyze, and are, in eneral, free from researcher bias, they must be used with caution, particularly ifthe construct to be measured is of a sensitive nature. ‘The Complementarity of Qual and Quantitative Procedures The following two cases illustrate how a quantitative analysis of qualitative data can produce illuminating results, Case 1. Although extent of implementation by teacher was operationally defined quantitatively as the percen- tage of lessons a teacher completes, it was also defined ‘qualitatively asthe fit between a teacher’s actual behav- jor and her behavior as expected by the curriculum de- velopers. Teachers were expected to display facility with M40 BERNADETTE B. RUSSEK and SHARON L. WEINBERG the calculator or computer, knowledge of the principles that underlie the innovation, and effective classroom ‘management while using the technology-based materi- als. I is obviously possible for a teacher to have com- pleted all lessons, but to have done so poorly. Such a teacher would be rated as high on quantity, but low on quality, resulting in a poor fit between the quantity and Guality measures of extent of implementation. Other combinations are possible as well. To determine how ‘teachers were distributed in terms of these two measures of extent of implementation, the scatterplot of Figure 2 ‘was constructed. According to Figure 2, teachers A, SM, CB, and BC are rated as high in both quantity and quality of implementation, whereas teachers WW, ES, and CY are rated as high in quality of implementation, ‘but low in quantity of lessons completed, From qualitative perspective, the scatterplot of Fig- ure 2 generated a number of analytic questions, some Of which led to re-entry into the field to gather the ad- ditional data that would answer these focused ques- ions— Why did teachers WW, ES, and CY complete so few lessons when the quality of their implementations was so high? Why did teachers RB and BR complete so ‘many lessons, but with low quality (in a way that was counter to the expectations of the developers)? Why did 11 of the 16 teachers complete less than 75% of the es- sons? Could anything be done to help the eight teach- ers in the low quality group to perform more effectively? From a quantitative perspective, an overall index, /, of calculator implementation was formulated, 1=O-R where P is the proportion of lessons completed and Q {sa quality weighting factor describing the effectiveness Of the teacher in fulfilling the objectives of the lessons completed. This numerical index enabled the research- ers to rank order teachers in a simple and straightfor- ward way according to their overall extent of calculator implementation, and then, as a result, to identify, more easily, factors salient to such implementation, Case 2. Once the matrix of all possible factors related to implementation was constructed (see Fig. 1), the re- searcher looked for some way to link these factors to the actual extent of implementation. Teachers were ranked according to their value on the index, J, defined in Case 1. Ratings of high, moderate, and low were then assigned to each teacher according to statements they ‘made regarding each factor. For example, with respect to the factor, Expressions of Personal Commitment, the s ve i as] y unnity Figure 2. Extent of Calculator Implementation: Quanity versus ‘ual statement, “I said I would do it and I did,” was rated as high, whereas the statement, “I haven't really exam- ined the program yet,” was rated as low (see Fig. 1), Analyses of these ratings revealed that three factors appeared to be necessary for there to be maximal cal- culator implementation: strong teacher commitment, positive pre-activity attitudes, and ealeulator skill and knowledge. Continuing in a quantitative mode of analysis, a Venn diagram was constructed to show how the teachers were distributed among these three factors. An extent of im- plementation scale was added for reference (see Fig. 3). ‘The display of Fig. 3 reveals that all three factors were necessary for high implementation and that no one fac- tor dominated. In general, the farther a teacher is from the intersection of al three factors, the lower that teacher is on the implementation scale, To verify that no one fac- tor dominated, three conditional probabilities were cal- culated, one for each factor. If all such probabilities are approximately equal, then one could conclude that no one factor dominated. The conditional probability of high implementation given high skill and knowledge ‘equals 0.50; the conditional probability of high imple ‘mentation given high personal commitment also equals 0.50; and, the conditional probability of high imple- ‘mentation given positive attitude equals 0.44. Accord- ingly, it appears that as Figure 3 suggests, no one factor dominated. CONCLUSIONS According to Eisenhart (1988), numerous mathematics education researchers, particularly those interested in What teachers and students are thinking and doing in

You might also like