How to conduct Empirical Research

how to do empirical research and how to write empirical research paper | download free pdf
Dr.KeiraCollins Profile Pic
Dr.KeiraCollins,United States,Professional
Published Date:07-07-2017
Your Website URL(Optional)
Comment
Fundamental empirical research principles - . intro-methods-iii-1 Fundamental empirical research principles (version 1.0, 1/4/05) Code: intro-methods Daniel K. Schneider, TECFA, University of Geneva Menu 1. The logic of empirical research 2 2. Objectives 4 3. Conceptualizations 5 4. The measure 16 5. Interpretation: validity (truth) and causality 21 6. Conclusion 26 Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 1. The logic of empirical research intro-methods-iii-2 1. The logic of empirical research 1.1 Elements of a typical research cycle • Details may considerably change within a given approach subject, literature objectives review result comparison research with other work questions 1.Objectives and theory 5.Analysis and conclusions analysis analytical 2.Conceptualisations frameworks data 4.Measures hypothesis gathering (measures) 3.Artifacts analysis grids sampling operationalization e.g. experimental material implementation Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 1. The logic of empirical research intro-methods-iii-3 1.2 Key elements of empirical research For a given research question, you usually do: Conceptualizations Artifacts Measure(s) Analysis & conclusions • Conceptualisations: make questions explicit, identify major concepts (variables), define terms and their dimensions, find analysis grids, define hypothesis, etc, • Artifacts: develop research materials (experiments, surveys), implement software, etc. • Measures: Observe (measure) in the field or through experiments (use your artifacts) • Analyses & conclusion: Analyze the measures (statistic or qualitative) and link to theoretical statements (e.g. operational research questions and hypothesis) Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 2. Objectives intro-methods-iii-4 2. Objectives subject, literature objectives review research questions 1.Objectives and theory 5.Analysis and conclusions 2.Conceptualisations 4.Measures 3.Artifacts Research questions are the result of: • your initial objectives (which you may have to revise) • a (first) review of the literature Everything you plan to do, must be formulated as a research question • See slides on "Finding a research subject" Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-5 3. Conceptualizations • elaborate and "massage" concepts so that they can be used to study observable phenomena 1.Objectives and theory 5.Analysis and conclusions analytical 2.Conceptualisations frameworks 4.Measures models & hypothesis 3.Artifacts analysis grids operationalization Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-6 3.1 The usefulness of analysis frameworks E.g. activity theory Quote: The Activity Triangle Model or activity system representationally outlines the various components of an activity system into a unified whole. Participants in an activity are portrayed as subjects interacting with objects to achieve desired outcomes. In the meanwhile, human interactions with each other and with objects of the environment are mediated through the use of tools, rules and division of labour. Mediators represent the nature of relationships that exist within and between participants of an activity in a given community of practices. This approach to modelling various aspects of human activity draws the researcher s attention to factors to consider when developing a learning system. However, activity theory does not include a theory of learning, (Daisy Mwanza & Yrjö Engeström) • Translation: Il helps us thinking about a phenomenon to study. • A framework is not true or false, just useful or useless for a given intellectual task Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-7 3.2 Models and hypothesis • These constructions link concepts and postulate causalities • causalities between concepts (theoretical variables) do not "exist" per se, they only can be observed indirectly • Typical statements: "More X leads to more Y", "an increase in X leads to a decrease in Y Exemple 3-1: Causality between teacher training and quality Hypothesis (often heard): Continuous teacher training (cause X) improves teaching (Y) cause ??? x= teacher y= quality of postulate training teaching x= amount y= grades observed of training of pupils dimension correlation x= days of teacher y= average of indicators: training / year grades / class /year observations between cases of N classes + statistical analysis X= explanatory Y= explained independent dependant variable variable Research Design for Educational Technologists © TECFA 1/4/05 conclusionsFundamental empirical research principles - 3. Conceptualizations intro-methods-iii-8 3.3 The importance of difference (variance) for explanations Without variance, no differences .... no explanatory science we’d like to know why things exist, why we can observe "more" and "less" .... Without co-variance, no correlations / causalities ... no explanation A. Quantitative example: teacher b teacher d teacher a 0 training days for teachers • We got different grade averages and different training days • therefore variance for both variables • According to these data: increased training days lead to lower averages • (consider this hypothetical example false please ) Research Design for Educational Technologists © TECFA 1/4/05 average gradesFundamental empirical research principles - 3. Conceptualizations intro-methods-iii-9 B. Qualitative example Imagine that we wish to know why certain private schools introduce technology faster than others. One hypothesis to test could be: "Reforms need external pressure". Strategies of a school strategy 3: strategy 2: strategy 4: strategy 1: internal training Type of pressure a task force is resources are no reaction programs are created reallocated created Letters written by (N=4) (N=1) parents (p=0.8) (p=0.2) Letters written by (N=2) (N=3) supervisory boards (p=0.4) (p=0.6) (N=1) newspaper articles (p=100%) N = number of observations, p = probability • Result (imaginary): increased pressure leads to increased action Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-10 3.4 How can we measure general concepts ? A scientific proposition contains concepts (theoretical variables) • Examples: “the learner”, “performance”, “efficiency”, “interactivity” An academic piece links concepts • ... empirical research requires that you work with data, find indicators, build indices, .. • because of observed correlations we can make statements at the theory level Exemple 3-2: “Collaborative learning improves pedagogical effect”: ??? Concept Y Concept X pedagogical effect collaborative cause learning conclusion measures X measures Y ... motivation observed cost observed coll. activities task performance correlations (which ones ???) meta-cognitive abilities ??? • We got a real problem here How could we measure "pedagogical effect" or "collaborative learning" ? Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-11 A. The bridge/gap between theoretical concept and measure: • There are 2 issues you must address: (1) Going from “abstract" to "concrete” (theoretical concept - observables) Examples: • measure of “student participation” with “number of forum messages posted” • measure of “pedagogical success” with “grade average of a class in exams” (2) “whole - part” (dimensions): Examples from educational design, i.e. dimensions you might consider when you plan to measure the socio-constructiveness of some teaching: • Decomposition of “socio-constructivist design” in (1) active or constructive learning, (2) self- directed learning, (3) contextual learning and (4) collaborative learning, (5) teacher’s interpersonal behavior (Dolmans et. al) • The Five Es socio-constructivist teaching model: Engagement, Exploration, Explanation, Elaboration and Evaluation (Boddy & al) Example from public policy analysis: • Decomposition de “economic development” in industrialization, urbanization, transports, communications and education. Example from HCI: • Decomposition of usability in "cognitive usability" (what you can achieve with the software) and "simple usability" (can you navigate, find buttons, etc.) Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-12 Exemple 3-3: COLLES Constructivist On-Line Learning Environment Survey (Taylor and Maor) Dimensions (from teacher education over the Internet survey studies survey): • Relevance How relevant is on-line learning to students' professional practices? • Reflection Does on-line learning stimulate students' critical reflective thinking? • Interactivity To what extent do students engage on-line in rich educative dialogue? • Tutor Support How well do tutors enable students to participate in on-line learning? • Peer Support Is sensitive and encouraging support provided on-line by fellow students? • Interpretation Do students and tutors make good sense of each other's on-line communications? Each of these dimensions is then measured with a few survey questions (items), e.g.: Almost Some- Almost Statements Seldom Often Never times Always Items concerning relevance my learning focuses on issues that interest me. O O O O O what I learn is important for my professional practice as a trainer. O O O O O I learn how to improve my professional practice as a trainer. O O O O O what I learn connects well with my prof. practice as a trainer. O O O O O Items concerning reflection … I think critically about how I learn. O O O O O … I think critically about my own ideas. O O O O O … I think critically about other students' ideas. O O O O O … I think critically about ideas in the readings. O O O O O Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-13 Exemple 3-4: measure of economic development • usage of official statistics • (only part of the diagram is shown) Explanatory Concept to explain explains Concept (ex. economic development) Global Index Dimension1 Dimension2 (ex. industrialization) (ex. urbanization) Index2 Index1 Indicator1 Indicator3 Indicator2 (ex. NGP) (ex. private energy use) (ex. roads) measure measure (NGP 2004) (km/km2 2002) Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-14 Exemple 3-5: measure of the strategic efficiency of a private distance teaching agency • example taken from a french methodology text book (Thiétard, 1999) strategic efficiency concept Commercial Financial dimensions performance performance Turnover Profits indicators index Profits/turnover Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 3. Conceptualizations intro-methods-iii-15 B. Dangers and problems of concept operationalization 1. Gap between data and theory • Example: measure communication within a community of practice (e.g. an e-learning group) by the quantity of exchanged forum messages • (students may use other channels to communicate ) 2. You forgot a dimension • example: measure classroom usage of technology only by looking at the technology the teacher uses e.g. powerpoint, demonstrations with simulation software or math. software • ( you don’t take into account technology enhanced student activities ) 3. Concept overloading • example: Include “education” in the definition of development (it could be done, but at the same you loose an important explanatory variable for development, e.g. consider India’s strategy that "overinvested" in education with the goal to influence on development) • Therefore: never ever collapse explanatory and explainable variables into one concept 4. Bad measures •(see later) Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 4. The measure intro-methods-iii-16 4. The measure • observe properties, attributes, behaviors, etc. • select the cases you study (sampling) 1.Objectives and theory 5.Analysis and conclusions 2.Conceptualisations 4.Measures data 3.Artifacts gathering sampling (choice of cases) Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 4. The measure intro-methods-iii-17 4.1 Sampling As a general rule: • Make sure that "operative" variables have good variance, otherwise you can’t make any statements on causality or difference ..... • operative variables = dependant (to explain) and independent (explaining) variables Overview on sampling strategies Type of selected cases Usage will give better scope to your result maximal variation (but needs more complex models, you have to control more intervening variables, etc. ) provides better focus and conclusions; will be "safer" since it will homogeneous be easier to identify explaining variables and to test relations critical exemplify a theory with a "natural" example according to theory, will give you better guarantees that you will be able to answer your i.e. your research questions questions .... extremes and deviant cases test the boundaries of your explanations, seek new adventures intense complete a quantitative study with an in-depth study • sampling strategies depend a lot on your research design Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 4. The measure intro-methods-iii-18 4.2 Measurement techniques • There are not only numbers, but also text, photos and videos • Not treated here, see the modules «ix Quantitative data acquisition methods (e.g. surveys and tests)» and «x Qualitative data acquisition methods (e.g. Interviews and observations)» Principal forms of data collection Articulation verbal non-verbal Situation and verbal oral written text analysis, informal participatory information interview log files analysis, observation etc. open interviews, open questionnaire, formal and systematic semi-structured interviews, journals, vignettes, unstructured observation thinking aloud protocols, etc. standardized questionnaire, formal and standardized interview, experiment simulation log files of structured user structured interactions, Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 4. The measure intro-methods-iii-19 4.3 Reliability of measure reliability = degree of measurement consistency for the same object 1. by different observers 2. by the same observer at different moments 3. by the same observer with (moderately) different tools example: measure of boiling water • A thermometer always shows 92 C. = it is reliable (but not construction valid) • The other gives between 99 and 101 C.: = not too reliable (but valid) Sub-types de reliability (Kirk & Miller): 1. circumstantial reliability: even if you always get the same result, it does not means that answers are reliable (e.g. people may lie) 2. diachronic reliability: the same kinds of measures still work after time 3. synchronic reliability: we obtain similar results by using different techniques, e.g. survey questions and item matching and in depth interviews In short: can we reproduce and replicate, can we trust data? Research Design for Educational Technologists © TECFA 1/4/05Fundamental empirical research principles - 4. The measure intro-methods-iii-20 The “3 Cs” of an indicator Are your data complete ? •Sometimes you lack data .... •Try to find other indicators Are your data correct ? •The reliability of indicators can be bad. •Example: Software ratings may not mean the same •according to cultures (sub-cultures, organizations, countries) people are more or less outspoken. Are your data comparable ? •The meaning of certain data are not comparable. •examples: (a) School budgets don’t mean the same thing in different countries (different living costs) (b) Percentage of student activities in the classroom don’t measure "socio-constructive" sensitivity of a teacher (since there a huge cultural differences between various school systems) Research Design for Educational Technologists © TECFA 1/4/05

Advise: Why You Wasting Money in Costly SEO Tools, Use World's Best Free SEO Tool Ubersuggest.