Key Takeaways
1. Select a Research Approach Guided by Worldview, Design, and Methods
In planning a study, researchers need to think through the philosophical worldview assumptions that they bring to the study, the research design that is related to this worldview, and the specific methods or procedures of research that translate the approach into practice.
Foundational choices. Every research project begins with fundamental decisions about its overall approach, which is a blend of philosophical assumptions, specific designs, and practical methods. These choices are not arbitrary but are deeply influenced by the researcher's worldview, the nature of the problem, personal experience, and the intended audience. Understanding this interconnectedness is crucial for a coherent and defensible research plan.
Worldviews shape inquiry. Four primary worldviews guide research: postpositivism (testing objective theories, reductionistic, empirical), constructivism (understanding subjective meanings, social/historical context, theory generation), transformative (political, power/justice-oriented, collaborative, change-oriented), and pragmatism (problem-centered, pluralistic, real-world practice). Each worldview dictates how knowledge is sought and what constitutes valid inquiry. For instance:
- Postpositivism: Often leads to quantitative designs like experiments.
- Constructivism: Favors qualitative designs such as ethnographies or case studies.
- Transformative: Integrates advocacy with research, often using mixed methods.
- Pragmatism: Embraces mixed methods, prioritizing what works to solve the problem.
Designs and methods. Once a worldview is established, it informs the choice of research design (e.g., survey, experiment, narrative, grounded theory, mixed methods convergent) and specific data collection methods (e.g., instruments, interviews, observations). This systematic progression ensures that the research approach is internally consistent and appropriate for the inquiry. The research problem itself often dictates the most suitable approach; for example, exploring an unknown phenomenon calls for qualitative methods, while testing causal relationships requires quantitative ones.
2. Anchor Your Study in a Thorough Literature Review
Studies need to add to the body of literature on a topic, and literature sections in proposals are generally shaped from the larger problem to the narrower issue that leads directly into the methods of a study.
Justify your research. A comprehensive literature review is essential for establishing the relevance and novelty of your proposed study. It demonstrates that you are aware of existing scholarship, helps identify gaps, extends prior work, and provides a framework for interpreting your findings. This process ensures your research contributes meaningfully to the ongoing academic dialogue.
Systematic search. Conducting a literature review involves a structured approach:
- Identify keywords: Use terms relevant to your topic.
- Search databases: Utilize academic resources like ERIC, Google Scholar, PsycINFO, EBSCO, and SSCI.
- Prioritize sources: Focus on recent refereed journal articles and scholarly books, then conference papers and dissertations.
- Abstract studies: Summarize key elements (problem, purpose, sample, results, flaws) for empirical works, or central themes and conclusions for conceptual pieces.
Visualize the landscape. A literature map, a visual representation of existing research, helps organize the literature hierarchically or thematically, positioning your study within the broader field. This map clarifies how your research extends, replicates, or fills gaps in previous studies, making a strong case for its contribution. Defining key terms, grounded in the literature, further enhances precision and clarity for the reader.
3. Integrate Theory Appropriately for Your Research Approach
In quantitative research, some historical precedent exists for viewing a theory as a scientific prediction or explanation for what the researcher expects to find.
Theory's diverse roles. The application of theory varies significantly across quantitative, qualitative, and mixed methods research, serving either as a guiding framework or an emergent outcome. Understanding these distinctions is crucial for its effective integration into a research proposal.
Quantitative theory use. In quantitative studies, theory is typically used deductively, providing a testable explanation for relationships between variables. Researchers identify independent, dependent, mediating, and moderating variables, then propose a theory (often from established social science disciplines) that explains why these variables are related. This theory can be presented as:
- A series of interconnected hypotheses.
- If-then logic statements.
- A visual causal model, illustrating variable relationships.
The theory is usually introduced early in the proposal, serving as a framework for the entire study.
Qualitative theory use. Qualitative research employs theory in more varied ways. It can be:
- A broad explanation: Like cultural themes in ethnography, guiding the inquiry.
- A theoretical lens: Such as feminist or critical theory, shaping questions and interpretations to address issues of power and social justice.
- An emergent outcome: As in grounded theory, where a theory or pattern is inductively developed from the data itself, appearing at the end of the study.
Some qualitative studies may even proceed without explicit theory, focusing purely on rich description.
Mixed methods theory use. Mixed methods research can incorporate theory both deductively and inductively. It may use a social science theory as an overarching framework, guiding both quantitative and qualitative components, or adopt a transformative framework. A transformative framework, for instance, explicitly addresses issues of discrimination and oppression, framing the entire study with an ethical stance of inclusion and a call for social change. This integration ensures that the theoretical underpinnings align with the mixed nature of the inquiry.
4. Structure Your Proposal with Clarity and Ethical Foresight
Ethical questions are apparent today in such issues as personal disclosure, authenticity, and credibility of the research report; the role of researchers in cross-cultural contexts; and issues of personal privacy through forms of Internet data collection.
Proposal as an argument. A research proposal is fundamentally an argument for conducting a study, requiring a clear, interconnected structure that addresses key questions about the topic, methods, and significance. Early planning of this structure is vital for a cohesive and persuasive document.
Tailored formats. Proposal formats vary by research approach:
- Qualitative: Often includes an introduction, procedures (worldview, design, researcher role, data collection/analysis, validation), ethical issues, preliminary findings, and expected impact.
- Quantitative: Typically follows a standard journal article structure: introduction, literature review, methods (design, population, instruments, data analysis), ethical issues, and pilot tests.
- Mixed Methods: Combines elements of both, specifying the rationale for mixing, the design type, and separate quantitative and qualitative data collection/analysis steps, often making it lengthier.
Writing with precision. Effective writing is crucial for readability and clarity. Key strategies include:
- Writing as thinking: Draft ideas quickly to clarify thoughts, then refine through multiple revisions.
- Consistent terms: Use the same terminology for variables or phenomena throughout.
- Narrative flow: Employ "umbrella thoughts," "big thoughts," and "little thoughts" to guide the reader, ensuring coherence through logical connections between sentences and paragraphs (the "hook-and-eye" technique).
- Concise language: Favor active voice, strong verbs, and eliminate unnecessary words ("trimming the fat").
Anticipate ethical issues. Ethical considerations must be woven throughout the research plan, from conception to reporting. This involves:
- Prior to study: Consulting professional codes of ethics, obtaining IRB approval, securing site permissions, and negotiating authorship.
- Beginning study: Identifying beneficial problems, disclosing purpose, ensuring voluntary participation, and respecting cultural norms.
- Data collection: Minimizing disruption, ensuring equitable benefits, avoiding deception, respecting power imbalances, and protecting against harmful disclosures.
- Analysis & reporting: Avoiding bias, reporting all results (positive and negative), protecting participant privacy, ensuring accurate reporting, avoiding plagiarism, sharing data responsibly, and clarifying data ownership.
5. Craft a Compelling Introduction and Precise Purpose Statement
The introduction is the part of the paper that provides readers with the background information for the research reported in the paper. Its purpose is to establish a framework for the research, so that readers can understand how it is related to other research.
Setting the stage. The introduction is the gateway to your research, designed to capture reader interest, establish the problem, contextualize the study within existing literature, and address a specific audience. It culminates in the purpose statement, the most critical sentence in your entire study.
The deficiencies model. A robust introduction often follows a "deficiencies model," comprising five key parts:
- Narrative hook: An engaging opening sentence that broadly introduces the topic and piques interest.
- Research problem: Clearly identify the issue or dilemma that necessitates the study, supported by relevant citations.
- Review of studies: Briefly summarize broad categories of existing literature related to the problem, highlighting what has been done.
- Deficiencies in literature: Point out gaps, limitations, or areas overlooked by previous research, and explain how your study will address these.
- Significance of study: Articulate the practical, theoretical, or policy implications of your research for various audiences.
The purpose statement. This concise statement, typically one or two sentences, declares the overall intent or objective of your study. It is distinct from the research problem (the need for the study) and research questions (what will be answered).
- Qualitative: Focuses on exploring a single central phenomenon, using action verbs like "understand," "explore," or "discover," and specifies participants and site.
- Quantitative: Identifies variables (independent, dependent, mediating, moderating), their relationships (e.g., "relate," "compare"), the theory being tested, participants, and site. Variables are typically ordered from independent to dependent.
- Mixed Methods: States the overall content aim, the specific mixed methods design, and the rationale for combining both quantitative and qualitative data.
6. Formulate Focused Research Questions or Hypotheses
Quantitative hypotheses, on the other hand, are predictions the researcher makes about the expected outcomes of relationships among variables.
Guiding the inquiry. Research questions and hypotheses serve as crucial signposts, narrowing the broad purpose statement into specific inquiries that the study will address. Their formulation differs significantly between qualitative and quantitative approaches, with mixed methods integrating both.
Qualitative questions. In qualitative research, questions are open-ended and exploratory, reflecting an emergent design:
- Central question: One or two broad questions exploring the central phenomenon (e.g., "How is the meaning of X for Y at Z?").
- Subquestions: Five to seven narrower questions that delve into aspects of the central question, guiding data collection (e.g., interview prompts).
- Key characteristics: Begin with "what" or "how," use exploratory verbs (understand, explore, discover), focus on a single phenomenon, and specify participants and site. Questions are expected to evolve during the study.
Quantitative questions and hypotheses. Quantitative studies use questions or hypotheses to examine relationships among variables:
- Research questions: Interrogative statements about relationships (e.g., "Does X explain the relationship between A and B?").
- Hypotheses: Predictive statements about expected outcomes, often used in experiments.
- Null hypothesis: States no significant difference or relationship (e.g., "There is no significant difference between Group A and Group B on Y").
- Alternative/Directional hypothesis: Predicts a specific outcome or direction (e.g., "Group A will have higher scores than Group B on Y").
- Nondirectional hypothesis: Predicts a difference but not its direction (e.g., "There is a difference between Group A and Group B on Y").
- Guidelines: Variables (independent, dependent, mediating, moderating) must be clearly defined and measurable, and typically ordered from independent to dependent. Descriptive questions for each variable can precede inferential questions or hypotheses.
Mixed methods questions. A strong mixed methods study includes quantitative questions/hypotheses, qualitative questions, and a unique mixed methods question that explicitly addresses the integration of the two strands. This mixed methods question can focus on:
- Methods: How one method informs the other (e.g., "Does qualitative data explain quantitative results?").
- Content: How themes from one method relate to variables in the other (e.g., "Does social support explain bullying?").
- Combined: Integrating both methods and content (e.g., "How do qualitative interviews further explain the quantitative relationship between X and Y?").
These questions are typically presented in the order they will be addressed in the study, with the mixed methods question often placed last to highlight the integration.
7. Design Rigorous Quantitative Methods for Surveys and Experiments
In an experiment, investigators may also identify a sample and generalize to a population; however, the basic intent of an experimental design is to test the impact of a treatment (or an intervention) on an outcome, controlling for all other factors that might influence that outcome.
Systematic inquiry. Quantitative methods provide a structured approach to describing trends, attitudes, or testing the impact of interventions, relying on numerical data and statistical analysis. A well-designed methods section for quantitative research details the specific procedures for surveys or experiments.
Survey method plan:
- Design: State the purpose of the survey (generalize from sample to population), rationale for its choice (e.g., economy, rapid data collection), and type (cross-sectional or longitudinal).
- Population & Sample: Identify the population size, sampling design (single-stage or multistage), selection process (random, systematic, or convenience), and whether stratification will be used. Specify sample size calculation (e.g., power analysis, margin of error).
- Instrumentation: Name the survey instrument, describe its development (designed, modified, or intact), and report established validity (content, predictive, construct) and reliability (internal consistency, test-retest) of scores. Include sample items and plans for pilot testing.
- Variables: Cross-reference variables with research questions/hypotheses and specific survey items, often using a table.
- Data Analysis & Interpretation:
- Report response rates and assess response bias.
- Provide descriptive analysis (means, standard deviations, range).
- Identify statistical procedures for scale development (factor analysis, Cronbach alpha).
- Specify inferential statistical tests (t-tests, ANOVA, regression) based on research questions, number/type of variables, and data distribution.
- Interpret results by addressing hypotheses, reporting statistical significance, confidence intervals, and effect sizes, and discussing implications for practice or future research.
Experimental method plan:
- Participants: Describe selection (random or nonrandom), random assignment procedures, matching techniques, and sample size determination (power analysis).
- Variables: Clearly identify independent variables (treatment, measured, control) and dependent variables (outcomes).
- Instrumentation & Materials: Detail instruments for pre/posttests (validity, reliability) and materials for experimental treatment (e.g., programs, handouts), including pilot testing.
- Experimental Procedures: Identify the design type (pre-experimental, quasi-experimental, true experiment, single-subject), what is being compared (between-subject, within-group), and provide a visual diagram using standard notation (X for treatment, O for observation, R for random assignment).
- Threats to Validity: Identify and address potential internal (history, maturation, regression, selection, mortality, diffusion, compensatory effects, testing, instrumentation) and external (interaction of selection/setting/history with treatment) threats, citing relevant literature.
- Data Analysis & Interpretation:
- Report descriptive statistics.
- Specify inferential tests (t-tests, ANOVA, ANCOVA, MANOVA) and nonparametric tests if assumptions are violated.
- For single-subject designs, use line graphs.
- Interpret findings by addressing hypotheses, explaining significance, and discussing implications.
8. Employ Systematic Qualitative Methods for In-depth Exploration
Qualitative validity means that the researcher checks for the accuracy of the findings by employing certain procedures, while qualitative reliability indicates that the researcher’s approach is consistent across different researchers and different projects.
Understanding meaning. Qualitative methods offer a distinct approach to inquiry, focusing on understanding the meaning individuals or groups ascribe to social or human problems through text and image data. A robust qualitative methods section educates readers on this approach, details specific designs, and outlines rigorous data collection and analysis procedures.
Core characteristics:
- Natural setting: Data collected in participants' real-world environments.
- Researcher as key instrument: The researcher personally gathers data (interviews, observations).
- Multiple data sources: Triangulation of interviews, observations, documents, and audiovisual materials.
- Inductive/deductive analysis: Building themes from data, then checking against data.
- Participants' meanings: Central focus on their perspectives.
- Emergent design: Flexible, evolving research plan.
- Reflexivity: Researcher acknowledges biases and their influence.
- Holistic account: Developing a complex, comprehensive picture.
Qualitative designs: Choose a specific design and provide its background, definition, and rationale:
- Narrative: Stories of individuals' lives.
- Phenomenology: Essence of lived experiences.
- Grounded Theory: Developing a theory from data.
- Ethnography: Culture-sharing group in natural setting.
- Case Study: In-depth analysis of a bounded case.
Researcher's role: Explicitly state your background, biases, and how they might shape interpretations. Discuss gaining entry to the site, securing permissions (IRB, gatekeepers), and anticipating ethical issues (e.g., confidentiality, power imbalances).
Data collection procedures:
- Purposeful sampling: Select participants/sites that best inform the research problem.
- Sample size: Varies by design (e.g., 1-2 for narrative, 20-30 for grounded theory), aiming for saturation.
- Data types: Observations (field notes), interviews (unstructured, focus groups), documents (public/private), audiovisual materials (photos, videos, social media).
- Recording: Use observational and interview protocols, logs for documents, and systems for visual materials.
Data analysis and interpretation:
- General process: Organize data, read through, code (using expected, surprising, unusual codes), develop descriptions and themes (5-7 themes), represent findings (narrative, visuals), and interpret (lessons learned, comparison to literature, action agenda).
- Software: Consider qualitative data analysis programs (MAXqda, Atlas.ti, NVivo) for efficiency.
Validity and reliability:
- Validity strategies: Triangulation, member checking, rich/thick description, clarifying researcher bias, presenting discrepant information, prolonged time in field, peer debriefing, external auditor.
- Reliability procedures: Check transcripts, ensure consistent code definitions (codebook), coordinate team coding, cross-check codes (intercoder agreement).
- Generalization: Limited, focusing on particularity rather than broad applicability, though multiple case studies can generalize to theory.
Writing the report: Discuss how findings will be presented (e.g., chronological narrative, process model, detailed portrait), using quotes, varied narrative forms, and researcher voice.
9. Strategically Combine Methods for Comprehensive Understanding
The core assumption of this form of inquiry is that the combination of qualitative and quantitative approaches provides a more complete understanding of a research problem than either approach alone.
Beyond single methods. Mixed methods research is a distinct methodology that intentionally integrates both qualitative and quantitative data to achieve a more profound and comprehensive understanding of a research problem than either approach could offer in isolation. It leverages the strengths of both while mitigating their individual limitations.
Defining mixed methods. Key characteristics include:
- Collection and analysis of both qualitative (open-ended) and quantitative (closed-ended) data.
- Rigorous conduct of both qualitative and quantitative procedures.
- Integration of the two data forms through merging, connecting, or embedding.
- Incorporation into a distinct mixed methods design, considering timing (concurrent/sequential) and emphasis (equal/unequal).
- Often informed by a philosophical worldview or theoretical framework.
Rationale for mixing. Researchers choose mixed methods for various reasons, including:
- Comparing different perspectives from both data types.
- Explaining quantitative results with qualitative follow-up.
- Developing better measurement instruments through initial qualitative exploration.
- Understanding experimental outcomes by incorporating participant perspectives.
- Developing a more complete understanding of changes needed for marginalized groups.
- Evaluating intervention programs comprehensively over time.
Challenges and benefits. While offering a sophisticated approach, mixed methods research demands extensive data collection, time-intensive analysis, and proficiency in both qualitative and quantitative methods. Clear visual models are essential to navigate its complexity. However, the payoff is a richer, more nuanced understanding of complex phenomena, appealing to those at the forefront of research innovation.
10. Choose a Mixed Methods Design Based on Intent and Practicality
The choice of a particular mixed methods design is based on several factors that relate to the intent of the procedures as well as practical considerations.
Design selection. Choosing the right mixed methods design is critical and depends on the study's intended outcomes, how data will be integrated, the timing of data collection, the emphasis placed on each method, the field's inclinations, and the researcher's resources.
Basic mixed methods designs:
- Convergent Parallel (QUAN + QUAL): Collects quantitative and qualitative data concurrently, analyzes them separately, and then compares results to confirm or disconfirm findings.
- Integration: Side-by-side comparison, data transformation, or joint display.
- Validity challenges: Unequal sample sizes, incomparable concepts, lack of follow-up on divergence.
- Explanatory Sequential (QUAN → QUAL): Collects quantitative data first, analyzes results, then uses findings to inform a second, qualitative phase to explain the initial quantitative results in more detail.
- Integration: Quantitative results inform qualitative sampling and question development.
- Validity challenges: Not considering all follow-up options, different samples for each phase, inadequate sample sizes.
- Exploratory Sequential (QUAL → QUAN): Begins with qualitative data collection and analysis, then uses findings to build into a second, quantitative phase (e.g., developing an instrument or new variables).
- Integration: Qualitative findings used for instrument development, variable identification, or category formation.
- Validity challenges: Inadequate instrument development, lack of rigor in qualitative phase, using the same sample for both phases.
Advanced designs: These build upon the basic forms:
- Embedded (QUAN(qual) or QUAL(quan)): Nests one data form within a larger design (e.g., qualitative data within an experiment) to support or augment the primary method.
- Transformative (QUAL + QUAN or QUAN → QUAL, etc., within a transformative framework): Uses a social justice theory as an overarching framework to address issues of marginalized groups, framing problem, questions, data, and action.
- Multiphase (QUAN → QUAL → QUAN, etc.): Conducts several mixed methods projects over time, often in longitudinal evaluations, building on each other toward a common objective.
Factors influencing choice:
- Expected outcomes: What do you hope to achieve (e.g., convergence, explanation, instrument development, action)?
- Data integration: Will data be merged, connected, or embedded?
- Timing: Concurrent or sequential data collection?
- Emphasis: Equal or unequal priority for quantitative vs. qualitative data?
- Field inclination: Which designs are favored in your discipline?
- Researcher resources: Single researcher (sequential, embedded) vs. team (concurrent, multiphase).
Last updated:
Similar Books
