• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
J Mix Methods Res. Author manuscript; available in PMC Oct 10, 2012.
Published in final edited form as:
J Mix Methods Res. Oct 2012; 6(4): 317–331.
Published online Dec 28, 2011. doi:  10.1177/1558689811427913
PMCID: PMC3467952
NIHMSID: NIHMS346743

Mapping the Mixed Methods–Mixed Research Synthesis Terrain

Abstract

Mixed methods–mixed research synthesis is a form of systematic review in which the findings of qualitative and quantitative studies are integrated via qualitative and/or quantitative methods. Although methodological advances have been made, efforts to differentiate research synthesis methods have been too focused on methods and not focused enough on the defining logics of research synthesis—each of which may be operationalized in different ways—or on the research findings themselves that are targeted for synthesis. The conduct of mixed methods–mixed research synthesis studies may more usefully be understood in terms of the logics of aggregation and configuration. Neither logic is preferable to the other nor tied exclusively to any one method or to any one side of the qualitative/quantitative binary.

Keywords: mixed research synthesis, systematic review, aggregation, configuration

Among the newest arrivals on the mixed methods research scene is a form of systematic literature review in which the findings of completed empirical qualitative and quantitative observational and experimental studies are integrated via qualitative and/or quantitative methods (Sandelowski, Voils, & Barroso, 2006). Such studies may be seen to constitute a form of mixed methods research in that they entail the use of methodological approaches associated with both qualitative and quantitative research. In contrast to primary mixed methods studies in which the data subject to analysis and synthesis are derived from human subjects (via interviews, questionnaires, observations, laboratory assays, etc.), the data in mixed research synthesis studies are the results or findings researchers generate from these data and then present in the written reports of their qualitative, quantitative, or mixed methods studies. These studies are thus more precisely labeled as mixed methods–mixed research synthesis studies in that what is “mixed” are both the object of synthesis (i.e., the findings appearing in written reports of primary qualitative, quantitative, and mixed methods studies) and the mode of synthesis (i.e., the qualitative and quantitative approaches to the integration of those findings). The purposes of mixed methods–mixed research synthesis studies (hereafter simply referred to as mixed research synthesis studies) are as varied as those of primary mixed methods studies. Among these purposes are the development of evidence summaries, development of theory and adjudication of rival theories, and determination of the active ingredients, effectiveness, and weak links in the implementation chain of interventions, programs, and policies (Pawson, 2006).

Informed by the evidence-based practice movement and in response to the urgency felt, especially in the practice disciplines (e.g., education, medicine, nursing), to increase the utilization value of empirical research findings (Sandelowski, 2004), a spate of articles and books has appeared on the subject of integrating the findings of methodologically diverse studies (e.g., Boaz et al., 2006; Dixon-Woods, Agarwal, Jones, Young, & Sutton, 2005; Harden & Thomas, 2005; Hawker, Payne, Kerr, Hardey, & Powell, 2002; Lemmer, Grellier, & Steven, 1999; Pope, Mays, & Popay, 2007). In this literature, an array of methodological approaches are described, categorized, and compared. Although contributing much to the field of systematic review, these seemingly different approaches serve to draw hard lines between methods that are at best highly permeable and thereby to distract from the common defining logics that these approaches share. The “methodological plenitude” (A. C. Love, 2006, p. 455) complicating the mixed research synthesis field is comparable to that in the primary mixed methods research field, which is now characterized by an increasing number of terms for typologies and frameworks for conceiving, designing, conducting, disseminating the results of, and evaluating such studies (e.g., Creswell & Plano Clark, 2007; Tashakkori & Teddlie, 2010). The current impetus in the literatures of mixed methods research and mixed research synthesis is multiplicity rather than parsimony.

Accordingly, our purpose in this article is to advance a parsimonious conceptualization of the mixed research synthesis enterprise as fundamentally encompassing the synthesis of research findings by aggregation and synthesis of research findings by configuration. Neither logic is superior nor preferable to the other or even a matter of researcher’s choice per se; rather, the nature of the research findings to be synthesized plays a large role in whether synthesis can proceed by aggregation or configuration. Moreover, although presented separately here, aggregation and configuration are not wholly separable as they may entail each other. Although we emphasize here the use of these logics in mixed research synthesis studies, they are not distinctive to such studies as aggregation and configuration are integral to the integration of qualitative and quantitative data in primary mixed methods research and to the integration of data in mono method studies as the very production of empirical research findings depends on synthesis, that is, on researchers putting data together. To set the scene for our presentation of these defining logics, we begin with an overview of the problems we see in the way methods are presented for conducting mixed research synthesis studies.

An Excess of Method

An array of qualitative and/or quantitative methods has been advanced for synthesizing qualitative research findings alone, quantitative research findings alone, and qualitative and quantitative research findings together. These methodological approaches to research synthesis have themselves been variously conceived as philosophically and/or operationally qualitative or quantitative. For example, Pope et al. (2007) described as (a) quantitative approaches to evidence synthesis: content analysis, quantitative case survey, the Bayesian approach, and qualitative comparative analysis; (b) interpretive approaches to evidence synthesis: comparative approaches (grounded theory and comparative case study) and translation-based approaches (meta-ethnography); and as (c) mixed approaches to evidence synthesis: thematic analysis, realist synthesis, and narrative synthesis. Dixon-Woods et al. (2005) described as “clustering towards the interpretive end of the spectrum” (pp. 46–47) of methods narrative summary, grounded theory, meta-ethnography, meta-synthesis, meta-study, realist synthesis, and the data analysis techniques in Miles and Huberman (1994); and as “lying at the more integrative end of the spectrum” (Dixon-Woods et al., 2005, p. 47) content analysis, case survey, qualitative comparative analysis, and Bayesian meta-analysis. Emphasizing philosophy as opposed to method, Suri and Clarke (2009) described positivist, interpretive, participative, and critically oriented research syntheses as distinctive approaches. Although these conceptions of methods represent laudable efforts to organize the mixed research synthesis field, they serve also to confuse even with the caveats offered for the overlapping of items in them.

The Binary That Continues to Bind

First, current classifications of research synthesis approaches (like classifications of methods for primary mono method and mixed methods research) tend to default to the qualitative/quantitative binary. Yet all these methodological approaches entail practices associated with both qualitative and quantitative research. For example, although theme identification tends to be seen as qualitative and enumeration as quantitative, those conducting any kind of research synthesis study would be hard-pressed to avoid either of these practices regardless of method (Sandelowski, Voils, & Knafl, 2009).

Although integral to the “iconography” (Law, 2004, p. 4) of research and particularly mixed methods research in the behavioral, social, and health sciences, “rhetorically unavoidable” (Becker, 1996, p. 53), and useful as a shorthand communication, the qualitative/quantitative binary is arguably also a major impediment to methodological advancement (Allwood, 2011). Vogt (2008) described the qualitative–quantitative distinction as a “category mistake” that reinforces false stereotypes, diverts attention away from the target phenomena under study, and emphasizes distinctions better deemphasized or even altogether abandoned. This binary diverts research synthesis efforts away from the application of solutions stereotypically viewed as qualitative to findings produced in quantitative studies, and solutions stereotypically viewed as quantitative to findings produced in qualitative studies. The default to the qualitative/quantitative binary is evident in the persistent alignment of aggregation of findings with positivism and the quantitative reduction of data, and of interpretation of findings with constructivism and the qualitative amplification of data. Yet as we propose in more detail later in this article, aggregation and interpretation entail each other, no one-to-one correspondence exists between philosophical positions and methods or between aggregation and quantitative and interpretation and qualitative, and all research syntheses are by definition reductions of masses of data that were amplified, that is, subjected to further exploration and manipulation.

The Conflation of Analysis With Synthesis and Method With Technique

Second, the placement of such entities as thematic analysis and realist synthesis on the same plane conflates analysis, or those practices directed toward disassembling bits of information extracted from research reports (e.g., thematic analysis, calculation of effect sizes, creation of visual data displays), with synthesis, or those practices directed toward reassembling them (e.g., grounded theory, meta-analysis). Such placements also conflate overall methodological approach with technique as any one technique (e.g., thematic analysis, creation of data matrices) might be a component of more than one methodological approach (Grant & Booth, 2009). Realist synthesis overlaps with elements of grounded theorizing by virtue of its iterative approach to data collection and theory development. Both realist synthesis and grounded theory entail thematic analysis, which is in turn typically accomplished by enumeration and tabulation or other visual displays of information.

All research synthesis studies require defining research problems, purposes, and questions; setting parameters for the searching, retrieval, inclusion, and extraction of information from literature; the comparison and translation of findings (e.g., of concepts into each other, of different statistical expressions of data into various effect size indexes); and some form of content or thematic analysis, counting, tabulating, plotting, diagramming, and narrating. Methods of research synthesis are not differentiated by these practices per se, but rather by how, why, and when in the course of the synthesis study these practices are executed.

The conflation of method with technique troubles the idea that different methods can be tested on the same set of studies to compare their relative merits and research synthesis results (Dixon-Woods et al., 2005; Lucas, Baird, Arai, Law, & Roberts, 2007), as it is difficult to imagine how such a test would be conducted of methods that have many techniques in common and as the research findings in the body of literature targeted for synthesis may not allow certain methods to be used (Voils, Barroso, Hasselblad, & Sandelowski, 2007). As evident in published reports of research synthesis studies, most approaches to conducting them are arguably better conceived in relation to each other as packages of overlapping techniques adapted to research purposes and the nature of the research findings to be synthesized.

The Conflation of Method Claims With Methods-in-Use

Third, current presentations of research synthesis methods gloss the difference between “method talk” (Gubrium & Holstein, 1997, p. 3)—the after-the-fact reconstruction of method that constitutes the “literary technology” (Shapin, 1984) known as the research report—and method practice, or what researchers actually do to conduct their studies. Although arguments about the right and wrong ways to use particular methods are commonplace, in the end, there is no one or perfect execution of any method. Within acceptable limits, methods are reinvented every time they are used to accommodate the real world of research practice. Methodological innovation is arguably the norm in conducting research, calling into question the lines often drawn between reinvention and innovation (Taylor & Coffey, 2009; Travers, 2009) and between the reinvention and “erosion” of method (Greckhamer & Koro-Ljungberg, 2005). In research synthesis studies, methods must always be accommodated to the actual reports of research under review and to the nature of the findings presented in them. Research findings cannot simply be read off the pages of these reports, extracted as given, and uniformly subjected to some method. Rather, these findings must be made pliable for synthesis: transformed to enable comparison and combination (Moreira, 2007; Sandelowski, 2008). Even so, there will be limits on what methods a body of research will allow. Just as all quantitative research findings do not lend themselves to meta-analysis, so too do all qualitative research findings not lend themselves to meta-ethnography.

Because methods do not exist apart from specific user contexts, methods have no inherent strengths or weaknesses that can be compared as such a comparison always implies some a priori standard of strong and weak. If researchers select methods not designed to accomplish their stated purposes, it is their decision that is weak, not the method. A method cannot be designated as, for example, more or less transparent than another method (cf., Dixon-Woods et al., 2005) as transparency is not a characteristic of method; rather, it is a judgment rendered by the reader of a research report as to how a study was presented. Indeed, routinely touted as a quality criterion and defining attribute of systematic reviews, transparency has come under deserved criticism as itself constituting a discourse masking moves toward conformity and social control (e.g., MacLure, 2005; see also Neyland, 2007; Strathern, 2000).

In short, methods become what they are in the hands of users, a generic “practical epistemology” (Becker, 1996, p. 57) that is evident in the appropriately eclectic, mix-and-match and therefore novel combinations of approaches described in reports of research synthesis studies regardless of claims to having used or invented one particular method (Dixon-Woods et al., 2006). Meta-ethnography, for example, has been variously implemented in health-related research syntheses (e.g., Malpass et al., 2009; Pound et al., 2005), and these variations have at best only a passing resemblance to the method Noblit and Hare (1988) first described for comparing a small number of school ethnographies. Meta-ethnography has become a methodological approach for synthesizing volumes of health-related research reports of studies conducted with different qualitative methods.

Writing about ethnography, Agar (2004) observed that “a study always develops in ways unforeseen at the beginning … the story of a methodology is the story of the study” (p. 19). The same can be said of research synthesis studies. The lines drawn between methods-in-use will always be more indistinct than those drawn in textbook depictions of those methods owing to the need to draw distinctions when teaching methods, discipline-specific variations in the use of ostensibly the same methods (particularly qualitative research methods), and the constraints of research practice that make methodological rules honored more often in the breach than in the observance.

Diverse Orientations to Research Synthesis

A fourth source of confusion pertains to the philosophical orientation to research synthesis studies. Research synthesis studies are variously viewed as philosophically realist/positivist as opposed to idealist/constructivist with these positions themselves variously aligned and conceived (Harden & Thomas, 2010; Suri & Clarke, 2009). Harden and Thomas, for example, viewed realist synthesis—the methodological approach to research synthesis advanced by Pawson (2006)—as located on the idealist end of the continuum.1

The problem here is confusing the orientation to research synthesis with the orientation of the individual primary studies that produced the research findings to be synthesized. A realist/positivist orientation toward research synthesis connotes a view of the research report as an index of the study conducted and of the findings in that report as indexes of the phenomena under investigation. Researchers with this orientation see research findings as separable and therefore as extractable from other components of the research report regardless of the philosophical orientation in which those findings were located by the authors of those reports (Sandelowski & Barroso, 2007). That is, a realist/positivist research synthesis study is one that takes research reports as reasonably faithful proxies for the studies conducted and the findings in those reports as reasonably faithful proxies for the phenomena under study regardless of how those primary studies were themselves philosophically located. Indeed, research synthesis projects in the health sciences are typically realist in orientation in that they take reports of research to represent something about a “real world that exists independently of” researchers’ conceptions of it (Maxwell & Mittapalli, 2010, p. 146). Although this world may be populated with entities such as beliefs, social constructions, and discourses, these entities are treated as if they had real effects.

Researchers assuming a realist/positivist stance operate “as if truth holds still” (Kearney in Thorne, Jensen, Kearney, Noblit, & Sandelowski, 2004, p. 1354), even as they recognize that all knowledge is inescapably partial and socially constructed and that all knowledge claims are merely pauses (Glaser & Strauss, 1967) in the never-ending quest for knowledge. Notwithstanding the valid critical, postmodern, and other arguments advanced concerning the dangers of settling on any one knowledge claim or of subscribing to the notion of single truths, researchers assuming a realist/positivist position see research synthesis studies as a way to determine provisionally the best evidence available for action to improve the public health and welfare.

In contrast are positions on the idealist/constructivist end of the continuum. Promoted as alternatives and even as antidotes to realist/positivist reviews of literature, idealist/constructivist studies of bodies of empirical literature are typically directed toward revealing something heretofore hidden concerning a field of study and toward unsettling prevailing knowledge claims and the very notion that any claims can be settled (Eisenhart, 1998). As Sandelowski and Barroso (2007) described them, such studies may not necessarily be directed at all toward synthesizing findings per se in the evidence-based practice sense; indeed, the researchers conducting them may not subscribe to the notion that such an entity as “findings” exists at all and may even show an antipathy to the evidence–practice movement.

In contrast to the realist/positivist orientation to research reports as indexes of the studies conducted are idealist treatments of research reports as constituting discourses concerning patients, diseases, therapeutics, and even research methodologies themselves. Such studies show a dominant interest in objectives such as tracking and interpreting over time the theoretical and methodological characteristics, dominant meta-narratives, and privileged discourses and silences in a body of literature (e.g., Greenhalgh et al., 2005; Suri & Clarke, 2009; Thorne, Joachim, Paterson, & Canam, 2002; Thorne, Paterson, et al., 2002). Such studies of bodies of research serve to place the findings of research synthesis studies in historical and cultural context and thereby offer alternatives to realist/positivist explanations of these findings (Paterson, Thorne, Canam, & Jillings, 2001).

In short, what needs better differentiation in debates about philosophical stances toward research synthesis is the stance of research synthesis studies as opposed to the stance of the primary studies to be reviewed and the evidence-base as opposed to critique agenda of studies referred to as research syntheses. Moreover, the view that research synthesis studies ought to lead to some conclusions that can serve as the evidence base for practice is neither a naive capitulation to positivism nor at odds with studies directed toward contextualizing and even troubling those conclusions. Although both settling and unsettling objectives may be pursued in a single program of research, they constitute different agendas and takes on the research literature. The judgmental line that seems to be drawn between the realist/conclusion-drawing and idealist/critique stances is a troubling one. To arrive at a provisional conclusion about the experience or management of an illness or about the mediators that make a cognitive–behavioral intervention more effective than a psychoeducational one in improving medication adherence does not preclude critique. Indeed, it is critique that is precluded when the position is assumed that no conclusions can or even should be drawn from reviews of research. The difference between realist and idealist postures toward research synthesis is not that between “achiev(ing) closure” and “stimulat(ing) healthy debate” (Suri & Clarke, 2009, p. 410); indeed, although different postures, the relationship between them is complementary as research conclusions themselves may stimulate debate and calls for further targeted research.

The Logics of Research Synthesis

To summarize our argument thus far, current efforts to differentiate one research synthesis method from another are ironically too focused on methods and not focused enough on the defining logics of research synthesis, each of which may be operationalized in a variety of ways. A preoccupation with method diverts attention away from understanding the nature and content of research findings and from the fact that methodologically diverse primary studies may yield thematically similar findings.

As shown in Table 1, we advance the view that the defining logics of mixed research synthesis (whether qualitative and quantitative findings are initially synthesized separately with these respectively synthesized findings subsequently themselves synthesized, or whether these findings are integrated earlier in the course of the project) are aggregation and configuration. Neither logic is tied exclusively to any one method, to any one particular design for mixed research synthesis (Sandelowski et al., 2006), or to any one side of the qualitative/quantitative binary. Neither logic is better or stronger than or preferable to the other but rather more or less allowable by the nature of the findings in the body of literature under review. These logics are, however, tied to a realist perspective on research syntheses, defined as those research synthesis projects that have as their primary focus the findings of studies and as their primary objective the synthesis of those findings. Although we analytically separate them here to feature their differences, these logics are not experientially distinct as synthesis by configuration may entail sets of aggregated findings and aggregation can be seen as a type of configuration.

Table 1
Comparison of Defining Attributes of Fundamental Synthesis Logics

Research Synthesis by Aggregation

Research synthesis by aggregation entails the assimilation of findings considered to address the same relationship or connection between two or more aspects of a target phenomenon, as, for example, when all findings about whether lack of trust in provider is related to medication nonadherence are pooled. Pooling the findings gives an indication of where the preponderance of evidence lies as to whether trust is associated with medication nonadherence, given insufficient or conflicting evidence, the limitations of all research, and the provisional nature of all knowledge claims. Aggregation is the primary logic of meta-analysis (Bayesian or frequentist) and vote counting used to synthesize quantitative findings, meta-summary used to synthesize qualitative findings, and of various adaptations of Bayesian meta-analysis used to synthesize qualitative and quantitative findings. These approaches can be extended further (e.g., meta-regression, moderation in meta-analysis) to identify the study design or populations under which the relationships are larger or smaller.

Research synthesis by aggregation depends on both qualitative and quantitative findings being conceived as potentially addressing the same factors or aspects of a target phenomenon regardless of the aims of the studies in which they were produced or differences in the way those findings were produced. This conception is in contrast to the view that qualitative and quantitative studies are distinguished by their addressing different aims in different ways, albeit in a shared domain of research (Barbour & Barbour, 2003) and that findings from studies with different aims and different approaches therefore cannot be pooled. Reviewers must, for example, be willing to see the reasons a group of people gave for missing medication doses produced from a thematic analysis of data generated from open-ended and minimally structured interviews with a purposefully selected sample of participants as potentially comparable to the predictors of missed doses produced from a regression analysis of data generated from closed-ended and highly structured questionnaires completed by a probability sample of participants. That is, they must be willing to focus on the content of findings apart from the motivations, methodological claims, or metrics of the individual studies in which they were produced. Reviewers will attend to these differences in post hoc sensitivity analyses by determining whether associations exist between findings and study characteristics, but no one variant necessarily precludes aggregating findings on the basis of a perceived thematic similarity among them.

This very act of seeing two or more entities as “the same” is an interpretive move—an intervention on the part of reviewers—that allows findings to be combined. This move is a sine qua non of all research synthesis studies and why all research synthesis projects and aggregation, in particular, are interpretive. The “interpretive gesture” (K. Love, Pritchard, Maguire, McCarthy, & Paddock, 2005) is present in even the most “quantitative” of research synthesis efforts. Researchers interpret which factors are repeatedly identified as influential or repeatedly identified as having no or minimal influence or the conditions under which each has a larger or smaller influence. Synthesis by aggregation is thus essentially an exercise in convergent validation as it rests on what researchers perceive as the repetition of the same relationship among findings (i.e., the confirmation of them) across primary studies regardless of their methodological pedigree. Judgments of similarity are “basic to thought and language” (Quine, 1969, p. 116) and to counting and theme or pattern identification (Sandelowski et al., 2009). The key mandate in research synthesis studies is that these judgments themselves be judged as appropriate and meaningful by the various consumers of research synthesis studies (Cooper, 2010).

Aggregation may be accomplished at the subject and/or study levels. Reviewers decide on the option likely to yield the most meaningful results (e.g., for practice, for policy, for future research) with the least amount of information loss. All research synthesis projects entail such a trade-off.

Aggregation at the subject level

Synthesis by aggregation at the subject level is an option if individual findings are linked to individual participants in research reports or this information can be obtained from the authors of those reports. Aggregation of quantitative findings at the subject level is the familiar logic of research synthesis by meta-analysis as the pooling of findings is dependent on sample size. Aggregation of qualitative findings alone or with quantitative findings at the subject level is an option only if the numbers of subjects linked to findings in qualitative research reports are available from the reports themselves or from the authors or if a reasonable range of values can be plausibly inferred from reports (Chang, Voils, Sandelowski, Hasselblad, & Crandell, 2009). A Bayesian approach to subject-level aggregation of qualitative and quantitative findings is described in Voils et al. (2009) and in Crandell, Voils, and Sandelowski (in press).

Aggregation at the study level

Because subject-level information is often not available or plausibly inferable from qualitative and sometimes quantitative reports, aggregation of qualitative and quantitative findings is typically possible only at the study level. Qualitative findings are usually presented “at the study level, with thematic and interpretive lines typically prevailing over frequency counts, and within-participant or between-thematic-lines comparisons prevailing over between- and cross-participant comparisons. In contrast, quantitative findings are typically presented as group-level statistics (e.g., odds ratio)” (Voils et al., 2009, p. 228) based on subject-level information available in or calculable from reports.

In study-level aggregation, each study provides support for or against a hypothesis; for example, each study may be assigned a value indicating whether support was present (1) or absent (0). The criteria for presence of support for a finding must be carefully determined. For example, quantitative studies might need to meet a certain significance level or effect size to be considered supportive of the finding, whereas qualitative studies might need a direct statement supporting the finding. To capture degree of presence and thereby to minimize the loss of analytic power resulting from the present/absent dichotomization, support for a hypothesis may be assessed ordinally according to a meaningful and consistently applied standard (e.g., assignment of values according to size of effect in quantitative studies or evaluating the amount of space or time given over to a thematic line in qualitative studies; Onwuegbuzie & Teddlie, 2003).

The line between study- and subject-level aggregations is not a hard one. The line may be crossed when standard meta-analysis approaches are used that approximate subject-level analysis through the use of standard errors from the original study (which depend on the sample size), or by weighting estimates according to the sample size, effectively giving more input to studies with more subjects.

Approaches to study-level aggregation of qualitative findings include meta-summary (Sandelowski & Barroso, 2007); of quantitative findings, vote counting (Bushman, 1994); and of qualitative and quantitative findings, adaptations of Bayesian meta-analysis (Berry & Stangl, 2000). Examples of the use of meta-summary include the Draucker et al. (2009) study of healing in sexual abuse victims and Williams’s (2006) study of spirituality at the end of life. An example of vote counting of quantitative findings is the Voils, Sandelowski, Barroso, and Hasselblad (2008) synthesis of factors associated with antiretroviral medication adherence. An example of a Bayesian aggregation of qualitative and quantitative findings at the study level is the Crandell, Voils, Chang, and Sandelowski (2011) study of antiretroviral adherence. The Roberts, Dixon-Woods, Fitzpatrick, Abrams, and Jones (2002) study is an example of Bayesian aggregation at both the subject and study levels as these investigators used study-level data from the qualitative reports as prior information in the Bayesian meta-analysis of quantitative reports at the subject level.

Research Synthesis by Configuration

Research synthesis by configuration entails the arrangement of thematically diverse individual findings, or sets of aggregated findings, into a coherent theoretical rendering of them. In contrast to the judgment of thematic similarity among findings required to aggregate findings, findings in configuration syntheses are conceived as thematically diverse and therefore as not amenable to pooling. Instead of confirming each other (by virtue of repetition of what are judged to be the same aspects or associations), thematically diverse findings may contradict, extend, explain, or otherwise modify each other. Although their relationship may not necessarily be immediately evident, such findings are viewed as potentially related. In configuration synthesis, researchers link findings, even though these links may not have been addressed in any of the primary studies reviewed. Configurations are “hindsight accounts of the connectedness of things … pieced-together patternings … of assorted materials … after the fact” (Geertz, 1995, p. 2).

An example is the linking of finding A indicating that individuals in one race/ethnicity group mistrust their health care providers with finding B indicating that individuals in this same group tend to have lower adherence to the medications prescribed for them. Here, the researchers conducting the synthesis study view provider mistrust as a possible explanation for lower medication adherence in a certain race/ethnicity group. An example of the configuration of aggregated findings occurs when each of these findings (i.e., the relationship between a race/ethnicity group and provider mistrust and the relationship between a race/ethnicity group and lower medication adherence) appears recurrently and the reviewer chooses to link these pooled sets of findings.

Although the pooling of findings in aggregation syntheses are also pieced-together, after-the-fact reconstructions, the linking of findings in configuration syntheses is farther removed from the findings as given in primary studies. That is, as indicated in the mistrust/adherence example, an even greater degree of reviewer intervention is involved in configuration syntheses as they may entail the “mesh(ing)” (Mason, 2006, p. 20) as opposed to merging of findings that were never placed together in the research reports reviewed. The logic of configuration syntheses is the logic of abductive reasoning, which is also the reasoning of grounded theorizing (Reichertz, 2007), a major methodological approach to research synthesis by configuration. Abductive reasoning begins when reviewers intuit that a set of seemingly unrelated findings are in fact related and ends with models of these relationships that can themselves be formally tested. The abductive reasoning at the heart of configuration syntheses captures the intuitive and often inexpressible leaps of imagination that characterize them.

Referring to such syntheses as configurative rather than interpretive not only avoids the problem of assuming that any research synthesis (including meta-analysis) can be noninterpretive but arguably gets closer to what proponents of “interpretive” research synthesis both mean and want when they advocate more “qualitatively driven” (Mason, 2006) research synthesis efforts to offset the hegemony of positivism and the knowledge-by-accumulation logic associated with it (Hammersley, 2001; Noblit & Hare, 1988). Configuration better captures the ideas of pattern, design, and gestalt: the mosaic, big-picture, more-than-the-sum-of-parts, and novel-whole view of the way knowledge advances. Science is composed of theories and models, that is, knowledge configurations.

Synthesis by configuration may be top-down and/or bottom-up. Top-down approaches are not simply deductive as they always entail hunches derived from the data that certain concepts or models might be useful and generative ways to configure findings. Bottom-up approaches are not simply inductive as they always draw from prior understandings, theoretical leanings, and the like concerning what factors might belong together, the order in which they are arranged, and the like.

Top-down

These configurations begin with a concept, conceptual framework, or extant theory—drawn from the primary studies reviewed or from some other literature—by which individual or pooled sets of findings can be organized or mapped. Heretofore unseen connections among findings may come into view by translating them into the language of the concept or constructs of a theory. Becker and Schram (1994) described the use of research synthesis of quantitative findings to evaluate support for different paths in a theoretical model. Malpass et al. (2009) imported the concept of the illness career to synthesize findings on patients’ experiences with antidepressant medications.

Bottom-up

In contrast is the bottom-up approach whereby configuration is data-derived or accomplished from various novel assemblies of findings. Here, a theoretical rendering is created to encompass and represent a particular assembly or configuration of findings. In the Paterson (2001) study, the concepts of “illness in foreground” and “wellness in foreground” were derived from a body of empirical research and then used to model the “shifting perspectives” of chronically ill persons.

Methods for both top-down and bottom-up configurations of qualitative and/or quantitative findings encompass methods used also in primary qualitative and quantitative research both to develop and test theory, including grounded theory (Kearney, 2001b, 2007), variants of meta-ethnography (Noblit & Hare, 1988), realist synthesis (Pawson, 2006), and structural equation modeling (Cheung & Chan, 2005). Examples of the use of grounded theory include Kearney (2001a) and Kearney and O’Sullivan (2003); of meta-ethnography, Malpass et al. (2009) and Pound et al. (2005); of realist synthesis, Greenhalgh, Kristjansson, and Robinson (2007) and O’Campo et al. (2009); and of structural equation modeling, Cheung and Chan (2005, 2009).

Conclusion

Our map of the mixed research synthesis terrain is focused on differentiating the aggregation versus configuration logics of synthesis as opposed to methods, any one of which might be used in different ways to operationalize either logic. These logics pertain not only to research synthesis studies but also to primary mono method and mixed method studies as the production of “findings” in all empirical research inherently entails the merging or meshing of data. Although this focus acknowledges the qualitative/quantitative binary, it does not default to it. Instead, the focus on the logics of research synthesis favors emphasis on the research questions addressed by the synthesis and on the nature of the findings in the body of research targeted for review to answer those questions, consideration of all the synthesis approaches these findings allow, and selection of the approaches that will likely yield credible, meaningful, and actionable results.

By mapping the mixed research synthesis enterprise by the logics of aggregation and configuration, and by our very presentation of this map in the form of a two-column table, we risk the charge of merely substituting one binary for another and thereby reproducing the very binary thinking we intended to offset. By focusing on the logics of research synthesis, however, we hope to move away from the most entrenched aspects of the qualitative/quantitative binary. By showing how these logics might entail each other, we hope to avoid reifying them. We hope also to resist the a priori view that aggregation is a violation of the imperatives of qualitative research. At the very least, we advance the logics of subject- and/or study-level aggregation and top-down and/or bottom-up configuration as potentially more useful comparative reference points for mapping the mixed research synthesis terrain.

We acknowledge too that what we describe here as the confusing state of affairs of mixed research synthesis may be precisely the state of affairs that ought to exist in that it captures not the weakness of confusion but rather the strength of diversity. That said, our map can still be read as an effort to drill down to the differences that make a difference in the conduct of mixed research synthesis studies.

Acknowledgments

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article from a National Institute of Nursing Research, National Institutes of Health grant (“Integrating Qualitative & Quantitative Research Findings,” 5R01NR004907, June 3, 2005 to March 31, 2011), and with resources and facilities of the Veterans Affairs Medical Center in Durham, North Carolina.

Footnotes

1We recognize the diverse understandings and debates about what constitutes a realist versus positivist versus idealist versus constructivist versus any other philosophical orientation to inquiry. Within the domain of realism alone are lines drawn among hard, soft, experiential, critical, scientific, and naive realisms and between one or more of these realisms and one or more positivisms (e.g., Barnett-Page & Thomas, 2009; Crotty, 1998; Maxwell & Mittapali, 2010). These words are variously used to designate ontological and epistemological positions with the terms ontology and epistemology themselves variously applied. We understand the extent to which any effort to communicate these differences simplifies them. The use of the joined form (e.g., realist/positivist) is simply to show that these words are seen in the systematic review and research synthesis literature at least in part as referring to overlapping stances. We use these terms here with great humility: to convey the -isms that populate the methodological discourse of this literature but primarily to communicate an important difference in orientation to research synthesis studies that tends to be glossed.

Views expressed in this article are those of the authors and do not necessarily represent the Department of Veterans Affairs.

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

  • Agar M. We have met the other and we’re all nonlinear: Ethnography as a nonlinear dynamic system. Complexity. 2004;10(2):16–24.
  • Allwood CM. The distinction between qualitative and quantitative research methods is problematic. Quality & Quantity. 2011 doi: 10.1007/s11135-011-9455-8. Advance online publication. [Cross Ref]
  • Barbour RS, Barbour M. Evaluating and synthesizing qualitative research: The need to develop a distinctive approach. Journal of Evaluation in Clinical Practice. 2003;9:179–186. [PubMed]
  • Barnett-Page E, Thomas J. Methods for the synthesis of qualitative research: A critical review. BMC Medical Research Methodology. 2009;9:59. Retrieved from http://www.biomedcentral.com/1471-2288/9/59. [PMC free article] [PubMed]
  • Becker BJ, Schram CM. Examining explanatory models through research synthesis. In: Cooper H, Hedges LV, editors. The handbook of research synthesis. New York, NY: Russell Sage Foundation; 1994. pp. 357–382.
  • Becker HS. The epistemology of qualitative research. In: Jessir R, Colby A, Shweder RA, editors. Ethnography and human development: Context and meaning in social inquiry. Chicago, IL: University of Chicago Press; 1996. pp. 53–71.
  • Berry DA, Stangl DK, editors. Meta-analysis in medicine and health policy. New York, NY: Dekker; 2000.
  • Boaz A, Ashby D, Denyer D, Egan M, Harden A, Jones DR, Tranfield D. A multitude of syntheses: A comparison of five approaches from diverse policy fields. Evidence & Policy. 2006;2:479–502.
  • Bushman B. Vote-counting procedures in meta-analysis. In: Cooper H, Hedges LV, editors. The handbook of research synthesis. New York, NY: Russell Sage Foundation; 1994. pp. 193–214.
  • Chang Y, Voils CI, Sandelowski M, Hasselblad V, Crandell JL. Transforming verbal counts in reports of qualitative descriptive studies into numbers. Western Journal of Nursing Research. 2009;31:837–852. [PMC free article] [PubMed]
  • Cheung MW, Chan W. Meta-analytic structural equation modeling: A two-stage approach. Psychological Methods. 2005;10:40–64. [PubMed]
  • Cheung MW, Chan W. A two-stage approach to synthesizing covariance matrices in meta-analytic structural equation modeling. Structural Equation Modeling. 2009;16:28–53.
  • Cooper H. Research synthesis and meta-analysis: A step-by-step approach. 4. Thousand Oaks, CA: SAGE; 2010.
  • Crandell JL, Voils CI, Chang Y, Sandelowski M. Bayesian data augmentation methods for the synthesis of qualitative and quantitative research findings. Quality & Quantity. 2011;45:653–669. [PMC free article] [PubMed]
  • Crandell JL, Voils CI, Sandelowski M. Bayesian approaches to the synthesis of qualitative and quantitative research findings. In: Hannes K, editor. Worked examples of qualitative evidence synthesis. New York, NY: Wiley; (in press)
  • Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. Thousand Oaks, CA: SAGE; 2007.
  • Crotty M. The foundations of social research: Meaning and perspective in the research process. London, England: SAGE; 1998.
  • Dixon-Woods M, Agarwal S, Jones D, Young B, Sutton A. Synthesizing qualitative and quantitative evidence: A review of possible methods. Journal of Health Services Research & Policy. 2005;10:45–53. [PubMed]
  • Dixon-Woods M, Cavers D, Agarwal S, Annandale E, Arthur A, Harvey J, Sutton AJ. Conducting a critical interpretive synthesis of the literature on access to health care by vulnerable groups. BMC Medical Research Methodology. 2006;6:35. Retrieved from http://www.biomedcentral.com/1471-2288/6/35. [PMC free article] [PubMed]
  • Draucker CB, Martsolf DS, Ross R, Cook CB, Stidham AW, Mweemba P. The essence of healing from sexual violence: A qualitative metasynthesis. Research in Nursing & Health. 2009;32:366–378. [PMC free article] [PubMed]
  • Eisenhart M. On the subject of interpretive reviews. Review of Educational Research. 1998;68(4):389–397.
  • Geertz C. After the fact: Two countries, four decades, one anthropologist. Cambridge, MA: Harvard University Press; 1995.
  • Glaser B, Strauss A. The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine; 1967.
  • Grant MJ, Booth A. A typology of reviews: An analysis of 14 review types and associated methodologies. Health Information and Libraries Journal. 2009;26:91–108. [PubMed]
  • Greckhamer T, Koro-Ljungberg M. The erosion of a method: Examples from grounded theory. International Journal of Qualitative Studies in Education. 2005;18:729–750.
  • Greenhalgh T, Kristjansson E, Robinson V. Realist review to understand the efficacy of school feeding programs. British Medical Journal. 2007;335:858–861. [PMC free article] [PubMed]
  • Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O, Peacock R. Storylines of research in diffusion of innovation: A meta-narrative approach to systematic review. Social Science & Medicine. 2005;61:417–430. [PubMed]
  • Gubrium JF, Holstein JA. The new language of qualitative method. New York, NY: Oxford University Press; 1997.
  • Hammersley M. On “systematic” reviews of research literatures: A “narrative” response to Evans & Benefield. British Educational Research Journal. 2001;27:543–554.
  • Harden A, Thomas J. Methodological issues in combining diverse study types in systematic reviews. Internal Journal of Social Research Methodology. 2005;8:257–271.
  • Harden A, Thomas J. Mixed methods and systematic reviews: Examples and emerging issues. In: Tashakkori A, Teddlie C, editors. Sage handbook of mixed methods in social & behavioral research. 2. Thousand Oaks, CA: SAGE; 2010. pp. 749–774.
  • Hawker S, Payne S, Kerr C, Hardey M, Powell J. Appraising the evidence: Reviewing the disparate data systematically. Qualitative Health Research. 2002;12:1284–1299. [PubMed]
  • Kearney MH. Enduring love: A grounded formal theory of women’s experience of domestic violence. Research in Nursing & Health. 2001a;24:270–282. [PubMed]
  • Kearney MH. New directions in grounded formal theory. In: Schreiber R, Stern PN, editors. Using grounded theory in nursing. New York, NY: Springer; 2001b. pp. 227–246.
  • Kearney MH. From the sublime to the meticulous: The continuing evolution of grounded formal theory. In: Bryant A, Charmaz K, editors. The Sage handbook of grounded theory. Thousand Oaks, CA: SAGE; 2007. pp. 127–150.
  • Kearney MH, O’Sullivan J. Identity shifts as turning points in health behavior change. Western Journal of Nursing Research. 2003;25:134–152. [PubMed]
  • Law J. After method: Mess in social science research. London, England: Routledge; 2004.
  • Lemmer B, Grellier R, Steven J. Systematic review of nonrandom and qualitative research literature: Exploring and uncovering an evidence base for health visiting and decision making. Qualitative Health Research. 1999;9:315–328.
  • Love AC. History, scientific methodology, and the “squishy” sciences. Perspectives in Biology and Medicine. 2006;49:452–456.
  • Love K, Pritchard C, Maguire K, McCarthy A, Paddock P. Qualitative and quantitative approaches to health impact assessment: An analysis of the political and philosophical milieu of the multi-method approach. Critical Public Health. 2005;15:275–289.
  • Lucas PJ, Baird J, Arai L, Law C, Roberts HM. Worked examples of alternative methods for the synthesis of qualitative and quantitative research in systematic reviews. BMC Medical Research Methodology; 2007. p. 4. Retrieved from http://www.biomedcentral.com/1471-2288/7/4. [PMC free article] [PubMed]
  • MacLure M. “Clarity bordering on stupidity”: Where’s the quality in systematic review? Journal of Education Policy. 2005;20:393–416.
  • Malpass A, Shaw A, Sharp D, Walter F, Feder G, Ridd M, Kessler D. “Medication career” or “moral career”? The two sides of managing antidepressants: A meta-ethnography of patients’ experience of antidepressants. Social Science & Medicine. 2009;68:154–168. [PubMed]
  • Mason J. Mixing methods in a qualitatively driven way. Qualitative Research. 2006;6:9–25.
  • Maxwell JA, Mittapalli K. Realism as a stance for mixed methods research. In: Tashakkori A, Teddlie C, editors. Sage handbook of mixed methods in social & behavioral research. 2. Thousand Oaks, CA: SAGE; 2010. pp. 145–167.
  • Mays N, Pope C, Popay J. Systematically reviewing qualitative and quantitative evidence to inform management and policy-making in the health field. Journal of Health Services Research and Policy. 2005;10(Suppl 1):S1:6–S1:20. [PubMed]
  • Miles MB, Huberman AM. Qualitative data analysis: An expanded sourcebook. 2. Thousand Oaks, CA: SAGE; 1994.
  • Moreira T. Entangled evidence: Knowledge making in systematic reviews in health care. Sociology of Health & Illness. 2007;29:180–197. [PubMed]
  • Neyland D. Achieving transparency: The visible, invisible and divisible in academic accountability networks. Organization. 2007;14:499–516.
  • Noblit GW, Hare RD. Meta-ethnography: Synthesizing qualitative studies. Newbury Park, CA: SAGE; 1988.
  • O’Campo P, Kirst M, Schaefer-McDaniel N, Firestone M, Scott A, McShane K. Community-based services for homeless adults experiencing concurrent mental health and substance use disorders: A realist approach to synthesizing evidence. Journal of Urban Health. 2009;86:965–989. [PMC free article] [PubMed]
  • Onwuegbuzie AJ, Teddlie C. A framework for analyzing data in mixed methods research. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: SAGE; 2003. pp. 351–383.
  • Paterson BL. The shifting perspectives model of chronic illness. Journal of Nursing Scholarship. 2001;33:21–26. [PubMed]
  • Paterson BL, Thorne SE, Canam C, Jillings C. Meta-study of qualitative health research: A practical guide to meta-analysis and meta-synthesis. Thousand Oaks, CA: SAGE; 2001.
  • Pawson R. Evidence-based policy: A realist perspective. London, England: SAGE; 2006.
  • Pope C, Mays N, Popay J. Synthesizing qualitative and quantitative health evidence. Berkshire, England: Open University Press; 2007.
  • Pound P, Britten N, Morgan M, Yardley L, Pope C, Daker-White G, Campbell R. Resisting medicines: A synthesis of qualitative studies of medicine taking. Social Science & Medicine. 2005;61:133–155. [PubMed]
  • Quine WV. Ontological relativity and other essays. New York, NY: Columbia University Press; 1969.
  • Reichertz J. Abduction: The logic of discovery of grounded theory. In: Bryant A, Charmaz K, editors. The Sage handbook of grounded theory. Thousand Oaks, CA: SAGE; 2007. pp. 214–228.
  • Roberts K, Dixon-Woods M, Fitzpatrick R, Abrams K, Jones DR. Factors affecting uptake of childhood immunization: An example of Bayesian synthesis of qualitative and quantitative evidence. Lancet. 2002;360:1596–1599. [PubMed]
  • Sandelowski M. Using qualitative research. Qualitative Health Research. 2004;14:1366–1386. [PubMed]
  • Sandelowski M. Reading, writing, and systematic review. Journal of Advanced Nursing. 2008;64:104–110. [PMC free article] [PubMed]
  • Sandelowski M, Barroso J. Handbook for synthesizing qualitative research. New York, NY: Springer; 2007.
  • Sandelowski M, Voils CI, Barroso J. Defining and designing mixed research synthesis studies. Research in the Schools. 2006;13:29–40. [PMC free article] [PubMed]
  • Sandelowski M, Voils CI, Knafl G. On quantitizing. Journal of Mixed Methods Research. 2009;3(3):208–222. [PMC free article] [PubMed]
  • Shapin S. Pump and circumstance: Robert Boyle’s literary technology. Social Studies of Science. 1984;14:481–520.
  • Strathern M. The tyranny of transparency. British Educational Research Journal. 2000;26:309–321.
  • Suri H, Clarke D. Advancements in research synthesis methods: From a methodologically inclusive perspective. Review of Educational Research. 2009;79:395–430.
  • Tashakkori A, Teddlie C, editors. Sage handbook of mixed methods in social & behavioral research. 2. Thousand Oaks, CA: SAGE; 2010.
  • Taylor C, Coffey A. Special issue: Qualitative research and methodological innovation. Qualitative Research. 2009;9:523–526.
  • Thorne S, Jensen L, Kearney MH, Noblit G, Sandelowski M. Qualitative meta-synthesis: Reflections on methodological orientation and ideological agenda. Qualitative Health Research. 2004;14:1342–1365. [PubMed]
  • Thorne S, Joachim G, Paterson B, Canam C. Influence of the research frame on qualitatively derived health science. International Journal of Qualitative Methods. 2002;1(1) Retrieved from http://ejournals.library.ualberta.ca/index.php/IJQM/article/view/4611/3760.
  • Thorne S, Paterson B, Acorn S, Canam C, Joachim G, Jillings C. Chronic illness experience: Insights from a metastudy. Qualitative Health Research. 2002;12:437–452. [PubMed]
  • Travers M. New methods, old problems: A skeptical view of innovation in qualitative research. Qualitative Research. 2009;9:161–179.
  • Voils CI, Barroso J, Hasselblad V, Sandelowski M. In or out? Methodological considerations for including and excluding findings from a meta-analysis of predictors of antiretroviral adherence in HIV-positive women. Journal of Advanced Nursing. 2007;59:163–177. [PMC free article] [PubMed]
  • Voils CI, Hasselblad V, Chang YK, Crandell J, Lee EJ, Sandelowski M. A Bayesian method for the synthesis of evidence from qualitative and quantitative reports: The example of antiretroviral medication adherence. Journal of Health Services Research & Policy. 2009;14:226–233. [PMC free article] [PubMed]
  • Voils CI, Sandelowski M, Barroso J, Hasselblad V. Making sense of qualitative and quantitative research findings in mixed research synthesis studies. Field Methods. 2008;20:3–25. [PMC free article] [PubMed]
  • Vogt WP. Quantitative versus qualitative is a distraction: Variations on a theme by Brewer & Hunter (2006) Methodological Innovations Online. 2008 Aug 27;3(1) Retrieved from http://erdt.plymouth.ac.uk/mionline/public_html/viewarticle.php?id=71.
  • Williams AL. Perspectives on spirituality at the end of life: A meta-summary. Palliative and Supportive Care. 2006;4:407–417. [PubMed]
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...