Article Alert Received 29 October 2012,

Accepted 3 November 2012

Published online in Wiley Online Library

(wileyonlinelibrary.com) DOI: 10.1002/jrsm.1069

Article Alerts: Items from 2011, Part I Adam R. Hafdahla,b*† The print component of this fifth ‘Article Alerts’ installment comprises 100 articles published in 2011. Since the preceding installment, more than 1,100 items disseminated before 1994 have been added to the archive component. To improve access, searching, and other aspects of users’ experience, items from the print and archive components are being added to an online library. Copyright © 2012 John Wiley & Sons, Ltd. Supporting information may be found in the online version of this article. Keywords:

bibliography; research synthesis; systematic review; meta-analysis; methodology; literature search

Continuing the precedent set in previous installments of the ‘Article Alerts’ feature section, this fifth installment consists of both a print component and additions to the archive component. These two components now contain more than 7,200 articles, chapters, dissertations, and other types of work on methodology for research synthesis. This installment’s print component comprises 100 articles published in 2011, and we anticipate that the next installment’s print component will include more from 2011. An Excel workbook provided as supporting information on the journal’s website contains separate spreadsheets for the print and archive components. All 1,129 items added to the archive component since the preceding installment[1] were published or otherwise disseminated before 1994. More than 75% of these (873) were disseminated during the 10 years after 1983, and 95% (1,074) were disseminated during the 20 years after 1973; the remaining 5% (55), from 1937 to 1973. Among these newly added archive items are many seminal contributions whose authors laid the methodological foundations of meta-analysis, systematic review, and related approaches. An endeavor is underway to improve access to this feature section’s print and archive items. To that end, the CiteULike reference management service was used to create a library devoted to methodology for research synthesis. CiteULike’s powerful search capability and other features facilitate finding and managing scholarly work. Those interested may access this library without paying or registering (http://www.citeulike.org/user/ Meth4ReSyn). As of this writing, it contains 10% of the 6,018 print and archive items from the preceding installment[1]; more will be added as time and other resources permit. Many items in this library include an abstract and other metadata not in this feature section’s associated Excel file, such as tags (i.e., keywords) and links to full-text access. The following web page describes the library in some detail: http://adamhafdahl.net/ bibliography. Readers are encouraged to try it and convey constructive feedback to this feature section’s editor.

Items for current installment Each of the 100 items in the succeeding text was published in a scholarly journal during 2011. For consistency with previous installments, these items and their associated keywords are partitioned into the same seven categories, whose descriptions are repeated with minor changes. The online supporting information’s Excel file includes DOI names for most of these items.

a

ARCH Statistical Consulting, Lawrence, KS, USA University of Kansas, Lawrence, KS, USA *Correspondence to: Adam R. Hafdahl, ARCH Statistical Consulting, LLC, P.O. Box 3877, Lawrence, KS 66046, USA. † E-mail: [email protected] b

Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

325

Adam R. Hafdahl, ARCH Statistical Consulting, Lawrence, KS, and Center for Research Methods and Data Analysis, University of Kansas.

A. R. HAFDAHL

Proposal of novel or refined method: primarily non-statistical Authors of the items in this first category proposed a novel procedure or strategy for a largely non-statistical aspect of research synthesis or a closely related task (e.g., appraising reviews). In cases where statistical methods were proposed, the contribution as a whole extends substantially beyond statistical considerations. Many of these items include worked examples or real-data illustrations. 401. Adami, H.-O., Berry, C. L., Breckenridge, C. B., Smith, L. L., Swenberg, J. A., Trichopoulos, D., et al. (2011). Toxicology and epidemiology: Improving the science with a framework for combining toxicological and epidemiological evidence to establish causal inference. Toxicological Sciences, 122, 223–234. [primary-study design, threat to validity, guidance for reviewing, literature search, primary-study quality, strength of evidence] 402. Christenson, R. H., Snyder, S. R., Shaw, C. S., Derzon, J. H., Black, R. S., Mass, D., et al. (2011). Laboratory medicine best practices: Systematic evidence review and evaluation methods for quality improvement. Clinical Chemistry, 57, 816–825. [guidance for reviewing, literature search, grey literature, inclusion/ exclusion, primary-study quality, coding, strength of evidence, knowledge transfer] 403. Greenhalgh, T., Wong, G., Westhorp, G., & Pawson, R. (2011). Protocol – Realist and meta-narrative evidence synthesis: evolving standards (RAMESES). BMC Medical Research Methodology, 11, 115. [qualitative research, policy implications, guidance for reviewing, guidance for reporting, assessment of review practice, training] 404. Higgins, J. P. T., Altman, D. G., Gøtzsche, P. C., Jüni, P., Moher, D., Oxman, A. D., et al. (2011). The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. British Medical Journal, 343, d5928. [threat to validity, Cochrane Collaboration, guidance for reporting] 405. Nowak, P. (2011). Synthesis of qualitative linguistic research—A pilot review integrating and generalizing findings on doctor–patient interaction. Patient Education and Counseling, 82, 429–441. [qualitative research, guidance for reviewing, literature search, primary-study quality, coding] 406. Peterson, K., McDonagh, M. S., & Fu, R. W. (2011). Decisions to update comparative drug effectiveness reviews vary based on type of new evidence. Journal of Clinical Epidemiology, 64, 977–984. [updating reviews, literature search, primary-study design, assessment of review practice] 407. Rosenbaum, S. E., Glenton, C., Wiysonge, C. S., Abalos, E., Mignini, L., Young, T., et al. (2011). Evidence summaries tailored to health policy-makers in low- and middle-income countries. Bulletin of the World Health Organization, 89, 54–61. [dissemination, policy implications, knowledge transfer] 408. Shamliyan, T. A., Kane, R. L., Ansari, M. T., Raman, G., Berkman, N. D., Grant, M., et al. (2011). Development quality criteria to evaluate nontherapeutic studies of incidence, prevalence, or risk factors of chronic diseases: pilot study of new checklists. Journal of Clinical Epidemiology, 64, 637–657. [primary-study quality, primary-study design, threat to validity, measurement] Proposal of novel or refined method: statistical In contrast to methods proposed by the previous category’s authors, those in the succeeding text primarily involve statistical techniques, and their authors often included evaluations of the proposed method and competing approaches. Much of this work pertains to the data-analysis phase of a quantitative review, but some authors addressed broader issues as well (e.g., data repositories, updating reviews).

326

409. Alves, G., & Yu, Y.-K. (2011). Combining independent, weighted P-values: achieving computational stability by a systematic expansion with controllable accuracy. PLoS One, 6, e22647. [combining p values, method comparison] 410. Bagos, P. G. (2011). Meta-analysis of haplotype-association studies: comparison of methods and empirical evaluation of the literature. BMC Genetics, 12, 8. [genetic association, binary outcome, heterogeneity, multivariate technique, random effects, software, assessment of review practice] 411. Bagos, P. G., Dimou, N. L., Liakopoulos, T. D., & Nikolopoulos, G. K. (2011). Meta-analysis of family-based and case-control genetic association studies that use the same cases. Statistical Applications in Genetics and Molecular Biology, 10, 19. [genetic association, primary-study design, binary outcome, multivariate technique, missing data, software, method comparison] 412. Davidov, O. (2011). Combining p-values using order-based methods. Computational Statistics & Data Analysis, 55, 2433–2444. [combining p values, method comparison, power, Monte Carlo simulation] 413. Doi, S. A. R., Barendregt, J. J., & Mozurkewich, E. L. (2011). Meta-analysis of heterogeneous clinical trials: an empirical example. Contemporary Clinical Trials, 32, 288–298. [heterogeneity, primary-study quality, random effects, method comparison, Monte Carlo simulation] 414. Han, B., & Eskin, E. (2011). Random-effects model aimed at discovering associations in meta-analysis of genome-wide association studies. American Journal of Human Genetics, 88, 586–598. [genetic association, heterogeneity, binary outcome, random effects, power, Monte Carlo simulation, method comparison] 415. Li, J., & Tseng, G. C. (2011). An adaptively weighted statistic for detecting differential gene expression when combining multiple transcriptomic studies. Annals of Applied Statistics, 5, 994–1019. [microarray, combining p values, method comparison, power, resampling, Monte Carlo simulation] Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

A. R. HAFDAHL

416. Li, J. L., & Fine, J. P. (2011). Assessing the dependence of sensitivity and specificity on prevalence in metaanalysis. Biostatistics, 12, 710–722. [diagnostic test accuracy, correlation, moderator, Monte Carlo simulation] 417. Ma, Y., & Mazumdar, M. (2011). Multivariate meta-analysis: a robust approach based on the theory of Ustatistic. Statistics in Medicine, 30, 2911–2929. [multivariate technique, random effects, assumption violation, missing data, method comparison, Monte Carlo simulation] 418. Meyer-Baron, M., Schäper, M., Knapp, G., Lucchini, R., Albini, E., Bast-Pettersen, R., et al. (2011). Statistical means to enhance the comparability of data within a pooled analysis of individual data in neurobehavioral toxicology. Toxicology Letters, 206, 144–151. [individual patient data, multilevel model, moderator, heterogeneity] 419. Miladinovic, B., Kumar, A., Hozo, I., & Djulbegovic, B. (2011). Instrumental variable meta-analysis of individual patient data: application to adjust for treatment non-compliance. BMC Medical Research Methodology, 11, 55. [individual patient data, missing data, selection bias, binary outcome, method comparison] 420. Noma, H. (2011). Confidence intervals for a random-effects meta-analysis based on Bartlett-type corrections. Statistics in Medicine, 30, 3304–3312. [interval estimation, random effects, method comparison, Monte Carlo simulation] 421. Politopoulos, I., Gibson, J., Tapper, W., Ennis, S., Eccles, D., & Collins, A. (2011). Composite likelihood-based meta-analysis of breast cancer association studies. Journal of Human Genetics, 56, 377–382. [genetic association, missing data, heterogeneity] 422. Sharma, G., & Mathew, T. (2011). Higher order inference for the consensus mean in inter-laboratory studies. Biometrical Journal, 53, 128–136. [interval estimation, heterogeneity, Monte Carlo simulation, method comparison] 423. Xie, M. G., Singh, K., & Strawderman, W. E. (2011). Confidence distributions and a unifying framework for meta-analysis. Journal of the American Statistical Association, 106, 320–333. [combining p values, random effects, Bayesian approach, outlier, method comparison] 424. Yao, H., Chen, M.-H., & Qiu, C. F. (2011). Bayesian modeling and inference for meta-data with applications in efficacy evaluation of an allergic rhinitis drug. Journal of Biopharmaceutical Statistics, 21, 992–1005. [random effects, moderator, dependence, Bayesian approach] 425. Zhou, B. Y., Shi, J. X., & Whittemore, A. S. (2011). Optimal methods for meta-analysis of genome-wide association studies. Genetic Epidemiology, 35, 581–591. [genetic association, survival analysis, heterogeneity, binary outcome, missing data, Monte Carlo simulation, method comparison, power] Broad review of synthesis methods Authors of items in this category described research synthesis methods in a fairly broad sense, often with particular attention to a specific research domain or type of data. They typically covered most major phases of review and often addressed the larger context in which syntheses are undertaken (e.g., evidence-based practice). A few authors focused specifically on meta-analysis or on qualitative methods, and some included real-data illustrations.

Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

327

426. Denson, N., & Seltzer, M. H. (2011). Meta-analysis in higher education: an illustrative example using hierarchical linear modeling. Research in Higher Education, 52, 215–244. [overview of meta-analysis, multilevel model, coding, standardized mean difference, graphics, heterogeneity, random effects, moderator, sensitivity analysis, outlier, publication bias] 427. Fu, R. W., Gartlehner, G., Grant, M., Shamliyan, T., Sedrakyan, A., Wilt, T. J., et al. (2011). Conducting quantitative synthesis when comparing medical interventions: AHRQ and the Effective Health Care Program. Journal of Clinical Epidemiology, 64, 1187–1197. [guidance for reviewing, indirect comparison, effect size, binary outcome, standardized mean difference, rare events, heterogeneity, Bayesian approach, moderator, primary-study design, sensitivity analysis] 428. Haase, S. C. (2011). Systematic reviews and meta-analysis. Plastic and Reconstructive Surgery, 127, 955–966. [overview of research synthesis, publication bias, guidance for reviewing, problem formulation, literature search, inclusion/exclusion, primary-study quality, coding, guidance for reporting, implications for individuals] 429. Hannes, K., & Lockwood, C. (2011). Pragmatism as the philosophical foundation for the Joanna Briggs metaaggregative approach to qualitative evidence synthesis. Journal of Advanced Nursing, 67, 1632–1642. [qualitative research, method comparison, guidance for practice, policy implications] 430. Hansen, H. P., Draborg, E., & Kristensen, F. B. (2011). Exploring qualitative research synthesis: the role of patients’ perspectives in health policy design and decision making. Patient, 4, 143–152. [qualitative research, implications for individuals, policy implications, technology assessment, literature search, primary-study quality, method comparison] 431. Huf, W., Kalcher, K., Pail, G., Friedrich, M.-E., Filzmoser, P., & Kasper, S. (2011). Meta-analysis: fact or fiction? How to interpret meta-analyses. World Journal of Biological Psychiatry, 12, 188–200. [assessment of review practice, appraising reviews, guidance for reporting, effect size, primary-study quality, missing data, publication bias, individual patient data, random effects, moderator]

A. R. HAFDAHL

432. Hussain, N., Bookwala, A., Sancheti, P., & Bhandari, M. (2011). The 3-min appraisal of a meta-analysis. Indian Journal of Orthopaedics, 45, 4–5. [appraising reviews] 433. Israel, H., & Richter, R. R. (2011). A guide to understanding meta-analysis. Journal of Orthopaedic & Sports Physical Therapy, 41, 496–504. [overview of meta-analysis, effect size, interval estimation, heterogeneity, random effects, binary outcome, graphics, software, moderator, individual patient data] 434. Katapodi, M. C., & Northouse, L. L. (2011). Comparative effectiveness research: using systematic reviews and meta-analyses to synthesize empirical evidence. Research and Theory for Nursing Practice, 25, 191–209. [overview of research synthesis, inclusion/exclusion, primary-study design, literature search, coding, heterogeneity, graphics, publication bias, guidance for reporting, cost analysis] 435. Major, C. H., & Savin-Baden, M. (2011). Integration of qualitative evidence: towards construction of academic knowledge in social science and professional fields. Qualitative Research, 11, 645–663. [qualitative research, overview of research synthesis, method comparison, assessment of review practice] 436. Nakagawa, S., & Hauber, M. E. (2011). Great challenges with few subjects: statistical strategies for neuroscientists. Neuroscience and Biobehavioral Reviews, 35, 462–473. [significance testing, effect size, overview of meta-analysis, multilevel model, Bayesian approach] 437. Navarese, E. P., Koziński, M., Pafundi, T., Andreotti, F., Buffon, A., De Servi, S., et al. (2011). Practical and updated guidelines on performing meta-analyses of non-randomized studies in interventional cardiology. Cardiology Journal, 18, 3–7. [overview of meta-analysis, guidance for reviewing, primary-study design, primary-study quality, publication bias, graphics, heterogeneity, moderator] 438. Petrokofsky, G., Holmgren, P., & Brown, N. D. (2011). Reliable forest carbon monitoring—Systematic reviews as a tool for validating the knowledge base. International Forestry Review, 13, 56–66. [evidence-based practice, policy implications] 439. Rew, L. (2011). The systematic review of literature: synthesizing evidence for practice. Journal for Specialists in Pediatric Nursing, 16, 64–69. [overview of research synthesis, problem formulation, literature search, inclusion/exclusion, coding, primary-study quality, dissemination] 440. Smith, V., Devane, D., Begley, C. M., & Clarke, M. (2011). Methodology in conducting a systematic review of systematic reviews of healthcare interventions. BMC Medical Research Methodology, 11, 15. [overview of research synthesis, review of reviews, literature search, inclusion/exclusion, review quality, dissemination] 441. Whiston, S. C., & Li, P. W. (2011). Meta-analysis: a systematic method for synthesizing counseling research. Journal of Counseling and Development, 89, 273–281. [overview of research synthesis, problem formulation, standardized mean difference, correlation, binary outcome, literature search, inclusion/exclusion, coding, dependence, random effects] Exposition of specific method or issue In contrast to items in the previous category, those in the succeeding text tended to focus on a fairly specific existing technique or issue, often including more detail than a broader review. Although many of these authors offered recommendations on the basis of their review of relevant methodology, their emphasis was not on introducing new methods.

328

442. Ades, A. E. (2011). ISPOR states its position on network meta-analysis. Value in Health, 14, 414–416. [network meta-analysis, indirect comparison, heterogeneity] 443. Aguinis, H., Gottfredson, R. K., & Wright, T. A. (2011). Best-practice recommendations for estimating interaction effects using meta-analysis. Journal of Organizational Behavior, 32, 1033–1043. [moderator, random effects, correlation, standardized mean difference, validity generalization, method comparison, guidance for reviewing, publication bias] 444. Al khalaf, M. M., Thalib, L., & Doi, S. A. R. (2011). Combining heterogeneous studies using the random-effects model is a mistake and leads to inconclusive meta-analyses. Journal of Clinical Epidemiology, 64, 119–123. [large study, heterogeneity, random effects, primary-study quality] 445. Botella, J., & Ponte, G. (2011). Effects of the heterogeneity of the variances on reliability generalization: an example with the Beck Depression Inventory. Psicothema, 23, 516–522. [reliability generalization, measurement, heterogeneity, moderator] 446. Bragge, P., Clavisi, O., Turner, T., Tavender, E., Collie, A., & Gruen, R. L. (2011). The global evidence mapping initiative: scoping research in broad topic areas. BMC Medical Research Methodology, 11, 92. [problem formulation, informing primary research, literature search, inclusion/exclusion, coding] 447. Bunn, F., & Sworn, K. (2011). Strategies to promote the impact of systematic reviews on healthcare policy: a systematic review of the literature. Evidence & Policy, 7, 403–428. [policy implications, dissemination, knowledge transfer] 448. Campbell, C. E., & Rukhin, A. L. (2011). Evaluation of self-diffusion data using weighted means statistics. Acta Materialia, 59, 5194–5201. [multivariate technique, random effects] 449. Combs, J. G., Ketchen, D. J., Jr., Crook, T. R., & Roth, P. L. (2011). Assessing cumulative evidence within ‘macro’ research: why meta-analysis should be preferred over vote counting. Journal of Management Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

A. R. HAFDAHL

450.

451.

452.

453.

454.

455.

456.

457. 458.

459.

460.

461.

462. 463.

464.

465.

466.

Studies, 48, 178–197. [vote counting, method comparison, significance testing, power, theory development, informing primary research] Coyne, J. C., Hagedoorn, M., & Thombs, B. (2011). Most published and unpublished dissertations should be excluded from meta-analyses: comment on Moyer et al. Psycho-Oncology, 20, 224–225. [grey literature, inclusion/exclusion, power, primary-study quality] Dent, L., Taylor, R., Jolly, K., & Raftery, J. (2011). “Flogging dead horses”: Evaluating when have clinical trials achieved sufficiency and stability? A case study in cardiac rehabilitation. Trials, 12, 83. [cumulative meta-analysis, practical significance, graphics] Guyatt, G. H., Oxman, A. D., Vist, G., Kunz, R., Brozek, J., Alonso-Coello, P., et al. (2011). GRADE guidelines: 4. Rating the quality of evidence-study limitations (risk of bias). Journal of Clinical Epidemiology, 64, 407– 415. [strength of evidence, threat to validity, primary-study quality, reporting bias, primary-study design, inclusion/exclusion] Hoaglin, D. C., Hawkins, N., Jansen, J. P., Scott, D. A., Itzler, R., Cappelleri, J. C., et al. (2011). Conducting indirecttreatment-comparison and network-meta-analysis studies: report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: Part 2. Value in Health, 14, 429–437. [indirect comparison, network meta-analysis, guidance for reviewing, literature search, random effects, moderator, Bayesian approach, method comparison, sensitivity analysis, heterogeneity, software] Huang, H.-Y., Andrews, E., Jones, J., Skovron, M. L., & Tilson, H. (2011). Pitfalls in meta-analyses on adverse events reported from clinical trials. Pharmacoepidemiology and Drug Safety, 20, 1014–1020. [adverse effects, threat to validity, missing data, rare events, moderator, guidance for reporting, guidance for reviewing] Hurley, J. (2011). Meta-analysis of clinical studies of diagnostic tests: developments in how the receiver operating characteristic “works.” Archives of Pathology & Laboratory Medicine, 135, 1585–1590. [diagnostic test accuracy, graphics, binary outcome, multivariate technique] Ioannidis, J. P. A. (2011). Commentary: adjusting for bias: a user’s guide to performing plastic surgery on metaanalyses of observational studies. International Journal of Epidemiology, 40, 777–779. [threat to validity, primary-study design, primary-study quality] Jackson, D., White, I. R., & Riley, R. D. (2011). Rejoinder to commentaries on ‘multivariate meta-analysis: potential and promise.’ Statistics in Medicine, 30, 2509–2510. [multivariate technique] Kwok, H., & Lewis, R. J. (2011). Bayesian hierarchical modeling and the integration of heterogeneous information on the effectiveness of cardiovascular therapies. Circulation: Cardiovascular Quality and Outcomes, 4, 657–666. [multilevel model, Bayesian approach, heterogeneity, random effects, method comparison, moderator, sensitivity analysis] Morissette, K., Tricco, A. C., Horsley, T., Chen, M. H., & Moher, D. (2011). Blinded versus unblinded assessments of risk of bias in studies included in a systematic review. Cochrane Database of Systematic Reviews, 2011(9), MR000025. [primary-study quality, threat to validity] Nikles, J., Mitchell, G. K., Schluter, P., Good, P., Hardy, J., Rowett, D., et al. (2011). Aggregating single patient (nof-1) trials in populations where recruitment and retention was difficult: The case of palliative care. Journal of Clinical Epidemiology, 64, 471–480. [single-subject design, primary-study design, missing data, sample size, Bayesian approach, primary-study quality, implications for individuals] Perrier, L., Mrklas, K., Lavis, J. N., & Straus, S. E. (2011). Interventions encouraging the use of systematic reviews by health policymakers and managers: a systematic review. Implementation Science, 6, 43. [knowledge transfer, policy implications, decision making] Trespidi, C., Barbui, C., & Cipriani, A. (2011). Why it is important to include unpublished data in systematic reviews. Epidemiology and Psychiatric Sciences, 20, 133–135. [grey literature, publication bias, registry of trials] Tsertsvadze, A., Maglione, M., Chou, R., Garritty, C., Coleman, C., Lux, L., et al. (2011). Updating comparative effectiveness reviews: current efforts in AHRQ’s Effective Health Care Program. Journal of Clinical Epidemiology, 64, 1208–1215. [updating reviews, literature search, guidance for reviewing, cumulative meta-analysis, guidance for reporting] Weiß, B., & Wagner, M. (2011). The identification and prevention of publication bias in the social sciences and economics. Jahrbücher fur Nationalökonomie und Statistik, 231, 661–684. [publication bias, graphics, method comparison, editorial peer review, grey literature, prospective meta-analysis] Wilks, D. C., Mander, A. P., Jebb, S. A., Thompson, S. G., Sharp, S. J., Turner, R. M., et al. (2011). Dietary energy density and adiposity: employing bias adjustments in a meta-analysis of prospective studies. BMC Public Health, 11, 48. [primary-study design, primary-study quality, threat to validity, correlation] Young, T., & Hopewell, S. (2011). Methods for obtaining unpublished data. Cochrane Database of Systematic Reviews, 2011(11), MR000027. [grey literature, missing data]

Evaluation of method

Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

329

A major contribution of the following items is the evaluation of one or more methods for research synthesis or a related task. This often entailed analytic critique, Monte Carlo simulation, or application to one or more real-world data sets. Some of these authors also proposed a novel or refined method, but not as the main focus.

A. R. HAFDAHL

467. Bagheri, Z., Ayatollahi, S. M. T., & Jafari, P. (2011). Comparison of three tests of homogeneity of odds ratios in multicenter trials with unequal sample sizes within and among centers. BMC Medical Research Methodology, 11, 58. [heterogeneity, multi-site study, binary outcome, power, sample size, method comparison, Monte Carlo simulation, software] 468. Campbell, R., Pound, P., Morgan, M., Daker-White, G., Britten, N., Pill, R., et al. (2011). Evaluating metaethnography: systematic analysis and synthesis of qualitative research. Health Technology Assessment, 15, 43. [qualitative research, assessment of review practice, literature search, overview of research synthesis, method comparison, primary-study quality] 469. Chapman, K., Ferreira, T., Morris, A., Asimit, J., & Zeggini, E. (2011). Defining the power limits of genome-wide association scan meta-analyses. Genetic Epidemiology, 35, 781–789. [genetic association, replication, power, sample size, heterogeneity, Monte Carlo simulation] 470. Cooper, N. J., Peters, J., Lai, M. C. W., Juni, P., Wandel, S., Palmer, S., et al. (2011). How valuable are multiple treatment comparison methods in evidence-based health-care evaluation? Value in Health, 14, 371–380. [network meta-analysis, longitudinal data, Bayesian approach, method comparison, multivariate technique, decision making, economic valuation] 471. Oppe, M., Al, M., & Rutten-van Mölken, M. (2011). Comparing methods of data synthesis: re-estimating parameters of an existing probabilistic cost-effectiveness model. Pharmacoeconomics, 29, 239–250. [cost analysis, Bayesian approach, random effects, method comparison, sensitivity analysis, binary outcome] 472. Pearson, M., Moxham, T., & Ashton, K. (2011). Effectiveness of search strategies for qualitative research about barriers and facilitators of program delivery. Evaluation & the Health Professions, 34, 297–308. [literature search, qualitative research, method comparison] 473. Rücker, G., Carpenter, J. R., & Schwarzer, G. (2011). Detecting and adjusting for small-study effects in metaanalysis. Biometrical Journal, 53, 351–368. [publication bias, sample size, graphics, heterogeneity, random effects, method comparison, Monte Carlo simulation, binary outcome, sensitivity analysis] 474. Thorlund, K., Imberger, G., Walsh, M., Chu, R., Gluud, C., Wetterslev, J., et al. (2011). The number of patients and events required to limit the risk of overestimation of intervention effects in meta-analysis—A simulation study. PLoS One, 6, e25491. [sample size, binary outcome, Monte Carlo simulation, heterogeneity, sequential testing] Evaluation of substantive application(s) Items in this category primarily concerned evaluating one or more reported real-data applications of research synthesis or related methods, often with emphasis on the quality of implementation rather than on the performance or properties of the method(s). Some authors focused on specific phases of review or particular threats to validity.

330

475. Aguinis, H., Dalton, D. R., Bosco, F. A., Pierce, C. A., & Dalton, C. M. (2011). Meta-analytic choices and judgment calls: implications for theory building and testing, obtained effect sizes, and scholarly impact. Journal of Management, 37, 5–38. [assessment of review practice, theory development, validity generalization, method comparison, Monte Carlo simulation] 476. Banzi, R., Cinquini, M., Liberati, A., Moschetti, I., Pecoraro, V., Tagliabue, L., et al. (2011). Speed of updating online evidence based point of care summaries: prospective cohort analysis. British Medical Journal, 343, d5856. [guidance for practice, knowledge transfer, updating reviews, assessment of review practice] 477. Cruzes, D. S., & Dybå, T. (2011). Research synthesis in software engineering: a tertiary study. Information and Software Technology, 53, 440–455. [assessment of review practice, qualitative research, primary-study quality, review quality, dissemination] 478. da Silva, F. Q. B., Santos, A. L. M., Soares, S., França, A. C. C., Monteiro, C. V. F., & Maciel, F. F. (2011). Six years of systematic literature reviews in software engineering: an updated tertiary study. Information and Software Technology, 53, 899–913. [assessment of review practice, review quality, guidance for practice, primarystudy quality] 479. Davey, J., Turner, R. M., Clarke, M. J., & Higgins, J. P. T. (2011). Characteristics of meta-analyses and their component studies in the Cochrane Database of Systematic Reviews: a cross-sectional, descriptive analysis. BMC Medical Research Methodology, 11, 160. [assessment of review practice, Cochrane Collaboration, sample size] 480. de Bot, C. M. A., Moed, H., Berger, M. Y., Röder, E., van Wijk, R. G., & van der Wouden, J. C. (2011). Sublingual immunotherapy in children with allergic rhinitis: quality of systematic reviews. Pediatric Allergy and Immunology, 22, 548–558. [assessment of review practice, review quality, primary-study quality] 481. Herbison, P., Hay-Smith, J., & Gillespie, W. J. (2011). Meta-analyses of small numbers of trials often agree with longer-term results. Journal of Clinical Epidemiology, 64, 145–153. [cumulative meta-analysis, heterogeneity, Cochrane Collaboration] 482. Korevaar, D. A., Hooft, L., & ter Riet, G. (2011). Systematic reviews and meta-analyses of preclinical studies: publication bias in laboratory animal experiments. Laboratory Animals, 45, 225–230. [assessment of review practice, review quality, publication bias] Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

A. R. HAFDAHL

483. Liu, Y. L., Yang, S. P., Dai, J. J., Xu, Y. T., Zhang, R., Jiang, H. L., et al. (2011). Risk of bias tool in systematic reviews/meta-analyses of acupuncture in Chinese journals. PLoS One, 6, e28130. [threat to validity, primary-study quality, assessment of review practice, Cochrane Collaboration] 484. Maggin, D. M., O’Keeffe, B. V., & Johnson, A. H. (2011). A quantitative synthesis of methodology in the metaanalysis of single-subject research for students with disabilities: 1985–2009. Exceptionality, 19, 109–135. [single-subject design, assessment of review practice, effect size, primary-study quality] 485. Mickenautsch, S., & Yengopal, V. (2011). Extent and quality of systematic review evidence related to minimum intervention in dentistry: essential oils, powered toothbrushes, triclosan, xylitol. International Dental Journal, 61, 179–192. [assessment of review practice, review quality, selection bias] 486. Papageorgiou, S. N., Papadopoulos, M. A., & Athanasiou, A. E. (2011). Evaluation of methodology and quality characteristics of systematic reviews in orthodontics. Orthodontics & Craniofacial Research, 14, 116–137. [assessment of review practice, review quality, dissemination, Cochrane Collaboration] 487. Pereira, T. V., & Ioannidis, J. P. A. (2011). Statistically significant meta-analyses of clinical trials have modest credibility and inflated effects. Journal of Clinical Epidemiology, 64, 1060–1069. [significance testing, assessment of review practice, updating reviews, Bayesian approach] 488. Pibouleau, L., & Chevret, S. (2011). Bayesian statistical method was underused despite its advantages in the assessment of implantable medical devices. Journal of Clinical Epidemiology, 64, 270–279. [Bayesian approach, assessment of review practice, dissemination, heterogeneity] 489. Savard, L. A., Thompson, D. R., & Clark, A. M. (2011). A meta-review of evidence on heart failure disease management programs: the challenges of describing and synthesizing evidence on complex interventions. Trials, 12, 194. [assessment of review practice, review quality, dissemination, heterogeneity] 490. Sequeira-Byron, P., Fedorowicz, Z., Jagannath, V. A., & Sharif, M. O. (2011). An AMSTAR assessment of the methodological quality of systematic reviews of oral healthcare interventions published in the Journal of Applied Oral Science (JAOS). Journal of Applied Oral Science, 19, 440–447. [assessment of review practice, review quality] 491. Sharma, R., Vannabouathong, C., Bains, S., Marshall, A., MacDonald, S. J., Parvizi, J., et al. (2011). Meta-Analyses in joint arthroplasty: a review of quantity, quality, and impact. Journal of Bone and Joint Surgery-American Volume, 93, 2304–2309. [assessment of review practice, review quality, dissemination] 492. Tao, K.-M., Li, X.-Q., Zhou, Q.-H., Moher, D., Ling, C.-Q., & Yu, W.-F. (2011). From QUOROM to PRISMA: a survey of high-impact medical journals’ instructions to authors and a review of systematic reviews in anesthesia literature. PLoS One, 6, e27611. [guidance for reporting, assessment of review practice] 493. Weed, D. L., Althuis, M. D., & Mink, P. J. (2011). Quality of reviews on sugar-sweetened beverages and health outcomes: a systematic review. American Journal of Clinical Nutrition, 94, 1340–1347. [review quality, assessment of review practice] 494. Willis, B. H., & Quigley, M. (2011). The assessment of the quality of reporting of meta-analyses in diagnostic research: a systematic review. BMC Medical Research Methodology, 11, 163. [guidance for reporting, assessment of review practice, diagnostic test accuracy, primary-study quality, technology assessment] Other type of contribution Each of the following items makes a unique type of methodological contribution relevant to research synthesis that is not readily classified into one of the preceding six categories. 495. Albright, D. L. (2011). [Review of the book Research synthesis and meta-analysis: A step-by-step approach, 4th ed.]. Research on Social Work Practice, 21, 119. [overview of research synthesis] 496. Barker, F. G., & Oyesiku, N. M. (2011). Improving the quality of research reports in NEUROSURGERY (R): the CONSORT, PRISMA, MOOSE, STARD, STROBE statements and the EQUATOR network. Neurosurgery, 68, 1– 5. [guidance for reporting, primary-study quality, primary-study design] 497. Lai, N. M., Teng, C. L., & Lee, M. L. (2011). Interpreting systematic reviews: are we ready to make our own conclusions? A cross-sectional study. BMC Medicine, 9, 30. [appraising reviews, strength of evidence, knowledge transfer, Cochrane Collaboration] 498. Moher, D., Altman, D. G., Liberati, A., & Tetzlaff, J. (2011). PRISMA Statement. Epidemiology, 22, 128. [review protocol] 499. Petticrew, M., & McCartney, G. (2011). Using systematic reviews to separate scientific from policy debate relevant to climate change. American Journal of Preventive Medicine, 40, 576–578. [policy implications, guidance for reviewing, strength of evidence] 500. The PLoS Medicine Editors. (2011). Best practice in systematic reviews: the importance of protocols and registration. PLoS Medicine, 8, e1001009. [review protocol, registry of trials]

1. Hafdahl AR. Article Alerts: Items from 2010, Part II. Research Synthesis Methods 2011; 2: 279–286. DOI: 10.1002/ jrsm.56 Copyright © 2012 John Wiley & Sons, Ltd.

Res. Syn. Meth. 2012, 3 325–331

331

Reference

Article Alerts: Items from 2011, Part I.

The print component of this fifth 'Article Alerts' installment comprises 100 articles published in 2011. Since the preceding installment, more than 1,...
128KB Sizes 2 Downloads 9 Views