Sot. Sci. Med. Vol. 35, No. II, pp. 1321-1324,1992 Printed in Great Britain. All rights reserved

0277-9536192 $5.00+ 0.00 Copyright 0 1992Pergamon Press Ltd

INTRODUCTION RESEARCH CAPACITY BUILDING IN INTERNATIONAL HEALTH: DEFINITIONS, EVALUATIONS AND STRATEGIES FOR SUCCESS JAMFSTRESTLE Applied Diarrhea1 Disease Research Project (ADDR), Harvard Institute for International Development, One Eliot Street, Cambridge, MA 02138, U.S.A.

The papers in this special issue describe research processes rather than products. They look behind research results, to examine the inducements and constraints to producing those results, especially in less-developed countries (LDCs). All the papers in this issue concern various aspects of research capacity building, a general term for a process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research. This development is commonly directed at academic or other research institutions, and funded by international donors and/or local governments. The Commission on Health Research for Development has recently defined research capacity as including four main components, including: individual researcher competence; quality of institutional infrastructure; presence of research focusing on country-specific policy formation and action; and ability to contribute to global research and policy priorities [l]. While these papers touch various aspects of international health, and the structure of research in LDCs, many of the concerns they raise are equally relevant to research in industrialized countries. This relevance is, in fact, perfectly predictable, because so much health research produced in LDCs is built on a Western model of science as practiced in industrialized countries. We see similarities because we have built from a limited, and familiar, set of designs. Most of these papers come from an invited session on research capacity building held at the 1989 Annual Meeting of the American Anthropological Association. The papers describe various training and grant support programs, including examples focused on clinical epidemiology [2]; studies of rational use of drugs [3]; tropical disease research [4]; diarrhea1 diseases [5-71; and women’s reproductive health [8]. These program descriptions and reports of personal and project histories are used to highlight a number of specific issues in international health research development: How can program designers build international consensus favoring new behavioral interventions [3]? What structural constraints interfere with research funding, and how can programs be organized to reduce these constraints [7]? What are the strengths and weaknesses of interdisciplinary

science as practiced in LDCs, and how can local concerns be better integrated into international agendas [5]? What are the benefits and costs of using manuals as a substitute for more in depth research training [6]? What are the post-graduate training needs for practicing medical social scientists [S]? How can social science concerns and methods be better integrated into training programs dominated by a clinical worldview [2]? What types of collaboration exist among scientific disciplines, and how have programs been organized to promote one or another type of collaboration [4]? The other two papers in this issue are not explicitly based in programs, for they discuss broader concerns of how research production is organized and how research is designed. Frenk’s paper [9] describes how organizations can confront the classic struggle between locally-relevant and internationally recognized research, and how they can minimize the movement of researchers toward administrative positions. Stanton and colleagues [IO] lay out a strategy for increasing the use of behavioral theories in field studies, a problem which has been noted by many international donors. Taken as a whole, these papers summarize the major challenges facing many of the largest supporters of health and social science in LDCs, and the strategies they have adopted to attain success. The social science programs described in these manuscripts represent expenditures totalling more than $50 million over the past 5 years. Program planning, finance, common sense, and curiosity all motivate an evaluation of their efforts. The following pages briefly pose some background questions to contextualize and uncover assumptions behind these evaluations. 1. WHAT IS RESEARCH CAPACITYBUILDING? Let us consider the phrase ‘research capacity building’ more closely, in order to unpack its multiple assumptions. To support research implies funding studies; but to support research capacity implies funding the multiple prerequisites to performing research, including technical skills and technology, career paths and computers, peer review and

1321

1322

JAMESTROSTLE

publications. This is a goal far larger and more complex than that of funding a particular research proposal. Emphasizing research capacity building also involves a subtle implicit arrogance that there is no capacity to begin with; that it needs to be ‘built’ rather than ‘expanded’ or ‘supported.’ So perhaps we need to talk about ‘supporting research institutions’ or ‘expanding research possibilities’ rather than building research capacity. Whatever term we choose, we at least know what process we are talking about. But do we? ‘Research capacity building’ is a complex phrase, and a difficult goal to define. Donors sometimes measure the primary products of capacity building (trained scientists and reputable research institutions) as though they were products like improved strains of corn. How many units were produced? At what cost? How fertile are they in terms of publications and presentations? Are they stable through time? Do they reproduce themselves? Research capacity is, however, a particular kind of development product with a special set of qualities that make it difficult to monitor, evaluate, and sustain: (1) It is often a subjective attribute: what one person or institution defines as capacity may be what others would dismiss as incompetence. Different types of research questions require different levels of research sophistication, yet all too often the quality of a research project is evaluated according to its complexity. (2) It is context-specijic. Prevailing research standards differ across countries, disciplines, and topics. Research related to policy in LDC’s (and elsewhere) may be useful even when study design or sample size limitations preclude publishing research results in elite, peer-reviewed academic Journals. Standards for data analysis also differ across countries and disciplines, making assessment and comparison of results more problematic. In some countries and disciplines quantitative analysis most often refers to producing univariate frequency tables, in others it refers to computer-based multivariate modelling. (3) Research capacity is also mobile, and can be short-lived: well-trained researchers in LDCs are in short supply and are sought after by governments, universities, non-governmental organizations (NGOs), and international development agencies. Local research careers are often terminated when successful researchers become university administrators or international consultants, or when they take resident positions in other countries. (4) Capable researchers and research institutions in LDCs are often over-committed: researchers often have multiple employers and their various commitments add to more than 100% of their time. Institutions sometimes have more projects than they can handle. Clinical researchers in public institutions often acquire a significant proportion of their income

from private practice, and their economic survival depends on their profits. Research, while a desirable goal, does not offer significant compensation to most. Thus excellence in research can lead paradoxically to overwork, declining quality, and increasing frustration. Research capacity is therefore a fragile development goal: a sustained research program requires not only funding opportunities, but also a network of colleagues, a career path, a set of personal and financial incentives, and a commitment by the state to support or at least tolerate research as a legitimate and valued endeavour. Disappearance of even one of these supports may reduce or eliminate research opportunities. 2. HOW IS RESEARCH CAPACITY BE EVALUATED?

TO

More conceptual attention must be focused on how best to evaluate the success of efforts to build research capacity. How much time must pass before training can be evaluated? Can the impact of money spent on training be evaluated apart from efforts to build a creative and well-equipped environment for researchers? How long must a researcher practice his or her craft to justify prior expenditures on training? Exercises such as calculating the cost-effectiveness of investments in research capacity should not proceed until these questions are answered. These are issues of policy as much as they are issues of measurement, and they lead us to questions about the larger goals of capacity building.

3. COLONIALISM OR INDEPENDENCE: TOWARD WHAT END IS RESEARCH CAPACITY TO BE ‘BUILT’?

Increasing numbers of LDC scientists are designing and performing their own scientific projects. Within the world of international research funding, however, their goals and designs are still often overshadowed by foreign scientists with better command of English, better connections to donors, and more sophisticated research skills. These LDC scientists seek true collaborative relationships with scientists and institutions in developed countries; they have for some time wanted to do more than implement other scientists’ study designs, or provide access to study populations (e.g. HIV positive heterosexuals) and diseases (e.g. cholera, malaria, schistosomiasis) unavailable to developed country scientists at home [11,12]. The systematic involvement of LDC scientists as implementors of research designs or purveyors of unique datasets can be called ‘scientific colonialism,’ for it is typified by extraction and export of knowledge rather than fertilization and indigenous growth. Scientific colonialism involves, among other activities, the gathering of data in LDCs, exportation of data from the LDC in raw form, and processing

Introduction (analysis and publication) outside the LDC. Unlike earlier stages of colonialism, however, the finished products (published papers) are often never marketed back to the countries that produced the raw data. Scientific journals are expensive and rare, and sometimes so are libraries. The most recent examples of scientific colonialism have accompanied the AIDS pandemic. For example, numerous superficial and rapid research projects on AIDS have taken place in Africa, and have therefore come to be called ‘safari research’ [13]. This type of research has led some countries to begin strictly enforcing regulations covering export of data, and it has heightened the urgency of building or better recognizing local networks of competent researchers. Research capacity can thus be built for two different ends: LDC scientists can design, perform, analyze, and disseminate results from their own research projects; or they can carry out studies designed by others. Reduced responsibility for design results in reduced ability in analysis. Independent leadership of scientific projects is a more effective goal of research capacity building because it requires participation in all phases of the research process. Information produced in this fashion is more likely to be used, for it is produced by a strong and informed lobby. Important skills in communication and budgeting can also be acquired, and kept sharp through use. 4. STRATEGIES FOR ADDITIONAL SUCCESS IN CAPACITY BUILDING: EMPHASIZING PROBLEM FORMULATION, THEORETICAL FOUNDATIONS, AND APPROPRIATE DISSEMINATION

Research entails a number of steps that historically have received insufficient attention in both LDCs and developed countries. These steps involve the critical areas where science, ideology, and politics all play important roles, such as ‘What is the right question?,’ ‘So what?,’ and ‘What is the right audience for my results?’ Research methods texts usually focus on protocol development, design issues, and the collection and analysis of data (e.g. in anthropology, Pelto and Pelto [14], and Bernard [I 51). Additional emphasis has recently been placed on these issues of knowing when the right question has been asked, and the right audience reached. These steps preceding and following project implementation are crucial aspects of training researchers and conducting useful research [16-l 81. Before developing a research protocol, scientists need to be able to identify important problems, divide them up into manageable questions, and specify the theory and models they will be using to organize their research. Many research proposals on issues in international health pay scant attention to the theoretical foundations of the research problem or the research approach. This has been a common finding by many funders of research in international health. It has contributed to the low prestige

1323

accorded to applied health research, for such research is often atheoretical, therefore outside the boundaries of core disciplinary advances in the social sciences, and therefore uninteresting. Lack of theory comes partly from insufficient library resources, inability to keep up with current scientific debates, and problems with research training. It also comes from discontinuities in research support: one project can be seen as the only opportunity to complete multiple goals, and attention to theory suffers when data collection itself is the research goal. These concerns motivated the ADDR project to sponsor a workshop on the use of theory in behavioral research, from which comes the paper by Stanton et al. [lo] included here. Insufficient attention paid to problem identification can cause researchers to investigate trivial problems, work toward unattainable research objectives, develop excessively complex designs, or produce results that are not or cannot be compared with other work done on the same topic. Knowing what level of detail is required for what application facilitates the design of efficient research projects. For those studies with an applied orientation, more emphasis after data collection needs to be placed on presenting results to policymakers and/or scientific colleagues in an accessible and useful form (see recent emphasis on this in van Willigen et al. [18]). Insufficient attention at these later stages can lead to unanalyzed datasets; an implicit understanding that research means little more than gathering data; findings that never reach policymakers in interpretable or useable form; and findings labelled ‘fugitive’ that fail to build a cumulative research record. These problems are not, of course, unique to LDCs, but they loom larger in these countries. This is because applied research projects in LDCs are usually designed to ameliorate pressing health problems: when research projects have no impact on policy, programs, or individual’s health status, then the cost of the lost opportunity appears far greater.

5. ISSUES FOR FUTURE DEBATE AND RESEARCH IN CAPACITY BUILDING

Though the papers in this issue cover new territory, and ask fundamental new questions, they by no means exhaust the important topics for research in this area called health research capacity building. As a first example, there is, as yet, no anthropology of scientists or scientific elites in LDCs, though the anthropology of science as practiced in the United States, Europe, and Japan is progressing rapidly. Research capacity building is development work among relatively privileged sectors of a society, and thus has its own contradictions deserving additional exploration, especially using cross-cultural longitudinal study designs. Second, it would be interesting to look more broadly and historically at changes in research capacity building over time. How have funding

JAMESTROSTLE

1324

themes changed; how have funding strategies changed; and which strategies seemed to have worked best? Third, it is also critical to compare research capacity building efforts in health with efforts in other sectors such as agriculture, economics, and engineering. These other sectors, especially agriculture, have a longer history and more funding support, and therefore provide some instructive parallels and divergences. Fourth, it is also appropriate to ask why research itself has taken on such importance in the international funding agenda, and why health action programs are receiving less financial support and attention. These political concerns must be addressed if the context of research capacity building is to be fully understood. Finally, it will be important to include more examples from LDC authors in a next assemblage of papers about research capacity building. Consultants, faculty, and program managers convey important experiences and histories, but these must in the end be supplemented with and calibrated against the perspectives of the funded scientists themselves. Acknowledgements-Thanks to Jon Simon and Lynn Morgan for their comments on an earlier draft. This paper was supported by means of a cooperative agreement (# DPE-5952-A-00-5073-00) between the Harvard Institute for International Development and the United States Agency for International Development. Neither institution bears any responsibility for the ideas expressed here.

4.

5.

6.

7.

8.

9.

10.

11.

12.

13. 14.

15. REFERENCES Commission on Health Research for Development. Health Research: Essential Link to Equity in Development, pp. 71-74. Oxford University Press, New York, 1990. Higginbotham N.H. Developing a social science component within the International Clinical Epidemiology Network (INCLEN). Sot. Sci. Med 35, 1325-1327, 1992. Ross-Degnan D., Laing R., Quick J., Ali H. M.,

16.

17.

18.

Ofori-Adjei D., Salako L. and Santoso B. A strategy for promoting improved pharmaceutical use: The international network for rational use of drugs. Sot. Sci. Med. 35, 1329-1341, 1992. Rosenfield P. The potential of transdisciplinary research for sustaining and extending linkages between the health and social sciences. Sot. Sci. Med. 35, 1343-1357, 1991. Good M. J. Local knowledge: Research capacity building in international health. Sot. Sci. Med. 35, 1359-1367, 1992. Herman E. and Bentley M. E. Manuals for ethnographic data collection: Experience and issues. Sot. Sci. Med. 35, 1369-1378, 1992. Trostle J. A. and Simon J. Building applied health research capacity in developing countries: Problems encountered by the ADDR Project. Sot. Sci. Med. 35, 1379-1387, 1992. Pelto P. J. and Pelto G. H. Developing applied medical anthropology in third world countries: Problems and actions. Sot. Sci. Med. 35, 1389-1395, 1992. Frenk J. Balancing relevance and excellence: Organizational responses to link research with decision making. Sot. Sci. Med. 35. 1397-1404, 1992. Stanton B., Black R., Engle P. and Pelto G. H. Theory-driven behavioral intervention research for the control of diarrhea1 diseases. Sot. Sci. Med. 95, 1405-1420, 1992. Stavenhagen R. Decolonalizing applied social sciences. (With Comments and Reply). Hum. Organiz. 30, 333-357, 1971. Martinez-Palomo A. Science for the Third World: An inside view. Perspect. Biol. Med. 30, 546-560, 1987. Palca J. African AIDS: Whose research rules? Science 258, 199-201, 1990. Pelto P. J. and Pelto G. H. Anthropological Research: The Structure of Inquiry. Cambridge University Press, London, 1978. Bernard H. R. Research Methods in Cultural Anthropology. Sage, Newbury Park, CA, 1988. Ratcliffe J. and Gonzalez-del-Valle A. Rigor in healthrelated research: Toward an expanded conceptualization. Int. J. Hlth Services 18, 361-392, 1988. Brownlee A. T. Applied research as a problem-solving tool: Strengthening the interface between health management and research. J. Hlth Admin. Educ. 4, 31-44, 1989. Van Willigen J., Rylko-Bauer B. and McElroy A. Making Our Research Useful: Case Studies in the Utilization of Anthropological Knowledge. Westview Press, Boulder, CO, 1989.

Research capacity building in international health: definitions, evaluations and strategies for success.

Sot. Sci. Med. Vol. 35, No. II, pp. 1321-1324,1992 Printed in Great Britain. All rights reserved 0277-9536192 $5.00+ 0.00 Copyright 0 1992Pergamon Pr...
470KB Sizes 0 Downloads 0 Views