http://informahealthcare.com/jic ISSN: 1356-1820 (print), 1469-9567 (electronic) J Interprof Care, 2014; 28(3): 194–199 ! 2014 Informa UK Ltd. DOI: 10.3109/13561820.2013.874982

THEMED ARTICLE

Citizen social science: a methodology to facilitate and evaluate workplace learning in continuing interprofessional education Ann Dadich School of Business, University of Western Sydney, Parramatta, Australia

Abstract

Keywords

Workplace learning in continuing interprofessional education (CIPE) can be difficult to facilitate and evaluate, which can create a number of challenges for this type of learning. This article presents an innovative method to foster and investigate workplace learning in CIPE – citizen social science. Citizen social science involves clinicians as co-researchers in the systematic examination of social phenomena. When facilitated by an open-source online social networking platform, clinicians can participate via computer, smartphone, or tablet in ways that suit their needs and preferences. Furthermore, as co-researchers they can help to reveal the dynamic interplay that facilitates workplace learning in CIPE. Although yet to be tested, citizen social science offers four potential benefits: it recognises and accommodates the complexity of workplace learning in CIPE; it has the capacity to both foster and evaluate the phenomena; it can be used in situ, capturing and having direct relevance to the complexity of the workplace; and by advancing both theoretical and methodological debates on CIPE, it may reveal opportunities to improve and sustain workplace learning. By describing an example situated in the youth health sector, this article demonstrates how these benefits might be realised.

Continuing interprofessional education, crowdsourcing, social media, workplace learning, youth health

Despite endorsement for and the rise of interprofessional care, continuing education (CE) for many health professions often remains siloed and not situated within the workplace (ICSC, 2007; Neville, 2010; Reeves, 2012). With few exceptions (Phelan, Barlow, & Iversen, 2006), employers of and professional bodies for clinicians typically offer discrete training opportunities with limited integration with the complexity of health services. As the Institute of Medicine (2010) has reported: There are major flaws in the way CE is conducted financed, regulated, and evaluated. As a result, the health care workforce is not optimally prepared to provide the highest quality of care to patients or to meet public expectations for quality and safety (p. 2, emphasis in original). While it can be difficult to foster continuing interprofessional education (CIPE), it can be equally difficult to evaluate (Gillan, Lovrics, Halpern, Wiljer, & Harnett, 2011). This is largely because many research methodologies and methods cannot adequately capture the multitude of ways that clinicians access, exchange and co-create information and resources on evidencebased practices. Consider for instance conventional approaches to CE research whereby change in clinician knowledge and/or skill is

Correspondence: Dr. Ann Dadich, University of Western Sydney, School of Business, Locked Bag 1797, Penrith 2751, Australia. E-mail: [email protected]

Received 22 January 2013 Revised 21 October 2013 Accepted 10 December 2013 Published online 9 January 2014

gauged using surveys or clinical audits devoid of a consideration of other determinants like organisational factors, professional identities or patient expectations (Clark, 2013; Grimshaw, Eccles, Walker, & Thomas, 2002; McMichael, 2000; Ockene & Zapka, 2000; Reeves et al., 2010; Suter et al., 2013). This in turn leaves CIPE at-risk of remaining under-studied and poorly understood. To address these practical and methodological issues, this article describes an approach to both facilitate and evaluate workplace learning in CIPE – namely, citizen social science. Citizen social science involves the examination of social phenomena, the systematic collection and analysis of related data as well as the dissemination and translation of these activities to practice by researchers on a primarily avocational basis. More specifically, interprofessional clinicians are invited to become co-researchers within a community of practice in situ. While in the workplace, they are invited to exchange and critique knowledge to inform their own and each other’s professional practices – they are also invited to reflect on, examine and address the factors that shape CIPE. Given their embeddedness within their respective organisations and their professions, inviting interprofessional clinicians to collaborate as co-researchers in this community of practice opens opportunities to detect nuances that may otherwise remain beyond the purview of the academic gaze – this in turn enhances the theoretical relevance of CIPE research. As a methodology, citizen social science aims to both facilitate and investigate workplace learning in CIPE. Given the inherent complexity of CIPE, citizen social science requires an adaptable research method like social media, which also has the capacity to connect users to each other and capture revealing data in real-time.

20 14

Introduction

History

Citizen social science

DOI: 10.3109/13561820.2013.874982

This article initially offers a description of citizen social science and justifies its potential through reference to a similar effort – namely, Sermo.1 Using an example situated in the highly interprofessional context of the youth health sector, it then demonstrates how citizen social science can be operationalised using social media as a research method. The article then concludes by summarising the potential benefits of citizen social science for CIPE and related research.

195

However, crowdsourcing can have its limitations. For example, an American grocery manufacturing and processing conglomerate had a costly crowdsourcing exercise whereby the resulting name for a new product-version was largely deemed unpalatable by the masses (Lee, 2009). Recent work may help to understand and potentially avoid such drawbacks. Afuah and Tucci (2012) argue that five conditions render crowdsourcing a helpful approach to problemsolving:

Citizen social science (1) the problem is easy to delineate and broadcast to the crowd, (2) the knowledge required to solve the problem falls outside the focal agent’s knowledge neighborhood (requires distant search), (3) the crowd is large, with some members of the crowd motivated and knowledgeable enough to self-select and solve the problem, (4) the final solution is easy to evaluate and integrate into the focal agent’s value chain, and (5) information technologies are low cost and pervasive in the environment that includes the focal agent and the crowd (p. 356).

Although citizen social science is a relatively new methodology, it draws heavily from two established approaches – namely, crowdsourcing and citizen science. Both are described with reference to examples as well as lessons learnt from their use. Crowdsourcing Crowdsourcing is the process of ‘‘posing a question or problem to a large group of people to try to get to the best answer quickly’’ (TopRankÕ Online Marketing, 2013). According to (Wexler, 2011), it essentially involves the following five steps: (1) Recognition of a problem or opportunity that can be effectively and/or efficiently addressed by a group of individuals (2) An appeal to a group of individuals for contributions to address the problem or opportunity which are guided by rules or expectations (3) The collection of contributions (4) The filtering of contributions which may be performed by the crowdsourcer, the crowd and/or an external party and (5) Determination of the future crowdsourcer–crowd relationship Although hardly a new concept (Dash, 1999), crowdsourcing is becoming synonymous with online activities given the opportunities afforded by technology, particularly for clinicians (Glasziou, 2012; Munro & Peacock, 2005). Individuals are sometimes recompensed for their contributions or have opportunity to be recompensed through a prize – however, this is not a prerequisite; sometimes personal motivation suffices (Busarovs, 2011; Zheng, Li, & Hou, 2011). Crowdsourcing is underpinned by three assumptions. First, it assumes ‘‘The crowd is. . . wiser than the individual’’ (Wesson, 2010, p. 109), and as such involving the masses will give rise to the ‘‘best answer’’ (Howe, 2006). Second, it assumes that when conceived by a representative of the masses this answer is likely to be deemed credible and be more palatable to the target audience (Wexler, 2011). Third, it assumes the process is efficient and effective for both those who source the crowd as well as those in the crowd – for instance crowdsourcers have access to affordable labour while the crowd can shape products, services and policies and be recompensed for their efforts (Behrend, Sharek, Meade, & Wiebe, 2011; Felstiner, 2011; Kee, 2009; Siddique, 2011; Van Buskirk, 2009). In the private and not-for-profit sectors, crowdsourcing is proving to be particularly effective at engaging large groups of individuals. For example, the UNAIDS Secretariat’s new youthled policy project has used crowdsourcing to enable young people from around the world to develop recommendations to work more effectively with young people in the AIDS response (CrowdOutAIDS, 2011). Crowdsourcing may also to be effective at optimising value for money, as its use can lower costs for advertisers (Fayolle, Basso, & Legrain, 2008; Hempel, 2006).

Citizen science When managed within a framework of scientific research crowdsourcing becomes citizen science. This is the, ‘‘systematic collection and analysis of data; development of technology; testing of natural phenomena; and the dissemination of these activities by researchers on a primarily avocational basis’’ (Open Scientist, 2011, para. 12). There are two key motivations for citizen science. The first is people power; for example Galaxy Zoo is an online astronomy project that invites the public to explore and map the universe, and Foldit is an initiative of the University of Washington to gamify protein folding (Clery, 2011; Cooper et al., 2010; Hand, 2010), whereby ‘‘game-thinking and game mechanics [are used] to engage users and solve problems’’ (Zichermann & Cunningham, 2011, p. xiv). More specifically, online gamers are invited to fold proteins to determine how linear chains of amino acids curl with limited strain; solutions that receive the highest scores are then examined by scientists to determine relevance to bona fide proteins. Results to date are impressive and include treatment opportunities to combat the AIDS virus (Khatib et al., 2011). The second key motivation for citizen science is engagement whereby science is packaged and promoted as having relevance and practical value. Consider Operation Spider, a 12-month project that engaged students from low socio-economic schools in scientific endeavours for both educational and research purposes (Paige et al., 2012). More specifically, school and university staff worked collaboratively to actively involve students in ecological observations in their schoolyards and homes. The program fostered a ‘‘motivational context for learning science’’ (p. 20), which was associated with ‘‘a marked improvement in student behaviours’’. Citizen social science When combined, crowdsourcing and citizen science collectively give rise to citizen social science. As noted, this methodology involves the examination of social phenomena, the systematic collection and analysis of related data as well as the dissemination and translation of these activities to practice by researchers on a primarily avocational basis.2

2 1

A US-based, physician-only online social networking platform.

Similarly, crowdsourced health research studies is said to blend crowdsourcing and citizen science (Swan, 2012).

196

A. Dadich

Citizen social science is particularly relevant to CIPE. This is because it has the capacity to overcome some of the practical and methodological issues noted earlier. More specifically, this approach invites a group of interprofessional clinicians to become co-researchers within a community of practice to ‘‘learn with, from, and about each other to improve collaboration and the quality of care’’ in situ – that is within the workplace (Reeves, 2009, p. 143). While fulfilling their usual roles and responsibilities, it invites these co-researchers to share and critique different forms of knowledge including evidence-based practices and experiential wisdom. Furthermore, it encourages them to reflect on, examine and address the factors that shape CIPE. As such, reflecting the demonstrated benefits of crowdsourcing and citizen science, this approach has the potential to: engage a group of interprofessional clinicians in interactive processes; draw on clinician expertise to elicit innovative solutions to potentially complex issues, efficiently and effectively; and optimise the relevance of these solutions. This is largely because citizen social science helps to connect CIPE to the workplace. It recognises CE as encompassing different forms of knowledge and it recognises the need to connect different professionals with each other and with their workplace. This recognition bolsters the relevance and sustainability of CIPE as well as the robustness of related research. Given the increasing use of social media by clinicians citizen social science can be aptly facilitated by an online social networking platform (Bosslet, Torke, Hickman, Terry, & Helft, 2011). In addition to its ease of access for time-poor clinicians such a platform provides opportunity to investigate interprofessional exchanges in real-time – this includes an examination of the type of professionals who engage in dialogue, the regularity and nature of their exchanges, what and how they learn from each other, how their workplaces influence their capacity to change their clinical capacities, the degree of influence on each other’s practices and ultimately how interprofessional exchanges influence patient engagement, patient health and potentially public health. Despite the demonstrated value of crowdsourcing and citizen science, the capacity of citizen social science to facilitate and evaluate workplace learning in CIPE is yet to be firmly established. Whether this methodology can engage qualified clinicians from different professions with each other and with their workplace remains to be seen. Also unknown are the conditions required to optimise the effectiveness of this approach. For instance, would knowledge of the expertise and status of fellow clinicians influence clinician-communication; would junior clinicians be more active online in the absence of their senior counterparts? The potential of this approach was argued in the 1970s with the advent the Delphi technique and its application to complex issues (Linstone & Turoff, 1975). This technique requires four elements – namely, ‘‘some feedback of individual contributions of information and knowledge; some assessment of the group judgment or view; some opportunity for individuals to revise views; and some degree of anonymity for the individual responses’’ (1975, p. 3, emphasis added). Using social media, the Delphi technique has recently been adapted to healthcare by the corporate firm Sermo. As illustrated in the following section, the seeming success of this initiative reinforces the role that social media can play in facilitating and understanding workplace learning in CIPE – more specifically it reinforces the role of social media in citizen social science. For this reason, the Sermo model and the ways it has managed the challenges of online clinical networks are described.

Social media in healthcare: the Sermo platform Sermo which is Latin for conversation is a US-based, physicianonly online social networking platform that ‘‘seeks to facilitate

J Interprof Care, 2014; 28(3): 194–199

valuable conversations – the sharing of observations and knowledge – about healthcare and medical practices’’ (Bray, Croxson, Dutton, & Konsynski, 2008, p. 2). It endeavours to bring tacit knowledge to the fore by focusing on ‘‘highly contextualized, experienced-based knowledge surpassing taught verbatim’’. As such Sermo represents a knowledge ecosystem – that is, a dynamic system that promotes knowledge flow into, and knowledge exchange within the network, ‘‘with the net result that the whole community is continuously involved in the knowledge life cycle thus supporting the steady growth of the ecosystem’’ (Slavazza, Fonti, Ferraro, Biasuzzi, & Gilardoni, 2006, p. 367). This ecosystem is facilitated by survey posts proffered by physicians some of whom are paid for their contributions and fee-paying companies that wish to tap into the collective wisdom of the clinical community. Sermo is proving to be effective at engaging physicians with some 700 new members per week (as at October, 2007); 15–20% of these physicians participate weekly with active participants averaging more than one hour of online activity per week. This might be partly explained by the ways Sermo has managed some of the challenges associated with online clinical networks. For instance, to balance the need for a closed-community of experts with a need for diversity Sermo allows physicians to vote on particular issues to optimise diverse opinion while ranking physician-expertise based on their online contributions (March, 1991; Jansen, van den Bosch, & Volberda, 2006). Similarly to balance the need for anonymity with that of disclosure participants who are authenticated by Sermo are encouraged to maintain their anonymity; however the level of self-disclosure is managed by participants (Clippinger, 1999; Wade-Benzoni, Tenbrunsel, & Mazerman, 1996). Furthermore, participants are asked to disclose all conflicts of interest when participating in the online community and physician-rankings help participants to verify the level of expertise of fellow participants. The Sermo platform represents a novel knowledge ecosystem that has a demonstrated capacity to facilitate exchange and instil a sense of community among participants. Although these capacities are relevant to interprofessional learning processes (Reeves, 2009), the potential value of this model for workplace learning in CIPE and related research remains unknown as does the potential value of a non-commercial model in which clinicians do not receive monetary gain. Given the evidence for crowdsourcing and citizen science, the increasing use of social media among clinicians and the importance of CE that connects different professions with each other and their workplace, these unanswered questions represent missed opportunities (Bosslet et al., 2011; Newton, Billett, & Ockerby, 2009; Reeves, 2009; Rosen, Hunt, Pronovost, Federowicz, & Weaver, 2012). These might be readily addressed by harnessing existing technologies. Using an example situated in the highly interprofessional context of the Australian youth health sector, one such approach is described in the following section to reveal how citizen social science might be used to facilitate and evaluate workplace learning in CIPE.

Citizen social science: imagining its application in the youth health sector In Australia, the youth health sector supports and promotes the well-being of young people aged 12–24 years inclusive (NSW Health, 2010). The sector largely includes government and not-for-profit organisations comprised of different professionals, all of which aim to promote the wellbeing of young people (NAYH, 2010b). They provide: developmentally appropriate programs as well as multiple informal or ‘soft’ points of access to health and related

DOI: 10.3109/13561820.2013.874982

services, acknowledging the varying needs, referral pathways and engagement preferences of young people. Examples . . . include counselling, health promotion, primary health care clinics, alcohol and other drug services, case management, arts based and drop in health services (NAYH, 2010a, p. 5).

To facilitate and evaluate workplace learning in CIPE within this interprofessional sector, citizen social science as a methodology could be facilitated by an existing online social networking platform as a research method. This would involve collaborating with clinicians as co-researchers in the collection and interpretation of research material pertaining to interprofessional collaboration and the quality of care delivered to young people in situ. For practical and financial reasons this could be achieved with a platform like HAweb. HAweb is a virtual professional network and dashboard developed by the College of Dutch General Practitioners and the National General Practitioner Association (LHV & NHG, nd). It was specifically built to mediate interaction and knowledge development between healthcare professionals who deliver primary care and is available in both Dutch and English. Members can develop a profile, form a group, contribute to discussions, organise events, share documents and cooperate on wiki’s among other activities. HAweb undergoes continuous improvement and can support the development of clinical guidelines. HAweb is built in the open-source content management system drupal which makes it relatively inexpensive and flexible to adapt – this includes the development of applications for smartphones and tablets. Furthermore, it is devoid of commercial interests to ensure the transparent and non-biased sharing of medical knowledge. Although clinicians may be relatively more familiar and comfortable with other platforms like Facebook regular access might be hindered by firewalls as is the case in many government services; additionally data may inappropriately become the property of a third party. In the context of workplace learning in CIPE a platform like HAweb could facilitate: user-engagement, for example through the use of user-friendly functions found in well-known platforms like Facebook which are therefore likely to be familiar to the coresearchers; user-connectedness, for instance by enabling users to communicate with individual users, user-groups and all users; information sharing, for example by enabling users to share different resources in different media; knowledge exchange, for instance by enabling users to share experiences, observations and perceptions; data collection, for example, by collecting usercontributions, mapping platform-use and tracking platformactivity; and data management, for instance by enabling data to be categorised, stored and searched. To demonstrate how a platform like HAweb might be used to facilitate and evaluate workplace learning in CIPE the following four interrelated stages are presented. First, clinicians working inteprofessionally within the youth health sector would be invited to collaborate as co-researchers in this endeavour. This would involve active use of HAweb by computer, smartphone and/or tablet particularly within the workplace to: access and critique information on evidence-based practices; exchange perceptions of and experiences with healthcare delivery; identify sources of information and knowledge that shape the translation of evidence-based practices into patient care; and co-create resources to promote evidence-based practices. Given the limited workforce capacity of the youth health sector and the time required to cultivate a network, the time in which to extend this invitation to clinicians should ideally be generous (Drucker, 1992; NAYH, 2010b). To fulfil their roles as

Citizen social science

197

co-researchers the clinicians would receive online training – this includes instructions on HAweb-use, examples as well as responses to frequently asked questions. An online discussion board would also enable co-researchers to support each other as they familiarise with the platform. As such training would be a continued rather than a discrete process. Second, to understand some of the conditions that shape workplace learning in CIPE the co-researchers would be allocated to one of several virtual environments. Although each would involve clinicians who represent different professions, a distinct feature as informed by the research reviewed earlier would offer comparative value. Examples include the involvement of clinicians from similar or diverse workplaces as indicated by governance arrangements, culture, workloads and/or leadership among other factors; the presence or absence of existing working relationships; the involvement of both junior and senior clinicians or a relatively more homogenous group; as well as the concealment or disclosure of clinician expertise and status. Third, the co-researchers would then be invited to create a personal profile for the purpose of collecting demographic information and depending on the online environment to which they are allocated, reveal selected details to fellow co-researchers. They would also be invited to comment on information that presents evidence-based practices; share their experiences with and observations of evidence-based practices and youth healthcare; share related information and resources; as well as co-create resources to promote evidence-based youth healthcare. This would involve the use of user-friendly functions – for instance, a function like Stack Overflow (http://stackoverflow.com/) would enable clinicians to post queries about engaging with young people and delivering evidence-based youth healthcare. Fellow co-researchers would be invited to offer responses noting any conflicts of interest and vote on all of the responses offered. During this process responses that receive the most votes would rise to the top of the list of responses providing co-researchers with ready-access to these without the need to review each response. This process would also help to identify co-researchers that regularly offer useful responses and are deemed to be most credible by fellow co-researchers. Similarly, using a social plugin like the Like Button (https://developers.facebook.com/ docs/plugins/like-button/) clinicians would be invited to vote on new resources on evidence-based youth healthcare with particular reference to content, wording and format. Another example is the use of a virtual whiteboard like Prezi (http://prezi.com/) which would enable clinicians to co-create resources that help to communicate evidence-based practices. They would be invited to adapt the language and presentation of existing resources and clinical guidelines to optimise perceived relevance and user-friendliness. Data would be collected and managed within HAweb and would include: co-researcher contributions including responses to open and closed items on the perceived value of the platform and reasons for use; network maps among co-researchers; as well as platform-activity including click-through rates. These data would be accessible to all coresearchers within a given virtual environment who would be invited to synthesise, critique and interpret the data to help reveal the dynamic interplay that facilitates or conversely hinders workplace learning in CIPE. Fourth, to further current debates on CIPE these complementary datasets comprised of qualitative and quantitative data would be triangulated to understand key areas (Denzin, 2012). These would potentially include an examination of: how ‘‘matured’’ uses of theory help to explain workplace learning in CIPE (Reeves & Hean, 2013, p. 2); the development and activity of communities of practice both on- and offline (Kilbride, Perry, Flatley, Turner, & Meyer, 2011); how knowledge intermediation – that is, the

198

A. Dadich

interaction between multiple forms and sources of knowledge might be facilitated (Davies, Nutley, & Walter, 2008); the organisational factors including power dynamics that shape workplace learning in CIPE (Baker, Egan-Lee, Martimianakis, & Reeves, 2011); the relationship between workplace learning in CIPE and institutional change (Clark, 2013); and how workplace learning in CIPE gives rise to clinician consensus. Further to these theoretical areas, the use of citizen social science also unveils opportunities to explore methodological areas – for instance, how clinicians who do not typically engage with CIPE might participate as co-researchers; how research integrity might be ensured when collaborating with clinicians who have limited research experience; how the acceptance and/or use of resources co-created by the co-researchers might be enhanced among others – including the wider community of clinicians as well as training and accreditation bodies; and how co-researcher contributions to CIPE might be duly acknowledged.

Concluding comments Limited opportunities for CIPE particularly within the workplace can have considerable implications for patient care and the use of limited healthcare resources (Institute of Medicine, 2010). However, workplace learning in CIPE is vexed by both practical and methodological issues. It can be difficult to facilitate among time-poor, under-resourced clinicians particularly if CE is deemed irrelevant to workplace practices. It can be equally difficult to design and conduct research that has the elasticity required to accommodate the complexity of health services and interprofessional dynamics. Guided by crowdsourcing and citizen science, citizen social science draws on the collective expertise of clinicians who represent diverse professions. It invites clinicians to collaborate as co-researchers to learn with, from and about each other and to enhance CE resources. Furthermore, when operationalised by an existing social media platform it can offer immediacy and valuefor-money. Akin to just-in-time medicine clinicians have opportunity to use existing technology including computers, smartphones and tablets to participate if, when and how they prefer (Chueh & Barnett, 1997; McGowan, Hogg, Campbell, & Rowan, 2008). In addition to recognising the unpredictable nature of service delivery, this approach aligns with adult learning principles which acknowledge the importance of self-directed learning (Kaufman, 2003; McNeil, Hughes, Toohey, & Dowton, 2006). Furthermore, open-source software that can be readily adapted to suit clinician needs and preferences is likely to be attractive to ‘‘cash-strapped’’ health services (Aston, 2012; Boyce, 2010).

Declaration of interest The author reports no declarations of interest. The author alone is responsible for the writing and content of this article.

References Afuah, A., & Tucci, C.L. (2012). Crowdsourcing as a solution to distant search. Academy of Management Review, 37, 355–375. Aston, S.A. (2012). Sustainable health care: How can you help? InnovAiT, 5, 633–636. Baker, L., Egan-Lee, E., Martimianakis, M.A., & Reeves, S. (2011). Relationships of power: Implications for interprofessional education. Journal of Interprofessional Care, 25, 98–104. Behrend, T.S., Sharek, D.J., Meade, A.W., & Wiebe, E.N. (2011). The viability of crowdsourcing for survey research. Behavior Research Methods, 43, 800–813. Bosslet, G.T., Torke, A.M., Hickman, S.E., Terry, C.L., & Helft, P.R. (2011). The patient-doctor relationship and online social networks:

J Interprof Care, 2014; 28(3): 194–199

Results of a national survey. Journal of General Internal Medicine, 26, 1168–1174. Boyce, R.A. (2010). Thriving in the cash strapped organisation. In R. Jones & F. Jenkins (Eds.), Managing money, measurement and marketing in the allied health professions (pp. 52–62). Oxon, OX: Radcliffe Publishing. Bray, D., Croxson, K., Dutton, W., & Konsynski, B. (2008). Sermo: A community-based knowledge ecosystem. Paper presented at the OII (Oxford Internet Institute) distributed problem-solving networks conference, Oxford. Retrieved from: http://ssrn.com/abstract¼1016483 or http://dx.doi.org/10.2139/ssrn.1016483 [last accessed 20 December 2013]. Busarovs, A. (2011). Crowdsourcing as user-driven innovation, new business philosophy’s model. Journal of Business Management, 4, 53–60. Chueh, H., & Barnett, G.O. (1997). ‘‘Just-in-time’’ clinical information. Academic Medicine, 72, 512–517. Clark, P.G. (2013). Toward a transtheoretical model of interprofessional education: Stages, processes and forces supporting institutional change. Journal of Interprofessional Care, 27, 43–49. Clery, D. (2011). Galaxy evolution. Galaxy zoo volunteers share pain and glory of research. Science, 333, 173–175. Clippinger, J. (Ed.). (1999). The biology of business: Decoding the natural laws of enterprise. San Francisco, CA: Jossey-Bass. Cooper, S., Khatib, F., Treuille, A., Barbero, J., Lee, J., Beenen, M., Leaver-Fay, A., et al. (2010). Predicting protein structures with a multiplayer online game. Nature, 466, 756–760. CrowdOutAIDS. (2011). About CrowdOut AIDS. Retrieved from: http:// www.crowdoutaids.org/wordpress/about [last accessed 1 June 2012]. Dash, J. (1999). The longitude prize: The race between the moon and the watch-machine. New York, NY: Farrar, Straus and Giroux. Davies, H., Nutley, S., & Walter, I. (2008). Why ‘knowledge transfer’ is misconceived for applied social research. Journal of Health Services Research & Policy, 13, 188–190. Denzin, N.K. (2012). Triangulation 2.0. Journal of Mixed Methods Research, 6, 80–88. Drucker, P. (1992). The new society of organizations. Harvard Business Review, 70, 95–104. Fayolle, A., Basso, O., & Legrain, T. (2008). Corporate culture and values: Genesis and success of L’Oreal’s entrepreneurial orientation. Journal of Small Business and Entrepreneurship, 21, 215–229. Felstiner, A. (2011). Working the crowd: Employment and labor law in the crowdsourcing industry. Berkeley Journal of Employment & Labor Law, 32, 143–214. Gillan, C., Lovrics, E., Halpern, E., Wiljer, D., & Harnett, N. (2011). The evaluation of learner outcomes in interprofessional continuing education: A literature review and an analysis of survey instruments. Medical Teacher, 33, e461–e470. Glasziou, P. (2012). Health technology assessment: An evidence-based medicine perspective. Medical Decision Making, 32, E20–E24. Grimshaw, J.M., Eccles, M.P., Walker, A.E., & Thomas, R.E. (2002). Changing physicians’ behavior: What works and thoughts on getting more things to work. Journal of Continuing Education in the Health Professions, 22, 237–243. Hand, E. (2010). People power. Nature, 466, 685–687. Hempel, J. (2006). Crowdsourcing: Milk the masses for inspiration. Business Week, 9/25, 38–39. Howe, J. (2006). Crowdsourcing: A definition. Retrieved from: http:// crowdsourcing.typepad.com/cs/2006/06/crowdsourcing_a.html [last accessed 1 May 2011]. ICSC (Interprofessional Care Steering Commitee). (2007). Interprofessional care: A blueprint for action in Ontario. Toronto, ON: HealthForceOntario. Retrieved from: http://www.healthforceontario.ca/upload/en/whatishfo/ipc%20blueprint%20final.pdf [last accessed 20 December 2013]. Institute of Medicine. (2010). Redesigning continuing education in the health professions. Washington, DC: National Academies Press. Jansen, J., van den Bosch, F., & Volberda, H. (2006). Exploratory innovation, exploitative innovation, and performance: Effects of organizational antecedents and environmental moderators. Management Science, 52, 1661–1674. Kaufman, D.M. (2003). Applying educational theory in practice. British Medical Journal, 326, 213–216. Kee, T. (2009). Will crowd-sourcing ruin James Patterson’s best-seller streak? Retrieved from: http://paidcontent.org/tech/419-is-james-

DOI: 10.3109/13561820.2013.874982

pattersons-crowd-sourced-book-the-future-of-publishing/ [last accessed 1 May 2012]. Khatib, F., DiMaio, F., Foldit Contenders Group, Foldit Void Crushers Group, Cooper, S., Kazmierczyk, M., Gilski, M., et al. (2011). Crystal structure of a monomeric retroviral protease solved by protein folding game players. Nature Structural & Molecular Biology, 18, 1175–1177. Kilbride, C., Perry, L., Flatley, M., Turner, E., & Meyer, J. (2011). Developing theory and practice: Creation of a community of practice through action research produced excellence in stroke care. Journal of Interprofessional Care, 25, 91–87. Lee, J. (2009). Unhappy little Vegemites vent their fury over iSnack 2.0. Retrieved from: http://www.theage.com.au/business/media-and-marketing/unhappy-little-vegemites-vent-their-fury-over-isnack-2020090928-g997.html [last accessed 12 May 2012]. LHV (Landelijke Huisartsen Vereniging) & NHG (Nederlands Huisartsen Genootschap). (nd). Over HAweb. Retrieved from: https://haweb.nl/ Linstone, H. A., & Turoff, M. (Eds.). (1975). Delphi method: Techniques and applications. Boston, MA: Addison-Wesley. March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2, 71–87. McGowan, J., Hogg, W., Campbell, C., & Rowan, M. (2008). Just-in-time information improved decision-making in primary care: A randomized controlled trial. PLoS One, 3, e3785. McMichael, A. (2000). Professional identity and continuing education: A study of social workers in hospital settings. Social Work Education, 19, 175–183. McNeil, H.P., Hughes, C.S., Toohey, S.M., & Dowton, B. (2006). An innovative outcomes-based medical education program built on adult learning principles. Medical Teacher, 28, 527–534. Munro, K.M., & Peacock, S. (2005). Improving access to learning in the workplace using technology in an accredited course. Nurse Education in Practice, 5, 117–126. NAYH (NSW Association for Youth Health). (2010a). Best practice health service delivery in youth health. Sydney, NSW: NAYH (NSW Association for Youth Health). NAYH (NSW Association for Youth Health). (2010b). NAYH youth health sector survey: Summary of results. Sydney, NSW: NAYH (NSW Association for Youth Health). Neville, K.A. (2010). Continuing education reform: Are we throwing the baby out with the bathwater? Clinical Pharmacology & Therapeutics, 87, 385–388. Newton, J.M., Billett, S., & Ockerby, C.M. (2009). Journeying through clinical placements – An examination of six student cases. Nurse Education Today, 29, 630–634. NSW Health. (2010). NSW youth health policy 2011–2016: Healthy bodies, healthy minds, vibrant futures. Sydney, NSW: NSW Health. Ockene, J.K., & Zapka, J.G. (2000). Provider education to promote implementation of clinical practice guidelines. Chest, 118, 33S–39S. Open Scientist. (2011). Finalizing a definition of ‘‘citizen science’’ and ‘‘citizen scientists’’. Retrieved from: http://www.openscientist.org/2011/ 09/finalizing-definition-of-citizen.html [last accessed 1 May 2012]. Paige, K., Lloyd, D., Zeegers, Y., Roetman, P., Daniels, C., Hoekman, B., Linnell, L., et al. (2012). Connecting teachers and students to the natural world through Operation Spider: An aspirations citizen science project. Teaching Science, 58, 13–20. Phelan, A.M., Barlow, C.A., & Iversen, S. (2006). Occasioning learning in the workplace: The case of interprofessional peer collaboration. Journal of Interprofessional Care, 20, 415–424.

Citizen social science

199

Reeves, S. (2009). An overview of continuing interprofessional education. Journal of Continuing Education in the Health Professions, 29, 142–146. Reeves, S. (2012). The rise and rise of interprofessional competence. Journal of Interprofessional Care, 26, 253–255. Reeves, S., & Hean, S. (2013). Why we need theory to help us better understand the nature of interprofessional education, practice and care. Journal of Interprofessional Care, 27, 1–3. Reeves, S., Zwarenstein, M., Goldman, J., Barr, H., Freeth, D., Koppel, I., & Hammick, M. (2010). The effectiveness of interprofessional education: Key findings from a new systematic review. Journal of Interprofessional Care, 24, 230–241. Rosen, M.A., Hunt, E.A., Pronovost, P.J., Federowicz, M.A., & Weaver, S.J. (2012). In situ simulation in continuing education for the health care professions: A systematic review. Journal of Continuing Education in the Health Professions, 32, 243–254. Siddique, H. (2011). Mob rule: Iceland crowdsources its next constitution. Retrieved from: http://www.guardian.co.uk/world/2011/jun/09/ iceland-crowdsourcing-constitution-facebook [last accessed 1 May 2012]. Slavazza, P., Fonti, R., Ferraro, M., Biasuzzi, C., & Gilardoni, L. (2006). Towards a knowledge ecosystem. In S. Staab & V. Sva´tek (Eds.), Managing knowledge in a world of networks (Vol. 4248, pp. 366–380). Berlin: Springer-Verlag. Suter, E., Goldman, J., Martimianakis, T., Chatalalsingh, C., DeMatteo, D.J., & Reeves, S. (2013). The use of systems and organizational theories in the interprofessional field: Findings from a scoping review. Journal of Interprofessional Care, 27, 57–64. Swan, M. (2012). Crowdsourced health research studies: An important emerging complement to clinical trials in the public health research ecosystem. Journal of Medical Internet Research, 14, e46. doi:10.2196/jmir.1988. TopRankÕ Online Marketing. (2013). Social media marketing glossary of terms [Website]. Retrieved from: http://www.toprankmarketing.com/ resources/social-media-marketing-glossary/ [last accessed 20 December 2013]. Van Buskirk, E. (2009). Sneak preview: A fantastic new way to find hot music. Retrieved from: http://www.wired.com/business/2009/04/playable-music [last accessed 12 May 2012]. Wade-Benzoni, K., Tenbrunsel, A., & Mazerman, M. (1996). Egocentric interpretations of fairness in asymmetric, environmental social dilemmas: Explaining harvesting behavior and the role of communication. Organizational Behavior and Human Decision Processes, 67, 111–126. Wesson, R. (2010). Information security, ensembles of experts. In R. Ragaini (Ed.), International seminar on nuclear war and planetary emergenics: 42nd session (pp. 109–114). Singapore: World Scientific Publishing. Wexler, M.N. (2011). Reconfiguring the sociology of the crowd: Exploring crowdsourcing. International Journal of Sociology and Social Policy, 31, 6–20. Zheng, H., Li, D., & Hou, W. (2011). Task design, motivation, and participation in crowdsourcing contests. International Journal of Electronic Commerce, 15, 57–88. Zichermann, G., & Cunningham, C. (2011). Gamification by design: Implementing game mechanics in web and mobile apps. Sebastopol, CA: O’Reilly Media.

Copyright of Journal of Interprofessional Care is the property of Taylor & Francis Ltd and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

Citizen social science: a methodology to facilitate and evaluate workplace learning in continuing interprofessional education.

Workplace learning in continuing interprofessional education (CIPE) can be difficult to facilitate and evaluate, which can create a number of challeng...
210KB Sizes 0 Downloads 0 Views