Research

Crowdsourcing Can Deliver Real Time Support for Physicians

Apr. 29, 2014

crowdsourcingA new study shows that physicians can successfully harness the power of crowdsourcing to help diagnose and treat patients in real time.  The pilot project, the results of which appear in the Journal of Hospital Medicine, could help providers make more informed decisions and improve the quality of care.

The findings were the result of an eight month field test of a mobile software application that involved 85 health care providers with UR Medicine.  The project was led by Marc Halterman, M.D., Ph.D. and Max Sims with the Department of Neurology, and Jeffrey Bigham, Ph.D. and Henry Kautz, Ph.D. with the Department of Computer Sciences. 

While crowdsourcing has been successfully employed in a range of scientific endeavors – searching for extraterrestrial life, tracking bird migrations, folding proteins, and mapping the location of every tree in Britain – significant barriers have heretofore prevented its application to the field of clinical medicine.  

There are several practical barriers that prevent clinicians from problem solving at the point of care. Often, physicians are busy and don’t have time to stop what they are doing and find a colleague with the appropriate expertise. And in some instances, physicians may shy away from asking for help if they think it will reflect poorly on their medical skills and knowledge.

The team developed an application called DocCHIRP for mobile (iOS and Android) and desktop use.  Devices using the software were encrypted and password protected.  The program allowed a provider to send inquiries to individual or groups of physicians and nurses that were part of the 85 person “crowd.”  The questions could require either a written or an agree/disagree response.

The most common inquiries were related to the effective use of medication (e.g. common side effects), navigating complex medical decisions, guidance regarding standard of care, the selection and interpretation of diagnostic tests, and patient referrals.   The fastest response time was 4 minutes and the median time it took for the first response to arrive was 19 minutes. 

Despite the fact that physicians have at their disposal a wealth of resource that enable them to research medical questions, the authors found that many providers feel that the opinion and guidance of trusted peers were as or more valuable. 

The team is in the process of developing a new crowdsourcing platform with the goal of involving a larger cadre of participating providers – which will potentially improve response and help minimize unnecessary interruptions – and allow doctors to send and received information anonymously, if necessary.