Tuesday, January 29, 2013

Implementing the Learning Healthcare System Can Be Facilitated Using the Principles of Evidence-Based Medicine

The enthusiasm for big data and for the use of analytics and business intelligence with that data is reaching a fevered pitch. I share that enthusiasm, but also know from both my clinical and my informatics experience that knowledge will not emanate just by turning on the data spigot from the growing number of electronic health record (EHR) systems now in operational use. However, if we approach the problem properly, I believe we can achieve the goals of the learning healthcare system as eloquently laid out in various reports from the Institute of Medicine (IOM) [1, 2].

One sensible approach was published recently in Annals of Internal Medicine [3]. The authors were from Group Health Cooperative in Seattle, a leader in the use of data and information systems to improve the quality and outcomes of care. The paper is summarized well by a figure that shows a continuous cycle of design-implementation-evaluation-adjustment of improved care, with interaction with the external environment through scanning for identification of problems and solutions and dissemination to share what has been learned in their setting.

A complementary approach to learning from EHR and other clinical data can be to apply the basic approach of evidence-based medicine (EBM) [4]. In some ways, EBM is antagonistic to EHR data analytics, with the former giving the most value to evidence from controlled experiments, especially randomized controlled trials (RCTs), while the latter makes use of real-world observational data that may be incomplete, incorrect, and inconsistent.

But I maintain that we can look to the process of EBM to guide us in how to best approach the "evidence" of EHR data analytics and the learning health system. EBM is not just about finding RCTs. Rather, it uses a principled approach to find and apply the best evidence to make clinical decisions. In particular, EBM done most effectively uses four steps:
  1. Ask an answerable question
  2. Find the best evidence
  3. Critically appraise the evidence
  4. Apply it to the patient situation
When I teach EBM, I emphasize that the first step of asking an answerable question may be the most important. It is not enough, for example, to ask if a test or treatment works. Rather, we need to know at a minimum whether it works relative to some alternative approach in a particular patient population or setting. This same approach is obviously necessary in the learning health system. Just as RCTs do not inform us passively, neither will EHR data analytics approaches.

In the second step, the principle from EBM is very much the same, even if the techniques of obtaining evidence are very different. The "evidence" in the case of the learning health system is the data in EHR and other systems that, as noted above, may be incomplete, incorrect, and inconsistent. We therefore need to determine if we have the proper data and, if so, whether it can applied to answer our question.

For the third step, just as with EBM, we must critically appraise our evidence. Can we trust the inferences and conclusions from the data? Are there confounding variables of which we may not be aware? This may be critical with EHR data where assignment of cause and effect could be difficult, if not impossible. The solution likely comes back to asking the right question, i.e., one we can have confidence in the correct answer.

Finally, we have to ask, can the data be applied in our setting? Just as some RCTs answer questions in patient populations very different from those of the clinician making decisions, it must be ascertained if the results obtained from this approach can be applied to a specific patient or setting.

The growing quantity of clinical data in operational clinical systems provides a foundation for the learning healthcare system. However, we must approach the questions we ask and how we answer them with caution and a sound methodology. The approach of EBM offers a framework for carrying out this very different but complementary work.

References

1. Eden J, Wheatley B, McNeil B, and Sox H, eds. Knowing What Works in Health Care: A Roadmap for the Nation. 2008, National Academies Press: Washington, DC.
2. Smith M, Saunders R, Stuckhardt L, and McGinnis JM, Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. 2012, Washington, DC: National Academies Press.
3. Greene SM, Reid RJ, and Larson EB, Implementing the learning health system: from concept to action. Annals of Internal Medicine, 2012. 157: 207-210.
4. Straus SE, Glasziou P, Richardson WS, and Haynes RB, Evidence-Based Medicine: How to Practice and Teach It, 4e. 2010, New York, NY: Churchill Livingstone.

Sunday, January 27, 2013

The Health IT "Grand Experiment": Mid-Study Check-Up

It seems that whenever there is something negative about health information technology (HIT) in the popular press, I get emails from people inside and outside the field, asking "what's wrong?" A case in point is a recent article in the New York Times [1], reporting on a "negative" point of view from two researchers from the RAND Corp. that was published in the journal Health Affairs [2]. One of the interesting twists of the Health Affairs piece was that it was written by researchers from RAND, the same organization that published a modeling study in 2005 that concluded that investment in HIT could provide potential annual savings to the healthcare system of $142-$371 billion [3]. About the same time, another model-based study from the Center for Information Technology Leadership (CITL) found similar potential savings [4]. This data in part led to the inclusion of HIT in the Health Information Technology for Economic and Clinical Health (HITECH) Act, the program from the American Recovery and Reinvestment Act (ARRA) of 2009, also known as the economic stimulus bill, that invested up to $29 billion in the adoption and "meaningful use" of the electronic health record (EHR) and other HIT [5].

What can we conclude from this recent publication and reporting about it in the popular press? As always, it is best to look at exactly what has been claimed, what evidence supports it, and where it fits in the larger picture of this topic.

The 2005 RAND study modeled savings that could occur from HIT adoption [3]:
  • Reduced adverse drug events that extend hospital length of stay in the inpatient setting and avoid hospitalization in the outpatient setting
  • Increased used of cost-effective immunizations and screening interventions
  • Improved efficiency of chronic disease management
Their model, however, noted that HIT adoption alone would not be enough; also required would be "interconnected and interoperable systems" that were "adopted widely" and "used effectively." This had an implicit assumption of change in the healthcare delivery system away from payment for volume toward payment for value. The paper describing this work was published in Health Affairs, along with several dissenting views [6-8]. An analysis by the Congressional Budget Office also took issue with the conclusions [9].

The CITL study used a somewhat similar modeling approach and drew similar conclusions. The CITL model focused on different types of health information exchange (HIE), from simple transmission of documents to full semantic interoperability of EHR systems. The latter approach was shown to achieve the most benefit, up to $77 billion per year.

Can we assess the correctness of these modeling studies, now that we have substantially increased EHR adoption through HITECH? The recent paper from RAND noted that the question is not simple to answer, but that HIT probably has fallen short of its promises, especially in terms of reducing costs [2]. Of course, one of the challenges in answering the question of cost-reduction is that it is difficult to attribute avoidable cost in the healthcare system. We do know that healthcare costs have reduced their rate of growth in the last few years, probably mainly due to the economic recession [10]. But we cannot know for sure how much of that reduction might be due to HIT adoption.

But an even bigger reason why we cannot know if the modeling studies are true is that we have achieved the kind of HIT environment that these studies assumed in the development of their models. The original RAND study assumed, as noted above, interconnected and interoperable systems that were adopted widely and used effectively. The authors of the new RAND paper note that HIT failure has come in large part because of failure to reach those assumption. In particular,
  1. We do not have interconnected and interoperable systems. In part, this is because many EHR systems are still closed and proprietary. In addition, HIE efforts are still early and nascent.
  2. We also do not have wide adoption yet of systems, especially advanced systems. While HITECH has led to increased adoption, there is still a long ways to go.
  3. And probably the biggest shortcoming has been lack of EHRs being used effectively. The adoption incentives in Stage 1 of meaningful use focus (by design) on building the data foundation. More effective use will come based on that foundation in Stage 2 and beyond.
The RAND authors conclude that the potential of HIT in reducing costs is still very real, but critical focus on interoperability, patient-centeredness, and usability must be prioritized.

Therefore my view echoes that of the RAND researchers in the new Health Affairs piece, which is that yes, HIT has not yet delivered on its promise to improve efficiency and reduce cost in the healthcare system. But the proposition that it inherently is not able to do so is also not known. As such, if we hope for that improvement, the grand experiment should go on. There is no question that the required time will be longer, the resources required will be larger, and the cultural change will be more difficult. There is also quite valid concern that there are some untended consequences of the staged approach in HITECH, which may be locking clinicians and hospitals into monolithic systems that are difficult to use and expand. I sympathize with the notion of current market-leader systems locking us into an "EHR trap," where the EHR should not be a monolithic application but instead a platform on top of which we can build apps that provide innovative functions and/or make new use of the data [11].

Over the last few years, I have ended many a talk on informatics noting that a "grand experiment" in our field was taking place, with the complete results unlikely to be years away. This study can be viewed as a mid-study assessment, and we can conclude that the benefits have not yet accrued, but that it may be too early to conclude that they will not occur. Although I agree that we probably need some mid-course correction in our approach, I also argue that we cannot go back nor should we end the experiment prematurely. We also must remember the motivations for implementing HIT and reforming healthcare in the first place, which is the error-prone and financially dysfunctional existing system, which both undermines competitiveness of US companies globally due to high employee healthcare costs as well as threatening to bankrupt the US government through unsustainable Medicare cost increases.

References

1. Abelson R and Creswell J, In Second Look, Few Savings From Digital Health Records, New York Times. January 10, 2013. http://www.nytimes.com/2013/01/11/business/electronic-records-systems-have-not-reduced-health-costs-report-says.html.
2. Kellermann AL and Jones SS, What will it take to achieve the as-yet-unfulfilled promises of health information technology? Health Affairs, 2013. 32: 63-68.
3. Hillestad R, Bigelow J, Bower A, Girosi F, Meili R, Scoville R, et al., Can electronic medical record systems transform health care? Health Affairs, 2005. 24: 1103-1117.
4. Pan E, Johnston D, Walker J, Adler-Milstein J, Bates DW, and Middleton B, The Value of Healthcare Information Exchange and Interoperability. 2004, Center for Information Technology Leadership: Boston, MA.
5. Blumenthal D, Launching HITECH. New England Journal of Medicine, 2010. 362: 382-385.
6. Himmelstein DU and Woolhandler S, Hope and hype: predicting the impact of electronic medical records. Health Affairs, 2005. 24: 1121-1123.
7. Goodman C, Savings in electronic medical record systems? Do it for the quality. Health Affairs, 2005. 24: 1124-1126.
8. Walker JM, Electronic medical records and health care transformation. Health Affairs, 2005. 24: 1118-1120.
9. Orszag P, Evidence on the Costs and Benefits of Health Information Technology. 2008, Congressional Budget Office: Washington, DC, http://www.cbo.gov/ftpdocs/91xx/doc9168/05-20-HealthIT.pdf.
10. Hartman M, Martin AB, Benson J, and Catlin A, National health spending in 2011: overall growth remains low, but some payers and services show signs of acceleration. Health Affairs, 2013. 32: 87-99.
11. Mandl KD and Kohane IS, Escaping the EHR trap--the future of health IT. New England Journal of Medicine, 2012. 366: 2240-2242.

Thursday, January 17, 2013

What Do Twenty-First Century Healthcare Professional Students Need to Learn About Informatics?


I am increasingly involved in efforts to determine the content and competencies in informatics for 21st century clinicians. Not only medical students at Oregon Health & Science University (OHSU) but also other healthcare professionals, such as nurses, physical/occupational therapists, nutritionists, and others at OHSU and some other health science universities.

This effort is congruent with the growing push for interprofessional education. The rationale behind inter professional education is that if the healthcare system is to embrace the vision of team-based coordinated care, then clinicians of the future must have at least some of their education together. The 21st century clinician needs to understand that the care they provide will be monitored for quality, safety, and cost. This will hopefully be done in a constructive and self-improving way, but also making sure that mistakes and waste are not swept under the proverbial rug of the paper chart (or not documented at all).

Informatics can be viewed as the ultimate interprofessional activity. There is really very little about informatics that is specific to any healthcare professional. Yes, physicians, nurses, and others need to learn the informatics applications specific to their work. But the underlying concepts of informatics, i.e., the use of information to improve health and healthcare, really apply to any healthcare professional (not to mention the patient, the researcher, and others!).

So what does the 21st century clinician need to know about informatics? Rather than provide a list, I will explain my thoughts in narrative form, showing the key informatics concepts that might comprise a list underlined so that they may form a list later.

A first critical concept is that informatics is not the same as computer literacy. Computer literacy is one of many requirements to use informatics successfully, but knowing how to use a computing device (PC, tablet, or smartphone) is not the same as having skills in informatics, i.e., using that device to improve health, healthcare, public health, or research.

Certainly one fundamental skill for 21st century clinicians is something we began teaching in the late 20th century, which is how to find information to apply to patient care. This is not just knowing what terms to enter into a search engine, but the whole process of asking answerable questions, finding information to answer them, critically appraising that information, and applying it to patients (or populations). The skills of the 21st century clinician start, and not stop, with the knowledge of how to enter simple queries into Google or Pubmed. The 21st century clinician should a power searcher, a skill we often associate with librarians or informaticians. Not that there is no roles for librarians and informaticians, but they should be more teachers and consultants when it comes to finding information.

Starting from the beginning, the 21st century clinician must be skilled information retrieval, what some might term search or others might term knowledge management. Whatever we call it, the 21st century clinician must know how to formulate a clinical question as an answerable one, and then be able to select the appropriate resource and make optimal use of it. This use needs to include knowing what content is in different search systems. This clinician must know that Google has almost all pages on the "visible" Web but not the part that is hidden from its indexing crawlers, while Pubmed is a bibliographic database that indexes biomedical and clinical journal literature. The 21st century clinician must know about specialized resources such as the AHRQ Guidelines Clearinghouse and the CDC Travel site. He or she should also have an understanding of the major commercial publishers as well as what their professional societies offers. Once they know how to use a search site, they must be able to phrase an appropriate query. Even sites like Google, with its ultra-simple interface, has additional features that can greatly enhance its retrieval capabilities. Pubmed has a myriad of features of great value to clinicians, probably the most important being the Clinical Queries interface that help focus the content of the search on more evidence-based articles. But it also offers much more.

Finally, once information is retrieved, the 21st century clinician must know how to critically appraise information retrieved and apply it to the patient (or population). As with searching, the type of appraisal varies with the search engine used. With output from general search engines like Google, the clinician must be able to assess the trustworthiness of the information. Google's algorithm of ranking pages by number of others that link to it actually does a pretty good job of promoting reputable sites to the top of the output. But it is not perfect, and the user must be discriminating. (Back in the 1990s, we used to teach clinicians to avoid using general search engines, since they did not discriminate well among good vs. poor sites, but that is less of an issue, not only because search engines are better but also because people are more savvy about the Web.)

Of course, clinician competency in informatics in the 21st century goes well beyond searching. The modern clinician must also know how to make optimal use of patient data and information. He or she must know how to use informatics to strive for Berwick's triple aim of better health, better care, and lower cost. In my mind, the best vision for this approach comes from the recent Institute of Medicine (IOM) report, Best Care, Lower Cost. This report creates a framework from which essentially all informatics competencies can be contextualized. It presents a compelling vision for a healthcare system that is patient-centered, learning, and population-based, concepts to be explained more fully below.

This also means an understanding that the patient record is more than "charting," and that its value goes beyond being able to read it. Certainly the 21st century clinician must be facile with all aspects of the electronic health record (EHR), being able to easily move from one system to another, and to understand why it is critical for health information exchange (HIE) to make any one record as complete as possible. But the EHR is more than looking up information about a patient. It becomes critically important as healthcare moves from a focus on quantity to one of value. The notion of value includes quality, patient safety, and attention to cost. This requires coordination of care, and not just providing medical procedures, nursing interventions, therapies, etc. in isolation. Coordination requires teamwork and communication.

In this context, the health record is no longer a passive collection of information used mainly to justify billing. Rather, it is a source of data, organized into coherent information, that allows the healthcare team to deliver the best, safest, and most cost-effective care. As such, the 21st century clinician must have a basic understanding of informatics issues, such as capturing data that is correct and complete as well as consistent in its expression. He or she must be able to work in partnership with informatics professionals to achieve what we know is so critical in the application of informatics, such as adhering to standards, achieving system interoperability, appropriately and optimally implementing clinical decision support, and maintaining security to assure privacy and confidentiality.

This view also requires that the 21st century clinician have some understanding of areas like quality measurement and improvement. If nothing else, he or she should understand quality measures because his or her work will increasingly be measured, used to assess quality and how to improve it, and maybe even influence their level of pay. But they should also understand the rationale for measuring quality, including how consistent quality of care is now, and how to work with clinical leaders to select, implement, and improve measures.

Another important area of safe, effective, and coordinated care is patient engagement. Not only is patient engagement the best thing to do from a healthcare standpoint, but 21st century patients, especially aging and Internet-savvy baby boomers, will demand it. Patients will want healthcare that adapts the online features of other modern industries, such as being able to view their own data and interact with their clinicians and healthcare system (e.g., online scheduling of appointments, prescription refills, and even consultations that are appropriate for online). These will likely take the form of a personal health record, accessible from a patient portal that allows access to all information, not just that from the system of the provider organization.

The 21st century clinician must also have some knowledge and understanding of the appropriate use of telemedicine and telehealth, done both to remote locations as well as more locally in patient's homes and other settings.

Complementary to the patient-centric view, the 21st century clinician must also understand population-based care and the informatics underlying it. The clinician and their team will be caring for populations of patients. They must be able to view their care needs and results across their patient population. When a new test or treatment comes along that is determined to be highly effective, they must be able to quickly identify patients who are candidates for it. They must also be able to identify outliers in their populations who require intervention, such as those with excessively high blood pressure or blood sugar, missed appointments or screen tests, or those at risk for hospital (re-)admission.

I also believe there are other areas where 21st century clinicians should have an understanding. One of these is bioinformatics, especially as it relates to personalized medicine. No, the modern clinician need not understand complicated gene sequencing algorithms, but he or she should have an understanding of how genomics and related areas are transforming our understanding of maintaining health, diagnosing disease, and treating it. If the vision of personalized medicine comes to pass, the 21st century clinician will need the help of decision support and other tools for help in applying it to individual patients. He or she should at least have a basic understanding of genomewide association studies and their ramifications.

The 21st century clinician must also understand the strengths and limitations of clinical research. He or she must understand the differences and value contributed by experimental and observational studies. Ideally, the student will have participated in research while in their training. But even if not, he or she should understand issues like data quality, study design, and the limitations that come from the sharp focus perspective of a clinical study. The 21st century student should more generally be able to participate well in the learning health system laid out in the vision of groups like the IOM.

There is certainly a great deal of informatics for the 21st century clinician to learn and be able to apply. From the pedagogical standpoint, there is also the issue of how to deliver it. One way not to deliver it is to have its own separate course, isolated from the rest of the curriculum. There will be a need for educators who are specialists in informatics to (collaboratively with clinical educators) design the learning and  to deliver that which is appropriate for lecture, group discussions, and other didactic settings. But informatics is one of those topics that is best infused throughout the curriculum, especially in clinical settings where it is being used.

Curriculum change can be hard. Academia can be one of the most tradition-bound settings, resistant to change. But just as healthcare must change, so must the education of its clinicians. Informatics is one excellent means of fostering interprofessional learning and interaction.

Friday, January 11, 2013

Eligibility for the Clinical Informatics Subspecialty

(Postscript: This posting was originally published on January 11, 2013. Thanks to two colleagues, Ted Shortliffe and Ben Munger, a few minor corrections and clarifications have been made, and it is being re-posted on January 13, 2013.)

One of the most common email messages I receive these days is an inquiry from a physician about his or her eligibility for the clinical informatics subspecialty. I am writing this posting in part to have a link to send to people in reply to those emails. But before I go further, let me make one vital disclaimer clear: I am not the decision-maker! That will be the exclusive role of the American Board of Preventive Medicine (ABPM), which is the administrative home for the subspecialty. I can give educated guesses based on what the ABPM has said and written, but ultimately it is their decision whether or not someone is eligible.

Shortly before the end of 2012, the ABPM released a one-page document on the clinical informatics subspecialty certification exam and qualifications for eligibility. The first exam will be available during a two-week window between October 7-18, 2013, with registration opening in March. The registration process will include a determination of whether an applicant is eligible for board certification, i.e., be allowed to take the exam.

The next most common questions I am asked are (a) what educational programs at Oregon Health & Science University (OHSU) will make me eligible and (b) what educational programs at OHSU or elsewhere will best prepare me for the exam? I will address those questions after providing what I know about the primary question on eligibility.

The eligibility requirements are clearly laid out in the ABPM document, so those who want to determine if they are eligible should read them carefully. The first three requirements are relatively straight-forward. In short, they are:
  1. Primary certification by one of the 23 member boards of the American Board of Medical Specialties (ABMS)
  2. Graduate from a US, Canadian, or other medical school deemed acceptable by the ABPM
  3. Unrestricted license to practice medicine in the US or Canada
The fourth requirement, which is the "pathway" by which one is eligible during the first five years of the subspecialty (also known as the "grandfathering" era), is more challenging to interpret. There are two pathways for eligibility in the first five years, after which only a formal clinical informatics fellowship accredited by the Accreditation Council for Graduate Medical Education (ACGME) will allow eligibility for certification. These pathways must be completed in the first five years to be eligible to take the certification exam under the "grandfathering" criteria.

The first of the two pathways is the "practice pathway." Those who have been working in informatics professionally for at least 25% time during any three of the previous five years, and can have a supervisory individual attest to it, are eligible for this pathway. "Working" in informatics not only includes "practice" (i.e., being a Chief Medical Information Officer), but also teaching and research.

The second pathway is the "non-traditional fellowship," which is any informatics fellowship of 24 or more months duration deemed acceptable by ABPM. At the November 4, 2012 panel at the American Medical Informatics Association (AMIA) Annual Symposium, Dr. William Greaves of ABPM stated this would be composed of informatics educational programs that were listed in the proposal submitted to ABPM by AMIA in 2009. This list, which has not been made public by ABPM or AMIA, included programs that were funded by training grants from the National Library of Medicine (NLM) or were members of the AMIA Academic Forum at the time the proposal was submitted by AMIA to ABMS in 2009. (I can say that OHSU was definitely on the list, since we were both NLM-funded and a member of the Academic Forum at that time and still are. both). Dr. Greaves also said that ABPM would review applicants trained in other fellowships for eligibility on a case-by-case basis.

The ABPM document also states that time spent in training in informatics could be applied to the practice pathway at one-half the value of practice time. In other words, someone in an educational program for at least 50% time during the previous five years would be eligible to take the certification exam. My interpretation of what he said (remember, I do not make the rules and they may change!) is that someone in a master's degree program that involves the equivalent of one and a half years of full-time study would thus be eligible. If my interpretation is correct, this would mean that completing the Master of Biomedical Informatics (MBI) Program at OHSU within a five-year time span would make one eligible, since it requires six academic quarters of full-time study. The OHSU Graduate Certificate Program, on the other hand, which is a subset of the MBI requiring about nine months of study if done full-time, thus would not be enough. Presumably one can mix and match to achieve eligibility, i.e., with some practice and some education.

As for the common question I get about which OHSU program would make one eligible, the answer then depends on how much practice pathway eligibility one has. If one completely meets the practice pathway criteria, then how much education is a moot point; they are eligible on the basis of professional work. But for those who have not worked in the field enough to quality by the practice pathway, it likely means they would need to complete the MBI or have enough education to make up for the "shortfall" in their practice time, perhaps with a Graduate Certificate.

The second common question I get asked is, what educational programs at OHSU and elsewhere will best prepare one for the exam? This answer is also part guess, as ABPM has not released any information about the exam content beyond saying it will likely reflect the core content outline that submitted with the ABMS proposal and published in JAMIA in 2009. Last year I created a matrix that mapped each element of the core content outline to an OHSU course. The sum of OHSU courses pretty much covered the outline, but the problem is that it required 23 courses to do so. This is about 50% more than required for the MBI. (Remember that OHSU is on an academic quarter system, so the number of courses is larger than programs on a semester system.) However, it is also clear from the matrix that a relatively constrained set of courses could cover a fairly large portion of the core content outline. Furthermore, we are undertaking a process to reorganize the curriculum with an eye to creating a set of courses that will be "core" for the core content, i.e., cover a substantial portion of the outline, while still maintaining the balance that we believe is important to learn in informatics. (The core content outline is likely to be fairly similar for the certifications that emerge for other healthcare professionals and PhDs that AMIA is now planning to propose. In fact, it is not very specific to "physician" informatics, and actually provides a good overview of the critical content necessary for mastery by all who work in clinical informatics.)

I believe that the eight courses required to obtain a Graduate Certificate can be fashioned in a way to prepare one well for the certification exam. Indeed, I can also see where the content of the Graduate Certificate program could form the basis of the didactic portion of a clinical informatics fellowship (perhaps allowing the practice time and fellowship project to add enough credits to qualify one for a master's degree).

I am also sometimes asked if the 10x10 ("ten by ten") course (which is the equivalent of one course, the introductory course, in our Graduate Certificate and MBI programs), or the AMIA Board Review course I will be directing, will cover enough to enable someone to pass the exam by just taking one or both. I believe it is unlikely that these courses alone, without any other formal training, would give one enough knowledge to pass the exam (although I suspect some will try, and perhaps succeed). It should be noted that achieving a sufficient grade of the optional final exam in the OHSU 10x10 course will provide credit to those eligible for study in the OHSU Graduate Certificate or MBI programs, which gets one course under their belt and gives them a trajectory for more.

(By the way, for those who are wondering: The AMIA Board Review course details will be announced in February. The current working plan is to offer the course three times between June and September, after the ABPM registration period opens but enough in advance of the actual exam. There will likely be East Coast, Midwest, and West Coast sites for the three offerings. The courses in the first year will be all face-to-face, although online versions will be developed for subsequent years. A brief interview of me about the course from the AMIA Symposium is available, as is an interview of Dr. Greaves.

Ironically, one activity for which there is no guarantee of being adequately prepared is a traditional research-oriented fellowship, such as those funded by NLM training grants. If one's course of study in one of these fellowships includes a course of study containing a good deal of practical clinical informatics courses, then that preparation should be excellent. However, not all informatics fellowships offer such coursework, as the primary purpose of the NLM-funded fellowships is to train future researchers. There may also be individuals in these fellowships who are extremely well-trained in other areas of informatics, such as bioinformatics or imaging informatics, who will technically be eligible for certification though not really well-prepared to pass an exam focused on practical clinical informatics. Indeed, even those who have a clinically oriented but highly theoretical curriculum may not be able to pass the exam that will have a very practical and applied focus.

I should also reiterate that this eligibility process only applies to the first five years of the subspecialty. After that time, the only way to achieve eligibility for certification will be in an ACGME-accredited clinical informatics fellowship. No details about these fellowships have been released, other than their proposed requirements in the ABMS proposal that was also published in JAMIA in 2009, and a statement by ABPM that such fellowships will be required to be 24 months (two years) in duration. I have previously raised some concerns about what these fellowships might look like, how they will be funded by healthcare organizations, and what will be the ramifications for the way many physicians train in informatics now, which is through graduate programs, often online. A lack of flexibility in these fellowships could limit clinical informatics training mainly to those at the beginning of their careers, which is currently not how most physicians train in informatics.

Let me summarize the answers to the questions of (a) am I eligible in the first five years of the subspecialty, (b) can OHSU make me eligible, and (c) can OHSU help me pass the certification exam?
  • Any US or Canadian physician who has a primary board, has a license to practice medicine, and was education in an acceptable medical school is eligible.
  • Further eligibility is required by either having "practiced" informatics for at least one-quarter time in three out of the last five years or who has completed an informatics fellowship at a to-be-released list of institutions.
  • Although the rules are not clear, educational time in an approved informatics program will count at one-half the value of practice time, i.e., having been in an educational for at least one-half time in the last five years. This may allow those who have insufficient practice time to obtain eligibility through educational programs. (For those with no practice time at all, I interpret this to mean one could be eligible through the OHSU master's program but not the Graduate Certificate Program.)
  • The curriculum of the OHSU Graduate Certificate Program as it now stands can be tailored to cover a substantial fraction of the core content likely to be on the exam, and will be reorganized in the next 1-2 years to allow it to do this even more efficiently.
  • The 10x10 and AMIA Board Review courses are unlikely to enable one with no other formal training in informatics to pass the exam (though anyone is able to try!).
A final question I am sometimes asked is whether I will be eligible for the exam, and if so, whether I plan to take it? I believe I am eligible (although this is for the ABPM to decide!), since I was certified by the American Board of Internal Medicine (ABIM) at a time when there was lifetime certification granted by ABIM, i.e., re-certification will never be required, even though I would re-certify if I ever returned to patient care. In addition, while I have not seen patients for over a decade, I still maintain an "administrative" medical license in the state of Oregon, which makes me a licensed physician. I also have no trouble meeting the practice pathway time requirements, since I live, eat, and breathe informatics at least full time (some would say well more than that!). Therefore if I am indeed eligible, I certainly plan to sit for the exam, and hope later this year to be among those who are certified clinical informatics subspecialists.