Back to Journals » Pathology and Laboratory Medicine International » Volume 7

Current status of discrete data capture in synoptic surgical pathology and cancer reporting

Authors Williams C, Bjugn R, Hassell L

Received 3 February 2015

Accepted for publication 7 April 2015

Published 9 June 2015 Volume 2015:7 Pages 11—22

DOI https://doi.org/10.2147/PLMI.S64378

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 4

Editor who approved publication: Dr Paul Zhang



Christopher L Williams,1 Roger Bjugn,2 Lewis A Hassell1

1Department of Pathology, University of Oklahoma Health Sciences Center, Oklahoma City, OK, USA; 2Department of Pathology, Oslo University Hospital, Oslo, Norway

Abstract: The current status of synoptic pathology reporting is presented with its historical context. The awareness of additional audiences and users has made the presentation and capture of pathology data, particularly cancer data of broad importance. Current models of adoption in the US, Canada, Norway, and the Netherlands are noted. Significant terms, benefits, and stakeholders key to implementation and advancement of capabilities particularly with regard to capture of discrete data elements are presented. Important barriers to be overcome include fiscal constraints, technologic barriers such as interconnectivity and legacy systems, as well as social and organizational obstacles.

Keywords: quality assurance, integrated disease reporting, clarity, completeness, pathology report, cancer registry, biorepository

Introduction

It has been nearly three decades since the College of American Pathologists (CAP) began advocating for standardizing the reporting for certain cancers1 and more than 20 years since the Archives of Pathology and Laboratory Medicine published a call for pathologists to begin using checklists in their cancer reports.2 However, despite encouragement from CAP, the Centers for Disease Control and Prevention (CDC), the Association of Directors of Anatomic and Surgical Pathology and other societies, the universal adoption of synoptic checklists in their highest reporting utility remains elusive. While the majority of US accredited institutions use some form of synoptic reporting for cancer resections, relatively few of them have the capability to capture and archive the separate data elements digitally. And while this may sound like “moving the goalposts” the value of such enhanced, coded data from these reports is so much greater as to be a “no brainer” kind of choice. Why does what we do seem to lag what we think we should (all) do in this arena? Answering this question and seeing a path forward requires an understanding of where we have come from, where we are presently, and more particularly, the current value of and barriers to greater application of this approach to surgical pathology reporting. This forms the key message of this paper. We will address the various stakeholder groups who stand to benefit, as well as those who may have to bear the costs in terms of both effort and capital, and the processes that can lead toward a favorable outcome. The focus is on cancer reporting, but the principles discussed are valid independent of sample type.

Terms and definitions

The words “checklist”, “synoptic reporting”, and “template” are frequently being used in the context of quality improvement by standardizing the histopathology cancer report. Definitions of these terms in such a context are given below:

  • Checklist: a checklist can be defined as a pre-defined list of informational elements considered relevant for performing a particular task. The intention of a checklist is to ensure consistency and completeness in carrying out the task. A checklist can, for example, contain information on how to do gross pathology on a particular resection specimen, or contain information on which parameters a histopathology report on a particular resection specimen should address.3
  • Synoptic histopathology report: a synoptic histopathology report can be defined as a report where the information elements are presented in a pre-defined tabular form.3 As such, a synoptic report can vary from a pre-defined text copied into the report to an advanced electronic scheme with pull-down menus and automatic coding, and where the individual information elements are stored as “discrete data elements” in a database. In the latter case, data elements can be automatically extracted and transferred electronically to other information systems.4
  • Template: a template can be defined as a pre-defined text outlying how a histopathology report can be phrased. A template will contain parameters to be reported on (the checklist) and alternatives/suggestions for how to report them, and usually also a suggestion for the graphical lay-out of the report.3,5

Purpose and value

The primary purpose of the histopathology report on a cancer resection specimen is to provide the clinician with sufficient information to ensure adequate patient care. Accuracy, timeliness, completeness, and proper information transfer are four keywords when evaluating the quality of histopathology reports with respect to this primary purpose.6,7 However, histopathology reports also have secondary purposes and additional audiences. The information provided by the reports is not only crucial for cancer registries, thereby for providing data for long-term health care planning, but also for the creation of new knowledge related to cancer and human biology.

Electronic data submission to cancer registries or other governmental agencies has expanded since the first and second Reporting Pathology Protocols Projects pioneered the effort over a decade ago.8,9 Currently the widespread use of Health Level 7 messaging, and standardization of software solutions from the CDC and various vendors, has dramatically improved workflow for cancer registries, but effective integration of electronic data from pathology sources has lagged on this front.

Having well vetted pathology data is invaluable in predicting prognosis, stratifying patients based on risk, assigning treatment, validating and modifying staging protocols, and resources are continuously being spent on manual extraction of such data from pathology reports. Why is the same amount of resources not being spent on the development of electronic laboratory information systems allowing such data to be added, stored, and extracted as discrete data elements?

How should one then estimate the value of collecting discrete data elements, and is it worth the cost in terms of both budget dollars and any possible inconvenience to the pathologists who generate the bulk of the data? Who is going to collect and review this data? Clearly data collection happens best and most efficiently prospectively at the point of creation of the data, meaning in the pathology suite.10 But the downstream users are not always perfectly willing to assume a share of the costs, nor engaged in structuring the data or collection tools. Ready electronic databases would allow departments to review their own data and collect information from collaborating institutions for multi-center reviews. National and governmental organizations (CAP, CDC, etc), as well as interested commercial entities could direct studies, collect, refine, and analyze the data, and play a role in its warehousing.

Historical development

USA

In the 1970s, the American College of Radiology aimed at improving quality of patient care by establishing guidelines through consensus development by peers for the best current management in radiation oncology. Recognizing the role of pathologists as consultants in the practice of medicine, CAP subsequently established the Pattern of Care Study in an effort to determine what information other physicians involved in the management of cancer patients need to get from the histopathology report. The intention was to establish guidelines for the minimal amount of data to be included in routine pathology reports.11,12 The first guidelines published in 1986 were for reporting breast cancer, bladder cancer, and Hodgkin’s disease.1 Five years later, Markel and Hirsch reiterated the call.13 In 1992, Zarbo published results of a multi-institutional Q-Probes study assessing the adequacy of surgical pathology reporting in colorectal cancer. The single most important factor identified in consistently providing complete data was the use of a standardized report form or checklist.14 In an accompanying editorial, Kempson advocated for the use of checklists in surgical pathology reporting.2 The following year, Rosai published examples of standardized reporting forms for major tumor types developed at the Memorial Sloan-Kettering Cancer Center.15 A key argument in his call for broad use of the templates was one of “foresight” in that he recognized that future questions of both research and clinical nature could be answered more easily from retrospective data with that bit of added prospective effort to synoptically include it in the initial report. Ironically, it is this feature of what to include or exclude from a synoptic report (and how often that changes) that remains one of the biggest challenges today. Another multi-institutional Q-Probes study assessing pathology reporting in lung cancer was published in 1996. Again, the use of a standard report form or checklist was associated with an increased likelihood of key data elements being reported.16

In 1998, CAP released the first edition of cancer reporting protocols consisting of those features found to be valuable in predicting prognosis or guiding therapy.17 These protocols were updated and expanded in both 2000 and 2003, the latter in conjunction with the sixth edition of the Cancer Staging Manual of American Joint Committee on Cancer. This edition also first introduced the use of SNOMED CT (systematic nomenclature of medicine, clinical terms) coded discrete data elements.17 However, at this time the checklist was provided exclusively as a paper checklist and depended on effort from the laboratory information system (LIS) vendors and pathologists to implement a system where this information was stored in a useful electronic format. In 2004, use of the CAP Cancer Checklists was added to the CAP laboratory checklist used for accreditation. However, the use of these cancer checklists was included as a recommendation rather than a requirement. Shortly thereafter, the American College of Surgeons mandated that accredited cancer programs must be served by pathology reporting mechanisms that included all the relevant synoptic data elements17 though not necessarily in synoptic format.

Realizing the limitations of a paper-based checklist, CAP released an electronic version of the Cancer Checklist in 2007. This electronic version was based on a Microsoft Access database and encoded the answers using SNOMED CT. The following year CAP established the Diagnostic Intelligence and Health Information Technology Committee with a mission “to advance the implementation of the CAP Cancer Checklists using health information technology”.18 In 2009, several important advancements were made. First, the number of available checklists was further expanded to 55. More importantly, extensible markup language (or XML) versions of the checklists were made available. This was a huge step forward in improving the compatibility and portability of the checklists to a variety of platforms, including the possibility of natively incorporating the checklists into a laboratory information system.17 Currently (in 2015) the number of checklists has been expanded to a total of 69.

In parallel with the work undertaken by CAP, the CDC National Program of Cancer Registries (NPCR) funded the Reporting Pathology Protocols (RPP 1) pilot study in 2001 to evaluate the use of discrete data elements in report submission to cancer registries.8 This study focused on reporting between two pathology groups and their respective cancer registries. The data encoding standards used in this study included Logical Observation Identifiers Names and Codes for the identifying names of the data elements and SNOMED CT codes as the value of the data elements with communication consisting of Health Level 7 encoded messages. One key observation from the study was that implementation of a new electronic reporting system required significant investments of both time, due to a steep learning curve, and financial resources which did not result in identifiable gains of efficiency or increased reimbursement. It was recommended that LIS developers and vendors begin supporting the storage of discrete elements of the cancer checklist in addition to free text to provide supplemental information not captured by the checklist. In 2004, a second study (RPP 2) was commissioned by NPCR to evaluate whether data from the CAP Cancer Checklists is more accurate and complete than data derived from narrative reports as well as to identify additional barriers to implementation.9 This study focused on pathology reports for breast, prostate, and melanoma submitted to three participating cancer registries in California, Maine, and Pennsylvania over a 12-month period and targeted the specific goal of electronic transfer of the synoptic data elements as individual fields. The results showed opportunities to enhance workflow, productivity, and timing of reporting using electronically encoded reports, primarily from the cancer registry perspective. Comparison of the reports revealed that the synoptic report generally agreed with the narrative portion; however, the synoptic report was rarely more informative than the narrative text. In the instances where the reports did not contain the same information, it was usually the narrative portion that contained additional or more complete information. Lack of available granularity and flexibility in the checklist was the most often cited reason for these differences, where the narrative text relayed more nuanced information incapable of being captured by the checklist. This issue is at least partially addressable by increasing the complexity of the checklist and possible answers; however, this also potentially compromises the usability of the checklist. The study also highlighted some of the social and structural barriers to implementation on the part of pathologists, including resistance to adopt data-entry roles, loss of autonomy in structuring and composing reports and time demands of completing electronic checklists.19,20

An informal survey of 28 pathology leaders in the US conducted by the authors in late 2014 confirmed that synoptic reporting of cancer cases is widely applied as the standard procedure, with 98% of the labs sampled employing this method for at least all cancer resections. However, only a minority (20%) had the capability to capture synoptic data into discrete data fields for subsequent purposes.

Canada

In its first comprehensive provincial cancer plan published in 2004, Cancer Care Ontario in Canada included real time capture of pathology information in its action plan.21,22 From 2005 to 2008, background data were collected, contacts with pathology departments were established, infrastructure was put in place and pilot projects were implemented. In this phase, it was decided to implement electronic versions of the CAP Cancer Checklists into all laboratory information systems being used by the more than 50 pathology laboratories in the province. In the following stage (2008–2012), electronic synoptic reporting was systematically implemented in the province. In the first phase (2008–2010), the target was that >90% of all pathology reports for the five most common forms of cancer should be reported in this way. In the second phase (2010–2012), the target was that >90% of all pathology cancer reports should be reported by the electronic CAP Cancer Checklists. Data from the Ontario Cancer Registry are now available no more than 30 days after the date of reporting.

This implementation of many of the concepts put forward with RPP 1 and 2 was more than just a demonstration project and evidenced the potential of organizations to see the added value to patients and society when the resultant data collected can be managed centrally. The project was also able to show a dramatic improvement in the completeness of reports from a host of anatomical sites following implementation of synoptic reporting. The report by Srigley et al described how this was implemented within the pathology departments22 and suggested a staged approach to implementation that still applies today (Figure 1).

Figure 1 Stages in the development of synoptic reporting and discrete data capture.

The Ontarian project was deemed so successful that a pan-Canadian initiative for electronic pathology synoptic reporting has been established.23

Norway

In 2003, a project on electronic synoptic cancer reporting was initiated by the Cancer Registry of Norway and the Norwegian Society of Pathology.24 The goal was similar to that in Ontario, but the project did not have the same solid organizational platform and long-term financing as Cancer Care Ontario. Just one structured electronic synoptic histopathology reporting template was developed before the project had to close down due to lack of funding. However, several follow-up studies on the implementation and use of the colorectal form have shown quality improvement by using it,24,25 that the compliance rate is generally high in departments having implemented the form,24,26 and that both individual pathologists and departments have a positive attitude toward electronic synoptic reporting.26

the Netherlands

The Dutch National Pathology Registry (PALGA) was established in 1971 as an independent foundation. PALGA operates a national database containing information on all histological, cytological, and autopsy examinations in the Netherlands and provides data for the Dutch Cancer Registry and national cancer screening programs. Although being an independent foundation, PALGA is government-funded. All pathology laboratories in the Netherlands are electronically linked to PALGA.

In 2007, PALGA started working on electronic synoptic histopathology reporting. This initiative was based on already established national cancer guidelines outlining “minimum datasets” on various cancers to be reported on by pathologists. The aim was accordingly to both improve the quality of the histopathology reports by introducing an electronic checklist and to facilitate the electronic transfer of such data to the national registry. The first two protocols were introduced in 2008. Currently, 16 electronic synoptic cancer protocols have been developed, and the aim is to have approximately 30. All such protocols are formally approved by the Dutch Society of Pathology before being implemented.

All 55 pathology departments in the Netherlands are using electronic laboratory information systems. PALGA’s electronic synoptic reporting system is fully integrated with all vendor systems being used, and in 2015 all 55 pathology departments will be linked to the PALGA electronic synoptic cancer reporting system. Cancer information being entered into the synoptic reports is done as part of routine reporting, and the information is both stored in a local database and submitted to the national database.

PALGA’s system allows non-mandatory parameters to be added to the national synoptic reports. Such non-mandatory parameters can be “switched on/off” as decided by the individual departments. The system also allows pathology departments to develop their own synoptic reports, but local protocols will be overruled if similar national ones are later developed. Synoptic reporting is also used for the national screening programs on colorectal and cervical cancer. In these instances, discrete data field information is sent directly to the National Institute for Public Health and the Environment (Paul Seegers, PALGA, personal communication, January, 2015).

Experience gained using checklists, templates, and/or synoptic reporting

As mentioned earlier, i) accuracy, ii) timeliness, iii) completeness, and iv) proper information transfer (or clarity) are four key areas for evaluating the quality of histopathology reports.6,7 To our knowledge, there are no studies on diagnostic accuracy or timeliness with respect to various ways of reporting. One study did, however, state that electronic synoptic reporting was considered time efficient by the pathologists as they could complete the template themselves and sign out the report immediately, but there was no hard data on this.27 With respect to secondary users of pathology data, such as cancer registries, electronic synoptic reporting greatly improves timeliness. Both Cancer Care Ontario and PALGA will have raw data available within 24 hours after the histopathology report has been signed out.

With respect to completeness of the histopathology reports on cancer, a number of studies have been undertaken.22,24,25,2836 The main conclusion from these studies is that the use of a checklist or template improves upon the completeness of histopathology reports as evaluated against guidelines published by pathology associations. One early intervention study is of particular interest. Cross et al compared the percentage of colorectal cancer reports containing a given number of data elements when no standard was provided, when guidelines were provided in text format, when guidelines were formatted as a flow chart, and when a checklist containing all the relevant items from the guidelines was attached to the request form. While a few common elements were universally reported, such as tumor type, grade, and lymph node status, the reports prepared using a checklist were complete every time while those not using a checklist were complete in only 30%–60% of cases.31 In a study on the quality of histopathology reports for renal cell cancer published in 2010, Shuch et al stated that insufficient data in the reports “[…] does not permit the use of prognostic systems, and may hinder enrollment into adjuvant trials and the selection of systemic therapy.” In their opinion, efforts should be made to improve reporting practices by methods such as standard template reporting to enable improved patient care.37 Synoptic cancer data completeness has also facilitated inter-institutional collaborative clinicopathologic studies of numerous entities, such as gastrointestinal stromal tumors,38 prostate cancer,39 and colorectal cancer40 among others. Proper information transfer can be translated into how easy is it for the receiver of the histopathology report to understand it and extract relevant information for further use. To our knowledge, just two studies have investigated these aspects. Both concluded that a standardized histopathology report was deemed more satisfactory than narrative text by clinicians.41,42 Anecdotal evidence from cancer registrars and from RPP 2 indicate that data extraction and categorization proceeds much more easily when it can be totally managed digitally. Additionally, our informal survey alluded to above revealed that over 60% of labs using synoptic reporting formats had received verbal or written compliments from clinical colleagues (surgeons, oncologists, or others) or from site inspectors.

Technology today

In the US, the current generation of electronic health records (EHR) does not support direct integration of the CAP checklists. This must be accomplished through third party software solutions, which has variable support from the EHR vendors. An advantage of this is that these third party solutions are typically updated more frequently than the EHR, however, it is the EHR that ultimately determines whether discrete, coded data fields can be stored and in what format. These third party solutions typically replicate a traditional paper-based checklist, utilizing checkboxes or pull-down lists for discrete elements as well as having an option for additional free text as needed.

This use of a stand-alone program interfaced to the EHR allows for potentially interesting applications. These applications could be run on non-traditional platforms, such as smartphones, tablets, etc, allowing pathologists to become untethered from their desk. Applications can be linked to and pull data from external sources, such as PubMed, and outside ancillary studies such as genomic data. Natural Language Processing could be used to extract meaning from the narrative text and pre-populate the required discrete element. Although these and many other opportunities exist to aid in report generation, these are not in common practice. Further development of these kinds of “ease of use” applications should be pursued as potentially key in overcoming the resistance barriers discussed below.

The North American Association of Central Cancer Registries has championed the efforts to build meaningful coding links between pathology reports, and specifically the synoptic checklist type of data, beginning with their role in RPP and more recently with the publication of Standards for Cancer Registries, Volume V.43 This ongoing effort to enhance the completeness, timeliness, consistency (quality), and cost-effectiveness for cancer registries, with the attendant benefits to public health and cancer awareness, serves as a useful roadmap for LIS vendors and developers in constructing and formatting pathology data to foster utility for this important downstream client. The challenge of course, has been getting health care providers and pathologists to be aware of or attendant to the interests of this mandated use case. However, a large amount of the detailed “grunt work” of mapping codes to enable facile data transfer has been done through this work, creating a true foundational tool.

Current and future prospects

CAP has developed a software application, electronic Forms and Reporting Module (eFRM), with commercial partner mTuitive, Inc., to provide a streamlined solution to fill out and report the cancer checklists. The application can be interfaced to the most popular anatomic pathology LISs and can also be used as a stand-alone product to submit reports to cancer registries or other clients. It stays up-to-date by automatically checking for and downloading the latest versions of the cancer checklist.

Nuance/Voicebrook has developed a software product that combines voice recognition with a software automation tool that allows users to control the standard functions of the LIS with their voice. This includes navigation to various areas of the report, free text dictation, and completing the checklist elements. The speech recognition software has a pathology specific dictionary enabling reasonable accuracy when used in this context. The automation software should be compatible with the most popular AP LISs on the market. The automation can also store commonly used combinations of commands and descriptions (macros) allowing multiple fields to be completed simultaneously using a single command. If a significant number of nearly identical specimens are handled, this can potentially offer dramatic increases in productivity.

Integration of learning-capable software solutions (artificial intelligence) may further help to solve the challenge, either from the data entry aspect or possibly from the search and classification standpoint. A forthcoming report of “secretary-mimicking” interface between the pathologist and LIS uses means of scanning pre-analytic data coupled with voice commands to complete, check, and file reports.44 While not yet adapted specifically to cancer checklists or complicated resections, the ability of such endeavors to improve workflow for the pathologist could potentially further reduce the resistance from that stakeholder group.

Another prospect on the horizon is the capability to do meaningful image-based searches.45 This element illustrates the divergent types of data that begin to enter the data stream, and quickly multiply its complexity and size. However, it becomes important as one begins to consider the utilities described earlier. Imagine a researcher wanting to tailor a study according to specific demographic, stage, and histologic image content features, perhaps along with a particular treatment profile. Providing linkages between textual discrete data, clinical or demographic information, pharmacologic data, and the visual data in radiology or pathology image files seems a not too distant additional demand upon our systems.

Toward a fully integrated, evolving disease report

Several efforts have come forward from the CAP further building on the ability to draw discrete data elements from synoptic reports and other elements of the clinical record. A number of recent webinars and other venues have discussed this capability and put forward discussion forums for consideration.46 As proposed by Diagnostic Intelligence and Health Information Technology, the CAP proposal plans to use structured data that may be continuously updated from within multiple pathology reports (including non-traditional sources of data such as next generation sequencing) to auto-generate portions of an integrated evolving disease report. This appears poised to become the new horizon, or “really” meaningful use, primarily for its benefits in ongoing patient management and patient safety – clearly touchstones of high impact. It may therefore be a fruitful avenue for alliance for those seeking to leverage resources for development of discretely coded synoptic data to serve other business, public health or clinical purposes.46

Web-based pathology reporting methods are not new, but certainly offer advantages in many settings, by allowing easier linkages to pertinent therapies, and expandable access to primary data, such as tumor synoptic data, digital slide data, next generation sequencing, or other sources (Figure 2).

Figure 2 Mock-up of web-based diagnostic report with collapsible sections for synoptic data elements, links for further information or treatment/management issues, connectivity to digital slide or image data, and/or other data sources.

Confronting barriers

Antagonists and protagonists with respect to change of medical practice, such as the way of histopathology reporting, can be present at all levels of the health care system, at the level of the individual pathologist, the health care team, the health care organization, and/or the wider environment.4749 Such possible stakeholders are illustrated in Figure 3.

Figure 3 Illustration of various stakeholders affecting or being affected by the introduction of electronic synoptic histopathology reporting.
Abbreviation: LIS, laboratory information system.

Studies on the implementation of synoptic histopathology reporting indicate that all levels must be addressed in order to achieve long-term success.20,22,26,27 Implementing change in this arena needs to take into account the case for change as seen from the viewpoint of all the various stakeholders. Such possible benefits are summarized in Figure 4. To our knowledge, only the province of Ontario, Canada and the Netherlands have succeeded in the implementation of electronic synoptic reporting on a wider scale. Both efforts are characterized by a long-term process (5–10 years), involvement of all key stakeholders, sufficient resources, and having one organization (Cancer Care Ontario and PALGA, respectively) taking the formal lead with respect to the coordination and implementation. In our opinion, this indicates that a wider societal success with respect to electronic synoptic reporting can only be achieved if a key organization in the health care system is driving the change. Much as the “meaningful use” mandates in the US have sought to drive greater information technology connectivity and use of electronic medical records via upfront incentive payments, and subsequent penalties, as a means of overcoming the status quo in terms of adoption, stakeholders beyond the conventional patient–pathologist–clinician triad will need to be willing to incent adoptive behaviors, and likely also consider operational support to defray launch and ongoing costs. Given the potential benefits for quality, and reduced costs overall to the system, it may be that some regional endeavors such as Accountable Care Organizations may be able to see value in these capabilities as well. Some of the users in our survey were using the costs of duplicate data entry, or data extraction on the part of other non-pathology stakeholders, to justify acquisition of systems with discrete coded data capture for example.

Figure 4 Reporting format.
Note: Benefits to various stakeholders of various levels of reporting of synoptic data, however, in order to succeed independent of scale, knowledge of the particular barriers in a given setting is needed in order to design optimal strategies for change.
Abbreviations: IDR, integrated disease report; MD, medical doctor; ICD, International Classification of Diseases; SNOMED, Systematic Nomenclature of Medicine.

Quality marker profiling, such as the Physician Quality Reporting System program of the Centers for Medicare and Medicaid Services (CMS) in the US, as implemented for pathology has depended highly on identification of selected facets within the pathology report indicative of “quality”.50 But the burden has been to be able to correctly identify and report to CMS the cases meeting their criteria. Not infrequently, organizations miss pertinent cases that are included in their denominator of candidate cases. Discretely coded and archived synoptic data make case identification and compliance verification a simpler task. Comparable quality metrics derived from encoded synoptic pathology data may become pertinent for other practitioners, for example, margin status, whether for CMS reimbursement issues, or for health care system quality or credentialing purposes.

Bio-specimen banking for both individual patient purposes, as well as for research, whether institutional or biopharma-based, depends heavily on vetted pathology data that are complete, and on the research side, generally de-identified. This kind of data transfer is extraordinarily tedious to perform manually from text-based reports, but becomes much easier with encoded discrete data, just as it has been demonstrated to be for the cancer registry recipients of such data. But there are potential advantages to individual patients as well to have their specimen data encoded in this manner as well. Software tools (such as “PatientLocate”)51 exist to facilitate matching patients to particular treatments or trials based upon particular characteristics, the kinds of elements to be captured in synoptic discrete data fields. When they are part of a dataset such as these, patients’ access to care options may expand.

Technical barriers within organizations

LIS

While being able to obtain the cancer checklist in versatile format such as XML is helpful, many LISs were developed before the widespread adoption of this standard and do not support it. Hence even if a checklist is used, it may have to be entered into the database as free text. This situation is less than ideal when it comes to sharing data with cancer registries or searching through the database, since the information cannot be easily indexed and must be decoded and then reformatted before being shared. Progressive vendors have attempted to build in the capabilities to acquire and store synoptic data as discrete elements, based on their experience in collaborating with RPP or other similar efforts, but installation, upgrading, validating, and implementing these new capabilities has not always risen to a top priority among even motivated customers. Only a quarter of those in our survey who were reporting synoptically were capturing this data using their LIS. Another 10%–15% were using some form of a middleware solution interposed between the pathologist and the LIS. The majority of these were forced to use a separate database from their LIS to store the discrete data elements, potentially further fragmenting the patient record and introducing the potential for inconsistencies or omissions.

Combined technical and social challenge

The increasing frequency with which checklist data elements are modified is a big challenge, even to those using only text-based synoptic reporting. The expense of re-programming to update with each (almost annual) revision is a challenge on the resources of the departments doing the work, whether within the department, the LIS or middleware vendor. These costs are particularly difficult to budget for given uncertainties about the degree and scope of revisions coming. While the CAP has attempted to deal with this by offering target implementation dates, not infrequently different users within an institution may want to move to new versions, or even their own customized versions, sooner than the target. Almost two-thirds of our survey respondents mentioned above noted that technical and the financial means to accomplish those technical demands were limiting factors in their deployment or decision to upgrade. It is interesting to note that in the two settings where electronic synoptic reporting has been widely implemented (Ontario, Canada and the Netherlands), one provincial/national organization has the sole responsibility for upgrading the synaptic forms. This organization ensures the content of the synoptic form and bears the cost of the upgrade.

Professional resistance

Data on professional resistance are mixed. The pathologist perspective on the RPP 2 work noted that resistance to change and restriction on their autonomy or creativity in reporting, as well as the time demands of doing synoptic data entry were significant barriers.20 Two-thirds of our survey respondents in 2014 noted that pathologist resistance had been encountered in their implementation. Related to this, it is pertinent to note that three-fourths (75%) of our survey respondents also reported that the pathologist or comparable individual was directly entering the synoptic data themselves. In a study on the implementation of electronic synoptic reporting in Norway, Casati et al also reported strong resistance against synoptic reporting from individual pathologists. On the other hand, some pathology departments had a long-term compliance rate of synoptic reporting around 90% in an environment without control on individual usage. In their discussion, the authors proposed that the “silent” majority may adopt a new system independent of opinion leaders if the reasons for doing so are compelling for the individual pathologist.26 Such reasons can be enhanced simplicity and efficiency in use.24 Thus it seems imperative that pathologists understand the rationale for the change and any incidental added cost they will bear time-wise if one is to obtain compliance. Still, interventions that do not take proper account of this asymmetry in the “cost” of the product (discrete data) and who reaps the benefits of that product will face significant challenges in adoption.

Regulatory mandate

A further method to achieve change, while often not popular with many who might be subject to enforced or coerced adoption of a particular improved practice, is imposition of regulatory or accreditation standards. Such measures can be effective in changing behavior, even if it does occur grudgingly. This is not quite the “carrot first, stick later” approach that CMS has taken with the Physician Quality Reporting System measures described above, but is more akin to the accreditation requirements set forth by various bodies with such authority, such as the National Cancer Institute which designates selected cancer centers as affiliates, CAP, which accredits moderately and highly complex laboratories providing the bulk of anatomic pathology in the US, and the American College of Surgeons Commission on Cancer (CoC) which accredits cancer centers. The move by the latter organization to require CAP Checklist synoptic data to be included in all pathology reports in their centers, and by the CAP to include a cancer checklist requirement in their inspection checklists have done much to make this practice as routine as it is at present (mostly at the level 2 or 3 stage illustrated in Figure 1). Movement to discrete data capture could be catalyzed by these kinds of dictated standards as well if they were implemented in a staged manner.

It is appropriate to ask if such efforts to raise the bar have an impact on care or patient safety. The evidence to date has centered on overall quality measures and patient experience measures in centers that meet the standards of one of these organizations, so the connection to the use of synoptic checklists per se is tenuous, and only an association. Recently, Merkow et al reported that both process and patient experience measures were predominantly more favorable for National Cancer Institute or CoC institutions, but these trends did not hold true for outcome measures, although disease severity was not controlled for in the study.52 Abundant data from Clinical Laboratory Improvement Amendments-mandated proficiency testing at CAP accredited labs likewise shows a higher level of performance improvement compared to non-accredited testing sites, but this does not implicitly transfer to diagnostic accuracy in AP, or cancer-report-related enhancements of patient safety.53

Summary

While use of synoptic templates for cancer reporting has reached near universal adoption in parts of the world, discrete data capture of synoptic data from pathology or cancer reports continues to be largely the exception. However, encouraging large scale efforts such as those in Canada and the Netherlands, offer models for others to develop. Technical tools such as the North American Association of Central Cancer Registries standards, and new middleware solutions that can be wrapped around legacy LIS systems, exist now to make the effort feasible, and more and more vendors are including discrete synoptic data element capabilities within their systems. The next steps will require collaborative efforts on the part of broader stakeholder groups along with recognition of the asymmetric distribution of costs and benefits in order to overcome the personal, social, and organizational barriers.

Acknowledgment

The help of Paul Seegers in obtaining data on The Dutch National Pathology Registry (PALGA) is greatly appreciated.

Disclosure

The authors report that they have no conflicts of interest, financial or otherwise, pertinent to the content of this article.


References

1.

College of American Pathologists. Guidelines for data to be included in consultation reports on breast cancer, bladder cancer, and Hodgkin’s disease. Pathologist. 1986;40:18–23.

2.

Kempson RL. The time is now. Checklists for surgical pathology reports. Arch Pathol Lab Med. 1992;116(11):1107–1108.

3.

Leslie KO, Rosai J. Standardization of the surgical pathology report: formats, templates, and synoptic reports. Semin Diagn Pathol. 1994;11(4):253–257.

4.

Ellis DW. Surgical pathology reporting at the crossroads: beyond synoptic reporting. Pathology. 2011;43(5):404–409.

5.

Cowan DF. How templates improve quality and efficiency in surgical pathology. Lab Med. 1997;28:263–267.

6.

Nakhleh RE. What is quality in surgical pathology? J Clin Pathol. 2006;59(7):669–672.

7.

Valenstein PN. Formatting pathology reports: applying four design principles to improve communication and patient safety. Arch Pathol Lab Med. 2008;132(1):84–94.

8.

Centers for Disease Control and Prevention. Report on the Reporting Pathology Protocols for Colon and Rectum Cancers Project 2005. Atlanta, Georgia: Centers for Disease Control and Prevention; 2005. Available from: http://www.cdc.gov/cancer/npcr/pdf/rpp_report_121605.pdf. Accessed April 20, 2015.

9.

Centers for Disease Control and Prevention. Report on the Reporting Pathology Protocols Project for Breast and Prostate Cancers and Melanomas Executive Summary. Atlanta, Georgia: Centers for Disease Control and Prevention; 2009. Available from: http://www.cdc.gov/cancer/npcr/pdf/rpp2_summary.pdf. Accessed April 20, 2015.

10.

Ministry of Finance Norway. Nasjonale medisinske kvalitetsregistre. Rapport fra Arbeidsgruppe 4: Forslag til fellesløsninger for registrene [National medical quality registries. Report from Working Group 4: Proposal for common solutions for the registries]. Oslo, Norway: Ministry of Finance Norway; 2005. Available from: http://omega.regjeringen.no/upload/kilde/hod/hdk/2006/0022/ddd/pdfv/287992-vedlegg_6_-arbeidsgruppe_4.pdf. Accessed April 20, 2015.

11.

Hammond ME, Compton CC. Protocols for the examination of tumors of diverse sites: introduction. Arch Pathol Lab Med. 2000;124(1):13–16.

12.

Hutter RV, Henson DE. The pathologist as a consultant in cancer patient management: a pattern of care study in pathology. Int J Radiat Oncol Biol Phys. 1984;10 Suppl 1:45–47.

13.

Markel SF, Hirsch SD. Synoptic surgical pathology reporting. Hum Pathol. 1991;22(8):807–810.

14.

Zarbo RJ. Interinstitutional assessment of colorectal carcinoma surgical pathology report adequacy. A College of American Pathologists Q-Probes study of practice patterns from 532 laboratories and 15,940 reports. Arch Pathol Lab Med. 1992;116(11):1113–1119.

15.

Rosai J. Standardized reporting of surgical pathology diagnoses for the major tumor types. A proposal. The Department of Pathology, Memorial Sloan-Kettering Cancer Center. Am J Clin Pathol. 1993;100(3):240–255.

16.

Gephardt GN, Baker PB. Lung carcinoma surgical pathology report adequacy: a College of American Pathologists Q-Probes study of over 8300 cases from 464 institutions. Arch Pathol Lab Med. 1996;120(10):922–927.

17.

Amin MB. The 2009 version of the cancer protocols of the college of American Pathologists. Arch Pathol Lab Med. 2010;134(3):326–330.

18.

College of American Pathologists. Birdsong GF, Foulis P. The CAP Cancer Case Summaries: from Paper to PC. College of American Pathologists; 2012. Available from: http://www.pathologyinformatics.com/sites/default/files/2012Powerpoints/45Birdsong_FoulisThurs.pdf. Accessed April 20, 2015.

19.

Hassell L, Aldinger W, Moody C, et al. Electronic capture and communication of synoptic cancer data elements from pathology reports: results of the Reporting Pathology Protocols 2 (RPP2) project. J Registry Manag. 2009;36(4):117–124.

20.

Hassell LA, Parwani AV, Weiss L, Jones MA, Ye J. Challenges and opportunities in the adoption of College of American Pathologists checklists in electronic format: perspectives and experience of Reporting Pathology Protocols Project (RPP2) participant laboratories. Arch Pathol Lab Med. 2010;134(8):1152–1159.

21.

Cancer care Ontario. Ontario Cancer Plan 2005–2008. Cancer care Ontario; 2004. Available from: https://www.cancercare.on.ca/common/pages/UserFile.aspx?fileId=13606. Accessed April 20, 2015.

22.

Srigley JR, McGowan T, Maclean A, et al. Standardized synoptic cancer pathology reporting: a population-based approach. J Surg Oncol. 2009;99(8):517–524.

23.

Canadian Partnership Against Cancer [homepage on the Internet]. The Partnership launches Electronic Synoptic Pathology Reporting Initiative (ESPRI) to advance pan-Canadian standardized cancer pathology reporting; 2012. Available from: http://www.partnershipagainstcancer.ca/the-partnership-launches-electronic-synoptic-pathology-reporting-initiative-espri-to-advance-pan-canadian-standardized-cancer-pathology-reporting/. Accessed April 20, 2015.

24.

Casati B, Bjugn R. Structured electronic template for histopathology reporting on colorectal carcinoma resections: five-year follow-up shows sustainable long-term quality improvement. Arch Pathol Lab Med. 2012;136(6):652–656.

25.

Haugland HK, Casati B, Dorum LM, Bjugn R. Template reporting matters – a nationwide study on histopathology reporting on colorectal carcinoma resections. Hum Pathol. 2011;42(1):36–40.

26.

Casati B, Haugland HK, Barstad GM, Bjugn R. Implementation and use of electronic synoptic cancer reporting: an explorative case study of six Norwegian pathology laboratories. Implement Sci. 2014;9:111.

27.

Bjugn R, Casati B, Norstein J. Structured electronic template for histopathology reports on colorectal carcinomas: a joint project by the Cancer Registry of Norway and the Norwegian Society for Pathology. Hum Pathol. 2008;39(3):359–367.

28.

Beattie GC, McAdam TK, Elliott S, Sloan JM, Irwin ST. Improvement in quality of colorectal cancer pathology reporting with a standardized proforma – a comparative study. Colorectal Dis. 2003;5(6):558–562.

29.

Buchwald P, Olofsson F, Lorinc E, Syk I. Standard protocol for assessment of colon cancer improves the quality of pathology. Colorectal Dis. 2011;13(3):e33–e36.

30.

Chan NG, Duggal A, Weir MM, Driman DK. Pathological reporting of colorectal cancer specimens: a retrospective survey in an academic Canadian pathology department. Can J Surg. 2008;51(4):284–288.

31.

Cross SS, Feeley KM, Angel CA. The effect of four interventions on the informational content of histopathology reports of resected colorectal carcinomas. J Clin Pathol. 1998;51(6):481–482.

32.

Karim RZ, van den Berg KS, Colman MH, et al. The advantage of using a synoptic pathology report format for cutaneous melanoma. Histopathology. 2008;52(2):130–138.

33.

Messenger DE, McLeod RS, Kirsch R. What impact has the introduction of a synoptic report for rectal cancer had on reporting outcomes for specialist gastrointestinal and nongastrointestinal pathologists? Arch Pathol Lab Med. 2011;135(11):1471–1475.

34.

Rigby K, Brown SR, Lakin G, Balsitis M, Hosie KB. The use of a proforma improves colorectal cancer pathology reporting. Ann R Coll Surg Engl. 1999;81(6):401–403.

35.

Siriwardana PN, Pathmeswaran A, Hewavisenthi J, Deen KI. Histopathology reporting in colorectal cancer: a proforma improves quality. Colorectal Dis. 2009;11(8):849–853.

36.

Gill AJ, Johns AL, Eckstein R, et al. Synoptic reporting improves histopathological assessment of pancreatic resection specimens. Pathology. 2009;41(2):161–167.

37.

Shuch B, Pantuck AJ, Pouliot F, et al. Quality of pathological reporting for renal cell cancer: implications for systemic therapy, prognostication and surveillance. BJU Int. 2010;108(3):343–348.

38.

Bischof DA, Kim Y, Blazer DG 3rd, et al. Surgical management of advanced gastrointestinal stromal tumors: an international multi-institutional analysis of 158 patients. J Am Coll Surg. 2014;219(3):439–449.

39.

Partin AW, Kattan MW, Subong EN, et al. Combination of prostate-specific antigen, clinical stage, and Gleason score to predict pathological stage of localized prostate cancer: a multi-institutional update. JAMA. 1997;277(18):1445–1451.

40.

Ueno H, Hase K, Hashiguchi Y, et al. Novel risk factors for lymph node metastasis in early invasive colorectal cancer: a multi-institution pathology review. J Gastroenterol. 2014;49(9):1314–1323.

41.

Lankshear S, Srigley J, McGowan T, Yurcan M, Sawka C. Standardized synoptic cancer pathology reports – so what and who cares? a population-based satisfaction survey of 970 pathologists, surgeons, and oncologists. Arch Pathol Lab Med. 2013;137(11):1599–1602.

42.

Yunker WK, Matthews TW, Dort JC. Making the most of your pathology: standardized histopathology reporting in head and neck cancer. J Otolaryngol Head Neck Surg. 2008;37(1):48–55.

43.

Klein WT, Havener L, editors. Standards for Cancer Registries Volume V: Pathology Laboratory Electronic Reporting. Version 4.0. Springfield, IL: North American Association of Central Cancer Registries, Inc.; 2011.

44.

Ye JJ. Artificial intelligence for pathologists is not near; it is here: description of a prototype that can transform how we practice pathology tomorrow. Arch Pathol Lab Med. Epub 2015.

45.

Hipp JD, Cheng JY, Toner M, Tompkins RG, Balis UJ. Spatially invariant vector quantization: a pattern matching algorithm for multiple classes of image subject matter including pathology. J Pathol Inform. 2011;2:13.

46.

De Baca ME, Birdsong G. Integrated disease reporting: Order from (almost) chaos, CAP Webinar; December 16, 2014.

47.

Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362(9391):1225–1230.

48.

Casati B, Haugland HK, Barstad GM, Bjugn R. Factors affecting the implementation and use of electronic templates for histopathology cancer reporting. Pathology. 2014;46(3):165–168.

49.

Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22.

50.

College of American Pathologists. Pathology Performance measure included in CMS 2015 PQRS. College of American Pathologists; 2014. Available from: http://www.cap.org/apps/docs/advocacy/overview_measures.pdf. Accessed April 20, 2015.

51.

patientprecision.com [homepage on the Internet]. PatientLocate-patient identification through protocol-specific clinical data. Available from: http://patientprecision.com/technology/. Accessed April 20, 2015.

52.

Merkow RP, Chung JW, Paruch JL, Bentrem DJ, Bilimoria KY. Relationship between cancer center accreditation and performance on publicly reported quality measures. Ann Surg. 2014;259(6):1091–1097.

53.

Howerton D, Krolak JM, Manasterski A, Handsfield JH. Proficiency testing performance in US laboratories: Results reported to the Centers for Medicare and Medicaid Services, 1994 through 2006. Arch Pathol Lab Med. 2010;134(5):751–758.

Creative Commons License © 2015 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.