July 21, 2020
HIGHLIGHTS
- Identifies how accreditation standards enhance quality of care
- Describes the qualities of effective site reviewers
- Provides examples of process improvements developed from reviewer feedback
- Summarizes the College’s initiative to align ACS Quality Programs to enhance client experience
Editor’s note: Because of the coronavirus disease 2019 (COVID-19) pandemic, the American College of Surgeons (ACS) Quality Programs are exploring innovative virtual options for the traditional in-person site visit. Each ACS Quality Program suspended on-site visits during the pandemic and has developed a proposed plan and agenda for conducting these reviews in the future. Pilot sites for alternative approaches are being identified this summer. The ACS will provide updates on these programs as they become available.
Each of the ACS quality improvement (QI) accreditation and verification programs is a singular entity tasked with the responsibility of verifying compliance with the standards established for that particular specialty. All of the programs feature the same general structure: application, pre-review questionnaire (PRQ), site visit, assessment, and facility report.
This article focuses on requirements for the site reviewer role, characteristics shared by productive reviewers, and real-world examples of process improvements tied to reviewer assessment.
At present, 388 surgeons serve as site reviewers for the following ACS accreditation and verification programs, collectively known as the ACS Quality Programs: the Accredited Education Institutes, the Children’s Surgery Verification (CSV) Quality Improvement Program, the Commission on Cancer (CoC), the Metabolic and Bariatric Surgery Accreditation and Quality Improvement Program (MBSAQIP), the National Accreditation Program for Breast Centers (NAPBC), the National Accreditation Program for Rectal Cancer (NAPRC), and Trauma Verification and Review Committee (VRC) (see Figure 1, page 20). Program standards criteria range from 20 to 200 standards or more depending on the specialty (CSV and Trauma have the highest number of standards because of the complex nature of those specialties).
According to the Optimal Resources for Surgical Quality and Safety manual (also known as the Red Book), “In high reliability organizations (HROs), the focus is on development and implementation of effective systems, transparency, and teamwork. The intent is to bring process failures and systemic issues to light and to solve them in a nonpunitive way. Lessons learned from analysis of errors are shared as best practices in order to mitigate future errors.”1
The Red Book also identifies specific concepts that reinforce the principles of high reliability, including standardization of best practices that reduce unwarranted variation and optimizes reproducible outcomes.1 “When care is standardized, variation arises only because of differences in patient needs or resources,” according to the manual. “Standardization accentuates deviations from best practices, making them easier to spot than if every health care provider used a different approach to deliver care.”1
Site reviewers, previously referred to as “surveyors,” are trained members of the health care team that assist facilities in identifying gaps in their adherence to standards, enabling HROs to continue to provide consistent, high-quality care to the surgical patient.
The ACS Quality Programs collectively accredit and/or verify more than 3,000 hospitals. Studies examining the effectiveness of these programs suggest that the levels of care necessary to meet accreditation standards lead to improved quality of care.
For example, a study published in the Journal of Trauma examined the effect of preparing for and achieving ACS Level I trauma verification on patient outcomes and hospital performance. After evaluating 1,098 trauma patients admitted to a facility in 1994, and 1,658 patients admitted in 1998, the authors concluded, “Trauma system improvement as related to achieving ACS Level I verification appeared to have a positive impact on survival and patient care,” with a notable decrease in mortality for severely injured patients, a marked decrease in average length of stay, and an estimated cost savings for 1998 of more than $4,000 per patient.2
Another study published in the Journal of the American College of Surgeons compared bariatric surgery outcomes in U.S. accredited versus nonaccredited centers based on a review of 13 studies that covered more than 1.5 million patients. Researchers found that “10 of the 13 studies identified a substantial benefit of Center of Excellence accreditation for risk-adjusted outcomes, and six of the eight studies reported a considerable reduction in mortality in patients operated on in Centers of Excellence.”3
ACS CoC accreditation programs also have been linked to improved outcomes. A survey of the CoC and NAPBC program participants revealed more than 90 percent of respondents displayed a “high level of agreement that accreditation is regarded as important in improving oncologic outcomes through compliance with standards that include continuous quality improvement.”4
Although the literature suggests that accreditation standards are linked to improved quality of care, the Red Book notes a “paucity of research that evaluates accreditation status and surgical quality” and states, “…further research to specifically address outcomes at accredited institutions will better illuminate the specific structural components of care that may be associated with improved outcomes.”1
The minimum qualifications for the site reviewer role vary by program, but generally the requirements are organized into three main components: credentials/affiliations and skills and knowledge.5-8 A site reviewer must be in active practice in a clinical, academic, or administrative role and employed or affiliated with the corresponding ACS-accredited program. As for the skills and knowledge component, reviewers should have an extensive and demonstrable knowledge of the current standards, significant knowledge of specialty registries where applicable, and strong verbal and written communication skills.5-8 The site visit team typically includes one to five members, depending on the program and site request (the average is a three-person team), with one individual designated as the lead reviewer.
Dr. Margulies
“Certain requirements are in place that are unchangeable,” including that the reviewer is active in the specialty, according to Daniel Margulies, MD, FACS, professor of surgery, Cedars-Sinai Medical Center; chief section of trauma, emergency surgery and surgical intensive care, University of California-Los Angeles, David Geffen School of Medicine; and Chair, ACS VRC Program Committee. “An older surgeon who is no longer active will not be allowed to continue to do reviews. We actually remove them as reviewers within one year of retirement. In terms of the young reviewers, there’s a requirement that they have to have been a trauma director or director of a service like the surgical intensive care unit, and generally a very young, inexperienced person is not going to be in that role,” said Dr. Margulies, who became a VRC reviewer in October 2011.
Teresa LaMasters, MD, FACS, FASMBS, DABOM, medical director, bariatric surgery, UnityPoint Clinic Weight Loss Specialists, Des Moines, IA, and a reviewer for MBSAQIP since September 2014, noted that capable reviewers generally have been in practice seven to 15 years.
Dr. LaMasters
Someone who has been in practice for five to seven years and is invested in learning about quality could definitely be a good site reviewer, Dr. LaMasters added. “But a lot of surgeons early in their careers are focused on practice building. And some of them have not really learned the principles behind quality improvement. I think that seven-year mark is ideal because these physicians have some experience, they’re excited to learn, and they have some time to commit to doing this.”
Site reviewers must be willing to commit six to 12 hours per site visit, with an additional two to four hours for preparation and final review. Depending on the program, each reviewer should plan on participating in four to eight site visits a year as specified in the individual specialty site reviewer agreement.
Linda Farkas, MD, FACS, a site reviewer for NAPRC, and professor of surgery with University of Texas-Southwestern, Dallas, noted that time-related barriers are a significant challenge in recruiting new reviewers. “There are a lot of surgeons whose salaries are based on revenue, or they’re in a busy practice where they’re just not afforded the time to take off to be on a site visit,” explained Dr. Farkas. “In those instances, surgeons would have to use their vacation time, or the revenue’s going to be down. If we had more reviewers, especially if we could get more surveyors distributed across the country, then maybe the time commitment wouldn’t be so bad—so the reviewer in Florida doesn’t have to fly to Washington if we can have a reviewer who lives in Oregon.”
Dr. Farkas
Site reviewers are compensated for their time and effort. The College provides an honorarium to each reviewer, which generally is used to cover the cost of travel expenses. Although the honorarium is a practical benefit, reviewers are motivated to take on this role for a variety of reasons—from an interest in refining the accreditation/verification process to learning best practices employed at other institutions.
“I had undergone a site review in a previous system, and that site review was really unsatisfying for me,” said Dr. LaMasters. “The previous system would isolate each member of the team in something like an interrogation room in an effort to identify problems rather than work with the whole team together to talk through processes and understand the reasoning behind the processes. When I came in as a site reviewer, I felt like that approach should be completely flipped on its head. The team should be kept together. The physician leadership should be there, and it should be an open, collaborative process for learning rather than an isolating process.”
Dr. Hopewood
Peter Hopewood, MD, FACS, a site reviewer for both the CoC and NAPBC, and a surgeon with Cape Cod Healthcare Cancer Programs, Falmouth, MA, said he became a reviewer to gain exposure to innovative quality improvement practices.
“I’m in Cape Cod, so I’m a little out of the mainstream,” Dr. Hopewood said. “I’m not in an academic hospital, I’m in a community hospital, and the initiatives (clinical trials, community outreach activities) that are happening in the academic teaching centers can take years to filter down into the community hospitals. However, as a reviewer, I’m visiting these centers and I’m observing cancer conferences as part of the site visit, and I’m talking to all the specialists. I learn a lot. So, it’s helping my practice and improving the care I give to my patients.”
Effective site reviewers share specific personality and leadership traits, including a collaborative approach to and an innate interest in quality improvement starting at the local level. For example, site reviewers often are involved in their hospital’s QI committee or are peer reviewers or case reviewers for a medical board or journal before taking on the site reviewer role.
Dr. Barnhart
“When selecting a reviewer, we look for someone who is actively doing this in their own center,” said Douglas C. Barnhart, MD, MSPH, FACS, FAAP, who started his role as a CSV reviewer in January 2017. Dr. Barnhart is a professor, division of pediatric surgery, University of Utah School of Medicine, and medical director for surgical patient safety and quality, Primary Children’s Hospital, Salt Lake City, UT. “Reviewers should know from their own experience which standards are hard and which ones are aspirational. Effective reviewers will recognize the struggles involved in solving some of these problems and will appreciate the work that’s been done, rather than focusing only on the gap. I always introduce the fact that I come from a center that’s been through the verification part, and that we appreciate all the work that they’ve gone through. Reviewers should also be actively practicing clinicians. Our foremost goal is improving patient care, and in order to do that, reviewers should have ongoing, direct experience in caring for patients.”
“As a site reviewer, you have to be patient and a little charismatic,” said Dr. Farkas. “Team members at the site may be a little nervous when you come in because you’re a surgeon and a site reviewer, and they may be program nurse coordinators, for example, and feel like their job is on the line with this survey.” She added, “The purpose of the survey is not to be punitive; it is to give hospitals feedback so that they can get over any impediments necessary for accreditation.”
According to Dr. LaMasters, “You have to be able to articulate the entire vision of the accreditation process and goals and spirit on the positive side, not just the negative side, and to do it in a way that I think really inspires them to want to continue to improve their quality, to see where they can go next, to not be complacent or content simply because they passed.”
“It’s just not about checking a box, indicating the hospital was compliant or not compliant, and that’s why it’s so critical for us, across all of our programs, that we pick the right people to do these site visits, be they surgeons, nurses, other physician specialists, or others,” said Teresa Fraker, MS, RN, Program Administrator, MBSAQIP. “Because that’s what our sites are paying for, more than data registries or anything else—the education, the consultation, the mentoring, and the sharing of best practices that our reviewers offer them.”
ACS accreditation/verification programs share some similar practices based on compliance with patient-centered standards. For example, each program has a manual that outlines the optimal resources for care of the patient served in each program.9-11
Each program also employs varying business processes for verifying and accrediting participating facilities. For example, during the verification review process, a trauma center is assessed on criteria outlined in the ACS Trauma Programs’ standards manual, including volumes of severely injured patients, 24-hour availability of trauma surgeons and other specialists, surgical capabilities, and availability of specialized equipment. Based on the review, the trauma center is categorized as Level I, II, or III.12
The site review process generally occurs in four phases: reviewer selection, pre-site visit preparation, site visit, and report generation.
A notable component of the pre-site verification process is the completion of a pre-review document. For most programs, such as trauma and CSV, this document is the PRQ. These questionnaires inform reviewers of the existing care capabilities of the hospital or center before the on-site review.13
“These site visits aren’t intended to be a surprise inspection with some element for which people aren’t prepared,” said Dr. Barnhart. “For example, when we had our site visit here, I knew we had some areas of weakness, having read the criteria and having been on some of these site visits. I knew we had some problem areas that we were in the process of solving but had not completely solved. And in my introductory comments to our site visit, I said, ‘Look, I want to tell you about our place. I want to tell you about our strengths, and I want to show you two things that we’ve struggled with. I’m going to show you our audit data.’ That’s really what we’re looking for places to do. If you know your weaknesses, and you’re committed to performance improvement, you’ll solve your weaknesses,” Dr. Barnhart said.
“We’re not saying that you have to be perfect, but we are saying you should have a process in place to look at your mistakes, think critically about them, and work to make a sustainable change in your program that addresses problems more generally than just looking at outcomes,” added Dr. Margulies.
“I tell the team that throughout the day I’m going to look for things that they can do better, and that doesn’t mean that they’re not doing a great job. We all can learn to do things better,” said Dr. LaMasters. “I try to engage each person in the team around the standards and how they apply to this part of the site visit. It’s not just about the case that we’re auditing, it’s really about how you develop the team culture to get everybody behind a process improvement that came from this adverse event.”
Site reviewers identify patient care challenges across a spectrum of clinical topic areas and offer suggestions for improvements in administrative processes related to equipment needs, scheduling and job sharing, and time management.
“One program was able to arrange one-stop shopping for its patients so that they can get their CEA [carcinoembryonic antigen], their CAT [computerized axial tomography] scan, and their MRI [magnetic resonance imaging] in one day,” said Dr. Farkas, noting that such strengths are institution-specific and may be difficult to achieve at every site.
Dr. Margulies offered another example of a process improvement, one that a site reviewer to his institution sought to replicate at his center. “The use of whole blood is new in trauma. We developed a mobile refrigerator so that the blood that is sent out to trauma is kept at a specified temperature, rather than in a cooler where the blood warms up and is basically wasted if it is not used,” Dr. Margulies said.
Dr. Hopewood said a process improvement his institution is adopting—based on what he encountered as a site reviewer—involves decreasing emergency room (ER) visits and unexpected admission for chemotherapy patients. “Some programs preemptively call patients early in the morning to ascertain how they are feeling. Certain chemotherapy regimens result in potential diarrhea or neutropenia, and targeting those patient populations and leaving room for empty appointments later in the day can reduce visits to the ER, resulting in better quality of care,” Dr. Hopewood said.
Dr. LaMasters said process improvements often entail understanding why the standard was created and realizing the difference between “the letter of the law and the spirt of the law.”
“The letter of the law might state that you have to have equipment that is weight-based appropriate for our patients, but the spirit of the law means understanding that you can’t have only one chair in your waiting room that is appropriate for a person of size because often family members may also be of size,” explained Dr. LaMasters. “Understanding that the concept of this standard is to help the entire institution recognize and be sensitive to patients of size and to be prepared to care for those patients wherever they occur in the institution—even though I’m specifically reviewing the sites where bariatric surgery patients will go,” she said.
To unify ACS quality programs, the College launched a project in July 2017 to enhance both the site reviewer and the client experience. Although each program was originated with the similar aim of improving surgical care, the models to achieve that goal varied by specialty.
The ACS alignment team identified three goals to unify ACS Quality Programs, including the development of a shared information technology platform, similar site visit and performance reports, and a common standards framework and template, including both the formatting and branding of the standards manuals. This updated standards framework comprises nine standard domains:
In 2019, the MBSAQIP was the first program to translate its standards into the new framework, successfully meeting one of three goals the ACS established to unify these programs. Other programs migrating to the new standards format this year include:
In March 2019, the ACS Quality Portal (QPort) was completed, successfully meeting another goal in the alignment process. The MBSAQIP was the first to migrate to QPort, and other quality programs will be migrating to the portal in the near future.
Alignment efforts are intended to provide consistent, high-quality experiences to all our hospitals participating in ACS Quality Programs. By providing a consistent framework for the standards manuals, portal, and reports for all ACS verification programs, the ACS aims to bring QI and leadership teams at participating hospitals together to work toward surgical quality within their institutions. The objective is to provide a road map of strengths and opportunities for improvement.
With more than 105 years of experience in QI, the ACS continues to define the standards necessary to provide optimal patient care across all surgical specialties. The College’s ongoing commitment to this goal continues with the development of a new standards-based program—a verification program based on the Optimal Resources for Surgical Quality and Safety, with pilot visits occurring over the last year.14,15
Site visits are an essential component of achieving quality in all phases of surgical care. Standardizing the College’s approach to verification and accreditation processes for both established and new Quality Programs will likely enhance the effectiveness of site reviews.
“Participation in accreditation programs does not guarantee high-quality care,” note the authors of the Red Book, “but it does demonstrate a commitment to such aims.”1 The authors assert that “clearly defined roles and responsibilities, coupled with appropriate resources and support” can lead to improved patient outcomes, shorter lengths of stay, and a reduction in costs.
References