By Dana Richman
Are your customers getting the quality service you think they are getting? How do you know? Cooper Pest Solutions, Lawrenceville, N.J., put this question to the test in the fall of 2017 using third-party verification, gaining the insights they desperately wanted without introducing their own bias and establishing a baseline for the entire company.
Third-party verification utilizes an independent organization to review service and to check whether it complies with very specific standards set by the pest management company. A portion of the verification processes include on-site inspections for ongoing compliance. This may include company records audits or tracing steps taken from start to finish. By conducting the audit, the pest management company gains a unique view of its operations with independent measures of the items they view as critical from a customer experience and compliance perspective. Ultimately, third-party verification lets a company know if its quality has gone off track and where it is knocking it out of the park.
The leadership at Cooper Pest Solutions understood the value of this type of evaluation. As such, they sought out my services (Richman Consulting Services [RCS]) to get answers to important questions associated with their rapid growth in both sales and service delivery. Cooper measures its overall customer experience using the Net Promotor Score (NPS) system (see box on page 116) to measure its success and has scored more than 83 percent for two consecutive years, which indicates customer service was perfectly fine.
But CEO Phil Cooper wasn’t convinced. “Our sales growth was in the double digits and we were bringing in technicians at a pace we had never seen before to deliver service. I sensed a general concern within my staff, that with so many new ‘green’ technicians, we might not be delivering a consistent product in our service delivery. We needed to be certain that service delivery was of the highest quality and performed in a consistent fashion from both experienced and new technicians alike.”
GOALS & OBJECTIVES. Cooper’s leadership team knew the only way to truly evaluate their customer service was to use a third party to conduct an evaluation. The first step was to pick one of the firm’s services to audit. Cooper chose the service with the greatest concentration of clients: residential home service. The audit included the technician’s ability to carry out home services in a manner consistent with service protocols and expectations of their expected customer experience.
Cooper leadership and I worked to develop an evaluation method that covered the firm’s service from soup to nuts, the ability to follow very specific directions on the work order, how they greeted the customer, the physical service and the communication at the end of service (verbal, written and electronic). The evaluation form itself was separated into five sections: arrival activities, inspection and service activities, customer service activities, conducive conditions found, and “above and beyond.” Questions were weighted based on importance to the company.
Next, we decided on how to conduct evaluations. We knew they would have to be done physically. The best way to determine how personnel were performing was to evaluate them on-site. We decided to use a “dummy” residence to recreate an actual service visit for each employee. We chose Phil Cooper’s house in Yardley, Pa., and I was able to evaluate four technicians per day. We scheduled employees so there would be no overlap among them. Approximately 75 percent of Cooper’s managers and technicians were evaluated and they ranged in experience from new employees (less than one month on the job) to seasoned veterans. We had them conduct service on the outside only.
The house is a single-family, two-story residence with approximately a quarter acre of yard space around it, which included a deck area and a shed. Employees were to inspect and treat the house as normal except they were told not to make any changes or add anything (such as bait or station labels), and employees used either a backpack or handheld sprayer to deliver water in place of chemicals around the house as treatment. We wanted each employee to see the same conditions as his/her predecessor.
I remained in character acting as the homeowner for most of the evaluation. I started each evaluation when the truck parked outside the house and concluded when the employee left.
We made a few notes in the service file for each employee to find. For example, the notes told them to knock on the side door and not the front door. Any employee who used the front door was assumed to have not read the service location notes thoroughly. After I opened the door, the technician was supposed to make an introduction, ask me about pest problems I’ve noticed and let me know what he/she was going to do around the house as part of the service. I listened to each technician for these important parts of his/her pre-inspection activities. The employee then inspected and treated (with water) as needed, and then came back to the door to give me a post-inspection report. I was in homeowner mode the entire time. My Academy-Award winning performance ended when the employee gave me the service paperwork or told me I would receive an emailed report. Now out of homeowner character, I invited the employee to do a walk around the house to show me any other issues he/she would have fixed.
WHAT DID THEY FIND? Some conducive conditions and pest evidence were staged for the evaluations and all evaluations were timed. I even inspected their vehicles. Employees were told to act as though they were performing a normal service, however they were not to change, fix or replace anything.
For example, we purposely left mouse bait stations empty for the technicians to find but they were not to replace any bait for this evaluation. Instead, they should have noted bait replacement product and volume on their service reports and told me about it while I was in character as the homeowner.
Another staged conducive condition included items that collected water around the yard, since as the homeowner I complained about the mosquitoes bothering my family outside the house. Technicians should have left the standing water alone while in the evaluation and told me about the issue while I was in homeowner character. After the “normal” service was complete I did a walk-around with each employee so he/she could point out all the things around the house that would have been fixed or replaced (such as standing water should have been poured out of containers), or conducive conditions that should be remediated. Some employees found more issues than we had staged and I made note of all their comments.
Employees were briefed beforehand. Cooper leadership guaranteed all employ-ees that the assessment results would be viewed at the company level and not the individual level. The purpose was to see if the training provided to technicians was meeting the demand of the rapid growth the company was experiencing. Moreover, everyone participating in the assessment was assured individual results would not result in any disciplinary actions but instead were only for self-improvement; if a technician scored low on the evaluation he/she would not be fired or demoted.
Instead, results were to be used to drive training strategy for the entire pest services team. Individual evaluations were not even shared with managers or any technicians until several months later when each manager received the results for their team so they knew which areas to focus on with each of their technicians.
THE RESULTS. The unbiased verification I provided gave Cooper leadership the answers they needed. The good news is that I was able to show that Cooper’s managers and technicians were highly knowledgeable about proper home inspections, conducive pest conditions, proper treatment protocol and pest behavior. “The data was so rich and good we could pluck off acute issues to make individual changes,” Phil Cooper said. “We learned we don’t have to change the system.”
Richard Cooper, co-owner and longtime technical director at the firm added, “It was great news to all of us to see that the training was making it through to our many new technicians and that our veteran staff had not gotten lax in their ways. More importantly we were able to identify one or two things for every technician that would make them even better in the job, which should lead to even greater client satisfaction and increased client retention.”
Richard Cooper agreed that the third-party evaluation process was invaluable to his company, enabling a truly non-biased view of how the company was doing and where they could improve. This type of evaluation could also be used for office staff — customer interactions, accounts payable or any other facet of the company’s operations where a certain script should be followed.
I helped Cooper Pest Services learn where they need to focus their attention moving into the future to ensure their education process yields high-quality technicians. “This process was invaluable and I would recommend it to any company committed to delivering an excellent customer experience,” Phil Cooper said.
Dina Richman, Ph.D., MBA, is a project management professional and president of Richman Consulting Services. Email her at firstname.lastname@example.org.