This February NASCIO proposed several actions states could take to improve procurements, including removing unlimited liability clauses in terms and conditions and introducing more flexible terms and conditions. One idea absent from the list is an approach being piloted in California: creating vendor performance scorecards on IT projects for use in future procurements.
In June 2014, the California Department of Technology (CDT) began work on a Contractor Performance Evaluation Scorecard. A workgroup made up of staff from the State Technology Procurement Division within the Technology Department, the Department of General Services, other state departments and volunteer members of the vendor community met several times to provide input and work out the details. Pilot projects are expected to begin this year.
In an in-depth interview with Public CIO, Carlos Ramos, who stepped down as CIO of California in March, described the genesis of the project and the progress to date.
As the CIO’s office fulfilled its responsibility to provide oversight of the state’s IT portfolio and projects, it started to see similar issues or factors whenever an IT project would go off the rails, Ramos said. There are many reasons why projects go bad, but one of them is a lack of performance on the part of contractors. “For instance, at the time of bidding, we would work with well qualified candidates — the A team,” Ramos said. “When it came time to start an engagement, however, we were often not getting the A team anymore. Now some of that was on us, because procurements take so darn long, but sometimes it was the contractors.”
Aiming for ImprovementCalifornia isn’t the only state interested in vendor scorecards. Arizona is in the early stages of setting up a scorecard system to rate all types of suppliers, including IT vendors. Judy Wente, Arizona’s state procurement administrator and a former Intel Corp. supply chain executive, said that in the private sector, scorecards are the norm.“To come to the state and not have scorecards seems like a disconnect because suppliers and the state need to have a way to communicate with each other and gain alignment on how work is being performed,” she said. It’s very similar to the relationship a company has with its own employees. “You have to be able to communicate with each other, and know what the baseline is and address continuous improvement. They should realize that status quo year after year just will not cut it.” Wente said Arizona would have to work through the issues that the California Department of Technology did to identify how vendors would be rated, whether they have a right to appeal and how the information might be used in future procurements. “I agree there is complexity, but there is also complexity in the private sector, so this is not something that should be unfamiliar to IT vendors,” she said. Arizona has signaled its direction to vendors and hopes to roll out a version of its scorecard by this fall. |
Besides providing performance feedback during a large project, the scorecards also are expected to be a way for the state to take into account previous performance in future procurements, something that has been difficult to do in the past because evaluations were based on requirements built into the procurement vehicles. “We had no systemic way of measuring performance and taking it into account,” Ramos said. “We saw that as a gap.”
While conceptualizing the scorecard, the workgroup decided it had to include consistent performance measures and be done in a way that the same thing is measured on each contract, reporting is done regularly and consistently, and it needs to be public.
“We have to give contractors the opportunity to dispute or rebut a rating that they may not agree with,” Ramos explained.
In addition, it was important to include vendors in the design process. “We went out to the bidding community and said, ‘We want to find a way to hold you accountable. We want to be able to rate your performance, report on it and use it the next time you bid, so help us come up with something,’” Ramos said. “And to their credit, they did.”
As the state looks for pilot projects to test the scorecard on, Ramos described how it will work: Once per quarter a project manager will rate the vendor on a scale of A through F on key performance indicators aligned with scope, schedule, cost and quality as defined in the contract, and whether any shortcomings are the fault of the state or the vendor.
If the vendor wants to dispute the rating, it will have the opportunity to appeal to the project’s steering committee. If that committee agrees with the vendor, the rating would be changed. But if not, the rating would be confirmed and made public. “This creates 360-degree accountability,” Ramos said.
The CDT will give state agencies flexibility in terms of how much to weigh these performance scores in future procurements. For example, one performance question is: Did the vendor deliver the scope it contracted to deliver on time? “Those factors may be of different levels of importance on future procurements,” said Ramos, “so we are going to give folks the flexibility to weight them differently in a procurement evaluation.”
Despite Ramos’ departure from the state CIO role, the scorecard is proceeding as planned, according to Teala Schaff, a CDT spokeswoman. “The CDT continues to engage its customers and the vendor community in the development of the policies and procedures of the scorecard and will continue this path forward throughout the planned pilot phase,” she said. “The CPES pilot will include a diverse representation of new reportable IT projects in California; however, the state has not approved any new projects for procurement. Therefore, no projects have yet been identified to participate in the pilot phase, although our commencement is expected this year.”
Not everyone is ready to embrace the scorecard plan — Ramos admitted that many vendors have reservations about the idea.
Josh Nisbet, director of government clients and markets for Deloitte, participated in workgroup sessions and supports the efforts to improve the delivery of IT projects in the state. But he expressed concern that when it comes to measuring the success of projects, they are still really only looking at vendor performance.
“At the end of the day, if a project is over budget or late, most of the rear-view-mirror look-back is evaluating the vendor, as opposed to measuring the performance of the overall team, which includes both the state and the vendor,” he said. “Those familiar with the complexity of large projects know there are a lot of moving parts and interdependencies that impact cost and schedule.”
Nisbet’s primary concern with California’s plan is that the project director is the person making the evaluation. If a vendor or consultant wants to appeal that evaluation, the process dictated by the state takes that appeal to the same governance counsel or executive body responsible for the project. “Most vendors would be uncomfortable seeking an appeal from the boss of someone who gave them a bad mark; they are not going to view that process as fair and impartial,” he said.
Nisbet suggested the appeal process take place somewhere removed from the project. “Some states that have implemented vendor performance scorecards, such as Texas, use state agencies removed from day-to-day management of the projects to evaluate and appeal vendor performance grades, infusing some third-party objectivity into the process.”
Gathering the right data to rate vendors in a way that would be meaningful in future procurements is very difficult, said Dean Kashiwagi, director of the Performance Based Studies Research Group at Arizona State University. (Kashiwagi developed the Best Value Performance Information Procurement System, which is designed to identify an expert vendor to handle a project and have the non-expert customer avoid micro-managing the project.) He has seen examples of public-sector entities in Europe trying to include vendor past performance as part of procurements. “They never get the data valid enough to actually disqualify somebody,” he said, predicting that California will also find that the approach doesn’t work.
Dugan Petty, a senior fellow at the Center for Digital Government and former CIO of Oregon, served on a task force that made several IT procurement recommendations to California in 2013, including the scorecard idea. He said that particularly in the area of IT procurement, past performance must be factored in somehow. “You’d do it in your home. If you hired painters to paint your house three years ago and they did a horrible job, you wouldn’t just give them the job the next time your house needed painting without taking that into consideration.”
But Petty said it’s important that California make it clear upfront how the ratings are going to be performed and used going forward. “You also have to give vendors a chance at recourse on information you capture,” he said. “You can’t just go behind closed doors and decide they didn’t do very well on this or that element of a project, and use that against them if they ever bid again on a project.”
Petty, who also previously served as Alaska’s chief procurement officer, said the vendors that have a strong reputation of being successful are likely to be a lot more supportive of the concept. “Fly-by-night outfits that come in and low-bid things will get weeded out,” he said. “You are likely to get higher quality in terms of performance contractors with this in place. But the challenge is getting it set up and being inclusive and transparent in how you engage the vendors.”