Audit of Performance Against Service Standards Reporting

3. Findings and Recommendations

Objective 1: To assess the appropriateness of the data being tracked for reporting performance against service standards for both regulatory and non-regulatory fees

Interpretation of the Standards for Measurement Purposes

The service standards, both regulatory and non-regulatory, vary in how clearly they state performance expectation. Although some service standards specifically state when the time period for tracking starts and ends (ex. "Results are provided to the client within two business days of the sample being received at the regional laboratory"), other service standards are less specific and therefore open to interpretation. For example, some standards say that results will be reported "within x business days of receiving the sample." Although the customer may expect that the sample is "received" when he or she leaves a sample at a regional office or service centre, often samples must be transported to the Grain Research Laboratory in Winnipeg before testing can start. Further, results have to be reported back to the regional office so they can be prepared for the client. Altogether, this may result in the total number of business days required for processing being greater than the customer expects (i.e. within the service standard target).

In addition to interpretation that may apply for the start and end times for tracking the service, certain service standards also require an interpretation of what should be measured (example: "Grades are accurate" does not explain how the Canadian Grain Commission will conclude whether its grading is accurate or not). Although some units, such as Licensing, have specifically defined and documented their interpretation of the service standards, most have not. Audit and Evaluation Services tested the Performance Against Service Standards results for regulatory fee service standards. There were incidents where the interpretation of how the service should be tracked in the Western Region was different from the Eastern Region for the same service standard. For all of these situations, complete documentation would promote consistency, illustrate that the Canadian Grain Commission follows a systematic approach to gather accurate performance data, and support a response to public or Central Agency inquiries.

We noted that, in many cases, the reason service standards required interpretation or that assumptions had to be made was a mismatch between the expectation being set out in the service standard and current Canadian Grain Commission operating procedures. Since the service standards were developed there have been changes to Canadian Grain Commission operations, technology used, etc. creating a gap. Because management does not intend to implement new or updated service standards until the next refresh of the user fees in 2018, the service standards themselves were specifically out-of-scope for this audit (see Scope). Audit and Evaluation Services has provided management with our observations and suggestions related to service standards to assist with their upcoming review.

Recommendations:

1) Establish start and end times for tracking performance

In order to support Canadian Grain Commission's performance reporting, we recommend that for each service standard (regulatory and non-regulatory) management establish and document consistent methods for determining when a time period starts and ends. Rationale also needs to be provided as to why particular start and end times were chosen. The interpretation of start and end times implied by a service standard may vary by fee code but should be as consistent as possible across all services.

(Impact: medium)

2) Document assumptions

For service standards that require additional explanation to clarify performance measurement, we recommend that management document:

  • the basis of the measurement numbers (numerator and denominator of the percentage calculation, where applicable)
  • how the numbers are obtained
  • the rationale for the values chosen

In cases where both regions provide input into the same measurement, management should ensure values are determined and reported using the same methodologies to increase consistency in the results.

(Impact: medium)

Objective 2: To assess the accuracy, integrity, and reliability of service standard reporting for regulatory fees.

Ability to Verify Results to Actual Performance

The Canadian Grain Commission's performance against service standards is reported externally, and it is important that results be accurate and reliable. To verify accuracy and reliability, Audit and Evaluation Services analyzed reported results and the methods used to obtain results for all of the regulatory fees. For some fees, the analysis included verifying a sample of reported results. We were unable to reasonably verify the accuracy of results for the following service standards:

Fee Service Standard Explanation
Outward official inspection-ships "When grain being loaded is other than grade ordered, the Canadian Grain Commission will inform the elevator staff by form IW-7." Due to the high volume, reporting limitations in the inspection system, the presence of subjectivity in determining the requirement to issue the IW-7 form (a non-conformance report submitted to the terminal elevator from a Canadian Grain Commission inspector), and the need for specialized inspection knowledge to interpret each situation, the reported performance results could not be re-created or verified through a sample. Further, because both regions were using different methodologies for calculating their results, Audit and Evaluation Services could not conclude whether the results were reasonable.
Full term licence, short term licence "Licensee inquiries will receive a response within one business day." Because Canadian Grain Commission does not record phone calls nor retain completed email correspondence related to inquiries, Audit and Evaluation Services could not verify that all inquiries were captured in the recorded results. We observed that not all staff in the Licensing and Compliance area were using the tracking spreadsheet in order to maintain accurate performance results.

The performance results for service standards related to submitted samples and producer car applications are generated by system reports that calculate the time difference between data being entered and a certificate or letter being issued from the system. The performance results are therefore verifiable. However, the accuracy of these results is dependent on the timeliness of entry of the source data into the system by Canadian Grain Commission staff, which could not be verified. Through discussion, various staff and management independently confirmed to Audit and Evaluation Services that their established procedures support timely processing and data entry for the activities identified. For these service standards the benefit of a system-generated report for quarterly reporting likely outweighs the risk of potential minor inaccuracies in the results; however, management should be aware of the exposure to potential delays in data entry that would not be captured in performance data.

Other than system reports available for service standards mentioned above, most performance must be manually tracked. Due to relatively low volume, there are no formal tracking mechanisms for some service standards, such as standards related to reinspection of submitted samples and publishing on the Canadian Grain Commission web site (a total of seven service standards). In addition, the performance measure for issuing documentation for outward inspection and weighing within specified time frames is recorded individually by vessel but not tracked in a consolidated manner. Because reliable source data exists, it was possible to re-create the performance results for these three groups of service standards; however, despite the low volume and availability of information, omissions were identified in the reported results during the audit. The results would have been more accurate if they were consistently tracked throughout the time period as the service was taking place instead of at the end of the quarter.

Overall for regulatory fees, taking into account the service standard interpretations and various assumptions as noted in Objective 1, Audit and Evaluation Services concluded that the Canadian Grain Commission has established appropriate methods for collecting performance data, and the majority of the data can be verified against actual performance. Conclusions and comments for each regulatory service standard have been provided to Canadian Grain Commission's Executive Management Committee and the Planning and Reporting unit for reference.

Recommendations:

3) Review Measurement Methodology

We recommend that management review the measurement methodology for service standard results that could not be verified and implement improved methods of tracking performance to increase the reliability and verifiability of data.

Management should also monitor service standards that rely on system reports to ensure operating procedures continue to support the accuracy of the performance results reported.

(Impact: medium)

4) Formalize Tracking

We recommend that more formal tracking methods be implemented for service standards related to reinspections, publishing materials on the Canadian Grain Commission web site, and issuance of documentation for outward inspection and weighing. Tracking can be a simple spreadsheet or table to record transactions throughout the quarter and facilitate accurate reporting.

(Impact: low)

Review and Monitoring of Performance Results

All regulatory fees are generated in the Industry Services division and consequently Industry Services owns all but four of the related service standard performance results. An Industry Services administrative assistant has been designated to receive and record quarterly Industry Services performance results. The results are then reviewed by Industry Services management prior to submission to the Planning and Reporting unit for further consolidation and reporting to the Executive Management Committee for approval.

During the audit, Audit and Evaluation Services tested a sample of service standards for selected quarters in 2014-2015 to confirm reported performance results. Despite the levels of review, through testing Audit and Evaluation Services identified or caused the business unit to identify errors in previously reviewed and approved third quarter Performance Against Service Standards report. In total nine service standards were affected by errors that likely would not otherwise have been identified. While some of these errors would be difficult to identify without having a second person re-calculate the result from source data, others were more apparent. For example:

  • Performance data was reported for two inspection and weighing fees for the first quarter of the year, while in the second and third quarters no transactions were recorded. Since management had decided to stop offering the service the previous fiscal year, it was very unlikely that there would be any transactions for those fee codes, a fact that was confirmed through review of financial results showing no revenue had been earned from those fees in the quarter. The numbers were entered in the report by mistake.
  • For two separate fees, the reporting methods that were used in the first quarter were later found to be inaccurate, and changes were made to improve the data for subsequent quarters. The first quarter data was not corrected even though the information was readily available.

It is evident that a stronger control function needs to be in place to monitor, review and verify reported results.

Recommendations:

5) Management Review of Results

We recommend that review of Performance Against Service Standards results for accuracy and reasonability be strengthened, beginning at the business unit management level. Review measures should include discussion with key data providers about the quarter's results and periodically verifying against supporting data if appropriate for a particular service standard. Management should then sign off to verify it has carried out a thorough review of the results and acknowledge accountability for the Performance Against Service Standards report.

(Impact: medium)

Administration of Performance Against Service Standards Report

As the unit responsible for Central Agency and external reporting for the Canadian Grain Commission, Planning and Reporting collects and reports on Canadian Grain Commission's Performance Against Service Standards results. The unit has a unique independent perspective as it does not generate any service standard results itself but is in contact with all the service standard owners and has the earliest access to the consolidated results. Service standards cover a breadth of services from across the Canadian Grain Commission, from the regions to headquarters, from the Industry Services analytical labs to the Grain Research Laboratory. Because there are so many contributors to, and owners of, performance against service standards, it is difficult for any one operational unit to have a good understanding of the impact that each result has and its relationship to other service standards. The need for a central control point is suggested by the audit results, which include errors, undocumented (and, in some cases, previously unknown) assumptions, and inconsistencies in the tracking and reporting of service standards.

Until the audit, Planning and Reporting has mainly focused on collecting results and has not provided a significant control function. Control activities would include questioning and challenging unusual or unexpected results or reports of "no service requested," assessing reasonableness based on operational events in the quarter, and periodically verifying (either through discussion or sample testing) the accuracy of performance results and supporting data. As of the fourth quarter of 2014-2015, the Planning and Reporting unit started performing a high-level comparison of service standard results to user fee revenue and initiated follow-up with management regarding an unexpected result. These are positive steps toward creating an effective control function in the Performance Against Service Standards reporting process.

Recommendations:

6) Enhance the Control Function

We recommend that the Planning and Reporting unit be clearly assigned a control role over service standards, and that the unit establish the steps required each quarter to confirm the Performance Against Service Standards results. As part of the control function, we recommend that Planning and Reporting maintains a centralized record of the interpretation and assumption documentation discussed in Recommendations 1 and 2.

(Impact: medium)

Objective 3: To review the preliminary performance data for non-regulatory fees and advise on the effectiveness of the reporting methods proposed.

Non-Regulatory Service Standards

Although some work was performed on the third audit objective, the audit focused mainly on regulatory fees. Regulatory fees are more significant in terms of revenue generated and potential consequences if service standards are not met. The Canadian Grain Commission strives to meet all of its service standards, but there are no financial consequences for not meeting non-regulatory standards.

At the end of the third quarter (the in-scope period for the audit), service standard owners were still determining the most appropriate methods for collecting and reporting performance data for non-regulatory standards. Audit and Evaluation Services participated in discussions with various business units as well as Planning and Reporting and provided advice on methods where appropriate. Complete non-regulatory performance data was reported internally for the first time at the end of the fourth quarter. Although Audit and Evaluation Services did not review the non-regulatory results in detail, we observed that the same overall recommendations outlined in this report can be applied to the non-regulatory service standards.

Recommendation:

None.