Audit of the Management Control Framework of Grants and Contributions (2014-2015) - Focus on the General Assessment Process

September 2015

PDF Version (95 Kb, 25 Pages)

 

Table of contents

Acronyms

AANDC

Aboriginal Affairs and Northern Development Canada

CFO

Chief Financial Officer

DTP

Treasury Board Directive on Transfer Payments

FNIHB

First Nations and Inuit Health Branch (of Health Canada)

FSO

Funding Service Officer

GA

General Assessment

G&Cs

Grants and Contributions

GCIMS

Grants and Contributions Information Management System

HC

Health Canada

HQ

Headquarters

MCF

Management Control Framework

PTP

Treasury Board’s Policy on Transfer Payments

TPCOE

Transfer Payments Centre of Expertise

 

 

Executive Summary

Background

In order to support compliance with the Treasury Board Policy on Transfer Payments (PTP) and the management of funding agreements that have been negotiated between Aboriginal Affairs and Northern Development Canada (AANDC or the Department) and funding recipients, a General Assessment (GA) tool was implemented in the Fall of 2010. This tool provides an annual snapshot of a funding recipient's past performance and identifies strengths and emerging risks that may impact the manner in which AANDC manages transfer payments to the recipient.

The GA is a living tool. Version 1 of the GA was implemented and used in the Fall of 2010 for funding agreements for the 2011-2012 funding year. Since that time, Version 2 was developed and has been used for subsequent funding years.

The results of the GA are expected to affect the duration of funding agreements, the frequency and type of departmental monitoring activities (and related reporting requirements), the flexibility of the funding arrangement and a recipient's eligibility for certain funding approaches. As such, the Department must be able to defend the consistency and transparency of the assessment results. Since Part A GA results are to be made public starting in January 2016, there is the potential for additional scrutiny by recipients and other stakeholders; further impacting the need for consistency and defensibility across the Department.

Audit Objective and Scope

The objective of this audit was to provide assurance to senior management of the adequacy and effectiveness of the application of the GA tool across Aboriginal Affairs and Northern Development Canada to support the management of funding agreements with funding recipients.

The scope of this audit focused on an assessment of the adequacy and effectiveness of the GA tool (Part A, Part B, and Part C) examined horizontally across a sample of seven (7) regions for the fiscal years 2014-2015 in order to evaluate the consistency of the application of the GA tool.

Statement of Conformance

The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the quality assurance and improvement program.

Conclusion

The audit found that GAs are being performed as required by the Department's Directive 410 – General Assessments. However, the audit noted specific inconsistencies, across regions and sectors, in the processes established to develop, review and approve GAs, as well as the documentation and analysis maintained to support assessment results.

In order to improve consistency in the development, review and approval of GAs, as well as their scoring throughout the Department, the audit identified opportunities to improve the tools, guidance and training associated with the GA process.

Further, the audit found that GA scores are not consistently impacting the level of administrative requirements imposed on recipients. Although the intention of the Policy on Transfer Payments was to allow for a risk-based approach to be taken to reporting and monitoring, due to current limitations in program reporting and management regimes, regions and sectors are unable to either adjust the reporting requirements or monitoring performed on recipients based on the GA scores.

Recommendation

The audit team identified areas where the adequacy and effectiveness of the GA tool should be improved, resulting in four (4) recommendations as follows:

  1. The Grants and Contributions Management Oversight Committee should develop and coordinate a Department-wide quality assurance/monitoring program in order to provide the Department with a level of assurance that GAs are being completed in a consistent manner and assessment results are supported by adequate justification. The quality assurance/monitoring program should include the review and assessment of Part A, Part B and Part C GAs.
  2. The Grants and Contributions Management Oversight Committee should ensure the standardization of program-specific checklists/templates used to support the risk analysis across all regions. These checklists/templates should be assessed for their alignment to the risk considerations and benchmarks outlined in the GA Workbook prior to implementation.
  3. The Grants and Contributions Management Oversight Committee should improve alignment of the reporting requirements and monitoring performed on recipients, as well as the flexibility in funding approaches available to recipients based on a recipient's level of risk.
  4. The Grants and Contributions Management Oversight Committee should address the recommendations laid out below to address gaps identified with Directive 410 – General Assessments, and the GA Workbook.
    1. Directive 410 – General Assessments should be updated to further define:
      1. The role, responsibilities and accountabilities of Funding Services and program representatives in the development, review and challenge of the programs' components of a Part A GA; and
      2. The role, responsibilities and accountabilities of regional offices in the review and challenge of the impact of Health Canada's input into the GA scores.
    2. The GA Workbook should be updated to include benchmarks for the low-medium and medium-high risk ratings for each risk consideration based on relevant criteria, such as what is already included for the low, medium and high risk ratings.
    3. In order to ensure consistency in the scoring of GAs, the Directive 410 and GA Workbook should be updated to provide additional guidance on how special considerations, such as those identified during the audit, are to be considered/addressed when completing GAs
    4. Directive 410 should be updated toclarify roles and responsibilities over the review and approval of GAs, specifying the appropriate authority level for approvals for Part A and B GAs and ensure their consistent application across regions/sectors.

Management Response

Management is in agreement with the findings, has accepted the recommendations included in the report, and has developed a management action plan to address them. The management action plan has been integrated into this report.

 

 

1. Background

1.1 Grants and Contributions

Aboriginal Affairs and Northern Development Canada (AANDC or the Department) makes funding available to First Nations and other recipients through grants and contributions (G&Cs) for the delivery of programs and services such as education, land management, social development and community infrastructure. Total departmental spending on G&Cs for the 2014-2015 fiscal year was $6.3 billionFootnote 1.

AANDC's transfer payment programs are administered in accordance with the Treasury Board Policy on Transfer Payments (PTP) and Directive on Transfer Payments (DTP), which took effect on October 1, 2008. The objective of the PTP and DTP is to manage transfer payment programs with integrity, transparency and accountability, taking into account the risks, and to ensure that programs are effectively focused on citizens and beneficiaries, and are designed to achieve various Federal Government priorities and expected results. More specifically, the PTP outlines the expectations that risk-based approaches are adapted to the design of: transfer payment programs; program terms and conditions; funding agreements; recipient monitoring; and, recipient auditing.

Within AANDC, the initial implementation of the PTP and DTP took place in March 2011. In order to meet the expectations of the PTP, the Chief Financial Officer (CFO) Sector established the Transfer Payments Centre of Expertise (TPCOE), which has put in place the Management Control Framework (MCF) for Grants and Contributions to ensure effective management and monitoring of G&C programs and to ensure compliance to the PTP and DTP. AANDC's CFO is accountable for the overall management of transfer payment funds and, as such, is the custodian of AANDC's MCF for G&Cs. The MCF establishes roles and responsibilities for the delivery of G&Cs, specifically to program management (the design and implementation of a program) and transfer payment operations (operations of a program with recipients), and sets the Department's expectations of how G&Cs are to be managed across regions and at Headquarters (HQ).

Due to the significance of G&C funding to the overall departmental budget and mandate, an internal audit of the MCF is conducted every year.

1.2 The General Assessment

The General Assessment (GA) is a tool that was implemented in the Fall of 2010 to support compliance with the PTP and the management of funding agreements that have been negotiated between AANDC and funding recipients. The tool provides an overview of a funding recipient's past performance and identifies strengths and emerging risks that may impact the manner in which AANDC manages transfer payments to the recipient. The GA allows AANDC to assess the capacity of recipients to access a variety of funding approaches, including multi-year agreements. The results of the GA may also result in adjustments to AANDC's reporting requirements of the recipient.

Since 2013-2014, AANDC has collaborated with Health Canada (HC) in the development of ongoing multi-program agreements in order to provide funding to common recipients of both AANDC and the First Nations and Inuit Health Branch (FNIHB) of Heath Canada. For the 2014-2015 fiscal year, Part A GAs developed for recipients under joint, ongoing multi-program agreements included an assessment of the recipients' program management activities associated with both AANDC and FNIHB programs.

The GA is a living tool. Version 1 of the GA was implemented and used in the Fall 2010 for funding agreements for the 2011-2012 funding year. Since that time, Version 2 was developed and has been used for subsequent funding years. Version 2 contains refinements to the scoring scale of the considerations, the weightings assigned to each risk factor, and streamlining of the risk factors.

In support of improving the consistency of the completion of GAs, a GA Workbook was created. It outlines the various "considerations", and associated benchmarks, to be assessed for each risk factor (governance, planning, financial management, program management, and other considerations) during the completion of the GA. The GA Workbook supports an equitable and consistent approach to managing funding agreements.

The GA Workbook is divided into two parts: Part A, for more complex funding relationships, and Part B, for less complex funding relationships. Part A is completed for recipients who have an ongoing relationship with AANDC and is completed prior to entering into a funding agreement and once annually thereafter for multi-year agreements in which the recipient is assessed as medium risk or higher. Part A GAs are typically completed by regional offices, with only a limited number of Part A GAs completed by HQ sectors. The Part A GAs include an assessment of the following risk factors: Governance, Planning, Financial Management and Program Management (applicable for each eligible program the recipient is receiving funding for).

In contrast to Part A GAs, Part B GAs are used to assess recipients of a one-time project or an ongoing specific service with its own funding agreement. Part B GAs include an assessment of the governance, performance history, financial stability and planning of the recipient, as well as project complexity for that program/project. Part B does contain an "Other Considerations" risk category which is used when specific program requirements result in the need to evaluate additional areas of risk. For 2014-2015, 30% of the Part B GAs were completed by HQ sectors.

The Part C GA tool is used to assess the risk and capacity of an ongoing, multi-program recipient in support of the decision to move or renew the recipient for a block funding agreement. As a result, Part C GAs contain much more detailed criteria than Part A and B GAs and provide a checklist that determines a numeric assessment score for each risk category. Part C GAs are completed by regional offices.

agreements, the frequency and type of departmental monitoring activities (and related reporting requirements), the flexibility of the funding arrangement and a recipient's eligibility for certain funding approaches, the Department must be able to defend the consistency and transparency of the assessment results. In an effort for increased transparency, the Department has made the decision to publish all Part A GA results starting in January 2016, resulting in the potential for additional scrutiny by recipients and other stakeholders. Due to these factors, the GA was determined to be the focus of this audit.

 

 

2. Audit Objective and Scope

2.1 Audit Objective

The objective of this audit was to provide assurance to senior management on the adequacy and effectiveness of the application of the GA tool across the Department to support the management of funding agreements with funding recipients.

2.2 Audit Scope

The scope of the audit focused on an assessment of the adequacy and effectiveness of Part A, Part B, and Part C of the GA tool. This included:

  • The oversight associated with the development and finalization of the assessment results and scores;
  • The selection of an appropriate funding mechanism based on the results of the GA, as applicableFootnote 2;
  • The activities and participation in the completion of the annual assessment per recipient; and,
  • The governance, tools and guidance in place to ensure the consistent application of the assessment across all regions.

The audit scope included examining a sample of Part A, Part B, and Part C GAs for the 2014-2015 fiscal year that covered site visits to three (3) regional offices, as well as a desktop exercise for an additional four (4) regions, in order to evaluate the consistency of the application of the tool.

The scope of the audit did not include an assessment of the GA results completed by Health Canada. The audit scope only covered the review and approval process of AANDC over the completed GA, which includes the components completed by Health Canada.

 

 

3. Approach and Methodology

The audit was conducted in accordance with the requirements of the Treasury Board's Policy on Internal Audit and followed the Institute of Internal Auditors' Standards for the Professional Practice of Internal Auditing and the Institute of Internal Auditors' International Standards for the Professional Practice of Internal Auditing.

The audit team examined sufficient, relevant evidence and obtained sufficient information to provide the appropriate level of assurance in support of the audit conclusion.

The principal audit techniques were:

In order to satisfy the audit criteria, as established in Appendix A, a sample of regions and recipients were selected for testing, as noted above. The following outlines the approach used to determine the sample.

Selection of Regions for Site Visits

The sampling methodology considered the following factors in the selection of the regional offices to be visited:

  • Size (number) of GAs completed by regional offices;
  • Size (number) of Low, Medium, and High risk GAs by regional office; and,
  • Feedback obtained during the planning phase by individuals interviewed at HQ and regions.

Based on the above analysis and with the objective being to assess the adequacy and effectiveness of the application of the GA tool across the Department, the following regional offices were selected for on-site field testing:

  • Atlantic;
  • Alberta; and,
  • British Columbia.

To gain a comprehensive understanding of the adequacy and effectiveness of Part A, Part B, and Part C of the GA tool, we expanded our testing to include GAs completed in the Ontario, Manitoba, Saskatchewan and Quebec regions. The audit also included interviews in some program areas at HQ focusing on the Part B of the GA Workbook. These included program areas within the Northern Affairs Organization, Lands and Economic Development Sector, and the Treaties and Aboriginal Government Sector.

 

 

4. Conclusion

The audit found that GAs are being performed as required by the Department's Directive 410 – General Assessments. However, the audit noted specific inconsistencies, across regions and sectors, in the processes established to develop, review and approve GAs, as well as the documentation and analysis maintained to support assessment results.

In order to improve consistency in the development, review and approval of GAs, as well as their scoring throughout the Department, the audit identified opportunities to improve the tools, guidance and training associated with the GA process.

Further, the audit found that GA scores are not consistently impacting the level of administrative requirements imposed on recipients. Although the intention of the Policy on Transfer Payments was to allow for a risk-based approach to be taken to reporting and monitoring, current limitations in programs' reporting and management regimes are preventing regions and sectors to either adjust the reporting requirements or monitoring performed on recipients as a result of the GA scores.

 

 

5. Findings and Recommendations

Based on the evidence gathered through the examination of documentation, analysis and interviews, each audit criterion was assessed by the audit team and a conclusion for each was determined. Where a significant difference between the audit criterion and the observed practice was found, the risk of the gap was evaluated and used to develop a conclusion and to document recommendations for improvement.

5.1 Roles, Responsibilities and Accountabilities

Directive 410 – General Assessments (Directive 410), developed by TPCOE, which took effect on April 1, 2011,outlines certain key roles, responsibilities and accountabilities in the completion, review and approval of GAs, as well as the maintenance of the GA framework, including the development of training and working tools. The User Guide, revised in August 2011, also includes the Terms of Reference for the Regional/Sector Transfer Payment Management Committee.

Specifically, Directive 410 assigns responsibility for completing the GAs to Regional Funding Services or program equivalent with oversight provided by Regional Directors General and/or Director Generals of the program branch, the Chief Financial Officer (CFO) sector and the regional and/or sector Transfer Payment Management Committees, or equivalent. The audit observed that although the assigned roles, responsibilities and accountabilities of certain key personnel involved in the GA process are clearly defined, there was a lack of a consistent understanding of some assigned roles. The audit noted that some improvements are necessary to ensure a consistent approach to the completion of programs' components of Part A GAs and HQ sector completed Part B GAs, as well as the review and challenge of Health Canada's assessments.

5.1.1 Accountability over Programs' Components of Part A General Assessments

Through the conduct of regional site visits, the audit noted that accountabilities for the completion, review and challenge of programs' components of the Part A GAs were inconsistently assigned. Specifically, the completion of the programs' portion of the GA was, at times, assigned directly to regional program representatives with little to no challenge by the assigned FSO and at other times, completed solely by the FSO with little to no consultation with program representatives. Directive 410 does not stipulate the level of accountability programs should have in the completion of the GAs; rather, it outlines that "in Regions, the Lead Officer (FSO) will coordinate with Regional Program Responsibility Centre Manager and specialists, and where appropriate, with HQ Responsibility Centre Manager to complete the Program Management component of the assessment". Since the programs' components of the GA accounts for approximately a third of the total GA score, discrepancies in the level of engagement of both program and funding services representatives in the assessment has led to inconsistencies in both the scoring and level of justification provided for scores. For example, the audit noted some programs were providing little to no justification for the scores given, while some others were providing justifications not aligned to the risk considerations outlined in the GA Workbook.

5.1.2 Roles and Responsibilities for HQ Sector Completed Part B General Assessments

The audit noted inconsistencies in the assigned role and responsibilities within HQ sectors for the completion, review and approval of Part B GAs. Although Directive 410 andthe User Guide, do not differentiate between the role and responsibilities for the completion of Part A, Part B or Part C GAs, the audit noted that Part B GAs are at times, being reviewed, challenged and approved by a Responsibility Centre Manager, and at other times, the assessments are only subject to a peer review.

Without the consistent application of a review and approval process to all Part B GAs, which includes the review and approval of the GAs by an individual or oversight committee with sufficient authority, there is an increased risk that assessments with inconsistent scoring or inadequate justification for scoring will not be identified and amended as required.

5.1.3 Role and Responsibilities in the Review and Challenge of Health Canada's Assessments

Part A GAs developed for recipients under joint, ongoing multi-program agreements for the fiscal year 2014-2015 included an assessment of the recipients' program management activities associated with both AANDC and FNIHB programs. Health Canada was responsible for providing the assessment results for FNIHB program which feed into the overall final GA score used for decision-making by AANDC.

Through the conduct of interviews with regional representatives, the audit noted an inconsistent understanding regarding the level of review and challenge required of regional personnel over the scoring and level justification provided by HC. In one of the regions visited, the ratings provided by HC were entered directly into the Grants and Contributions Information Management System (GCIMS) with no review or challenge by AANDC; however, in another region, the assigned FSO reviewed and challenged the assessment ratings being provided by HC. Regional interviewees noted a lack of direction from TPCOE regarding regional responsibilities in reviewing and/or challenging the assessment results provided by HC.

Due to the fact that the ratings provided by Health Canada can have a significant impact on the Program Management consideration score given to a recipient with a joint, ongoing multi-program agreement, it is critical for clear expectations of regional responsibilities regarding the review and challenge of the impact of HC's assessment on the Department's overall scoring of recipients.

5.1.4 Departmental Quality Assurance Program over GAs

With the expectation that Part A GA results will be made public, it is important to ensure that regions are completing the assessments in a consistent manner, and that scoring is consistent across regions. Directive 410 outlines that the CFO Sector is responsible for "providing periodical oversight and activities to ensure compliance with the Directive". In order to meet this requirement, there is an expectation that TPCOE develop and implement a quality assurance/monitoring program across the Department. A quality assurance/monitoring program would include the review and challenge of the assessment results across regions/sectors.

Interviews with a sample of regional and TPCOE representatives, revealed that no quality assurance/monitoring program has been developed and implemented Department-wide. Rather, to date, regional offices have been responsible for developing and implementing their own quality assurance/monitoring processes to identify inconsistencies in the scoring and level of justification provided within the region; however, no holistic review regarding the consistency of scoring was performed.

Without a robust quality assurance/monitoring program which assesses the consistency of scoring across regions and challenges the level of justification provided for scores, there is an increased risk that regional differences in the completion and scoring of GAs will not be identified and addressed in a timely manner.

The audit was informed that Regional Operations is considering the implementation of a peer review process, whereby a sample of regionally completed Part A GAs would be reviewed and challenged by another region. This proposed peer review process could be a key component of a Department-wide quality assurance/monitoring program, whereby the results or issues associated with the review are escalated and addressed by TPCOE.

Recommendations

  • For recommended actions regarding clarifying roles and responsibilities in Directive 410, pleaserefer to recommendation number four (4) bullet A on page 18.

1. The Grants and Contributions Management Oversight Committee should develop and coordinate a Department-wide quality assurance/monitoring program in order to provide the Department with a level of assurance that GAs are being completed in a consistent manner and assessment results are supported by adequate justification. The quality assurance/monitoring program should include the review and assessment of Part A, Part B and Part C GAs.

5.2 Training, Tools and Guidance

The audit team noted gaps in the level of training provided to those assigned with responsibility for completing, reviewing or approving GAs. Additionally, the audit identified opportunities for improving the objectivity of the GA process through enhancements to the GA Workbook.

5.2.1 Training

TPCOE has developed and provided training to HQ sectors and regional offices on how to complete GAs using the GA Workbook and GCIMS when the GA process and tools were first rolled-out. Subsequent to that training, only online training on how to use the GA tool in GCIMS (i.e. the keystrokes required to complete the assessments) has been made available. This training does not provide guidance on how to interpret the risk considerations outlined in the Workbook nor the level of justification required for scoring. Further, through the conduct of interviews with a sample of HQ sector and regional office representatives, many key individuals responsible for completing the GAs were unaware of the availability of the online training and expressed a desire for substantive GA training.

TPCOE has recently acknowledged the need for additional training on the GA process and issued a Statement of Work to hire a contracted resource to review and revise existing training materials, as well as provide "Train the Trainer" sessions to all regional offices starting in the Fall of 2015.

5.2.2 Tools and Guidance

Those tasked with completing the GAs are required to follow the requirements outlined within the Workbook. The Workbook outlines the risk categories and considerations to be used in the assessments, as well as the benchmarks for the low, medium and high rating scales for each risk consideration. The benchmarks outline the elements that a recipient needs to have in place in order to reach that risk rating.

Through interviews with FSOs, the audit team noted that the degree to which the Workbook was actively referenced during the completion of the GA Part A varied between FSOs. Generally, newer FSOs would rely on the Workbook to a greater extent than more experienced FSOs. The risk with an inconsistent use of this tool is that FSOs may begin to stray from the key benchmarks laid out in the Workbook, and apply different assessment criteria than other FSOs. To address this risk, the audit team strongly suggests that TPCOE, as part of the "Train the Trainer" sessions, emphasize to FSOs and GA reviewers the importance of applying the benchmarks and guidance as included in the GA Workbook to ensure that consistent assessment criteria are being applied to General Assessments.

Generally, the audit team noted no concerns regarding the relevancy and appropriateness of the benchmarks established for Part B GAs, as well as the first three risk categories (Governance, Planning and Financial Management) for Part A GAs. However, certain regional program representatives noted concerns regarding the relevancy and application of the benchmarks outlined in the Workbook for Program Management risk factor and its considerations. For example, program representatives noted requiring additional guidance on: what constitutes disruptions, delays, and gaps in service/project delivery; what are the expectations in regards to program staff qualifications and capacity; and, what policies and plans are required for sound management of the program.

While the national Workbook can only provide general program guidance as it is used across multiple programs, certain program staff have found the need for more customized program specific guidance, and as such, have developed their own checklists/templates which feed into the Part A and Part B GAs. For example, in one region a checklist/template was created and is consistently used in the assessment of the Program Management risk factor for Community Infrastructure. Their checklist/template stipulates expectations for each risk consideration, such as the listing of policies required by the Capital Program for assessment against the risk consideration Service/Project Plans and Policies. The results of these checklists are incorporated into the overall GA results.

Because the development and implementation of program-specific checklists/templates is generally performed at the regional-level, the audit noted inconsistencies in the scoring against the risk considerations and the level of justification provided across the regions for the same programs.

The audit team further noted that in addition to the use of the low, medium and high risk ratings, the GA tool allows the use of the low-medium and medium-high ratings (i.e. a five point rating scale), which have not been formally defined through the establishment of benchmarks in the Workbook. Without the establishment of criteria for the low-medium and medium-high risk ratings for each risk consideration, the audit team identified inconsistencies in their use across programs, regional offices and individual FSOs.

Recommendations

2. The Grants and Contributions Management Oversight Committee should ensure the standardization of program-specific checklists/templates used to support the risk analysis across all regions. These checklists/templates should be assessed for their alignment to the risk considerations and benchmarks outlined in the GA Workbook prior to implementation.

  • For recommended actions regarding establishing additional benchmarks in the GA Workbook, please refer to recommendation number four (4) bullet B on page 18.

5.3 GA Assessment Results

5.3.1 GA Assessment Results

Through the conduct of detailed testing on a sample of Part A, B and C GAs across the Department, the audit team noted inconsistencies in the level of justification provided for the scoring, as well as discrepancies in the scoring against the risk considerations for Part A and Part B GAs. For Part C GAs, the audit team noted a consistent and adequate level of justification for the scoring against the risk considerations.

In an attempt to re-perform the assessment for a sample of GAs using solely the information contained within the GA report and the benchmarks outlined in the Workbook, the audit team ran into difficulties arriving at the same risk score for certain risk considerations. The audit team was forced to follow-up with either a program representative or FSO in order to gain sufficient justification to support certain scores. In certain circumstances, the audit team was either unable to obtain sufficient justification to support the assessment rating assigned by the assessor, and would have assessed the recipient higher or lower based on the information provided as justification.

In the event the audit team was in disagreement with the rating provided to a recipient against certain risk considerations, the overall rating of the recipient would not have generally moved the recipient to a higher or lower overall risk rating (i.e. would have remained low, medium or high); however, for two (2) recipients (out of 30 Part A GAs reviewed), the audit team would have increased the overall risk rating of the recipient from either a low to a medium overall rating or a medium to a high overall rating.

5.3.2 Special Considerations

In addition to the above noted concerns regarding the level of justification provided for the rating, as well as the challenge with the rating assigned, the audit noted inconsistencies in how assessors were addressing specific circumstances. Specifically, the audit noted that, due to a lack of guidance/standard on how to address specific situations, scoring against certain risk considerations was inconsistent. These circumstances included:

  • How to address risk considerations that require information/documentation not currently requested of recipients per the funding agreements. For example, the audit noted that, at times, when a program did not have a formal requirement for the submission of financial reports, policies or operational/strategic plans, they were not requested or reviewed. In other cases, some assessors did request and review documents which were not required by the recipient per the funding agreement but were necessary to assess the recipient against the risk consideration.
  • How to address risk considerations when the delivery of a specific program usually tasked to the community has been delegated to a third party. For example, the audit noted, at times, the assessment included a recipient's ability to administer programs even if responsibility had been assigned to a third party, and at other times, the assessment was performed taking into consideration the ability of the third party to support the delivery of the program (rather than the community's ability).
  • How to address the weighting of programs when a program is less "significant" than its assigned weighting. The Part A GA tool pre-populates the assigned weighting of programs for the assessment of the Program Management risk factor.The audit noted circumstances where, for example, the Education Program accounted for 30% of the overall risk rating; however, the recipient was only receiving approximately $30K in funding for students who live on reserve but attend provincial schools off-reserve (rather than delivering a full elementary/secondary education program). Any scoring assigned to this category could potentially have an inflated impact due to the weighting, although the funding provided in this category would not warrant it.

Additionally, the audit noted variations in the timeframe considered as the scope of the assessment. At times, the GA report represented solely an assessment a recipients' performance during the last fiscal year; while at other times, poor past performance, even if dating back a few years, affected the score. In one instance, the audit team noted that consideration was given for something the recipient was expected to accomplish in the future.

Lastly, the audit noted an inconsistent understanding across regions and HQ sectors on whether or not the Part A and B GAs needed to be evidenced-based, as per the GA benchmarks and criteria. Current procedural documentation does not provide clarity regarding if the assessor is responsible for not only asking if key documents such as a policies and plans exist, but also to obtain a copy of them to review for completeness, relevancy and appropriateness.

Recommendation

  • For recommended actions regarding clarifying the treatment of these special circumstances, please refer to recommendation number four (4) bullet C on page 18.

5.4 Impact on Monitoring and Reporting Activities

Per the PTP, administrative requirements on recipients, which are required to ensure effective control, transparency and accountability, need to be proportionate to the level of risks specific to the program, the materiality of funding and the risk profile of recipients. The DTP further articulates that the level of monitoring of recipients and the reporting required from recipients should also be impacted by the same risks.

Through the conduct of interviews with representatives from a sample of regional offices and HQ sectors, the audit noted that GA scores are not being consistently used to determine the level of administrative requirements applied to recipients. More specifically, due to limitations imposed by certain programs control frameworks, the level of monitoring of, and reporting required by recipients cannot be tailored based on the results of the GA.

This observation is aligned to a finding outlined in the 2013-2014 Audit of the Management Control Framework for Grants and Contributions – Focus on Program Control Frameworks and Recipient Reporting. That audit had noted that regions and programs were generally not implementing risk-based reporting and management regimes to target limited departmental resources on projects and recipients of highest risk. Similarly, the 2012-2013 Audit of the Management Control Framework for Grants and Contributions – Funding Approaches found that the level of recipient risk was not always adequately considered in the establishment and selection of funding approaches and compliance activities within the Department.

Per Directive 410, the GA was developed to provide a standardized process for assessing a recipient for the purpose of identifying potential issues that impact delivery of AANDC-funded programs in order to provide flexibilities in funding agreement management regimes. As such, the GA should be the main tool used in determining the level of reporting and other administrative requirements imposed on recipients; facilitated by program management control frameworks that provide sufficient flexibility to allow regions to select the appropriate reporting and monitoring requirements based on the GA scores and other relevant factors.

Recommendation

3. The Grants and Contributions Management Oversight Committee should improve alignment of the reporting requirements and monitoring performed on recipients, as well as the flexibility in funding approaches available to recipients based on a recipient's level of risk.

5.5 Review and Approval Process

Directive 410 outlines that GA reports are subject to a multi-level review process to ensure the integrity of individual decision documents and consistency across recipients. The Directive outlines that completed GAs are first subject to a peer group review, then review by a Funding Services Manager or equivalent, and finally, the review and approval by the Chair of the Transfer Payment Management Committee or equivalent.

The GA User Guide – Annex B indicates that GA results should be approved by an established regional committee, or delegated to a sub-committee when appropriate. A risk-based approach should be conducted to determine if circumstances exist where GA approval should be escalated to a Transfer Payment Management Committee.

The audit identified a robust and consistent approach for the review and approval of Part C GAs, with all Part C GAs being reviewed and approved by an established regional oversight committee, comprised of individuals with sufficient authority. However, inconsistencies were noted in the review, challenge, and approval of Part A and Part B GAs across the sampled regional offices and HQ sectors. Of the three regional sites visited, the following was noted:

  • In one region visited, no formal peer, supervisory or oversight committee review is performed on all GAs (beyond a manager approval in GCIMS). Only specific GAs that meet a pre-determined variance threshold (5% increase/ decrease against the previously completed risk rating) are reviewed and questioned by the Director of Funding Services. No Part A or B GAs are reviewed or approved by the Regional Operations Committee or a sub-committee.
  • In another region, all Part A and B GAs are reviewed first by a Field Services Manager and then by the region's Band Audit and Allegations Management (BAAM) unit. GAs are reviewed and approved either at the Audit Review Committee, or through a BAAM desk review (if very low risk), and then approved in GCIMS by a member of BAAM. Medium and high risk GAs, along with a roll-up of all GA results, are discussed at the Regional Operations Committee.
  • In a third region visited, the results of all Part A and B GAs were reviewed by the Funding Services Manager and the region's Risk Assessment Committee. For any Part A and B GAs assessed as Medium-High or High risk, the results were also reviewed by the Transfer Payment Management Committee.

For the three sampled HQ programs, none had established a Transfer Payment Management Committee to review the completed Part B GAs. In all programs, the GAs were reviewed and approved at a Manager-level.

Without a consistent approach for the review and approval of Part A and B GAs, there is an increased risk that unsubstantiated or incomplete GAs are not identified and addressed in a timely manner. Additionally, without a comprehensive review process, which includes the review and challenge of GAs by a committee or individual with sufficient oversight over the completion of all regional/sectoral GAs, inconsistencies in the scoring against risk considerations may not be identified.

Recommendation

4. The Grants and Contributions Management Oversight Committee should address the recommendations laid out below to address gaps identified with Directive 410 – General Assessments, and the GA Workbook.

  1. Directive 410 – General Assessments should be updated to further define:
    1. The role, responsibilities and accountabilities of Funding Services and program representatives in the development, review and challenge of the programs' components of a Part A GA; and
    2. The role, responsibilities and accountabilities of regional offices in the review and challenge of the impact of Health Canada's input into the GA scores.
  2. The GA Workbook should be updated to include benchmarks for the low-medium and medium-high risk ratings for each risk consideration based on relevant criteria, such as what is already included for the low, medium and high risk ratings.
  3. In order to ensure consistency in the scoring of GAs, the Directive 410 and GA Workbook should be updated to provide additional guidance on how special considerations, such as those identified during the audit, are to be considered/addressed when completing GAs
  4. Directive 410 should be updated toclarify roles and responsibilities over the review and approval of GAs, specifying the appropriate authority level for approvals for Part A and B GAs and ensure their consistent application across regions/sectors.
 

 

6. Management Action Plan

Recommendations Actions Responsible Manager
(Title)
Planned
Implementation
Date
(Month & Year)
1. The Grants and Contributions Management Oversight Committee should develop and coordinate a Department-wide quality assurance/monitoring program in order to provide the Department with a level of assurance that GAs are being completed in a consistent manner and assessment results are supported by adequate justification. The quality assurance/monitoring program should include the review and assessment of Part A, Part B and Part C GAs.

The ADM ;G&C Committee led by the CFO is currently updating the department's compliance and assurance framework towards more integration between its various compliance and assurance activities with a risk management overlay. An overhaul of the current GA score methodology is envisioned as part of this work given its importance as a foundational element of the framework.

The revision to the methodology will strive to make the GA score assessment more objective through the use of external evidence for each of the inputs contributing to the GA score. This initiative combined with the department's intent to make public GA scores of FNs (thereby enabling a challenge function of the GA score by the FNs themselves), and a review of GA scores of FNs for comparability purposes by the G&C Management Oversight Committee will address the need for quality assurance and monitoring.

Grants and Contributions Management Oversight Committee Revised GA Process Implemented by September 2016.
2. The Grants and Contributions Management Oversight Committee should ensure the standardization of program-specific checklists/templates used to support the risk analysis across all regions. These checklists/templates should be assessed for their alignment to the risk considerations and benchmarks outlined in the GA Workbook prior to implementation. The revision to the methodology will strive to make the GA score assessment more objective through the use of external evidence for each of the inputs contributing to the GA score. As a result, the set of descriptions or benchmarks that describes what a low, medium and high risk situation for each considerations looks like will be revised to increase objectivity. In order to accomplish this task, a Regional GA Working Group will be created to address the situation accordingly based on past experience and lessons learned. The recommendations and outcomes will be shared with the Grants and Contributions Management Oversight Committee for approval. Grants and Contributions Management Oversight Committee Revised GA Process Implemented by September 2016.
3. The Grants and Contributions Management Oversight Committee should improve alignment of the reporting requirements and monitoring performed on recipients, as well as the flexibility in funding approaches available to recipients based on a recipient's level of risk.

Work currently being initiated to update the department's compliance and assurance framework including the revision to the GA score methodology will define the level of monitoring required by the department (as well as any reporting requirements by the FN in response to them) based on the risk level ascribed to the FN based on the score. Given flat-lined resource levels within the department, the objective will be to focus more staff time and efforts towards high-risk FNs and less on low-risk FNs.

The framework will better define the funding flexibilities available to FNs based on their GA scores. A more aggressive use of grant and block contributions to FNs is envisioned for those with a demonstrated track record of low GA scores, as well as a more robust value-added engagement with those FNs that are high-risk.

Grants and Contributions Management Oversight Committee Revised GA Process Implemented by September 2016.

4. The Grants and Contributions Management Oversight Committee should address the recommendations laid out below to address gaps identified with Directive 410 – General Assessments, and the GA Workbook.

A. Directive 410 – General Assessments should be updated to further define:

i. The role, responsibilities and accountabilities of Funding Services and program representatives in the development, review and challenge of the programs' components of a Part A GA; and

ii. The role, responsibilities and accountabilities of regional offices in the review and challenge of the impact of Health Canada's input into the GA scores.

The GA Directive will be revised to further define the role, responsibilities and accountabilities of Funding Services and program representatives in the development, review and challenge of the programs' components of a Part A GA.

The GA Directive will be revised to further define the review and approval process of GA Part B.

The GA Directive will be revised to further define the role, responsibilities and accountabilities of regional offices in the review and challenge of the impact of Health Canada's input into the GA scores. Efforts will continue towards having one GA score for common recipients in line with the whole of government approach. In the meantime, AANDC will continue to honor its commitment in working closely with HC on the GA process.

Grants and Contributions Management Oversight Committee Revised GA Process Implemented by September 2016.

B. The GA Workbook should be updated to include benchmarks for the low-medium and medium-high risk ratings for each risk consideration based on relevant criteria, such as what is already included for the low, medium and high risk ratings.

The GA Workbook will be revised and adding benchmarks for the low-medium and medium-high risk ratings for each risk consideration based on relevant criteria will be discussed for possible implementation.

C. In order to ensure consistency in the scoring of GAs, the Directive 410 and GA Workbook should be updated to provide additional guidance on how special considerations, such as those identified during the audit, are to be considered/addressed when completing GAs

The GA Directive and Workbook will be updated to provide additional guidance on how special considerations, such as those identified during the audit, are to be considered/addressed when completing GAs.

D. Directive 410 should be updated to clarify roles and responsibilities over the review and approval of GAs, specifying the appropriate authority level for approvals for Part A and B GAs and ensure their consistent application across regions/sectors.

The GA Directive will be revised to clarify roles and responsibilities over the review and approval of GAs, specifying the appropriate authority level for approvals for Part A and B GAs and ensure their consistent application across regions/sectors.

 

 

Appendix A: Audit Criteria

To ensure an appropriate level of assurance to meet the audit objectives, the following criteria were developed to address the objectives as follows:

Audit Criteria
1.0 The roles, responsibilities, and accountabilities of key personnel involved in the GA process have been clearly defined and are well understood.
1.1 Roles, responsibilities, and accountabilities for key personnel within TPCOE involved in the GA process have been clearly defined and are well understood.
1.2 Roles, responsibilities, and accountabilities for key personnel within regional offices and program areas regarding the GA process have been clearly defined and are well understood.
1.3 Sufficient and appropriate oversight activities have been established to ensure the consistency of the completion of GAs across all regions.
2.0 Adequate tools, guidance and training is provided to those tasked with completing, reviewing, and approving the GAs in support of consistent, reliable, complete, accurate, and timely information.
2.1 Adequate guidance and training is provided to program management and regional offices to adequately complete, review and approve GAs consistently across regions.
2.2 Adequate templates and supporting tools have been developed for the consistent, reliable, complete, accurate and timely completion, review and approval of GAs.
3.0 Assessment results are adequately supported and justified through proper documentation and analysis.
3.1 Formal requirements have been established for the gathering of sufficient documentation and the conduct of adequate analysis in support justified assessment results.
3.2 Assessment results are supported by comprehensive analyses and adequate documentation.
4.0 An appropriate process has been established for the review and approval of the GAs.
4.1 GAs are consistently reviewed and approved by appropriate authorities.
4.2 A standardized process has been established to identify and assess the impact of changes in recipients’ environment and the need for re-performance of the General Assessment.
4.3 The selection of an appropriate funding mechanism is based on the results of the GA.
 
 

 

Appendix B: Applicable Legislation, Regulations and Policiess

The following authoritative sources (i.e. legislation/regulations/policies) were examined and used as a basis for this audit:

  1. Treasury Board of Canada Policy on Internal Audit
  2. Treasury Board of Canada Policy on Transfer Payments
  3. Treasury Board of Canada Directive on Transfer Payments
  4. General Assessment (GA) User Guide
  5. General Assessment (GA) Workbook - August 2013
  6. GA Workbook: Part C – Ongoing, Multi-program Recipients for Block Contribution Funding or Other Flexible Funding Approaches
  7. Directive 410 – General Assessment
  8. AANDC Management Control Framework for Grants and Contributions
 
 

Did you find what you were looking for?

What was wrong?

You will not receive a reply. Don't include personal information (telephone, email, SIN, financial, medical, or work details).
Maximum 300 characters

Thank you for your feedback

Date modified: