Archived - State of Performance Measurement of Programs in Support of Evaluation at Indian and Northern Affairs Canada

Archived information

This Web page has been archived on the Web. Archived information is provided for reference, research or record keeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Date: September 2009

PDF Version (133 Kb, 29 Pages)

 

 

Table of Contents




1. Introduction

Canadians want to know that their Government is getting value for money when public funds are expended and that the results of government programs are achieved at a reasonable cost. The Government of Canada is committed to responsible spending as a cornerstone of an accountable government. In light of this commitment, the Government of Canada views evaluation as one of the principle means by which Canadians, parliamentarians, ministers, central agencies, and deputy heads are able to receive credible, timely, and neutral information on the relevance and performance of federal government programs.

The information garnered through the evaluation process has a number of uses. It is expected to be used to support evidence-based decision making regarding programming and resource allocation, and to inform priority setting. It is also expected to be used to demonstrate accountability for the results achieved by each program. Evaluation is critical to ensure that spending supports programs, which are relevant, effective and efficient. In order for evaluations to be robust, it is essential to have sound performance information.

Performance measurement is the process and systems of selection, development and ongoing use of performance measures to guide decision making[Note 1]. Simply stated, performance measurement is about measuring results. Performance measures, also called performance indicators, provide the foundational pieces of information necessary to evaluate a program or policy in order to determine its ongoing relevance, efficiency, effectiveness, and economy. An ongoing cycle, performance measurement uses performance information from all sources to set goals and objectives, plan activities to accomplish these goals and objectives, allocate resources to programs, monitor and evaluate results to determine if progress is being made toward achieving the goals and objectives, and modify program plans as necessary to enhance performance. Performance measurement is also useful for multiple reporting requirements, strategic review, planning purposes, demonstrating accountability, and meaningful outcomes to communities and stakeholders.

Return to Table of Contents
 

 

2. Purpose

The purpose of this report is to provide an assessment of the state of performance measurement of programs in support of evaluation at Indian and Northern Affairs Canada (INAC). In addition to the clear value of ensuring high quality performance measurement, this report will ensure compliance with the Treasury Board Secretariat's (TBS) Directive on the Evaluation Function. The directive requires the departmental head of evaluation to submit to the Departmental Evaluation Committee an annual report on the state of performance measurement of programs in support of evaluation.

Return to Table of Contents





3. Background

The renewed TBS Policy on Evaluation requires the creation of a comprehensive and reliable base of evaluation evidence to support policy and program improvement, expenditure management, Cabinet decision-making, and public reporting. Departments are expected to gather credible, timely and neutral information on the ongoing relevance and performance of their direct program spending. This will then be made available to ministers, central agencies and deputy heads to support evidence-based decisions. Ultimately, this information will be made available to Parliament and Canadians to support government accountability for results achieved by programs.

The Policy on Evaluation gives the Deputy Minister (DM) the responsibility of ensuring that ongoing performance measurement is implemented throughout the Department. In addition and as noted above, TBS has instituted a complementary Directive on the Evaluation Function with elements directly related to performance measurement. The directive states that it is program managers who are responsible for developing, implementing and monitoring ongoing performance measurement strategies for their programs.

There are additional components to the policy framework in which performance measurement plays an essential role. Performance measurement is the foundation on which the information required for the Performance Measurement Framework (PMF) and the Management, Resources and Results Structure (MRRS) relies. In order for the Department to demonstrate that its programs are achieving the expected results, and are aligned with the Strategic Objectives, a credible performance measurement system must be operating at the ground level.

In anticipation of getting that information from the ground level, the Policy on Transfer Payments requires that a performance measurement strategy be established at the time of program design. The performance measurement strategy is to be maintained and updated throughout its life cycle so that it can effectively support the evaluation of each transfer payment program.

In developing performance measurement strategies, there are additional influences at play. These include:

Within these parameters, it is useful to define a quality performance measurement framework and subsequently measure the Department's performance against that yardstick, on an annual basis.

Return to Table of Contents
 

 

4. Key Attributes of a Quality Performance Measurement System

To assess the state of performance measurement in the Department, it is instructive to identify the attributes that are consistently found in quality performance measurement systems and are evident in high performing organizations. Drawing on sources such as the Auditor General of Canada, central agencies and current literature, it is possible to identify attributes of a quality performance measurement system.

4.1 Leadership

Leadership from the senior levels of an organization is critical to the success of performance measurement. The executive level needs to be involved and needs to be seen as being involved, and needs to actively support a culture of performance measurement throughout the Department. Commitment at the senior level is needed before program managers can be expected to take ownership of evaluation results and embrace performance measurement as a means of continuous improvement.

4.2 Clear Accountability

Clear roles and responsibilities, for all individuals involved, need to be well articulated and understood. Individuals at all levels, from the recipients, managers, regional offices, internal services and executives, need to know understand their roles and accountabilities. Some organizations tie financial and non-financial incentives to performance measurement.

4.3 Community needs

The needs and capacity of the community, which is the target audience for programs and activities, must be integrated into the planning process. Designing programs that have incorporated community input can be expected to resonate with the audience. Community involvement should also mean that realistic performance measures and targets can be established at the outset and that they will be clearly understood by all parties before the programming activities begin. Communities will be more motivated to participate in the performance measurement processes if they can see a community focus in the programming and the value of their participation in the performance measurement and reporting processes (i.e. measuring what matters to them).

4.4 Alignment with strategic direction

Performance measures need to be aligned with the strategic direction of an organization in order for the organization to demonstrate the extent to which it has achieved its strategic objectives. Supporting systems also need to be aligned. For example, the Information Technology/Information Management (IT/IM) systems must be aligned to support the strategic results of the organization.

4.5 Performance information is credible

For a performance measurement system to be of value, there must be confidence in the resulting information.; Users will be confident in the information if it is credible; in order to obtain credible information, effective planning is required. The performance measurement strategy or framework provides the means for identifying and gathering the performance measures required for result-based management. The performance measurement strategy must support the ongoing collection and tracking of performance information. It also requires the co-ordination and collection of a substantial amount of information from many different sources, both inside and outside an organization. Ultimately, the performance measurement strategy must effectively support the evaluation of the program for which it was designed.

The performance measurement strategy must include:

  1. clear objectives that are defined, realistic and determinable, and that are aligned with the strategic objective;

  2. performance measures (indicators) that are aligned with decision making authority and accountability, are of an appropriate number, and are accessible, accurate and meaningful;

  3. clear outcomes;

  4. clear and realistic performance targets;

  5. data collection mechanisms or plans;

  6. baseline data that is used to set realistic performance targets;

  7. an approach to monitoring that is risk-appropriate;

  8. reporting requirements that are realistic and supported by reporting systems, which are in place; and,

  9. plans for performance measurement, evaluation and reporting;

4.6 Implementation

The performance measurement strategies must be fully implemented for the benefits to be realized. Performance information needs to be collected effectively and regularly from all identified sources. The approach must take into account the twin focuses of managing responsibility and balancing capacity.

4.7 Capacity

Employees and other stakeholders need to have the capacity to fulfill the requirements for performance measurement. Capacity issues include:

4.8 Performance information is used

The performance information that is gathered needs to be used to fulfill policy requirements, support evidence-based decision making and meet various reporting requirements. Most directly, performance information is used to monitor progress on programs and inform evaluation work. More broadly, performance information is used as part of a continuous improvement process in quality management.

4.9 Communications

Ongoing communications between all people involved, from all levels and areas of responsibility, internal and external, is important. Key performance information needs to be cascaded through an organization so employees understand its significance and their role in achieving expected results.

4.10 Culture

A culture that focuses on results, where the purpose and value of performance measurement is understood and employees have the required skills, is needed in order to create a supportive operating environment.

Return to Table of Contents
 

 

5. Where is INAC in Relation to These Attributes

5.1 Leadership

Status

The DM is the Chair of the newly established Evaluation, Performance Measurement and Review Committee (EPMRC). The involvement of the DM should ensure that the significance of performance measurement is conveyed throughout the Department.

The EPMRC also includes three senior assistant deputy ministers and three external experts - including the chair of the audit committee. Its roles include advising the DM on performance measurement strategies and making recommendations to the DM on matters related to performance measurement systems and managerial accountability. The EPMRC is relatively new, having met on just three occasions to date. Once it has established itself and communicated its role more broadly, the EPMRC should be able to provide the sort of profile necessary to fulfill a leadership role.

Workshops recently held on ‘Measuring What Matters' were attended by senior management and representatives from various sectors across the Department, which raised the awareness and understanding of many participants. Since then, there has been an increase in the number of requests for performance measurement support and tools made to the Evaluation, Performance Measurement and Review Branch (EPMRB).

Planned Activity/Future Action

5.2 Accountability

Status

The accountabilities with respect to performance measurement are clearly articulated in the Policy on Transfer Payments and the Evaluation Policy.

However, a review of the 30 evaluations, which the Department has conducted since 2005, reveals that seven recommended that greater clarity was required with respect to articulating roles and responsibilities.

The interview process provides further evidence that greater clarity is needed. There are some areas of responsibility where there appears to be overlap or ‘fuzzy boundaries'. There are also cases where program managers request performance information or data relevant to their program from other parts of the department, the results of a misconception that someone else is collecting and storing the information on their behalf. Again, this supports the finding of greater clarity regarding roles and responsibilities in all areas: policy, programs and internal services.

To provide oversight for these accountability requirements, the Department has established the EPMRC, as noted above. Through the EPMRC, the DM involves senior management and external members in planning and establishing evaluation and review priorities; examining and approving terms of reference for evaluations, special studies and reviews, including assessment of related actions taken; and promoting effective management and performance monitoring of departmental programs, services and operations. The focus of the Committee is to identify program relevance and performance issues, and ensure that sectors effectively resolve them.

Planned Activity/Future Action

5.3 Community needs

Status

The Department is engaging in a number of initiatives to ensure community needs are reflected.

The Aboriginal Information Management Committee (AIM) serves as a forum for allowing discussion and information sharing on activities of interest related to information management matters affecting First Nations, Inuit, Non-status Indian and Métis peoples within federal departments and agencies. The Committee will also serve as a conduit to exchange information and ideas on Aboriginal information management matters, strategies and any new data initiatives with representatives of national Aboriginal organizations.

The EPMRB has developed an Engagement Policy that will serve as a framework for ensuring Aboriginal involvement in evaluations. The policy acknowledges that Aboriginal engagement is critical to the process of planning for quality evaluations. The policy contemplates various methods of engagement in order to ensure Aboriginal input and increase communication. Although the policy's focus is specifically on evaluation, and it is not yet fully implemented, the links made through implementing this policy may aid performance measurement.

In addition, there was a study recently undertaken that resulted in the report: "Measuring What Matters: Assessing the Quality of Indigenous Community Life" (2009). This report reflected on the Department's PMF and was supportive of the Department's efforts to raise the profile of community voices as contemplated in the proposed use of a governance assessment tool. The report is also supportive of the PMF's articulation of "sustainable" outcomes as this expresses hope for ‘enduring change'. The report made the point that community engagement in performance measurement can be expected to reinforce accountability within communities and move toward an accountability structure that communities can "call their own".

Planned Activity/Future Action

5.4 Alignment with strategic direction

The Department's Program Activity Architecture (PAA) is an articulation of how resources are managed and allocated, and how activities are organized to achieve results. The PAA is organized into five strategic outcomes that establish the Department's strategic direction. Policy requirements for the PAA include the establishment of performance measures, which have been developed for most of the Department's program activities, sub-activities and sub-sub-activities. The Department's tool for assessing performance measurement strategies (formerly Results-based Management and Accountability Frameworks (RMAFs) requires that clear linkages be made to the Strategic Objectives articulated in the PAA.  This requirement is expected to ensure that each program does align with the strategic direction of the Department.

The evergreen departmental PMF, and the process by which it has been created and managed, is expected to be an effective tool to ensure that program measures and objectives align with the department's Strategic Objectives. The indicators used in the PMF are expected to guide the development of performance measures/ indicators at the program level. The Management and Accountability Framework (MAF) Assessment (Round VI) has commented that the performance indicators identified for the PMF are "not clear and cannot be used for data collection to provide reliable insight into program effectiveness". This was noted by interviewees to be particularly acute with respect to intermediate indicators. In addition, interviewees did not think that efforts to make clear the linkages to the PAA and PMF were successful.

In addition to the policy requirements noted above, guidance from the EPMRB for program managers regarding performance measurement strategies includes the requirement for clear links to the PAA to be articulated in the program profile and reflected in the program's logic model.

Planned Activity/Future Action

5.5 Performance information is credible

Performance information availability has been identified as a risk in INAC's Evaluation Plan 2009-2010 to 2013-2014. Having performance measurement information available for analysis is imperative; the following elements identify the performance information that should be available in order to demonstrate results. Ensuring that the performance information is credible depends on the processes followed to collect the information and the accuracy of the data gathered – which relies on the knowledge and capacity of the individuals gathering the data.

The tool used to plan for and collect performance information has been the RMAFs introduced by TBS in the 2001 Policy on Transfer Payments. In 2008, the EPMRB undertook an RMAF Special Study ('the RMAF Study'), which assessed the quality of 59 departmental RMAFs and the degree to which they had been implemented. The key compliments for the Department's RMAFs were that, generally, those reviewed were of high quality. Eighty percent were assessed as 'excellent' based on the project's assessment criteria. (The RMAFs were reviewed against TBS criteria). Areas of strength included: clear objectives, expected results and logic models. Evaluation plans were assessed as 'generally good', but often lacked solid data collection plans or methodologies.

The key shortcomings of the RMAFs related primarily to gathering performance information. When compared against the attributes set out in section 4 above, the following observations can be made:

5.5.1 Clear Objectives

As noted, the RMAF Study found, overall, the objectives acceptable with respect to clarity. However, a review of evaluation studies reveal that eight of the thirty evaluations recommended that the programs needed to establish either clear program objectives or clarify existing objectives.

5.5.2 Performance measures that are aligned, appropriate in number, accessible, accurate, and meaningful

There are several criticisms related to indicators including: there are 'too many indicators'; they are 'vague and difficult to measure'; and they are too focused on outputs and not sufficiently focused on outcomes (RMAF Study).

Several evaluations have also criticised indicators for various reasons including: that they 'lack meaning'; they are not comparable with other measures (e.g. for the purposes of provincial comparability); and they are not 'appropriate' or 'measurable' or they do not allow for gender equality analysis'. (Evaluations, various, 2005 - 2009).

Audits have noted several programs had not identified performance measures as part of their performance measurement strategy - and other programs that need to improve those that have been identified.

Input from the interview process raised a number of issues. For years, there has been a reliance on administrative data; a great deal of work has gone into the identification of performance measures, yet there remains the need to shift the focus from activities to outcomes. A more global, integrated view may help reconcile performance information at the program level with the broader, high level performance measures identified at the strategic level.

There were also concerns that people were identifying measures without fully understanding them or without considering issues of attribution. Specifically, a program might be operating as expected, yet the selected measures may not demonstrate 'success'. Reasons for this are often entirely unrelated to the program's activities or are due to external influences beyond the program's control (or even the Department's).

There is a sense, however, that progress is being made in developing better performance measures. There was strong support for identifying good, measurable performance measures at the policy and program design stage so program managers will know that the required information can be collected over the program's lifecycle.

5.5.3 Clear Outcomes

The RMAF study noted that 80 percent of the RMAFs were ‘excellent' with respect to having clear objectives. There were, however, objectives that were not well linked to the Strategic Outcomes and some objectives that were actually outputs. The Study, as well as nine of the evaluations, also criticized some program outcomes as weak, unclear, or not supported by the indicators that had been identified.

5.5.4 Clear performance targets

Seven of the 30 evaluations reviewed recommended improved or clearer performance targets for the programs evaluated. Two of the audits reviewed recommended clear targets be set. The RMAF Special Study did not comment on the issue of targets.

5.5.5 Data collection mechanisms or plans

Data collection mechanisms or plans are often lacking in the performance measurement strategies as reported by the RMAF Study. Specifically, in 15 of the 59 RMAFs assessed, data collection sources were not clearly defined, the frequency of data collection was missing or unclear, and the responsibility for data collection was unclear. Results from the surveys sent to managers as part of the Study revealed that data is being collected for approximately 42 percent of the performance indicators listed in RMAFs. The input from the majority of informants is that there is ‘no data' and that there is a lack of meaningful data sources – which supports the identification in the Corporate Risk Profile of the risk: "Information for Decision Making".  However, the view was also expressed that there is actually a lot of data – but it is not the right data, it is not in useable formats, or it is not credible.

Issues with data collection ranged from indicators being reassessed for their usefulness during the lifecycle of the program to more serious issues, including a lack of capacity with both program recipients and program managers to collect, report and analyze data. Other issues in data collection may be linked to a problem noted above: too many indicators are listed for measurement, many of which are output-focused.

The lack of credible data is a key weakness facing the Department's ability to effectively evaluate some of its programs. TBS's MAF assessment (Round VI) noted that the evaluations reviewed rarely address questions of program relevance, success and effectiveness, primarily because of a lack of reliable performance information. This current lack of performance information is expected to impact future evaluation unless the issue of baseline data and data sources is addressed.

Various workshops over the last year or so, such as strategic outcome planning meetings and 'Measuring What Matters' workshops, have facilitated progress as the Department is beginning to identify and map out meaningful indicators, which reflect the objectives and expected results of its programs.

As noted above, the mandate of AIM includes discussing information management matters that affect First Nations, Inuit, Non-status Indian and Métis peoples. Information management is defined very broadly to include such items as data requirements, data collection mechanisms, data sharing, analysis, research, management of information, etc. Participants in the committee include representatives from federal government departments and agencies, and national Aboriginal organizations. As well as serving as a forum for information sharing, objectives for the AIM include looking for opportunities to streamline data collection and avoiding duplication.

EPMRB developed and tabled at the Data Experts Workshop in December 2008, an extensive list entitled "Potential External Sources Relevant to INAC Performance Measurement", which includes a variety of sources and various methodologies. Accessing information that is already available and routinely collected is an efficient approach and can be expected to contribute to consistency in performance measurement.

Progress has been noted in strengthening the internal processes that support performance measurement. The First Nation and Inuit Transfer Payment System has contributed to the reduction in the reporting burden and streamlined the payment process. Ongoing efforts to connect research, planning and forecasting activities appear to be positioning the Department for improved performance measurement with the understanding that adjustments will be required along the way.

5.5.6 Baseline data

The Department has recognized a lack of accurate and consistent baseline data that affects the evaluation of some of its programs. Most of the 30 evaluations reviewed indicated the need for baseline data or performance data. As baseline data is necessary to set performance targets and ongoing performance data is needed to measure progress, the lack of data is a serious shortcoming. As noted in the TB MAF assessment (Round VI), the lack of baseline data will create difficulties in substantiating future evaluation work.

5.5.7 Monitoring that is risk appropriate

TB requirements have changed from requiring each program to prepare a Risk-Based Audit Framework to the less prescriptive requirement for transfer payments to be managed in a manner that is sensitive to risks. Only three of the 30 evaluations reviewed address risk, which is unsurprising because the RMAFs for the RMAF Study did not include risk assessments. This issue is difficult to assess at this time given that the policy changes are recent.

5.5.8 Reporting requirements

The Auditor General (AG) first reported in 2002, and highlighted again in 2006, the reporting burden faced by First Nations in meeting their reporting obligations to the federal government. The AG also noted that some of this reporting was not used to support decision-making. This issue was also the subject of a recommendation of the Blue Ribbon Panel on Grants and Contributions, which proposed a dramatically simplified reporting and accountability regime that would reflect the circumstances and capacities of recipients, and the real needs of the Government. It has also been supported by the Department and recipients.

Of the 30 evaluations reviewed, there were some precise recommendations ("simplify reporting forms to avoid double counting") and many more general recommendations, such as "reporting practices should be enhanced". Good reporting processes are required to demonstrate the outcomes achieved and meet accountability requirements. However, there is the issue of the reporting burden.

The Department embarked on a strategy to reduce the reporting burden as part of its efforts to manage for results and measure what matters. The Department launched its SMART reporting initiative in 2007 wherein unnecessary and low utility information is re-examined. Specifically, the Department has achieved close to a 50 percent reduction in reports.

As noted, the EPMRB has prepared draft guidance on the preparation of, and requirements for, performance measurement strategies in line with the expectations of the revised Policy on Transfer Payments. Clear guidance for reporting requirements will be developed and will be aligned with the Government of Canada's Reporting Principles:

Principle 1: Focus on the benefits for Canadians, explain the critical aspects of planning and performance, and set them in context;

Principle 2: Present credible, reliable, and balanced information;

Principle 3: Associate performance with plans, priorities, and expected results, explain changes, and apply lessons learned; and

Principle 4: Link resources to results.

5.5.9 Performance measurement, evaluation and reporting plans

Twelve of the 30 evaluations reviewed recommended that the program either required a performance measurement strategy or needed to make a number of improvements to the existing strategy. Although most of the shortcomings were found to be in the areas noted above (reporting burden, performance measures, data collection) there were recommendations that spoke to having more program-focused strategies, better alignment with departmental strategies and clearer outcomes identified for the program. There are other cases noted where the strategies did not include evaluation issues and methodologies in the evaluation plans.

Although the RMAFs have been dropped as a requirement in the revised Policy on Transfer Payments, there remains a requirement for a performance measurement strategy, which serves much the same purpose. The EPMRB has prepared draft guidance on the preparation of, and requirements for, performance measurement strategies to ensure they are in line with the expectations of the revised Policy on Transfer Payments.

Planned Activity/Future Action

5.6 Implementation

The RMAF Study found that 40 percent to 60 percent of RMAFs were fully implemented. One of the key obstacles to full implementation has been, as noted above, the difficulty in instituting the performance measurement plan component of the framework. The RMAF Study found that data was being collected for 43 percent of performance indicators in 22 of the RMAFs sampled in a survey of program managers.

The Department has devised (April 2009) an implementation approach for performance measurement strategies to reflect the requirements of the revised Policy on Transfer Payments. The implementation approach includes engaging senior management, building capacity and understanding and developing a more integrated approach overall.

The EPMRB has identified the risks associated with the development and implementation of performance measurement strategies. They include:

Planned Activity/Future Action

5.7 Capacity

Another finding made by the RMAF Study was the lack of capacity to collect all of the performance indicators listed in the RMAF. The limited capacity of both the Department and the First Nations recipients was reported to be a barrier to effective data collection. This lack of capacity includes, but is not limited to, insufficient resources, low ability among some recipients to fill out the data collection forms, lack of internet access for data reporting, lack of departmental personnel to perform data analysis, and a lack of information systems to collect, store and analyze data.

The TB MAF Assessment (Round VI) also noted the impact staff shortages had on the Department's evaluations, specifically as it related to the data collection required to provide baseline measures and performance reporting. Staff shortages also caused the time frame for evaluations to be shorter than ideal. Eight of the evaluations conducted since 2005 specifically recommended training of departmental staff in order to build capacity.

In discussing the capacity issues, the AG has recommended that the Department provide training to ensure that First Nations communities have adequate financial administration capacity, based on the challenges First Nations have in meeting reporting requirements.

Input from the interview process was consistent in noting a lack of capacity for performance measurement, ranging from program managers not knowing what constitutes good performance measures, to First Nations not having the capacity to collect performance information in their communities, to regions not having the capacity or the information systems to input data collected.

To address capacity issues, the EPMRB has supported workshops and meetings that provide departmental employees with a learning opportunity around performance measurement. The EPMRB has also developed tools to support program managers in specific areas, such as the development of performance measurement strategies. Also, the Department has increased the resources for the evaluation function to support improved performance measurement as a means of improving the quality of its evaluations.

Planned Activity/Future Action

5.8 Performance information is used

It appears that, to the extent that it is available, performance measurement information is used to inform all evaluations undertaken by the Department. TB has registered its concern that departmental evaluations contain conclusions, which are not often supported by the performance information contained in the body of the report. Internal assessments of the Department's program evaluations concur that evaluations would be of higher quality if the performance measurement information was more credible.

In addition, the Capacity Assessment Survey, which informed the TB MAF assessment, reveals that the Department's evaluations are almost always brought into consideration in Memoranda to Cabinet and TB Submissions and, on occasion, the Reports on Plans and Priorities and the Departmental Performance Reports.

Some participants in the interview process noted that there is a perception that decisions are not based on performance information. Decisions are made for other reasons that often have nothing to do with performance. It was thought that this perspective may contribute to the generally poor assessment of performance measurement at the Department.

The recent policy changes make it imperative that evaluations are completed prior to program renewal and that programming decisions will be based on those evaluations. Evaluations will also be used to inform the development of new programs at the policy concept stage through the Memorandum to Cabinet process. The importance of solid performance measurement information is essential and the Branch reports that program managers of audited programs are taking action to improve performance management practices in their programs.

Planned Activity/Future Action

5.9 Communications

Many of the Department's programs use a decentralized and devolved delivery model with multiple and diverse partners. To be successful, communications, both internal and external, will have to be effective.

The workshops and meetings supported by the EPMRB serve as a means of communicating the value of performance measurement to departmental employees. Motivated people facilitate the collection and reporting of performance information.

There are other committees that will serve as forums for communicating the value of performance measurement and the key role it plays in evaluation to a wider audience. Committees include those noted above, such as the EPMRC, the AIM and the proposed Strategic Research and Data Advisory Forum.

Planned Activity/Future Action

5.10 Culture

The level of success that the Department has in infusing a performance measurement and results based culture into the organization will have to be assessed in the future. It emerged through the interview process that the Department is in the early stages of building a performance measurement culture. The change of the Department from direct delivery to a funding agency reportedly still has an impact on the Department's operations. In addition, there is the natural reluctance to have one's performance assessed based on outcomes over which an employee has no control. However, it was noted that a results based culture within the Department is growing and that there is an increase acceptance that it is ‘here to stay'. Building such a culture requires an approach that is proactive and clear, but also accepting of failure on the road to success because of the perception that the Department is not yet at an optimal state of readiness.

Planned Activity/Future Action

Return to Table of Contents





6. Cross cutting issues/findings

6.1 Collaboration and Co-ordination – Data requirements, data collection, and data sharing

Given the number of departments delivering programs to First Nations, Inuit, Non-Status Indian and Métis peoples, a collaborative approach is necessary to co-ordinate reporting, with a review to reducing the burden, while ensuring the rigour of the data collected. For example, programs delivered by INAC complement program delivered by other departments such as Human Resources and Skills Development Canada ( c) and Health Canada. Specifically, the Assisted Living program (Social Development) complements Health Canada's Home and Community Care Program. Together these programs fund and support the home and community care foundations of the First Nation continuing care system on-reserve. Program managers are in various stages of establishing joint working groups to address common issues to improve the effectiveness of both programs. There is the potential through these co-operative approaches to streamline reporting and share data.

There are several formal forums either planned or in place to co-ordinate issues across departments, including:

The Department plans to participate in an "APS Consortium" that will include HRSDC, Health Canada, Canadian Mortgage and Housing Corporation, and the Department of Canadian Heritage.

A significant barrier to solid performance measurement information has been the challenge of getting on-reserve information. Statistics Canada has the mandate for the national statistical system that gathers census data yet a gap remains in gathering data for the on-reserve Aboriginal population. The new strategy contemplates taking into account the needs of the Department and other stakeholders, including other federal departments and Aboriginal groups. The data gathered is expected to be relevant to all groups. It is expected that there will be new linkages with the First Nations Regional Health Survey as an expanding model of collaboration and tool of data collection. The goal is to produce timely reliable and representative socio-economic statistics at the community, provincial and national levels to meet the short and long term information needs of INAC and other stakeholders. The three pillar approach will include

(1) theme-based general social surveys;

(2) timely social surveys; and,

(3) supporting First Nation governance.

The strategy will require an amendment to the existing Memorandum of Understanding with Statistics Canada

This existing interdepartmental committee will serve as a forum that, along with activities noted above, examine areas where First Nations, Inuit, Non-status Indian, and Métis data and information can be: shared to minimize overlap and duplication; create opportunities for partnering on key initiatives; and examine how data and information collection and can be consolidated and streamlined. Given the number of programs that the Department operates in concert with other federal departments, co-ordination of this kind could be expected to improve efficiency.

Internally, there is a need to ensure that all branches in the Department that have a role to play in performance measurement, are connected so the Department will benefit from the expertise each can bring to the process. As noted above, the Department has established the EPMRC to ensure collaboration and co-operation across all the departments sectors. The Department's performance story will be more complete with input and participation from each sector.

An additional internal mechanism that has been planned is:

This internal forum will discuss data requirements, set priorities and make decisions on data needs. Its purpose is to co-ordinate the Department's data needs. The forum is one component of the new strategy to conduct on-reserve social surveys discussed above and will be chaired by the Director General of Strategic Planning, Policy and Research.

6.2 Costs

The issue of costs for effective performance measurement were a concern identified by a number of interviewees. The anticipated significant costs of a department-wide, enterprise IT/IM system was seen as a barrier for a number of interviewees, primarily because of a lack of belief that the funds would be made available. However, other interviewees thought that smaller systems would be sufficient, perhaps organized by sector – and similar to what currently exists. Although there is a need to invest funds into performance measurement systems, these costs should be offset by the savings and efficiencies that can be expected from:

Return to Table of Contents
 

 

7. Conclusions

Achieving the goal of building a results culture throughout the Department cannot be done overnight; mistakes will be made and adjustments will undoubtedly be required. There needs to be clarity and leadership from the executive level; but there are also issues "on the ground" that need to be addressed. This may include assessing all pieces of program delivery and agreeing on a common vocabulary. One example cited was for a clear definition of the ‘provincial comparability' term used in several programs.

The major issues to address in improving performance measurement is the need to communicate the importance of performance measurement as the foundation of results based management and the key to effective evaluation at the Department. The roles and responsibilities for all employees need to be communicated to ensure understanding and promote engagement in the process. Both internal and external stakeholders need to appreciate the value of performance measurement in order for them to engage in the process.

The key challenges of performance measures/indicators, data collection process and the reporting burden are all issues that the Department has begun to address. Given the magnitude of the data collection and capacity issues, co-ordinated efforts are required, within the Department, across national and regional levels, with external partners and with First Nations. Progress made, for example, in identifying and tracking fewer yet better performance measures (indicators), will improve the process for all stakeholders. The approach of focusing on a few key and/or common outcomes, supported by solid performance measurement data, will likely improve the robustness of evaluations. To do this successfully, investments must be made in the capacity of the Department to incorporate a comprehensive performance measurement system across all sectors. This investment will have to address the data warehousing issue and the ongoing maintenance of data.
Return to Table of Contents
 

 

8. Recommendations

There are a number of recommendations that could be incorporated into a formal action plan for moving the performance measurement file forward and building on the accomplishments to date:

8.1 Leadership

There is a need for strong leadership in order for a results culture to take hold across the Department, and this leadership needs to come from the highest levels. Employees need to take ownership for their individual contributions to performance measurement and this is more likely to occur if they can see the necessary leadership from the executive level.; Therefore, it is recommended that each Assistant Deputy Minister is personally responsible and accountable for performance measurement within their respective sectors.

8.2 Harmonization of activities

There is a need to harmonize the activities currently underway to ensure a cogent approach to performance measurement and address the criticisms of a lack of integration across the Department. The Strategic Research and Analysis Directorate is advancing the three-pillared approach to social surveys and develop the Community Well-Being Index to measure socio-economic conditions. The Strategic Planning and Priorities Directorate has been the lead for the MRRS policy, the PAA and the PMF as well as extensive planning and reporting responsibilities. The Strategic Management Review and Analysis Directorate recently led the Strategic Review that assessed performance results from all INAC programs. The EPMRB has been increasing the awareness of performance measurement and developing the tools to build capacity within the Department. There are also other connections that need to be made that will include other knowledge areas, such as regional representation, IT/IM, and policy functions. Whether a working committee needs to be created to facilitate the process or whether this work falls within the ambit of the EPMRC, the issues raised in this status report will need a co-ordinated effort to address.

The initial steps to address should include:

8.3 Collaboration with the regional level

There appears to be a gap in the identification of, access to and use of data from the regions. There is also a need to collaborate more extensively with the regions in terms of developing performance measures that can be collected in the regions. There is potential for this information to add to the performance story of the Department. An assessment of what data is currently being collected in the regions should be undertaken and used to supplement data needs.

8.4 Improving program design

The Department may wish to require that each program, while in the concept and development stage, identify its expected outcomes and its indicators. Program designers should also be required to include in their business case exactly how they plan to measure success and where the performance information will come from. The program's theory should be set out in a clear and concise logic model – the development of which will require program managers to articulate precisely what they expect their program to achieve. Although such detail has been expected in the past, there appears to have been a lack of rigour in ensuring all elements were included in the RMAFs. In this way, issues can be addressed and baseline data can be captured at the outset of program delivery. Additional adjustments may include a challenge function within the Department to ensure these issues are addressed.

8.5 Training / Orientation

Capacity has emerged as an issue, both with respect to competencies of individuals and adequate resources. Training for current employees, and orientation for new employees, is vital for ensuring that employees have a clear understanding of the Department's results based management and performance measurement culture. This understanding will also help to clarify the roles, responsibilities and expectations for employees – a level of understanding that will ensure a common understanding of what is performance measurement.

In view of the increased demand for performance-related information following the performance measurement workshops, the Department may consider developing a common course or tools on performance measurement that is readily available to all levels of staff, at Headquarters and in the regions.

In addition, given the requirements to consult, engage and otherwise involve Aboriginal people and organizations in evaluations, there is an opportunity to build, share or provide capacity building tools in support of performance measurement that will help ensure a common understanding of terms and expectations. Working from a common frame of reference can be expected to result in better performance measurement data and more rigorous evaluations.

8.6 Integration

There are a number of points where the integration of performance measurement related activities could be better integrated. For example, the discrete performance measurement and planning cycles of the Department presents an opportunity for rationalizing the gathering of performance information by the various directorates and branches that require the information. This will require collaboration between the directorates and branches to find common information needs and to streamline the process for program managers. Better integration of planning, information sharing and reporting between all branches and directorates that have a role to play or an interest in better performance measurement, supported by the Department's information systems, can be expected to result in a more efficient management within the Department.

8.7 Standards

Where possible and practicable, standards could be developed and shared across the Department. This might include developing a list of ‘approved' indicators and existing data sources for program managers to incorporate into their performance measurement strategies. The checklist could include ensuring the right form as well as the right content is being planned for. The EPMRB has started this process with the Thematic Indicators Research Project.

8.8 Guidance and Tools

Performance measurement strategies are required for each program, and the EPMRB has plans to develop guidance and tools to assist program managers, and to ensure the requirements of the Department are met. Checklists could also be employed by EPMRB staff to ensure all the required elements are included in the performance measurement strategy.

8.9 Communication strategy

Developing a communications strategy to address the revised TBS policy requirements and the next steps that the EPMRB plans to take can be expected to result in better uptake of the information. Communicating to employees the importance of results based management and performance measurement will help infuse these concepts into the departmental culture. Employees must be aware of how these issues are all linked together and support one another.

8.10 Monitoring the implementation of performance measurement strategies

Performance measurement strategies need to be fully implemented. In view of the number of strategies that are prepared but reportedly never fully implemented, there should be a method established to ensure implementation and to provide assistance when barriers to implementation are encountered. Having assistance available for the implementation of strategies will underscore the importance of performance measurement. Ensuring the strategies are fully implemented will improve the quality of evaluations.

8.11 Planning for Future Annual Reports

Plans could be developed that would assist in gathering information to inform future annual reports assessing the state of performance measurement of programs in support of evaluation at INAC. Putting into place tools that would gather that information would ensure consistency of input. Methods might include:

Return to Table of Contents





9. Sources

Departmental Evaluation report, 2005 – present

Departmental Audit reports, 2005 – present

Measuring What Matters: Performance Measurement Guidelines
Prepared for the Audit and Evaluation Division of the Department of Indian and Northern Affairs Canada, Prepared by Kishk Anaquot Health Research (2009)

Performance Measurement, Reporting and Accountability: Recent Trends and Future Directions SIPP Public Policy Paper No. 23 (2004)
Dr. Paul G. Thomas, Duff Roblin Professor of Government
St John's College, University of Manitoba

Reports of the Audit General, 2005 - present

RMAF Special Study
Indian and Northern Affairs Canada, Audit and Evaluation, (2008)
Banting, J., et al.

Treasury Board's MAF assessment (Round VI) (2009)

Trend Analysis of OAG Audit and DAEB Audit and Evaluation Recommendations and Action Plans Since 2000
Prepared for INAC, DAEB
Prepared by Consulting and Audit Canada, Project Number 572-0335 (2005)

Return to Table of Contents
 

 
  1. Results Based Management Lexicon, Treasury Board Secretariat (return to source paragraph)

Return to Table of Contents
 
 

Did you find what you were looking for?

What was wrong?

You will not receive a reply. Don't include personal information (telephone, email, SIN, financial, medical, or work details).
Maximum 300 characters

Thank you for your feedback

Date modified: