Archived - System Under Development Audit of Education Information System

Archived information

This Web page has been archived on the Web. Archived information is provided for reference, research or record keeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please contact us to request a format other than those available.

Date: November 2010

PDF Version (270 Kb, 37 Pages)

 

 

Table of Contents

 

 

List of abbrs and technical definitions

List of abbrs

AES
Audit and Evaluation Sector

COTS
Commercial off-the-shelf

DCI
Data Collection Instruments

EIS
Education Information System

EPA
Effective Project Approval

EDW
Enterprise Data Warehouse

FNITP
First Nations and Inuit Transfer Payment (system)

Indian and Northern Affairs Canada

IRS
Indian Registration System

IT
Information Technology

OGC
Office of Government Commerce

PDF
Portable Document Format

PPA
Preliminary Project Approval

TB
Treasury Board

List of technical definitions, as used in this report

Data Collection Instruments (DCI) - recipient reports that are part of the terms and conditions of various Contribution Agreements, representing the sole source of data collected for INAC-funded education programs.

Go-live - date on which EIS is moved to production (system is on-line, users are granted access and can create/submit proposals and run reports, and in-service support of EIS begins)

Integration - assembling system components (or subsystems) into a single system such that the subsystems function together and go on to operate as a single system

Interface - a point of interaction/communication between systems or system components, which may include both automated and manually-enabled data exchanges

Stabilization - transition period following go-live when the project team is ramped down and EIS operations are handed off

 

 

Executive summary

Background

Indian and Northern Affairs Canada (INAC or "the Department") spends approximately $1.7B per year on Aboriginal education programs and has primary responsibility over education for First Nations people on reserves. In May 2008, INAC's initiative to support foundational change in First Nations education was approved. A performance measurement system, termed the Education Information System (EIS), was identified as an essential tool to support this reform. EIS aims to support improved accountability for education programs, inform changes to policy and program development, and improve service delivery.

Funding of $26.6M for EIS system development and implementation and $450K for ongoing annual maintenance and support was approved for the project. Starting in 2008, external consultants were hired by INAC to manage the EIS project and develop the system. From 2008-2010, $5.5M was devoted to preliminary project planning and documentation, $0.8M of which was provided by existing Departmental funds. In June 2009, EIS was granted Preliminary Project Approval (PPA) for INAC's proposed EIS design. If Effective Project Approval (EPA) is granted, funding of $21.7M will be released to develop and implement the approved EIS design.

INAC's Audit and Evaluation Sector (AES) engaged Ernst & Young to conduct this audit of EIS while it was in its early pre-EPA developmental stages from June to October 2010. The audit results, which include the findings and recommendations in this report, are based on information made available to us for review between June and August 2010.

Objective and scope

The audit [Note 2] objective was to provide assurance that the EIS project is on track to deliver a system, processes and controls in September 2012 that will securely and reliably administer a comprehensive national education information resource for school/institution-based learning. The scope included the following three areas of risk identified during the planning phase of the audit:

Methodology

A risk-based audit program was developed during the planning phase of the audit to focus on areas of greatest risk. The audit program was structured to include documentation reviews and interviews. All audit fieldwork was conducted at INAC national headquarters in Ottawa/Gatineau. The audit planning took place in June 2010, followed by the audit fieldwork from July to August 2010. Reporting took place from September to October 2010.

To allow INAC management to consider risks identified throughout the audit in a timely manner, we shared preliminary findings with INAC senior officials in mid-September 2010.

Findings and conclusions

Three audit findings were identified; to summarize, we found a lack of evidence to demonstrate that:

While the EIS project appears to be on track to deliver a system that consolidates data, provides electronic reporting and increases the maturity and experience of the IT function at INAC, the complex functionality being invested in to enable performance measurement cannot be leveraged without a Departmental ability to collect complete and reliable data. EIS will be unsuccessful if INAC lacks the tools and information to leverage the technology, even if the system is built according to the technical specifications. Moreover, without a complete understanding of the performance measurement requirements and data, EIS requirements and expectations are likely to evolve, creating significant risk for the project.

As a result of the audit findings, we cannot conclude that the EIS project is on track to deliver a system, processes and controls in September 2012 that will securely and reliably administer a comprehensive national education information resource for school/institution-based learning.

Recommendations

We recommend that INAC examine their performance measurement expectations and validate that reliable and complete data will be gathered in a consistent manner to appropriately leverage EIS technology, prior to investing in system capabilities. A timeline of expected performance measurement outcomes and an estimate of the complete cost of ownership should then be developed. Finally, the EIS implementation approach, timeline and mitigation strategies should be re-examined objectively and independently; this examination should validate that the project is appropriately aligned to Departmental and Treasury Board expectations and best-suited to the business requirements.

 

 

1. Background

Indian and Northern Affairs Canada (INAC) supports province-like social programs, including education, in First Nation communities. In support of its mandate, INAC has a legal responsibility to ensure that these services compare in quality to those available to other Canadians. [Note 3] The Department spends approximately $1.7B per year on Aboriginal education programs and has primary responsibility over education for First Nations people on reserves.[Note 4]

In May 2008, an initiative to support foundational change in First Nations education, entitled Reforming First Nations Education was approved. A performance measurement system, termed the Education Information System (EIS), was identified as an essential tool to support this reform. Funding of $26.6M for EIS system development and implementation, as well as $450K per year for its ongoing maintenance and support was dedicated to the project.

The EIS project is an effort to develop a comprehensive national education information system for school/institution-based learning, in which INAC and First Nations have a shared interest and responsibility. The new system aims to support improved accountability for education programs, inform changes to policy and program development, and improve service delivery.

The development of EIS is the one of the largest and most technically complex Information Technology (IT) projects the Department has undertaken. Starting in 2008, external consultants were hired by INAC to manage the EIS project and develop the system. From 2008-2010, $5.5M was devoted to preliminary project planning and documentation, $0.8M of which was provided by existing Departmental funds.

In June 2009, Preliminary Project Approval (PPA) for INAC's proposed EIS design was granted. According to the PPA TB submission, EIS will provide the opportunity to make a very significant contribution to understanding and improving education outcomes for First Nation students, improve the management of the education programs, and increase accountability by providing the following capabilities:

If EPA is granted, funding of $21.7M will be released to develop and implement the approved EIS design, scheduled for delivery in September 2012. The Audit and Evaluation Sector (AES) engaged Ernst & Young to conduct an audit of EIS while it was in its early developmental stages, prior to EPA.

To allow INAC management to consider risks identified throughout the audit in a timely manner, we shared preliminary findings with INAC senior officials in mid-September 2010.

This audit report provides further context and details the audit findings and recommendations.

 

 

2. Objective and scope

2.1 Objective

The objective of the audit was to provide assurance that the EIS project is on track to deliver a system, processes and controls that will securely and reliably administer a comprehensive national education information resource for school/institution-based learning.

2.2 Scope

The scope of the audit included the following broad areas:

This report is organized according to the following three areas of risk identified during the planning phase of the audit:

The observations and findings in each of the three main areas of risk above are detailed in Section 5.

2.3. Scope limitations

Given that the EIS project is in its early planning stages, with system development to commence following EPA, some areas of review had to be excluded. These included the review and testing of system controls, disaster recovery plans, and interfaces. We reviewed documentation available between June and August 2010, which included the detailed requirements, the EIS Business Case and Project Charter.

Due to the ongoing nature of the EIS project, we understand that some of the issues identified in this report may have already been identified and addressed with an action plan by the project team. The observations in this report are based on information available during the fieldwork phase of the audit. Relevant information provided in management meetings during the reporting phase of the audit is incorporated where appropriate to provide further context to the findings and recommendations in this report. However, no further fieldwork was performed during the reporting phase to examine new areas of project progress and development.

 

 

3. Statement of assurance

Sufficient work was performed and the necessary evidence was gathered to support the findings, recommendations and conclusions contained in this report. The work was conducted according to a risk-based audit program developed collaboratively with INAC AES.

The risk-based audit program was based on Control Objectives for Information and related Technology, version 4.1 (COBIT 4.1) and the Project Management Institute's Project Management Body of Knowledge, version 4 (PMI PMBOK 4).

The audit was executed in conformity with the Internal Auditing Standards of the Government of Canada. The audit procedures were also aligned to the TB Policy on Internal Audit and related policy instruments, as well as to the International Standards for the Professional Practice of Internal Auditing. This audit does not constitute an audit or review in accordance with any Generally Accepted Auditing Standards (GAAS).

 

 

4. Methodology

4.1 Approach and timeline

The audit was conducted in three distinct phases:

Given the size, complexity, and the current developmental stage of the EIS project, we recognize the importance of timely feedback. Accordingly, we communicated preliminary observations, risks and recommendations to senior INAC officials in September 2010.

4.2 Audit approach

A risk-based audit program was developed during the planning phase of the audit to focus on areas of greatest risk. The audit program was structured to include documentation reviews and interviews. All audit fieldwork was conducted at INAC national headquarters in Ottawa/Gatineau.

Because the audit was conducted pre-EPA, prior to the commencement of system development, no testing was performed. We did leverage research and leading practices in the following areas to assess the project plans and develop the audit program:

 

 

5. Findings and recommendations

EIS is an important tool to enable INAC's performance measurement objectives and to consolidate currently unmanageable educational reports in a "mineable" format. We recognize the strategic importance of this system to the Department and the inherent challenges in the field of performance measurement. We also recognize that significant work has been performed in the pre-EPA phase, with a considerable degree of project planning. We understand from the project team that over 4,500 planning documents have been developed. Several of the planning documents reviewed incorporate leading practices in project management.

This section identifies risks, gaps and areas of potential improvement to increase the probability of EIS success. The findings and recommendations are organized below according to the following three main areas of risk identified during the audit:

5.1 Implementation schedule

EIS is scheduled to go live in September 2012, a deadline that is considered imperative for the project. Specifically, because educational reports are tied to school years, the new system must be implemented at the start of the annual Departmental reporting period (September). If the target year is missed, the next viable window will be a year away. We were informed by the project team that the September 2012 target must be met because funds for project resources cannot be sustained until the next annual reporting cycle. INAC has also made public commitment to implement EIS at the start of the 2012 school year.

The EIS project team selected a single-release implementation approach, scheduled to go live in September 2012, as illustrated in Figure 1 below. The single-release implementation model introduces the finalized system into production at one time, as opposed to a phased model (where each implementation phase entails an "upgrade") or an evolutionary model (where a pilot, or partial implementation, is conducted prior to full implementation to mitigate risks).

Timeline of the High-level project schedule

Figure 1 - High-level project schedule [Note 6]

This graphic illustrates the EIS development cycle timeline in four-month intervals (from October 2010 to December 2012), indicating the key points in the cycle.

In phased and evolutionary implementations, results and "lessons learned" from production usage can feed into subsequent phases or releases. In contrast, a single-release implementation presents a finalized solution to an entire user base at once - similar to what is termed "big bang adoption" with respect to software implementations.

Big bang adoption is riskier than other adoption types because there are fewer learning opportunities incorporated in the approach, so significant preparation is required to get to the big bang. [Note 7] In addition to providing learning opportunities, phased implementations generally have the following advantages:

While the EIS Business Case uses the term pilot with respect to a production go-live in 2012, there is no evidence to demonstrate that this pilot will be used to leverage the above-listed advantages and learning opportunities. Rather, the Business Case suggests that deployment activities will begin in July 2012, with the system going live in September 2012 and stabilizing by December 2012.

Because EIS will be implemented in a single release, it is imperative that appropriate planning and testing has been conducted prior to going live, and that all critical issues are identified and addressed in advance of the go-live date. The following areas of risk associated with the implementation approach and schedule are discussed below:

5.1.1 External dependencies

EIS is highly dependent on several factors outside of the project team's control. The following are aspects of the project that entail critical dependencies, which may determine whether the implementation targets and business outcomes are met:

Our observations regarding the above-listed project dependencies are detailed in Appendix A, with associated risk assessments based on pre-EPA project plans.

Some of the external project dependencies directly determine whether mandatory requirements are met, such as decommissioning legacy systems, while others play an important role in determining whether the project remains on schedule and within budget, such as cleansing the legacy data prior to conversion.

Given the single-release implementation approach and the September 2012 go-live date, the reliance on external dependencies is an area of significant risk. While external dependencies are inevitable in any project, the following EIS project planning and scoping decisions have elevated the associated risks:

The EIS project team has developed strategies to mitigate risks associated with project dependencies; accordingly, $3.26M has been allocated to a contingency fund. However, there is a lack of evidence to demonstrate that the strategies and contingency fund are sufficient to address the cumulative effect of multiple risk events (for example, the combination of multiple regional offices requiring additional resources to cleanse legacy data, timeline delays, and a significant degree of paper-based reporting once EIS goes live). This is of considerable importance given the number of project dependencies and the reliance on Departmental resources and infrastructure to support a pioneering performance measurement system.

Due to the number of external dependencies, and the associated implications, the September 2012 go-live date requires adequate time buffers to allow for scheduling slippage outside of the project's control. However, as discussed in Section 5.1.2 below, there is a lack of evidence to demonstrate that the schedule includes sufficient buffering.

5.1.2 Milestone scheduling

Key milestones per the EPA project Business Case are illustrated below in Figure 2 with higher risk scheduling activities highlighted in red. Between July 2012 and September 2012, the project plans to conduct deployment activities, which include:

Timeline of the High-level project schedule highlighting areas of higher risk in red

Figure 2 - High-level project schedule highlighting areas of higher risk in red [Note 9]

This graphic illustrates the EIS development cycle and project progression timeline, indicating the main points in the cycle and emphasizing the critical phase of the project (the critical deployment activities and EIS go-live).

One of the top reasons for government IT project failures is too little attention being paid to breaking project development steps into manageable work steps, as outlined in the United Kingdom Office of Government Commerce (OGC) leading practices. There is a lack of evidence to demonstrate that this risk of undertaking unmanageable project work steps prior to go-live is being effectively mitigated for the EIS project, particularly with respect to data conversion activities.

Specifically, within a three-month window prior to go-live, key tasks have little buffering to allow for project delays or unexpected events associated with external dependencies. This is particularly risky because several aspects of the data conversions from legacy systems are dependent on business owners, as detailed in Appendix A. Data conversion is a challenging aspect of the implementation, as the project team has highlighted in their risk analysis. However, there is a lack of evidence to demonstrate that the likelihood and impact of data conversion risks are "moderate", as qualified by the project team.

Data conversion and consolidation projects can impact organizational effectiveness and efficiency, which is especially true for project involving "legacy" databases. Based on research and leading practices, data conversion usually requires more time than anticipated. As a result, conversions are often rushed, resulting in the identification of problems after the conversion is complete and the data quality has already suffered. To create a proper timeline and list of conversion expectations, preliminary data-profiling and a data quality assessment are required. [Note 10] With data integrity and data cleansing outside of the EIS project scope, the planned conversions to commence in July 2012 represent a significant project risk, further amplified by typical program and operational resource unavailability over summer months.

5.1.3 Mitigation strategy

Interviews with management resulted in inconsistent information regarding the impact of missing the September 2012 go-live date and the mitigation strategy. In particular, subsequent to communicating preliminary audit findings during the reporting phase of the audit, we were informed that the go-live date could be delayed by up to one month with little to no impact on the project delivery. Following this one month delay, the implementation could proceed with paper-based reporting and manual data entry. This information differed from that provided during the fieldwork phase of the audit.

Furthermore, it is not evident how EIS can be successfully deployed following a month-long delay with paper-based manual data entry; specifically, paper-based reporting is currently highly time-consuming for regional offices and a key driver for this system. There is no evidence to demonstrate that the current resource shortage will be augmented in 2012 to accommodate a paper-based EIS system that meets the reporting timeliness objectives set forth in project plans.

In addition, the mitigation strategy communicated during the reporting phase of the audit involves the option of switching to a phased approach mid-development whereby components of the application would be deployed in advance of go-live, with the proposal/recipient and performance measurement reporting components being deployed in subsequent iterations (as opposed to one that "phases" over annual reporting cycles). No evidence of this potential plan was made available in interviews and documents (including the options-analysis documentation) during the fieldwork phase of the audit. At the time of the audit, the following was not evident:

Finding #1: There exists a lack of evidence to demonstrate that risks can be effectively mitigated under the planned project timeline using a single-release implementation. Specifically, the following issues introduce a significant level of risk to INAC:



Recommendation #1: In consultations with project stakeholders and the Education Performance Measurement Strategy team, assess whether preventive controls can be implemented to mitigate the risks. In particular, consider the following potential activities to manage the scheduling risks:



5.2 Design feasibility

As previously discussed in Section 5.1, it is unclear whether the proposed schedule and go-live date are feasible. Similarly, there is a lack of evidence to demonstrate that the proposed design and implementation strategy are best-suited to INAC's business requirements. Because an independent feasibility study has not been conducted to synchronously assess the project from all relevant viewpoints (specifically, from technological, performance-measurement-strategic and economic perspectives), it is difficult to ascertain whether EIS can meet the stated objectives and whether the proposed implementation strategy is optimal in terms of risk management, resource efficiency and approach effectiveness. [Note 11]

The EIS project team started with a fixed budget and a fixed timeline, prior to finalizing the requirements and objectives for EIS. As a result, once analysis was performed to estimate cost and complexity for the requirements, the project scope was adjusted to conform to the budget and timeline constraints. Therefore, as previously discussed, many project dependencies are outside of the EIS project scope, but entail additional costs to INAC and represent areas of strategic risk to the EIS project. Three aspects of feasibility are examined in the following sections:

5.2.1 Technological feasibility

The single-release implementation approach was selected by the project team based on an "options-analysis" process conducted prior to PPA that considered five options. During the PPA phase, the project team determined that the only solution capable of meeting business requirements was a "custom-built Web-forms on-line" solution. Two implementation strategies were considered for this design:

During the PPA phase, the project team selected the single-release implementation approach using a team of consultants - a decision re-affirmed by project team consultants during the EPA phase.

While the Business Case states that a phased implementation would reduce risk and provide ability to learn and improve the system prior to final deployment, the team concluded that a multi-phased approach "would add little value and would introduce additional overhead by extending the length of the project to implement the same capabilities". However, the options-analysis made available for our review lacks information to justify the single-release implementation decision. Specifically:

In addition, no evidence was available to demonstrate why a team of consultants (to be attained under various procurement instruments) was selected over a single third-party service provider for the development and implementation of the custom-built Web-forms on-line system

A second area of technological feasibility involves alignment with the Umbrella Education Performance Measurement Strategy. As further described in Section 5.3, if the business outcomes or requirements evolve, technical rework may be required, impacting EIS timeline and budget.

Due to the lack of information available, combined with the lack of an independent feasibility study, we were unable to validate that the implementation approach selected is optimal for INAC. More importantly, the lack of a high-level feasibility study to examine alignment between the EIS design and INAC's performance measurement strategic objectives creates significant enterprise risks for INAC.

5.2.2 Performance measurement strategic feasibility

As further discussed in Section 5.3, there is no evidence to demonstrate when, or if, all mandatory requirements and business outcomes will be met.

As well, there is no evidence to demonstrate what degree of electronic adoption by institutions is required for success. Specifically, the EIS system is being designed to accommodate paper-based reporting, which will require manual input at INAC regional offices. However, no evidence was made available to demonstrate what volume of manual input is feasible for INAC with current regional resources, regional operating budgets and project budget. While manual input is the current method of managing many educational reports, INAC has found that the volume of reporting results in inadequate analysis due to resource shortages. There was a lack of evidence available to demonstrate that the EIS hard-copy reports will be less time-consuming to enter into the EIS system than the current reports or that EIS adoption rates will be sufficient to reduce manual data entry.

A more significant aspect of EIS adoption involves data quality and data integrity, which are integral to performance measurement. We were informed of significant data integrity issues currently embedded in the current educational data at INAC. Interviews indicated that EIS will act as an incentive for First Nations and institutions to provide higher-quality data and will reduce manual data-entry errors, thus addressing the current data quality challenges. However, this approach is inherently dependent on EIS adoption.

If data integrity is impaired following EIS implementation due to a lack of adoption and a high volume of paper-based reporting and manual data entry, the reports will not be reliable. An unreliable system may diminish the incentive for users to provide high-quality data, thus perpetuating the cycle of poor information. Both data integrity and data completeness are fundamental to the achievement of INAC's strategic objectives and to the success of EIS and are discussed further in Section 5.3.

5.2.3 Economic feasibility

Several costs to INAC are omitted from the EIS scope (including cleansing legacy data and negotiating and documenting internal and external data-sharing agreements), which will increase the total project cost to the Department. While these costs may be reasonable and accepted by key stakeholders, there was no evidence available to demonstrate that they have all been estimated or taken into account in the EIS pre-EPA stage, as detailed in Appendix A.

Another area of cost concern involves data quality and integrity, which is an inherent challenge for performance measurement across several industries, including education. In particular, significant data integrity issues involving the current Nominal Roll system and educational data at INAC were reported in a 2009-10 internal evaluation and communicated in audit interviews. [Note 12] No evidence that the total cost associated with addressing data integrity issues, which includes both data cleansing and ongoing data quality maintenance, has been estimated within a reasonable degree of precision. Moreover, the uncertainties regarding the degree of system adoption discussed in Section 5.2.1 may increase these costs and lead to ongoing data-cleansing requirements.

As outlined in Section 5.1, there is a lack of evidence to demonstrate that the current mitigation strategies are sufficient to manage the high number of project dependencies and Departmental changes required to introduce EIS in a manner such that business outcomes are achieved.

5.2.4 Summary

In summary, we found no evidence to demonstrate that several aspects of project feasibility have been independently assessed with a Department-wide view of strategy, indirect costs and technology. Given the size and complexity of EIS, a feasibility study could mitigate risks by validating the following success indicators, in advance of significant resource and financial investment.

Based on pre-EPA project plans, we did not find sufficient evidence to ascertain the existence of the above-listed success indicators. To achieve a sufficiently broad perspective, the feasibility study should be conducted independently of the project team; this approach also avoids the possibility of a conflict of interest.

Finding #2: No independent feasibility study was conducted to assess the EIS implementation approach and to confirm results of the EIS options-analysis. In particular, given EIS size, complexity and number of project dependencies, there is a lack of evidence to demonstrate that the implementation plan is strategically (with respect to performance measurement), technologically and economically feasible.



Recommendation #2: INAC should assess, independently of the EIS project team, whether the EIS system design is strategically (with respect to performance measurement), technologically and economically feasible. The Department should estimate the total cost of EIS ownership, based on all business and system requirements and stated outcomes, within a reasonable degree of precision.



5.3 Performance measurement outcomes

EIS's success depends on its ability to support the Department's performance measurement objectives. Key elements of EIS success, beyondthe achievement of project timelines, budget, and technical specifications, are outlined in Appendix B.

Of the approximately 119,000 elementary and secondary school students who reside on reserves in Canada, approximately 60% attend schools on reserves and 40% attend off-reserve schools, the majority of which are under provincial authority. In addition to measuring performance for primary and secondary schools, EIS aims to measure performance for students attending post-secondary schools as well as for youth wage subsidy programs, SchoolNet sites, and cultural centres that are funded through INAC. The project intends to collect data according to 18 performance indicators specified by the Umbrella Education Performance Measurement Strategy group.

While EIS is being designed to meet the mandatory requirements for performance measurement, we have not found sufficient evidence to demonstrate that the data-related aspects of achieving the mandatory requirements can be accomplished. That is, while the system may have the technological capabilities to measure stated metrics, the business outcomes cannot be achieved if the data are not available and reliable to leverage these capabilities. This is particularly important given that two key performance indicators for EIS success are data integrity and data completeness, as stated in project plans made available for review. These aspects of EIS project success are discussed in the following sections, followed by a summary of progress toward stated business outcomes.

5.3.1 Data integrity

"Successful adoption" is considered a critical success factor throughout the documented project plans. To employ Departmental lessons learned from FNITP, where system adoption targets were not met, the EIS project devoted a significant portion of their preliminary funding to attaining better adoption rates. However, there is no evidence to demonstrate how adoption "success" is quantified; specifically, we found no evidence to demonstrate that First Nations "adoption" of EIS has been sufficiently defined to resolve the following varying degrees of adoption:

To gather consistently high-quality data, user buy-in is essential. Given the current challenges around data integrity communicated in interviews and previous education evaluations, the uncertain adoption rates imply significant risk and impact the stated EIS business outcomes, as detailed further in Section 5.3.2 and in Appendix C.

Another risk associated with adoption involves the mandatory requirement to reduce data-entry workload for First Nations, regions and headquarters. This requirement is dependent on adoption and the degree of paper-based reporting; therefore, without quantified adoption targets, there is a lack of evidence to demonstrate this will be achieved. Other mandatory requirements and their dependence on data completeness are discussed in Section 5.3.2 below.

A further aspect of data integrity involves EIS data exchanges with the IRS. A mandatory EIS requirement is to verify the student's IRS number at data entry, including matches on birthday and last name. We have noted several data integrity issues regarding the IRS in an ongoing audit of the Secure Certificate of Indian Status (SCIS). Therefore, while the data validation with EIS may provide an opportunity to improve data consistency across systems, there may be significant operational challenges to overcome, particularly with a planned upcoming IRS development initiative. While not directly in support of findings and recommendations in this report, we recommend that data-sharing agreements are established with IRS owners without delay and that appropriate escalation protocols are established and documented to manage data differences in a timely manner.

5.3.2 Data completeness

In addition to hindering data integrity, low adoption rates may impact data completeness. Specifically, subsets of data supporting the 18 performance indicators specified by the Umbrella Education Performance Measurement Strategy were split by the program into three tiers as follows:

Of 45 subsets of data supporting the performance indicators, 20 are in tiers 2 and 3. There is a lack of evidence to demonstrate that Tier 2 data will be effectively collected by institutions that do not adopt EIS when the system goes live in September 2012. Similarly, even if INAC determines how to collect Tier 3 data in the future, there is a lack of evidence to demonstrate it can be effectively collected from institutions that do not adopt EIS.

It is unclear what percentage of institutions and percentage of students must be tracked by EIS to achieve the requirements and fulfill the project objectives. Provincial data-sharing agreements are not within the scope of the EIS project and are not expected to be in place when EIS is implemented. Provincial institutions encompass approximately 40% of the primary and secondary student population that INAC seeks to track. This directly impacts INAC's ability to meet mandatory requirements listed in the EIS requirements document.

In addition to challenges with data being available and reliable, there was a lack of evidence to demonstrate how students would be individually tracked. The EIS project plans to leverage the IRS registration numbers assigned to Status Indians to track students; however, non-Status Indians, including Inuit, are not registered in the IRS. While plans for a unique student identifier are discussed in project documentation, there was no evidence available to demonstrate how such a tracking mechanism would operate (including how and when students are assigned a number).

While mandatory requirements may pertain specifically to system functionality, the business outcomes listed in the EIS requirements document are dependent on requirements being met at both a system-functionality level and at the programmatic level. That is, it is not sufficient that EIS be technologically capable of meeting requirements; the overarching education program must have the means to leverage this technology with complete, high-quality data to provide reliable and timely reports.

For example, a mandatory requirement is to track students and standardized test results between schools, both on and off reserve. However, there is no evidence to demonstrate that this can be measured (and the requirement met) if data is not gathered from all institutions, including provincial institutions. Because this data is currently not available, there is no evidence to demonstrate this requirement can be met without data-sharing agreements, sufficient EIS adoption and an ability to track individual students. Similarly, there is a high likelihood that the following mandatory requirements will not be met without sufficient data completeness:

The impact of not meeting these mandatory requirements on business outcomes is outlined in Table 1 below and further detailed in Appendix C.

Table 1 - Business outcomes impacted by data completeness and integrity, which depend on provincial data sharing and EIS adoption rates (further detailed in Appendix C)

Business outcomes at risk of not being met due to a lack of data completeness and integrity

High likelihood of outcome not being achieved (achievement unlikely) Very high likelihood of outcome not being achieved (no evidence outcome can be met)

High likelihood of outcome not being achieved (achievement unlikely) High likelihood of outcome not being achieved (achievement unlikely)

Moderate likelihood of outcome not being achieved (unknown likelihood of achievement) Moderate likelihood of outcome not being achieved (unknown likelihood of achievement)
Provide a complete picture of all education program results by school and community (where privacy permits), by region and nationally, as well as by geographic zone. Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Provide the context to meaningfully measure student outcomes (e.g. socio-economic, school infrastructure), which is not currently possible. Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Track students and cohorts over time from kindergarten through secondary school and potentially through post-secondary education, which is not currently possible. Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Implement INAC's First Nation education performance measurement strategy to provide more information on results, and contribute towards informed improvement and increased accountability for both INAC and First Nations. High likelihood of outcome not being achieved (achievement unlikely)
Improve management of the education programs and increase accountability by reducing the reporting burden, simplifying the reporting process, and providing timely reports on program management and performance measurement. High likelihood of outcome not being achieved (achievement unlikely)
Improve data quality. Moderate likelihood of outcome not being achieved (unknown likelihood of achievement)
 

Data availability and integrity are critically important; without end-user buy-in, EIS reports may become unreliable with entire student populations omitted from comparative analyses. Based on performance measurement research, the relationship between inputs and outputs is generally easy to measure, whereas the relationship between inputs and outcomes is much more challenging[Note 13]. It is not evident that performance can be effectively measured without a representative population of students tracked. For instance, students that transfer between schools provincial and band-operated schools and that move on and off reserves may skew performance indicator results, as it would be unknown whether students dropped out or not.

Inconsistent awareness of performance measurement objectives was observed in interviews; specifically, management had differing expectations for EIS usage following implementation. While the system functionality expectations were communicated consistently, the expected ability and associated timeline for leveraging the functionality (in terms of usable reports) was unclear.

5.3.3 Summary

If it is determined that mandatory requirements and business outcomes cannot be met, at any point in the project development lifecycle, we would expect to find a reassessment of the project, objectives and proposed design. Specifically, it should be determined whether INAC's performance measurement goals can be met with partial student populations, and what degree of system adoption is required to successfully measure educational performance (i.e. statistical relevance). If the original business outcomes and mandatory requirements are considered essential to the Department, we would, similarly, expect to find further investments being considered to address these external dependencies. There was no evidence available to demonstrate that this analysis was conducted or that the mandatory requirements and business outcomes were revised to reflect data constraints.

A broad area of concern noted throughout audit interviews was the evolution of performance measurement requirements and the inconsistent expectations for EIS usage. Effective performance measurement is dependent on consistent reporting; trends can only be established if historical data is reliable and captured consistently on a periodic basis. Therefore, it is important that the data be of primary focus; data availability, reliability and expected usage should be well understood prior to developing a system. There was a lack of evidence available to demonstrate that the EIS plans are founded on stable performance measurement requirements and expected outcomes. The OGC indicates that a lack of co-ordination between project teams and departmental strategic objectives is the first of eight root causes for government IT project failures, highlighting the relevance of these strategic alignment risks.

In summary, it is not evident what performance measurement requirements can be met under the current project plans. While EIS appears to be on track to deliver a system that consolidates data, provides electronic reporting and increases the maturity and experience of the IT function at INAC, the complex functionality being invested in to enable performance measurement may not be leveraged without a programmatic ability to collect complete and reliable data.

Finding #3: There is a lack of evidence to demonstrate that EIS will result in the achievement of INAC's performance measurement objectives. While the technological capabilities may be in place to meet EIS requirements, there is no evidence to demonstrate that reliable and complete data will be in place and collected to leverage the technology and achieve the business outcomes.



Recommendation #3: In conjunction with Recommendation #2, INAC should clearly define what adoption rates and data are required to meet the mandatory requirements and stated business outcomes. Focus should be placed on the ability to collect reliable data prior to designing a system to house the data. Specifically, the performance measurement outcomes and associated timelines should be clearly defined to prevent overspend or rework on a system that cannot be effectively leveraged within a relevant timeline.

 

 

6. Conclusion

We cannot conclude that the EIS project is on track to deliver a system, processes and controls that will securely and reliably administer a comprehensive national education information resource.

Given the importance, size and complexity of EIS, we found a lack of evidence to demonstrate that the proposed EIS design and implementation strategy are feasible given the constraints and external dependencies. More important, there was no evidence available to demonstrate how the project will achieve the mandatory requirements and business outcomes, enabling the Department's performance measurement objectives. In particular, without data-sharing agreements with provinces and specified adoption targets by First Nations institutions, there is a high-likelihood risk that EIS will not meet mandatory project requirements and performance measurement objectives.

While EIS appears to be on track to deliver a system that consolidates data, provides electronic reporting and increases the maturity and experience of the IT function at INAC, the complex functionality being invested in to enable performance measurement may not be leveraged without a programmatic ability to collect complete and reliable data. EIS will be unsuccessful if INAC lacks the tools and information to leverage the technology, even if the system is built according to the technical specifications. Moreover, without a complete understanding of the performance measurement requirements and data, EIS requirements and expectations are likely to evolve, creating significant risk for the project.

We recommend that INAC examine their performance measurement expectations and validate that reliable and complete data will be gathered in a consistent manner to appropriately leverage EIS technology prior to investing in system capabilities. A timeline of expected performance measurement outcomes and an estimate of the complete cost of ownership should then be developed. Finally, the EIS implementation approach, timeline and mitigation strategy should be re-examined independently of the project to validate that they are appropriately aligned to Departmental and TB expectations and best-suited to the business requirements.

 

 

7. Management Action Plan

INAC's Audit and Evaluation Sector (AES) engaged Ernest and Young to conduct an audit of EIS while it was in its early pre-Effective Project Approval (EPA) development stages. The audit was conducted when the EPA Submission documents and associated project deliverables were being completed and progress relevant to the Audit findings has been made since, including approval by the Treasury Board Ministers. The management response reflects the progress the project has made since the documents reviewed by the Audit Team were drafted.

Based on the Audit Findings, the Audit Report made three (3) specific recommendations that are addressed below.

Table 2: Audit Recommendations

# Audit Recommendations Action Plan
Plan Date Owner
1. In consultations with project stakeholders and the Education Performance Measurement Strategy team, assess whether preventive controls can be implemented to mitigate the risks. In particular, consider the following potential activities to manage the scheduling risks:
  1. Expand the EIS scope to include key external dependencies, particularly those for which INAC will eventually absorb the associated cost independent of the EIS project
  2. Re-examine the requirement for a September 2012 go-live date and consider doing the following, with appropriate adjustments to the project plan, budget and communication strategy;
    1. Inserting greater scheduling buffers between key milestones, particularly for data conversion
    2. b. Introducing a pilot implementation prior to the single-release go-live or a phased roll-out to identify and address critical issues prior to introducing the system to all users
The preventive controls recommended by the audit to mitigate scheduling related risks were identified and addressed during the finalization of the EPA Submission.

Key related activities during the EPA finalization process included:

  • The draft EIS scope was reviewed and the resultant scope; cost and related project dependencies approved by the ADM-ESDPP, the CFO and Treasury Board.
  • The need for a September 2012 go-live date was re-examined and validated and discussed in the finalized EPA submission. The updated project schedule calls for development of the EIS application to be complete in the Spring of 2012 leaving significant schedule buffer for the September 2012 go-live target.
  • The principles of the Department's System Development Lifecycle (SDLC) will be fully leveraged to ensure the components of EIS software development deemed to be higher risk are included in early development iterations, thereby allowing for longer periods of system, integration and user acceptance testing.
  • A conversion strategy has been developed and reflected in the EPA Submission project plan.The conversion activity will be divided into 4 iterations over a period of 16 months starting in spring 2011. Converted data will be used in the 3 user test phases providing an opportunity to validate the conversion process and data well in advance of implementation.
  • A detailed test plan was developed based on the existing test strategy that ensures all key stakeholder groups are represented in User Acceptance Testing and each of the four solution components (iterations) are extensively tested prior to go-live.
   
Additionally the project will:

1. Develop a data cleansing plan based on the current conversion strategy that details roles/responsibilities, timelines and costs and is signed-off by impacted stakeholder groups

2011-03-31 DG-Educ
2. Develop an implementation strategy based on the current implementation approach that:
  1. Includes appropriate risk mitigation strategies including the potential to release the solution in iterations
  2. Includes identification of opportunities to pilot or prototype specific solution components or features, as they become available
2011-03-31 CIO
2. INAC should assess, independently of the EIS project team, whether the EIS system design is strategically (with respect to performance measurement), technologically and economically feasible. The Department should estimate the total cost of EIS ownership based on all business and system requirements and stated outcomes, within a reasonable degree of precision. The recent completion of the EPA submission and approval by Senior INAC management and the Treasury Board provided the project with a review of technological, strategic and economic alignment. In order to optimize additional review opportunities the project will ensure the two (2) independent validation and verification reviews that have been built into the project schedule include assessment activities to determine if the project is still on track and has adequately addressed the key audit findings.
  • This approach respects the validity of the Audit's recommendation for independent assessment and reflects the activities that have occurred during the EPA process that address immediate risk areas.
  • The project has incorporated two independent validation and verification activities into the schedule/budget to provide timely checks on progress and to identify any corrective actions that may be warranted.
  • A review of the draft project scope and all associated costs was undertaken during the EPA planning phase and the resultant scope and budget approved by ESDPP and IMB.
  • While work continues to refine responsibility and funding authority for key project dependencies the project has developed mitigation strategies that will ensure the project can be delivered successfully without dependence on funding outside of ESDPP.
   
Additionally the project will undertake the following actions:

1. Update the project forecast and identify project dependencies that can be funded by the Project

2010-11-30 DG-Educ ADM -
2. Confirm funding source for remaining dependencies 2010-12-31 ESDPP /CIO
3. Adjust the project plan to accommodate any shortfall related to the funding of project dependency activities 2011-02-28 DG-Educ
3. In conjunction with Recommendation #2, INAC should clearly define what adoption rates and data are required to meet the mandatory requirements and stated business outcomes. Focus should be placed on the ability to collect reliable data prior to designing a system to house the data. Specifically, the performance measurement outcomes and associated timelines should be clearly defined to prevent overspend or rework on a system that cannot be effectively leveraged within a relevant timeline. The project is actively addressing the ability to collect reliable data and will continue to address the related concerns by:

1. Obtaining Departmental approval of Education's Performance Measurement Strategy

2010-11-30 DG-Educ
2. Identifying key success criteria including adoption rates 2010-12-31 DG-Educ
3. Finalizing the current data mapping activities (tracing the availability of data required to support the planned performance measures) 2010-12-31 DG-Educ
4. Finalizing the changes to Data Collection Instrument's required to support ESI performance measurement.

Notes:

  1. The Sponsor recognizes that while data cleansing activities will improve the quality of historical data it is likely that some quality issues will remain and that there will be gaps in the historical information available.
  2. Additionally, the Sponsor recognizes that some data that EIS will capture in future may not be available at go-live due to dependence on the successful completion of tri-partite negotiations that are outside the project scope.
    • Currently nominal role, Education's primary source of information, is available for FN Students funded by INAC and attending Provincial schools. Tri-partite discussions are underway that have the potential to make new provincial information available (e.g. literacy and numeracy).
    • HQ and Regional Working Groups to improve data integrity will continue
2010-12-31 DG-Educ
 

 

Appendix A - Summary of key external dependencies

This Appendix lists critical EIS project dependencies discussed in Section 5.1 and 5.2, ordered by the party responsible for the majority of associated costs (i.e. the "owner"). The audit observations and risks pertaining to each dependency are detailed in the table below, as well as a risk likelihood assessment pertaining to the following project areas of constraint:

To summarize, we have identified 14 external dependencies which create risks in the three main areas of project constraint: schedule, cost and quality. Six of the dependencies are "owned" by the EIS project, while the remaining eight are the responsibility of First Nations and INAC business owners. Based on current project plans, several factors outside of the project teams' control may impact project success; in addition, the project plans entail additional costs to the Department that are not included in the project budget.

Dependency Owner Observations, risks and reasoning for likelihood assessment Likelihood assessment
Schedule
delays
Additional
costs
Quality
impaired
IRS data exchange EIS
  • Several mandatory requirements involve IRS data exchange, including data quality checks against the IRS upon entering information into EIS
  • IRS may be undergoing an upcoming development project - no evidence that EIS project has taken this into account or what the impact on EIS development will be
  • May involve privacy implications due to data aggregation between multiple sources of personal information
High Moderate High
Alignment with Education Performance Measurement Strategy EIS
  • EIS is heavily dependent on the finalization of the performance measurement strategy and confirmation of the performance measures
  • Strategy changes will affect EIS development and timeline
  • The strategy itself may have dependencies, including data-sharing agreements and privacy considerations
High Moderate High
Development of FNITP system integration (result of changes to project plan following PPA) EIS and INAC
  • EIS is dependent on FNITP resources being available for integration design and development
  • Due to other initiatives in the Department that are competing for FNITP resources, there is a risk that analysts and developers will not be available according to EIS project schedule, which amplifies the following risks:
    • EIS compromised due to technological weaknesses pertaining to FNITP integration
    • FNITP compromised due to technological weaknesses pertaining to EIS integration
    • Potential costs to be absorbed by INAC
High Moderate Moderate
Resource transition between 2 separate procurement phases ("tiers") of contracting EIS
  • Multiple procurement vehicles will be used to secure third-party resources for project development and implementation
  • The timing and success of these contract commencements will impact the project; in particular, resource changes and discontinuities are a risk mid-EIS-development
High Moderate Low
EDW data exchange EIS
  • EIS is dependent on data from other systems being available in the EDW

  • No evidence that internal data-sharing agreements are in place to enable data exchange with the EDW
  • Development on EDW and on systems feeding EDW could impact EIS schedule and cost
High Moderate Low
Privacy constraints EIS
  • Privacy implications are a significant project dependency, as they may determine how data is collected, used (including aggregation between systems) and shared, and what agreements must be in place in order to do so
  • Privacy Impact Assessment (PIA) is underway, the outcome of which may require changes to the current EIS plans (plans assume a certain degree of Protected B data)
Moderate Moderate Moderate
First Nations' hardware and software First Nations
  • EIS may require some First Nations to acquire new hardware and software (costs of which may be covered under independent funding programs), which increases costs outside of EIS and could contribute to on-site technical challenges
  • May contribute to lower adoption rates and increased paper-based reporting
Moderate High Moderate
First Nations' Internet connectivity First Nations
  • A stand-alone connectivity solution for communities without Internet access was removed from EIS project scope following PPA
  • May increase paper-based reporting and impact EIS adoption incentives
Low Moderate High
First Nations adoption of EIS, including methods of use First Nations and INAC
  • System adoption is central to EIS success. Adoption includes several aspects, which include:
    • Willingness to track the required key performance measures
    • Data-sharing negotiations and agreements
    • School system compatibility, which may require some First Nations to invest in new hardware and software
    • Ability to provide electronic reports, which includes Internet connectivity
  • The degree of adoption, and the methods of usage, may impact data integrity (affecting performance measurement), manual data-entry workload (reduction of which is a mandatory requirement) and project schedule
Low High High
Data cleansing INAC
  • Cleansing legacy data, which must be conducted prior to converting data from legacy systems to EIS (in accordance with mandatory requirements), is not within EIS scope
  • The cost of cleansing is to absorbed by business owners; however, no evidence was available to demonstrate cost estimates or agreement from business owners
  • Could impact quality of data imported to EIS
High High High
Region-specific DCIs INAC
  • No evidence to demonstrate how the EIS project will ensure consistent measurement techniques
  • Potential added financial burden to INAC regions
  • Ten mandatory requirements pertaining to region-specific DCIs may be at risk of not being achieved
Moderate Moderate High
Data-sharing negotiations and establishment of documented data-sharing agreements INAC
  • No evidence of data-sharing agreements in place with:
    • Provincial education institutions
    • First Nations
    • INAC system business owners
  • Data from provinces, First Nations and other INAC systems are required to meet several mandatory requirements and business outcomes and to enable effective performance measurement
  • While provincial data-sharing agreements are considered independent of the EIS project, they determine whether mandatory requirements can be met
Moderate Moderate High
Data entry for paper-based reporting into EIS INAC
  • The following paper-based data-entry risks are highly dependent on EIS adoption rates and methods of reporting:
    • Increased costs to INAC regional offices (no evidence of cost estimates)
    • Potential back-log of data entry
    • Data errors impacting integrity of performance measurement reports
    • Mandatory requirement to reduce data-entry workload for First Nations and INAC may not be achieved
Not applicable* Moderate High
Ongoing support costs in excess of $450K annual funding - determined to be inadequate during PPA phase INAC
  • Ongoing support costs include the following, which are assumed to be covered by the Department and the First Nation Student Success Program:
    • Infrastructure and operations - software licenses, hardware, and maintenance
    • Application support and architecture - hardware and software maintenance
    • EDW - hardware and software maintenance
    • Ongoing training - support infrastructure, processes and resources
    • First Nation EIS power-users - training travel and annual accommodations
  • Cost estimates and assumptions for all of the above could not be confirmed; however, cost for training alone were estimated by the project to be $3.3M
Not applicable* High Moderate
Decommissioning of legacy systems INAC
  • No evidence of commitment from business owners to decommission legacy system within planned timeline
  • Decommissioning is a mandatory requirement that is outside of the EIS project scope and may not be achieved
  • Could result in "black-book" systems being used in conjunction with EIS post-go-live
Not applicable* Low High

* Considered "not applicable" to project delivery schedule because activities are to begin following project delivery target date.

 

 

Appendix B - Key elements of Education Information System success

The success of EIS involves not only IT project success, which can be defined as delivering a system to specification, but also an organizational ability to leverage the technology and achieve strategic objectives related to performance measurement. The below are criteria we consider to be essential EIS success factors (beyond the achievement of project timelines, budget, and technical specifications) based on the information made available during the audit.

Key EIS success factors
Success in accordance with the key performance indicators for EIS outcomes stated in project plans, as follows:
Effective support of Departmental education performance measurement strategy, including:
Achievement of all mandatory requirements and 11 business outcomes, which are stated as follows:
  1. Provide a complete picture of all education program results by school, community, region, geographic zone, and nationally
  2. Provide the context to meaningfully measure student outcomes (e.g. socio-economic, school infrastructure)
  3. Track students and cohorts over time from kindergarten through secondary school and potentially through post-secondary education
  4. Implement INAC's First Nation education performance measurement strategy to provide more information on results, and contribute towards informed improvement and increased accountability for both INAC and First Nations
  5. Improve management of the education programs and increase accountability by:
    • Reducing reporting burden by minimizing redundancy and simplifying the reporting process
    • Improving data quality by streamlining information-gathering
    • Enabling early detection of anomalies to trigger automatic review of reports
    • Linking education program results with INAC's financial information to better assess expenditures relative to results
    • Providing timely reports on program management as well as performance measurement
  6. Improve data quality
  7. Work with First Nations to develop performance indicators that will be the basis for determining enhanced data requirements for EIS
  8. Support auditing functions to log activity (who, what and when) in accessing EIS information
  9. Improve the ability to train new users on an ongoing basis, and to train users on system changes
  10. Make EIS user-friendly to encourage First Nations usage of the system
  11. Accommodate varying levels of uptake by First Nations by providing user interfaces and data capture alternatives that reflect the varying levels of connectivity, user computer skills and comfort with the technology, thus reducing training requirements and minimizing technical and help-desk support
 

 

Appendix C - Business outcomes dependent on sufficient adoption and data sharing

The Table below provides further details pertaining to the business outcomes (outlined in Table 1 of Section 5.3) that are dependent on mandatory requirements that may not be met. Specifically, without sufficient data sharing with First Nations and provincial schools, there is a lack of evidence to demonstrate that EIS is on track to achieve these outcomes. While the system may be designed to provide the required technological capabilities, the business outcomes cannot be achieved unless data is available to leverage these system capabilities.

Legend (based on pre-EPA status and project plans):

Business outcomes at risk of not being met due to a lack of data completeness and integrity

High likelihood of outcome not being achieved (achievement unlikely) Very high likelihood of outcome not being achieved (no evidence outcome can be met)

High likelihood of outcome not being achieved (achievement unlikely) High likelihood of outcome not being achieved (achievement unlikely)

Moderate likelihood of outcome not being achieved (unknown likelihood of achievement) Moderate likelihood of outcome not being achieved (unknown likelihood of achievement)
Business outcome Observation and assessment
Provide a complete picture of all education program results by school and community (where privacy permits), by region and nationally, as well as by geographic zone. Dependent on data-sharing with all educational institutions, including provincial institutions.

In addition, certain First Nations have negotiated special agreements or self-government. In some cases, they do not have the same reporting obligations (either the same level of detail or to report at all) on certain areas of the education program. This may also hinder the ability to provide a "complete" picture, in line with the business outcome.
Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Provide the context to meaningfully measure student outcomes (e.g. socio-economic, school infrastructure), which is not currently possible. Dependent on advanced performance measurement techniques, as well as on the completeness and reliability of data, impacted by First Nation adoption rates and provincial data sharing. Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Track students and cohorts over time from kindergarten through secondary school and potentially through post-secondary education, which is not currently possible. Dependent on completeness and integrity of data, which depend on First Nation adoption rates and provincial data-sharing, as well as on the capability to track students (and cohorts) who migrate on and off reserves. Very high likelihood of outcome not being achieved (no evidence outcome can be met)
Implement INAC's First Nation education performance measurement strategy to provide more information on results, and contribute towards informed improvement and increased accountability for both INAC and First Nations. Dependent on completeness and integrity of data, which depend on First Nation adoption rates and provincial data-sharing. High likelihood of outcome not being achieved (achievement unlikely)
Improve management of the education programs and increase accountability by reducing the reporting burden, simplifying the reporting process, and providing timely reports on program management and performance measurement. Dependent on adoption rates and methods of data collection (paper-based, PDF, on-line data entry, electronic extracts mailed to INAC or local system uploads). High likelihood of outcome not being achieved (achievement unlikely)
Improve data quality. Dependent on adoption rates and methods of data collection (paper-based, PDF, on-line data entry, electronic extracts mailed to INAC or local system uploads), as well as on effective data cleansing. Moderate likelihood of outcome not being achieved (unknown likelihood of achievement)
 

 

Footnotes:

  1. Also referred to as "the Department" in this report (return to source paragraph)
  2. The audit was executed in conformity with the requirements of the Treasury Board Policy on Internal Audit requirements and followed the Institute of Internal Auditors' Standards for the Professional Practice of Internal Auditing. It does not constitute an audit or review in accordance with any Generally Accepted Auditing Standards (GAAS). (return to source paragraph)
  3. INAC Social Programs (return to source paragraph)
  4. INAC Education (return to source paragraph)
  5. Based on interview, this capability refers to storing data such that users can generate reports on annual results as well as on historical trends for cross-sectional populations (e.g. student cohorts) (return to source paragraph)
  6. Timeline is as described in the EIS Business Case (return to source paragraph)
  7. K. Eason (1998), "Information technology and organisational change", Taylor & Francis., M. A. Ardis, B. L. Marcolin (2001), "Diffusing Software Product and Process Innovations", Kluwer Academic Publishers. Organisation for Economic Co-operation and Development (2007), "Performance Budgeting in OECD countries". (return to source paragraph)
  8. A. Maydanchik (1997), "Data Quality Assessment (from the Data Quality for Practitioners Series)", Technics Publications LLC (return to source paragraph)
  9. Timeline is as described in the EIS Business Case (return to source paragraph)
  10. A. Maydanchik (1997), "Data Quality Assessment (from the Data Quality for Practitioners Series)", Technics Publications LLC (return to source paragraph)
  11. H. Kerzner (2004) "Advanced project management: best practices on implementation" John Wiley and Sons. (return to source paragraph)
  12. "Formative Evaluation of the Elementary/Secondary Education Program On Reserve" (return to source paragraph)
  13. Institute of Public Administration of Canada (IPAC) 2008 (return to source paragraph)
 
 

Did you find what you were looking for?

What was wrong?

You will not receive a reply. Don't include personal information (telephone, email, SIN, financial, medical, or work details).
Maximum 300 characters

Thank you for your feedback

Date modified: