Press Release – New Zealand Treasury
Today, the Treasury released the second annual administrative & support (A&S) services benchmarking report , which provides findings of the cost, efficiency, and effectiveness of A&S services across the State sector.Media Statement: Treasury releases second annual benchmarking report
7 Mar 2012
Today, the Treasury released the second annual administrative & support (A&S) services benchmarking report, which provides findings of the cost, efficiency, and effectiveness of A&S services across the State sector.
Findings are based on data from two reporting periods (Financial Years 2009/10 and 2010/11) and results cover six A&S service functions across 31 agencies: Human Resources (HR); Finance; Information and Communications Technology (ICT); Procurement; Property Management; and Corporate and Executive Services.
This year’s report shows the 31 agencies spent $1.722 billion on A&S services in 2010/11 – a reduction of $20.4 million in real terms, or 1.2%, from 2009/10.
The report responds to Government demands for better, smarter public services for less and stronger performance management across the State sector. One part of delivering better public services is ensuring money is not unnecessarily spent on back office administration, when redirecting it to frontline services would yield better results.
While some agencies reported significant savings, the report shows significant opportunities to reduce spending and improve the efficiency and effectiveness of services. It finds that lifting agency performance to moderate efficiency targets would realise gross savings of over $250 million annually.
Andrew Kibblewhite, Treasury’s Deputy Chief Executive, said that “fragmentation is a systemic obstacle to efficiency that can be overcome with cross-agency collaboration. Key opportunities to realise efficiency gains include leveraging knowledge and scale across agencies; streamlining, automating, and standardising processes; and having more common systems.”
There are a number of agency-specific and all-of-government initiatives that are acting on opportunities for collaboration, and the Treasury expects to see improvements in future reports as a result of this activity. Mr. Kibblewhite said “we need to recognise that improvement activities take investment and time to yield results. We are using this report to track results of programmes across government to see if these investments are making a difference and to learn from what works and what doesn’t.”
Chief executives face increasing public expectation and fiscal constraint, and delivering ‘better for less’ requires high quality advice from A&S services. “The pursuit of efficiencies and savings cannot undermine A&S service quality. Having a low cost, low performing Finance, Procurement, ICT or Human Resources function is a false economy. This year’s report shows that the maturity of A&S service functions has improved across most functions, and that some functions are getting more strategic, for example, more agencies reported having long-term workforce plans.” Mr Kibblewhite said.
The Treasury will continue to benchmark A&S services annually to have valuable trend information and t
Administrative & Support Services Benchmarking Report for Financial Year 2010/11 (7 March 2012) Data: bnchmrk201011.xls 1. How many benchmarking reports have been published?
This is the second annual administrative and support (A&S) service benchmarking report for the New Zealand (NZ) State sector. In December 2010, Cabinet directed selected larger agencies to undertake an annual A&S service benchmarking exercise.[The Treasury, Better Administrative and Support Services Programme: Report on Phase One Findings and Proposal for Phase Two,Wellington CAB Minute (10) 38/4B directed departments with more than 250 FTEs to submit performance data to the Treasury each year.] Measurement agencies are a mix of larger departments and Crown Entities. The first report was published in April 2011. This second report has the same metrics as the first (with limited exceptions) to enable time series analysis. Background, Executive Summary
2. What does the report provide?
This report provides information on the cost, efficiency, and effectiveness of A&S services in the State sector. Consistent performance information across agencies gives transparency over a significant area of expenditure and provides an evidence base for assessing performance. Purpose, Executive Summary
3. What is the purpose of the report?
This report identifies gross savings possible if agencies reach a range of efficiency targets by function. For example, for the Property function, $34 million could be saved if agencies met a target of 16m2 per full time equivalent (FTE) and the surplus accommodation can be sub-let or released back into the market, and over $62 million could be saved if agencies met a target of 13m2 per FTE. It is important to note that these scenarios use illustrative targets, that agency-specific targets may differ from these, and that gross savings should not be confused with net savings. Purpose, Executive Summary
4. Why was this report written?
This report responds to Government demands for better, smarter public services for less. The current economic climate drives the Government’s focus on delivering services more efficiently and effectively and redirecting resources from A&S services to higher priorities, including services to the public, where possible. The performance information in this report helps agencies better understand the cost and quality of their internal services and make sound resource allocation decisions. This report also responds to Government demands for stronger performance management practices in the State sector. Performance management involves using performance information to agree to targets; allocate and prioritise resources; and track, report, and learn from success. Performance information also identifies top performers and opportunities to share knowledge and practices. Performance management is desirable in any economic climate and is applicable to both A&S services and services to the public. Background, Executive Summary
5. What data is the report based on?
Findings are based on data from two reporting periods (Financial Years 2009/10 and 2010/11), and results cover six A&S service functions across 31 agencies. Functions include Human Resources (HR); Finance; Information and Communications Technology (ICT); Procurement; Property Management; and Corporate and Executive Services (CES). Background, Executive Summary
6. What are the findings in this report based on?
Findings regarding performance changes over time are based on data from two reporting periods. Findings about changes in service performance are based on data from two reporting periods: FY 2009/10 and FY 2010/11. Appendix 3 has information on the scope of the benchmarking study for each reporting period. While some information is available for FY 2008/09 from a pilot measurement exercise, it is not used in this report because the limited number of agencies that participated in the pilot and changes to metrics and definitions limit the value of the time series analysis. Scope of the report, Context
7. Does the report prescribe agency specific targets?
This report does not make agency-specific findings or recommendations, and it does not prescribe targets for agencies. Agencies across the State sector are working to lower the cost and strengthen the efficiency and effectiveness of A&S services. While this report identifies general opportunities across agencies, agencies set their own targets based on their understanding of their operations, including the costs, benefits, and risks of pursuing specific targets. Purpose, Executive Summary
8. Which administrative and support service functions are discussed in the report?
Results cover six administrative and support (A&S) service functions. This report features a chapter specific to each of the following functions: Human Resources (HR), Finance, Information and Communications Technology (ICT), Procurement, Property Management, and Corporate and Executive Services (CES). The latter includes but is not limited to Legal Services, Communications, and Information Management. Function definitions are in Appendix 4. Scope of the report, Context
9. Who wrote this report?
The Treasury is responsible for providing an annual benchmarking service across the public service and for compiling this report. This role involves providing practical supports to measurement agencies during data collection, validating and analysing data, producing a summary report, and working with practitioners to strengthen the metric set based on lessons learnt. The Treasury completes most work in house and draws on third parties such as American Productivity & Quality Center (APQC) and The Hackett Group for comparator data and specialist analysis as required. It also liaises with other governments to access comparator data and lessons learnt from similar exercises overseas. Measurement and benchmarking approach, Context
10. Have practitioners had input in to the development of the report?
Metrics have been selected with practitioners input, and leading State sector practitioners provide insight into the findings for each function. Metric result findings in each chapter are prefaced by expert commentary from senior managers in government playing a lead role in initiating or executing cross-agency reform programmes for a specific function. They are in a unique position to observe the key trends in findings across agencies and provide an update on current improvement initiatives that can have an impact on future performance. Scope of the report, Context
11. What methodology has been followed?
The Treasury’s approach to benchmarking is adapted from established international methodologies. Rather than building a bespoke methodology, the New Zealand agency benchmarking exercise adopted metrics and methods from the UK Audit Agencies (UKAA) and two leading international benchmarking organisations: APQC and The Hackett Group. Measurement and benchmarking approach, Context
12. Was there consistent measurement practice across agencies and international comparator groups?
Agencies used common definitions and data collection practices, and these definitions and practices are aligned with those used by three main sources of comparator data: UKAA, APQC, and The Hackett Group. This consistency is foundational to the comparability of results and usefulness of management information. Quality of management information, Context
13. What is the quality of data submitted?
Overall, data quality is high.
Where there are concerns with data quality, the underlying problems are based in the maturity of measurement methods and are common in the private and public sectors around the world. Two functions in the benchmarking exercise are particularly difficult to measure:
• Procurement: The highly devolved nature of the Procurement function makes it hard to measure consistently because measurement only captures costs where procurement activities make up more than 20 percent of a person’s time. While these data collection practices are consistent with international practice, they lead to an understatement of the cost of Procurement in agencies with a devolved function and are less reliable for comparison between agencies and over different reporting periods.
• CES: Organisations around the world undertake a wide range of activities within this function without standard definitions, and it is not common for them to benchmark these services. When they do benchmark, the quality of management information is impaired by data inconsistency and a limited pool of reliable comparator data in New Zealand or internationally.
14. Are the results comparable?
While results are broadly comparable, results need to be understood within the context of each organisation. While agencies have common features, each has their own unique functions and cost drivers. Benchmarking results are a guide to relative performance, and conclusions regarding efficiency and effectiveness should be made in light of each agency’s operational context. Quality of management information, Context
15. How much did the measured agencies spend on Administrative and Support services?
The 31 agencies spent $1.722 billion on A&S services in Fiscal Year (FY) 2010/11, and the distribution of A&S service expenditure shows that ICT continues to make up the bulk of (57%) expenditure. Agencies that were measured in both FY 2009/10 and FY 2010/11 reported a nominal A&S spending increase of nearly $19 million, which is a reduction of over $20 million when adjusted for inflation. A&S nominal spending was $1.704 billion in FY 2009/10 and $1.722 billion in FY 2010/11, an increase of $18.8 million or 1.1 percent. When adjusted for inflation, the $1.704 billion spent on A&S services in FY 2009/10 is $1.743 billion in FY 2010/11 dollars, representing a $20.4 million (or 1.2 percent) reduction.[Inflation-adjusted costs are based on the annualised average Consumer Price Index (CPI) increase of 2.3 percent, excluding the Goods and Services Tax (GST) increase.] Findings, Executive Summary
16. How is total office accommodation (m2) per FTE calculated?
The net leasable area of office buildings divided by the number of FTEs accommodated in those buildings. It is not workstation size. Net leasable area (or net lettable area) is defined as the area of the building over which rents are usually charged.
17. How were the NZ cohorts identified?
Measured agencies are grouped into three NZ agency cohorts. To support comparisons of agencies with the greatest operational similarities, agencies are grouped using the following criteria: Size of operating budget, number of organisational FTEs, agency type by primary function, and distribution of people/service. Using these criteria, measured agencies fell into three groups of equivalent size with a profile that shared at least three of the four criteria and is described in Appendix 3 of the report.
18. How were the metrics decided?
Metrics were selected with measured agencies. Three principles guided metric selection: Metrics reflect performance – they provide meaningful management information; Results can be compared – they are comparable across NZ agencies and comparator groups; Data is accessible within agencies – the measurement costs are reasonable.
The final selected metrics were those most relevant and measurable in the New Zealand State sector environment. Measured agencies used a consistent underlying taxonomy based on definitions from the UK Audit Agencies, the American Productivity & Quality Center, and The Hackett Group. Measurement and benchmarking approach, Context
19. How were the agencies selected for measurement?
In December 2010, Cabinet agreed that departments with more than 250 FTEs be required, and Crown Agents be expected, to make an annual submission of A&S service performance data to the Treasury. Note that, Ministry for Culture & Heritage, Ministry of Transport, and State Services Commission all have less than 250 FTEs and participated in the measurement exercise on a voluntary basis.
The 31 Public Service Departments, Non-Public Service Departments and Crown Agents that participated in the FY2010/11 exercise are listed alphabetically below.
|Department of Building and Housing||Ministry of Justice|
|Department of Conservation||Ministry of Social Development|
|Department of Corrections||Ministry of Transport|
|Department of Internal Affairs||New Zealand Customs Service|
|Department of Labour||New Zealand Defence Force|
|Housing New Zealand Corporation||New Zealand Fire Service Commission|
|Inland Revenue||New Zealand Police|
|Land Information New Zealand||New Zealand Qualifications Authority|
|Ministry for the Environment||New Zealand Tourism Board|
|Ministry of Agriculture and Forestry||New Zealand Trade and Enterprise|
|Ministry for Culture & Heritage||New Zealand Transport Agency|
|Ministry of Economic Development||State Services Commission|
|Ministry of Education||Statistics New Zealand|
|Ministry of Fisheries||Te Puni Kokiri (Ministry of Maori Development)|
|Ministry of Foreign Affairs and Trade||The Treasury|
|Ministry of Health|