Benchmarking in Australasian Botanic Gardens and Zoos

Stephen Forbes

Senior Assistant Director,
Royal Botanic Gardens, SYDNEY NSW AUSTRALIA


Home | Contents | Abstract | Introduction | What is Benchmarking? | Why Benchmark? | Alternatives | Market Testing | Performance Indicators | Where to Start? | Profiling Survey Design | Profiling Survey Results and Analysis | Examples | Future Directions | Conclusion | References | Appendix 1


Abstract

The Australasian Botanic Gardens and Zoo Horticulturists Forum held in Alice Springs in 1997 resulted in a commitment to undertake a benchmarking exercise for horticultural programs and plant collections within botanic gardens, zoos and significant historic or amenity gardens. The initial objective was to prepare a profile of participants, to provide an overview of the business environment and to progress the development of industry benchmarks. A benchmarking profiling survey was developed by a working group from the Forum, and forwarded to participating agencies through Parks Victoria's Benchmarking and Best Practice Group.

The structure of the profiling survey considers the following areas:

Review of the profiling survey data has stimulated a dialogue between active participants in relation to defining and measuring outputs, and ultimately in relation to the development of best practice through process benchmarking.

The benchmarking survey represents a milestone for horticultural and plant collections management. The results allow refinement of key processes for benchmarking, and highlight the paucity of output measures in botanic gardens. Future directions for botanic gardens benchmarking programs are proposed.

Top

Introduction

The Australasian Botanic Gardens and Zoo Horticulturists Forum held in Alice Springs in 1997 resulted in a commitment to undertake a benchmarking exercise for horticultural programs and plant collections within botanic gardens, zoological gardens and significant historic or amenity gardens. The initial objective was to prepare a profile of participants to provide an overview of the business environment and to progress the development of industry benchmarks. The profiling survey was developed by a working group from the Australasian Botanic Gardens and Zoo Horticulturists Forum, and forwarded to State and regional botanic gardens, zoological gardens and significant public historic or amenity gardens in late 1997 through Parks Victoria's Benchmarking and Best Practice Group.

The benchmarking survey represents a milestone for horticultural and plant collection management. An outline of the results of the initial profiling survey is presented in this paper. The results illustrate the importance of identifying and defining processes to facilitate comparisons. Nevertheless, analysis of the results assists in furthering our understanding of objectives, processes and outcomes, and provides the basis of an essential dialogue leading towards the achievement of best practice. Review of the profiling survey data has stimulated this dialogue between active participants in relation to defining and measuring outputs, and ultimately in relation to the development of best practice through process benchmarking.

Top

What is Benchmarking?

A dictionary definition of benchmarking provides a reasonable starting point. A benchmark is defined literally as a, 'mark cut in rock etc by surveyors to mark a point in a line of levels', and figuratively as a, 'criterion or point of reference'. The use of the term benchmarking in business processes is analogous - Xerox defines benchmarking in as, 'The continuous process of measuring our products, services and practices against our toughest competitors or those companies known as leaders.' The simpler definition of benchmarking as, 'The search for and implementation of best practices' is widely accepted (Camp 1995).

Benchmarking in business focuses on:

The detail of benchmarking methodology is effectively presented in Robert Camp's Benchmarking: the search for industry best practices that lead to superior performance (Camp 1989).

Top

Why Benchmark?

Botanic gardens and zoos, in common with any business, are concerned with the effective and efficient application of resources to allow them to achieve outcomes reflecting their mission. Without measuring performance against the achievements of other businesses, it is difficult to understand internally how well a botanic garden or zoo is performing a particular process. Benchmarking is a process which allows participants to measure how they are performing in relation to others in the same field, or in relation to comparable industries, and allows analysis of opportunities for improving performance in individual products, services and practices.

Externally, benchmarking programs provide a powerful illustration of an agency's accountability. The ability to demonstrate effective performance in relation to industry benchmarks provides clear evidence of good internal governance, and can also provide a significant support for public and private fundraising programs. The link between accountability and advocacy is a particularly important one in increasingly competitive environments.

In most botanic gardens and zoos there is a strong commitment to comprehensive financial reporting, often demonstrated by the investment of considerable resources in financial management. Whilst statutory requirements, or at least concern for transparent financial management, may drive financial reporting, the same commitment is rarely demonstrated in performance measurement. Whilst financial management is a support activity, core business which delivers a botanic garden or zoo's mission, requires at least a similar commitment to ensuring effective, efficient and appropriate delivery.

Top

Do market testing and effective performance indicators provide an alternative to benchmarking?
Benchmarking has been criticised in the public sector as '… backward looking' and '… unable to reveal the scope of new or innovative solutions' (Schapper 1995). The criticism is perhaps driven by a government commitment to privatisation but may be fairly directed at benchmarking programs that avoid lateral or innovative solutions. However, effective benchmarking has the potential to deliver more radical solutions than the market. The key issue for benchmarking is the delivery of best practice - in this context the application of market testing strategies and the application of performance indicators are better viewed as complimentary to an effective benchmarking strategy, rather than as alternatives.

Top

Market Testing
Market testing (or competitive tendering), which relies on the market to deliver cost effective solutions to providing individual products, services and practices, may be viewed as a simple alternative to benchmarking. Market testing, and subsequently competitive tendering, has been applied in many botanic gardens and zoos for processes considered as being outside core business. For example, many botanic gardens and zoos out-source waste management, cleaning and signage production. However, market testing solutions should be reviewed in the context of an agency's mission and strategic direction. Financial gains achieved in competitive tendering of individual processes require review in the context of a strategic approach leading towards the achievement of long term goals.

Top

Performance Indicators
Performance indicators provide a valuable tool for reporting trends in effectiveness and efficiency on a regular basis. However, performance indicators per se are a measurement rather than an analytical tool, and fail to test the fundamental processes delivering a business product, service or practice. Accordingly, analysis of performance indicators is unlikely to reveal new opportunities for improving performance. Nevertheless, performance indicators reflecting outputs in key result areas can demonstrate continuing improvement, and provide benchmarks which may be effectively utilised in a benchmarking strategy.

Top

Where to Start?
Camp (1995) summarises the formal benchmarking process (Camp 1989) as follows;

Communicate benchmarking findings and gain acceptance Whilst Camp (1989) provides an explicit methodology for benchmarking careful planning and input from experienced benchmarking practitioners will facilitate an effective benchmarking program. Nevertheless, benchmarking is a heuristic process, and experience will improve understanding of opportunities for improving performance. The key issues are identified in the first two points above - determining which business processes to benchmark, and against whom.

Top

Profiling Survey Design
As with other businesses, within botanic gardens, zoos and historic and amenity gardens, the fundamental questions are what to benchmark, and whom to benchmark. In order to answer these questions, a profiling survey was utilised as a starting point to provide a scan of the industry identifying key business processes and potential partners.

The intention of the profiling survey was to provide the basis for a broad dialogue to establish what the industry sees as key processes, and to establish the commitment of potential partners. A draft profiling survey was prepared and circulated for review by the Australasian Botanic Gardens and Zoo Horticulturists Forum Benchmarking Working Group and Parks Victoria's Benchmarking and Best Practice Group. The survey was refined and forwarded to participants from the Forum and additional participants identified by Parks Victoria. The organisations that participated by completing and returning the profiling survey are listed in Appendix 1. The structure of the profiling survey considers the following areas:

Top

Profiling Survey Results and Analysis
The results of the profiling survey present an overview of participating agencies. The next step is to determine which processes to benchmark with which partners. Analysis of the profiling survey provides basic data that allows participants to determine potential partners for a benchmarking program, and for benchmarking individual processes. For example, agencies can identify others with common concerns and programs that may reflect, for example, visitation levels, horticultural programs, revenue streams or environmental management.

As the profiling survey is only intended to provide a scan of participating agencies, application for the profiling survey data per se is limited. However, some worked examples analysing the survey data may facilitate an understanding of benchmarking methodology. A number of such examples are presented below.

Example 1 - Revenue breakdown

Aim: To facilitate comparisons of success in revenue generation (excluding entrance fees) for participating agencies.

Analysis: The chart illustrates the percentage of individual components contributing to total revenue for participating agencies - the code along the horizontal axis identifies each agency. Accordingly, review of the chart assists in determining which participants are most effective (in terms of percentage contribution to total revenue) in utilising particular strategies for generating revenue. A number of agencies have a broad revenue base, whilst others have a narrow base.

Outcome: Benchmarking of revenue generation strategies is most effectively focused on individual revenue streams. Analysis of relative contribution and total revenue for participating agencies will provide a useful basis for determining opportunities for process benchmarking.

Example 2 - Revenue per visitor

Aim: To facilitate comparisons of revenue per visitor for participating agencies.

Analysis: The chart illustrates that revenue per visitor varies significantly between participating agencies - the code along the horizontal axis identifies each agency. Entrance fees distort analysis of revenue from other sources - revenue per visitor exclusive of entrance fees may provide a useful alternative. Review of the data indicates that revenue per visitor is most strongly influenced by the institution of entrance fees.

Outcome: These data indicate the importance of careful construction of surveys and appropriate analysis in delivering meaningful results. However, the data still provides a valuable basis for selecting partners for process benchmarking.

Example 3 - Landscape structure

Aim: To facilitate comparisons between landscapes for participating agencies.

Analysis: The chart illustrates the area and components of the landscape for participating agencies - the code along the horizontal axis identifies each agency.. A clear understanding of landscape structure may be desirable to assist in determining benchmarking partners or in developing benchmarks for landscape maintenance and management. Whilst comparisons between landscape structure may be interesting, differences in climate, soils, technical skills and plant collections policy may restrict benchmarking opportunities..

Outcome: Benchmarking of landscape structure is most effectively focused on individual components. Analysis of relative contribution of individual components for participating agencies will provide a useful basis for determining opportunities for process benchmarking.

Example 4 - Area/ horticulturist

Aim: To facilitate comparisons between landscape maintenance and management efficiency for participating agencies.

Analysis: The landscape area maintained per horticulturist provides an indication of staffing requirements for participating agencies - the code along the horizontal axis identifies each agency. Climate, landscape structure, collections policy, presentation standards and management strategies all influence maintenance and management outputs. Whilst comparisons between organisations require cautious interpretation, benchmarking may still provide a valuable tool for analysing successful landscape management strategies.

Outcome: Comparisons between agencies should be further analysed at an individual collection level.

Example 5 - Accessions/ horticulturist

Aim: To facilitate comparisons between plant collections maintenance and management efficiency for participating agencies.

Analysis: The number of plant accessions maintained per horticulturist provides an indication of staffing requirements for participating agencies - the code along the horizontal axis identifies each agency. Climate, landscape structure, collections policy, presentation standards and management strategies all influence maintenance and management outputs. Whilst comparisons between organisations require cautious interpretation, benchmarking may still provide a valuable tool for analysing successful landscape management strategies.

Outcome: Comparisons between agencies should be further analysed at an individual collection level.

Example 6 - Expenditure/ accession & expenditure/ visitor

Aim: To facilitate comparisons between plant collections maintenance and management efficiency and visitor services efficiency.

Analysis: Expenditure per visitor and expenditure per accession reflects visitation and total number of accessions. Although direct comparison between visitors and accessions is perhaps meaningless, the data provides a useful tool for discussions on visitor focus and the purpose of plant collections.

Outcome: Whilst caution is required in comparing expenditure per accession and expenditure per visitor, analysis of the data provides a useful starting point for dialogue on the purposes of plant collections in botanic gardens and zoos.



Top

Future Directions

The results of the initial benchmarking profiling survey provides useful baseline data. However, the survey also highlights both opportunities and constraints for process benchmarking. Agreement on the identification of key processes and on methodologies for benchmarking these key processes is required.

Clearly measurement of outputs in some form is essential. However, benchmarking allows a successful process to be analysed without necessarily measuring inputs. For example, if visitors rate a particular garden as outstanding, whilst the visitor reaction requires assessment (i.e. some form of measurement), the processes that determine delivery of the outstanding garden can be analysed without necessarily being measured. Process benchmarking answers how a result is achieved - the actual level of improvement which can be achieved at another garden reflects both the appropriateness of the process, and the effectiveness of process transfer.

Accordingly, identification of key result areas is essential. Further analysis of the initial benchmarking profiling survey identifies a number of areas where process benchmarking may be effectively implemented.

Processes suitable for consideration include:

Top

Conclusion

Underlying benchmarking is a philosophy of continuous improvement - a benchmarking program develops a focus on how an agency is performing in relation to its competitors, as well as from year to year. Key issues in the implementation of a benchmarking program are a commitment to a systematic approach to the achievement of best practice, and compatibility in relation to objectives and operations. The next challenge for the Australasian Botanic Gardens and Zoo benchmarking program is to undertake investigations of individual processes.
References

Appendix 1

Participants:

 

CONTENTS | NEXT PAPER | PREVIOUS PAPER | TOP

Copyright 1999 NBI
ACE DESIGNS