Blog posts tagged with 'pacific infrastructure'

Solomon Islands National Infrastructure Priority Pipeline 2023 launched

The Solomon Islands National Infrastructure Priority Pipeline 2023 consolidates the most important nationally significant infrastructure projects and programs into a single document. These infrastructure projects have been selected and packaged to transform how Solomon Islanders live, work and thrive in the future. These projects and programs will be critical to the achievement of the goals and objectives of the Solomon Islands National Development Strategy. 

This Priority Pipeline represents a new way of looking at infrastructure in the Solomon Islands. Instead of a single list of national projects and programs, we have presented the Pipeline as individual packages of sequenced infrastructure projects within each of the nine Provinces. This is a new ‘place-based’ planning approach that recognises the contexts, unique challenges and aspirations of the people in each of our Provinces.

We have also recognised that transport and accessibility is the foundation of development. It forms the backbone of the infrastructure ecosystem, as recognised in National Transport Core Initiative (NTCI) policy. These transport projects therefore sit at the heart of the sequencing in the Pipeline.

The document summarises our analyses of each of the Provinces, highlighting what we perceive as the challenges and opportunities. The projects and programs in each of the   Provinces are tabulated and mapped where appropriate. We believe that this approach makes it easier for a broad audience of stakeholders to appreciate the plans and intentions behind this Pipeline.


  • Launch the Solomon Islands National Infrastructure Priority Pipeline 2023 Report.
  • Inform development partners, business and supply chains of the Solomon Islands longer term infrastructure intentions
  • Dissemination of the 2023 Solomon Islands Priority Pipeline for implementation.

Click to view the complete SINIPP 2023 document below:

The Solomon Islands National Infrastructure Priority Pipeline 2023

Testing times for labs

The importance of materials testing cannot be underestimated. Across every major industry, manufacturers, developers and operators of critically important systems, products and components need to be assured that the materials employed in forming their equipment are up to their intended tasks. That’s why the most diligent method of verifying that the materials companies use in manufacturing processes will perform to expectations and adhere to all applicable regulations is through reliable materials testing.

Baseline or Benchmark? Reviewing The Pacific Infrastructure Maintenance Benchmarking Report

(Honiara International Port (photo credit: Solomon Islands Ports Authority)


(PT Columnist, David Jacobs Spring, Melbourne) 

Those who work in the Pacific know the predicament of attempting to make scant data look robust. The Pacific Infrastructure Maintenance Benchmarking Report, launched in February 2022 by the Pacific Region Infrastructure Facility (PRIF), is a case study of this quandary.

The aim of the report is modest – to “raise the profile of infrastructure maintenance.” No doubt it has already achieved this, through the engagement of the various authorities and ministries in the self-assessments which form the raw data for the study. By launching and further presenting the report, the profile of infrastructure maintenance will again be promoted.

The report attempts to analyse the current level of ‘maturity’ of maintenance planning, funding and practices in the Pacific against an objective standard. It does this by selectively choosing from three international standards, then collecting data on 37 measures from voluntary respondents to surveys, sent out by the authors. The results are aggregated to draw conclusions and a series of recommendations for further studies, national governments and donors. These are grouped under three themes: accounting, planning and budgeting, and funding capital maintenance. The common thrust is apparent.

The report tacitly acknowledges the difficulty of their undertaking. The phrases,” studies have validated”, “it is accepted” and “experience tells us that” are indicative of the scarcity of available historical data or relevant studies into infrastructure maintenance in the Pacific.

The study was conducted during the COVID-19 pandemic, which no doubt made primary, in-country data difficult to obtain. Data collection was presumably limited to remote sources, which is a hindrance to effective engagement in the Pacific.

What the report accomplishes

The great strength of the report is in the development of a maintenance maturity assessment tool. Drawing on elements of three international assessment frameworks, the authors derive their own tool, one well suited to the infrastructure challenges faced in the Pacific region. While the framework is recognisably a developed nation style construct - taking a universal approach to definitions, aspirations and requirements - its value will be the opportunity it presents for long term benchmarking.

The report draws on a bank of reports into the various sectors over the past decade. Some of these reports are familiar within the industry and ministries, for example, the National Infrastructure Investment Plans (NIIPs), Medium-Term Expenditure Framework (MTEFs) and national strategic plans. This assists recognition and thus lends legitimacy to this report.

The report relies heavily on the analysis of financial statements and records. This is a rational approach, as effective infrastructure maintenance funding is a function of finance flows. It is likely to be as a result of the relative availability of this data, as public records.


This is a PRIF report. Its underlying assumptions and purpose are known It strikes a paternal tone at times, consistent with the implicit reference to Australian or international standards as being a benchmark to which “PICs” can and should aspire. This criticism can be overlooked as a function of PRIF culture.

However, there is a mismatch between the quality, scope and scale of data assessed with the certainty of the recommendations. The self-assessment surveys were conducted primarily in Micronesia, and the Solomon Islands. There were some contributions from Samoa, Tonga and Fiji. Summarising the results to apply across “the Pacific” in such circumstances is not representative.

The report relies on references from as far back as 1988. Not that infrastructure maintenance principles have changed much in 30 years, but the perception is of a report that lacks relevance for today’s context - maintenance challenges and innovations. In addition, the use of GDP data only up to 2019 and government reports from 2014 and 2015 to support case studies, obscure current status or true progress.

While is it tempting, and desirable in some ways, to aggregate data and findings across the “Pacific” and even across all types of “infrastructure” or even sectors, the value of doing so is limited and may even be counter-productive. State owned enterprises (SOEs) operate under different incentives and governance schemes than do public works and infrastructure ministries. The problems faced in Kiribati may not be at all similar to those faced in Tonga. Vanuatu’s Ministry of Infrastructure and Public Utilities may not consider the findings based on Micronesian SOE’s particularly useful. Ultimately, national governments and SOEs will use the information as needed.

The recommendations appear to broadly miss the operational context into which they should supposedly be implemented. The first recommendation for action is to improve the coding of maintenance expenditure across infrastructure (Section 5.1.2 (a)). Obviously, the order of the recommendations does not reflect their relative importance because there are diminishing returns to be gained from ‘more accurate’ accounting. Having more accounting codes can provide the appearance of accuracy, providing a false confidence but without making next years’ forecasts or budgets any more accurate. Worse, the increased administrative burden of this practice will not encourage the diligent cost coding of actuals, thereby moving forecasts even further from reality. The rationale for this, to “convince politicians …that the amount being spent is not sustainable” (i.e., it’s too low), is noble but unlikely to be compelling.

The other side of the same unconvincing coin is the recommendation to increase funding for capital maintenance (Section 5.3). Pacific country infrastructure ministries hold relatively low political prestige, funding limitations will remain. These realities are a reflection of cultural values and are wisely approached with due respect.

The opportunity

Conducting an extensive, high-profile, publicised report such as this is a key opportunity to not only raise the profile of infrastructure maintenance, but to be a constructive part of that ongoing conversation. Some aspects of maintenance across the Pacific missing from the report that would improve its relevance are:

  • Recognising cultural values that place a greater emphasis on social relationships than financial efficiencies
  • The private sector is mentioned as a service provider of maintenance, but can play a much more active role in the planning and quality of outcomes
  • The benchmarking of maintenance costs. While this can’t be accurately obtained from the publicly available financial statements, the data is known and available. The wisdom of publishing it is debated amongst the procurement establishment. On balance, greater transparency of these metrics would assist in estimating and identifying procurement anomalies
  • Consideration of administrative burden – trade-off required between re-directing the team to do more data management compared to the value derived
  • Many infrastructure ministries in the Pacific are ‘supported’ by programs funded by PRIF members such as DFAT and ADB. Discussion of the successes and failures of these past interventions would balance the implication that the “areas for improvement” can be fixed by further interventions
  • Discussion on how to achieve quality standards. The specification, testing and verification of construction (maintenance) materials and equipment is an area of inconsistency, requiring expertise and funding support
  • Gender equality. Every opportunity should be taken to pursue public assets, spaces and workplaces that are inclusive and raise the profile of the value of women’s involvement and participation in the planning and delivery of infrastructure
  • As noted above – innovations in maintenance practice to be explored and applied, not just in the area of technology and digital asset management systems

The report notes ‘perverse incentives’ which legitimate the build-neglect-rebuild logic. These are only increasing, as geopolitical competition across the Pacific intensifies. Even if aid money is flowing, the volatility of donor funding can make it impossible for recipient governments and businesses to engage in long-term planning and sustainable spending.

Nonetheless, this dynamic can be leveraged by national government ministries/SOEs, if some of the recommended planning practices are implemented. Use of donor funds for capital expenditure (as recommended) would enable a backlog of un-maintainable assets to be brought to a maintainable standard. If governments/SOEs can combine this planning with a reckoning of the level of routine maintenance funding that national coffers and tariffs (not donors) can actually afford, a consistent level of service is achievable into the future.

A benchmark?

So, is it a benchmark report on Pacific infrastructure maintenance? If the report was positioned as a starting point, that would be an accurate title. The report’s Preface refers to it as a ‘baseline’ study, and that best reflects its character. It does provide an albeit-limited viewport into the current status of maintenance practice across parts of the Pacific, as of 2021.

Benchmarks are stable, standard, triangulated data points for surveyors to rely upon. To achieve this metaphorical benchmark status, the maturity assessment will require up to three more years’ consistent data collection, using the self-assessment and additional, more robust methods. The data to objectively evaluate the 37 key requirements for maturity does exist and could be obtained, through a similarly engaging approach as that employed for the self-assessments.

This will set up the assessment as both a central repository of data and enable meaningful comparisons – that is, benchmarking.