This opinion piece was originally published in The Age.

An independent inquiry led by Professor Peter Shergold has been announced. Its job is to unpack Australia’s response to the COVID-19 pandemic. The panel charged with collating and distilling the lessons from the public submission process includes some serious input from the health and medical perspective via Professor Sharon Lewin from the Doherty Institute as well as Peter Varghese with his foreign affairs and trade background and panel members who provide broader business and youth perspectives.

Although it is unusual for such an inquiry to be called by philanthropic sources rather than government, it is great to see it established. Without it, I fear we might move on before we’ve learned all we can from these past two years, and miss out on having the insight we need to shape future pandemic preparedness.

Previous inquiries established into aspects of our response have at times seemed like blame exercises, and that can run the risk of shutting down the flow of information, focussing on finding scapegoats or evading responsibility. With the pandemic still a polarising topic, an independent panel standing apart from politics may have more hope of delivering practical recommendations. Our bureaucracies will need to co-operate and consider its conclusions and recommendations.

There have been lessons learnt already that will form part of the hard-earned positive legacy of this pandemic, but there were also policies and actions that were not evidence-based and that have yet to be evaluated. Just because we came through a wave does not mean we should not look back to see where we might have managed it in more timely and effective ways, with less collateral damage.

Pandemic responses, by their very nature, require urgency, before complete evidence can be gathered on the risks of infection or how to control transmission. Early in a pandemic, where the consequences of inaction are deemed to be greater than any risks associated with the interventions themselves, we apply the “precautionary principle” and implement reasonable public health measures without waiting on the evidence to support them.

What happens next is the critical part when it comes to evidence-based practice. In public health, every intervention is expected to have built-in evaluation processes so that the investment can be continually reviewed and refined. This becomes even more important where the context of an intervention changes, and we must question whether it remains the best way to address a particular public health issue or is still required at all. The same principle should apply in a pandemic. Evidence accrued as you go should be continually reviewed to determine whether decisions made under the precautionary principle are still fit for purpose or if refinement is required.

The measure of success of an intervention is not just that an end point is reached, it is also whether we achieved this by the most cost-effective ethical and equitable means, in the broadest sense of the word, “cost”, avoiding unintended consequences. Quarantine protocols for household contacts is a current discussion point. We estimate less than 50 per cent of current infections are recorded as cases. We therefore ask household contacts of only a fraction of cases to isolate in case they develop an infection. Meanwhile, very high vaccination rates and recent infections in the population dramatically reduces the risk of another Omicron infection for those household contacts, yet they are still asked to isolate. A marginal risk reduction in only a fraction of the infected households.

This may also mean even fewer people test as they are worried about the imposition on their household. And then we have a person who actually has an infection who is out and about mixing, as well as their household contacts who do not know they have a case in their mix.

We have to assess the direct and indirect risks associated with these policies as it’s not always true that keeping a system like this in place is actually making the difference we think it is, and unless we assess this, we may be actually undermining our testing for little gain in households with a recorded history of infection. And having a quarantine amnesty for the months following an infection might actually lift our case ascertainment above 50 per cent and increase the proportion of infectious people who are isolating.

By most measures, Australia has done well in navigating its way through this pandemic, but that does not mean we could not have done better.

For example, the partnerships built between health departments and community leaders helped bring Delta transmission under control, but it was not until midway through the Delta wave before we realised that vaccination rates lagged in the same communities that were vulnerable in previous outbreaks – those with the highest transmission potential due to population density, household mixing and occupation.

Prevention and resilience should have been an ongoing focus off the back of previous waves that had identified our most at-risk communities, those populations where a future outbreak would again accelerate, putting everyone at risk.

By the end of 2021, Victoria had successfully achieved the smallest gap in vaccine uptake between metro and rural areas, and between high and low socioeconomic areas. A great outcome, but if this had been achieved before Delta, then that outbreak may have run a very different course statewide.

Australia’s response has been characterised by “listening to the science” right from the start, and yet quite different approaches have been employed across jurisdictions, or even within jurisdictions. This does raise questions about the role of evidence in our public health policy, but it also tells us that we now have lived experiments on a grand scale that can be reviewed and compared in detail to understand the strengths and weaknesses of each in managing the dynamics of disease transmission, the entire health response, and the wider impacts of these approaches.

This is feasible in Australia because of the extensive and detailed case and contact tracing throughout most of the pandemic, providing rich data that can tell us more than if interventions worked in terms of gross case numbers, but which components worked, and how this differed across our communities.

The inquiry is unlikely to provide detailed answers in its report later in the year, but it can identify those areas where more work should be done to extract a cohesive guide to pandemic prevention and preparedness. We owe that to every Australian who has been impacted by this pandemic, directly and indirectly.