Safe AI innovation needs robust regulation, warns IChemE roundtable

Article by Aniqah Majid

INDUSTRY leaders have urged the UK government not to neglect legislative support in the race for advances in AI.

Digital technologies are at the forefront of the UK government’s “Invest 2025” industrial strategy, a ten-year plan aimed at boosting investment in the UK’s key growth areas, including AI technology.

However, at IChemE’s recent roundtable, industry leaders highlighted the lack of regulation to inform safe practice, pointing out the need for chemical engineers to be part of the conversation to ensure future AI frameworks are fit for purpose.

The discussion focused on the use of AI and machine learning in plant operation and control and involved insights from experts in academia and industry, including Simon Rogers, the global technology pre-sales leader at KBC Global, Mehmet Mercangöz, an ABB reader in autonomous industrial systems at Imperial College London, and Vira Jogia, a process safety consultant at VRJ Consultancy.

“Do we have the right policy framework for AI to be trialled?” said another speaker, Jin Xuan, the associate dean of Research & Innovation at the University of Surrey. “I do not think it is a technology issue in the UK, it is really a policy and regulation issue.”

Regulatory changes

Current AI safety regulation in the UK is minimal, with the government only establishing the AI Safety Institute in February.

In the absence of a framework for the safety and application of new AI technologies, the industry has relied on existing regulation, which is not always flexible.

Jogia said: “I have applied and used HAZOP because that is the most commonly used hazard identification tool in the industry.

“We were in collaboration with the provider of the technology we were assessing, and after a few weeks, we found that it was not safe, as chlorine gas could be generated, and the provider said they did not consider this. But we would not have been able to do such an in-depth investigation if the provider was not open.”

Jogia said that for regulation to effectively reflect the practicalities of trialling new technology, industry needs to work with the technology providers.

And Rogers said the foundations for this regulatory framework have already been established in chemical engineering with similar technologies: “We have done many projects with machine learning, albeit not widespread. But in regard to reinforcement learning, it is very similar technology to multivariable control.”

Immediate changes

For the government to cultivate international interest into the UK’s AI capabilities, the roundtable suggested a shift in strategy.

“Policymakers are way too opportunities focused,” said Nicole Mwananshiku, another speaker and policy advisor in the Royal Society’s data policy team. “They are not even wanting to dip their toes into risks, as they are focused on growth and innovation.”

Along with the new strategy, the UK government have introduced the AI Opportunities Action Plan, which will focus on growing the AI industry specifically, bringing together academics, industry, and regulators.

Looking at past regulation, Bhavik Mehta, another speaker and senior applications engineer at Siemens, said: “Policy cannot be generic, because every industry has specific needs and regulations that they need to follow.

“I go back to pharma because it is one of the most highly regulated industries in the world, whereas the policies for oil and gas for example are not sufficient, so the AI tools would not be either. It does not need to be specific to industries, but the policies need to tell low to highly regulated industries specifically what they need to do.”

Article by Aniqah Majid

Staff reporter, The Chemical Engineer

Recent Editions

Catch up on the latest news, views and jobs from The Chemical Engineer. Below are the four latest issues. View a wider selection of the archive from within the Magazine section of this site.