AI: Oracle or ogre?

09 October 2023

In an exclusive chat, we explore the buzz around artificial intelligence technologies and what AI means for UK manufacturing with the University of Sheffield AMRC’s director of industrial digitalisation, Professor Rab Scott.


Article featured in the latest issue of the AMRC Journal.


‘I warned you guys in 1984. And you didn’t listen.’

That was the stark warning from Terminator filmmaker James Cameron, the director and writer behind the cult movie about a cyborg assassin sent from the future and the risk posed to humanity by the ‘rise of the machines’.

You know a topic is hot when filmmakers like Cameron line up alongside tech experts and media magnates, academics and researchers, to join in on the global conversation about artificial intelligence (AI). It’s fair to say interest in the technology has seemingly exploded of late, which is due, in no small part perhaps, to the emergence and popularity of platforms like ChatGPT. 

So, it’s official. AI is the latest buzzword. And while many might not truly understand what AI is or what it actually does, the chances are they’ve talked about it with friends and work colleagues, heard it mentioned down the pub, or even at family gatherings. It’s the tech talk on everyone’s lips. And we’re all riding the AI wave, whether we like it or not. 

But what actually is AI? It is defined as ‘the ability of machines to replicate or enhance human intellect, such as reasoning and learning from experience’. Artificial intelligence has been used in computer programs for years, but is now applied to many other products and services. And although the concept of AI has been around since the 19th century, when Alan Turing first proposed an ‘imitation game’ to assess machine intelligence, it only became feasible to achieve in recent decades due to the increased availability of computing power and data to ‘train’ AI systems.

AI seems to be everywhere - it is talked about in the news, on TV and touches almost every part of our lives. Most people use it everyday without thinking about it. How? Well, when you reach for your smartphone and open it up using face recognition - that’s a form of AI called machine learning that is based on algorithms. Emails sent to your spam inbox - AI. Google search - that’s enabled by AI. Security transaction and fraud checks for your banking - AI. The commentary you hear on non-show court matches for Wimbledon? AI. Those recommendations you get on Amazon? You guessed it - AI.  

Even celebrities can now be brought back from the dead as digital clones powered by AI. In fact, it is this development that has put Hollywood in hot water, opening up a dark void of uncomfortable questions on who owns the rights to a face, voice or persona that sit behind some of the concerns from actors and screenwriters who recently went on strike. And it’s not just concerns about how the dead are affected, but the living too. 

And these ideas and fears are creeping into the wider social consciousness through pop culture. Take the new series of TV show Black Mirror - spookily released just weeks before the standoff between actors and studios - where Hollywood A-lister Salma Hayek unwittingly signs away her AI-generated digital likeness to a studio.

AI continues to pose the global question on whether the technology is a friend or foe. The rapid progress being made by AI seems to raise fear and excitement in equal measure; and answers seem to bring more questions - how do we define AI, how do we use it, what is the impact of automation on industry, on society? And that’s without touching upon the moral and ethical dilemmas circling around the subject of AI.

As digital technologies continue to transform the world we live in, should we see AI as an opportunity or a threat for manufacturing and the UK? We put that question to the AMRC’s director of industrial digitalisation, Professor Rab Scott, who shared his thinking around this hot topic - starting with the current use of AI technologies by UK businesses.

Prof Scott says that, according to Capital Economics - which was commissioned by the government to report on current future use of AI by UK businesses -  the use of the technology is limited to a minority but is more prevalent in certain sectors and larger businesses.

“Overall about 15 per cent of the UK’s 2.8 million private businesses have adopted at least one AI technology - that equates to 432,000 companies - spending a total of £16.7 billion,” says Prof Scott.

“As businesses grow, they are more likely to adopt AI. About two per cent of businesses - 62,000 - are piloting AI while ten per cent - 292,000 - say they plan to adopt at least one AI technology in the future. That could see spending grow to as much as £35bn by 2025.”

Currently, AI solutions for data management and analysis are most prevalent, followed by natural language processing and generation; machine learning; AI hardware; and computer vision, image processing and generation. The IT and telecommunications, and legal sectors have the highest rate of adoption, hovering at about 29 per cent. The next highest sectors in terms of adoption rate are finance and accounting as well as the media, marketing, and sales. Health and retail are the lowest, nudging 12 per cent. 

So where does manufacturing fit into all of this? 

“The Capital Economics report showed that about 17% of the manufacturing sector has adopted AI technology, with a further 14% saying they plan to do so in future and about two per cent already piloting AI technologies,” says Prof Scott.

“What this shows is a huge opportunity for growth. Government is investing heavily in support for AI and in developing an appropriate regulatory framework with the aim of increasing the type, frequency and scale of AI discoveries developed and exploited in the UK and diffusing AI across the whole economy to drive the highest amount of economic and productivity growth. 

“There is a clear moment for manufacturers to harness the power of AI in several ways to enhance their operations and drive innovation.”

Prof Scott outlines a number of ways for how AI tools might be used, including to: 

  • Aid product design and development: by training the models on existing product information, customer preferences, and market trends. Manufacturers can utilise AI to generate new product concepts, optimise designs, and simulate product performance. This can accelerate the innovation process and reduce time-to-market;
  • Support quality assurance and predictive maintenance by identifying patterns and anomalies to predict potential issues, recommend preventive maintenance, and improve quality control processes;
  • Create comprehensive knowledge bases, training materials and interactive tutorials; 
  • Process and analyse vast amounts of textual data to gain valuable insights. AI applications can help with sentiment analysis, trend identification, market research, and competitive intelligence, empowering manufacturers to make data-driven decisions;
  • Automate customer support with chatbots, and virtual assistants. These applications can improve customer experience, streamline communication and provide personalised assistance;
  • Optimise supply chains by processing and analysing unstructured data like supplier contracts, purchase orders, and logistics documents. By automating document processing, AI can help identify bottlenecks, optimise inventory management, and improve overall supply chain efficiency;
  • Deliver better market intelligence and customer insights by analysing vast amounts of publicly available data, including news articles, social media posts, and online reviews to deliver real-time market intelligence, track consumer sentiment, and identify emerging trends. This information can drive product development, marketing strategies, and customer engagement initiatives.

According to Prof Scott, these are all valuable functions, but he says it is important to remember that AI is not a panacea.

“Even the most sophisticated machine learning tools will lack true understanding, or consciousness. They will also carry any shortcomings or bias from the data on which they are trained into the outputs they deliver,” he said.  

“More than that, there may also be ethical considerations to bear in mind. Generative-AI models will confidently produce inaccurate, plagiarised, or biassed results, without any indication that its  outputs may be problematic. That’s because the models have been trained on data often sourced from the internet, which is hardly a universally reliable source.”

So - artificial intelligence - oracle or ogre? Prof Scott has a view.

As the author Mark Twain once said: ‘To the man with a hammer, everything looks like a nail’ and to some extent that is where it feels we are with AI. It is just a tool, but everyone wants to solve all their problems with it. But, as with all tools, training in their use is needed, and there must be recognition that tools used wrongly can cause harm. 

“However, as with every tool in manufacturing, used responsibly and in the right way and in the right place, AI can be a tremendous aid for good, allowing us to move towards net zero, improving productivity and making lives better.”


Know the lingo:

Artificial intelligence (AI) is the ability of machines to replicate or enhance human intellect, such as reasoning and learning from experience. Artificial intelligence has been used in computer programs for years, but it is now applied to many other products and services. Although the concept of AI has been around since the 19th century, when Alan Turing first proposed an “imitation game” to assess machine intelligence, it only became feasible to achieve in recent decades due to the increased availability of computing power and data to ‘train’ AI systems.
Machine learning is a form of artificial intelligence based on algorithms that are trained on data. These algorithms can detect patterns and learn how to make predictions and recommendations by processing data and experiences, rather than by receiving explicit programming instruction. The algorithms also adapt in response to new data and experiences to improve their efficacy over time. The volume and complexity of data that is now being generated, too vast for humans to reasonably reckon with, has increased the potential of machine learning, as well as the need for it.
Deep learning is a type of machine learning that can process a wider range of data resources (images, for instance, in addition to text), requires even less human intervention, and can often produce more accurate results than traditional machine learning. Deep learning uses neural networks to process data through multiple iterations that learn increasingly complex features of the data. The neural network can then make determinations about the data, learn whether a determination is correct, and use what it has learned to make determinations about new data. For example, once it “learns” what an object looks like, it can recognize the object in a new image.
Generative AI describes algorithms that can be used to create new content, including audio, code, images, text, simulations, and videos. Following huge media interest, ChatGPT is perhaps one of the most well-known generative AI tool and has prompted a noisy public debate about the extent to which AI tools have the potential to drastically change the way we approach content creation and have the potential to change how a range of jobs are performed.

Related News

The digital thread: A force of modern manufacturing
03/09/2024
The AMRC’s expertise across multiple capabilities puts us at the forefront of u …
AMRC becomes part of £60 million bid to explore and improve new manufacturing techniques
15/12/2016
The AMRC has been named a partner in three of the UK’s new Future Manufacturing …
Madge and Bob turn on the speed for F1 team
04/05/2020
Two fast-moving, agile robots named Madge and Bob have been racing around the Mercede …
Collaboration unveiled in boost to Welsh agri-food economy
25/04/2023
A new partnership between AMRC Cymru and Grŵp Llandrillo Menai has been announced, a …