Artificial intelligence or A.I, was a huge theme for this year’s Festival of Future Cites; at this week’s lunchtime talk as part of Digital Week we were lucky enough to be joined by four experts in the field to ask some of the burning questions. 

Meet the panellists:

Mike Lloyd runs company club.org and believes A.I. is in the grasp of everybody, if you can use WordPress you’re on your way to using A.I. and Mike wants to embed it into everyday learning. 

Charles Radclyff is a ‘data philosopher’ who questions the ethics and societal impact of A.I. and automation systems. Charles is a serial entrepreneur who has focused his career on solving tough technology challenges for some of the world's largest organisations. 

Becky Sage is the CEO of Interactive and a former Studio resident. She’s exploring how A.I. could benefit schools and allow learning to be more inclusive. The ethics around A.I. are still unclear. 

The panel was introduced by BBC Academy producer Mel Rodrigues who produces a variety of factual films and inspirational events aimed at connecting people with creativity and technology. Mel kicked off the panel with this question.

1. What is A.I?

Mike: A.I. is the application of maths to data. It’s made progress over a large amount of time, but what’s making it such a popular buzzword is the huge influx of data coming in from our smartphones, social media and the internet. The ‘A.I. machine’ could be likened to a harvester where machine learning is the engine of the harvester and the wheat is the data. This machine can look for anomalies or can look for clusters where there is better growth and you can make predictions. There’s 100 ways to process the data. One real life example of this is Siri; you speak into Siri and it can recognise you because it has been taught to using algorithms. After several iterations, it becomes more and more accurate.

2. Who actually owns this technology? 

Charles: We own the data, however with Facebook we’ve unwittingly trained it over 10 years how to read faces every time we tag a photo. Google and Facebook don’t own the maths, but it’s the application of how it’s being used that matters. A.I. is all of ours, but the threat is companies gaining unfair control of how it is used.

3. How can we use this data in a more educational way?

Becky: A.I. can help with personalised learning. Science learning is measured by box-ticking and exam results, but we should be teaching curiosity and allowing students not have fear around learning. What if we could create worlds to explore to learn? However, there is a balance to be struck as they still need to measure learning in some way. Could we use A.I. to track how different people learn?

4. What is exciting about using A.I. in education?

Mike: The hard thing about STEM (science, tech, engineering and maths) is teaching it in schools, as it’s all fragmented into different subjects with little crossover. Metacognition or ‘thinking about thinking’ has enormous potential of being a catalyst for STEM learning.

5. What are the real dangers of A.I? How do we safeguard against a robot uprising?!

Charles: There’s going to be much bigger problems before we even reach that point, such as automation and the impact this will have on jobs. Truck drivers in particular are at threat, as we are so close to driverless trucks. What’s our responsibility in helping them retrain to support their families? Will the implementation of a Universal Based Income help? Will it break the stigma of not having a job and allow us to transition towards purpose separate from the job? Our jobs shouldn’t be connected to who we are- this is a very 19th century notion.

Becky: We should question who is driving this A.I. technology? This is the first understanding how unconscious bias works. There’s so much focus on STEM jobs, but why aren’t we teaching compassion or ethics? There is a real concern as there’s a very specific group of people driving STEM and A.I. forward.

Mike: If you work for Deliveroo you’re already working for an app; A.I. is already here. However, it also has its limitations - A.I. can’t be creative (google dream is only one example of artificial creativity) and even if it could, what would be its purpose? Would we want to buy art created by A.I.?

If you'd like to find out more about this years Festival of the Future Cities programme click here.