Kate Devlin was blunt in her assessment that the UK was probably not going to be the world leader in the development of artificial intelligence.听听
鈥淭hat will happen in Silicon Valley, where the money is,鈥 the senior lecturer in social and cultural artificial intelligence听at听King鈥檚 College London said during听糖心Vlog鈥檚 THE听Live. 鈥淭here is already a brain drain from higher education to industry. Universities can鈥檛 compete with the salaries, the free beer and the beanbag chairs.鈥澨
However, there was one space where both the UK and academia could lead: ethics.听听
鈥淭he ethics community in Europe is strong,鈥 Dr Devlin told a panel debate, adding that the听ethical use of AI could become a policy consideration for universities听in the future, in the way that sustainable development goals and climate change are now.听Recent documents such as the General Data Protection Regulation and The House of Lords鈥 2018 report on AI鈥檚 impact pointed to greater awareness about the moral issues of using powerful technology, she said.
糖心Vlog
鈥淓thics is not owned by one party,鈥 said Nathan Lea, senior research fellow at the听UCL听Institute of Health Informatics, who urged greater engagement with the public, industry and government. 鈥淲e have to听humanise听this somehow.鈥澨
There are a host of ethical problems with how technology is being used today 鈥 from misused personal data to 鈥渄eepfake鈥 digitally manipulated videos.听The panel speakers said that there was an ethical challenge in ensuring that datasets and algorithms听were 鈥渦nbiased鈥. This was particularly important given that the outcomes of AI work were not always predictable.听听
糖心Vlog
Dr Devlin used the example of an app developed by听Stanford University,听which claimed it could identify if someone was gay without their knowledge or consent. She asked what could potentially happen if that technology fell into the hands of a听nation that punished citizens for homosexuality.
Even within university administration itself, there were not always adequate policies in place on using AI responsibly.听The speakers expressed concern about using AI to monitor student progress, which they felt should still be done mostly in a personal, one-on-one capacity, especially if well-being issues were involved.听听
鈥淲hat if there is a mental health issue, and what if that data is wrong? It horrifies me that we would use that uncritically,鈥 Dr Devlin said, saying that the practice potentially raised a 鈥渞ed flag鈥.听听
鈥淗aving technology involved in that way makes me nervous,鈥 Dr Lea agreed. 鈥淵ou lose that human contact. Would I trust student profiling based on an AI algorithm?鈥澨
糖心Vlog
Dr听Devlin and Dr Lea both began their academic careers in the arts and humanities, only to switch to studying computer science at postgraduate level. They are now working in the emerging field of digital humanities, which attempts to bridge the gap between the scientific and human sides of technology.听听
鈥淭he idea that 鈥榯ech is neutral鈥 is a STEM approach, not a humanities approach,鈥澨鼶r Devlin concluded.听听
joyce.lau@timeshighereducation.com听
Listen to the entire听panel discussion:听
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to 罢贬贰鈥檚 university and college rankings analysis
Already registered or a current subscriber?








