Better technology collaboration with local communities is needed to ensure the benefits of digital transformation and post-Covid economic growth are distributed across all parts of UK society, according to a panel of experts.
Panellists speaking at the TechUK Digital Ethics Summit on 10 December told attendees that achieving “inclusive economic growth” after Covid-19 will rely on bringing a much wider range of voices to the table, especially in the context of deciding how new technologies are developed and deployed.
Highlighting how the pandemic has “shone a very stark spotlight on a lot of disparities that were there already in the UK and globally” – from regional inequalities to the exclusion of people along racial, gender or class lines – Priya Guha, a venture partner at Merian Ventures, said there is “a huge opportunity as society to take this as a realisation that we cannot continue with the status quo”.
“We are essentially shooting ourselves in the foot if we’re not allowing everyone to participate in our economic future,” she added.
Part of the solution, according to Jonathan Dowden, small business product manager at Sage, is getting small businesses across the country to collaborate on developing their own proprietary technologies that will allow them to “digitally enable” and therefore serve local communities in ways that big multinational technology companies cannot.
“Small businesses in particular have to come together and have to build clusters and build communities, and it has to be led by those businesses themselves,” he said, adding that smaller businesses currently lack the capital resources to make initial investments in that direction.
“There could well be, with some support, some economies of scale there that would help small businesses invest in technologies and own those technologies, and make them work for them in a much better way,” he said.
He added while technological advances are to be welcomed, they must be accessible to a wide range of groups to address the risk of people “being left behind”.
“These divisions that have always been there, and that Covid has shone a light on, are going to get bigger, and that is going to create a really serious problem,” he said. “Technology isn’t the solution in its own right.”
Christina Colclough, founder of Why Not Lab, similarly warned against “techno-solutionism” and the temptation of falling back on the idea that technology, in its ability to increase efficiency and productivity through automation, holds all of the answers to what are fundamentally social and political problems.
She said the increasing dependency of governments on private sector bodies for data collection and interpretation is creating “a very unilateral way of looking at the world” that does not necessarily reflect the views of many ordinary people.
“We have all these fancy AI [artificial intelligence] principles now and one of them is the principle of fairness, but then we have to ask for whom? What’s fair for managers is not necessarily fair for workers, for men for women, that’s why we need multi-stakeholder governance,” she said, adding that people should come together in their localities to “demand that we have joint access and control over the data that’s produced, so we can interpret it together”.
However, to make these kinds of demands requires citizens to mobilise themselves through political organising.
“We mustn’t kid ourselves that data [alone] is sufficient – it’s necessary but not sufficient – because we’ve known about data on climate change for years and years, there’s overwhelmingly clear data on poverty and destitution, but that doesn’t necessarily mean we’re going to change anything, so there’s more that’s required in terms of political action,” said Dowden.
Private companies and technological development
In terms of big private technology companies and their role in promoting fairness and inclusive growth, many attending the summit were sceptical of their ability to do so.
Speaking on a separate panel about the ethical development of AI, Allyn Shaw, president and chief technology officer at Recycle Track Systems, said: “Whenever you start to integrate human rights and discrimination with capitalism, unfortunately capitalism always wins.
“If it’s a matter of ‘Do you get your product out on time or do you ensure it does not have bias or that it’s not discriminatory?’, I’m concerned that those capitalistic system may not err on the side of non-bias,” he added.
Kanta Dihal, a senior research fellow and project at the Leverhulme Centre for the Future of Intelligence in University of Cambridge, said that there are very few incentives for private companies to take actions that are “not directly lucrative”, adding that “legal interventions have always been necessary to prevent a capitalist system from exploiting human beings in the worst possible way”.
She also contested the increasingly popular view that it is in the best interests of for-profit companies to act responsibly on the basis that better consumer trust will bring greater returns down the line, positing that “Facebook wasn’t supposed to be massively unethical and it hasn’t been pulled off the market”.
Renée Cummings, a data activist in residence at University of Virginia, said we need to be highly critical of the ways that we use new technologies.
“It comes back to advocacy, activism and evangelism … I think because we have so much fun with technology and we use it for so many things that entertain, we don’t look at risks such as privacy and security, and how these things can impact your life,” she said.
“It comes to really building the knowledge level, building public education and public awareness, and getting communities involved in the conversation.”