Enabling trust online

Written By :

Category :

Artificial Intelligence, Communication, Future PR, Leadership, Public Relations

Posted On :

It was my privilege this month to speak at the International Public Relations Summit in Bali and I thought it might be useful to share some of my presentation here – particularly the thoughts on DALL-E, the reliability of data and the bias inherent in AI. Text and some of the images below – I hope it gives you some food for thought.

“Kia ora, good morning everyone and thank you Ibu Elizabeth for your kind introduction and I would like to take this opportunity to thank you for all the work you have done to elevate the profession over many years, particularly through this event. I am very grateful to be here and have the opportunity to speak together today.​Before I begin I would like to offer my condolences for the devastating loss of life caused by the earthquake this week. My heart goes out to all those who have lost loved ones, to those still searching for friends and family and those involved with the rescue and recovery work. It is pertinent that we are going to be discussing trust this morning as the enormous effort to help those so deeply affected by this disaster highlights that we can still trust the goodness, bravery and care of our fellow humans.

As a society – global or local – we can’t function if we don’t trust each other and the ways in which those bonds of trust are formed have shifted and changed. As we know, trust is central to the work we do, which is building and sustaining the relationships we need to maintain our licence to operate – the permission we are given to do the things we do. Trust is one of the agreed and measurable components of our relationships – the others being satisfaction, commitment, loyalty, mutuality – and I add reputation as a reputation, good or bad, will either hinder or help the start or demise of a relationship.

Image shows Arrow's PR Atom which visualises the elements of public relations. Four spheres with the relationship at the centre and the other elements of behaviour, understanding and communication orbiting the central purpose

​So I thought a really good place to start would be with some sheep. As you may know, I am coming to you from New Zealand where we have our fair share of sheep, scenery and sparkling seas. These sheep I’m going to introduce you to are not any old sheep – because they pose the question ‘are they trustworthy sheep’? You see, they were created by DALL E – an artificial intelligence system that creates art and images from text. I simply told the AI engine what I want to see and it has presented me with options. Pretty cute eh – couple of sheep, enjoying the sea view underneath a rainbow. Where’s the harm? The next ‘image’ I requested was for public relations professionals at work – and I got these.

AI generated image showing sheep on hillside with rainbow

​But my final request was for public relations leaders at work – and look what DALL E served me. Images based on data provided and loaded with bias.

AI generated images of 'public relations leaders'

The pandemic accelerated digital transformation in organisations, with much automation and reliance on data. Learning engines have been deployed across sectors – everything from DALL E style images to recruitment but with the deployment comes bias, burrowing its way into systems and data because the AI we get is only as good as the data on which it is based and if that data is preloaded with bias, the actions undertaken as a result will not be trustworthy – or accurate. This then proves something of a quandary for our leaders as they will be guiding us based on inaccurate data capable of creating stereotypes with the potential for societal harm.Back in 2002 I had high hopes for digital engagement – blogs were the primary form of online connection (along with forums, message boards and other tech now considered archaic). The embryonic online world removed the filters and barriers to communication and allowed us to connect directly to our communities, our customers and stakeholders. Leap forward to 2012 when Elizabeth began this series of events and leaders had realised the power of digital engagement with authenticity and trustworthiness online approaching its peak. We moved beyond direct communication, binding our lives to the cloud, using it to meet many of our needs and wants. But, from 2015, the souring of tone, the bullying behaviour and exploitation of others began to accelerate. We saw the trolls come out in force, we saw bad actors manipulate data for profit and gain, we saw political candidates launch an onslaught of hateful speech, unleash the curse of misinformation and open the doors wide to conspiracy theorists. Some have deliberately exploited the loopholes and shortcomings of social media, simultaneously reducing trust to rubble or inspiring their followers to take to the streets – the digital environment is a now place of great contradictions where we are capable of amazing or terrible things, depending on the choices we make and the values we hold. Digital transformation accelerated in 2020 with the arrival of COVID19 – a Deloitte CEO study reported 77% of them had pushed forward digital initiatives as a result of the pandemic. We became used to connecting like this and had to learn to trust the strength of our connections – both literal and figurative.Today, as people seek to connect with trustworthy information, organisations and people, the online environment is once more fragmenting – this time into closed communities of interest, run on platforms like sub stack or community platforms like guild. Research into trust is regularly undertaken – indeed Adrian has taken us through some of Edelman’s trust findings this morning – and they are not alone. Ongoing monitoring of trust by researchers at Our World in Data demonstrates the correlation between levels of interpersonal trust – how much we trust each other – and other areas of trust such as in government and media. The levels we saw in 2020 are probably very different in 2022 but one constant result is that countries with high levels of interpersonal trust were also more stable, safer, and civil discourse was still – well, civil.Sadly, online engagement has undermined societal trust significantly in the last decade and this distrust has been thrown into sharp relief in the last two years as we have navigated the pandemic. The ‘infodemic’ that resulted from COVID19 and its management created deep cracks in social cohesion, even in those countries with previously high levels of civil discourse, resulting in unrest and violence. Ongoing developments on the many social networks have worsened the situation – the Elon Musk takeover of Twitter does nothing to increase hope for change and the Facebook controversies concerning privacy, accuracy, bias, hate speech and other topics fans the flames of division.So what is our part in all this? As public relations and communication professionals we have an ethical duty of care to ensure that data used by our organisation is clean, accurate and unbiased. We have a duty of care to equip our leaders with the tools they need to speak and act honestly and transparently in the digital environment and we have a duty of care to help our organisations, our stakeholders and our communities of interest to navigate the turbulent waters that await them each time they are online – which for most people, is most of their time.This may seem a little bleak when my topic is enabling trust in the digital environment but we have to understand where trust is being disabled in order for us to create or encourage circumstances where trust can take root and thrive.The people who operate the platforms and networks we use have a great responsibility but we have traded our privacy and in some cases civility for access to the platforms. As has been said so often, the product peddled by the networks is us. But the new, evolved product isn’t simply ‘us’, it is our attention, it is our emotion, it is our state of mind, gripped and manipulated by algorithms that seem to know us better than we know ourselves. Algorithms that are tweaked and tinkered with so they serve us controversy that prompts us to engage, allowing our attention to be gobbled up by advertisers. Given this known manipulation, should our organisations follow the lead of those who ran a Facebook advertising boycott? Withdrawing ad revenues may not seem to0 great an action for these commercial companies but it certainly has a significant impact on reputation, stock prices and hopefully moderation of operations.​As organisations seemingly move into a new ‘golden age’ of purpose – having forgotten all about it for a few decades – and are now attempting to align their purpose and values, perhaps a good starting point from which to demonstrate this renewed dedication is to start by cleaning up their act online and not using technology for dubious purposes. For example, there’s a row here this morning concerning a supermarket chain that has deployed facial recognition technology in a bid to combat shoplifters but there has been a lack of transparency about that deployment, the use of data and long term consequences of the data acquisition – and quite rightly they have been called out for this behaviour.Instead of capturing faces how about building trust through communities instead – we have the tech to do this. How about we use data responsibly and redraw our digital terms of engagement?

Because enabling trust online isn’t about shiny new tech, artistic artificial intelligence or great reviews on Google – it is, as it has always been, about the way we behave, the choices we make and the genuine desire to build a fair and equitable society. Enabling trust online is up to us – let’s discuss how we can make a start today.