Better Business Outcomes
Better Business Outcomes
Dr Stephanie Hare: Are you using technology, or is it using you?
On this episode of Better Business Outcome, Stephen Waddington from Wadds Inc. welcomes author and geo-political risk researcher Dr Stephanie Hare.
They discuss:
· Why technology has such an outsized share of voice and the issues and the range of issues that it is blocking out of the news agenda
· Why the enthusiasm and optimism that characterised the web in the eighties and nineties has given way to a much bleaker perspective
· Regaining control of data and technology, and looking forward to the social media era of the web
· The need for clarity on environmental, society and governance (ESG) metrics as they relate to business
· How ethics is driving the latest wave of innovation in artificial intelligence and why that’s a good thing
· The future of Twitter. Is is a technology platform, product, or hive mind
· Why integrity drives better business outcomes
Presented by Sarah Waddington and Stephen Waddington
For more information visit https://www.wadds.co.uk/
With thanks to our production partners at What Goes On Media
Stephen Waddington:
Welcome to Better Business Outcomes. The podcast where we discuss how communication can transform and grow organisations with a series of global leaders who have set the standard for what great looks like.
I'm Stephen Waddington from Wadds Inc, and in this podcast you'll hear from leaders and senior communicators about their leadership journey and how they create social impact.
You’ll also understand the areas you should be focusing on to build personal and organisational resilience, find out how public relations can unlock value for your business, and enjoy a great listen along the way.
Today I'm joined by Dr. Stephanie Hare, a researcher in the sphere of technology and geopolitical risk. Her book ‘Technology is Not Neutral’, was shortlisted by the Financial Times as one of its best books of 2022.
Welcome to the show Stephanie.
Stephanie Hare:
Thank you so much for having me Stephen.
Stephen Waddington:
So Stephanie, when we spoke recently we talked about tech and why it, seemingly at the moment, dominates the media agenda so much blocking out oxygen from so many other important topics. Why has it got such an outsized share of voice at the moment?
Stephanie Hare:
Well, and it's also even what kind of tech is dominating the media discourse and what kind of tech is also being shut out? So it's not even just that all tech is shutting out so many other topics, it's that even within technology there's certain stories that we seem to be hearing, dare I say, almost too much of for some of us and some that we're just not hearing very much of at all. And so there's the fact of the matter that you've had your big American technology giants FAANG, so your Facebook, Amazon, Apple, Netflix, Google for a really long time. They have dominated quarterly financial results because of their stellar performance and any time their performance slacks, we hear about that, we hear about their leaders lots, everybody knows their names and what they're up to. And we're not necessarily hearing about other types of companies that aren't like an Amazon, which is kind of doing everything or a social media company.
So at the moment, Elon Musk, the world's richest man is taking up a lot of oxygen in the tech news cycle because of his recent acquisition of Twitter and the terms of that and why he might be doing it and the performance thereof and how it relates to his many other businesses, which are also tech businesses and getting a lot of headlines, Tesla and SpaceX.
There's the crypto story of FTX and Binance and what that means. And again, the majority of people on this planet are not involved in cryptocurrency and they're not on Twitter and yet we're hearing such a huge disproportionate amount about them.
And what are we not necessarily hearing about is - we're living in the cost of living crisis. We've got people who really need to do energy efficiency measures to make their homes and businesses more energy efficient to keep their costs down. Not hearing a lot about how to do that. We've got climate change as a sort of dominant concern. This is the COP 27 second week and yet you could really be forgiven for not even knowing that that's happening. You know, you can follow it if you want to, but if you don't want to, you can also very easily ignore it. And I suspect many people are just because it's not flashy and exciting and entertaining. I mean Elon Musk said, ‘Are you not entertained’ recently?
Stephen Waddington:
Yeah, it struck me with so many issues to unpick there. Well, is tech a threat do you think, to conversation in the public sphere? And we've had so much conversation around - since 2016 and the impact of tech on Brexit and then the elections in the US and we've seen it most recently in the midterm. Is it good or bad? How do you characterize it?
Stephanie Hare:
I think, and I have to be really careful that I might bringing personal bias into this because I focus so much on technology. But that caveat aside, I feel like there was a spirit of optimism about the technology sector that wasn't simply about ‘this is a great way to make money for a lot of people’. It was genuine enthusiasm about the start of the internet and the dot com boom in the early two thousands, late 1990s. And then I would say even when social media came on board in the sort of mid-2000s to about 2015, let's say, it was exciting, it was changing commerce, it was changing connection. There was this idea that it was going to help democracy and give more people a voice than ever before and hold power to account. And I think we can kind of bookend that little halcyon period for many people and say that it ends in a sort of darker phase with a lot of this stuff begins I guess from 2015 to 2016 for people in the United States or in the United Kingdom who saw some of the social media companies basically become weaponised.
And of course around the world the role of these companies in facilitating genocide in Myanmar in the case of Facebook or auctioning off child brides and Sudan also in the case of Facebook. Just the trolling, and we saw the effect on democracy in terms of misinformation, disinformation, certain politicians, particularly people of color, women and of course women of color getting it worst of all being really harassed and sometimes chased off these platforms or feeling that they had to censor themselves to avoid getting that kind of agro.
It's turned a bit darker. And also because of Edward Snowden's revelations and the Cambridge Analytica story with Facebook again in terms of election interference in the US 2016 presidential contest, we've realised, I think everyone has realised that any time you are going on any of these platforms or using these tools, they are harvesting data about you.
So you've got your Amazon Alexa at home, it's listening to you while you're having sex, it's listening to you while you're having dinner with your kids. It is taking all of that data and it is doing something. So there's this sense that it's not just that you are using technology. Technology is using you. The human, it's using us. And I think that leaves a sort of dirty feeling for a lot of people.
So you hear people thinking about taking tech Shabbats, taking the night off, taking the weekend off from tech, stepping away the mental health implications, the way that we know these firms know that it hurts people's mental health but they keep doing it anyways. Particularly with kids getting all that surveillance tech even into schools that school's not a safe space anymore. I just think that we've probably all lost our innocence when it comes to tech if we ever had it. I think some of us did though, and now it's difficult to feel that sort of unbridled optimism. It's very much a tempered optimism if you are optimistic, it's with caveats.
Stephen Waddington:
I grew up in the eighties and nineties when the web was a very optimistic place, it was full of promise and we see, I mean we're seeing a little bit of that come back with the revolt against Twitter and the move to Masterdon, but it's terribly complicated. And open source doesn't seem to be a solution either, right?
Stephanie Hare:
No, I mean the Masterdon I don't think is a model that will scale easily in its current form for most people. Most people don't have time to be messing around with this stuff. They want things that are easy. So the more friction that you have in adopting a product or a service, the less people are gonna do it.
I mean there's also this question of maybe for some people they're like, ‘I kind of hope Twitter does tank because I'd love to just be off it. I'd love to just turn off that tap first of all of noise, but also of data collection about me in using it’. And I think that's really the question, is the party over for social media or are people just going to use it and understand that we're in a fractured environment now and we make these trade offs and we accept them in terms of privacy versus convenience and fun or it's good for your business or whatever. Or are we gonna potentially start to have to think about the new phase, a sort of post social media phase, what happens next? It doesn't have to necessarily mean more of the same, it could be something totally new and we could look back on this in 20 years and be like, why did we ever use those products <laugh>?
Stephen Waddington:
I know you are a fairly active user of social medias. Where do you draw the boundaries in your own personal and professional use of these networks?
Stephanie Hare:
Yeah, so I'm only on Twitter and LinkedIn. LinkedIn is obviously for the sort of professional world. If people want to contact me for things. Twitter, I feel that because I do so much work in technology and also with the media, I'm on it because that's where a lot of journalists and political leaders are and where I can get stories fast. I found it a really useful tool. Back when I first joined, I think in 2011, back in the day when I was a political risk analyst covering the Euro crisis, I would see information being released much faster on Twitter than I would ever get from reading the traditional media. And I could contact people directly and people would chat. I thought it was a really useful tool. Now I try to use it still to keep up with what's going on, but I'm also spending a lot more time away from social media in general and reading a lot more books and attending a lot of talks here in London and just trying to get away because I think Twitter is very fast. I think social media can be very fast and it's curated because of these algorithms of course. And it just makes me a bit nervous about what that might be doing to my thinking and analysis. So I've just noticed that very naturally I've changed how I use it and I don't post personal information at all. You should not be knowing anything about my life based on Twitter. You might know about my work interests or stuff I'm following, but nothing about me.
Stephen Waddington:
Just develop that point for a minute then. You talked about the overreach of technology companies in collecting and harvesting data. The UK at the moment is trying, is looking potentially to weaken data protection legislation. What's your view on GVPR and what the UK is trying to do?
Stephanie Hare:
That's a really tricky one. On the one hand, GDPR is better than what came before. So we want to acknowledge that. I don't particularly feel that it's been enforced very well, which is a different point. I would like to see our regulators be much more active in enforcing GDPR. And I still think oftentimes you have to be a lawyer or a tech policy person to even really understand it. If you're just an average person or particularly a kid, a teacher, somebody's working with kids, trying to understand your responsibilities and also your options in case you do get into trouble, who can you turn to for help? It's very opaque still. The UK with its online safety bill that's had a lot of criticism in some quarters and a lot of commendations in others. And I think the fact that it's just, it's so hard. Grinding through data protection legislation is such a grind. My concern with these things always is just how relevant does this feel to the ordinary person stepping away from the specialists and experts? If you go on any British street today and ask them to explain what the GDPR is or what they should do, if they're having problems with a data violation, I bet you most of them will not be able to answer you. And yet that legislation has been in effect since 2018.
Stephen Waddington:
You've talked a little bit about big issues that we're missing related to climate, and this is something that really concerns me, that while there's all this focus on tech and the drama of Facebook, Twitter, and so forth, we're missing so many big issues. I wanted to ask you, what do you think are the economic and political issues that are occupying your mind and concerning you at the moment that you think organisations are missing because of so much oxygen being pulled out of the news agenda by tech?
Stephanie Hare:
So one of the things that's been on my mind that I wrote about it for a magazine called The Wired World in 2023, that's where you sort of look at, you make a prediction for the following year. I'm looking at the ESG market. So environment, social governance, investing. One of the things I thought was really fascinating about that, because I'm very pro-business and I like the idea of business having a positive role to play in society as it so often does. But what I didn't know about ESG and still I started working on it, was that understanding the metrics for accountability for ESG investing. And if you wanted your pension, for instance, to be invested in an ESG fund or a green fund or an ethical fund, the way that I had understood it was I thought it meant ‘how is the company's performance impacting the environment impacting society or impacting governance?
Is it a force for good in the world or not?’ And I wanted to be able to reward or punish companies with my custom, if you will, with my investment <laugh>.
What I learned looking into it was that's not actually how it works at all. ESG metrics are all about how the environment or society or governance affects the company. So how does the world affect the company rather than how does the company affect the world? And so that is something that our friends over on the continent in Europe also had cottoned onto to. And legislation has been passed and is underway to start getting this thing. It's called, I think, double materiality to get it where companies have to report on both how the world affects them and how they affect the world according to these metrics. So I thought that was really cool and I'd love to see more about it because I think so many people are desperate to, if not do the right thing, at least to do better in their lives, the decarbonize their lives to decarbonize their companies as much as possible to play a better, more positive role in society.
And it's again, it's so grinding, it's so opaque, it's really confusing. Are most people studying legislation in Brussels or looking at the SEC's position on this back in the US – no! <laugh>, right? So how do we constantly, my challenge as a researchers, how do we take these things that are happening in certain fora and translate them so that the ordinary person reading a newspaper, listening to a radio broadcast, going to a conference, just having a chat with their clients can do better things with their lives. And so it means I'll never be out of business because unfortunately that problem is yet to be solved. But I do think there's a communication piece and it's a real responsibility for anybody trying to make change. Just like are you taking people with you on that change and empowering them.
Stephen Waddington:
I want to just talk a little bit about artificial intelligence if we can. We seem to be in-and around the hype cycle several times of innovation. Computers are gonna take our jobs, completely over-reaching ethically, and then there's a reset. Where do you think we are right now in the processing power of computers, the data being collected and the application of machine intelligence to act on those data sets?
Stephanie Hare:
I guess you can say in one sense we're in kind of a golden age of AI in that we have an enormous amount of data that is being generated and collected - probably more than ever in human history. Algorithms have evolved to levels of great sophistication to be able to do all sorts of things.
And then the final piece, which is what we were really missing before is we have the computer processing power because of the computer chips that we've been building since really sort mid 20th century up until now. They are just so impressive and the science behind them is incredibly impressive. But so is the manufacturing and productising of them to say nothing of the supply chain that is global, that's required to produce them. So all of these factors have met in this perfect storm. And it's also a case of there's not a lot of legislation governing the use of artificial intelligence or really even data to be completely honest.
It still feels to me very much like a wild west with lots of room for improvement and lots of room for innovation. So I think it's probably a pretty exciting time overall. And I also think what's hopeful if I may be optimistic unusually, is that I think there's a lot of people thinking about the ethics of all of this, and therefore just because we have something doesn't mean we should use it or how do we wanna use it? How do we make it more transparent, explainable, accountable, all of that stuff. So I think a number of factors are coming together to make this a really interesting time for the field of artificial intelligence.
Stephen Waddington:
There's a point you make in your book about ethical red lines for tech - picks up on this last point about where do you draw the line. Do you think between the sentience of technology and the sentience of human beings? What's your current thinking about that? How do we keep the compu or how do we ensure that we have the governance in place to prevent harm from machine intelligence?
Stephanie Hare:
One of the things I explore in my book is what it means to be conscious. And this is an unresolved question. We're still debating it. We still don't really know. There's no agreed upon definition of what consciousness even means. So in the book, I started by thinking, let's just start at the basics. We'll look at plant life and we'll look at animal life and then we'll look at humans and then we'll take on machines because you just wanna sort take people through who've never maybe thought about it before. To be like, ‘that's true, actually is a plant on its own sentient’? That's a question. But you can also be like, ‘is a forest sentient’ a group of plants working together, <laugh> providing an ecosystem? And then there's like just plant my flora around the world. And these are questions that are centuries old, by the way.
So it's like nothing new under the sun. It's fascinating. And ditto for animals, we know we've really evolved in as humans. It's not that animal consciousness necessarily has changed. It's our wisdom and knowledge and understanding and what we think about animals and what we think about animal rights and consciousness and all of that has evolved substantially and no doubt will continue to do so. As I say, my big fear as a historian is historians in a hundred years time are gonna be looking back at us, even this chat we're having right now and being like, ‘ugh. So basic. How did they not know? How did they think that it's so medieval’.
So much less before we approach machines. You could say Twitter is a technology, it's a tool. Does it become conscious though as something that you could study as a conscious entity when all of us are using it? So what happens if everybody just stopped using Twitter? Can you imagine on a peaceful day that would be
Stephen Waddington:
Twitter is an interesting point - I challenged you in the preparation for this call that we wouldn't get into a discussion about Elon Musk, but here we are. We're gonna talk about Elon Musk.
Don't you think that Elon Musk, he works in a product environment and it can design brilliant rockets and brilliant cars, but actually what he's got in Twitter is a network of human relationships and actually it's a hive mind. It's a living thing. Just exactly to your point and it now trying to create governance around that, it just is showing up the challenges of doing that.
Stephanie Hare:
Yes, which is really weird because just a few days ago he said something to the effect that Twitter was a software company, I think it was a services and a software company. And I just thought, ‘Wow, that's such an interesting way for you to see that’. And he's obviously entitled to his opinion, he as a person, but also as the owner.
I'm not sure other people would necessarily agree with that though I don't see it as a software company or as a services company at all. I think its value is in that hive minds mentality and what you could potentially study and learn from something like a Twitter. And it doesn't have to just be that. It could be, frankly, any social network is fascinating.
So is it sentient though, is it conscious? I mean, ultimately I still think no, because as I said, if everybody just stopped using it and had a little holiday, it would just be what <laugh>, It would just be lots of code, a bunch of engineers sitting around waiting for all of us to come back on and infuse it. So we are the oxygen and the blood that makes it alive. And then when we're not there, it's what? It's just a bunch of bones and meat and skin. To make the analogy, it's the shell, but it doesn't have the animating force, the animating force in that case is all of us. <laugh>,
Not a shareholder. I should just like to say <laugh>. Just a user. Just a user.
Stephen Waddington:
Likewise. Honestly, it's so refreshing to hear your perspective of it seems that in so many spheres in research that cycles repeat themselves over history and here we are. The same is happening again.
We always ask guests on our podcast one final question and I want put it to you.
Smple question, What's the one thing you lead you think leads to better business outcomes in organizations?
Stephanie Hare:
<laugh> integrity would be a great place to start. <laugh>. Yeah. Integrity. I think if you start with that at your core, that's gonna affect how you view everything and how you act and conduct yourself. I could really expand upon that, but I don't wanna waffle and fill up our time. But I would say integrity <laugh> a great place to start. You get that right, I think the rest of it'll look after itself.
Stephen Waddington:
That's a brilliant place to end. Thank you so much, Stephanie, for joining me.
Stephanie Hare:
Thank you <laugh>.
Stephen Waddington:
Well, that’s the perfect wrap to today’s Better Business Outcomes podcast. My thanks to Stephanie Hare for joining me.
Please don’t forget to subscribe wherever you usually find your podcasts and if you enjoy what you hear please also leave us a review.
I’ll see you next time.