Reinventing yourself to stay relevant with Simon Waller
At CIO Edge, Digital Champion, Author and Advisor Simon Waller’s presentation focused on the evolving IT function. He sat down with ADAPT’s Marketing Director Kylie Bonassi afterwards to discuss how to stay relevant today and ways of reinventing yourself and your organisation.
So, “The future is already here, “it’s just not evenly distributed yet.” A quote that’s on the homepage of your website by William Gibson, and it’s what sort of gave me the topic of our conversation today, which I want to talk to you about; what you think is coming in the future, and I’d also love to talk to you about, how do you stay relevant today? How do you pivot and shift when the world is constantly changing? And how do you keep your confidence while doing so, as well? So if we start with, let’s get a big wacky, what will 2050 be like?
Okay, so 2050. So, part of my background is in scenario planning. So scenario planning is a strategic planning approach that, at its heart, says that the biggest mistake we can make is to pick just one future. Because if you pick one future, you can guaranteed to be wrong. It’s like trying to pick lotto numbers but actually harder. So it’s like imagine trying to pick lotto numbers in 2050. So instead I’d say to you, rather than pick a future, what are some of the examples of what the future might look like, and let’s just talk about what a positive version might look like or a slightly negative version. So let’s talk about the influence of things like artificial intelligence, this new kind of breed of technology.
So artificial intelligence, machine learning, augmented reality, virtual reality. I believe, or I hope, my hopeful view of the future is that what we realise in these technologies is what we are not. So throughout human history our identity has evolved and changed. It used to be, prior to the Industrial Revolution it was one of the artisans, the makers of things. And then we created machines that were better at making stuff than us, and that was a tumultuous time in human history, and at the end of that we actually redefined ourselves as this current incarnation as cognitive beings.
So what defines us is our ability to think. And here we are now in the cusp of this next revolution where we’re creating machines that are smarter than us in that traditional sense of intelligence that we use, and then you go, well what’s next? And the conclusion I come to is that when we automate things, so when we automate intelligence, we just value it less, so we mass produce it. And just like we mass produced goods and services and we now value them less, it is the rules of supply and demand, let’s imagine the same thing happened to intelligence. What if we just valued it less? If we valued it less, what would we value more? What are the things that are still maybe in short supply?
And so my positive, hopeful view of the future is that we evolve into a version of ourselves that is more focused on creativity, compassion, love, these kind of soft skills at the moment which we know deep down to be important.”
We just really struggle to articulate how or why or their value.
We’ve got got caught up in the convenience. There’s a lot of convenience with automation and things just happening.
Yeah. I think there’s some stuff that’s happening at the moment which are positive signs. It used to about the fear of missing out, but now we’re talking about the joy of missing out. And what if were just to do better stuff with less people? We were just talking previously about Facebook and I was saying that I deleted my Facebook account along with my Twitter and my Instagram. And the one fear I had was not really about keeping up with everybody, that didn’t worry me, it was about not being invited to a party, like an event. People would organise an event on Facebook and because I wasn’t on it, I wouldn’t find out about it. But then I thought about it afterwards and was like, if it was an event that someone really wanted me to be at, they would have called me. If it was one of my close friends, they wouldn’t have left it to the chance of Facebook. They might have put it on Facebook and they might have invited me there, but they would have also picked up the phone and called me. And that was the event I wanted to go to, not the one that everyone got invited to.
So my hopeful view of the future is that we evolve into this version of ourselves which actually has more time for these types of, this side of ourselves, this soft side of ourselves, and that as a society we learn to value that stuff more. And I think there’s certain examples of that starting to emerge. I do a lot of work, or have done a lot of work with tourism, and this idea of experiences, we value experiences more than we’ve ever valued things. And experience is like, how do you measure experience? It’s really hard for us to measure that, and yet that’s something that we are gravitating towards. The slightly perhaps more dystopian view is that at the moment, when we talk about the growth of technology and that technology is fed by data and data-driven decision making, is that we come to look at data as being the answer to everything. And unfortunately, those things that we talked about, compassion, creativity, love, are notoriously difficult to measure. If we’ve got to measure them, maybe we’d actually stop valuing them.
There are businesses who are fitting intelligence to, it turns out–
No, no, no, I completely get it. So you’re looking at a shorter terms now, if you say 2030, I’d say that’s the problem. In the next 10 years, there’s genuine question marks of the way that AI is actually going to be very good, and one of the reasons for it is biases. So there’s a great example of Amazon, who created this artificial intelligence algorithm, or machine learning algorithm, to assess job applicants. And what they found was it threw up a disproportionate number of middle-aged white people, middle-aged white men, specifically. And they had to can it. And initially they were going, “What’s wrong with the algorithm?” And then realised it wasn’t anything wrong with the algorithm, it was wrong with the data. So I think, to me, following that timeline, I think 2030 we have real question marks over the efficacy of some of those types of technologies.
I’m expecting that by 2050 we’ve resolved some of that stuff. But I think that the identity shift is actually a long process. I was talking to you earlier about, when we talked about the Industrial Revolution and the Luddite movements. And the Luddite movement was these people who actually would break into factories after hours and sabotage machinery. And that issue is often framed in terms of people’s loss of jobs.
And you made a really interesting point about loss, this idea of loss and how we respond to loss. And I would think, though, that more powerful than the loss of the job is actually the loss of identity. And I think that’s going to be a real challenge, just like some of the people that we aspire to in society, so we think about smart people, doctors, when we think about engineers, lawyers, maybe not lawyers. But we think about these jobs as being smart things, being smart is something to aspire to. And what if that aspiration is gone?
So I guess in a world of disempowered change, how do you constantly reinvent yourself?
So I think one of the great privileges of my job is the time I get to think. I feel that we’re in a world where people are so busy and we almost wear busyness as a badge of honour. I don’t know to what point when someone says, “How are you?” And you go, “I’m busy.” And then they go, “That’s great.” At what point did just being busy become a good outcome?
How do you not have this fear of loss when there’s things always changing?
Look, I don’t think it’s possible to not have that fear. And it doesn’t matter how often you change or how good you think you are with change, I believe that everyone’s immediate reaction to change is a sense of fear and loss. And it could be a split second, but that is always our first reaction. I think it’s part of, it’s kind of like an animalistic instinct for us. So I think that’s not so much the challenge, so that’s okay, we got to be able to embrace that. But I think that it is something that we can get better at. . The way we get better at it is by doing it, ultimately. Theorising it is not the answer.
Channel the fear.
Yeah. I think the problem we have is that we don’t necessarily provide environments that we can use that inside organisations, which you’ve kind of alluded to before. You know, we talk about fail-safe environments, experimentation. One of the keynotes I do looks at, what is it that really disruptive companies do differently? So what can we learn from the Amazons, the Googles, the Facebooks, the Spotifys of the world, and how do they do things differently? And there’s one thing that Jeff Bezos came on and said, “If you double your number of experiments, “you double your inventiveness.”
That’s the secret, double your experiments.”
So an experiment here is a structured approach that allows you to fail safely. So we need to be providing, I think, our people with them. We need people who are willing to take advantage of that, but I think in a lot of organisations we’ve actually trained people to just be safe without the fail.