With a background in law and politics, what led you into a career in AI?
It’s been a bit of a journey. I studied Law because I was interested in advocacy, analysis, philosophy, and sociology. But when I actually started my career as a lawyer, I found myself increasingly drawn towards politics. It had a lot of those same aspects as law, but it also allowed me to look at some of the big questions and challenges that we grapple with in society which I didn’t have much access to as a capital markets lawyer!
I left law after 6 years of studying and working and got a job in Westminster as a political adviser and campaigns manager on the Shadow energy and environment brief. It opened a whole new world of politics, policy, regulation, and communications for me, which I loved instantly. It was fascinating being there just as the huge socio-economic debate about climate change was exploding and new and emerging technologies were fundamentally reshaping energy systems and markets. It felt like a real time of challenge and disruption, which I’ve become quite addicted to in my career.
Technology and systems became a big passion for me, especially those on the cusp of, or undergoing, radical change. Telecoms felt like a natural evolution from energy, especially as the two were dovetailing through smart systems and smart cities I loved learning about everything – from infrastructure through to the economics, and of course the rapidly evolving world of digital policy.
This was again an amazing moment, because we were starting to talk politically and as a society about “taming the digital wild west”. The internet as we know it had been around for nearly two decades years, with unbridled growth, very little regulation, and not enough mind paid to dangers and risks. But because of issues like online harms, digital exclusion, and cybersecurity, the conversation was hotting up about ethics and where technology could lead us if we didn’t start to put appropriate safeguards and boundaries in place.
When the opportunity at Visa came along, I was really excited. Like energy and telecoms, payments is a major system in which digital transformation is happening incredibly fast. It’s the glue that holds together economies and Visa has been a leader in the space from the start, so it was a real privilege to get a front row seat for that. Alongside digital payments, Visa was also leading on a host of other areas of digital that I was interested in – from data science and AI, to cyber and fintech.
My first year at Visa, I learned how the payments network and corporate side of the business worked. But I was increasingly drawn to the opportunities the business was exploring with data, especially looking at how we can use the power of data to tackle some of the big socioeconomic challenges we face today in the world. This has become a huge passion for me.
What kind of improvement, change or growth are you trying to achieve right now?
Visa is an extraordinary innovator as a business. They’ve created a global payments network that powers global commerce for businesses (large and small), and helps drives convenience, choice, security and all of these other benefits to consumers. Cyber security and fraud protection is particularly important. Most consumers wouldn’t immediately think of that when they think if Visa, but it’s at the core of everything we do.
The core business is processing transactions (around 150 billion every year), which is a fundamental service we provide in a digital economy. But it also means that we have an extraordinary opportunity and responsibility to explore what other value we can deliver from that transaction data for consumers, businesses and society.
I think every business should be looking at how they can use data to make better decisions. We use it to make ourselves a better business for the payments ecosystem – everything from operational efficiency and network resilience to cyber security and fraud protection. We increasingly look at how data can be used to help our clients grow their businesses, and better understand and serve their customers
And finally, something I’m particularly passionate about: using the power of Visa data ‘for good’ in areas like sustainability, financial inclusion, disaster response and economic recovery. Data might just be the most powerful asset we’ve ever had to tackle some of these problems, which are becoming more and more urgent. During Covid -19 data has been an unbelievable tool in helping predict and track the spread of the virus and develop vaccines, which I think has really opened people’s eyes to the potential benefits.
All of this data exploration and innovation needs to be done with a deep respect for privacy and the highest standards of responsible best practice. So it’s fascinating from a tech and innovation standpoint, but also from a policy, regulatory and ethical perspective, which is a core part of my role.
What’s been the most memorable moment in your career to date?
Probably the cyber-attack at TalkTalk. That was quite extraordinary. I’d been at the company for 5 weeks when it fell victim to what was at the time the UK’s largest cyber-attack. This was pre-GDPR, and the company took an unprecedented decision to voluntarily notify all their customers within a couple of days.
Companies often simply did not tell you back then within any reasonable timeframe (if at all), that your data had been compromised, so it was an incredibly courageous and controversial decision the company took. There was a lot of commentary in the crisis comms world about how they should have “kept quiet” about it, because it had serious repercussions reputationally and financially. But from the inside, having been told by security experts that they could not immediately tell what data was taken, nor could they say when they would know, it was the only ethical thing they could have done. Warning everybody as soon as possible was the only protection the company could give customers against potential scams.
What unfolded was like being in the eye of a storm for 6 months, with a huge number of stakeholders to manage – from police and security services to investors, regulators and politicians, and of course the media. It was an extraordinary experience and the team I worked with were absolutely incredible.
We came out the other side with two choices: we could never talk about data again, or we could try to raise awareness and share what we’d learned – because that was the other thing, companies to whom this had happened never talked about it. The management made what I think was a great call and let us run a series of initiatives to help businesses and consumers understand the risks and how to protect themselves. We launched a campaign with The Sun called ‘Beat the Scammers’, and published a thought leadership report on cybercrime with chapters written by the secretary of state, hackers, GCHQ, cyber-security experts from BAE Systems alongside others. Our hope was bold action and transparency would be useful to others, and the response we received was fantastic.
In terms of crisis management, it was the most invaluable experience I’ve had – and frankly I hope I never have to go through one like that again. It was a lesson that I will keep with me for my whole career, which is that every data point represents something about someone’s life or business. So, if you are using data you must keep those people front and centre in your mind with everything you do. Even if you have to make really hard decisions which might impact your business, you put them first. If we don’t have trust in the companies that hold and use our data, the whole digital enterprise collapses. So, it was a hard lesson, but invaluable and seminal in my career, and I will always be grateful for what it taught me about leadership and doing the right thing.
What advice do you have for future talent wanting to work in data and AI?
My advice for future talent is don’t ever think of yourself as “not the type of person” who could, or should, be working in data and AI. Ignore the stereotypes. Not every data scientist needs to be a maths genius, not everyone working on data and AI needs to be technical – my role for example is focused on policy, ethics, privacy and risk. People from minority or excluded groups should be aware that while they may not see enough people like them in leadership roles, they are badly needed and will bring huge value. We need diversity and inclusion and fresh thinking in this sector, so if you’re interested and inspired by the power of data, go for it.