The Business of Data Podcast

Business of Data by Corinium

The Business of Data Podcast is dedicated to providing a voice to the Global Data & Analytics community. Each episode is focused on a specific topic area, uncovering the most pertinent issues facing global data & analytics leaders.

All Episodes

Ross Simson, Global Chief Data Officer at CDP, shares how he’s advancing the non-profit’s data strategy and tackling the unique challenges facing charity sector data leaders With climate change in the spotlight following the UN Climate Change Conference (COP26), Ross Simson, Global Chief Data Officer at environment-focused non-profit CDP, joins us this week on the Business of Data podcast. In this episode, he talks about developing a successful data strategy at CDP, building a data team with the right skills and the need for a global standard for data management accreditation. Futureproofing CDP’s Data Strategy CDP helps organizations and governments measure and disclose their environmental impact. To develop CDP’s data strategy, Simson had to take a holistic approach that considered a range of factors, including people, resource allocation and the business environment. “Being a charity, and as we get more organizations disclosing to us, we can’t just add more staff,” Simson explains. “We have to invest in technology where we can automate certain tasks and still use data science and machine learning.” One of the methods Simson uses when planning CDP’s data strategy is McKinsey's Three Horizons Model. The model aims to help businesses manage innovation and growth goals by grouping initiatives according to when they will come to maturity. He says: “We’re facing this dilemma where you've got horizon thinking, but everybody wants everything done now. For example, ‘data-driven’ seems to be the newest buzzword. But to be a truly data-driven organization, you need to understand the value of data. We have all these tools and technologies, but we aren’t well organized where data’s concerned.” “One of our challenges now [at CDP] is that we’re updating our data model,” he continues. “But there are 15 global standards that we can adhere to and each of those can have up to 25 data points. So the question becomes, how do you create a model which underpins not only what you need now, but for the future?” “You want to develop a data strategy that’s as futureproof as possible,” he adds. “The past year has shown us that you can’t ever be truly futureproof. We must become more nimble [and] change the way we think because, when it comes to technology, one size doesn’t fit all.” Investing in Data Team Talent Simson would like to see a common global standard for accreditation in the various data fields, as well as a focus on nurturing young talent. While there are many organizations working on creating global data standards, he says the industry needs to focus more on accreditation. He chairs the Customer Data Council in the UK, an advisory arm of the Data and Marketing Association. The association provides recognized accreditation and training in areas such as data management. The challenge going forward, he says, is in setting a global accreditation standard for the va

Nov 24

29 min 22 sec

Alwyn Thomas, Head of Data Strategy at the Financial Times, discusses the importance of collaboration and buy-in when embarking on organizational transformation Data leaders agree that people are key to implementing business transformation projects successfully. The challenge often lies in communicating the vision with employees and fostering collaboration to see that vision through. Speaking on the latest episode of the Business of Data podcast, Alwyn Thomas, Head of Data Strategy at the Financial Times, says collaboration is fundamental to digital transformation. The people developing a company’s transformation strategy should be talking to those on the ground who’ll be implementing it. “If the people don’t understand why the process is taking place or why we’ve chosen a specific technology, it creates friction,” Thomas says. “We won’t be moving in the same direction.” He says speaking to each department in its own language will help everyone understand how the transformation strategy applies to them. The best way to do this, he says, is to first walk in a person’s shoes. He gives two examples of this: The first is of a data strategist who joined Deliveroo, the food delivery service, and spent a day delivering goods to get a sense of the business challenges. The second is from Thomas’ time at the Bank of Ireland. “We initiated an undergrad program where the graduates would do one of their cycles with the data team,” he recalls. “They learned where the data’s coming from and to use tools such as Tableau so that, when they finished their rotation, they could join any team and know how to access and use the data.” “As a data strategist, these are the types of things I should be focusing on – advising departments on a roadmap that will address problems and enable them to make the right decisions moving forward,” he says. Tech Won’t Solve Every Challenge Even as someone passionate about technology, Thomas admits that tech won’t solve all the problems organizations face. He says collaboration and breaking down silos is what has helps companies to make connections where none previously existed, resulting in new product offerings. He mentions, for example, the recent partnership struck between competing data visualization companies Tableau and Looker. The agreement will make visualization and analysis projects more accessible and easier to collaborate on. “It’s a symbiotic relation between the two and you want that type of relationship between data and other parts of your business,” he concludes. “The innovation is in understanding how we can help make the data available so business can see what’s going on and understand it.” Key Takeaways Communicate with stakeholders on their level. To establish trust between data and business teams, meet teams on their level and ‘walk in their shoes’ Tech won’t solve everything. Innovation depends on breaking down organizational silos, as well as data ones Focus on business needs. A data strategist’s job is to enable the right decisions and processes for the organization to move forward

Nov 18

28 min 24 sec

Nedbank Executive, Group Data Services Paul Morley shares his experiences of building company-wide data communities and how he believes organizations can benefit from them For companies to truly embrace data and analytics, data leaders need to move conversations around data literacy beyond their departments and into the broader organization. For Paul Morley, Executive, Group Data Services at South African bank Nedbank, one of the best ways to build data literate organizations is to make data conversations a part of daily business operations. In this week’s Business of Data podcast episode, he shares his experiences around building organization-wide data communities and how companies can benefit from them. “I focus a lot on internal education and collaboration to build awareness and create enthusiasm around data,” Morley says. “I probably spend about two or three hours a day doing just that.” “If I could do more, I would,” he adds. “It's very important to make people who don't understand data understand it because, for an organization, working with data is like a team sport. As much as we might naturally want to focus on just the data team, it’s actually not about us. It’s about taking the whole company with you and inculcating that knowledge.” Three Tips for Promoting Data Literacy Gartner’s 2020 Execution Gap Survey found that 67% of employees don’t understand their role when new growth initiatives are rolled out. To address this challenge and drive the value of data literacy all the way to the grassroots, Morley recommends the following: Embrace repetition. The only way to instill a data literate business culture is repetition, repetition, repetition. It’s like playing a new sport, Morley says. It takes practice to get good at itAppoint the right leaders. Morley says it’s important for data leaders to cultivate good communication skills. He attributes his own success in part to his natural extroversion, and recommends building teams full of leaders who are strong communicatorsGet down in the trenches. Morley encourages data leaders to spend more time talking to the people at the coalface than sitting in leadership strategy sessions. If frontline staff don’t buy into the journey, you’re going to fail. With them on-side, you’ll have a better chance of enacting the change you’d like to see Be Mindful of the Headhunter Threat Building a strong internal data community may be about more than those working directly with data. But the challenge of finding and retaining staff with the right skills persists. Talent poaching remains a reality for many companies. Morley explains: “Two other local banks are actively hunting our employees. We’ve lost about 30% of our staff this year alone in our group, across professions. It is concerning and it’s something we’re discussing at the executive level. But we’re also still attracting a lot of new blood; that’s testament to Nedbank’s culture.” Seeing off this threat is about making your company as attractive a place to work as possible. Morley says providing opportunities for training and personal development has a role to play, here

Nov 5

25 min 41 sec

Ian Wallis, Deputy Director, People, Analytics and Insight for HM Revenue and Customs, talks about his new book and how organizations can use data-driven insights to better serve their HR functions The success of an organization depends on the people who work in it. But while data analytics is becoming central to many business functions, most companies could be doing more to use data-driven insights to enhance their HR departments. This week’sBusiness of Data podcast guest, Ian Wallis, Deputy Director, People Analytics and Insight for the UK’s HM Revenue and Customs (HMRC) department, specializes in exactly that. He believes people analytics is a “great, untapped way” to transform organizations and drive better CX through more engaged employees. “Getting the most out of our staff falls into my domain,” Wallis says. “Anything I can do to improve their experience ultimately leads to a difference for our customers.” Helping Staff to Harness the Power of Data Wallis argues that executives don’t typically appreciate that HR is at the heart of a business’ operations. As a result, people analytics is still an underrated discipline in modern business. “Looking at HR through the employee lens, there's a very direct relationship between good customer experiences and engaged, well-trained employees who are equipped for their roles,” Wallis says. To ensure its staff have the skills they need, Wallis helped HMRC to develop a voluntary information literacy program for its 65,000 employees. The program covers topics including why GDPR matters, how to deliver analytics and what data ownership is. “We’re living in an information literacy era and people need to be comfortable using information in their daily tasks,” he quips. “That’s a philosophy we’re trying to embed.” Ensuring Continuity and Retaining Data Talent After more than 30 years working in data analytics, Wallis recently published Data Strategy: From Definition to Execution, a book sharing his experiences in planning, developing and implementing data strategies. In it, Wallace argues that a key issue when working with in-demand skills is ensuring a sense of continuity as employees come and go. “It becomes important to entrench a level of understanding beyond a few people,” he says. “One of the themes I cover in the book is the importance of building bridges with stakeholders to have a common understanding, and then linking the corporate strategy to the data strategy, so that it's not only enduring but also perfectly aligned.” Considering his own career, Wallis says one of the best ways to retain and develop talent is by creating opportunities for lateral career moves. “I’ve built a number of analytics and insight teams from scratch,” he says. “Here at HMRC, there are 14 of us that span everything from master data, data governance and data quality, all the way through to touching on data science.” “There are great opportunities to move sideways and broaden your career,” he says. “It makes you a well-rounded employee, allowing you to learn how this broad spectrum comes together.” Key Takeaways Encourage employee growth and career development. Fulfilled employees deliver better customer experiences HR is an underutilized resource. Partnering with HR to roll-out data literacy initiatives can help to drive business transformation efforts Provide opportunities for lateral career movement to retain valuable data and analytics talent

Oct 28

29 min 39 sec

Kassim Hussein, Head of Enterprise Analytics at Cleveland Clinic London, shares how he overcame an internal skills shortage to set up the clinic’s data and analytics function When Kassim Hussein was offered the chance to build the analytics function for US-based hospital group Cleveland Clinic’s new UK arm, he jumped at the opportunity. As Head of Enterprise Analytics at Cleveland Clinic London, he’d gain leadership experience while sinking his teeth into an exciting new challenge. “I realized this is a chance to start a new healthcare system, from scratch, in London,” Hussein recalls. “You’re not going to get this opportunity many times in your life!” But as Hussein says in this week’s Business of Data podcast episode, the COVID-19 pandemic created unexpected challenges as he set to work establishing the new data and analytics unit. Overcoming a Skills Shortage with Vendor Support Cleveland Clinic is a US-based hospital group that also specializes in medical research. Its outpatient center was opened in Marylebone in September. This will be followed by a 184-bed hospital next year. Before the center could open its doors, one of Hussein’s first priorities was to set up an enterprise data warehouse (EDW). This would help the organization capture key strategic data sources in one governed location that would also feed self-service analytics tools. Hussein says working virtually with different vendors was a challenge. But he also says doing so was essential to address the limited resources his team had in-house. “Because of GDPR, we had to build our own processes, working with vendors to make sure we have the right capabilities from day one,” he recalls. “Remember, I joined the team virtually and this came with many challenges. I was lucky that we had a big project support team (based in Cleveland, USA) and some contractors. But finding the right local skills was the biggest challenge.” “We use [e-health cloud-based software] Epic for our medical records management system,” Hussein continues. “It’s an American tool. So, only a few healthcare providers use it in the UK. Finding someone with the standard BI skills and experience plus knowledge about Epic is near impossible.” “Because of the nature of our situation, one has to get their hands dirty to cover the internal skills gap,” Hussein adds. “It’s a bit of a hybrid role, right now. So, I have to do SQL scripting [and] Tableau visualizations. I’m leading big meetings with stakeholders, but I’m loving it.” “If we want to make London more attractive to BI analysts, we would have to invest in training and development,” Hussein adds. “The analysts also need to be interested in upskilling themselves.” Next Steps for Analytics at Cleveland Clinic London With the hospital’s basic analytics systems up and running, Hussein says the next step is laying the foundations Cleveland Clinic London will need to branch out into AI and machine learning. “The data quality has to be really accurate; if we want to build models, the data has to be fit for purpose,” Hussein says. “We want to understand the data lineage and we need a genuine u

Oct 21

29 min 10 sec

Gary Goldberg, Chief Data Officer, Trading and Shipping at BP, talks about the role data will play as the firm transitions into an ‘integrated energy company’ With the 2021 UN Climate Change Conference just weeks away, the pressing need for the global economy to transition to greener energy sources is ‘top of mind’ for leaders across the globe. This need will be a defining feature of business and data strategies for energy companies including BP in the coming years. In this week’s Business of Data podcast, BP Chief Data Officer, Trading and Shipping Gary Goldberg outlines the role data will play in helping BP to realize its vision of becoming an “integrated energy company”. “BP is transitioning into an integrated energy company,” Goldberg says. “That, especially from a data perspective, creates a whole load of opportunities. When we look at new sources of energy, when we look at production techniques, there’s a need for data to support those initiatives.” “[This] also means that the company is moving from a historically siloed approach to different commodities to (as the title would imply) an integrated approach,” he adds. “It puts data more and more at the heart of everything we’re trying to do.” “That integration of data and what that means for the company and what we can do is incredibly inspiring,” he continues. “It means bringing together data from across our commodities, from across the new products we’re bringing to market, and getting the interrelationship insights that are gained in offering a portfolio of products to our customers.” BP’s Data-Driven Business Transformation For any historic organization, achieving such an ambitious goal is like turning an oil tanker at sea. Creating a truly data-driven organization will take time. But Goldberg highlights several key milestones his team has already reached on that journey. “We’ve got some pockets of real excellence and are trying to scale up,” he says. “For me, the transformation on the data comes down to fundamentally allowing people to understand the value of data.” “When we can articulate what [our data] assets are, we can then have a conversation about how you want to manage them,” Goldberg continues. “For us, one of the first [milestones] was just to establish that inventory.” He notes: “[It was about] changings the conversation from an ethereal one around, ‘Let’s just make it better’, to, ‘This is what we have. This is what we could have. It will generate a return if we use it for the following purposes. How much would you spend to get that return?’” “For me, when I changed that conversation to where the conversation was one of investment return, that’s the transformative moment,” Goldberg concludes. “It’s when the business was properly engaged.” Key Takeaways Data integration can enable business innovation. Breaking down data silos is essential for uncovering insights that stem from the interconnectedness of data Articulating the value of data is key for securing buy-in. Reframe governance conversations around increasing the value of the company’s data asset and achieving business outcomes Cataloging data is the first step to quantifying its value. Before leaders can talk about how to increase the value of their data assets, they should understand what data their businesses have

Oct 15

26 min 31 sec

Sathya Bala, Head of Global Data Governance at Chanel, shares why she launched My Skin My Story, a global non-profit community connecting diversity and inclusion leaders with data-focused professionals Inclusion and diversity has shot up the corporate agenda at many companies in recent years. For Sathya Bala, Head of Global Data Governance at Chanel, it’s a topic that will one day receive the same attention as sustainability does in business today. Awareness of the roles diversity and inclusion plays in guarding against decision-making bias and data plays in ensuring organizations treat people fairly may be on the rise. But collaboration between diversity-focused and data-focused teams is generally very limited. In this week’s Business of Data podcast episode, Bala shares how she is working to address this disconnect through My Skin My Story, a non-profit global community she founded to connect, empower and elevate women of color to succeed on their own terms. “I wanted to champion diversity and inclusion,” she says. “I was doing work as a very enthusiastic amateur, training myself, going to webinars, going to events around diversity and inclusion, just trying to educate myself and complement my personal journey that I’ve been on.” “I was finding that data was coming up a lot in these diversity and inclusion events,” she continues. “But there were no data people at those events.” “Over time, [I was] hearing from data leaders at [industry] events that they cared about diversity and inclusion and hearing from diversity and inclusion leaders saying how important data is,” she adds. “I very quickly thought, ‘Why are we not in the same place talking about this?’” Where Data, Diversity and Inclusion Collide Bala has been bringing professionals from these tribes together since November 2020. In that time, she’s identified several areas where these historically separate business units can learn from each other. She argues that it’s the responsibility of data leaders to help diversity and inclusion teams overcome challenges around data collection, data governance and data analysis. Meanwhile, data teams should draw on diversity and inclusion leaders’ expertise to ensure they are diverse and aware of issues around diversity and decision-making bias. “These are things that [are not in the wheelhouse of] diversity and inclusion professionals,” Bala says. “That’s not their job. We have that expertise and we can also make sure that, as organizations, we’re not wholly reliant on [external] consultants.” “Should we be trying to tackle our posture on data ethics without diversity and inclusion input?” she adds. “Probably not, because there are things that they can provide a lens on, in terms of equity, transparency [and] debiasing.” “We just don’t have the relationships built, historically, between the data profession [and] the diversity and inclusion profession,” she continues. “Do we have to wait for our diversity and inclusion teams to think of us and tap us on the shoulder? Or can we be the instigators?” “As a Head of Data Governance, I had to figure out, what are those business-critical projects that I want to advise on, so that we build data by design into those projects?” she concludes. “Diversity and inclusion is one of those initiatives. It is a business-critical initiative just like perhaps sustainability was a few years ago.”

Oct 7

33 min 25 sec

Andrie Galaktiou, former Head of Data at John Lewis Partnership, discusses the role she played in establishing John Lewis’ data management team during her 13-year tenure at retail giant For a retail giant like John Lewis Partnership, the growth of digital commerce in recent years has created big opportunities to drive sales and optimize processes with data and analytics. But as former John Lewis Partnership Head of Data Andrie Galaktiou says in this week’s Business of Data podcast episode, few in the industry are in a position to take full advantage of these opportunities. “It’s exciting, because I think the retail industry’s definitely moving forward,” she says. “What we need to understand is, the foundational pieces need to be done first, before you can start to think about being innovative.” After starting her career at John Lewis in its merchandising department, Galaktiou moved into the company’s data function and played a key role in establishing its central data management team before starting a new role at Publicis Groupe in September 2021. Kickstarting Data Investment at John Lewis While the Global Financial Crash kickstarted data governance investment in the financial services sector, the retail industry has had no parallel event to kickstart its path to data maturity. As a result, it’s taken longer for retail executives to prioritize strategic data investments. “The first [challenge] will always be getting the buy-in and the funding within that space,” Galaktiou says. “Sometimes it helps to start things small or start things with specific projects or pieces of work.” “If you don’t understand what your stakeholders need or want, you’re not always going to get the right results from them” Andrie Galaktiou, former Head of Data, John Lewis Partnership “It’s really hard to try to move forward with some of these things if people don’t get it,” she adds. “So, educating and training the business on what data is, how it works, what it can do for them, is absolutely key and one of the biggest challenges.” For Galaktiou, starting with key programs that showcased what data could do for John Lewis helped her to securing buy-in for further investment. She also believes her background in merchandising meant she understood what mattered to stakeholders in the business. “Having worked in that space and really understanding how systems work from their perspective – how they use the data, what they need to do with it – made it a lot easier,” she argues. Building a Data Management Team for the Retail Sector Once Galaktiou and her colleagues had secured executive support and signed off the company’s data strategy, her focus became building a team with the right skills to drive those plans forward. “I don’t believe there is a textbook answer for what a data team should look like, what the roles should be and what people should do,” she says. “You’ve got to really understand your organization, bring in the expertise from the organization as well as the expertise from understanding data and data management to really drive that forward.” “Providing you’ve got the backing and the funding to do it, it’s about really understanding the business and where the business is at and where it’s going” Andrie Galaktiou, former Head of Data, John Lewis Partnership “For me, it’s [about] finding people who can speak the data language, who can really understand the business,” Galaktiou continues. “The data team often sits between the business and technology, and so they have to speak three different languages. You need to be able to find people who can speak all those languages and really bring it together.” She adds that ensuring data team roles have enough breadth for staff members to work on a variety of projects and choose the projects and focus areas that interest them most is important for staff retention. “It doesn’t all just happen really quickly,” she concludes. “Getting the people in place is one step of that process. You’ve then

Sep 30

26 min 32 sec

Handelsbanken UK Head of Data Quality Mark Wilson offers his advice on getting to the root of data quality problems with frontline staff Data quality has a profound impact on the daily work of frontline staff. And when frontline staff identify data quality problems they expect them to be dealt with promptly. Failure to do so can seriously drain morale. In this week’s episode of the Business of Data Podcast, Mark Wilson, Head of Data Quality UK at Handelsbanken argues that fighting back against apathy is essential to drive business improvements using data and analytics. For Wilson to create a grass-roots view of data governance at Handelsbanken he focused on working with frontline staff to improve data quality as well as data governance structures. “We’re working with the front-end staff on how we collaborate with them more in their everyday work with customers,” Wilson says. “It’s really about being in that real-life world, away from that ‘ivory tower head office’ mentality.” Ascending from the Ivory Tower Understanding the realities of how businesses generate and use data and analytics on the ground is critical to fixing problems and improving results. And there are few better ways to learn than by getting your hands dirty. For this reason, Wilson recommends that head office staff spend time working in the field to discover how frontline staff are using data and analytics, and what their challenges are. “They’ll tell you the problems that really need solving and what’s causing the problems,” Wilson says. “Whereas if you sit too far back in the business, you just see the results.” This distance can sometimes lead to head office staff misdiagnosing problems. Problems like blaming careless staff when, in fact, the problems may lie in processes and technology. “A lot of our early wins, [came from] speaking to our branches,” Wilson recalls. “You’re doing a disservice to not flush this out and talk about these things.” He continues:” And as always, by enabling communication about them, you could find there is already something in place that a small, slight shift on a project path to factor something in might solve things that were never anticipated in the first place.” Fighting Apathy in the Workforce Data problems when reported should be resolved promptly. Failure to do so can seriously drain morale, and this effect can spread quickly throughout teams. “I think this is something we consciously have to think about. What is the message we’re sending when we don’t respond?” Wilson remarks. “I think that [responding] is important for morale, for people’s wellbeing or belief that they can make a change. And for the company to show that it’s listening.” Creating structure and feedback loops for staff is therefore essential to providing agency to frontline staff – and showing that the company cares about their input. “We should [have] data quality issue management processes in place where any employee can go into a place and record a data quality problem,” Wilson says. “You should be reviewing that, digging into the root cause, doing the evaluation, and perhaps then identifying who in the business has the responsibility to take the corrective action.” He continues: “We should have a data governance committee in place who are keeping track of these open data quality issues. And that should be part of the management structure of an organization so you’ve got an escalation point.” Ultimately, data and analytics teams are there to help companies meet their goals. Therefore, fostering trust amongst staff at all levels is essential. “We’re here to help the company grow and be better through better data management,” Wilson concludes. “So we need to word things in a way that says, ‘let us know if you’ve got a problem. This is who you contact. These are the processes in place to help you.’” Key Findings Leave the ‘ivory tower’. By working with the frontline staff you can gain a true understanding of how data is generated and wh

Sep 15

31 min 34 sec

Roger Halliday, Chief Statistician for The Scottish Government, shares how public service organizations across Scotland are ‘joining the dots’ between their datasets to help the most vulnerable members of society In the UK today, much of the data public sector organizations collect about British citizens lives in silos. But as pioneering countries such as Estonia have shown in recent years, governments can greatly improve the quality and efficiency of the services they provide by breaking down those silos and working toward a 360-degree view of their citizens. In this week’s Business of Data podcast episode, Roger Halliday, Chief Statistician for The Scottish Government, talks about the work he’s doing to help Scotland provide better services to its citizens with data. “I’m responsible for whatever numbers come out of public bodies across Scotland,” Halliday explains. “There are 40 or so organizations, from schools to prisons to the health service and so on.” “I’m [here] to tell the story of a nation in an objective way and in an open and transparent way,” he continues. “I’m responsible for making sure that the numbers are trusted, that they’re high quality and that they’re actually used to improve the lives of people and improve decisions that are taken.” Two Ways Scotland is Improving Society with Data The COVID-19 pandemic is one obvious example of how curating and sharing valuable datasets can help governments provide better services and make more informed policy decisions. Indeed, Halliday says this has been a significant focus for him over the past 18 months. “[For] the last year, for example, I was leading up the COVID-19 analysis team for the Scottish government,” he says. So, we were modelling the epidemic, getting evidence together for the difficult decisions that governments around the UK [and] the world have had to make.” But Halliday also highlights an initiative geared toward providing essential services to homeless people to illustrate some of the more strategic ways Scotland’s government is harnessing the power of data. “We’ve been collecting data on homelessness for many years,” he says. “When [we] put it together, we found that 8% of people in Scotland have been homeless at one time or another over the last 15 years.” “We thought, if you put that data together with other bits of information, then maybe we’ll be able to better help people who are in that situation,” he continues. “So, they’re able to link that data on homelessness with data on the health services that people that are homeless receive and, not surprisingly, found that [these] people have difficulty accessing health services and that their health is a lot poorer.” Through analyzing these connected datasets, Scotland’s public service organizations have developed new ways for people to access key services they might otherwise have struggled to access if they were homeless. Perhaps more interestingly, they have also identified ‘trigger events’ that frequently cause people to become homeless. This is helping them develop ways to predict which citizens are at risk

Sep 8

30 min 1 sec

Soeren Lueders PhD, VP, Effectiveness and ROI Modeling at SevenOne Media, outlines his team’s ‘questions first’ data science strategy and why focusing on the right data beats traditional ‘big data’ approaches Marketing analytics is a key focus for German broadcasting company SevenOne Media. But while many marketing analytics teams prioritize gathering huge volumes of data to micro-target potential customers, SevenOne Media is bucking this trend. As Soeren Lueders PhD, VP, Effectiveness and ROI Modeling at SevenOne Media, says in this week’s Business of Data podcast episode, this isn’t necessarily the best approach for all companies. “As soon as you talk about ‘data-driven’, [people think] it’s about data collection and it’s still about collecting as much data as possible,” Dr Lueders says. “I think it’s slowly changing. And what we’re doing as well is to look at, ‘Is all this data really necessary for what we’re doing? Or, what we’re doing with all this data, is it really getting us where we want to go?’” Having the Right Data Beats Having ‘Big Data’ Dr Lueders argues that many companies are drawn to approaches such as microtargeting because the tech companies that sell the data and tools needed to them generally have compelling sales pitches. However, he notes that research suggests focusing narrowly on ‘ideal’ customer segments can be counterproductive. “Traditional digital targeting is all about data collection,” he says. “So, you try to get as many data as possible. You try to build groups upon that data and to try to target niche markets.” “[But] when you look closely at your marketing campaigns,” he continues. “Niche targeting or surgically targeting certain groups is not really necessary for most companies because, most of the time, your product is really available for a broad audience.” “You can easily analyze this by yourself, if you just look at, ‘Who are you aiming at?’ and then at the end, ‘Who is buying your product?’” he adds. “If you make this analysis [and] you see that there’s a big difference, then you should think, ‘Maybe this approach doesn’t really make sense.’” For Dr Lueders, unnecessary microtargeting causes many companies to neglect large portions of their true customer base. In the end, it’s customers who lose out, with some audience segments receiving too many ads and others being served none at all. A ‘Questions First’ Approach to Data Science Ultimately, the key to avoiding this kind of trap is to flip the approach that many companies take to data science. Rather than collecting lots of data first and then working out what to do with it, data-focused executives should start by asking questions about the problems they want to solve. “The right data is the data which is necessary for the project to fulfill the task or to get the results,” Dr Lueders. “It’s very common in the market to collect as much data as possible. And then, once you’ve got data, you kind of decide what to do with it.” “[This is], I would say, the wrong approach,” he concludes. “You should really focus on what kind of question you want to answer, and then you look at, OK

Sep 2

30 min 48 sec

Simon Jones, Head of Data Science and Advanced Analytics at Saga, talks about building a remote-first data science team to help Saga recruit the talent it needs to modernize and engage its increasingly digitally savvy audience Providing seamless digital customer experiences wasn’t always a priority for British ‘over 50s’ insurance specialist Saga. But as a new cohort of digitally savvy consumers enter their middle ages, the firm’s attitude toward the need for modernization has changed. As Saga Head of Data Science and Advanced Analytics Simon Jones explains in this week’s Business of Data podcast episode, the company is now reimagining itself in light of the changing needs of its customers. “A lot of people are moving into the ‘over 50s’ category, which is where Saga’s footprint begins, and they don’t necessarily think of themselves as the sort of person who signs up with Saga,” he says. “Trying to understand exactly how we can penetrate into that demographic group was a really important thing. “And what [we] recognized very early on was that a lot of it was down to our relationship with technology.” Saga is now developing new technological capabilities with these customers in mind, and Jones believes embracing a ‘remote-first’ model for data science will give the company an advantage as it pursues this aim. The Benefits of Being Remote-First Jones joined Saga’s insurance arm in May 2021 with a remit to build a data science team to help the company get the most out of its data asset. He says the role is the first he’s held that has empowered him to truly embrace remote working. Jones argues that this approach makes it easier for Saga to recruit top-quality talent and makes a career at the company more attractive to data scientists who enjoy the flexibility that comes with remote working. “I’ve got a recruitment function right now, to build out a remote-first team, trying to find top talent in data science and bring them on board to Saga,” he explains. “That means our talent pool is anywhere in the UK.” “If somebody wished to explore a bit more of the country by basing themselves in different spots over the course of a working month, I have no problems with that,” he adds. “As far as I’m concerned, you’re always working in the same location: The cloud, online, with me. “That makes it possible for us to reach out to talent which, for particular reasons, have based themselves outside the areas we’d normally be recruiting in.” What’s Next for Data Science at Saga In the near-term, Jones’ priorities include building out his team, helping Saga build out its data lake and sourcing new “exotic” datasets to provide staff with insights they don’t have access to currently. But looking to the future, he sees his priorities shifting toward helping to drive the adoption of data-driven technologies across the organization and creating processes that help his team get data science products into production efficiently. “It’s all part of serving the broader agenda of helping Saga advance,” he concludes. “There’s going to be an awful

Aug 18

35 min 27 sec

Stefanie Costa Leabo, Chief Data Officer for the City of Boston, shares how her team led an open data project to help manage the impact companies like AirBnB are having on the city’s property rental market Short-term rental companies including AirBnB have transformed housing markets across the globe. But while many tourists and property owners have benefited from these services, they have also made life harder for long-term renters in some parts of the world. As Stefanie Costa Leabo, Chief Data Officer for the City of Boston, reveals in this week’s episode of the Business of Data podcast, her team is playing a key role in managing this phenomenon in the city. “In certain parts of the city, properties were being brought up by large developers and they were being run as almost de facto hotels,” Leabo recalls. “[That’s] problematic for a couple of reasons. One is that there’s a reason that hotels are regulated and have to hold certain licenses.” “There are health and safety standards that we apply to businesses and those regulations are there for a reason,” she continues. “The second issue is that it was changing the character of neighborhoods and taking long-term housing out of the housing market.” The city decided it needed to find the right balance between allowing some to generate an income from their spare rooms or properties and ensuring long-term property is available to rent for its residents. So, it decided to set new regulations for short-term renters in 2018. Then, it enlisted Leabo and her team to deliver an ambitious open data project to assist with the enforcement of these new rules. An Open Dataset for the Boston Property Market Involving Boston’s data office in this project from the outset was pivotal to its success. Leabo’s team collaborated with other departments to evaluate what impact short-term rentals were having on Boston’s housing market. Then, it created a new open data portal that the legislation mandated. “The first thing we needed to do was create a new datasetfrom scratch that would determine the eligibility of every single residential housing unit in the entire city,” Leabo says. “There were three different types of license. So, we had to be able to tell, ‘Is this unit eligible for each different type?’” “We were brought in from day one and were able to be part of the policy discussion, be part of the implementation and enforcement team” Stefanie Costa Leabo, Chief Data Officer, City of Boston This project involved gathering data from at least six city departments including inspectional services, public works and non-emergency services hotline 311. These were then merged to create a unified short-term rental eligibility dataset. “Many of these data sources had never been joined before,” Leabo notes. “At least, not at such scale.” “The biggest challenge, that we both saw coming but also surprised us at times, was data quality,” she adds. “[When] working with data that is being managed by six or seven different departments, you have different levels of data quality [and] different levels of data standards, in some cases.

Aug 12

30 min 54 sec

Matteo Secchi, Director of Product Analytics at HelloFresh, discusses how the company is ensuring its analysts are driving business impact as the company scales Meal-kit company HelloFresh has enjoyed fantastic growth in recent years. Demand for its services remained strong through the pandemic, even when companies in other sectors suffered. As Matteo Secchi, HelloFresh’s Director of Product Analytics, explains in this week’s Business of Data podcast episode, this rapid expansion has created challenges for the company’s analytics function. Not only has the product analytics team grown since Secchi joined HelloFresh in 2018, so too has the number of staff he and his colleague must provide with easy-to-digest data-driven insights. “It’s really challenging, especially when the organization is changing so much, going from 1,000 employees to 10,000 employees,” Secchi says. “What is the right form of communication? What is the right level? That has been my most interesting challenge which I face at HelloFresh.” “I don’t have the perfect recipe,” he continues. “It’s a combination of things. You have to constantly change not just the way [we communicate, or] the frequency, the tool, the tone of voice, the type of content. We have to change everything and be ready to change everything.” In response to this challenge, Secchi says he monitors the traffic HelloFresh’s analytics portals get carefully to gauge which types of insight business stakeholders engage with the most. He then uses this information to tweak his approach to ensure staff can access the most timely, relevant insights. Secchi Sees Upskilling as a Weapon for Retention For Secchi, creating processes that enable staff to continually improve their skills and share knowledge has also been fundamental to the analytics team’s success. He says this is about more than building teams that deliver results; it’s also a weapon for staff retention. “You have to try to make it part of your ongoing ceremonies,” he recommends. “So, you finish a task, a job, an analysis, a research, and then attached to this research is also the sharing part with the rest of the team. It [must be] part of the normal lifecycle of every task.” Secchi says there are two sides to upskilling analytics staff at HelloFresh. One is technical and the other relates to ‘soft skills’ such as data storytelling. “If I had to choose one single coding language,” he says. “That’s Python, for us, because it allows us to automate a lot of processes and we upskill literally every analyst which we have in the team.” “Without a very advanced knowledge of Python, we would never have been able to do software engineering, but also more data science, more business intelligence,” he adds. “Everything we are doing today is based on Python.” While staff can easily find courses on the technical side of data visualization from any of the major platform providers, Secchi says learning how to use those charts to tell a compelling story requires domain expertise. It’s something analysts must learn ‘in the field’. “You need to know the principles of storytelling,” he explains.

Aug 5

30 min 26 sec

Thierry Grima, Group Chief Analytics Officer at ENGIE, reveals how he’s driving engagement with the company’s analytics strategy on the Business of Data podcast Global electricity utility company ENGIE is transforming in more ways than one. Not only is it transitioning toward an operating model built on renewable energy sources, it’s reimagining its business practices for the age of data and analytics. In this week’s Business of Data podcast episode, ENGIE Group Chief Analytics Officer Thierry Grima shares his experiences of leading the company on this journey and galvanizing staff around his analytics vision. “We established a CDO community with 30+ members and we help them to provide data sources,” he says. “What we are here for, for me, is really to bring the ‘glue’ that will help people to connect together and see value in those connections to IT.” For Grima, building awareness of what ENGIE is doing with data and how staff can use it to drive value for their own business units is an integral part of a Chief Analytics Officer’s role. Driving Change with Gamification and eLearning “Data is the new ‘sexy’, and it’s really important for us to ensure that our people know that, and they can actually play their role,” says Grima. “They all have to play a role.” One of the most attention-grabbing ways ENGIE is driving awareness of its analytics strategy is through the creation of a ‘data game’. The company created a mobile app that allowed people to challenge their colleagues to ‘duels’ where they answered questions about the company’s data and analytics initiatives. Winners were awarded points and competed for prizes on a company-wide leaderboard. “Over three weeks we played something like 200,000 games,” Grima recalls. “It was really interesting to see how people were playing and gain some understanding of what ENGIE does with its data and how it helps to bring value to reduce cost, to find new revenue streams.” Alongside the ‘data game’ the company is also delivering online training courses for staff across the organization to improve their data literacy levels. Grima says engagement with these courses has been particularly good throughout the pandemic. Creating a Repository for Analytics Use Cases Alongside working to raise awareness of what ENGIE’s staff can do with data, Grima’s team has created a ‘data marketplace’ to act as a central repository for the company’s analytics use cases. “What you can find there is quite simple,” he says. “You find the definition of the program itself. So, what do we do? Why are we here? And what are the different parts that we cover from a strategy and organization standpoint, but also from a technology or data science standpoint?” “We also gather all the use cases depending on their state in the lifecycle; so, the development state they are in,” he adds. “At the moment we have more than 300 use cases in this repository.” “We share the use cases so that everyone knows exactly what other [staff are doing],” he continues. “But also, if they want to understand something, they [will] know exactly where it’s been developed already or anything that comes a bit close to what they want to build.” Through breaking down the silos between ENGIE’s different business units in this way, Grima hopes to catalyze greater analytics innovation across the organization. His strategy is to couple this will his team’s broader data literacy and awareness initiatives to drive analytics adoption across the group.

Jul 30

28 min 35 sec

How 300+ C-Level Data Executives in the Americas, EMEA and Asia Pacific are Managing Enterprise Data Assets to Fuel Reliable Data-Driven Decision-Making. Effective business decision-making depends on providing staff, business intelligence (BI) tools and AI or analytics models with data that’s accurate, consistent and framed with the right context. We call this ‘data integrity', and our first Data Integrity Trends survey seeks to measure how effectively enterprises are doing this across the world. This report summarizes what we discovered to paint a unique picture of how successfully enterprises are establishing and maintaining bases of high-integrity data to fuel their data-driven transformations. Key Findings: 40% is the average proportion of their time data teams spend on data cleaning, integration and preparation. 35% of respondents say staff will trust a data-driven insight that conflicts with their own intuitions. 88% say a lack of staff with the right skills is creating challenges for their data integration projects. 88% have started building automation into their data management processes. 82% say data quality concerns represent a barrier to their data integration projects. 80% find it challenging to ensure data is enriched consistently at scale. To download the full report, please visit:

Jul 21

26 min 5 sec

Joe DosSantos, Chief Data Officer at business analytics platform Qlik, outlines why he believes the company’s ‘cloud-based offering is the future of business analyticsThe advent of cloud-based software as a service (SaaS) lowered the barrier to entry for all manner of business analytics capabilities. The cloud makes it much easier for companies to experiment with and acquire data-driven toolsby removing the need to build and maintain software products in-house.But as Joe DosSantos, Chief Data Officer at business analytics platform Qlik, notes in this week’s Business of Data podcast, it’s only recently that many data-focused executives have started to get comfortable with cloud-based technologies.“People were a little bit nervous about the cloud,” he says. “Generally speaking, people have been born and raised in the data area to be very afraid of moving data anywhere where they can’t control it.”In recent years, attitudes toward cloud-based platforms and services has changed dramatically. For DosSantos, this is partly down to how greatly these technologies have matured.“The tools are out there, now,” he says. “People have known and loved Qlik for a long time. But what’s new and different is, ‘How do I start to get comfortable with the idea of my data being somewhere ‘out there’?’”Qlik’s ‘Italian Cooking’ Approach to Business AnalyticsDosSantos says a key benefit of doing analytics in the cloud is that it makes it easier for company stakeholders to access the data they need and connect datasets to uncover valuable business insights. “It’s all about ‘time to value’, at the end of the day,” he says. “How do I take this data and make sense of it more quickly? So, SaaS is fundamentally a way to get there faster.” To illustrate his views about how best to approach this, DosSantos uses the analogy of French versus Italian cooking. French recipes are sophisticated and require detailed knowledge of the chef and their ingredients. But Italian food is about simplicity and the quality of the ingredients.“In the data lake era, we kind of let everyone fend for themselves,” he says. “We said, ‘Go and grab raw data and figure it out.’ It was French cooking.”He adds: “What we’re trying to do is roll out this idea that what you want to do is put the best data that’s already been finely curated out there, so people can get the answers quickly.”By focusing on taking high-quality data and making it available to people at the right time to inform key decisions, DosSantos believes companies can maximize the value they drive with business analytics in the cloud.“Decisions must be part of one’s calculus,” he concludes. “At Qlik we call that active intelligence. It’s not good enough to know something. One must do something with that which you know.”Key TakeawaysThe future of analytics is in the cloud. SaaS is helping enterprises make data available to company stakeholders, so they can use it to uncover valuable business insightsProvide access to high-quality data. DosSantos argues that data leaders should focus their efforts on making high-quality data available to as many company stakeholders as possibleActive business intelligence is the key. Enterprises must integrate insights with business processes, so they inform the decisions staff members make Other quotes“People have known and loved Qlik for a long time. But what’s new and different is, how do I start to get comfortable with the idea of my data being somewhere ‘out there’? And the tools are all there, now... and I think now the expectation is there.”“Analytics is fundamentally about the discovery of new things and the connecting of new data... so, one of the things that we had to do was to make sure that the security was super intuitive, clear, understandable, and that we offered people a really complete way to understand what kind of data assets were being made available.”>> says execs are starting to expect teams to be able to adopt new technologies in the cloud“The idea that we had as weJ

Jul 14

30 min 20 sec

Austen King, Global Head of Data and Analytics at Clyde & Co argues that building a collaborative culture is key to successful digital transformation For many businesses digital transformation is a catalyst for recovery as we emerge from the pandemic. However, for too many people digital transformation is seen as a job for the IT department. In this week’s episode of the Business of Data Podcast, Global Head of Data and Analytics at Clyde & Co Austen King argues that successful digital transformation requires everyone to take ownership of the process. “You can't just outsource the ownership of your business,” King says. “The data is the business.” Law firm Clyde & Co was founded almost 90 years ago. For King, transforming a business with a long cultural memory and entrenched legacy systems was not without challenges. “We have a lot of processes and procedures,” King notes. “It can be challenging to migrate [to the cloud] because they are legacy systems and there's a lot of things to do.” However, the long cultural memory is an advantage in other ways. King and his team were able to infuse their transformation initiatives with the experience of the business. There's a maturity of instinct within the firm to do things in certain ways. The challenge is to try and take those elements of instinct and then apply those to the new system,” he says. “When you distill that down, you can actually get some great insight from people.” However, creating enthusiasm for the digital step-change is not always easy. King recommends creating a formal structure for feedback – this improves performance and gives staff a sense of ownership. “It’s largely about trying to communicate, being transparent, letting people know what you're doing and why you're doing it in a particular way, and giving the opportunity for people to give their feedback,” King advises.

Jul 8

28 min 17 sec

Jordan Levine, MIT Lecturer and Partner at Dynamic Ideas, outlines why he believes executives and regulators must do more to combat AI bias – and what they can do about it. When the EU announced its proposed new AI legislation in April 2021, the bloc touted the new laws as a necessary step to ensure Europeans can trust AI technologies. But for Jordan Levine, Partner at consulting firm Dynamic Ideas, the proposals are something of a ‘blunt instrument’. In this week’s Business of Data podcast, Levine argues that this kind of legislation is, at best, a starting point. It’s up to AI-focused executives to sit down and implement practical frameworks for ensuring AI is used responsibly in their organizations. “I'm 100% supportive of the government getting involved in establishing the rules,” he says. “[But] I hope that both academics and business [and] society-conscious individuals get excited and say, ‘OK, how do we refine this?’” In Levine’s experience, there are many things that can cause ethical issues when enterprises put AI or analytics models into production. That’s why much of the work he does at Dynamic Ideas is geared toward educating people about AI bias challenges. He says it’s important for businesses to have both clear mitigation strategies to combat ethical issues such as biased decision-making and the right tools or technologies to orchestrate those strategies in practice. “What I try to do is show how to mitigate those issues and then show actual techniques that exist today, [so] that you can leverage open-source software to do the processing,” he says. Levine argues that business leaders must use a framework like the one he’s developed to make sure they are aware of the ethical issues that may arise from the ways they’re using AI and analytics. This will allow them to take steps to make sure these issues are addressed. “I hope they can use this framework to actually challenge their analytics groups,” he says. “To actually sit down with the individuals writing the algorithms and confirming whether the issue does or does not exist.” However, Levine concedes that no framework for combatting AI bias can ever really be complete. Technology is constantly evolving, and enterprises are constantly innovating with it. So, AI-focused executives must be vigilant and reevaluate their AI practices regularly with an ethics lens. Levine concludes: “The more precise that we can get in terms of bias and ethics and the more, the more discrete issues we can identify and then think through how to mitigate them and show examples of mitigation, I think, the better we all are.” Key Takeaways · Regulatory compliance is not the same as ethical behavior.Enterprises must go beyond what’s required of them by law to ensure their AI practices are ethical · Executives must be aware of potential ethical issues. If executives don’t know the specific risks that come with adopting AI technologies, they will struggle to ensure the right processes are in place to mitigate them · AI ethics frameworks must be updated regularly. AI-focused executives must constantly reevaluate their AI ethics strategies to ensure their teams are following current industry best practices

Jul 1

26 min 28 sec

AXA Investment Managers Global Head of Customer Insight, Web Experience and Analytics Brian Stewart argues that the time has come for asset management firms to innovate their customer experiences The asset management industry is not famous for its innovative customer experiences. However, in a competitive market even asset management firms can’t be complacent about fast-changing customer expectations, argues AXA Investment Managers Global Head of Customer Insight, Web Experience and Analytics Brian Stewart in this week’s episode of the Business of Data Podcast. Helping AXA Investment Managers to understand not only what their customers buy, but why they buy it is at the core of Stewart’s mission, and data is the key. “[The industry is] now really starting to understand the importance of this customer data, because [customer] interests are probably changing much faster than the industry can change, traditionally,” Stewart says. “It’s leading to a transformation within our industry to become more agile and to really start to try to understand the things that people want to invest in,” he continues. Like many organizations, AXA Investment Managers accelerated their digital transformation initiatives because of the pandemic. As customers began to interact with the firm more online, it gave them access to richer and more plentiful customer data. “We've been able to collect an awful lot of information and data which we never had pre-pandemic,” Stewart notes. “Whether that be through webinars or online events, through our websites, our fund center, and so forth.” Previously, data from different sources had been siloed in different systems and deployed in various models. Stewart’s team centralized their datasets and linked their marketing automation tools to a new CRM to create a fuller picture of their customers’ behavior. At the same time, Stewart’s team helped to refresh the customer experience on their websites with a focus on easy access. This led to a massive increase in online traffic and subscriptions, a key metric for the firm. “That led to that has led to more people coming to our website, but then more people go into our fund center,” Stewart says. “The fund center is the key part of our website where [you get] information on the funds that we sell and how they're performing.” According to Stewart, the key to the success of their strategy has been breaking down data siloes and connecting tools to a centralized CMS. “Do not do things in silos,” Stewart warns. “It really just makes a rod for your own back. So, map it out, think it through, and then go for it.” Key Findings Follow your customers’ expectations. Even industries that have lagged in the past need to keep up with rapidly changing customer behavior. Break down data siloes. If you can’t link your data then you won’t be able to build a complete picture of your customer. Link your tools. Connecting the dots between your internal systems will produce more meaningful, data-driven insights.

Jun 25

29 min 52 sec

Lloyds Chief Data Officer Simon Asplen-Taylor shares his tips on realizing the potential of data to drive business decision-making When putting together their strategy to become more data-driven Lloyds did something unusual. They published it. In this week’s episode of the Business of Data Podcast, Simon Asplen-Taylor, Chief Data Officer at Lloyd’s, argues that the publication of their data-driven strategy was an essential first step to turning theory into practice. “When you're doing something across a market, what you have to watch out for is that everyone understands what the overall strategy is, so we wrote a data-driven strategy called Blueprint Two,” says Taylor. “And that's unusual. I think most data strategies are internal. This is very much external.” As they refined the strategy, Taylor and his team crowdsourced feedback using a tool to promote engagement, encourage feedback and create a more complete product. The result was a document that accounts for the priorities of multiple stakeholders in the Lloyd’s universe. “If I said to you, ‘I know the answer’ to something, you might well then start questioning me, but if we work together on an answer, it feels a bit more inclusive,” Taylor says. “You have to be prepared to learn new things and understand that there may be challenges you didn't know about.” Building confidence in the initiative is another crucial step towards success. To do this, Taylor recommends focusing on overall objectives, especially if the person is non-technical. It’s a process that Taylor compares to watching a movie. “[When] you watch a [movie] you don't necessarily know how it was all put together. But if someone forced you to watch the ‘making of’ the movie before you watched it, then actually it wouldn't be so exciting,” Taylor quips. “Start with the story and explain it in their language.” Key Findings Collaborate on your strategy. Getting feedback and engagement from key stakeholders will build trust and produce a better result. Take a consultative approach to implementation. Understanding what drives your stakeholders may uncover issues you hadn’t considered. Speak the language of your audience. If CEOs care about increasing revenue, reducing costs, and improving customer satisfaction, then tell them how your initiative will achieve that.

Jun 17

29 min 2 sec

Business of Data Podcast

Jun 9

33 min 22 sec

Louise Maynard-Atem, Data Insights Lead at GBG, shares her tips on implementing agile methodology to drive innovation in the wake of the pandemic Born in the realm of software development, agile methodology has been growing in popularity across a wide range of business functions in recent years. In this week’s episode of the Business of Data Podcast, Louise Maynard-Atem, Data Insights Lead at identity verification, location intelligence and fraud prevention company GBG argues that an iterative, collaborative approach to data and analytics will help to drive innovation and demonstrate business value as we emerge from the pandemic. “[Agile] helps us innovate faster. It helps us to surface the problem quicker and utilize data more effectively,” says Maynard-Atem. “But it wasn't until we really had to put agility into practice quickly, because necessity meant that we had to, that we realized the importance of it.” The Pandemic Highlighted the Importance of Business Agility If there’s one thing we’ve learned in the last 12 months, it’s that you never know when you might need to transform the way your business operates. “Agility really is king. It’s king because you never know when you are going to have to make a pivot, make changes to your business model, make changes to your ways of working and make changes to what you're doing with data,” says Maynard-Atem. “It’s taken the global pandemic, I suppose, to really bring the need for agility into clear focus.” The advantages of rapid action in a turbulent market have highlighted the advantages of agile thinking to business leaders. “I think it wasn't until we had to put agility into practice quickly, because necessity meant that we had to, that we realized the importance of it,” says Maynard-Atem. Driving Innovation with Agile Methodology However, as things begin to settle, Maynard-Atem says that agile thinking and, more specifically, agile methodology, will drive innovation in data and analytics. I think innovation, agile thinking and agile practices go hand-in-hand because innovation is ultimately [about] trying to do something new,” says Maynard-Atem. She continues: “We want to make sure that we're not just taking a waterfall approach. We're taking small incremental steps and pulling in the feedback loops – and that’s ultimately what agile teaches you.” However, for organizations used to long development cycles and multi-year digital transformation initiatives, the fast-paced iterative nature of agile can seem like an unlikely partner. “It seems as though a lot of organizations feel like they're under pressure to deliver a big transformation program, but I don't think that's the best way to deliver in terms of data and analytics,” says Maynard-Atem. “And certainly not from an agile perspective.” Instead, Maynard-Atem recommends looking for manageable, well-defined experiments to test hypotheses, and pulling in feedback loops. “It's just breaking it down to those manageable chunks and being really specific about what each experiment is going to deliver, what that value means, and then how [you will define] success,” she says. Key findings Agility was a critical success factor for businesses during the pandemic. As companies rapidly pivoted their operations, the ability to think and act quickly was a key differentiator. Agile methodology empowers innovation. Experimentation and rapid iterations are the hallmarks of both innovative thinking and agile methodology. Break down large initiatives into manageable chunks. By looking for the smallest experiment possible you can demonstrate value more quickly.

Jun 3

29 min 18 sec

eBay Head of UK Analytics Amit Agnihotri explains how eBay is improving customer experiences by using NLP to tap into a vast trove of previously unused data Selling items on eBay is supposed to be convenient for both buyers and sellers – but should a dispute arise, eBay needs to know when to step in, and how to resolve it. In the past, users would have to manually create a ticket and wait for a response. Now, eBay is using natural language processing (NLP) technology to analyze member to member (M2M) messages to predict outcomes and modify their remediation. In this week’s episode of the Business of Data Podcast, eBay Head of UK Analytics Amit Agnihotri explains how they are doing this, and the big plans they have for the future of NLP at eBay UK. Tapping into unstructured data Over the past 20 years, eBay has radically improved its analytics capabilities, however, this has been primarily based on numeric data. Using advances in AI technology, eBay is now targeting the vast amount of unstructured data its platform generates – specifically, M2M messages. “When we do analytics, we are mostly dealing with numbers. How many users are there? What is the length of the time? So, it's very numeric.”,” says Agnihotri. “What we have ignored that almost a hundred times bigger is data available on natural language.” By tapping into this vast repository of unstructured data, Agnihotri’s team can provide analytics to enable eBay to act more effectively to resolve disputes, make inferences about customer satisfaction, and reduce the rates of customer churn. Taking account of cultural differences Language and communication are highly culturally specific, and as a result, any approach to NLP must consider cultural differences in communication styles before inferring meaning. One of the challenges that I see is the culture-specific challenge,” says Agnihotri. “For example, the same sentence from a German customer and a UK customer could mean something very different.” Understanding the nuances in how people communicate dissatisfaction is essential to preventing customer churn, a process Agnihotri compares to ‘death by a thousand cuts’. “We call them paper cuts. The smallest paper cuts occur, and they add up,” Agnihotri says. “And then one day, [the customer] decides to search for some other platform.” Getting started with NLP The path to NLP involves a certain amount of trial and error. For those who want to experiment with NLP, Agnihotri recommends an iterative process focusing on step-by-step improvement. “If anyone is trying to get into NLP and using that as a learning tool, don't expect that you will get a very quick answer,” Agnihotri advises. “Be patient on this. This is a huge tool, like 10 times, or a hundred times more data than we have on natural language compared to the numbers. It will take time.” One strategy that Agnihotri suggests as a starting point for NLP is monitoring social media to ‘listen’ for feedback about your company. “On Facebook, Twitter, a

May 20

29 min 10 sec

HSBC is modernizing its fraud and money laundering detection capabilities by rolling out algorithms designed to identify ‘bad apples’ in the most efficient way possible Digital transformation is not unique to businesses. Since the onset of the COVID-19 pandemic, criminals have also moved their activities online. Now, HSBC is using data to fight back. In this week’s episode of the Business of Data Podcast, HSBC’s Head of Data and Analytics and Business Financial Crime Risk Francisco Mainez explains the motivations behind this global initiative. “We need to find something that will divert our attention to the customers that we really wanted to analyze. The ones that have the potential [to be] the bad apples in your basket,” says Mainez. “And also, that’s going to have a knock-on effect on cost efficiency.” Using Data to Identify High-Risk Customers Historically, customer-focused assessments looking for fraud and money laundering might involve the manual review of thousands, if not tens of thousands of people and accounts. This process was costly, inefficient and time-consuming. By employing an algorithm to give individual customers a personal score based on predetermined risk factors, HSBC can quickly identify high-risk accounts. “What the algorithm does is embed different key risk indicators,” says Mainez. “Are you moving countries? Are you transacting with virtual currencies? Are you over a certain age?” He continues: “In the old world, we will be looking for things that we know for a fact from previous experience that could be suspicious. With this [algorithm], you’re scoring customers because you’re actually measuring customer behavior.” Improving the Efficiency of Fraud Investigations By using data to identify the high-risk accounts, HSBC is making sure that their investigative resources are being used as efficiently as possible. “You don’t want to spend 80% of your time, energy, budget and resources, especially on the operation segment, checking a false positive. Then 20% of the time rushing to find out if those customers are the ones that you’re looking for,” Mainez says. “We wanted to reverse that.” He continues: We’re going to spend minimal time because the machine is going to help us make a decision on which customers we need to review. And then we’re going to spend the rest of the time properly analyzing the customers.” Mainez points out that this initiative is designed to assist human decision-making, not replace it. The human element of fraud detection is still essential, especially when it comes to adapting a global initiative to local realities. Taking Stock of Cultural Factors Big financial institutions typically work in a very decentralized way. To make this global initiative successful, Mainez knew that the algorithm would have to take account of local and cultural factors. “You also need to take into consideration cultural factors, he says. “Every country is going to have to worry about their own typologies, not the ones from [any other] country because that’s going to produce false positives that they’ll have to review”. By asking individual regions to specify the cultural or regional typographies that best indicate risk, Mainez can tailor the algorithm to that region. “You’re going to tell me which are the typologies that are keeping you awake at night,” he says. We want to help you by configuring the system in a way that can detect those types of behaviors.” The new initiative has already been rolled out in several regions, but the future has plenty in store for Mainez and his team. “Over the next few months, we’ll be deploying in more markets and continually tweaking those typologies,” Mainez says. “Because of all possible times, we started to roll this out in the middle of COVID.” He concludes: “[Criminals] are adapting to a more digital and remote environment. That’s reflected in the data, and we need to be able to figure out how those typologies, and how they are evolving.”

May 12

30 min 34 sec

The co-founders of Black in Data join us to discuss why they founded a collaborative movement to promote equal opportunities and representation for people of color in data and analytics Our guests for this week’s podcast are on a mission. Devina Nembhard and Sadiqah Musa, both senior analysts at the British newspaper The Guardian, are co-founders of a newly-launched community, Black in Data, designed to accelerate the careers of people of color in data and analytics. Born of the turbulent events of 2020, including the murder of George Floyd in the US, Black in Data exists to provide mentorship, inspiration and a community to people of color seeking a career in data. “The idea is that we get data professionals of color together in one place to network, for example, to meet each other, to share ideas, tips, and hints about what they’re doing in their data world,” says Musa. “Overall, the idea is for us to increase ethnic representation within the data industry.” “I think it’s always been time for an organization like Black in Data,” adds Nembhard. “And it’s clear from people’s reactions when we invite them to the group that it’s something that everyone’s been really thirsty for.” Accelerating the Careers of People of Color It’s no secret that people of color are underrepresented in a range of professional and academic fields and in particular those that draw on graduates of science, technology, education and math (STEM). The trend manifested in a very personal way for Sadiqah Musa as she embarked on her career in data. “I had been working for well over 10 years and I had never worked with another black female. I just felt like I did not belong in any of the workspaces that I’ve been at,” says Musa. “And it’s not because of anything that I was doing wrong or anything that my colleagues were doing wrong. I’ve worked with some really amazing people. But something was just missing.” Black in Data provides a ready-made network for people of color to make connections, receive advice and support and even find employment opportunities. “If you want to access the diverse pool of candidates, you have to go to the right place, that’s why with Black in Data we have set up a jobs board,” says Musa. “We’ve got a fantastic group of people that are super qualified. We are here. Find us.” Providing Training and Mentorship Black in Data is about more than simply networking and finding new roles. Musa and Nembhard are also passionate about helping their members develop their data and analytics skills. “The mentoring, for me, I think is the part of the organization that I feel most passionate about,” says Musa. “I found when I started out I had nobody to reach out to, to ask questions.” She continues: “So, we are running a three-month mentoring program where we asked the mentors and the mentees to meet at least once every month for an hour. And it’s completely mentee-led.” “We offer a training program as well,” adds Nembhard. “The whole point of the training scheme is to give them those skills. Teach them SQL and Python, teach them advanced analytics, teach them how to visualize data. And then they can just make the leap into the data world a bit easier.” If you would like to join the Black in Data community, or if you are looking to support their initiative, you can find them at Black in Data. Key Findings Black in Data is a newly-created movement. Its mission is to support the careers of under-represented communities in data and analytics. It’s a place where you can develop your skills. From the ‘data visualization challenge’ to training and mentorship, Black in Data can help you develop your career. A ready-made community for people of color. It’s also a place to network, share tips and ideas, and make new friends.

May 5

31 min 30 sec

Former Executive Director of Enterprise Information Strategy and Risk Management & Global Data Protection Officer at Bristol-Myers Squibb on what it takes to emerge stronger from the pandemic In this week’s episode of the Business of Data Podcast, Kamayini Kaul, the former Executive Director of Enterprise Information Strategy and Risk Management & Global Data Protection Officer for Bristol-Myers Squibb, joins our host Catherine King for a wide-ranging conversation about how data and analytics teams and individuals can put their best foot forward as we emerge from the pandemic. “Sometimes you have to go slower to go faster,” Kaul says. Knowing when to slow down before you can accelerate with the synergies that are arriving as part of a new team, new company, new organization, I think is a huge learning for me personally.” How data teams can emerge stronger after the pandemic The path to recovery for organizations severely affected by the pandemic will differ from business to business. Having the right data strategy will be critical to success, regardless of industry, as Kaul notes. However, it will be up to data and analytics leaders to define the correct path. “Data strategy is going to be front and center for all industries and all data and analytics [teams],” she says. “But, for data and analytics to drive that recovery, I think every leader at their level is going to need to introspect and say, ‘what is our data strategy?’” In addition, Kaul believes that the pandemic has highlighted the benefits of cooperation between individuals, companies and even countries. A lesson that can benefit data teams and their practice. “The pharma [industry], medical device manufacturers and provider networks are cooperating on an unprecedented scale, not just within countries, but globally to bring to bear both treatments as well as vaccines,” Kaur notes She continues: “The data space, and the ecosystems of the seamless exchange of data, quality data, trusted data and the ways in which data and analytics professionals enable that for their enterprises is going to be another focus.” Facing new challenges as an individual Of course, for many people, the pandemic has meant drastic personal change. A change of location, a new job, or even a new career. For those people, Kaul has some advice. “Get comfortable with tech, digital and data,” she says. “It is very much a part of the next industrial revolution. If you happen to be in the field [already], find ways to make sure that you're bringing everyone else and their level of proficiency along with you for the ride.” She concludes: “I think that that could be the biggest differentiator in all of us as data and analytics professionals, trying to make a dent with being a data-driven organization, culture, or a society.” Key Findings Slow down to go faster. Getting the basics right will help you scale faster when the time comes. Cooperation is key. The pandemic has highlighted the benefits of cooperation between companies, industries and even countries. Help raise standards of data literacy. Understanding the world of data will be instrumental to success in the information age.

Apr 29

35 min 7 sec

Wendy Zhang, Director of Governance and Data Strategy at Sallie Mae, discusses why companies must put the right culture, value and quality into their AI initiatives Since Gartner’s famous proclamation that 85% of AI projects end in failure, the maturity of enterprise AI functions has improved dramatically. But the high number of projects that continue to end in failure suggests that many companies are still getting the basics of AI development wrong. In this week’s Business of Data podcast, Wendy Zhang, Director of Governance and Data Strategy at banking company Sallie Mae, shares her views on why so many AI projects don’t deliver results. For Zhang, common issues like poor data quality, trouble identifying valuable applications for AI and lack of buy-in for investing in or adopting AI technologies are symptoms of a more basic problem. “There are a lot of different reasons [AI projects fail],” she says. “But it all starts with a lack of fundamental understanding of AI, what it is and what it can or cannot do.” AI Success Starts with Asking the Right Questions Zhang warns against doing AI for the sake of AI. She argues that companies must start with the business challenges they need to solve before considering what value AI might bring to these initiatives. “The next [thing] you have to really assess is, is this something that AI can actually do?” she continues. “Is this appropriate for the business problem you’re going to solve?” Once an AI-focused executive has identified projects that could benefit from AI-driven technologies, they must consider what they need to deliver these projects successfully. This includes assessing what resources, funding, datasets and support they’ll need for each project. “It’s really got to become the company’s DNA,” Zhang adds. “It requires people to really look at a lot of your business processes and to think about different possibilities, and that requires mindset change.” “It’s not so much working and doing the same things over and over and just automating a few things and having AI on the side,” she concludes. “If you really want to get a massive benefit, you have to be able to experiment and fail and also incorporate that into your core business.” Simpler is Often Better for AI Beginners When companies are new to AI, they typically don’t have fully formed strategies for adopting these technologies. It’s more common for enterprises to begin experimenting through trial and error to discover the types of AI systems that are relevant to them. When starting out on this journey, it’s good practice to start with projects that can be delivered using data the company already has. The sooner they are implemented and delivering value, the better. Similarly, Zhang notes that simpler AI models can be easier for fledgling AI teams to deliver. Even for more advanced AI functions, she warns against overcomplicating AI systems unnecessarily. “I think of simple models as, in plain terms, you get more bang for your buck,” she quips. “The more complicated the models are, the harder it is to have a higher interpretability.” “The other component is having the right people,” she adds. “It’s important to build AI capabilities in-house. However, when you first start out with a pilot project, it might be beneficial to get external help, just so that you can get the ball rolling and gain some momentum.” “You really have to go through a lot of trial and error,” she concludes. “Start with pilot projects to score some small wins to get some buy-in and build your credibility to get faith for your team.” Key Takeaways • Start with the right questions. Only embark on AI projects when they’re the best answer to a pressing business question • Find use cases you can deliver with what you have. Identify what data, resources and support you’ll need before you begin • Simpler is often better. As any engineer will tell you, the more parts something has, the more bits there are that can break

Apr 21

32 min 28 sec Director of Data Science and Analytics Guy Taylor shares his tips on scaling data and analytics initiatives from solid foundations and developing a sound data culture Scaling data and analytics initiatives successfully can be a challenge - even for businesses with a rich data culture. In this week’s episode of the Business of Data Podcast, Director of Data Science and Analytics Guy Taylor argues that scaling such initiatives successfully relies on strong data foundations, tying data and analytics initiatives to business incentives, and understanding the unique data context of your organization. “One of the big learnings that I’ve had is that having his kind of cookie-cutter one-size-fits-all strategy really doesn't work,” Taylor says. “It’s really important to understand your current state and your current context. I think that is the thing that I’m pointing to which I hadn't fully taken into account. Context is absolutely everything.” How Data Culture and Data Context Interact To scale data and analytics initiatives successfully, Taylor recommends developing a data culture that focuses on breaking down traditional silos and democratizing data use. This can be a challenge for many organizations, especially given that data contexts vary widely across industries. “It all comes back to the culture,” Taylor explains. “In the banking environment, for example, because of the regulation and because of the way that data is really considered to be a key asset. What you see is power dynamics built up around data fiefdoms and people really wanting to hold on to control of the data.” He continues: “What I’m seeing in the start-up culture, with its culture of high growth and rapid acceleration is the exact opposite. It’s that everybody has access, and everyone can do everything withing the regulatory frameworks that do exist. Building on Strong Data Foundations The work of building a strong data culture and shoring up data foundations never stops. Indeed, because the data landscape is constantly evolving so to must data culture constantly evolve. However, striking a balance between driving value through data and analytics initiatives while continuing to build strong data foundations can be tricky. Taylor says that communicating effectively with key stakeholders on the importance of solid foundations to the ultimate success of an initiative is imperative. “It’s about figuring out what the incentives are. Because without aligning with those objectives, you’re dead in the water,” Taylor says. “You need to figure out what the incentives are on a business level, what the incentives at a social level, and what the incentives at a personal level and align to those.” He concludes: “If you can figure out how you can inject your ‘how’ into their ‘why’ then you're both winning.” Key Takeaways Data culture is everything. The success, or failure, of data and analytics initiatives relies on a democratized data culture. Build your data foundations. To scale initiatives successfully, data and analytics initiatives must be built on solid foundations. Align your ‘how’ with their ‘why’. Demonstrating how data and analytics initiatives will achieve business goals is the best way to win support.

Apr 15

32 min 57 sec

Adrian Pearce, Group Chief Data Officer at Credit Suisse, outlines how he balances consistency with flexibility while advancing his data strategy across the firm’s many and diverse business units For organizations with tens of thousands of employees, getting everyone pulling in the same direction on data strategy can be a huge challenge. Orchestrating a group-wide vision of the future requires a delicate balance of consistency, transparency and flexibility. In this week’s episode of the Business of Data podcast, Credit Suisse Group Chief Data Officer Adrian Pearce shares his approach to striking this balance to achieve the firm’s data strategy goals. “If you’re overly prescriptive, you end up with 80% of the people telling you why it doesn’t work for them,” he says. “The challenge is being flexible enough while making sure you drive a common direction.” Balancing Data Strategy Consistency with Flexibility Today, Credit Suisse is focusing on three data strategy objectives: 1) fixing data quality issues and democratizing the data, 2) industrializing data management processes and 3) ensuring data is source from the right places and used correctly. While these goals are simple, executing them is not. Pearce gives the example of the firm’s investment banking division and its retail operation in Switzerland to illustrate the differences between how the company’s many divisions and business units use data. “In an organization like Credit Suisse, data isn’t the same for everybody,” he says. “The way we interact with both of those client sets is just completely different.” “You have to do [things] in a careful way,” he adds. “You can’t change direction. You can’t come up with a bigger, better goal every 10 minutes. You need to really be giving consistent information.” For Pearce, the key to success lies in balancing the “non-negotiable” steps toward achieving these consistent organizational goals with flexibility in other areas. This helps divisional CDOs to buy into these big projects without compromising their ability to serve the needs of their units. To illustrate this idea, he gives the example of Credit Suisse’s organization-wide data quality initiative. “We have a tool called Data Quality Issue Management,” he says. “It’s non-negotiable. Everybody has to enter their data quality issues in it.” “We’ve managed to drive that consistently across the firm,” he continues. “By being able to explain to the organization the benefits of fixing [data quality issues], the individual CDOs of each divisional function have clearly bought into it.” Key Takeaways Consistency is key in large enterprises. It takes time for a big ship to turn. So, data leaders should pick clear goals that aren’t going to change or move too much Don’t be too prescriptive. Group data leaders much allow divisional or functional data teams the flexibility to meet the needs of different stakeholder groups across the enterprise Secure buy-in for key strategic projects. Affording data leaders flexibility in some areas can make it easier to secure buy-in for ‘non-negotiable’ objectives that will have tangible business benefits

Apr 7

29 min 8 sec

Syed Sameer Rahman, Director of Insight and Data Science at The Royal Mint, discusses how companies can start developing better business models with data Over the course of his 17-year career, The Royal Mint Director of Insight and Data Science Syed Sameer Rahman has used data-driven techniques to solve a wide range of challenges. One of his key learnings in that time is that data is most useful when you look at it through the right lens. This week on the Business of Data podcast, we invite him to share his views on why businesses must use the techniques to build propositions that are based on data-driven insights. “You have a piece of data and you develop your business around that,” he explains. “That is what I call pivoting your business around data.” Rahman gives the example of Klarna, the ‘buy now pay later company’, to illustrate how doing this successfully can allow businesses to identify and profit from gaps in the market. He recounts how Klarna rose to prominence using the insight that the market for creditworthiness was drying up in the wake of the 2007 crash. Its founders noticed that consumers still wanted to buy small things to cheer themselves up. So, Klarna identified a market gap – consumers with low risk of default who are interested in buying things now. He says: “That’s a good example of [a company] using market insight, consumer insight and industry insight to identify the market gap and to develop a business out of it.” To achieve this, he says enterprises must understand the business challenges they are trying to solve and build their data strategies around uncovering these insights. Syed Sameer Rahman, Director of Insight and Data Science at The Royal Mint“The main barriers, I think, is really in understanding, in data literacy, self-awareness and triangulation” He argues that insufficient data literacy is the greatest barrier to this kind of thinking in the business world today. “One of the barriers really is the interpretation of data, which is linked to data literacy,” he says. “We see, quite often, people using data to manipulate data to get to the decision that they want. “A good data person will help with triangulation, which is, they’ll look at the various different lenses that we have talked about and then come about in a very unbiased way to a conclusion.” Key Takeaways Data helps companies find their competitive edge. Rahman says businesses should aim to pivot their business models around data Use the right data lens for the job. Business leaders should be clear on the problem they’re trying to solve to ensure they select the right analytical methods to solve it. Promote company-wide data literacy. Rahman argues that poor data literacy is the top barrier to data-driven thinking in business today.

Mar 31

29 min 32 sec

ZestMoney Chief Data Officer Natalia Lyarskaya explains how cutting-edge technology is helping consumers in India access credit where it was previously unavailable The appetite for credit is growing in India. However, compared to developed credit markets like the US, India is underserved. There are only three credit cards per 100 people in India, compared with 32 per 100 in the US. This may be starting to change. In this week’s episode of the Business of Data Podcast, ZestMoney Chief Data Officer Natalia Lyarskaya explains how ZestMoney is using AI and machine learning technology to create a transparent and trustworthy credit solution where traditional banks have been unwilling or unable to do so. “There is a kind of chicken and egg problem where someone needs access to the credit products but has never been in the financial sector before and other banks, traditional banks, cannot evaluate their creditworthiness,” Lyarskaya says. She continues: “We believe that using data and technology, we can build this affordable, transparent, financial product for the Indian people that can be used by everyone and can increase also the trustworthy population in this new credit segment.” Evaluating customers using data, machine learning and AI ZestMoney has created a 100% digital user experience that uses an array of data coupled with machine learning and AI technologies to evaluate new credit lines in a matter of milliseconds. “Based on the AB testing that we've done we have collected quite a good amount of data,” Lyarskaya says. “[We built] some predictive models that allow us to differentiate between different groups of users, so we can propose different journeys and different options for users to apply for our product.” While the technology behind ZestMoney’s model evaluates new credit applications and makes the final decision on credit approval, it also guides the user on a personalized journey assessing and modifying questions during the application process based on personal and historical data. “This [model] is basically behind every decision that we take along the journey,” she explains. “Like, what kind of questions we want to ask a user, or do we want to ask this question in one way or the other?”. She concludes: “There is a model that stands behind that tells us what exactly we need to do and who is the user that we see in front of us. So that is all based, not just on our assumptions, but on what the data has been telling us.” Key Takeaways Machine learning and AI are helping financial firms reach new credit markets in India. Where traditional banks have been slow to react, tech upstarts have been able to capitalize. Balance privacy and personalization for a better user experience. Understanding how much data an individual feels comfortable sharing is an important first step to creating outstanding user experiences. AI and machine learning solutions enable better products but do not create them. Human critical thinking is needed to make sure a system works. AI and machine learning make sure that the system is efficient.

Mar 24

26 min 30 sec

Danone Global Data and Analytics Transformation Director Camilla Schwartz-Björkqvist explains how she is creating the next generation of data and analytics evangelists at Danone Large businesses have digitized rapidly over the last few years. As a result, the data and analytics function is more important to business success than ever. However, some people in businesses of all sizes are still skeptical, even fearful, of the ongoing information revolution. In this week’s episode of the Business of Data Podcast, Danone Global Data and Analytics Transformation Director Camilla Schwartz-Björkqvist argues that to address the fear, businesses must help their staff see the benefits of digital transformation by improving their data literacy. “There's this population, I think, that's in every company [that] will have a healthy skepticism,” she says. “I think the first challenge is [addressing] the skepticism and the fear.” She continues: “You have to make data and analytics accessible to people [because] it creates a lot of fear in the organization when you hear ‘oh, we're going to automate, we're going to introduce machine learning, and AI is coming.’” Raising Awareness About Data and Analytics A key part of the transformational work that Schwartz-Björkqvist is doing at Danone is getting more business partners interested in data and analytics.To do this, she created a ‘data bar’ on Danone’s internal social media where she could share educational data and analytics content. “That was one of the key cornerstones of what we did first, [creating] a space where people could come and find us, and find information that they're looking for,” she says. “We wanted to create a place where people wanted to come and hang out.” At the data bar, team members at Danone can listen to podcasts, do some light reading, or even take a masterclass on key data and analytics topics. “When I set up the first masterclass, I was expecting we would have 20 to 40 people attend, but we had 200!” she recalls. Creating Unique Training Journeys While raising awareness about the positive benefits of data and analytics is an important first step, realizing those benefits requires raising the level of data literacy across businesses. For a company with tens of thousands of employees like Danone, this is an considerable challenge. Schwartz-Björkqvist realized that she would need to create a platform that could provide bespoke training journeys for staff regardless of their geographical location or seniority. Thus, Danone’s data academy was born. “The data professionals will get the deep expertise training they need around data governance, and around data science,” she explains. “We’ll have the Python training, and we’ll have really in-depth cool stuff where they also get certified externally – so that is that little extra spice.” Of course, Danone, has a large population of staff who are not data professionals. The data academy has a course for them too. “We'll be taking them through a combination of e-learning and workshops depending on where you are in the organization, and how much we believe that [they] will be impacted by data enablement,” she says. She concludes: “Of course, let's not forget the executives, they need to they need to get it – they need to really grasp it – so they are the third population.”

Mar 18

26 min 47 sec

Laura Spencer: How to Scale Data and Analytics Initiatives More Successfully with AnalyticsOps

Mar 12

21 min 49 sec

Pranav Kapoor, Global Head of Decision Analytics Audit Innovation at Manulife, discusses how he’s evolving the insurance firm’s audit function to support continuous auditing and advanced analytics Automation promises to revolutionize the internal auditing process by enabling teams to continually gather from process data that supports auditing activities. As Manulife Global Head of Decision Analytics Audit Innovation Pranav Kapoor notes, this will enable auditors to provide their businesses with more regular assurance about risk management, governance and their internal control processes. In this week’s episode of the Business of Data podcast, he talks about the work his team is doing to make this vision of the future a reality. “The biggest opportunity we believe is to provide continuous assurance to the business,” he says. “If you can use automation to run these audits pretty much when you desire, or even in real-time, I think that’s the piece where continuous auditing processes become very interesting.” “You can really see a high demand in internal audit teams to push in that direction,” he adds. “Everyone in the business sees the value around it.” Pranav Kapoor, Global Head of Decision Analytics Audit Innovation, Manulife“We need to drive the innovation culture and embed digital skills and knowledge into all our auditors, and not just a small team that will be aware of these skills” As a business function, internal audit (IA) is evolving rapidly. Companies including Manulife are looking at how IA can stop focusing purely on risk discovery and start using automation and analytics to drive innovation. “We want to the be the innovative function in the audit group,” says Kapoor. “In my utopia, the auditors will have analytics skills and the data analytics group, which is my group, will become the innovation function.” To achieve this, Kapoor has been working to ensure Manulife’s auditors have a common definition of what analytics is and educate them about the power of analytics to improve their productivity. Of course, educating staff about the benefits of automation and securing buy-in for analytics projects is the first step in a much larger journey. Kapoor sees these efforts as a starting point for the more ambitious goal of enabling continuous auditing and assurance in the long-term. Key Takeaways Automation is the future of IA. Continuous auditing will allow IA teams to provide the business with audit assurance more regularly Data literacy is key. Data-focused leaders must equip non-data staff with the right skills to drive business transformation Quick wins come first. Delivering smaller projects that make auditors’ lives easier is helping Kapoor secure buy-in for larger initiatives

Mar 4

25 min 7 sec

Maria Tarasidou, Global Data Program Manager at Facebook, argues that legacy companies should follow the example of big tech to succeed with data-driven business transformation Enterprises are increasingly open to investing in new data-driven technologies that are shaping the future of business. But as Facebook Global Data Program Manager Maria Tarasidou argues in this week’s Business of Data podcast, technology doesn’t drive business transformation by itself. “You have to also be prepared to bring in the right people with the right mindset,” she says. “Everyone needs to understand data. Everybody needs to use it. Everybody needs to be able to go back and retrieve and extract the data they that need in the way that they need it and visualize it.” In recent years, many companies established hubs that are separate from their legacy business to kickstart the data strategies or innovation projects. While this can make sense in the short-term, Tarasidou notes that data-driven ways of working must become embedded across an entire organization before meaningful transformations can occur. “What happens in big tech companies is that there’s no role that is actually a Data Analyst role,” she says. “Everyone is an analyst.” This chimes with the stories we hear from our wider data and analytics community. It’s those companies that invest in data literacy and integrate data-driven ways of working into the roles of staff across the business which get the most value out of data and analytics. Maria Tarasidou, Global Data Program Manager, Facebook“If we say, ‘In 10 years do you expect for the current Data Analyst role to exist?’ I would say, ‘No’” Tarasidou predicts that integrating data with business processes in this way will become so widespread within a decade that Data Analyst roles as we know them will cease to exist. “If you want to force it, you bring in the right people and the right talent and you educate the business accordingly,” she suggests. “But it’s going to happen. It’s where we’re heading. This is the age of information.” Enterprises that want to make the most of futuristic technologies such as the ‘data mesh’ must ensure their staff are committed to upskilling and changing how they work to drive successful data-driven business transformations. Key Takeaways Technology by itself is not enough. Executives that focus their investments on shiny new tools will not succeed in driving meaningful business transformations Data literacy fuels digital transformation. Enterprises should focus on empowering staff to work with data efficiently to accelerate their data-driven business transformations Data skills are the future of business. The Data Analyst role could one day be phased out as data analysis skills become an integral part of all jobs in the workforce of the future

Feb 25

27 min 13 sec

Carlos Rivero, Chief Data Officer of the Commonwealth of Virginia discusses how his team built a better data governance framework to help address the State’s opioid epidemic   Drug overdose deaths in the United States have accelerated during the COVID-19 pandemic, according to the CDC. Synthetic opioids are driving this increase, nearly 40% more opioid-related deaths were reported year-on-year in May 2020. In this week’s episode of the Business of Data Podcast, Carlos Rivero, Chief Data Officer of the Commonwealth of Virginia discusses how improved data sharing and governance has helped the State’s worsening opioid epidemic.   “When you think of the opioid problem, it isn't one-dimensional. It isn't just a law enforcement problem, it isn't just a health science problem, it isn't just a community problem. It's an overall problem that has multiple facets to it,” says Rivero. “So, being able to connect with a council that has multiple representatives from each of these different industries participating in it, one of the biggest concerns was how do we share data?” Creating a Data Governance Framework Rivero is responsible for 63 executive branch agencies and 133 localities in the State. A top priority when he joined the agency in 2018 was building a data governance framework to make data sharing easier. Rivero’s first task was to establish communication between stakeholders at all levels in the data management cycle to address complex multidisciplinary issues that one agency cannot address alone. “The number one [priority] was to establish a governance framework that allowed people to participate in the discussion of how we best leverage our data assets,” he says.  After that, Rivero focused on improving data discoverability and creating a data trust model that could be implemented across the State. “The Commonwealth data trust is all about [creating a] legal framework that facilitates confidence and trust in our ability to manage these restricted use sensitive data assets,” Rivero explains. How Data Use Evolved to Address Statewide Health Problems One of Rivero’s biggest successes in the Commonwealth is a substance use disorder project focused on addiction analysis and community transformation. Starting in Winchester, Virginia, a small community in the Northwest of the State, Rivero’s team implemented a pilot program that aimed to demonstrate the efficacy of data to address the region’s opioid problem. “We were looking at that [community] as a microcosm for what happens in the larger scale across the Commonwealth with regards to data sharing, but then deriving intelligence from the data assets that are being collected from a wide variety of different organizations,” says Rivero. Ultimately, the success of the project in facilitating data sharing and making intelligence available has seen it rolled out across four other regions of the commonwealth. Not only that, but the systems that Rivero’s team built were also implemented into the State’s pandemic response. “We took all of that and implemented it for the COVID 19 pandemic response,” Rivero concludes. “So, what you're seeing is a very fast evolution of the data, trust, the governance framework, the technology platforms, and all of the components that go together to make data sharing analytics and intelligence possible.” Key Takeaways • Increasing communication amongst stakeholders is key. Implementing a data governance framework requires efficient cross-team communication • Creating a data trust increases confidence in data. The legal framework of a data trust increases confidence around the use of sensitive data • Apply your experience to new problems. Governance frameworks and technology platforms can be used to address new challenges

Feb 18

29 min 59 sec

Matt Lovell, Former Data, Analytics & Insight Director at Eurostar explains how automation transformed their customer experience in the wake of the pandemic On March 13, 2020, after two years of hard work, Eurostar replaced its 50-year-old ticketing system with a modern, data-driven platform. On March 15, 2020, COVID-19 caused Eurostar’s passenger numbers to crash. In this week’s episode of the Business of Data Podcast, Matt Lovell, former Data, Analytics, and Insight Director at Eurostar, explains why he reprioritized his data projects to improve customer experiences as pandemic disruption hit. “At the moment all of the projects that we would normally work on are largely on hold. So, it does give you the options to do a bit of a reset, whether it’s adding rigorous processes, fixing systems, or restructuring data in a way that we want it,” he says. “These are things that normally wouldn’t get looked at.” Reacting to Customer Demand in Real-Time As lockdowns began, Eurostar customers needed a way to easily reschedule or cancel their journeys. Unfortunately, their voucher-based compensation system was not designed to deal with a pandemic. “That created a whole new management scenario that we hadn’t necessarily planned for,” Lovell says. “There were a lot of things we had to systematically work through.” The first job, he explains, was quick to take stock of the situation and prioritize key projects. Then, the team rapidly iterated on system modifications and introduced automation designed to improve customer experiences. “We started to [ask] how we could gradually move to a point whereas much of this was automated as possible and as much of this was visible to the customer as possible.” Automating key parts of the process helped Lowell to implement a convenient system for customers to switch tickets and claim refunds online. It also proved the value of automation to the business. “The resource that was needed for us to do it manually at the beginning was so substantial,” he says. “[Now] we can build this in a way where it barely has any of that.” “Not only is that reducing the stress on the business but it’s also improving the customer experience, so it’s really a win-win,” he concludes. Key Takeaways Use ‘downtime’ to reevaluate data priorities. If your regular projects are on hold, take the opportunity to take a fresh look at your priorities Iterate for success. Even if a system is not perfect immediately, by iterating over time you can make incremental improvements Automation can create a win-win. By making systems more efficient, data leaders can improve customer experiences and prove business value at the same time.

Feb 11

35 min 2 sec

Thanassis Thomopoulos, Head of Global Marketing and Commercial Analytics at eBay Classifieds Group, outlines how Apple’s ‘transparency framework’ and the looming death of cookies will affect his teams’ approach personalization Data privacy regulations have been ratcheting gradually up globally since the EU’s General Data Protection Regulation (GDPR) came into effect two years ago. As we move into 2021, two looming developments will transform the way companies provide personalized customer experiences. In this week’s episode of the Business of Data podcast, eBay Classifieds Group Head of Global Marketing and Commercial Analytics Thanassis Thomopoulos outlines what they are and how his company is preparing for them. “It’s becoming more and more difficult to recognize people online,” he says. “What this has in terms of a second wave impact is, if you can’t recognize people online, then you will have more challenges in providing personalized experiences and also being able to measure whatever you’re doing online.” Why eBay Classifieds Group is Preparing for a Cookie-Free World After some initial disruption, European businesses have largely mastered the art of GDPR compliance. However, legislators are now moving to address the widely hated ‘cookie walls’ that have popped up on many websites as an unintended consequence of the regulations. “A few months from now, the world will be cookie-less,” Thomopoulos predicts. “That’s very different form what we knew.” Today, cookies are the main way companies including eBay Classifieds Group recognize people across websites to pass information between websites and provide joined-up experiences. Thomopoulos warns: “This is something that’s going to be disappearing and, frankly, not everyone has all the answers as to how we’re going to be able to function after that.” Customer Trust is Essential to the Future of Personalization A second challenge Thomopoulos highlights is specific to the ‘transparency framework’ outlined in Apple’s iOS 14. “In their own way, they will give a very obvious and vocal choice to the user on whether they are willing to share their identifier for advertising,” Thomopoulos says. “We’ve been preparing for this at eBay Classifieds Group and we’ve run a few tests,” he adds. “What we can see is, there’s a sizeable chunk of people who will decline their consent.” Companies will likely deliver campaigns to communicate the benefits of personalization to customers in response to this new challenge. But eBay Classifieds Group will also be focusing its efforts on getting more users to create and log into profiles on its website. “To do that, you need to build trust,” Thomopoulos notes. “If I’m a shady website or a website that is well-known for, let’s say, having subpar practices around their information sharing, then I would be very reluctant to do that.” He concludes: “If it’s a business that I trust – that I love – then I would be totally OK with giving some of my data to in exchange for a better experience. I will do this very gladly.” Key Takeaways Prepare for a cookie-free world. European companies should be planning for a world without advertising or cross-site cookies Adapt to Apple’s transparency framework. Consider focusing on getting users to create customer accounts to enable personalization Consumer trust is more important than ever. Changing attitudes around data privacy mean companies must work hard to earn their customers’ trust

Feb 8

26 min 37 sec

Anne Merel Oosterbroek, Head of Data & Analytics, Financial Restructuring and Recovery at ABN AMRO Bank NV shares her tips on creating data literacy campaigns for senior executives Successful digital transformations require careful planning and significant investments of time and money. Demystifying data and analytics for senior leadership is an essential part of winning that financial investment, argues Anne Merel Oosterbroek, Head of Data and Analytics, Financial Restructuring and Recovery for ABN AMRO Bank NV in this week’s episode of the Business of Data Podcast. Data teams steeped in data and analytics know their power and potential. However, educating senior leaders who may be less aware of the benefits is an important step toward successful digital transformations. “Start with the top,” she recommends. “It’s very important to create those believers, not only at the lower levels of an organization but [also] at the top. If you have a few believers, [then] it is a lot easier to have those investments approved.” By creating a data literacy campaign specifically for senior executives, data and analytics leaders can demonstrate how digital transformation will help their businesses to achieve their goals. “We shouldn’t assume that our leadership team understands what we can do with data and analytics,” she says. “We should really start by answering the very basic questions. As soon as we’ve done that, we can then explain what is possible, so they actually get enthusiastic.” Communicating data and analytics success stories is also an important part of winning hearts and minds across the business, Oosterbroek says. “Invest your time in understanding the possibilities of data and analytics, and don’t be shy to make your colleagues [feel] enthusiastic about this fantastic world of data analytics,” she concludes. Key Takeaways Don’t assume that senior executives understand the potential of data and analytics. Start with the basics and build toward more complex topics from there Educate senior leadership on the benefits to secure buy-in. Their support is essential to achieve successful digital transformations Become a data and analytics evangelist. Establish data and analytics literacy programs and prioritize education across business units to get people excited about the future

Jan 28

27 min 20 sec

Dan Marzouk, Senior Vice President of Data Science at Aegis, explains how data science is shaping their approach to insurance Wildfires are difficult to predict, grow rapidly and have the potential to cause damage worth tens of billions of US dollars. This is a problem for insurers trying to price risk. The solution? Using data to develop a more complete understanding of risk, argues Aegis Insurance Senior Vice President of Data Science Dan Marzouk in this week’s episode of the Business of Data Podcast. When evaluating, for example, the chances of a wildfire affecting a suburban home, there are a wide range of data points to consider and a variety of data sources to include. However, not all sources are of equal quality. “The challenges are similar to comparing a Google review, a Yelp review and a Facebook review for a business. Each of those [reviews] have their pros and cons,” Marzouk notes. “Each of our data sources also have their pros and cons.” The differing quality of data sources can lead to discrepancies in the data. That’s where data science comes in. Creating a consistent risk assessment requires building a model that quantifies the accuracy of input data. “Over time we start to learn and utilize what we think is accurate from one dataset and continue on that path to build our own data integration system that understands what we believe to be the most accurate system,” says Marzouk. Of course, weighing tens of thousands of data points takes time. However, as Marzouk explains, in the age of instant everything it is crucial to provide insights to decision-makers quickly. “To do that, we have to both understand how to aggregate that data quickly and cull out what’s not as important or useful,” he says. “And be able to develop something that the underwriter can make a decision on quickly.” Ultimately, to meet the business need the data must help to create a product that is appealing to the customer. That means that data scientists must also maintain a commercial awareness. “Customers don’t buy things because you told them that the model says [they’re] going to buy it,” Marzouk quips. “That’s my advice to the data science community. Take a step back and say, ‘I know the data’s telling me this, but does it make sense?’” Key Takeaways Understand the data you have. Is it granular enough? How reliable is the source? The answers should inform your model. Maximize your data points. Innovative technologies like image recognition can dramatically increase the number of data points available Take a step back. Remember to evaluate what the data is telling you in the light of all other available information

Jan 21

37 min 38 sec

Rapid advances in data-driven technology and a precipitous rise in catastrophic flood events in the US presented an opportunity for this InsureTech startup There are 62 Million homes at moderate or extreme risk of flooding in the US, according to insurance risk assessment firm Verisk. Homeowners insurance does not typically cover flood damage and up to 50% of homes in high-risk areas have no flood insurance at all. This amounts to a serious problem, argues the founder of InsureTech startup Neptune Flood Insurance Jim Albert in this week’s episode of the Business of Data Podcast. In the past, most flood insurance in the US was provided by the National Flood Insurance Program (NFIP). Now, powered by innovative technologies, nimble insurgent companies are shaking up the status quo. “The NFIP has done an exceptional job over the years, but as with most government programs, technology has started to outstrip what has happened within the flood space,” says Albert. “And so, what I tried to create with Neptune when I founded it in 2016 was an ‘Amazon-like’ buying experience in flood insurance.” “You can get one-click buying for virtually everything else you do in life,” he continues. “So, we tried to make it easy to buy flood insurance in the US through the use of data analytics and a really simple online quoting platform.” The game-changing, automated approach championed by Neptune Flood Insurance was not without its skeptics. In 2016 when the company was founded, the idea of digital insurance was even more revolutionary than it is today. “There was a lot of skepticism about digital insurance [back then]. Could a digital model actually replace the traditional back room full of underwriters?” Albert recalls. “[Especially] when I explained that we don’t have any underwriters. In fact, the underwriter is the computer.” What sets Neptune Flood Insurance apart from its competition is the speed that customers can get a quote and buy their flood insurance online. We’ve proved in the model at this point,” Albert says. “We pull in about a hundred different data elements in one second when you enter the address, and we do the full evaluation right then and there. The application of this technology could not be timelier. Not only are flood events likely to occur more often in the US, but due to the pandemic no-one wants to have an inspector in their home, nor to wait weeks for an estimate. Do [customers] want to sign on to a days or weeks-long slog to finally get the information that they need?” Albert concludes. “Or [do they] want to go to one site that has seemingly all the information with a really good price and great coverage options? That’s what we see happening.” Key Takeaways Many homes at high risk of flooding in the US are uninsured. A lack of awareness of the risks is one cause, but catastrophic damage can take years to recover from Data has paved the way for a better solution.  By pulling together data from a multitude of sources, Neptune Flood Insurance can provide a policy in seconds Hyper-personalization is on the way. Other types of insurance companies will soon take advantage of advanced, data-driven technology to provide highly personalized policies to their customers

Jan 14

29 min 33 sec

The best way for companies to provide premium experiences to their customers is a cloud-enabled platform, argues USAA Assistant Vice President and Head of Information Management for P&C Allen Crane in this week’s podcast USAA Assistant Vice President and Head of Information Management for P&C Allen Crane has a simple message for those companies yet to begin their cloud transformation journey. Start now. The USAA built their cloud infrastructure from the ground up to provide services with the ‘wow’ factor, as Crane explains in this week’s episode of the Business of Data podcast. However, cloud transformation initiatives are complex, challenging and require careful planning. A process that Crane compares to a pilot building an aeroplane in flight. “We’re building a new plane in the sky that has to fly higher and faster than the one we’re already in,” Crane says. “And once we get that other plane flying, we have got to get all of the passengers off this plane and onto the new plane while it’s still in the air.” In addition, Crane says, it is essential to obtain the support of senior leadership for such a long and complex transition to be a success. “The most important thing in my mind is that the support starts at the top,” Crane says. “If you don’t have that level of support from the top you really won’t be successful.  You can’t do something at this scale at the grass-roots level.” Companies must be able to provide their customers with premium experiences to remain competitive, argues Crane. Cloud transformation is an essential first step to achieving this. “The world is moving to the cloud. Your user experiences will be enabled by the cloud. Machine learning and AI and all of that will be dependent on the cloud to deliver the kind of expectations that you want to deliver for you customers,” emphasizes Crane. “The sooner you get there the better off you will be.” Key Takeaways Plan the transition carefully. Migration to the cloud requires a retooling of the foundations of your data infrastructure Win the support of senior leadership. Long and complex cloud transformation initiatives require the support of those at the top to be a success Get started early. The sooner your company starts the cloud transformation journey, the sooner your customers will reap the benefits

Jan 11

27 min 11 sec

Sherene Jose, VP and Chief of Staff, Cyber and Intelligence Solutions at Mastercard explains how they reimagined their fraud detection teams as revenue-generating innovation machines You might not have heard of Mastercard’s cyber and intelligence solutions team, but you have probably used their technology. Chip and PIN, contactless payments and even biometric-secured purchases are all part of the growing arsenal of payment solutions they oversee at the financial services giant. In fact, creating innovative ways to make shopping safer and easier for their customers the team’s core purpose, explains Mastercard VP Chief of Staff Cyber and Intelligence Solutions Sherene Jose in this week’s episode of the Business of Data podcast. “Theoretically, the best way to achieve zero fraud loss is to just reject every transaction, right?” quips Jose. “[To prevent that] we have to intelligently find ways to navigate the consumer experience and minimize any security risks.” The Birth of Cyber and Intelligence Solutions at Mastercard Prior to 2014, Mastercard had fraud detection and management teams dotted around the business. These decentralized teams were primarily seen as a function of cost control, designed to minimize fraud losses for customers. Then came the big idea: Consolidate these departments with external expertise and create a new, revenue-generating cyber intelligence unit for the business. This unit is now responsible for protecting Mastercard’s payment ecosystem from fraud, creating innovative solutions for its customers and differentiating their core offerings. Of course, patching together a newly conceived cyber intelligence unit from a combination of disparate teams and newly acquired startups is easier said than done. “There was an evolution where teams working in specific verticals of authentication and fraud management and so on learned to come together and think across different verticals,” says Jose. “I could immediately sense the excitement, the sense that things are possible because of this paradigm shift. That mood continues to this day.” Now, the cyber and intelligence solutions unit is at the forefront of innovation and fraud prevention for the company. In the first eight months of last year alone, their AI-powered cybersecurity system ‘Safety Net’ blocked over $113 Million USD in fraudulent transactions in the US. Innovating Payments While Maintaining Customer Security The uptake of technologies like contactless payments, spearheaded at Mastercard by Jose’s team, has skyrocketed during the pandemic. For Jose, the goal is to continue to create seamless and safe ways for their customers to shop, whether that’s online or in-store. “An example of this would be digital wallets, right? You don’t have to key in your password or your PIN to just go ahead and [make] transactions,” she explains. “That’s the kind of seamless experience that we are trying to recreate in every channel.” To do this, it is vital that Jose’s team understands rapidly changing customer needs. By leveraging data and analytics, they are able to build a more complete picture to work from as they create highly secure and innovative payment solutions. “Mastercard as an organization has a very conservative and consumer-centric approach to data and analytics,” she explains. “We never want to store any personally identifiable data. The insights that we get from data in aggregate is what powers our solutions.” “What is top of mind for us is how do we keep our ecosystem safe and how do we keep our stakeholders safe in this environment?” she concludes. “There’s a lot more that we can do and we’re working hard towards it by leveraging the power of data and analytics.” Key Takeaways • Fraud management need not only be a ‘cost’. By leveraging their expertise, fraud teams can be turned into innovation engines • Seamless and secure payments are the heart of the customer experience. Seamless and safe transactions make for happy customers • AI is a powerful tool aga

Dec 2020

31 min 39 sec

Di Mayze: WPP’s Community-Led Approach to Data and Analytics

Dec 2020

30 min 14 sec

Lisa Allen, Head of Data and Analytical Services at UK mapping agency Ordnance Survey, reveals how its data is helping the government respond to COVID-19 Ordnance Survey, Great Britain’s state-owned mapping agency, has a data culture that stretches back to its founding nearly 230 years ago. It supplies geospatial data and services to hundreds of customers from insurance companies to the police and local councils. Innovation and data science are at the heart of everything Ordnance Survey does, as Ordnance Survey Head of Data and Analytical Services Lisa Allen says in this week’s episode of the Business of Data podcast. “We manage one of the key national data assets for Great Britain,” Allen says. “The original purpose of [Ordnance Survey] was to collect [data] for cartographic purposes. But actually, now we want it to for analytical purposes.” “The [Ordnance] Survey has been supplying data during the outbreak and we’ve been in great demand,” she continues. “We’ve really seen [the agency] come into its own.” The Data Informing the UK’s COVID-19 Response Thanks to its long heritage, Ordnance Survey boasts a world-class approach to geospatial data science. Its data stores contain more than 500 million geographical features and are updated 20,000 times a day. Keeping such a crucial dataset up to date a huge responsibility and requires close collaboration between data scientists and surveyors, as well as the use of third-party data and machine learning techniques. The events of 2020 have underscored how vital this work is. Thanks to the data at its fingertips, Ordnance Survey has been able to provide the British government with data and insights throughout the pandemic. “COVID-19 has really shown the importance of data,” Allen remarks. “This epidemic is about, ‘Where are the outbreaks?’ And all the information you need to know is based on location.” “What I’ve really seen during the epidemic is the OS come into its own,” she adds. “We’ve been asked questions about our mapping. We’ve been asked, ‘Where are the care homes? Where are the supermarkets? Where are the GP surgeries?’” Lisa Allen, Head of Data and Analytics Services, Ordnance Survey“During an emergency we’re available 24 hours a day, every day of the year at no cost” Ordnance Survey has a contract with the British government that sees it provide geospatial data and location data to public services organizations. It also provides services ranging from providing basic maps and identifying ‘points of interest’ on them to data matching. “This is especially important for things like addressing,” says Allen. “So, during the pandemic, making sure the letters went out to the vulnerable [and] making sure those addresses were right.” Following the news that the British government has become the first to authorize a COVID-19 vaccine for use, an end to the pandemic may be on the horizon. But the Ordnance Survey’s work is far from over. The agency will continue providing world-class data-driven services long after the crisis is over, just as it has for hundreds of years.

Dec 2020

27 min 20 sec

Meena Thanikachalam, Head of Data Architecture at Ally Bank explains how building a world-class data platform in the cloud will transform the customer experience and build loyalty Traditional high-street banks were not at the forefront of the digital revolution. However, customers today demand instant access to high-quality digital experiences – a trend that has only been accelerated by the pandemic. Banks must use their data to develop a better understanding of their customers’ needs, argues Meena Thanikachalam, Head of Data Architecture at online bank Ally in this week’s episode of the Business of Data podcast. Thanikachalam heads up the team responsible for creating an innovative cloud-based data and analytics platform for the bank that is designed specifically with the customer experience in mind. “We are building a world-class data platform that will help improve our customer experience,” says Thanikachalam. “And will also help deepen our customer relationships and increase customer loyalty.” A core element of this customer relationship is to create an experience for the customer which feels bespoke. That is why Ally Bank have done the work to understand what their customers need and when they need it. “This platform is also looking at integrating omni-channel data and also data that we have collected about customer preferences,” she says. “Based on that we would provide a targeted and personalized experience for them.” Ally Bank is also using AI initiatives like cognitive computing and conversational AI to further enrich the customer experience and enable customers to do more without needing to speak to an agent. “In banking specifically, cognitive computing is used predominantly to have human-like conversations,” Thanikachalam says. “That is one area [in banking] where I see AI penetrating a lot.” Key Takeaways Develop a 360-degree view of the customer. Understanding what your customer needs and when they need it will help you shape your strategy. Write the data strategy to inform the customer experience. Identify what data is needed and which metrics will most effectively influence the customer. Don’t forget the guiding principles. Scalability, reliability, performance efficiency, and operational excellence should guide your architectural work.

Dec 2020

24 min 13 sec

Gillian Tomlinson, Director of Data and Analytics at Three UK, outlines how the telco is partnering with its holding company on futuristic data monetization projects that will harness the power of 5G Public attitudes towards the UK’s 5G rollout may be mixed. But for data and analytics leaders at British telecoms companies, the technology represents an opportunity to explore new data monetization opportunities. As Three UK Director of Data and Analytics Gillian Tomlinson explains in this week’s Business of Data podcast, this is something Three built into its digital transformation plans years ago. “[We decided] we had to move our network and our IT stack into the cloud as soon as possible,” she recalls. “We [needed] the processing power in the future. [We] needed to be able to support all that 5G brings. That was really the spur for us.” In recent years, the company has partnered with cloud providers to ensure its data infrastructure would meet its changing needs as it scaled its ambitions and integrated analytics capabilities. “The ultimate goal is, you’ve got to compete,” she says. “You’ve got to understand how you’re going to compete in future and how the nature of the industry’s competition is going to change because of digitization.” Tomlinson argues that success in today’s business landscape requires perpetual digital transformation. Companies must constantly innovate to keep step with their competitors, who are also perpetually innovating. “There’s no such thing as an end-goal anymore,” she notes. “You’ve got to be constantly improving, testing, proving, scaling up [and] looking for how you’re going to make that next big leap.” Three UK’s Data Monetization Plans Enterprises that are sitting on valuable troves of proprietary data are increasingly exploring data monetization opportunities. While the regulatory protections in place to protect consumer privacy when sharing sensitive information are a challenge, these projects are seen as a logical ‘next step’ for those looking to drive revenue with data. For Three UK, this means partnering with CK Delta, a holding company within CK Hutchison Group, to provide data support on experimental monetization projects and initiatives. “We’re incredibly mindful of anonymized customer information being absolutely critical and the compliance requirements around GDPR,” she says. “But the information we sit on, specifically from a network perspective, is incredibly valuable.” “Our ability to understand, let’s say, people’s movements and provide that in an anonymized way to the City of London is a valuable insight,” she continues. “It’s got a real monetary value attached to it.” She adds: “More recently, we have also looked at a proof-of-concept that we’ve been working with around flying taxis.” This kind of innovation may be out of reach for enterprises that are still laying the foundations for advanced analytics. But the promise of data monetization is a tantalising one we expect to see more and more companies exploring in future. Key Takeaways Enterprises are eyeing data monetization opportunities. Using data to create valuable new products is a logical next step for advanced data-using companies. Data monetization depends on proprietary data. Companies that can curate their own unique and valuable datasets will be best positioned to succeed with data monetization. Enterprises will struggle without the right ‘data foundations’. Data leaders should ensure their organizations have the right level of data maturity before pursuing these projects.

Dec 2020

28 min 6 sec