top of page

Intoware's Digital Transformation Bytes Podcast

2 Apr 2025

ree

Rosanne Werner joined Brad Flook, CRO of Intoware, on the Digital Transformation Bytes podcast to discuss a question many organisations struggle to answer: why do expensive AI and digital transformation initiatives so often fail to deliver? The answer, it turns out, has less to do with the technology itself and far more to do with the people expected to use it.


This episode focuses on the simple premise that tools don't transform organisations - people do. Werner and Flook explore what it takes to build real adoption, why data quality begins with frontline behaviour, and how prompt engineering might be the most overlooked skill in modern business.


Key topics discussed include:

💡The gap between technological advancement and organisational adoption

💡Why data silos remain the biggest obstacle to AI readiness

💡The importance of data quality and governance at the source

💡How to build essential habits and behaviours for data fluency

💡The value of prompt engineering as a critical skill for the future


Rosanne shared insights from her experience leading the Global Data and AI Mindset and Culture programme at Coca-Cola, highlighting how companies can use behavioural science principles to create lasting cultural change.


Both speakers stressed that while AI tools offer substantial opportunities, success depends on human capabilities, critical thinking, and collaboration.


You can find out more about Intoware at https://www.intoware.com/ and see the full podcast episode below:




 

Ready to empower your people on the Data and AI Journey? 



Highlights

00:45 – Why digital transformation is a people problem, not a tech problem 

08:12 – The biggest barrier to AI adoption isn't data, It's fear 

15:30 – How siloed data mirrors siloed teams 

22:45 – Why businesses chase shiny AI tools instead of solving real problems 

29:10 – The role of prompt engineering in future-proofing skills 

36:20 – Bias in AI models - and why diverse teams matter 

42:15 – Human-in-the-loop: when you shouldn't trust the algorithm 

48:30 – Advice for individuals and businesses entering the AI space


Technology Is Easy. People Are Hard.


Werner opened by addressing a reality many organisations overlook: digital transformation is rarely about the technology itself.


🗣 Rosanne Werner

"Digital transformation is about the technology, about the tools - but the people aspect is often overlooked. Very little gets invested into building those capabilities. Just giving people a tool doesn't mean they're able to use it. If you don't have the people driving the cars, the tools will just sit there."


Flook echoed this from his own work deploying digital workflows in manufacturing, oil and gas, and infrastructure.


🎙️ Brad Flook

"People are key. We can all work out the return on investment, but if the people don't pick the tool up, it doesn't get the adoption needed. That's one of the biggest hurdles we come across."


Werner described it as a tension between Martec's Law - technology accelerating at alarming speed - and organisational capacity to absorb that change.


🗣 Rosanne Werner

"There's this rate of technology accelerating, but organisations don't change at that speed. Habits and behaviours don't change overnight. You need thinking time. You need to build skills and confidence, and address the fears - the fear of change, the fear of uncertainty."


Leadership alignment, she noted, isn't optional. It's the difference between tools gathering dust and tools transforming work.


---


The Barrier Isn't Data, It's Behaviour


Flook raised a challenge familiar to anyone working in operations: many businesses still rely on pen, paper, and Excel spreadsheets. They haven't built the data foundations needed for AI to function.


Werner agreed - but pointed to something deeper.


🗣 Rosanne Werner

"Many companies dive into AI initiatives because they want to stay up to date, keep on top of competitors. But you can't adopt everything. You've got to be strategic. Start with: what is the business value? What specific problem are we trying to solve? Is AI the most suitable tool for the job?"


She used a sharp analogy: don't use a chopping knife to slice an apple. Sometimes traditional programming or automation is simpler and more cost-effective.


🗣 Rosanne Werner

"You've got to ask: what are the high-impact initiatives that drive value? What do we pursue, ignore, or postpone? If the data isn't ready or accessible, you get garbage in, garbage out."


Flook pressed further: even when data exists, it's often inconsistent. He gave the example of regional language differences - how a bread roll is described entirely differently across the UK.


🎙️ Brad Flook

"If everyone in your team answers it differently, how do you get consistency of data? That's where things like multiple choice questions give you that consistency."


Werner confirmed it's a widespread issue - one recently identified by Harvard Business Review as the single biggest hurdle to AI readiness.


---


Silos, Chefs, and Missing Ingredients


Werner used a vivid analogy to describe the problem of siloed data.


🗣 Rosanne Werner

"Imagine an AI system as a team of professional chefs. They're brilliant at what they do - but they're locked in separate kitchens. They only have their own ingredients. None of those recipes can be shared. They can't create a cohesive meal."


A unified data platform, she explained, is like a central kitchen where everyone has access to the same ingredients, the same tools, the same giant recipe book.


🗣 Rosanne Werner

"That's when you can create a symphony of flavours. But siloed data leads to multiple collections of the same data because teams aren't aware it's already been collected elsewhere."


She also highlighted a deeper issue: even with the right processes and policies, does the workforce have the data awareness to treat it with care?


🗣 Rosanne Werner

"Do they know how to manage data as an asset? Do they have the right habits and behaviours? Do they know the implications down the line if they enter something incorrectly - like putting ABC@hotmail.com instead of the real customer email?"


When field sales teams enter incorrect data, marketing campaigns fail. Decisions get made on faulty information. The cost is more than just technical - it's strategic.


🗣 Rosanne Werner

"If we can influence the frontline workers who capture that data at source, we spend less time cleaning it. We can focus on what the data tells us, how we interpret it, and how we tell that story to guide decisions that bring value."


---


AI Didn't Create the Data Problem - It Exposed It


Flook asked whether businesses had simply missed the importance of data governance over the past five years.


Werner's answer was direct.


🗣 Rosanne Werner

"AI has shown a real spotlight on data. Data's always been there, but AI has highlighted all the holes. If you want AI to perform well, your data needs to be of really good quality - and it needs to be relevant to the problem you're solving."


She described AI as forcing a reckoning: organisations must sort their foundations before they can build on top of them.


🗣 Rosanne Werner

"AI is getting a lot of attention from leadership. They can see the benefits. But you need a good foundation with that data before you get results. You need to sort your house out before AI comes in."


---


Prompt Engineering: The Literacy Skill That is Being Ignored


Flook shifted the conversation to something increasingly critical: prompt engineering.


🎙️ Brad Flook

"There's a certain way of asking these models to get real value. It isn't just a two-line sentence. What are your thoughts?"


Werner called it a skill that's now essential but rarely taught.


🗣 Rosanne Werner

"Prompt engineering isn't something we've been brought up with at school. It's a skill set that's so necessary these days. The AI can do so much - but it's only as good as how you prompt it."


She described common mistakes: people start with a single line and wonder why the output disappoints.


🗣 Rosanne Werner

"Have you put the role in there? The task? The outputs? I've got a list of 'do not use' words because I don't want ChatGPT telling me something's 'a tapestry of life.'"


Flook laughed in agreement, banning phrases like "deep dive" and "take a few lives."


Werner also raised a more nuanced point: how do you train a model to reflect a company's values, communication style, and approach?


🗣 Rosanne Werner

"How you communicate to supply chain teams could be very different than how you communicate to commercial teams. Training the model to have domain expertise and insights the engine wouldn't normally have - that's where the value is."


But she also issued a warning.


🗣 Rosanne Werner

"When we're building models based on our own data, we need to be conscious about whether we're feeding our own bias into them. Are we building a supercharged bias model? We need diverse teams, diverse feedback loops, and we need to test the logic."


---


AI and the CV Debate: Tool or Cheat?


Flook brought up a contentious topic: companies rejecting CVs because applicants used AI to write them.


🎙️ Brad Flook

"I'm a big believer that AI doesn't replace humans - but it can automate manual tasks dramatically. So where do you see the cut-off point?"


Werner's response was unequivocal.


🗣 Rosanne Werner

"AI is a tool. Why would you not use it? A Gallup report says only 13% of the workforce is engaged at work. When you see what AI can do - take on repetitive, mundane tasks - it frees up time for the things that give us fire in the belly. Time to be curious, creative, time to spend with families and friends."


She compared it to the arrival of calculators.


🗣 Rosanne Werner

"It's a tool to help us do things better and more quickly. Why wouldn't you use it?"


She extended the argument to education.


🗣 Rosanne Werner

"If we're not teaching AI in education now, the generations coming up will graduate and be asked to use AI at work. Why are we not prepping them with those cognitive capabilities?"


Flook agreed, describing AI as a tool that closes knowledge gaps and accelerates learning.


🎙️ Brad Flook

"What AI does is close the gap between something you don't understand and someone who does. If you understand the right prompts to ask, you can accelerate that gap very quickly."


Werner highlighted a medical example.


🗣 Rosanne Werner

"Scientific research shows that the more time doctors spend by a patient's side with good bedside manner, the more likely they are to stick to a treatment plan. Why wouldn't we use AI to help doctors take the grunt work of writing reports so they can focus on things that actually save lives?"


---


Bias: The Overlooked Aspect in Every Model


The conversation turned to one of the most contentious issues in AI: bias.


Flook voiced his frustration with models that lean politically, socially, or geographically based on the data they were trained on.


🎙️ Brad Flook

"We've gone through such a media-driven world where media is so biased. I thought, this isn't person-driven, this is data-driven - but it still leans one side or the other."


Werner ran an experiment around International Women's Day.


🗣 Rosanne Werner

"I asked three different models to give me an image of a civil engineer, a mathematician, an actuary. They all came back with men. Men with beards."


Flook laughed - then pointed out the implications.


🎙️ Brad Flook

"Neither of us are going to do well then" and rubs his chin.


🗣 Rosanne Werner

"It's not just identifying the bias in the AI. We need critical thinking skills - not just taking outputs as truth. If you're building models, what data are you putting in? Have you got a diverse team that can provide feedback?"


She stressed the importance of human judgement.


🗣 Rosanne Werner

"You still need the human in the loop for those really complex grey areas. And we need to be conscious about whether we're feeding our own bias into these models."


Flook asked whether bias could ever be truly controlled in large frontier models.


Werner returned to education and awareness.


🗣 Rosanne Werner

"What we can change now is the education side. How do we fact-check AI outputs? What sources do we use? We need fairness checks, standards. We need to share use cases and say, 'This has come out - are you aware of these biases?'"


She also emphasised psychological safety.


🗣 Rosanne Werner

"If we've got different perspectives, are we creating a safe space for everyone to voice their opinions and concerns if they're seeing bias? You might not see a bias that I see. Do we have a safe space to talk about it openly?"


---


Advice for Individuals and Businesses


To close, Werner offered guidance for anyone looking to move into AI or lead digital transformation.


🗣 Rosanne Werner

"Build a continuously learning habit. Be curious. Build your technical skills - but know that the half-life of skills is between two and a half and five years. Every two and a half years, your technical skills are out of date."


She also stressed resilience and adaptability.


🗣 Rosanne Werner

"It's AI now - what's the next emerging technology? You're building your adaptability muscles for whatever comes next."


But her strongest emphasis was on human skills.


🗣 Rosanne Werner

"Build the skills AI can't replace. Empathy. Emotional intelligence. Creative problem-solving that connects different dots. Communication. Collaboration. People are craving human connection. These are the things that matter."


For businesses, her message was equally clear.


🗣 Rosanne Werner

"Technology is changing so quickly. You need to invest in your people. We're in an era where users drive adoption - not the tech giants. Do your people have the capabilities? Do they feel safe to test and learn? Do you have collaborative spaces where people can share best practices?"


Flook reinforced the point.


🎙️ Brad Flook

"Human-in-the-loop is important. How people approach it, how they put empathy on it, how they use their different skill sets to translate what's happening - that's what makes the difference."


Closing Thoughts

 

This conversation doesn't pretend AI transformation is simple. It's messy, human, and requires far more than a subscription to a platform. The tools exist. The models are powerful. But transformation doesn't happen in the cloud - it happens in the habits, confidence, and trust of the people expected to use them.

 

Data quality, adoption, and behaviour change don't happen by accident. They require investment, leadership alignment, psychological safety, and a willingness to address the uncertainty and be adaptable. But when organisations get it right, the return is more than just efficiency - it's engagement, capability, and long-term resilience.

bottom of page