What do the Phoenix Suns, a Regional Healthcare Plan, Commercial HVAC software, and a Fortune 500 bank have in common? They all struggle with data migration headaches.
This revelation – while not entirely surprising to me as someone who's spent years in data migration – might shock many readers: every single organization, regardless of industry or size, faces the same fundamental data conversion challenges.
With over 3,000 IT executives gathered under one roof – I was able to test my hypotheses about both the interest of AI in data migrations and data migration pain points across an unprecedented cross-section of organizations in just three days. The conversations I had during networking sessions, booth visits, and between keynotes consistently reinforced that data migration remains one of the most pressing challenges facing organizations today – regardless of whether they're managing player statistics for a professional sports team or customer data for a local bank with three branches.
The conference opened with Dr. Tom Zehren's powerful keynote, "Transform IT. Transform Everything." His message struck a chord: IT leaders are navigating unprecedented global uncertainty, with the World Uncertainty Index spiking 481% in just six months. What resonated most with me was his call for IT professionals to evolve into "Enterprise Technology Officers" – leaders capable of driving organization-wide transformation rather than just maintaining systems.
This transformation mindset directly applies to data migration across organizations of all sizes – especially as every company races to implement AI capabilities. Too often, both large enterprises and growing businesses treat data conversion as a technical afterthought rather than the strategic foundation for business flexibility and AI readiness. The companies I spoke with that had successfully modernized their systems were those that approached data migration as an essential stepping stone to AI implementation, not just an IT project.
Malcolm Gladwell's keynote truly resonated with me. He recounted his work with Kennesaw State University and Jiwoo, an AI Assistant that helps future teachers practice responsive teaching. His phrase, "I'm building a case for Jiwoo," exemplified exactly what we're doing at Zengines – building AI that solves real, practical problems.
Gladwell urged leaders to stay curious when the path ahead is unclear, make educated experimental bets, and give teams freedom to challenge the status quo. This mirrors our approach: taking smart bets on AI-powered solutions rather than waiting for the "perfect" comprehensive data management platform.
John Rossman's "Winning With Big Bets in the Hyper Digital Era" keynote challenged the incremental thinking that plagues many IT initiatives. As a former Amazon executive who helped launch Amazon Marketplace, Rossman argued that "cautious, incremental projects rarely move the needle." Instead, organizations need well-governed big bets that tackle transformational opportunities head-on.
Rossman's "Build Backward" method resonated particularly strongly with me because it mirrors exactly how we developed our approach at Zengines. Instead of starting with technical specifications, we worked backward from the ultimate outcome every organization wants from data migration: a successful "Go Live" that maintains business continuity while unlocking new capabilities. This outcome-first thinking led us to focus on what really matters – data validation, business process continuity, and stakeholder confidence – rather than just technical data movement.
Steve Reese's presentation on "Addictive Leadership Stories in the League" provided fascinating insights from his role as CIO of the Phoenix Suns. His central question – "Are you the kind of leader you'd follow?" – cuts to the heart of what makes technology transformations successful.
Beyond the keynotes, Day 2's breakout sessions heavily focused on AI governance frameworks, with organizations of all sizes grappling with how to implement secure and responsible AI while maintaining competitive speed. What became clear across these discussions is that effective AI governance starts with clean, well-structured data – making data migration not just a technical prerequisite but a governance foundation. Organizations struggling with AI ethics, bias detection, and regulatory compliance consistently traced their challenges back to unreliable or fragmented data sources that added challenge and complexity to implement proper oversight and control mechanisms.
The most valuable aspect of Info-Tech LIVE wasn't just the keynotes – it was discovering how AI aspirations are driving data migration needs across organizations of every size. Whether I was talking with the CIO of a major healthcare system planning AI-powered diagnostics, a mid-market logistics company wanting AI route optimization, or a software development shop building AI-solutions for their clients, the conversation inevitably led to the same realization: their current data challenges couldn't support their AI ambitions.
The Universal AI-Data Challenge: Every organization, regardless of size, faces the same fundamental bottleneck: you can't implement effective AI solutions on fragmented, inconsistent, or poorly integrated data. This reality is driving a new wave of data migration projects that organizations previously might have delayed.
Throughout three days, the emphasis was clear: apply AI for measurable value, not trends. This aligns perfectly with our philosophy. We're solving specific problems:
Info-Tech's theme perfectly captures what we're seeing: organizations aren't just upgrading technology – they're fundamentally transforming operations. At the heart of every transformation is data migration. Organizations that recognize this shift early – and build migration capabilities rather than just executing migration projects – will have significant advantages in an AI-driven economy.
Zengines not just building a data migration tool – we're building an enduring capability for business transformation. When organizations can move data quickly and accurately, they can accelerate digital initiatives, adopt new technologies fearlessly, respond to market opportunities faster, and reduce transformation costs.
Malcolm Gladwell's thoughts on embracing uncertainty and making experimental bets stayed with me. Technology will continue evolving rapidly, but one constant remains: organizations will always need to move data between systems.
Our mission at Zengines is to make that process so seamless that data migration becomes an enabler of transformation rather than a barrier. Based on the conversations at Info-Tech LIVE, we're solving one of the most universal pain points in business technology.
The future belongs to organizations that can transform quickly and confidently. We're here to make sure data migration never stands in their way.
Interested in learning how Zengines can accelerate your next data migration or help you understand your legacy systems? Contact us to discuss your specific challenges.
Data lineage is the process of tracking data usage within your organization. This includes how data originates, how it is transformed, how it is calculated, its movement between different systems, and ultimately how it is utilized in applications, reporting, analysis, and decision-making. This is a crucial capability for any modern ecosystem, as the amount of data businesses generate and store increases every year.
As of 2024, 64% of organizations manage at least one petabyte of data — and 41% have at least 500 petabytes of information within their systems. In many industries, like banking and insurance, this includes legacy data that spans not just systems but eras of technology.
As the data volume grows, so does the need to aid the business with trust in access to that data. Thus, it is important for companies to invest in data lineage initiatives to improve data governance, quality, and transparency. If you’re shopping for a data lineage tool, there are many cutting-edge options. The cloud-based Zengines platform uses an innovative artificial intelligence-powered model that includes data lineage capabilities to support clean, consistent, and well-organized data.
Whether you go with Zengines or something else, though, it’s important to be strategic in your decision-making. Here is a step-by-step process to help you choose the best data lineage tools for your organization’s needs.
Start by ensuring your selection team has a thorough understanding of not just data lineage as a concept but also the requirements that your particular data lineage tools must have.
First, consider core data lineage tool functionalities that every company needs. For example, you want to be able to access a clear visualization of the relationship between complex data across programs and systems at a glance. Impact analysis also provides a clear picture of how change will influence your current data system.
In addition, review technology-specific data-lineage needs, such as the need to ingest legacy codebases like COBOL. Compliance and regulatory requirements vary from one industry to the next, too. They also change often. Make sure you’re aware of both business operations needs and what is expected of the business from a compliance and legal perspective.
Also, consider future growth. Can the tool you select support the data as you scale? Don’t hamstring momentum down the road by short-changing your data lineage capabilities in the present.
When you begin to review specific data lineage tools, you want to know what features to prioritize. Here are six key areas to focus on:
Keep these factors in mind and make sure whatever tool you choose satisfies these basic requirements.
Along with specific features, you want to assess how easy it is to implement the tool and how easy it is to use the tool.
Start with setup. Consider how well each data lineage software solution is designed to implement within and configure to your system. For businesses that built technology solutions before the 1980s, you may have critical business operations that run on mainframes. Make sure a data lineage tool will be able to easily integrate into a complex system before signing off on it.
Consider the learning curve and usability too. Does the tool have an intuitive interface? Are there complex training requirements? Is the information and operation accessible?
When considering the cost of a data lineage software solution, there are a few factors to keep in mind. Here are the top elements that can influence expenses when implementing and using a tool like this over time:
Make sure to consider costs, benefits, TCO and ROI when assessing your options.
If you’re looking for a comprehensive assessment of what makes the Zengines platform stand out from other data lineage solutions, here it is in a nutshell:
Our automated solutions create frictionless, sped-up solutions that reduce risk, lower costs, and create more accessible data lineage solutions.
As you assess your data lineage tool choices, keep the above factors in mind. What are your industry and organizational requirements? Focus on key features like automation and integration capabilities. Consider implementation, training, user experience, ROI, and comprehensive cost analyses.
Use this framework to help create stakeholder buy-in for your strategy. Then, select your tool with confidence, knowing you are organizing your data’s past to improve your present and lay the groundwork for a more successful future.
If you have any follow-up questions about data lineage and what makes a software solution particularly effective and relevant in this field, our team at Zengines can help. Reach out for a consultation, and together, we can explore how to create a clean, transparent, and effective future for your data.
What do the Phoenix Suns, a Regional Healthcare Plan, Commercial HVAC software, and a Fortune 500 bank have in common? They all struggle with data migration headaches.
This revelation – while not entirely surprising to me as someone who's spent years in data migration – might shock many readers: every single organization, regardless of industry or size, faces the same fundamental data conversion challenges.
With over 3,000 IT executives gathered under one roof – I was able to test my hypotheses about both the interest of AI in data migrations and data migration pain points across an unprecedented cross-section of organizations in just three days. The conversations I had during networking sessions, booth visits, and between keynotes consistently reinforced that data migration remains one of the most pressing challenges facing organizations today – regardless of whether they're managing player statistics for a professional sports team or customer data for a local bank with three branches.
The conference opened with Dr. Tom Zehren's powerful keynote, "Transform IT. Transform Everything." His message struck a chord: IT leaders are navigating unprecedented global uncertainty, with the World Uncertainty Index spiking 481% in just six months. What resonated most with me was his call for IT professionals to evolve into "Enterprise Technology Officers" – leaders capable of driving organization-wide transformation rather than just maintaining systems.
This transformation mindset directly applies to data migration across organizations of all sizes – especially as every company races to implement AI capabilities. Too often, both large enterprises and growing businesses treat data conversion as a technical afterthought rather than the strategic foundation for business flexibility and AI readiness. The companies I spoke with that had successfully modernized their systems were those that approached data migration as an essential stepping stone to AI implementation, not just an IT project.
Malcolm Gladwell's keynote truly resonated with me. He recounted his work with Kennesaw State University and Jiwoo, an AI Assistant that helps future teachers practice responsive teaching. His phrase, "I'm building a case for Jiwoo," exemplified exactly what we're doing at Zengines – building AI that solves real, practical problems.
Gladwell urged leaders to stay curious when the path ahead is unclear, make educated experimental bets, and give teams freedom to challenge the status quo. This mirrors our approach: taking smart bets on AI-powered solutions rather than waiting for the "perfect" comprehensive data management platform.
John Rossman's "Winning With Big Bets in the Hyper Digital Era" keynote challenged the incremental thinking that plagues many IT initiatives. As a former Amazon executive who helped launch Amazon Marketplace, Rossman argued that "cautious, incremental projects rarely move the needle." Instead, organizations need well-governed big bets that tackle transformational opportunities head-on.
Rossman's "Build Backward" method resonated particularly strongly with me because it mirrors exactly how we developed our approach at Zengines. Instead of starting with technical specifications, we worked backward from the ultimate outcome every organization wants from data migration: a successful "Go Live" that maintains business continuity while unlocking new capabilities. This outcome-first thinking led us to focus on what really matters – data validation, business process continuity, and stakeholder confidence – rather than just technical data movement.
Steve Reese's presentation on "Addictive Leadership Stories in the League" provided fascinating insights from his role as CIO of the Phoenix Suns. His central question – "Are you the kind of leader you'd follow?" – cuts to the heart of what makes technology transformations successful.
Beyond the keynotes, Day 2's breakout sessions heavily focused on AI governance frameworks, with organizations of all sizes grappling with how to implement secure and responsible AI while maintaining competitive speed. What became clear across these discussions is that effective AI governance starts with clean, well-structured data – making data migration not just a technical prerequisite but a governance foundation. Organizations struggling with AI ethics, bias detection, and regulatory compliance consistently traced their challenges back to unreliable or fragmented data sources that added challenge and complexity to implement proper oversight and control mechanisms.
The most valuable aspect of Info-Tech LIVE wasn't just the keynotes – it was discovering how AI aspirations are driving data migration needs across organizations of every size. Whether I was talking with the CIO of a major healthcare system planning AI-powered diagnostics, a mid-market logistics company wanting AI route optimization, or a software development shop building AI-solutions for their clients, the conversation inevitably led to the same realization: their current data challenges couldn't support their AI ambitions.
The Universal AI-Data Challenge: Every organization, regardless of size, faces the same fundamental bottleneck: you can't implement effective AI solutions on fragmented, inconsistent, or poorly integrated data. This reality is driving a new wave of data migration projects that organizations previously might have delayed.
Throughout three days, the emphasis was clear: apply AI for measurable value, not trends. This aligns perfectly with our philosophy. We're solving specific problems:
Info-Tech's theme perfectly captures what we're seeing: organizations aren't just upgrading technology – they're fundamentally transforming operations. At the heart of every transformation is data migration. Organizations that recognize this shift early – and build migration capabilities rather than just executing migration projects – will have significant advantages in an AI-driven economy.
Zengines not just building a data migration tool – we're building an enduring capability for business transformation. When organizations can move data quickly and accurately, they can accelerate digital initiatives, adopt new technologies fearlessly, respond to market opportunities faster, and reduce transformation costs.
Malcolm Gladwell's thoughts on embracing uncertainty and making experimental bets stayed with me. Technology will continue evolving rapidly, but one constant remains: organizations will always need to move data between systems.
Our mission at Zengines is to make that process so seamless that data migration becomes an enabler of transformation rather than a barrier. Based on the conversations at Info-Tech LIVE, we're solving one of the most universal pain points in business technology.
The future belongs to organizations that can transform quickly and confidently. We're here to make sure data migration never stands in their way.
Interested in learning how Zengines can accelerate your next data migration or help you understand your legacy systems? Contact us to discuss your specific challenges.
Your new core banking system just went live. The migration appeared successful. Then Monday morning hits: customers can't access their accounts, transaction amounts don't match, and your reconciliation team is drowning in discrepancies. Sound familiar?
If you've ever been part of a major system migration, you've likely lived a version of this nightmare. What's worse is that this scenario isn't the exception—it's becoming the norm. A recent analysis of failed implementations reveals that organizations spend 60-80% of their post-migration effort on reconciliation and testing, yet they're doing it completely blind, without understanding WHY differences exist between old and new systems.
The result? Projects that should take months stretch into years, costs spiral out of control, and in the worst cases, customers are impacted for weeks while teams scramble to understand what went wrong.
Let's be honest about what post-migration reconciliation looks like today. Your team runs the same transaction through both the legacy system and the new system. The old system says the interest accrual is $5. The new system says it's $15. Now what?
"At this point in time, the business says who is right?" explains Caitlin Truong, CEO of Zengines. "Is it that we have a rule or some variation or some specific business rule that we need to make sure we account for, or is the software system wrong in how they are computing this calculation? They need to understand what was in that mainframe black box to make a decision."
The traditional approach looks like this:
The real cost isn't just time—it's risk. While your team plays detective with legacy systems, you're running parallel environments, paying for two systems, and hoping nothing breaks before you figure it out.
Here's what most organizations don't realize: the biggest risk in any migration isn't moving the data—it's understanding the why behind the data.
Legacy systems, particularly mainframes running COBOL code written decades ago, have become black boxes. The people who built them are retired. The business rules are buried in thousands of modules with cryptic variable names. The documentation, if it exists, is outdated.
"This process looks like the business writing a question and sending it to the mainframe SMEs and then waiting for a response," Truong observes. "That mainframe SME is then navigating and reading through COBOL code, traversing module after module, lookups and reference calls. It’s understandable that without additional tools, it takes some time for them to respond."
When you encounter a reconciliation break, you're not just debugging a technical issue—you're conducting digital archaeology, trying to reverse-engineer business requirements that were implemented 30+ years ago.
One of our global banking customers faced this exact challenge. They had 80,000 COBOL modules in their mainframe system. When their migration team encountered discrepancies during testing, it took over two months to get answers to simple questions. Their SMEs were overwhelmed, and the business team felt held hostage by their inability to understand their own system.
"When the business gets that answer they say, okay, that's helpful, but now you've spawned three more questions and so that's a painful process for the business to feel like they are held hostage a bit to the fact that they can't get answers themselves," explains Truong.
What if instead of discovering reconciliation issues during testing, you could predict and prevent them before they happen? What if business analysts could investigate discrepancies themselves in minutes instead of waiting months for SME responses?
This is exactly what our mainframe data lineage tool makes possible.
"This is the challenge we aimed to solve when we built our product. By democratizing that knowledge base and making it available for the business to get answers in plain English, they can successfully complete that conversion in a fraction of the time with far less risk," says Truong.
Here's how it works:
AI algorithms ingest your entire legacy codebase—COBOL modules, JCL scripts, database schemas, and job schedulers. Instead of humans manually navigating 80,000 lines of code, pattern recognition identifies the relationships, dependencies, and calculation logic automatically.
The AI doesn't just map data flow—it extracts the underlying business logic. That cryptic COBOL calculation becomes readable: "If asset type equals equity AND purchase date is before 2020, apply special accrual rate of 2.5%."
When your new system shows $15 and your old system shows $5, business analysts can immediately trace the calculation path. They see exactly why the difference exists: perhaps the new system doesn't account for that pre-2020 equity rule embedded in the legacy code.
Now your team can make strategic decisions: Do we want to replicate this legacy rule in the new system, or is this an opportunity to simplify our business logic? Instead of technical debugging, you're having business conversations.
Let me share a concrete example of this transformation in action. A financial services company was modernizing their core system and moving off their mainframe. Like many organizations, they were running parallel testing—executing the same transactions in both old and new systems to ensure consistency.
Before implementing AI-powered data lineage:
After implementing the solution:
"The business team presents their dashboard at the steering committee and program review every couple weeks," Truong shares. "Every time they ran into a break, they have a tool and the ability to answer why that break is there and how they plan to remediate it."
The most successful migrations we've seen follow a fundamentally different approach to reconciliation:
Before you migrate anything, understand what you're moving. Use AI to create a comprehensive map of your legacy system's business logic. Know the rules, conditions, and calculations that drive your current operations.
Instead of hoping for the best, use pattern recognition to identify the most likely sources of reconciliation breaks. Focus your testing efforts on the areas with the highest risk of discrepancies.
When breaks occur (and they will), empower your business team to investigate immediately. No more waiting for SME availability or technical resource allocation.
Transform reconciliation from a technical debugging exercise into a business optimization opportunity. Decide which legacy rules to preserve and which to retire.
"The ability to catch that upfront, as opposed to not knowing it and waiting until you're testing pre go-live or in a parallel run and then discovering these things," Truong emphasizes. "That's why you will encounter missed budgets, timelines, etc. Because you just couldn't answer these critical questions upfront."
Here's something most organizations don't consider: this capability doesn't become obsolete after your migration. You now have a living documentation system that can answer questions about your business logic indefinitely.
Need to understand why a customer's account behaves differently? Want to add a new product feature? Considering another system change? Your AI-powered lineage tool becomes a permanent asset for business intelligence and system understanding.
"When I say de-risk, not only do you de-risk a modernization program, but you also de-risk business operations," notes Truong. "Whether organizations are looking to leave their mainframe or keep their mainframe, leadership needs to make sure they have the tools that can empower their workforce to properly manage it."
Every migration involves risk. The question is whether you want to manage that risk proactively or react to problems as they emerge.
Traditional reconciliation approaches essentially accept risk—you hope the breaks will be manageable and that you can figure them out when they happen. AI-powered data lineage allows you to mitigate risk substantially by understanding your system completely before you make changes.
The choice is yours:
If you're planning a migration or struggling with an ongoing reconciliation challenge, you don't have to accept the traditional pain points as inevitable. AI-powered data lineage has already transformed reconciliation for organizations managing everything from simple CRM migrations to complex mainframe modernizations.
Schedule a demo to explore how AI can turn your legacy "black box" into transparent, understandable business intelligence.