Press

Zengines CEO Caitlyn Truong Recognized as 2025 Info-Tech Women Leading IT Award Winner

November 12, 2025
Todd Stone

BOSTON, MA – November 12, 2025 – Zengines is pleased to announce that the company's CEO and Co-Founder, Caitlyn Truong, has been recognized as a winner of the 2025 Info-Tech Awards by Info-Tech Research Group, a global leader in IT research and advisory.

Truong has been named a winner in the Women Leading IT award category.

The Info-Tech Awards celebrate outstanding achievements in IT, recognizing both individual leaders and organizations that have demonstrated exceptional leadership, innovation, and impact. The Women Leading IT Award celebrates exceptional women whose strength of leadership is driving innovation and transformation in their organization and the IT industry.

Since founding Zengines in 2020, Truong has led the development of AI-powered solutions that address two of the most pressing data management challenges facing enterprise organizations: data migration and data lineage. Under her leadership, Zengines has partnered with some of the largest enterprises to accelerate and de-risk their most critical business initiatives—from customer onboarding and system modernization to M&A integration and compliance requirements. The company's innovative approach helps organizations complete data conversions up to 80% faster while significantly reducing risk and cost, transforming processes that traditionally required large teams of specialists and months of manual work into streamlined operations achievable in minutes or days through AI-driven automation.

"I'm deeply honored by this recognition from Info-Tech and applaud their commitment to celebrating women in tech," says Caitlyn Truong, CEO of Zengines. "At Zengines, we're solving some of the most complex challenges the industry hasn't been able to crack: helping companies understand, modernize, and move their most valuable asset - their data. We're succeeding thanks to our incredible teammates - including women leaders who earned their place through grit and skill. When we amplify this power between women in tech - sharing knowledge, championing success, staying in the fight - we create leaders who know how to do hard things. That's the future worth building."

The 2025 Info-Tech Award winners were selected from a competitive pool of hundreds of candidates. The Women Leading IT Award winners were determined by their track record of innovation, leadership, and business impact, and their contribution to the advancement of women in technology through mentorship, advocacy, or initiatives that support diversity in IT.

"Women Leading IT within the 2025 Info-Tech Awards celebrates leaders whose vision and execution have driven measurable progress in innovation, inclusion, and organizational performance," says Tom Zehren, Chief Executive Officer at Info-Tech Research Group. "Congratulations to this year's honorees for strengthening their organizations through strategic leadership and opening doors for the future generation of IT leaders. Each Women Leading IT winner for 2025 exemplifies the strength of inclusive leadership that is shaping IT's next chapter."

To view the full list of winners and learn more about the Info-Tech Awards, please click here.

About Zengines

Zengines is a technology company that transforms how organizations handle data migrations and mainframe modernization. Zengines serves business analysts, developers, and transformation leaders who need to map, change, and move data across systems. With deep expertise in AI, data migration, and legacy systems, Zengines helps organizations reduce time, cost, and risk associated with their most challenging data initiatives. Learn more at zengines.ai.

About Info-Tech Research Group

Info-Tech Research Group is the world's leading research and advisory firm, proudly serving over 30,000 IT, HR, and marketing professionals. The company produces unbiased, highly relevant research and provides industry-leading advisory services to help leaders make strategic, timely, and well-informed decisions. For nearly 30 years, Info-Tech has partnered closely with teams to provide them with everything they need, from actionable tools to analyst guidance, ensuring they deliver measurable results for their organizations.

To learn more about Info-Tech Research Group or to access the latest research, visit infotech.com.

You may also like

TL;DR: The Quick Answer

LLM code analysis tools like ChatGPT and Copilot excel at explaining and translating specific COBOL programs you've already identified. Mainframe data lineage platforms like Zengines excel at discovering business logic across thousands of programs when you don't know where to look. Most enterprise modernization initiatives need both: data lineage to find what matters, LLMs to accelerate the work once you've found it.

---------------

When enterprises tackle mainframe modernization and legacy COBOL code analysis, two technologies dominate the conversation: Large Language Models (LLMs) and mainframe data lineage platforms. Both promise to reveal what your code does—but they solve fundamentally different problems.

LLMs like ChatGPT, GitHub Copilot, and IBM watsonx Code Assistant excel at interpreting and translating code you paste into them. Data lineage platforms like Zengines excel at discovering and extracting business logic across enterprise codebases—often millions of lines of COBOL—when you don't know where that logic lives.

Understanding this distinction determines whether your modernization initiative succeeds or stalls. This guide clarifies when each approach fits your actual need.

What LLMs and Data Lineage Platforms Actually Do

LLM code analysis tools provide deep explanations of specific code. They rewrite programs in modern languages, optimize algorithms, and tutor developers. If you know which program to analyze, LLMs accelerate understanding and translation.

Mainframe data lineage platforms find business logic you didn't know existed. They search across thousands of programs, extract calculations and conditions at enterprise scale, and prove completeness for regulatory compliance like BCBS-239.

The overlap matters: Both can show you what calculations do. The critical difference is scale and discovery. Zengines extracts calculation logic from anywhere in your codebase without knowing where to look. LLMs explain and transform specific code once you identify it.

Most enterprise teams need both: data lineage to discover scope and extract system-wide business logic, LLMs to accelerate understanding and translation of specific programs.

How Each Tool "Shows You How Code Works"

The phrase "shows you how code works" means different things for each tool—and the distinction matters for mainframe modernization projects.

Traditional (schema-based) lineage tools show that Field A flows to Field B, but not what happens during that transformation. They map connections without revealing logic.

Code-based lineage platforms like Zengines extract the actual calculation:

PREMIUM = BASE_RATE * RISK_FACTOR * (1 + ADJUSTMENT)

...along with the conditions that govern when it applies:

IF CUSTOMER_TYPE = 'COMMERCIAL' AND REGION = 'EU'

This reveals business rules governing when logic applies across your entire system.

LLMs explain code line-by-line, clarify algorithmic intent, suggest optimizations, and generate alternatives—but only for code you paste into them.

The key difference: Zengines shows you calculations across 5,000 programs without needing to know where to look. LLMs explain calculations in depth once you know which program matters. Both "show how code works," but at different scales for different purposes.

When to Use LLMs vs. Data Lineage Platforms

The right tool depends on the question you're trying to answer. Use this table to identify whether your challenge calls for an LLM, a data lineage platform, or both.

Notice the pattern: LLMs shine when you've already identified the code in question. Zengines shines when you need to find or trace logic across an unknown scope.

Your Question Use an LLM When... Use Zengines When...
Scope "Explain what Program_X does" "What programs are in scope for this modernization initiative?"
Discovery "I'm looking at InterestCalc.cbl - explain the algorithm" "Find all interest rate logic across the codebase - I don't know which programs contain it"
Extraction "Take this one formula and optimize it" "Extract all premium calculation formulas across 200 programs and show me the variations"
Dependencies "Refactor this code to handle the new data structure" "What breaks if I change this copybook? Show me the actual code that will fail."
Data Flow "Walk me through the logic within this single program" "Trace how data flows from File A through all programs to Report Z"
Business Rules "Explain this nested IF-THEN-ELSE logic and suggest a cleaner approach" "What business rules govern when calculation X applies vs calculation Y across the entire system?"
Root Cause "Why does this specific function return unexpected values? Debug this." "Why do System A and System B produce different results? Show me where the calculations diverge."
Compliance "Document what this legacy code does for knowledge transfer" "Prove to auditors complete data lineage with actual business logic for this regulatory metric"

LLM vs. Data Lineage Platform: Feature Comparison

Beyond specific use cases, it helps to understand how these tools differ in design and outcomes. This comparison highlights what each tool is built for—and where each falls short.

Dimension LLM Code Analysis Zengines Data Lineage
Core Use Case Explain, translate, or refactor specific code you've already identified Discover, trace, and document data flows across entire enterprise codebase
User Experience Interactive Q&A - paste code, get explanations, iterate Query-based research - search indexed codebase, visualize dependencies
Primary Output Code explanations, translations, refactored snippets Complete lineage maps, impact analysis, dependency graphs, regulatory docs
Success Outcome Faster understanding and porting of known programs Comprehensive scope, validated completeness, regulatory compliance proof
What You Must Know First Which programs/files to analyze Nothing - designed for discovery when you don't know where logic resides
Proves Completeness? No - limited to what you ask about; may hallucinate details Yes - systematic indexing enables audit trail; deterministic extraction

How to Use LLMs and Data Lineage Together

Successful enterprise modernization initiatives use both tools strategically. Here's the workflow that works:

  1. Zengines discovers scope: "Find all programs touching customer credit calculation"—returns 47 programs with actual calculation logic extracted.
  1. Zengines diagnoses issues: "Why do System A and System B produce different results?"—shows where logic diverges across programs.
  1. LLM accelerates implementation: Take specific programs identified by Zengines and use an LLM to explain details, generate Java equivalents, and create tests.
  1. Zengines validates completeness: Prove to auditors that the initiative covered all logic paths and transformations.

Why Teams Confuse LLMs with Data Lineage Tools

Many teams successfully use LLMs to port known programs and assume this scales to enterprise-wide COBOL modernization. The confusion happens because:

  • 80% of programs may be straightforward—well-documented, isolated, known scope.
  • LLMs work great on this 80%—fast translation, helpful explanations.
  • The 20% with hidden complexity stops initiatives—cross-program dependencies, undocumented business rules, conditional logic spread across multiple files.

Teams don't realize they have a system-level problem until deep into the initiative when they discover programs or dependencies they didn't know existed.

The Bottom Line: Choose Based on Your Problem

LLM code analysis and mainframe data lineage platforms solve different problems:

  • LLMs excel at code-level interpretation and generation for known programs.
  • Data lineage platforms excel at system-scale discovery and extraction across thousands of programs.

The critical distinction isn't whether they can show you what code does—both can. The distinction is scale, discovery, and proof of completeness.

For enterprise mainframe modernization, regulatory compliance, and large-scale initiatives, you need both. Data lineage platforms like Zengines find what matters across your entire codebase and prove you didn't miss anything. LLMs then accelerate the mechanical work of understanding and translating what you found.

The question isn't "Which tool should I use?" It's "Which problem am I solving right now?"

See How Zengines Complements Your LLM Tools

If you're planning a mainframe modernization initiative, regulatory compliance project, or enterprise-wide code analysis, we'd love to show you how Zengines works alongside your existing LLM tools.

Schedule a demo to see our mainframe data lineage platform in action with your use case.

For nearly a decade, global banks have treated BCBS 239 compliance as an aspirational goal rather than a regulatory mandate. That era is ending.

Since January 2016, the Basel Committee's Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS 239) have required global systemically important banks to maintain complete, accurate, and timely risk data. Yet enforcement was inconsistent, and banks routinely pushed back implementation timelines.

Now regulators are done waiting. According to KPMG, banks that fail to remediate BCBS 239 deficiencies are "playing with fire."

At the heart of BCBS 239 compliance sits data lineage - the complete, auditable trail of data from its origin through all transformations to final reporting. Despite being mandatory for nearly nine years, it remains the most consistently unmet requirement.

The Data Lineage Challenge: Why Banks Deferred Implementation

From 2016 through 2023, comprehensive data lineage proved extraordinarily difficult to verify and enforce. The numbers tell the story: as of November 2023, only 2 out of 31 assessed global systemically important banks fully complied with all BCBS 239 principles. Not a single principle has been fully implemented by all banks (PwC).

Even more troubling? Progress has been glacial. Between 2019 and 2022, the average compliance level across all principles barely moved - from 3.14 to 3.17 on a scale of 1 ("non-compliant") to 4 ("fully compliant") (PwC).

Throughout this period, banks submitted implementation roadmaps extending through 2019, 2021, and beyond, citing the technical complexity of establishing end-to-end lineage across legacy systems. Many BCBS 239 programs were underfunded and lacked attention from boards and senior management (PwC). For seven years past the compliance deadline, data lineage requirements remained particularly challenging to implement and even harder to validate.

The Turning Point: Escalating Enforcement and Explicit Guidance

The Basel Committee's November 2023 progress report marked a shift in tone. Banks' progress was deemed "unsatisfactory," and regulators signaled that increased enforcement measures - including capital surcharges, restrictions on capital distribution, and other penalties would follow (PwC).

Then came the ECB's May 2024 Risk Data Aggregation and Risk Reporting (RDARR) Guide, which provides unprecedented specificity on what compliant data lineage actually looks like - requirements that were previously open to interpretation (EY).

Daily Fines on the Table

In public statements, ECB leaders have hinted that BCBS 239 could be the next area for periodic penalty payments (PPPs)—daily fines that accrue as long as a bank remains noncompliant (KPMG). These penalties can reach up to 5% of average daily turnover for every day the infringement continues, for a maximum of six months (European Central Bank).

This enforcement mechanism is no longer theoretical. In November 2024, the ECB imposed €187,650 in periodic penalty payments on ABANCA for failing to comply with climate risk requirements—demonstrating the regulator's willingness to deploy this tool (European Banking Authority).

Capital Consequences are already here

European enforcement now includes ECB letters with findings, Pillar 2 requirement (P2R) add-ons, and fines (McKinsey & Company). These aren't hypothetical consequences.

ABN AMRO's Pillar 2 requirement increased by 0.25% to 2.25% in 2024, with the increase "mainly reflecting improvements required in BCBS 239 compliance" (ABN AMRO). That's a tangible capital cost for risk data aggregation deficiencies.

The ECB's May 2024 RDARR Guide goes further, warning that banks must "step up their efforts" or face "escalation measures." It explicitly states that deficiencies may lead to reassessment of the suitability of responsible executives—and in severe cases, their removal (EY).

U.S. Regulators Taking Similar Action

American regulators have demonstrated equal resolve on data management failures. The OCC assessed a $400 million civil money penalty against Citibank in October 2020 for deficiencies in data governance and internal controls (Office of the Comptroller of the Currency). When Citi's progress proved insufficient, regulators added another $136 million in penalties in July 2024 for failing to meet remediation milestones (FinTech Futures).

Deutsche Bank felt the consequences in 2018, failing the Federal Reserve's CCAR stress test specifically due to "material weaknesses in data capabilities and controls supporting its capital planning process"—deficiencies examiners explicitly linked to weak data management practices (CNBC, Risk.net).

Data Lineage: Explicit Requirements and Rigorous Testing

The ECB's May 2024 RDARR Guide exceeds even the July 2023 consultation draft in requiring rigorous data governance and lineage frameworks (KPMG). The specificity is unprecedented: banks need complete, attribute-level data lineage encompassing all data flows across all systems from end to end—not just subsets or table-level views.

The ECB is testing these requirements through on-site inspections that typically last up to three months and involve as many as 15 inspectors. These examinations often feature risk data "fire drills" requiring banks to produce large quantities of data at short notice with little warning (KPMG). Banks without comprehensive automated data lineage simply cannot respond adequately.

The regulatory stance continues to intensify. The ECB has announced targeted reviews of RDARR practices, on-site inspections, and annual questionnaires as key activities in its supervisory priorities work program (EY). With clearer guidance on what constitutes compliant data lineage and explicit warnings of enforcement escalation, deficiencies that were difficult to verify in previous years have become directly testable.

Solving the Hardest Part: Legacy Mainframe Lineage

BCBS 239 data lineage requirements are mandatory and now explicitly defined in regulatory guidance. But here's the uncomfortable truth: for most banks, the biggest gap isn't in modern cloud systems with well-documented APIs. It's in the legacy mainframes that still process the majority of core banking transactions.

These systems—built on COBOL, RPG, and decades-old custom code—are the "black boxes" that make BCBS 239 compliance so difficult. They hold critical risk data, but their logic is buried in thousands of modules written by engineers who retired years ago. When regulators ask "where did this number come from?", banks often cannot answer with confidence.

Zengines' AI-powered platform solves this specific challenge. We deliver complete, automated, attribute-level lineage for legacy mainframe systems - parsing COBOL code, tracing data flows through job schedulers, and exposing the calculation logic that determines how risk data moves from source to regulatory report.

This isn't enterprise-wide metadata management. It's targeted, deep lineage for the systems that have historically been impossible to document—the same systems that trip up banks during ECB fire drills and on-site inspections. Zengines produces the audit-ready evidence that satisfies examination requirements, with the granularity regulators now explicitly demand.

For banks facing P2R capital add-ons, the cost of addressing mainframe lineage gaps is minimal compared to ongoing capital charges for non-compliance - let alone the risk of periodic penalty payments accruing at up to 5% of daily turnover.

The time to act is now

BCBS 239 has required comprehensive data lineage since January 2016. With the May 2024 RDARR Guide providing explicit requirements and regulators signaling enforcement escalation, banks can no longer defer implementation—especially for legacy systems.

Zengines provides the proven technology to shine a light into mainframe black boxes, enabling banks to demonstrate compliance when regulators arrive with data requests and their enforcement toolkit.

Learn more today.

The "I" in CIO has always stood for Information, but in 2026 that responsibility takes on new urgency.

As the market pours resources into AI and enterprises face mounting pressure to manage it - whether deploying it internally, partnering with third parties who use it, or satisfying regulators who demand clarity on its use - the CIO's priority isn't another technology platform. It's data lineage and provenance as an unwavering capability.

This is what separates CIOs who treat technology management as an operational function from those who deliver trustworthy information as a strategic outcome.

Three Industry Drivers Making Data Lineage Urgent

Three industry drivers make this imperative urgent:

First, AI's transformative impact on business: Gartner reports that, despite an average spend of $1.9 million on GenAI initiatives in 2024, less than 30% of AI leaders report their CEOs are happy with AI investment return—largely because organizations struggle to verify their data's fitness for AI use.

Second, the massive workforce retirement in legacy technology: 79% cited their top mainframe-related challenge is acquiring the right resources and skills to get work done, according to Forrester Research, as seasoned experts retire and take decades of institutional knowledge about critical data flows with them.

Third, the ever-increasing regulatory landscape: Cybersecurity vulnerabilities, data governance, and regulatory compliance are three of the most common risk areas expected to be included in 2026 internal audit plans, with regulators demanding verifiable data lineage across industries.

As the enterprise's Information Officer, the CIO must be accountable for the organization's ability to produce and trust information - not just operate technology systems. Understanding the complete journey of data, from origin through every transformation to final use, supports every strategic outcome CIOs need to deliver: enabling AI capabilities, satisfying regulatory requirements, and partnering confidently with third parties. Data lineage provides the technical foundation that makes trustworthy information possible across the enterprise.

The Burning Platform: Why CIOs Must Act Now

Three forces converge to create a burning platform:

First, regulatory compliance demands now span every industry - from BCBS-239 and DORA in financial services to HIPAA in healthcare to SEC analytics requirements across public companies. Regulators are enforcing data lineage mandates with substantial penalties.

Second, every business needs to demonstrate AI innovation, yet AI initiatives succeed or fail based on verified training data quality and explainability.

Third, in a connected world demanding "always on," enterprises must be agile enough to globally partner with third parties, whether serving customers through partner ecosystems or trusting data from their own vendors and service providers.

The urgency intensifies because mainframe systems house decades of critical business logic while the workforce that understands these systems is retiring, making automated lineage extraction essential before institutional knowledge disappears.

What Enterprise-Wide Data Lineage Capability Requires

Given these converging pressures, CIOs need enterprise-wide data lineage capability that captures information flows across the entire technology landscape, including legacy systems. This means automated lineage extraction from mainframes, mid-tier applications, cloud platforms, and third-party integrations - creating a comprehensive map of how data moves and transforms throughout the organization.

Manual documentation fails because it can't keep pace with system complexity and depends on human compliance. The solution requires technology that captures lineage at the technical level where data actually flows, then makes this intelligence accessible for business understanding.

For mainframe environments specifically, this means extracting lineage from COBOL and RPG code before retiring experts leave. The strategic outcome: a single, verifiable source of truth about data provenance that serves regulatory needs, AI development, and partnership confidence simultaneously.

From Operational Execution to Strategic Accountability

This shift elevates the CIO's accountability from operational execution to strategic outcomes. Rather than simply providing systems, CIOs become accountable for the infrastructure that proves information integrity and lineage.

This transforms conversations with boards and regulators from "we operate technology systems" to "we can verify our information's complete journey and quality"—a fundamentally stronger position.

The CIO role expands from technology delivery to information assurance, directly supporting enterprise risk management, innovation initiatives, and strategic partnerships through verifiable capability.

Three Strategic Business Outcomes from Data Lineage

Ultimately, data lineage capability delivers three strategic business outcomes:

  1. Regulatory compliance transforms from expensive fire drills into routine capability—examiners receive complete, accurate lineage documentation on demand across multiple industry requirements.
  2. AI and analytics initiatives launch faster with confidence because teams can verify training data quality, understand transformations, and explain model inputs to stakeholders and regulators.
  3. Third-party partnerships expand safely because the enterprise can verify data quality across organizational boundaries, whether integrating partner data to serve customers or trusting vendor information for operations.

The enterprise moves from defensive compliance postures to offensive information leverage, with the CIO providing infrastructure that turns data into a strategic asset rather than a regulatory liability.

For CIOs in 2026, owning Information means proving it - and data lineage is what makes that promise possible.

To learn more about how Zengines can support your data lineage priorities, schedule a call with our team.

Subscribe to our Insights