Janet George, EVP of AI at Mastercard and MARA board member, discusses how MARA’s flexible Bitcoin mining and immersion cooling address AI’s critical energy and infrastructure challenges.
Artificial Intelligence is a powerful tool transforming global industries. At the center of this transformation sits an AI expert who has driven over $1 billion in growth through cutting-edge cloud and data center innovations: Janet George. Now serving as Executive Vice President of Artificial Intelligence at Mastercard and a board member at MARA, George brings decades of experience building high-performance computing infrastructures for manufacturing, energy, healthcare, and finance. Her unique vantage point spans AI, data centers, and energy optimization, core to MARA’s mission of deploying digital energy technologies to advance the world’s energy systems.
The following is a Q&A with Janet George, exploring her visionary insights into AI’s rapidly evolving landscape and how MARA is uniquely positioned to address industry challenges.
Can you share a bit about your journey and what excites you most about bringing your expertise to MARA’s board?
My journey in the digital infrastructure and AI space spans decades. I often say I was born into cloud, born into data, and born into AI. Early in my career, we were already handling massive volumes; 15, 20, sometimes 30 petabytes of data. To put that in perspective, that’s like managing the entire Library of Congress. It was clear we needed to rethink infrastructure, and that’s how cloud came to be. I’ve also spent years in the data center business, so watching how these emerging workloads have evolved has been incredibly exciting. MARA sits right at the intersection of data centers, cloud, Bitcoin, and energy; it’s the ideal mix for someone like me.
What were some of the biggest hurdles you’ve faced working in AI data centers?
In the data center business, power, cooling, and space are constant concerns. The challenge is that many existing data centers were built for older, less demanding workloads, not for the high-performance computing (HPC) demands we’ve seen emerge over the past five years. The rise of generative AI brought a seismic shift. These large language models are incredibly energy-intensive, and most data centers simply weren’t designed with them in mind. We’re talking about models with billions of parameters, which require a completely different approach to infrastructure. Unlike cloud environments, where some idle capacity can be tolerated, energy and heat are different, you can’t afford to waste them because once they’re gone, they’re gone. That’s why we need to advance both our cooling technologies and power systems to meet the demands of this new generation of AI workloads.
Have the challenges shifted for AI operators in the current wave of AI? If so, how?
The challenge has shifted to minimizing energy waste. Today, power has become the gating factor for AI workloads. It’s no longer just about the technology or even the compute; it’s about power. When you’re running large GPU clusters, understanding how power is consumed and where it’s being wasted is absolutely critical. If we’re not efficient, we end up with idle power and unpredictable, spiky energy usage. The challenge now is about accurately forecasting demand, avoiding waste, and engaging in what I like to call retrospective planning: looking back at usage patterns to better plan for the future.
What makes Bitcoin mining such a compelling solution for scaling AI?
What makes Bitcoin mining so intriguing for scaling AI is its flexibility. It’s incredibly good at powering up and powering down on demand. You can mine Bitcoin at virtually any time, which makes it an ideal tool for balancing energy loads. When you’re training or running large language models, there are often moments when excess power is available, Bitcoin mining lets you put that surplus to work. It acts like an on-demand power user, helping absorb peaks and fill in the gaps during downtimes. That kind of dynamic energy balancing is a big win for AI infrastructure.
Why do you feel that MARA is better equipped to assist AI than other companies?
MARA brings some truly amazing technology to the table. Its cutting-edge immersion cooling systems can support GPUs running at extremely high power levels and as I mentioned earlier, Bitcoin mining can serve as a dynamic load balancer for data centers. When you colocate mining with AI infrastructure, you get a powerful synergy: adaptable, software-defined energy consumption that responds in real time to workload demands.
What sets MARA apart is that it’s not just a Bitcoin company, it’s also a technology company that’s been investing in and developing forward-looking infrastructure for years. Now, as AI workloads push the limits of power and thermal management, MARA is ready to bring those innovations into modern data center environments and make a real impact.
How can MARA help solve the problem of AI data centers needing to be near water?
One of the biggest challenges for new AI data centers is their dependency on being near large water sources for cooling. Today’s cooling infrastructure doesn’t offer much adaptability, which limits where these facilities can be built. That’s where MARA’s 2PIC technology comes in; it offers a adaptable cooling solution that eliminates the need for water-based systems and drastically reduces waste. And it's not just water waste; it's waste all around. It reduces the waste of water, energy, and power.
In what ways can MARA’s operations support and benefit electric grid operators?
This is a critical area of focus, because much of our existing grid infrastructure needs to be preserved, not replaced. The challenge lies in the mismatch between how power is delivered and how compute is consumed. Power is homogeneous; it flows in a steady, uniform way. Compute, on the other hand, is highly heterogeneous—it varies in demand, intensity, and timing, especially with today’s AI workloads.
MARA is working to bridge that gap. By developing technologies and operational strategies that align compute demand with available power supply, MARA helps stabilize the grid. This creates a more flexible, responsive energy ecosystem that benefits both data centers and grid operators alike.
Can you explain AI’s growing need for data centers “on the edge” and how MARA can step in?
AI’s evolution, especially with the rise of agentic applications, has created a growing demand for data centers “on the edge.” These are smaller, distributed facilities located closer to where data is generated and used, rather than relying solely on massive, centralized hubs. This shift is driven by the need for lower latency, faster inference.
Right now, there’s no clear leader in the edge data center space, which creates a huge opportunity, and MARA is stepping in to fill it. MARA is focusing on deploying infrastructure tailored for AI inference workloads, with an emphasis on efficiency. It's really a massive paradigm shift.
What are you most excited to accomplish on MARA’s board?
One of the things I love about MARA is the company’s commitment to zero-cost energy and zero waste. In this world of AI, there’s a growing amount of clutter, and with that clutter comes inefficiency. So, responsible AI isn’t just about better models or smarter algorithms; it’s about thinking beyond computation to the broader ecosystem we operate in. We can’t afford to let energy, power, or compute sit idle, because once they’re gone, they’re gone for good. MARA has the potential to meaningfully reduce that waste, and I’m thrilled to help drive that forward.