Due Diligence
Your due diligence is probably wrongGlobal Advisors: a consulting leader in defining quantified strategy, decreasing uncertainty, improving decisions, achieving measureable results.
Our latest perspective - What's behind under-performing listed companies?
Outperform through the downturn
Experienced hires
We are hiring experienced top-tier strategy consultants
Quantified Strategy
Decreased uncertainty, improved decisions
Global Advisors is a leader in defining quantified strategies, decreasing uncertainty, improving decisions and achieving measureable results.
We specialise in providing highly-analytical data-driven recommendations in the face of significant uncertainty.
We utilise advanced predictive analytics to build robust strategies and enable our clients to make calculated decisions.
We support implementation of adaptive capability and capacity.
Our latest
Thoughts
Global Advisors’ Thoughts: Outperforming through the downturn AND the cost of ignoring full potential
Press drew attention last year to a slew of JSE-listed companies whose share prices had collapsed over the past few years. Some were previous investor darlings. Analysis pointed to a toxic combination of decreasing earnings growth and increased leverage. While this might be a warning to investors of a company in trouble, what fundamentals drive this combination?
In our analysis, company expansion driven by the need to compensate for poor performance in their core business is a typical driver of exactly this outcome.
This article was written in January 2020 but publication was delayed due to the outbreak of Covid-19. Five months after South Africa’s first case, we update our analysis and show that core-based companies outperformed diverse peers by 29% over the period.
Management should always seek to reach full potential in their core business. Attempts to expand should be to a clearly logical set of adjacencies to which they can apply their capabilities using a repeatable business model.
In the article “Steinhoff, Tongaat, Omnia… Here’s the dead giveaway that you should have avoided these companies, says an asset manager,” (Business Insider SA, Jun 11, 2019) Helena Wasserman lists a number of Johannesburg Stock Exchange (JSE) listed shares that have plummeted in recent years.
In many cases these companies’ corresponding sectors have been declining. However, in most of the sectors there is at least one company that has outperformed the rest. What is it about these outperformers that distinguishes them from the rest?
The outperformers have typically shown strong financial performance – be that Growth, ROE, ROA, RONA or Asset Turnover – and varying degrees of leverage. However, performance against these metrics is by no means consistent – see our analysis.
What is consistent is that the outperformers all show clearly delineated core businesses and ongoing growth towards full potential in these businesses alongside growth into clear adjacencies that protect, enhance and leverage the core. In some cases, the core may have been or is currently being redefined, typically through gradual, step-wise extension along logical adjacencies. Redefinition is particularly important in light of the digital transformation seen in many industries. The outperformers are very seldom diversified across unrelated business segments – although isolated examples such as Bidvest clearly exist in other sectors.
Analysis of the over- and underperformers in the sectors highlighted in the article shows that those following a clear core-based strategy have typically outperformed peers through the initial months of the downturn caused by the Covid-19 outbreak.
Strategy Tools
PODCAST: Effective Transfer Pricing
Our Spotify podcast discusses how to get transfer pricing right.
We discuss effective transfer pricing within organizations, highlighting the prevalent challenges and proposing solutions. The core issue is that poorly implemented internal pricing leads to suboptimal economic decisions, resource allocation problems, and interdepartmental conflict. The hosts advocate for market-based pricing over cost recovery, emphasizing the importance of clear price signals for efficient resource allocation and accurate decision-making. They stress the need for service level agreements, fair cost allocation, and a comprehensive process to manage the political and emotional aspects of internal pricing, ultimately aiming for improved organizational performance and profitability. The podcast includes case studies illustrating successful implementations and the authors’ expertise in this field.
Read more from the original article.

Fast Facts
Fast Fact: Great returns aren’t enough
Key insights
It’s not enough to just have great returns – top-line growth is just as critical.
In fact, S&P 500 investors rewarded high-growth companies more than high-ROIC companies over the past decade.
While the distinction was less clear on the JSE, what is clear is that getting a balance of growth and returns is critical.
Strong and consistent ROIC or RONA performers provide investors with a steady flow of discounted cash flows – without growth effectively a fixed-income instrument.
Improvements in ROIC through margin improvements, efficiencies and working-capital optimisation provide point-in-time uplifts to share price.
Top-line growth presents a compounding mechanism – ROIC (and improvements) are compounded each year leading to on-going increases in share price.
However, without acceptable levels of ROIC, the benefits of compounding will be subdued and share price appreciation will be depressed – and when ROIC is below WACC value will be destroyed.
Maintaining high levels of growth is not as sustainable as maintaining high levels of ROIC – while both typically decline as industries mature, growth is usually more affected.
Getting the right balance between ROIC and growth is critical to optimising shareholder value.
Selected News
Quote: Trevor McCourt – Extropic CTO
“We need something like 10 terawatts in the next 20 years to make LLM systems truly useful to everyone… Nvidia would need to 100× output… You basically need to fill Nevada with solar panels to provide 10 terawatts of power, at a cost around the world’s GDP. Totally crazy.” – Trevor McCourt – Extropic CTO
Trevor McCourt, Chief Technology Officer and co-founder of Extropic, has emerged as a leading voice articulating a paradox at the heart of artificial intelligence advancement: the technology that promises to democratise intelligence across the planet may, in fact, be fundamentally unscalable using conventional infrastructure. His observation about the terawatt imperative captures this tension with stark clarity—a reality increasingly difficult to dismiss as speculative.
Who Trevor McCourt Is
McCourt brings a rare convergence of disciplinary expertise to his role. Trained in mechanical engineering at the University of Waterloo (graduating 2015) and holding advanced credentials from the Massachusetts Institute of Technology (2020), he combines rigorous physical intuition with deep software systems architecture. Prior to co-founding Extropic, McCourt worked as a Principal Software Engineer, establishing a track record of delivering infrastructure at scale: he designed microservices-based cloud platforms that improved deployment speed by 40% whilst reducing operational costs by 30%, co-invented a patented dynamic caching algorithm for distributed systems, and led open-source initiatives that garnered over 500 GitHub contributors.
This background—spanning mechanical systems, quantum computation, backend infrastructure, and data engineering—positions McCourt uniquely to diagnose what others in the AI space have overlooked: that energy is not merely a cost line item but a binding physical constraint on AI’s future deployment model.
Extropic, which McCourt co-founded alongside Guillaume Verdon (formerly a quantum technology lead at Alphabet’s X division), closed a $14.1 million Series Seed funding round in 2023, led by Kindred Ventures and backed by institutional investors including Buckley Ventures, HOF Capital, and OSS Capital. The company now stands at approximately 15 people distributed across integrated circuit design, statistical physics research, and machine learning—a lean team assembled to pursue what McCourt characterises as a paradigm shift in compute architecture.
The Quote in Strategic Context
McCourt’s assertion that “10 terawatts in the next 20 years” is required for universal LLM deployment, coupled with his observation that this would demand filling Nevada with solar panels at a cost approaching global GDP, represents far more than rhetorical flourish. It is the product of methodical back-of-the-envelope engineering calculation.
His reasoning unfolds as follows:
From Today’s Baseline to Mass Deployment:
A text-based assistant operating at today’s reasoning capability (approximating GPT-5-Pro performance) deployed to every person globally would consume roughly 20% of the current US electrical grid—approximately 100 gigawatts. This is not theoretical; McCourt derives this from first principles: transformer models consume roughly 2 × (parameters × tokens) floating-point operations; modern accelerators like Nvidia’s H100 operate at approximately 0.7 picojoules per FLOP; population-scale deployment implies continuous, always-on inference at scale.
Adding Modalities and Reasoning:
Upgrade that assistant to include video capability at just 1 frame per second (envisioning Meta-style augmented-reality glasses worn by billions), and the grid requirement multiplies by approximately 10×. Enhance the reasoning capability to match models working on the ARC AGI benchmark—problems of human-level reasoning difficulty—and the text assistant alone requires a 10× expansion: 5 terawatts. Push further to expert-level systems capable of solving International Mathematical Olympiad problems, and the requirement reaches 100× the current grid.
Economic Impossibility:
A single gigawatt data centre costs approximately $10 billion to construct. The infrastructure required for mass-market AI deployment rapidly enters the hundreds of trillions of dollars—approaching or exceeding global GDP. Nvidia’s current manufacturing capacity would itself require a 100-fold increase to support even McCourt’s more modest scenarios.
Physical Reality Check:
Over the past 75 years, US grid capacity has grown remarkably consistently—a nearly linear expansion. Sam Altman’s public commitment to building one gigawatt of data centre capacity per week alone would require 3–5× the historical rate of grid growth. Credible plans for mass-market AI acceleration push this requirement into the terawatt range over two decades—a rate of infrastructure expansion that is not merely economically daunting but potentially physically impossible given resource constraints, construction timelines, and raw materials availability.
McCourt’s conclusion: the energy path is not simply expensive; it is economically and physically untenable. The paradigm must change.
Intellectual Foundations: Leading Theorists in Energy-Efficient Computing and Probabilistic AI
Understanding McCourt’s position requires engagement with the broader intellectual landscape that has shaped thinking about computing’s physical limits and probabilistic approaches to machine learning.
Geoffrey Hinton—Pioneering Energy-Based Models and Probabilistic Foundations:
Few figures loom larger in the theoretical background to Extropic’s work than Geoffrey Hinton. Decades before the deep learning boom, Hinton developed foundational theory around Boltzmann machines and energy-based models (EBMs)—the conceptual framework that treats learning as the discovery and inference of complex probability distributions. His work posits that machine learning, at its essence, is about fitting a probability distribution to observed data and then sampling from it to generate new instances consistent with that distribution. Hinton’s recognition with the 2023 Nobel Prize in Physics for “foundational discoveries and inventions that enable machine learning with artificial neural networks” reflects the deep prescience of this probabilistic worldview. More than theoretical elegance, this framework points toward an alternative computational paradigm: rather than spending vast resources on deterministic matrix operations (the GPU model), a system optimised for efficient sampling from complex distributions would align computation with the statistical nature of intelligence itself.
Michael Frank—Physics of Reversible and Adiabatic Computing:
Michael Frank, a senior scientist now at Vaire (a near-zero-energy chip company), has spent decades at the intersection of physics and computing. His research programme, initiated at MIT in the 1990s and continued at the University of Florida, Florida State, and Sandia National Laboratories, focuses on reversible computing and adiabatic CMOS—techniques aimed at reducing the fundamental energy cost of information processing. Frank’s work addresses a deep truth: in conventional digital logic, information erasure is thermodynamically irreversible and expensive, dissipating energy as heat. By contrast, reversible computing minimises such erasure, thereby approaching theoretical energy limits set by physics rather than by engineering convention. Whilst Frank’s trajectory and Extropic’s diverge in architectural detail, both share the conviction that energy efficiency must be rooted in physical first principles, not merely in engineering optimisation of existing paradigms.
Yoshua Bengio and Chris Bishop—Probabilistic Learning Theory:
Leading researchers in deep generative modelling—including Bengio, Bishop, and others—have consistently advocated for probabilistic frameworks as foundational to machine learning. Their work on diffusion models, variational inference, and sampling-based approaches has legitimised the view that efficient inference is not about raw compute speed but about statistical appropriateness. This theoretical lineage underpins the algorithmic choices at Extropic: energy-based models and denoising thermodynamic models are not novel inventions but rather a return to first principles, informed by decades of probabilistic ML research.
Richard Feynman—Foundational Physics of Computing:
Though less directly cited in contemporary AI discourse, Feynman’s 1982 lectures on the physics of computation remain conceptually foundational. Feynman observed that computation’s energy cost is ultimately governed by physical law, not engineering ingenuity alone. His observations on reversibility and the thermodynamic cost of irreversible operations informed the entire reversible-computing movement and, by extension, contemporary efforts to align computation with physics rather than against it.
Contemporary Systems Thinkers (Sam Altman, Jensen Huang):
Counterintuitively, McCourt’s critique is sharpened by engagement with the visionary statements of industry leaders who have perhaps underestimated energy constraints. Altman’s commitment to building one gigawatt of data centre capacity per week, and Huang’s roadmaps for continued GPU scaling, have inadvertently validated McCourt’s concern: even the most optimistic industrial plans require infrastructure expansion at rates that collide with physical reality. McCourt uses their own projections as evidence for the necessity of paradigm change.
The Broader Strategic Narrative
McCourt’s remarks must be understood within a convergence of intellectual and practical pressures:
The Efficiency Plateau:
Digital logic efficiency, measured as energy per operation, has stalled. Transistor capacitance plateaued around the 10-nanometre node; operating voltage is thermodynamically bounded near 300 millivolts. Architectural optimisations (quantisation, sparsity, tensor cores) improve throughput but do not overcome these physical barriers. The era of “free lunch” efficiency gains from Moore’s Law miniaturisation has ended.
Model Complexity Trajectory:
Whilst small models have improved at fixed benchmarks, frontier AI systems—those solving novel, difficult problems—continue to demand exponentially more compute. AlphaGo required ~1 exaFLOP per game; AlphaCode required ~100 exaFLOPs per coding problem; the system solving International Mathematical Olympiad problems required ~100,000 exaFLOPs. Model miniaturisation is not offsetting capability ambitions.
Market Economics:
The AI market has attracted trillions in capital precisely because the economic potential is genuine and vast. Yet this same vastness creates the energy paradox: truly universal AI deployment would consume resources incompatible with global infrastructure and economics. The contradiction is not marginal; it is structural.
Extropic’s Alternative:
Extropic proposes to escape this local minimum through radical architectural redesign. Thermodynamic Sampling Units (TSUs)—circuits architected as arrays of probabilistic sampling cells rather than multiply-accumulate units—would natively perform the statistical operations that diffusion and generative AI models require. Early simulations suggest energy efficiency improvements of 10,000× on simple benchmarks compared to GPU-based approaches. Hybrid algorithms combining TSUs with compact neural networks on conventional hardware could deliver intermediate gains whilst establishing a pathway toward a fundamentally different compute paradigm.
Why This Matters Now
The quote’s urgency reflects a dawning recognition across technical and policy circles that energy is not a peripheral constraint but the central bottleneck determining AI’s future trajectory. The choice, as McCourt frames it, is stark: either invest in a radically new architecture, or accept that mass-market AI remains perpetually out of reach—a luxury good confined to the wealthy and powerful rather than a technology accessible to humanity.
This is not mere speculation or provocation. It is engineering analysis grounded in physics, economics, and historical precedent, articulated by someone with the technical depth to understand both the problem and the extraordinary difficulty of solving it.

Polls
What determines your success?
We need your help!
We’re testing how people think about their successes and failures. It would be great if you would take 2 minutes to give a simple multiple choice answer and share the poll with your friends.
Take the poll – “What determines your success?”
Then read So You Think You’re Self Aware?
Services
Global Advisors is different
We help clients to measurably improve strategic decision-making and the results they achieve through defining clearly prioritised choices, reducing uncertainty, winning hearts and minds and partnering to deliver.
Our difference is embodied in our team. Our values define us.
Corporate portfolio strategy
Define optimal business portfolios aligned with investor expectations
BUSINESS UNIT STRATEGY
Define how to win against competitors
Reach full potential
Understand your business’ core, reach full potential and grow into optimal adjacencies
Deal advisory
M&A, due diligence, deal structuring, balance sheet optimisation
Global Advisors Digital Data Analytics
14 years of quantitative and data science experience
An enabler to delivering quantified strategy and accelerated implementation
Digital enablement, acceleration and data science
Leading-edge data science and digital skills
Experts in large data processing, analytics and data visualisation
Developers of digital proof-of-concepts
An accelerator for Global Advisors and our clients
Join Global Advisors
We hire and grow amazing people
Consultants join our firm based on a fit with our values, culture and vision. They believe in and are excited by our differentiated approach. They realise that working on our clients’ most important projects is a privilege. While the problems we solve are strategic to clients, consultants recognise that solutions primarily require hard work – rigorous and thorough analysis, partnering with client team members to overcome political and emotional obstacles, and a large investment in knowledge development and self-growth.
Get In Touch
16th Floor, The Forum, 2 Maude Street, Sandton, Johannesburg, South Africa
+27114616371
