ARTIFICIAL INTELLIGENCE
An AI-native strategy firmGlobal Advisors: a consulting leader in defining quantified strategy, decreasing uncertainty, improving decisions, achieving measureable results.
A Different Kind of Partner in an AI World
AI-native strategy
consulting
Experienced hires
We are hiring experienced top-tier strategy consultants
Quantified Strategy
Decreased uncertainty, improved decisions
Global Advisors is a leader in defining quantified strategies, decreasing uncertainty, improving decisions and achieving measureable results.
We specialise in providing highly-analytical data-driven recommendations in the face of significant uncertainty.
We utilise advanced predictive analytics to build robust strategies and enable our clients to make calculated decisions.
We support implementation of adaptive capability and capacity.
Our latest
Thoughts
No Results Found
The page you requested could not be found. Try refining your search, or use the navigation above to locate the post.
Strategy Tools
Strategy Tools: The GE Matrix
The GE matrix is a nine cell portfolio matrix first developed by General Electric in the 1970s which was used as a tool for screening large portfolios of business units or product lines. It is based on the idea that determining an appropriate level of investment for a business depends on both the attractiveness of the market and the businesses current capability in that market. Industry attractiveness and business unit strength are calculated by identifying a number of criteria and applying a weighting to each to come to a combined figure for its positioning on the graph. It is similar to the growth-share matrix in that it maps the strategic business units relative to their position within the industry. The axes of industry attractiveness and business unit strength are comparable to the market growth and market share axes of the growth-share matrix. The tool could be used to decide what products or business units should be added to or removed from a portfolio or which markets to exit/enter, and as a result how investment should be prioritised across the business.
Fast Facts
3,6% of South African retirement funds make up 80% of total value
The South African retirement industry is highly concentrated with 80% of the total fund value being held by less than 4% of registered retirement funds.
Of these approximately 3000 are active, most of which are small – 70% of funds have assets of less than R6m.
Membership in the system is voluntary, with only around half of formally-employed workers participating, and balances are low, partly because few members preserve their funds for retirement.
There has been a substantial move to umbrella funds due to the focus on retirement fund costs and the audit requirements of underwritten funds.
Underwritten funds used to be exempt from submitting audited returns to the Pension Funds Registrar, as they were effectively registered by the insurance division of the FSB.
This exemption has now been revoked and so underwritten funds are also required to submit audited results which incurs significant compliance costs.
Selected News
Quote: Trevor McCourt – Extropic CTO
“If you upgrade that assistant to see video at 1 FPS – think Meta’s glasses… you’d need to roughly 10× the grid to accommodate that for everyone. If you upgrade the text assistant to reason at the level of models working on the ARC AGI benchmark… even just the text assistant would require around a 10× of today’s grid.” – Trevor McCourt – Extropic CTO
The quoted remark by Trevor McCourt, CTO of Extropic, underscores a crucial bottleneck in artificial intelligence scaling: energy consumption outpaces technological progress in compute efficiency, threatening the viability of universal, always-on AI. The quote translates hard technical extrapolation into plain language—projecting that if every person were to have a vision-capable assistant running at just 1 video frame per second, or if text models achieved a level of reasoning comparable to ARC AGI benchmarks, global energy infrastructure would need to multiply several times over, amounting to many terawatts—figures that quickly reach into economic and physical absurdity.
Backstory and Context of the Quote & Trevor McCourt
Trevor McCourt is the co-founder and Chief Technology Officer of Extropic, a pioneering company targeting the energy barrier limiting mass-market AI deployment. With multidisciplinary roots—a blend of mechanical engineering and quantum programming, honed at the University of Waterloo and Massachusetts Institute of Technology—McCourt contributed to projects at Google before moving to the hardware-software frontier. His leadership at Extropic is defined by a willingness to challenge orthodoxy and champion a first-principles, physics-driven approach to AI compute architecture.
The quote arises from a keynote on how present-day large language models and diffusion AI models are fundamentally energy-bound. McCourt’s analysis is rooted in practical engineering, economic realism, and deep technical awareness: the computational demands of state-of-the-art assistants vastly outstrip what today’s grid can provide if deployed at population scale. This is not merely an engineering or machine learning problem, but a macroeconomic and geopolitical dilemma.
Extropic proposes to address this impasse with Thermodynamic Sampling Units (TSUs)—a new silicon compute primitive designed to natively perform probabilistic inference, consuming orders of magnitude less power than GPU-based digital logic. Here, McCourt follows the direction set by energy-based probabilistic models and advances it both in hardware and algorithm.
McCourt’s career has been defined by innovation at the technical edge: microservices in cloud environments, patented improvements to dynamic caching in distributed systems, and research in scalable backend infrastructure. This breadth, from academic research to commercial deployment, enables his holistic critique of the GPU-centred AI paradigm, as well as his leadership at Extropic’s deep technology startup.
Leading Theorists & Influencers in the Subject
Several waves of theory and practice converge in McCourt’s and Extropic’s work:
1. Geoffrey Hinton (Energy-Based and Probabilistic Models):
Long before deep learning’s mainstream embrace, Hinton’s foundational work on Boltzmann machines and energy-based models explored the idea of learning and inference as sampling from complex probability distributions. These early probabilistic paradigms anticipated both the difficulties of scaling and the algorithmic challenges that underlie today’s generative models. Hinton’s recognition—including the Nobel Prize for work on energy-based models—cements his stature as a theorist whose footprints underpin Extropic’s approach.
2. Michael Frank (Reversible Computing)
Frank is a prominent physicist in reversible and adiabatic computing, having led major advances at MIT, Sandia National Laboratories, and others. His research investigates how the physics of computation can reduce the fundamental energy cost—directly relevant to Extropic’s mission. Frank’s focus on low-energy information processing provides a conceptual environment for approaches like TSUs to flourish.
3. Chris Bishop & Yoshua Bengio (Probabilistic Machine Learning):
Leaders like Bishop and Bengio have shaped the field’s probabilistic foundations, advocating both for deep generative models and for the practical co-design of hardware and algorithms. Their research has stressed the need to reconcile statistical efficiency with computational tractability—a tension at the core of Extropic’s narrative.
4. Alan Turing & John von Neumann (Foundations of Computing):
While not direct contributors to modern machine learning, the legacies of Turing and von Neumann persist in every conversation about alternative architectures and the physical limits of computation. The post-von Neumann and post-Turing trajectory, with a return to analogue, stochastic, or sampling-based circuitry, is directly echoed in Extropic’s work.
5. Recent Industry Visionaries (e.g., Sam Altman, Jensen Huang):
Contemporary leaders in the AI infrastructure space—such as Altman of OpenAI and Huang of Nvidia—have articulated the scale required for AGI and the daunting reality of terawatt-scale compute. Their business strategies rely on the assumption that improved digital hardware will be sufficient, a view McCourt contests with data and physical models.
Strategic & Scientific Context for the Field
- Core problem: The energy that powers AI is reaching non-linear scaling—mass-market AI could consume a significant fraction or even multiples of the entire global grid if naively scaled with today’s architectures.
- Physics bottlenecks: Improvements in digital logic are limited by physical constants: capacitance, voltage, and the energy required for irreversible computation. Digital logic has plateaued at the 10nm node.
- Algorithmic evolution: Traditional deep learning is rooted in deterministic matrix computations, but the true statistical nature of intelligence calls for sampling from complex distributions—as foregrounded in Hinton’s work and now implemented in Extropic’s TSUs.
- Paradigm shift: McCourt and contemporaries argue for a transition to native hardware–software co-design where the core computational primitive is no longer the multiply–accumulate (MAC) operation, but energy-efficient probabilistic sampling.
Summary Insight
Trevor McCourt anchors his cautionary prognosis for AI’s future on rigorous cross-disciplinary insights—from physical hardware limits to probabilistic learning theory. By combining his own engineering prowess with the legacy of foundational theorists and contemporary thinkers, McCourt’s perspective is not simply one of warning but also one of opportunity: a new generation of probabilistic, thermodynamically-inspired computers could rewrite the energy economics of artificial intelligence, making “AI for everyone” plausible—without grid-scale insanity.

Polls
No Results Found
The page you requested could not be found. Try refining your search, or use the navigation above to locate the post.
Services
Global Advisors is different
We help clients to measurably improve strategic decision-making and the results they achieve through defining clearly prioritised choices, reducing uncertainty, winning hearts and minds and partnering to deliver.
Our difference is embodied in our team. Our values define us.
Corporate portfolio strategy
Define optimal business portfolios aligned with investor expectations
BUSINESS UNIT STRATEGY
Define how to win against competitors
Reach full potential
Understand your business’ core, reach full potential and grow into optimal adjacencies
Deal advisory
M&A, due diligence, deal structuring, balance sheet optimisation
Global Advisors Digital Data Analytics
14 years of quantitative and data science experience
An enabler to delivering quantified strategy and accelerated implementation
Digital enablement, acceleration and data science
Leading-edge data science and digital skills
Experts in large data processing, analytics and data visualisation
Developers of digital proof-of-concepts
An accelerator for Global Advisors and our clients
Join Global Advisors
We hire and grow amazing people
Consultants join our firm based on a fit with our values, culture and vision. They believe in and are excited by our differentiated approach. They realise that working on our clients’ most important projects is a privilege. While the problems we solve are strategic to clients, consultants recognise that solutions primarily require hard work – rigorous and thorough analysis, partnering with client team members to overcome political and emotional obstacles, and a large investment in knowledge development and self-growth.
Get In Touch
16th Floor, The Forum, 2 Maude Street, Sandton, Johannesburg, South Africa
+27114616371

