Select Page

ARTIFICIAL INTELLIGENCE

An AI-native strategy firm

Global Advisors: a consulting leader in defining quantified strategy, decreasing uncertainty, improving decisions, achieving measureable results.

Learn MoreGlobal Advisors AI

A Different Kind of Partner in an AI World

AI-native strategy
consulting

Experienced hires

We are hiring experienced top-tier strategy consultants

Quantified Strategy

Decreased uncertainty, improved decisions

Global Advisors is a leader in defining quantified strategies, decreasing uncertainty, improving decisions and achieving measureable results.

We specialise in providing highly-analytical data-driven recommendations in the face of significant uncertainty.

We utilise advanced predictive analytics to build robust strategies and enable our clients to make calculated decisions.

We support implementation of adaptive capability and capacity.

Our latest

Thoughts

Global Advisors’ Thoughts: Should you be restructuring (again)?

Global Advisors’ Thoughts: Should you be restructuring (again)?

By Marc Wilson

Photo by John Chew

You don’t take a hospital visit for surgery lightly. In fact, neither do good surgeons. Most recommend conservative treatment first due to risks and trauma involved in surgical procedures. Restructuring is the orthopaedic surgery of corporate change. Yet it is often the go-to option for leaders as they seek to address a problem or spark an improvement.

Restructuring offers quick impact

It is easy to see why restructuring can be so alluring. It has the promise of a quick impact. It will certainly give you that. Yet it should be last option you take in most scenarios.

Most active people have had some nagging injury at some point. Remember that debilitating foot or knee injury? How each movement brought about pain and when things seemed better a return to action brought the injury right back to the fore? When you visited your doctor, he gave two options: a program of physiotherapy over an extended period with a good chance of success or corrective surgery that may or may not fix the problem more quickly. Which did you choose? If you’re like me, the promise of the quick pain with quick solution merited serious consideration. But at the same time, the concern over undergoing surgery with its attendant risks for potential relief without guarantee is hugely concerning.

No amount of physiotherapy will cure a crookedly-healed bone. A good orthopaedic surgeon might perform a procedure that addresses the issues even if painful and with long term recovery consequences.

That’s restructuring. It is the only option for a “crooked bone” equivalent. It may well be the right procedure to address dysfunction, but it has risks. Orthopaedic surgery would not be prescribed to address a muscular dysfunction. Neither should restructuring be executed to deal with a problem person. Surgery would not be undertaken to address a suboptimal athletic action. Neither should restructuring be undertaken to address broken processes. And no amount of surgery will turn an unfit average athlete into a race winner. Neither will restructuring address problems with strategic positioning and corporate fitness. All of that said, a broken structure that results in lack of appropriate focus and political roadblocks can be akin to a compound fracture – no amount of physiotherapy will heal it and poor treatment might well threaten the life of the patient.

What are you dealing with: a poorly performing person, broken processes or a structure that results in poor market focus and impedes optimum function?

Perennial restructuring

Many organisations I have worked with adopt a restructuring exercise every few years. This often coincides with a change in leadership or a poor financial result. It typically occurs after a consulting intervention. When I consult with leadership teams, my warning is a rule of thumb – any major restructure will take one-and-a-half years to deliver results. This is equivalent to full remuneration cycle and some implementation time. The risk of failure is high: the surgery will be painful and the side-effects might be dramatic. Why?

Restructuring involves changes in reporting lines and the relationships between people. This is political change. New ways of working will be tried in an effort to build successful working relationships and please a new boss. Teams will be reformed and require time to form, storm, norm and perform. People will take time to agree, understand and embed their new roles and responsibilities. The effect of incentives will be felt somewhere down the line.

Restructuring is often attempted to avoid the medium-to-long-term delivery of change through process change and mobilisation. As can be seen, this under-appreciates that these and other facets of change are usually required to deliver on the promise of a new structure anyway.

Restructuring creates uncertainty in anticipation

Restructuring also impacts through anticipation. Think of the athlete waiting for surgery. Exercise might stop, mental excuses for current performance might start, dread of the impending pain and recovery might set in. Similarly, personnel waiting for a structural change typically fret over the change in their roles, their reporting relationships and begin to see excuses for poor performance in the status quo. The longer the uncertainty over potential restructuring lasts, the more debilitating the effect.

Leaders feel empowered through restructuring

The role of the leader should also be considered. Leaders often feel powerless or lack capacity and time to implement fundamental change in processes and team performance. They can restructure definitively and feel empowered by doing so. This is equivalent to the athlete overruling the doctors advice and undergoing surgery, knowing that action is taking place – rather than relying on corrective therapeutic action. A great deal of introspection should be undertaken by the leader. “Am I calling for a restructure because I can, knowing that change will result?” Such action can be self-satisfying rather than remedial.

Is structure the source of the problem?

Restructuring and surgery are about people. While both may be necessary, the effects can be severe and may not fix the underlying problem. Leaders should consider the true source of underperformance and practice introspection – “Am I seeking the allure of a quick fix for a problem that require more conservative longer-term treatment?”

Photo by John Chew

read more

Strategy Tools

Strategy Tools: Profit from the Core

Strategy Tools: Profit from the Core

Extensive research conducted by Chris Zook and James Allen has shown that many companies have failed to deliver on their growth strategies because they have strayed too far from their core business. Successful companies operate in areas where they have established the “right to win”. The core business is that set of products, capabilities, customers, channels and geographies that maximises their ability to build a right to win. The pursuit of growth in new and exciting often leads companies into products, customers, geographies and channels that are distant from the core. Not only do the non-core areas of the business often suffer in their own right, they distract management from the core business.

Profit from the Core is a back-to-basics strategy which says that developing a strong, well-defined core is the foundation of sustainable, profitable growth. Any new growth should leverage and strengthen the core.

Management following the core methodology should evaluate and prioritise growth along three cyclical steps:

Management following the core methodology should evaluate and prioritise growth along three cyclical steps

Focus – reach full potential in the core

  • Define the core boundaries
  • Strengthen core differentiation at the customer
  • Drive for superior cost economics
  • Mine full potential operating profit from the core
  • Discourage competitive investment in the core

For some companies the definition of the core will be obvious, while for others much debate will be required. Executives can ask directive questions to guide the discussion:

  • What are the business’ natural economic boundaries defined by customer needs and basic economics?
  • What products, customers, channels and competitors do these boundaries encompass?
  • What are the core skills and assets needed to compete effectively within that competitive arena?
  • What is the core business as defined by those customers, products, technologies and channels through which the company can earn a return today and compete effectively with current resources?
  • What is the key differentiating factor that makes the company unique to its core customers?
  • What are the adjacent areas around the core?
  • Are the definitions of the business and industry likely to shift resulting in a change of the competitive and customer landscape?

Expand – grow through adjacencies

  • Protect and extend strengths
  • Expand into related adjacencies
  • Push the core boundaries out
  • Pursue a repeatable growth formula

Companies should expand in a measured basis, pursuing growth opportunities in immediate and sensible adjacencies to the core. A useful tool for evaluating opportunities is the adjacency map, which is constructed by identifying the key core descriptors and mapping opportunities based on their proximity to the core along each descriptor. An example adjacency map is presented below:

Adjacency Map

Redefine – evaluate if the core definition should be changed

  • Pursue profit pools of the future
  • Redefine around new and robust differentiation
  • Strengthen the operating platform before redefining strategy
  • Fully value the power of leadership economics
  • Invest heavily in new capabilities

Executives should ask guiding questions to determine whether the core definition is still relevant.

  • Is the core business confronted with a radically improved business model for servicing its customers’ needs?
  • Are the original boundaries and structure of the core business changing in complicated ways?
  • Is there significant turbulence in the industry that may result in the current core definition becoming redundant?

The questions can help identify whether the company should redefine their core and if so, what type of redefinition is required:

Core redefinition

The core methodology should be followed and reviewed on an on-going basis. Management must perform the difficult balancing act of ensuring they are constantly striving to grow and reach full potential within the core, looking for new adjacencies which strengthen and leverage the core and being alert and ready for the possibility of redefining the core.

Source: 1 Zook, C – 2001 – “Profit From The Core” – Cambridge, M.A. – Harvard Business School Press
2 Van den Berg, G; Pietersma, P – 2014 – “25 need-to-know strategy tools” – Harlow – FT Publishing

read more

Fast Facts

There is a positive relationship between long production run sizes and OEE

Capture

  • Evidence suggests that longer run sizes lead to increased overall equipment effectiveness (OEE).
  • OEE is a measure of how effectively manufacturing equipment is utilised and is defined as a product of machine availability, machine performance and product quality.
  • Increasing run sizes improves availability as a result of less change over time, and performance as a result of less operator inefficiency.
  • North America facilities that previously ran at world-class OEE rates, have experienced lower OEE rates due to a move towards reduced lot sizes and shifting large volume production overseas1.
    • Shorter run sizes resulted in increased changeover frequency which led to increased planned downtime and reduced asset utilization.
    • As a result OEE rates dropped from 85% to as low as 50%1.
read more

Selected News

Quote: Jack Clark – Import AI

Quote: Jack Clark – Import AI

“Since 2020, we have seen a 600 000x increase in the computational scale of decentralized training projects, for an implied growth rate of about 20x/year.” – Jack Clark – Import AI

Jack Clark on Exponential Growth in Decentralized AI Training

The Quote and Its Context

Jack Clark’s statement about the 600,000x increase in computational scale for decentralized training projects over approximately five years (2020-2025) represents a striking observation about the democratization of frontier AI development.1,2,3,4 This 20x annual growth rate reflects one of the most significant shifts in the technological and political economy of artificial intelligence: the transition from centralized, proprietary training architectures controlled by a handful of well-capitalized labs toward distributed, federated approaches that enable loosely coordinated collectives to pool computational resources globally.

Jack Clark: Architect of AI Governance Thinking

Jack Clark is the Head of Policy at Anthropic and one of the most influential voices shaping how we think about AI development, governance, and the distribution of technological power.1 His trajectory uniquely positions him to observe this transformation. Clark co-authored the original GPT-2 paper at OpenAI in 2019, a moment he now reflects on as pivotal—not merely for the model’s capabilities, but for what it revealed about scaling laws: the discovery that larger models trained on more data would exhibit predictably superior performance across diverse tasks, even without task-specific optimization.1

This insight proved prophetic. Clark recognized that GPT-2 was “a sketch of the future”—a partial glimpse of what would emerge through scaling. The paper’s modest performance advances on seven of eight tested benchmarks, achieved without narrow task optimization, suggested something fundamental about how neural networks could be made more generally capable.1 What followed validated his foresight: GPT-3, instruction-tuned variants, ChatGPT, Claude, and the subsequent explosion of large language models all emerged from the scaling principles Clark and colleagues had identified.

However, Clark’s thinking has evolved substantially since those early days. Reflecting in 2024, five years after GPT-2’s release, he acknowledged that while his team had anticipated many malicious uses of advanced language models, they failed to predict the most disruptive actual impact: the generation of low-grade synthetic content driven by economic incentives rather than malicious intent.1 This humility about the limits of foresight informs his current policy positions.

The Political Economy of Decentralized Training

Clark’s observation about the 600,000x scaling in decentralized training projects is not merely a technical metric—it is a statement about power distribution. Currently, the frontier of AI capability depends on the ability to concentrate vast amounts of computational resources in physically centralized clusters. Companies like Anthropic, OpenAI, and hyperscalers like Google and Meta control this concentrated compute, which has enabled governments and policymakers to theoretically monitor and regulate AI development through chokepoints: controlling access to advanced semiconductors, tracking large training clusters, and licensing centralized development entities.3,4

Decentralized training disrupts this assumption entirely. If computational resources can be pooled across hundreds of loosely federated organizations and individuals globally—each contributing smaller clusters of GPUs or other accelerators—then the frontier of AI capability becomes distributed across many actors rather than concentrated in a few.3,4 This changes everything about AI policy, which has largely been built on the premise of controllable centralization.

Recent proof-of-concepts underscore this trajectory:

  • Prime Intellect’s INTELLECT-1 (10 billion parameters) demonstrated that decentralized training at scale was technically feasible, a threshold achievement because it showed loosely coordinated collectives could match capabilities that previously required single-company efforts.3,9

  • INTELLECT-2 (32 billion parameters) followed, designed to compete with modern reasoning models through distributed training, suggesting that decentralized approaches were not merely proof-of-concept but could produce competitive frontier-grade systems.4

  • DiLoCoX, an advancement on DeepMind’s DiLoCo technology, demonstrated a 357x speedup in distributed training while achieving model convergence across decentralized clusters with minimal network bandwidth (1Gbps)—a crucial breakthrough because communication overhead had previously been the limiting factor in distributed training.2

The implied growth rate of 20x annually suggests an acceleration curve where technical barriers to decentralized training are falling faster than regulatory frameworks or policy interventions can adapt.

Leading Theorists and Intellectual Lineages

Scaling Laws and the Foundations

The intellectual foundation for understanding exponential growth in AI capabilities rests on the work of researchers who formalized scaling laws. While Clark and colleagues at OpenAI contributed to this work through GPT-2 and subsequent research, the broader field—including contributions from Jared Kaplan, Dario Amodei, and others at Anthropic—established that model performance scales predictably with increases in parameters, data, and compute.1 These scaling laws create the mathematical logic that enables decentralized systems to be competitive: a 32-billion-parameter model trained via distributed methods can approach the capabilities of centralized training at similar scales.

Political Economy and Technological Governance

Clark’s thinking is situated within broader intellectual traditions examining how technology distributes power. His emphasis on the “political economy” of AI reflects influence from scholars and policymakers concerned with how technological architectures embed power relationships. The notion that decentralized training redistributes who can develop frontier AI systems draws on longstanding traditions in technology policy examining how architectural choices (centralized vs. distributed systems) have political consequences.

His advocacy for polycentric governance—distributing decision-making about AI behavior across multiple scales from individuals to platforms to regulatory bodies—reflects engagement with governance theory emphasizing that monocentric control is often less resilient and responsive than systems with distributed decision-making authority.5

The “Regulatory Markets” Framework

Clark has articulated the need for governments to systematically monitor the societal impact and diffusion of AI technologies, a position he advanced through the concept of “Regulatory Markets”—market-driven mechanisms for monitoring AI systems. This framework acknowledges that traditional command-and-control regulation may be poorly suited to rapidly evolving technological domains and that measurement and transparency might be more foundational than licensing or restriction.1 This connects to broader work in regulatory innovation and adaptive governance.

The Implications of Exponential Decentralization

The 600,000x growth over five years, if sustained or accelerated, implies several transformative consequences:

On AI Policy: Traditional approaches to AI governance that assume centralized training clusters and a small number of frontier labs become obsolete. Export controls on advanced semiconductors, for instance, become less effective if 100 organizations in 50 countries can collectively train competitive models using previous-generation chips.3,4

On Open-Source Development: The growth depends crucially on the availability of open-weight models (like Meta’s LLaMA or DeepSeek) and accessible software stacks (like Prime.cpp) that enable distributed inference and fine-tuning.4 The democratization of capability is inseparable from the proliferation of open-source infrastructure.

On Sovereignty and Concentration: Clark frames this as essential for “sovereign AI”—the ability for nations, organizations, and individuals to develop and deploy capable AI systems without dependence on centralized providers. However, this same decentralization could enable the rapid proliferation of systems with limited safety testing or alignment work.4

On Clark’s Own Policy Evolution: Notably, Clark has found himself increasingly at odds with AI safety and policy positions he previously held or was associated with. He expresses skepticism toward licensing regimes for AI development, restrictions on open-source model deployment, and calls for worldwide development pauses—positions that, he argues, would create concentrated power in the present to prevent speculative future risks.1 Instead, he remains confident in the value of systematic societal impact monitoring and measurement, which he has championed through his work at Anthropic and in policy forums like the Bletchley and Seoul AI safety summits.1

The Unresolved Tension

The exponential growth in decentralized training capacity creates a central tension in AI governance: it democratizes access to frontier capabilities but potentially distributes both beneficial and harmful applications more widely. Clark’s quote and his broader work reflect an intellectual reckoning with this tension—recognizing that attempts to maintain centralized control through policy and export restrictions may be both technically infeasible and politically counterproductive, yet that some form of measurement and transparency remains essential for democratic societies to understand and respond to AI’s societal impacts.

References

1. https://jack-clark.net/2024/06/03/import-ai-375-gpt-2-five-years-later-decentralized-training-new-ways-of-thinking-about-consciousness-and-ai/

2. https://jack-clark.net/2025/06/30/import-ai-418-100b-distributed-training-run-decentralized-robots-ai-myths/

3. https://jack-clark.net/2024/10/14/import-ai-387-overfitting-vs-reasoning-distributed-training-runs-and-facebooks-new-video-models/

4. https://jack-clark.net/2025/04/21/import-ai-409-huawei-trains-a-model-on-8000-ascend-chips-32b-decentralized-training-run-and-the-era-of-experience-and-superintelligence/

5. https://importai.substack.com/p/import-ai-413-40b-distributed-training

6. https://www.youtube.com/watch?v=uRXrP_nfTSI

7. https://importai.substack.com/p/import-ai-375-gpt-2-five-years-later/comments

8. https://jack-clark.net

9. https://jack-clark.net/2024/12/03/import-ai-393-10b-distributed-training-run-china-vs-the-chip-embargo-and-moral-hazards-of-ai-development/

10. https://www.lesswrong.com/posts/iFrefmWAct3wYG7vQ/ai-labs-statements-on-governance

"Since 2020, we have seen a 600 000x increase in the computational scale of decentralized training projects, for an implied growth rate of about 20x/year." - Quote: Jack Clark

read more

Polls

No Results Found

The page you requested could not be found. Try refining your search, or use the navigation above to locate the post.

Services

Global Advisors is different

We help clients to measurably improve strategic decision-making and the results they achieve through defining clearly prioritised choices, reducing uncertainty, winning hearts and minds and partnering to deliver.

Our difference is embodied in our team. Our values define us.

Corporate portfolio strategy

Define optimal business portfolios aligned with investor expectations

BUSINESS UNIT STRATEGY

Define how to win against competitors

Reach full potential

Understand your business’ core, reach full potential and grow into optimal adjacencies

Deal advisory

M&A, due diligence, deal structuring, balance sheet optimisation

Global Advisors Digital Data Analytics

14 years of quantitative and data science experience

An enabler to delivering quantified strategy and accelerated implementation

Digital enablement, acceleration and data science

Leading-edge data science and digital skills

Experts in large data processing, analytics and data visualisation

Developers of digital proof-of-concepts

An accelerator for Global Advisors and our clients

Join Global Advisors

We hire and grow amazing people

Consultants join our firm based on a fit with our values, culture and vision. They believe in and are excited by our differentiated approach. They realise that working on our clients’ most important projects is a privilege. While the problems we solve are strategic to clients, consultants recognise that solutions primarily require hard work – rigorous and thorough analysis, partnering with client team members to overcome political and emotional obstacles, and a large investment in knowledge development and self-growth.

Get In Touch

16th Floor, The Forum, 2 Maude Street, Sandton, Johannesburg, South Africa
+27114616371

Global Advisors | Quantified Strategy Consulting