Sponsored
  • Narayana Hrudayalaya (NH) is an innovative Indian healthcare provider
  • In just 15 years NH has become one of India’s leading hospital groups
  • Founded by Dr Devi Shetty, a heart surgeon, NH treats nearly 2m patients a year
  • NH has built an international reputation for affordable quality healthcare
  • A number of large institutional investors are betting that NH can grow
  • Is NH’s model of affordable quality healthcare replicable outside of India?

Will Devi Shetty have a major influence on global healthcare?
 
 
PART 1
 
Dr. Devi Shetty, founder and chairman of Narayana Hrudayalaya (NH), an innovative Indian healthcare provider, wants to transform the way healthcare is delivered across the world. Can he do it?
 
This Commentary is in three parts. Part 1 is a general introduction to NH and its 2015 initial public offering. It describes some of NH’s internal challenges and suggests that it is reasonable to assume that these will be overcome given its position within a buoyant Indian healthcare market. Part 2 describes some key aspects of NH’s model for affordable quality healthcare. In particular, it shows how Shetty has embraced information technology and some aspects of scientific management to create mega hospitals in India that delivers sustainable high-volume affordable quality care. Part 3 discusses some of the challenges associated with replicating the NH model outside of India. It briefly describes Shetty’s initiative to create a medical city in the Cayman Islands to capture share from the North and South American healthcare markets. It discusses some of the barriers to replicating the model in the UK and other developed markets and suggests that besides India; Africa, - despite its complexities and challenges - might offer NH growth opportunities. It also suggests that NH could play a leading role in training a new generation of healthcare professionals specifically attuned to the vast and escalating healthcare needs of developing economies, and this could be commercially valuable.
 
London-based financial institution CDC and a number of others think Shetty can provide the world with a new model of affirdable healthcare. In December 2015 the CDC Group, owned by the British government, with an investment portfolio valued at £2.8bn, backed NH’s initial public offering (IPO) with an investment of US$48m. The IPO valued NH at US$1bn. The issue was 8.6 times oversubscribed, with most of the demand coming from foreign institutional investors. Beside CDC, other anchor investors included the government of Singapore, Morgan Stanley, Nomura, BlackRock, and Prudential.
 
Dharmesh Mehta, former managing director and CEO of Axis Capital, one of the bankers to the issue, said:  “We got one of the best anchor books, with several long-term investors supporting it. Investors are bullish about the Indian healthcare space, especially hospitals, and Narayana Hrudayalaya has a unique business model, and the backing of good quality management.”
 
In the video below Shetty argues that, “Healthcare of the future will not be an extension of the past.” Shetty has a good understanding about how technology is revolutionizing the way healthcare is delivered and changing its structure and organization to such an extent that the future of healthcare will be dramatically different from what it is today. Healthcare is moving beyond the hospital towards patient self-knowledge and empowerment. Home-healthcare services facilitate enhanced doctor-patient connectivity where it had not been previously possible.

 
 
(click to play the video)
 
Narayana Hrudayalaya
 
Shetty, who has more than three decades of experience as a cardiac surgeon both in the UK and India, founded NH in 2000. Since then, it has become one of India’s leading healthcare service providers; with a network of 23 multi-specialty, primary and tertiary healthcare facilities, eight heart centers, and 25 primary care facilities, across 32 cities, towns and villages in India. Currently, NH has 5,600 operational beds, which it intends to increase to 30,000 by 2020. NH employs some 12,500 people, including 818 doctors, 5,400 nurses and about 1,660 visiting consultants.

In fiscal year 2015, Narayana provided care to nearly two million patients and undertook more than 51,456 cardiology procedures, 14,000 cardiac surgeries - which accounted for 10% of the national figure - and 184,443 dialysis procedures. Narayana posted revenues of US$219m for fiscal year 2015 and profit after tax of $2m. For the four fiscal years that ended March 31, 2015, the company’s revenues grew at a compounded annual rate of 30%.
 
Access to healthcare for millions of poor people
 
NH has one of the world’s largest telemedicine networks with 150 centers including 50 in Africa, where Shetty sees further expansion opportunities for NH. The service is free-of-charge and enhances the connectivity between remote health facilities and consultants at Narayana. Shetty, a vocal advocate of affordable healthcare, helped design the Karnataka State government Yeshasvini scheme, which is one of the largest self-funded micro healthcare insurance programs in India. It covers about 2 million people who previously did not have access to healthcare. Participants pay US$1.40 per year, which provides them with free access to over 800 surgical procedures in 400 hospitals. In the past 10 years, 85,000 peasant farmers have used the insurance to have surgery.
 
Challenges

NH faces some challenges. Its profit margins are low and its revenues are mainly derived from three of its largest hospitals, which concentrate on cardiac care and cardiology. As of March 2015, the company’s recent acquisitions and expansion into the Cayman Islands, where it opened a 130-bed tertiary hospital, were making losses.

However, NH’s acquisitions and expansion are strategic and their pay-offs are expected to accrue over the next four years. Also, higher yields from value-added therapies such as oncology, neurology and gastroenterology are anticipated to improve Narayana’s average revenue per operating bed (ARPOB). The company’s strategy to focus on the mid-income segment of the market is predicted to increase its utilization, given that this is a large, rapidly growing and immediately addressable market. Narayana is also advantaged by its history of efficient use of capital: it has a debt-equity ratio of only about 0.3. 

 
Market drivers

In 2015 investors might have been influenced by the falling gold, oil and real estate markets and the relative attraction of the Indian healthcare sector, buoyed by changing demographics, rising incomes and a large and expanding middle class, greater health awareness, changes in disease profiles and a rising penetration of health insurance. By 2020 India is expected to be the world’s third largest middleclass consumer market behind China and the US. By 2030 India is projected to surpass both countries with an aggregated consumer spend of some US$13 trillion. A 2019 study by the McKinsey Global Institute (MGI) suggests that if India continues to grow at her current pace, average household incomes will triple over the next two decades, making the country the world’s fifth-largest consumer economy by 2025, up from the current 12th position.

While recognizing the challenges for India’s healthcare sector, investors must have thought that NH is well positioned to take advantage of the expected explosion in India’s middleclass consumer market. Narayana has a strong brand name and it is one of India’s leading healthcare companies, with significant revenue growth over the past four years. Its services appear cheaper than those of its competitors, such as Chennai’s Apollo Hospitals Limited, which has about four times the revenues of NH and Delhi’s Fortis Healthcare, which is about three times bigger in revenue terms. This suggests that NH has scope for substantial growth. 


 
PART 2
 
International attention
 
Healthcare systems worldwide consume a large and escalating share of national incomes and costs and quality of care are the two most hotly debated issues among healthcare professionals. Does Shetty have an answer?
 
For many years, Shetty has attracted international attention. For example, in 2010 a UK prime ministerial delegation visited NH’s Medical City in Bengaluru. Vince Cable, then the UK’s Business Secretary, said: “What we're trying to do in the UK is to get more for less. Dr Shetty has shown us a model by which we do not need to accept inferior healthcare because there's less money, but actually how to get more out of the system for less resource,” Cable described his visit as “inspirational” and went on to say, "I just found it overwhelming. NH combines what we always see in a good health system, which is humane humanitarian behaviour, with sound economics."
 
The Henry Ford of heart surgery
 
Worldwide, the demand for healthcare services is rising faster than its supply. By focusing on an endeavour to make doctors more effective, NH has demonstrated that it can deliver what healthcare systems need: enhanced patient outcomes for less money.  “We have invested in infrastructure. Similar infrastructure in the UK and the US is used for about eight to nine hours a day. Ours is used for 14 to 15 hours a day, which allows us to perform the high volume of procedures,” says Shetty. In 2009 the Wall Street Journal referred to him as “the Henry Ford of Heart Surgery”.
 
In a similar way Henry Ford used large factories and mass-production techniques to manufacture a large number of quality cars, which many ordinary people could afford; so, Shetty developed large hospitals and a significant skill base, which he used to improve the quality of surgical procedures and reduce costs. This enabled him to offer large numbers of people access to affordable high-quality healthcare. 
 
NH doctors, who are on fixed salaries, work in teams. Each team comprises a specialist, a number of junior doctors, trainees, nurses and paramedics. A bypass surgery typically takes about five hours. The actual grafting, which is the critical part, takes only an hour and is performed by an experienced specialist surgeon, while harvesting of the veins/arteries, opening and closing of the chest, suturing and other procedures are carried out by junior doctors. Nurses and paramedics handle the preparation and the aftercare of the patient. This Henry Ford-type process leaves the specialist free to perform more surgeries. As the volume of surgeries increase, outcomes improve, and costs are reduced. A heart surgery at NH costs less than US$2,000 per operation.
 
NH’s lower costs have not come at the expense of quality. Narayana’s mortality rate for coronary artery bypass procedures is 1.27% and its infection rate 1%, which are as good as that of US hospitals. Incidence of bedsores after cardiac surgery is anywhere between 8% and 40% globally, whereas at NH it has been almost zero in the last four years.
 
It can’t be done!
 
When we started our journey, we were discouraged by people saying that, ‘there is no such thing as low-cost high-quality healthcare’, and that ‘healthcare is expensive and will always be expensive’. Only when people become wealthy, they can afford quality healthcare . . . . . When I grew up, I looked at some of the richest countries in the world, struggling to offer healthcare to its citizens and quickly realized that even if India became a rich country, it still would not be able to guarantee healthcare to everyone. We had to change the way we were doing things and this is what we’ve done,” says Shetty.
 
Socializing the P&L
 
UK doctors and health providers often talk about reducing the costs of healthcare, but, says Shetty, “doctors usually have no idea how much they are spending”.  In contrast, at noon every day all NH doctors receive an text with NH’s previous day’s revenue, expenses and EBIDTA (earnings before interest, depreciation, taxation and amortization). According to Dr. Ashutosh Raghuvanshi, NH’s CEO, “When you look at financials at the end of the month, it’s a post-mortem. When you look at them daily, you can do something to change things”. The daily data doctors receive describes their operations, and the various levels of reimbursement. “It’s not just a cheap process, it’s effective,” says Raghuvanshi.
 
In the video below Shetty suggests that a key factor for the future success of NHS England will be its ability to re-invent itself, increase its focus on costs and outcomes, benchmark key functions with successful international comparators and instil strict financial discipline in doctors, “because they represent the biggest spend in healthcare systems,” says Shetty.
 
      
 (click to play the video)   
 
Information technology
 
Healthcare systems require radical change at every level in order to reduce the vast and upward trajectory of unsustainable costs, improve patient experiences and outcomes, speed the translation of research into therapies and make healthcare accessible to everyone. Information technology helps in these regards. NH regularly mines data to raise the quality of care and patient outcomes. Its business intelligence activities manage real-time data on 30 different parameters that track and support efficiency improvements. Those related to clinical outcomes are then reviewed at a weekly meeting, where all major clinical procedures are discussed among doctors and best practices shared. This way NH maps the cost effectiveness of each doctor.
 
PART 3
 
Affordable quality healthcare outside India
 
An example of Shetty’s model of affordable quality healthcare working effectively outside of India is Narayana Health Cayman Islands. The Cayman government has given Shetty a 200-acre site and New York investors have backed him to develop and operate a Health City. In 2014 NH opened its first phase, a 130-bed tertiary hospital targeting the elective surgery markets of North and South America. “Narayana Health City Cayman will demonstrate how over-priced and inefficient US hospitals actually are and show that lower costs and better outcomes can be achieved outside of India just as well as in Bengaluru,” says Shetty.
 
The UK
 
There are numerous barriers to adopting the Shetty model in the UK and in other developed economies. NHS England has its innovators and there are efforts to roll-out innovations nationally, but they have limited success, mainly because innovations tend to be isolated and local and not widely known across different NHS functions or beyond sector boundaries. The lack of centralised expertise in NHS England skews perspectives and limits resources. This presents a significant obstacle to the adoption of compelling healthcare innovations, such as those demonstrated by Narayana.
 
Further, there is doctor-resistance to innovations in the UK. Doctors are trained to identify and implement proven and recommended treatment protocols for various disease states. To deviate from this is to run the risk of litigation. Further, health professionals in the UK are increasingly time-pressed, with the result that acquiring and adopting new and innovative pathways of care takes a back seat. See, Meeting the challenges of affordable quality healthcare. and, The end of doctors.
 
Medical tourism
 
"Medical tourism" refers to traveling to another country for medical care. The world population is aging and becoming more affluent at rates that surpass the availability of quality healthcare resources. In addition, out-of-pocket medical costs of critical and elective procedures continue to rise, while nations offering universal care, such as the UK, are faced with ever-increasing resource burdens. These drivers are forcing patients to pursue cross-border healthcare options either to save money or to avoid long waits for treatment.

In 2015 it was estimated that the worldwide medical tourism market was between US$50bn and US$65bn and growing at an annual rate of between 15%-25%. In 2015 some 1.5 million US residents travelled abroad for care, up from 0.5 million in 2007. Two of their top destinations were Costa Rica and India. Costa Rica can yield savings on standard surgical procedures of between 45% and 65%, and India, between 65% and 90%.

Beyond the US, the OECD estimates that there are up to 50 million medical tourists worldwide annually. The most common procedures that people undergo on medical tourism trips include heart surgery, dentistry and cosmetic procedures. People are attracted to well-known, internationally accredited hospitals, which have a flow of medical tourists, internationally trained experienced health professionals, a sustained reputation for clinical excellence and a history of healthcare innovation and achievement.

Already, NH attracts medical tourists from over 50 countries, it has an international reputation for excellence, many of its top health professionals have been trained and have gained clinical experience in the US and Europe and it has a significant track record in high demand areas, particularly heart surgery. This suggests that NH is well positioned to take advantage in the future growth of medical tourism and this is probably something taken into account by NH’s anchor investors. 

 
Africa
 
Because of entrenched obstacles to change in the healthcare systems of developed economies, Shetty has indicated an interest in Africa. In the past, private healthcare providers have neglected African healthcare; it has been underserved by governments, and mostly reliant on irregular help from abroad. However, this is about to change, and there is some evidence to suggest that healthcare reform in Africa is beginning. A 2016 African Healthcare Summit suggested that African healthcare spending is expected to grow to 6.4% of GDP in 2016, making it the second highest category of government investment. A Report from the International Finance Corporation (IFC) of the World Bank suggests that, over the next 10 years, there will be, “considerable African demand” for investment in hospitals, medicines and health professionals and meeting this demand, “can deliver strong financial returns.”
 
Healthcare providers also can take heart that a number of African countries are trying to establish or widen social insurance programs to give medical cover to more of their citizens. Further, there are six African countries with projected compounded annual growth rates (CAGR) for 2014 through 2017 of between 7.12% and 9.7%. These are: Rwanda, Tanzania, Mozambique, Cote d’Ivoire, the Democratic Republic of the Congo, and Ethiopia.
 
Notwithstanding, Africa is facing a dual challenge of communicable and parasitic diseases such as malaria, TB and HIV/AIDS and growing rates of chronic conditions such as diabetes, hypertension, obesity, cancer and respiratory diseases. Increased urbanisation in many African countries, along with growing incomes and changing lifestyles, have led to a rise in the rate of chronic conditions, which are projected to overtake communicable diseases as Africa’s principal health challenge by 2030. This suggests that despite the fledging signs of change, over the next decade African healthcare will still be challenged. However, over the past 15 years, NH’s has demonstrated capabilities to meet and overcome similar challenges in India, which positions it well to succeed in Africa where it already has a non-trivial telemedicine presence.
 
Training health professionals
 
The healthcare and wellness sectors are positioned to be significant drivers of the world economy in the 21st century. Healthcare is about a US$6 trillion global market, which is increasing. Advances in medical technology, public health and governance have improved healthcare for about 30% of the world’s population. But billions of people still have no access to healthcare.
 
The WHO estimates that there is a shortage of nearly 13 million healthcare workers globally, but Shetty believes these shortages could be significantly higher. According to the Royal College of General Practitioners the shortage of doctors in the UK is the worst it has been for 40 years. One hundred primary care practices, serving 700,000 patients across Britain, are facing closure and the number of GP-patient consultations is estimated to rise from 338 million in 2013 to 441 million by 2017. UK experts warn that primary care doctors with too many patients will fail to provide adequate healthcare through current delivery methods and they say that this is expected to further drive patients to search online for health-related issues. See: Curing the Problems of General Practice.
 
Such shortages concern Shetty, who believes that the situation will only be improved with a radical change in the way healthcare is delivered. “This”, says Shetty, “will only be achieved with a change in the way health professionals are trained.” Future health professionals need to be trained for a world of e-patients. Digital classrooms will create new connections between students and health professionals and allow for access to the most current information and resources. Shetty advocates the development of a virtual global medical university, with features that include a cross-country curriculum and a reduced training period. “This is the only way we will increase the much-needed pool of healthcare talent,” says Shetty.
 
Takeaways

While change in Western healthcare systems will neither be quick nor easy, NH’s near to medium term growth will most probably come from India, the Caymans, Africa and other developing countries where the need for quality healthcare is high and growing fast, and the barriers to entry relatively low. In time, however, the US and the UK might be able to benefit from some of Narayana’s best practices so that an increasing percentage Americans may have access to high quality affordable healthcare and NHS England maybe reformed to ensure its survival.
view in full page
  • Chronic disease and ageing populations are breaking the episodic, hospital-centric model of care
  • Intelligence, data integration, and continuous monitoring are becoming the system’s new organising logic
  • MedTech value is shifting from standalone devices to connected platform architectures
  • Policy and capital are moving upstream - rewarding prevention and longitudinal outcomes over throughput
  • The organisations that redesign early will define the next era of healthcare; incrementalists will be left behind

The Hospital Rewritten

Hospitals and MedTech companies do not pivot on command. Their installed bases are measured in decades, not quarters. Capital equipment cycles span generations of technology. Regulatory frameworks are necessarily exacting. Clinical cultures are deliberately conservative because the cost of error is measured in morbidity and mortality, not missed earnings. Stability has long been a virtue in healthcare. Caution has been rational. Incremental improvement has been rewarded.

Yet beneath that surface stability, the foundations of the system are shifting.

Change in healthcare rarely announces itself as disruption. It accumulates quietly - in epidemiology, in demographics, in reimbursement pressure, in data infrastructure - until the cumulative tension becomes impossible to ignore. Slowly, then abruptly, operating assumptions give way.

If hospitals and MedTech firms try to carry twentieth-century logic unchanged into the 2030s, they will struggle - not because clinicians lack dedication or executives lack intelligence, but because the world those systems were designed for has fundamentally changed. Modern healthcare institutions were built to fight short, acute infections in younger populations, delivering treatment in discrete episodes and then discharging the patient. Today, the dominant challenge is different. Chronic diseases such as diabetes, cardiovascular disease, and neurodegeneration require continuous management rather than one-off intervention. Populations are older, meaning patients use services more frequently and for longer periods. Meanwhile, medical data - from wearables, remote monitoring, genomics, and imaging - now flow continuously, yet care remains organised around occasional appointments and hospital visits. The mismatch is structural. Systems optimised for episodic, acute care cannot effectively manage long-term, data-rich, chronic disease.

The hospital is not disappearing; it is being redefined. The device is not obsolete; it is being absorbed into a broader architecture. What is emerging is a different organising principle - one in which intelligence, integration, and longitudinal accountability displace episodic intervention as the core design logic of the system.

 
In this Commentary

This Commentary describes how healthcare’s core architecture is shifting from episodic intervention to continuous intelligence. It argues that demographic ageing, chronic disease, AI maturation, and capital reallocation are converging to redefine hospitals as data-driven coordination hubs and MedTech as platform ecosystems. Those who adapt early will shape the next care paradigm; those who rely on incrementalism risk structural decline.
 
The Epidemiological Reality the Infrastructure Was Not Built For

The modern hospital was engineered for acute intervention: trauma, infection, childbirth, surgical correction, organ failure. Its workflows, reimbursement logic, workforce training, and physical infrastructure all reflect that design. Patients present with symptoms; clinicians diagnose; an intervention is delivered; billing follows. The encounter is bounded. The episode ends.

That model made sense when the dominant threats to health were sudden, identifiable, and often reversible. It makes less sense in a system now defined by conditions that unfold slowly and rarely resolve.

Cardiovascular disease, type 2 diabetes, neurodegeneration, chronic kidney disease, obesity, inflammatory disorders, and cancer survivorship do not behave like infections or fractures. They progress over decades. They are not events but trajectories. Their early phases are metabolically active yet clinically silent; by the time symptoms emerge, biological damage is established and costly to contain.

In most developed economies, the majority of healthcare expenditure is now directed toward managing the long-term consequences of chronic illness rather than preventing its onset. Demographic ageing intensifies this dynamic. By 2030, a large and growing share of the population will be over sixty, and multi-morbidity - multiple interacting chronic conditions in the same patient - is becoming the rule rather than the exception. This is not a temporary surge in demand but a structural shift in the composition of illness.

96% of MedTech leaders believe in connected care—yet many still treat it as an add-on, not a strategic shift. The latest episode of HealthPadTalks challenges a core misconception: dashboards aren’t strategies, and connectivity isn’t platform ownership. As care moves into the home, winners will be defined by outcomes and infrastructure—not devices. It explore how AI and data platforms are reshaping the sector—and what it takes to move beyond products and stay relevant.  If you’re still selling the box, you’re already losing ground.

LISTEN NOW TO Platform vs Product
The core tension is biological versus institutional time. Chronic disease evolves continuously. Glucose regulation, vascular inflammation, renal function, tumour growth - these processes change daily. Yet care is organised around intermittent appointments and hospital admissions. Disease progresses between visits. Intervention arrives late.

The consequences are predictable. Costs escalate as complications accumulate. Clinical staff are stretched managing advanced disease states that might have been mitigated earlier. Patients cycle through fragmented encounters that address acute manifestations but rarely alter underlying trajectories. Hospitals remain financially incentivised to treat complications rather than prevent them. Expanding capacity - more beds, more operating rooms, more admissions - cannot resolve this mismatch. No system can build enough acute infrastructure to compensate for decades of unmanaged chronic progression.

 
Inertia Was Rational - Until It Wasn’t

It is tempting to frame the current tension as a failure of imagination but that would be inaccurate.

Healthcare delivery systems and MedTech companies operate within environments that reward caution. Regulatory approval is rigorous for good reason. Clinical practice evolves through evidence and replication. Procurement cycles favour proven solutions. Installed bases represent sunk capital and operational familiarity. Risk aversion is rational when human lives are at stake.

Over decades, this logic produced structural inertia. Hospitals optimised locally - reducing length of stay, improving surgical throughput, digitising records. MedTech firms iterated devices - enhancing materials, improving reliability, expanding indications. These incremental gains were meaningful because they improved outcomes and extended survival.

But incrementalism becomes a liability when the paradigm shifts.

For much of the twentieth century, episodic, device-centric healthcare aligned with disease burden and demographic structure. Today it is increasingly misaligned. Extraordinary scientific progress now coexists with structural stagnation. Institutions sense the tension but often respond defensively - protecting legacy revenue streams, amortising infrastructure, extending product lines. This response is understandable, but it is also insufficient.

Industries under structural pressure do not usually transform gradually or willingly. Instead, they adapt at the margins - cutting costs, improving efficiency, and protecting familiar business models - even as deeper tensions accumulate beneath the surface. Change is postponed, not resolved. Over time, however, those pressures compound until incremental adjustment is no longer enough and a more abrupt shift becomes unavoidable. Healthcare is now approaching that point.

 
Intelligence as Architecture

Early signals of architectural change are already visible.

In China, several leading academic institutions have begun experimenting with what might be described as AI-native clinical environments - settings in which algorithmic triage, automated documentation, and integrated reasoning systems are embedded into the core architecture of care rather than layered onto legacy hospital workflows. The distinction is subtle but decisive. In these models, AI is not treated as a decision-support accessory; it is treated as infrastructure.

One widely discussed example is Agent Hospital, developed by Tsinghua University. Often described as the world’s first AI-powered hospital prototype, the system employs coordinated AI agents - effectively virtual clinicians - designed to simulate and manage end-to-end care pathways, from triage and diagnostic reasoning to follow-up planning within a unified computational environment. The project remains experimental. Yet its importance is conceptual rather than operational. It reframes clinical workflow as something that can be computationally orchestrated from first contact to discharge, rather than sequentially handed off across fragmented institutional silos.

A parallel shift is visible in India. In early 2026, the Government of India inaugurated an AI-enabled e-ICU command centre at MMG District Hospital, in Ghaziabad, Uttar Pradesh,
that integrates bedside monitoring devices, hospital information systems, and real-time AI analytics into a continuous supervisory layer. Rather than episodic review, patient status is persistently evaluated through algorithmic monitoring and escalation protocols.

Similarly, Apollo Hospitals Enterprise Ltd. - India’s largest private hospital network - has announced expanded investment in AI to automate documentation, augment clinical decision-making, and streamline operational coordination across its network of more than 10,000 beds. The significance lies not in isolated pilots but in system-level integration: digital command centres, imaging analytics, triage systems, and longitudinal patient data are increasingly treated as native elements of care delivery rather than experimental add-ons.

These initiatives are not attempts to replace clinicians. They are architectural experiments. They test a more fundamental question: are diagnostic delay, fragmented records, and manual triage intrinsic to medicine - or artefacts of twentieth-century institutional design?
You might also like:
 
This is where the examples matter. If AI-native models demonstrate measurable gains in throughput, diagnostic accuracy, or cost per episode, the global benchmark for healthcare performance will shift. Policymakers will not ask whether AI can assist clinicians; they will ask why comparable efficiencies remain structurally unavailable in established systems. Patients, conditioned by continuous digital feedback loops in other domains, will increasingly expect responsiveness shaped by persistent data flows rather than episodic encounters.
The implication is not imitation for its own sake. It is recognition that the architecture of care - how data moves, how decisions are sequenced, how accountability is encoded - has become a variable rather than a constant. In a world of AI-native infrastructure, institutional design itself becomes a site of competition.
 
From Device Markets to Platform Architectures

This architectural shift is equally visible within MedTech.

For decades, many categories advanced through disciplined hardware optimisation. Neurosurgical shunt systems, cardiac implants, orthopaedic implants, vascular devices - each evolved through iterative refinement. The strategy was rational: it mitigated regulatory risk, leveraged installed bases, and generated durable returns.

Yet demographic and biological realities are exposing the limits of this approach. Rising incidence of age-related neurological conditions, revision-prone implants, lifetime cost scrutiny from payers, and advancing biological insight are altering the problem space. When failure-prone infrastructure meets expanding patient populations, incremental refinement begins to resemble entrenchment.

Across specialties, the strategic question is shifting. It is no longer just “Who builds the most reliable device?” but “Who owns the sensing layer, the data feedback loop, and the system architecture?” Continuous monitoring, adaptive algorithms, minimally invasive delivery, and integrated analytics transform hardware into one component within a learning ecosystem.

Value accrues less to those who sell components and more to those who orchestrate systems. In some cases, the most disruptive competitor may not be a better device manufacturer but a pharmacological, biological, or data-driven paradigm that renders hardware secondary.

MedTech’s historical incrementalism was not an error. It was contextually rational. The question now is whether the sector recognises that the context has changed.

 
Prevention Becomes Infrastructure

For decades, prevention occupied a rhetorical position within healthcare strategy - universally endorsed, operationally marginal. That era is ending.

As ageing populations collide with chronic disease expenditure, prevention shifts from moral aspiration to fiscal necessity. Governments cannot sustain indefinite downstream intervention. Payers cannot reimburse complications without demanding upstream risk modification. Prevention must therefore become measurable, regulated, and reimbursable.

This requires infrastructure: continuous monitoring integrated into predictive engines; longitudinal metabolic tracking rather than episodic measurement; multi-modal oncology detection combining molecular and imaging signals; AI systems synthesising heterogeneous data into dynamic risk stratification.

Prevention becomes operational when it is quantified and tied to outcomes. Hospitals evolve from treatment centres to risk-orchestration hubs. MedTech devices become data generators within longitudinal models rather than isolated instruments. Clinical practice expands from reactive management to trajectory modification.

None of this negates acute expertise. It contextualises it within a broader, upstream mandate.

 
Continuous Monitoring and the Dissolution of Walls

Biology does not behave episodically between appointments. Monitoring technologies are dissolving the boundary between hospital and daily life. For example, continuous glucose monitoring transformed diabetes care by replacing intermittent sampling with real-time feedback. Similar dynamics are emerging in cardiac rhythm surveillance, blood pressure monitoring, and rehabilitation adherence.

As biochemical sensing matures, the distinction between “in hospital” and “at home” will matter less than the integrity of the data loop. Hospitals will function increasingly as coordination centres. Data will flow inward from communities and homes. Intervention thresholds will be triggered by predictive analytics rather than symptomatic deterioration.

This transformation demands cybersecurity, interoperability, AI governance, and workforce upskilling. It also challenges reimbursement models. Yet its direction is clear: intelligence and integration define capability.

Hospitals that remain structurally episodic risk being overwhelmed by preventable deterioration. MedTech firms that supply hardware without integrated analytics risk commoditisation.

 
Workforce Evolution

Technology alone cannot redesign healthcare. Capability must evolve in parallel.

The clinician of 2035 will operate at the intersection of biology, data, and behavioural science. Acute expertise will remain indispensable. But longitudinal risk assessment, genomic interpretation, probabilistic reasoning, and AI-assisted decision-making will become core competencies.

Professional autonomy will not diminish; it will transform. Clinicians will interpret algorithmic insight, manage uncertainty, and contextualise risk. Institutions that invest in workforce evolution will translate technological potential into clinical impact. Those that do not will generate data without transformation.
For MedTech executives, this is not a feature upgrade but a strategic reset. The era in which a device could be sold on technical performance alone is closing. Products must be designed for workflow integration, interpretability, and embedded training from the outset, because adoption now depends on cognitive fit as much as clinical accuracy. The winners will not be those who add AI to existing portfolios, but those who redesign their offerings around how clinicians think, decide, and operate. Yesterday’s playbook optimised hardware: tomorrow’s will optimise decision environments.
You might also:
 
India’s Inflection - And Global Implications

Nowhere is architectural redesign more visible than in India.

Several of the country’s largest tertiary centres are confronting undercapacity - not because demand has weakened, but because centralised, capital-intensive hospital logic is misaligned with contemporary patient behaviour and digital capability. In response, Indian providers are building smaller, digitally enabled hubs embedded within regional networks, supported by telemedicine, AI-assisted triage, interoperable diagnostics, and shared data infrastructure.

These asset-light nodes reduce capital intensity, accelerate deployment, and embed digital workflows from inception. What is emerging is not incremental throughput optimisation but a structural redesign of care delivery.

This matters because global MedTech growth is compressing. The United States and Europe - together representing ~73% per the global market - face maturing procedure volumes, lengthening capital cycles, intensifying pricing pressure, and diminishing marginal gains from incremental innovation. Following a post-pandemic rebound, growth has cooled to low single digits. Shareholder returns have moderated, and scrutiny of R&D productivity has intensified.

India’s experimentation therefore carries strategic weight. As policy and capital realign around prevention and longitudinal outcomes, investment is flowing toward distributed platforms, AI-enabled diagnostics, and prevention infrastructure. Hospitals that reposition as intelligence hubs will attract partnerships. MedTech firms that articulate credible platform strategies - integrating hardware, software, connectivity, and data - will command valuation premiums.

India is not just expanding access. It is prototyping next-generation care architecture under fiscal constraint. Western companies that engage early will not only access growth; they will acquire structural insight into the future design of healthcare itself.

 
Takeaway: The Inflection Is Structural - Not Cyclical

Healthcare transformation does not arrive with the velocity of consumer technology. It moves through regulatory frameworks, professional norms, reimbursement models, and deeply embedded institutional habits. It is negotiated, not viral. But its pace should not be mistaken for fragility.

What is unfolding is not a temporary disruption. It is a structural inflection.

Demographic ageing is accelerating demand while shrinking the workforce. Chronic disease is compounding complexity and cost. Continuous data streams from wearables, imaging, genomics, and remote monitoring are expanding the observable surface of health. AI systems are crossing the threshold from experimentation to operational utility. Policy is increasingly aligned with prevention and value-based care. Capital is migrating toward platform models and data infrastructure.

Individually, each pressure can be managed with incremental reform. Together, they overwhelm incrementalism.

The hospital, as currently configured, cannot indefinitely absorb rising chronic load, expanding data flows, workforce scarcity, and reimbursement reform without redesign. Nor can MedTech remain defined by standalone devices competing on marginal hardware improvements. The centre of gravity is shifting - from physical throughput to intelligence orchestration.

This transition will not be smooth. There will be regulatory drag. There will be cultural resistance. There will be investments that age poorly and pilots that never scale. But strategic ambiguity about direction no longer exists.

The trajectory is clear:
  • From episodic treatment facilities to distributed health intelligence networks.
  • From device sales to integrated, continuously learning platforms.
  • From reactive intervention to proactive optimisation.
  • From procedural volume to outcome ownership.
 
The hospital is not disappearing. It is being rewritten as a health intelligence hub - coordinating data, analytics, and intervention across a distributed ecosystem. The device is not obsolete. It is becoming a sensor, actuator, and data node within that ecosystem.

The strategic question is not whether this shift will occur. It is whether your organisation will architect it - or be forced to adapt to architectures defined by others.

Healthcare professionals who engage early will shape new standards of care. MedTech executives who redesign their business models around intelligence, interoperability, and longitudinal value will define the next competitive frontier.

Those who rely on incremental optimisation of legacy models may continue to perform - until they do not. This is not a technology cycle. It is a structural reconfiguration of how health is delivered, measured, and monetised.

The window for deliberate positioning is open. It will not remain so indefinitely.
<!--[if mso]>
view in full page
  • In high-income countries populations are aging
  • By 2050 the world population of people over 60 is projected to reach 2bn
  • Age-related low back pain is the highest contributor to disability in the world
  • Over 80% of people will experience back pain at some point in their life
  • Older people with back pain have a higher chance of dying prematurely
  • The causes of back pain are difficult to determine which presents challenges for the diagnosis and management of the condition
  • The US $100bn-a-year American back pain industry is “ineffective
  • Each year 10,000 and 300,000 spine fusion surgeries are carried out in the UK and US respectively
  • 20% of spinal fusion surgeries are undertaken without good evidence
  • In 10 to 39% of spine surgery patients pain continues or worsens after surgeries
 
Age of the aged and low back pain
 
A triumph of 20th century medicine is that it has created the “age of the aged”. By 2050 the world population of people aged 60 and older is projected to be 2bn, up from 900m in 2015. Today, there are 125m people aged 80 and older and by 2050 there is expected to be 434m people in this age group worldwide. The average age of the UK population has reached 40. Some 22% will be over 65 by 2031, and this will exceed the percentage of the UK population under 25. 33% of people born today in the UK can expect to live to 100. However, this medical success is the source of rapidly increasing age-related disorders, which present significant challenges for the UK and other high-income nations. Low back pain (LBP) is the most common age-related pain disorder, and ranked as the highest contributor to disability in the world. 
 
At some point back pain affects 84% of all adults in developed economies. Research published in 2017 in the journal Scoliosis Spinal Disorders suggests that LBP is the most common health problem among older adults that results in pain and disability. The over 65s are the second most common age group to seek medical advice for LBP, which represents a significant and increasing workload for health providers. Each year back pain costs the UK and US Exchequers respectively some £5bn and more than US635bn in medical treatment and lost productivity. LBP accounts for 11% of the total disability of the respective populations. This Commentary discusses therapies for LBP, and describes the changing management landscape for this vast and rapidly growing condition.

 

Your spine and LBP

 

Your spine, which supports your back, consists of 24 vertebrae, bones stacked on top of one another.  At the bottom of your spine and below your vertebrae are the bones of your sacrum and coccyx. Threading through the entire length of your vertebrae is your spinal cord, which transmits signals from your brain to the rest of your body. Your spinal cord ends in your lower back, and continues as a series of nerves, which resemble a horse’s tail, hence its medical name, ‘cauda equine’. Between each vertebra are discs. In younger people discs contain a high degree of water. This gives them the ability to act like shock absorbers. During the normal aging process discs lose much of their water content and degenerate. Such degenerative spinal structures may result in a herniated disc when the disc nucleus extrudes through the disc’s outer fibres, or a compression of nerve roots, which may lead to radiculopathy. This is a condition more commonly known as sciatica, which is pain caused by compression of a spinal nerve root in the lower back that is often associated with the degeneration of an intervertebral disc, and can manifest itself as pain, numbness, or weakness of the buttock and outer side of the leg.

 

Challenges in diagnosis
 
Because your back is comprised of so many connected tissues, which include bones, muscles, ligaments, nerves, tendons, and joints, it is often difficult for doctors to say with confidence what causes back pain even with the help of X-rays and MRI scans. Usually, LBP does not have a serious cause. In the majority of cases LBP will reduce and often disappear within 4 to 6 weeks, and therefore can be self-managed by keeping mobile and taking over-the-counter painkillers. However, in a relatively small proportion of people with LBP, the pain and disability can persist for many months or even years. Once LBP has been present for more than a year few people return to normal activities. There is not sufficient evidence to suggest definitive management pathways for this group that accounts for the majority of the health and social costs associated with LBP.
 
Assessing treatment options for back pain

Ranjeev Bhangoo, a consultant neurosurgeon at Kings’ College Hospital Trust, London, and the London Neurosurgery Partnership describes the nature and role of intervertebral discs and how treatment options should be assessed.

When a person presents with a problem in the lower back, which might manifest as leg or arm pain, you need to ask 3 questions: (i) is the history of the pain compatible with a particular disc causing the problem?  (ii) Does an examination suggest that a particular disc is causing a problem? And (iii) does a scan show that the disc you thought was the problem is the problem? If all 3 answers align, then there maybe some good reason to consider treatment options. If the 3 answers are not aligned, be weary of a surgeon suggesting intervention because 90% of us will experience back pain at some point in our lives, and 90% of the population don’t need back surgery.”
 
 
Back pain requiring immediate medical attention
 
Although the majority of LBP tends to be benign and temporary, people should seek immediate medical advice if their back pain is associated with certain red flags such as loss of bladder control; loss of weight, fever, upper back or chest pain; or if there is no obvious cause for the pain; or if the pain is accompanied by weakness, loss of sensation or persistent pins and needles in the lower limbs. Also, people with chronic lifetime conditions such as cancer should pay particular attention to back pain.
 
Epidemiology of LBP

Back pain affects approximately 700m people worldwide. A 2011 report by the US Institute of Medicine, estimates that 100m Americans are living with chronic back pain, which is more than the total affected by heart disease, cancer, and diabetes combined. This represents a vast market for therapies that include surgery and the prescription of opioids. Estimates of the prevalence of LBP vary significantly between studies. There is no convincing evidence that age affects the prevalence of back pain, and published data do not distinguish between LBP that persists for more than, or less than, a year. Each year LBP affects some 33% of UK adults, and around 20% of these - about 2.8m - will consult their GP. One year after a first episode of back pain, 62% of people still experience pain, and 16% of those initially unable to work are not working after 1 year. Typically in about 60% of cases pain and disability improve rapidly during the first month after onset.

 

Non-invasive therapies for LBP

The most common non-invasive treatment for LBP is non-steroidal anti-inflammatory drugs (NSAIDs), but also other pain medication may include paracetamol, oral steroids, gabapentin/pregabalin, opioids and muscle relaxants, antidepressants, chiropractic manipulation, osteopathy, epidural injections, transcutaneous electrical nerve stimulation (TENS), ultrasound that uses vibration to deliver heat and energy to parts of the lower back, physiotherapy, massage, and acupuncture.
You might also be interested in:

Medical cannabis and modern healthcare
 

 
Prelude to surgery
 
Despite the range of non-invasive therapies for LBP, the incidence of lumbar spinal fusion surgery for ordinary LBP increased significantly over the past 2 decades without definitive evidence of the efficacy of the procedure. Recent guidelines from UK and US regulatory bodies have instructed doctors to consider more conservative therapies for the management of back pain, and this has resulted in the reduction in the incidence of spinal fusion surgeries.
 
Notwithstanding, because there has been clear recognition of the paucity of evidence for reliable rates of improvement following fusion for back pain surgery, it does not necessarily follow that fusions should never be done and indeed there are many instances where fusions are strongly supported by evidence. The gold standard for diagnosing degenerative disc disease is MRI evidence, which has formed the principal basis for surgical decisions in older adults. However, studies suggest that although MRI evidence indicates that degenerative change in the lumbar spine is common among people over 60, the overwhelming majority do not have chronic LBP.
 
Increasing prevalence of spinal fusion surgery
 
Each year, NHS England undertakes some 10,000 spinal surgeries for LBP at a cost of some £200m, which is in addition to the large and growing number of patients receiving epidurals that cost the NHS about £9bn a year, and they too have low evidence as to their efficacy. In the US more than 300,000 back surgeries are performed each year. In 10 to 39% of these cases, pain may continue or even worsen after surgery; a condition known as ‘failed back surgery syndrome’. In the US, about 80,000 new cases of failed back surgery syndrome are accumulated each year. Pain after back surgery is difficult to treat, and many patients are obliged to live with pain for the rest of their lives, which causes significant disability.
  
Back pain and premature death
 
A study by researchers from the University of Sydney published in 2017 in the European Journal of Pain found that older people with persistent chronic back pain have a higher chance of dying prematurely. The study examined the prevalence of back pain in nearly 4,400 Danish twins over 70. They then compared their findings with the death registry and concluded that, "Older people reporting spinal pain have a 13% increased risk of mortality per year lived, but the connection is not causal." According to lead author Matthew Fernandez, “This is a significant finding as many people think that back pain is not life-threatening.” Previous research has suggested that chronic pain can wear down peoples’ immune systems and make them more vulnerable to disease.
 
Spinal fusion
 
While recognizing that a relatively small group of elite spine surgeons, mostly from premier medical institutions, regularly carry out essential complex surgeries required for dire and paralysis-threating conditions such as traumatic injuries, spinal tumors, and congenital spinal abnormalities, the majority of procedures undertaken by a significant number of spine surgeons have been elective fusion procedures for people diagnosed with pain, which is referred to as “axial”, “functional” and “ non-specific”.  People most likely to benefit from spine surgery are the young, fit and healthy. This is according to a study undertaken by the American Spine Research AssociationNotwithstanding, the study also suggests that the typical American candidate for spinal fusion surgery is an overweight, over 55 year old smoker on opioids.
 
Steady growth projected for the spinal fusion market

The spine surgery market is relatively mature and dominated by a few global corporations: Medtronic, DePuy, Stryker, and Zimmer-Biomet. According to a 2017 report from the consulting firm GlobalData the market for spinal fusion, which includes spinal plating systems, interbody devices, vertebral body replacement devices, and pedicle screw systems is set to rise from approximately US$7bn in 2016 to US$9bn by 2023, representing a compound annual growth rate of 3.4%. The increasing prevalence of age-related degenerative spinal disorders, and continued technological advances in spinal fusion surgeries, such as expandable interbody cages and navigation systems, and the increased adoption of minimally invasive techniques, have driven this relatively steady market growth.
 
Spinal fusion surgery

Lumbar spinal fusion surgery has been performed for decades. It is a technique, which unites - fuses - 1 or more vertebrae to eliminate the motion between them. The procedure involves placing a bone graft around the spine, which, over time, heals like a fracture and joins the vertebrae together. The surgery takes away some spinal flexibility, but since most spinal fusions involve only small segments of the spine the surgery does not limit motion significantly.
 
Lumbar spinal fusion

Fusion using bone taken from the patient - autograft - has a long history of use, results in predictable healing, and currently is the “gold standard” source of bone for a fusion. One alternative is an allograft, which is cadaver bone that is typically acquired through a bone bank. In addition, several artificial bone graft materials have been developed, and include: (i) demineralized bone matrices (DBMs), which are created by removing calcium from cadaver bone. Without the mineral the bone can be changed into putty or a gel-like consistency and used in combination with other grafts. Also it may contain proteins that help in bone healing; (ii) bone morphogenetic proteins (BMPs), which are powerful synthetic bone-forming proteins that promote fusion, and have FDA approval for certain spine procedures, and (iii) ceramics, which are synthetic calcium/phosphate materials similar in shape and consistency to the patient’s own bone.
 
Different approaches to fusion surgery

Spinal fusion surgery can be either minimally invasive (MIS) or open. The former is easily marketable to patients because smaller incisions are often perceived as superior to traditional open spine surgery. Notwithstanding, open fusion surgery may be performed using surgical techniques that are considered "minimally invasive", because they require relatively small surgical incisions, and do minimal muscle or other soft tissue damage. After the initial incision, the surgeon moves the muscles and structures to the side to see your spine. The joint or joints between the damaged or painful discs are then removed, and then screws, cages, rods, or pieces of bone grafts are used to connect the discs and keep them from moving. Generally, MIS decreases the muscle retraction and disruption necessary to perform the same operation, in comparison to the traditional open spinal fusion surgery, although this depends on the preferences of individual surgeons. The indications for MIS are identical to those for traditional large incision surgery. A smaller incision does not necessarily mean less risk involved in the surgery.

There are three main approaches to fusion surgery, (i) the anterior procedure, which approaches your spine from the front and requires an incision in the lower abdomen, (ii) a posterior approach is done from your back, and (iii) a lateral approach from your side.

 
Difficulty identifying source of back pain
 
A major obstacle to the successful treatment of spine pain by fusion is the difficulty in accurately identifying the source of a patient’s pain. The theory is that pain can originate from spinal motion, and fusing the vertebrae together to eliminate the motion will get rid of the pain. Current techniques to precisely identify which of the many structures in the spine could be the source of a patient’s back pain are not perfect. Because it can be challenging to locate the source of pain, treatment of back pain alone by spinal fusion is somewhat controversial. Fusion under these conditions is usually viewed as a last resort and should be considered only after other nonsurgical measures have failed.
 
Spinal fusion surgery is only appropriate for a very small group of back pain sufferers

Nick Thomas, also a consultant neurosurgeon at King’s College Hospital Trust, London and the London Neurosurgery Partnership suggests there are a scarcity of preoperative tests to indicate whether spinal lumbar fusion surgery is appropriate, and stresses that spinal fusion is appropriate only for a small group of patients who present with back pain.
 
The overwhelming majority of patients who present with low back pain will be treated non operatively. In a few very select cases, spinal fusion may be appropriate. A challenge in managing low back pain is that there are precious few pre-operative investigations that give a clear indication of whether a spinal fusion may or may not work. Even with MRI evidence it can be very difficult to determine whether changes in a disc are the result of the normal process of degeneration or whether they reflect a problem that might be generating the back pain. If patients fail to respond to non-operative treatments they may well consider spinal fusion. A very small group of patients, who present with a small crack in one of the vertebrae bones - pars defect - or slippage of the vertebrae - spondylolisthesis - may favorably respond to spinal fusion. In patients where the cause of the back pain is less clear the success rate of spinal fusion is far less.” See video:
 
 
Back pain industry

In a new book entitled Crooked published in 2017, investigative journalist Cathryn Jakobson Ramin suggests that the US $100bn a year back pain industry is, “often ineffective, and sometimes harmful”. Ramin challenges the assumptions of a range of therapies for back pain, including surgery, epidurals, chiropractic methods, physiotherapy, and analgesics. She is particularly damning about lumbar spinal fusion surgery.  In the US 300,000 of such procedures are carried out each year at a cost of about $80,000 per surgery. Ramin suggests these have a success rate of 35%.
 
Over a period of 6 years Ramin interviewed spine surgeons, pain specialists, physiotherapists, and chiropractors. She also met with patients whose pain and desperation led them to make life-changing decisions. This prompted her to investigate evidence-based rehabilitation options and suggest how these might help back pain sufferers to avoid the range of current therapies, save time and money, and reduce their anxiety. According to Ramin people in pain are poor decision makers, and the US back pain industry exemplifies the worst aspects of American healthcare. But this is changing.
 
New Guidelines for LBP
 
In February 2017, the American College of Physicians published updated guidelines, which recommended surgery only as a last resort. Also, it said that doctors should avoid prescribing opioid painkillers for relief of back pain, and suggested that before patients try anti-inflammatories or muscle relaxants, they should try alternative therapies such as exercise, acupuncture, massage therapy or yoga. Doctors should reassure their patients that they would get better no matter what treatment they try. The guidelines also said that steroid injections were not helpful, and neither was paracetamol, although other over-the-counter analgesics such as aspirin or ibuprofen could provide some relief. The UK’s National Institute for Health and Care Excellence (NICE) has also updated its guidelines (NG59) for back pain management. These make it clear that in a significant proportion of back pain surgeries is not efficacious. The new guidelines instruct doctors to recommend various aerobic and biomechanical exercise, NHS England and private health insurers are changing their reimbursement policies. As a consequence the incidence of back surgeries have fallen significantly.
 
In perspective

Syed Aftab, a Consultant Spinal Orthopaedic Surgeon at the Royal London, Barts Health NHS Trust, welcomes the new guidelines, but warns that, “We should be careful that an excellent operation preformed by some surgeons on some patients does not get ‘vilified’. If surgeons stop preforming an operation because of the potential of being vilified, patients who could benefit from the procedure lose out”.
 
Surgical cycle

There seems to be a 20-year cycle for surgical procedures such as lumbar fusion. The procedure starts, some patients benefit and do well. This encourages more surgeons to carry out the procedure. Over time, indications become blurred, and the procedure is more widely used by an increasing number of surgeons. Not all patient do well. This leads to surgeons being scrutinized, some vilified, the procedure gets a bad name, surgeons stop preforming the operation, and patients who could benefit from the procedure lose out,” says Aftab, who is also a member of Complex Spine London, a team of spinal surgeons and pain specialists who focus on an evidence based multidisciplinary approach to spinal pathology.
 
Takeaway
 
LBP is a common disabling and costly health challenge. Although therapies are expensive, not well founded on evidence, and have a relatively poor success rate, their prevalence has increased over the past 2 decades, and an aging population does not explain this entirely. Although the prevalence of lumbar spinal fusion surgery has decreased in resent years, the spine has become a rewarding source of income for global spine companies, and also there have been allegations of conflicts of interest in this area of medicine. With the new UK and US guidelines the tide has changed, but ethical questions albeit historical still should be heeded.
view in full page
  • The coronavirus CoVID-19 has created the greatest health-cum-economic-cum-societal crisis in history and put unprecedented pressure on overstretched and unprepared healthcare systems
  • Before the coronavirus outbreak, primary care in England already was in crisis, fuelled by an aging population, a large and increasing demand for its services and a shrinking supply of health professionals
  • In 2019, before the outbreak, 75% of primary care doctors (GPs) across 540 clinics in England were over the age of 55 and nearing retirement and a large percentage of newly trained GPs were seeking employment abroad
  • Patients who could not get GP appointments used A&E departments as convenient drop-in clinics for minor ailments, which significantly increased healthcare costs and burden
  • For decades successive UK governments have tried in vain to transform the nation’s primary care services predicated upon face-to-face patient-doctor consultations
  • Several well-funded long-term national plans advocated increased digitization of some routine primary care services
  • But before the coronavirus outbreak only 1% of all primary care consultations were online
  • What these national plans could not achieve in decades appears to have been achieved in days by the UK’s NHS’s response to the coronavirus outbreak
  • Today, millions of patients in England are having face-to-face appointments with their GPs replaced by telephone or video consultations
  • Could CoVID-19 transform the UK’s traditional primary care model?

 

Introduction
 
The UK’s National Health Service’s (NHS) response to the coronavirus CoVID-19 outbreak might improve the nation’s crisis ridden primary care service. This became evident in March 2020, when the UK government ordered all citizens except key workers to stay at home. At the same time, NHS England announced its ‘battle plan’ for CoVID-19, which recommended that England’s 7,000 primary care clinics start conducting as many remote consultations as soon as possible.  In a matter of days, millions of patients had face-to-face appointments with their GP replaced by telephone or video consultations. If this shift to online consultations becomes permanent then the NHS’s response to the coronavirus would have achieved in days what well-funded national healthcare plans, such as the NHS Digital First Primary Care drive, could not achieve in decades.
 
Future healthcare is digital
 
For years, the benefits of online doctor-patient consultations have been advocated by  Devi Shetty, a world-renowned heart surgeon and  founder and chairman of Narayana Health, one India’s largest hospital groups.  According to Shetty, “The next biggest thing in healthcare is not going to be a ‘magic’ pill, a faster scanner or a new operation but information technology (IT). IT will dramatically change the way a health professional will interact with a patient. Every step of patient care will be informed by a protocol embedded in a smartphone. This will make healthcare safer for the patient and remove a lot of traditional dace-to-face healthcare activities and shift healthcare away from the clinic and into the home. Doctors and patients don't need to be together; they could be in their respective homes and effective consultations could take place online.” (see video below)
 
The next ‘big thing’ in healthcare
 
The coronavirus CoVID-19
 
In December 2019, initial reports of a new coronavirus - CoVID-19 - emerged  when patients from Wuhan, the sprawling capital city of China’s Hubei province, which has a population of some 11m, presented with pneumonia of unknown origin. By December 2019 the virus had spread to other countries and on 11th March 2020, the World Health Organization characterised the outbreak as a pandemic. CoVID-19 is an illness caused by a member of the coronavirus family that has never been encountered before but is believed to come from animals.There have been other coronaviruses. For example, severe acute respiratory syndrome (Sars) and Middle Eastern respiratory syndrome (Mers) are both caused by coronaviruses that came from animals. In 2002, Sars spread virtually unchecked to 37 countries, causing global panic, infecting more than 8,000 people and killing about 800, but it soon ran itself out. Mers first emerged in 2012, cases of which have been occurring sporadically since. Mers appears to be less easily passed from human to human, but has greater lethality, killing 35% of about 2,500 people who were infected. CoVID-19 is different to Sars and Mers in that the spectrum of disease is broader, with around 80% of cases leading to a mild infection. There may also be many people carrying the disease and displaying no symptoms, making it even harder to control. CoVID-19 affects your lungs and airways and can cause pneumonia. So, people with an  inflammatory lung disease that causes obstructed airflow from the lungs, such as asthma and chronic obstructive pulmonary disease (COPD), are particularly vulnerable; as are people with weak immune systems, which make them susceptible to infections that might be more severe or harder to treat. In January 2020, China’s national health commission confirmed human-to-human transmission of CoVID-19, and there have been such transmissions in countries throughout the world. Those who have fallen ill are reported to suffer a general feeling of being unwell, fever, dry cough, tiredness, breathing difficulties and a loss of taste and smell. In roughly 14% of cases the virus causes severe disease, including pneumonia and shortness of breath. In about 5% of patients it is critical, leading to respiratory failure, septic shock and multiple organ failure. As this is viral pneumonia, antibiotics are of no use. The antiviral drugs we have against flu will not work. Recovery depends on the strength of your immune system. Many of those who have died were already in poor health. Initially, scientists were challenged to accurately assess how dangerous CoVID-19 was because there were inadequate data. A challenge  to  collecting data was because of a shortage of tests and also because people who had contracted the coronavirus were emitting, or “shedding,” infectious viruses early in the progression of the illness; sometimes before they develop symptoms.

The 1918 Spanish Influenza 
remains the most devastating virus in modern history. The disease swept around the globe and is estimated to have caused between 50m and 100m deaths. A cousin of the same virus was also behind the 2009 swine flu outbreak, thought to have killed as many as 0.58m. Other major viral outbreaks include the Asian flu in 1957, which led to roughly 2m deaths and the Hong Kong flu, which killed 1m people 11 years later. 

 
In this Commentary
 
This Commentary is produced by HealthPad, which is an online health solutions company. (see below). We begin the Commentary by briefly describing the underlying reasons for the UK’s primary care crisis, which include: (i)  the changing and aging population and the consequent increased demand for healthcare, (ii) the shrinking supply of health professionals, and (iii) failing national initiatives to improve the provision of primary care. We then draw attention to some well funded national plans, whose intentions have been to harness the power of information and digital strategies to reform and improve primary care services in England. We also cite research, which suggests that these plans have failed. The Commentary briefly describes a number of innovative online healthcare solution companies, (HealthPad is one).  The majority of these are private initiatives, which have taken advantage of the UK’s high smartphone penetration rates and advanced wireless networks to enter the UK’s healthcare market with an intention to transform the sector. Notwithstanding, to-date the overall impact of these companies has been marginal, due in part, to the general resistance of private enterprises playing a significant role in England’s public NHS, which offers free healthcare to all citizens at the point of care. However, they represent a nascent UK online healthcare solutions market, which is well positioned to benefit from the nation’s response to the coronavirus outbreak, which has forced more primary care services to be delivered online. To increase their footprint these companies, which are largely driven by technology, will need to become more strategic and consolidate. And this will take time. We conclude the Commentary by looking to China and WeDoctor to understand the potential that online services can make to the delivery of healthcare in England. WeDoctor is a Chinese mobile app launched in 2010 to help patients book doctor appointments. Over the past decade it has added more functions to help unclog China’s fragmented and bureaucratic healthcare system and has become a US$5.5bn healthcare company, which connects some 210m registered users with 360,000 doctors.
 
UK’s primary care crisis
 
There are three drivers to the UK’s primary care crisis: (i) the changing and aging population, which increases the demand for healthcare, (ii)  the shrinking supply of healthcare professionals to a point where GP workloads are becoming unsafe, and (iii) failing national initiatives to improve the provision of primary care. Let us briefly describe these.
 
Changing and aging population
 
The UK’s population is changing and aging, which is fuelled by improvements in life expectancy and a decrease in fertility. According to the UK’s Office of National Statistics, in 2016, there were 12m UK residents aged 65 years and over, representing 18% of the total population. 25 years before, in 1991, there were 9m, accounting for 16% of the population. By 2040, it is projected that there will be an additional 8m people aged 65 years and over in the UK: a population roughly the size of present-day London, which will account for 25% of the total population.
 
A report by Deloitte,  a consultancy, suggests that as people age so their propensity for illness increases and more than a quarter of the UK’s population of some 66m have long-term chronic illnesses. This places a significant extra burden on the nation’s overstretched primary care services by utilizing about half of all GP appointments. Deloitte’s analysis is supported by a British Medical Association’s 2019 GP Patient Survey, which found that GP clinics are now caring for 0.72m more patients than they were in 2018. Findings of a 2016 report by the UK’s Royal College of General Practitioners (RGCP), suggest that GPs see 1.3m patients a day and do more than 370m consultations annually: 60m more than in 2010. A research study on GP productivity carried out by the King’s Fund and also published in 2016, suggested that between 2010 and 2015 the total number of telephone consultations increased by 15%, but still only accounted for 1% of all patient-doctor consultations.
 
Shrinking supply of GPs
 
As the UK’s population has grown and aged and the consequent demand for healthcare has increased, so there has been a sustained fall in the number of GPs. This  dynamic is described in a Nuffield Trust report published in May 2019, which confirms the findings of a joint report from the Institute of Fiscal Studies and the Health Foundation for the NHS Confederation, which concluded that, “The fall in GPs per person reflects insufficient numbers previously being trained and going on to join NHS England, failure to recruit enough from abroad and more GPs leaving for early retirement”. As to the future, a  2019 report by three leading think tanks - the Nuffield Trust, the Health Foundation and the King's Fund - predicts that GP shortages in England will almost triple to 7,000 by 2024. According to NHS Statistics, Facts and Figures, currently there are just over 42,000 GPs working in England, down by nearly 1,500 since 2016.
 
Failure to stop or slow these trends means today, primary care services in England struggle with staff shortages and a rising demand for care. A 2019 Pulse Magazine survey found that  GPs in England are seeing more patients than is safe. A probe undertaken by The Times in 2019 suggested that the  national shortage of GPs has left some surgeries with one permanent doctor caring for as many as 11,000 patients and one in 10 GPs are seeing up to 60 patients a day, double the number considered safe.
 
GPs across the UK work an average 11-hour day. In that time, they typically see patients for 8 hours and spend the other 3 on administrative tasks such as checking test results and reading letters sent by hospitals.  A 2019 British Medical Association survey found that more than 80% of GPs said the pressure to attend to multiple tasks at once meant they were unable to guarantee safe care, while 91% said excessive workload was the main reason the NHS was struggling to recruit enough staff. The situation has resulted in patients having to wait longer - up to three weeks - for a GP consultation. It seems reasonable to suggest that GPs with too many patients and using traditional face-to-face delivery methods will fail in their duty of care, which obliges them to inform patients about their health and reach shared clinical decisions about treatments. This requires that patients understand their condition/s and are well informed. In many cases, a 10-minute  face-to-face GP consultation might not be the best way to achieve this.
 
Failing national initiatives to improve primary care
 
Subsequent UK governments have struggled to reduce the primary care crisis with well funded national plans. In 2019, the British Medical Journal published findings of a survey to report UK GPs’ views and experience of national healthcare initiatives introduced in England to address the workforce crisis in general practice. The survey was conducted in the same region as a similar survey undertaken in 2014. This allows for a comparative analysis to see how GPs’ views have changed over time. Findings confirm that primary care in England remains in crisis and suggest that numerous national initiatives to improve general practice are perceived by GPs as, “reactive in approach”. To reduce the primary care crisis, respondents suggested, “more GPs and better education of the public". 
 
The UK’s NHS
 
Healthcare in the UK is mainly provided by the National Health Service (NHS), which is a vast public institution funded largely from general taxation to the tune of some £134bn (US$161bn) a year. Created in 1948, the NHS  provides free health services at the point of care for everyone living in the UK and has become the largest single payer health system in the world, and the biggest employer in the UK with 1.2m full time equivalent (FTE) workers, which is the fifth-largest workforce in the world. NHS England is a vast bureaucratic and fragmented organisation, which has proven difficult to change. Private provision of NHS services has always been controversial, even though some services, such as dentistry, optical care and pharmacy, have been provided by the private sector to the NHS for decades and most GP practices are private partnerships. It is challenging to determine how much the NHS spends each year on the private sector because central bodies do not hold detailed information on individual contracts with service providers, especially where these contracts may cover relatively small amounts of activity and spending. Notwithstanding, estimates suggest the share of the NHS’s total revenue budget that is spent on private providers is about 7.3%. 

National plans to improve the NHS
 
The planning and authorising of NHS services is the responsibility of regional Clinical Commissioning Groups (CCGs). Although CCGs are constantly changing because of mergers, as of 2019, there were 191 CCGs in England supporting about 7,000 primary care clinics, some 42,000 GPs and about 15,800 FTE nurses who work in GP clinics, and 1,257 hospitals, which include NHS Trust-managed hospitals and private hospitals that provide services to the NHS. In total, the NHS employs around 150,000 doctors  and over 320,000 nurses and midwives.
 
Successive UK governments have been aware of the impact of technological advances, changing healthcare needs and societal developments on healthcare and have introduced a succession of well-funded national plans to change and improve the NHS. For example, in June 2018, the UK’s Prime Minister announced a new five-year funding settlement for the NHS that amounted to an extra £20.5bn (US$25.2bn) between 2019 and 2024, which represents a 3.4% real average annual increase.
 
NHS long term plan to transform primary care
 
To unlock the funding, national bodies were asked to develop a long-term plan to help the NHS cut costs and improve services. The suggested plan articulated the need to integrate care in order to meet the needs of a changing population and was in line with the Forward View, a planning document published in 2014 and the General practice forward view,which was first published in 2016 and updated in subsequent years. The long-term plan committed the government to an extra £2.4bn (US$3bn) a year to speed up the transformation of primary care and suggested GP clinics join together to form networks typically covering 30,000 to 50,000 patients and provide them with multidisciplinary integrated care. The plan also suggested ‘significant changes’ in the existing performance management and payment of NHS GPs [the Quality and Outcomes Framework (QOF)] in order to encourage more personalised care.
 
NHS long term plans and private online healthcare solution companies have delivered little change
 
Three of five principal objectives of the latest NHS long term plan are to: (i) “give people more control over their own health and the care they receive”;  (ii) “increase the contribution to tackling some of the most significant causes of ill health, including new action to help people stop smoking, overcome drinking problems and avoid Type 2 diabetes”, and (iii) “provide more convenient access to services and health information for patients”.

The plan emphasises the importance of developing digital services, and recommends that within five years, all patients should be able to access GP consultations via a telephone or online. This goal is supported by NHS Digital, which is the national information and technology partner to the UK’s health and social care system. Its mission is to harness the power of information and technology to improve healthcare. Over the past decade there has been an increasing number of innovative online private  healthcare solutions companies entering the market. (see below). Notwithstanding, these and the NHS’s well-funded national plans, have failed to dent the primary care crisis by slowing the vast and escalating demand for healthcare and reversing the shrinking supply of healthcare professionals. So, for the past two decades at least, the NHS has tended to operate on the cusp of a crisis.
 
The death of distance
 
According to Deloitte, the UK has more than 90% smartphone penetration. The main driver of high smartphone adoption rates is the take-up among older age groups. By 2023 smartphone ownership among 55-to-75-year-olds will reach 85% in the UK, and the difference in smartphone penetration by age will disappear. Further, the UK’s smartphone market has seen a greater variety of choice of models and the introduction of faster and more reliable wireless networks. This has benefited the online private healthcare solution companies, which have entered the UK market to provide varying degrees of qualified online healthcare information, consultations, networking opportunities, triage and Q&A. According to Shetty, “A doctor only needs to touch a patient if s/he is going to operate on that patient. If a doctor doesn’t need to operate, a doctor-patient consultation can take place remotely. For a patient-doctor communication distance doesn’t matter.” (see video below)
 

 A doctor only needs to touch a patient if s/he is going to operate on that patient
 
Innovative online healthcare solution enterprises
 
The new online healthcare solution enterprises are a combination of private, public and charitable initiatives, which are well positioned to contribute to the transformation of the UK’s traditional primary care model and include: Babylon Health, which provides remote consultations with doctors and healthcare professionals via text and video; BioBeatsa workplace wellbeing platform designed to empower and improve mental health; Docly, a digital messaging healthcare service, which is a spin-off of Min Doktor; Doctorlink, which partners with payers, healthcare professionals and pharmacists to provide a 24-7 platform for NHS patients to assess symptoms; DrDoctor, a patient engagement platform, which enables patients to book, change and cancel their appointments; EggPlant, a software testing and monitoring company, which helps to streamline patient activities; Dr Fox, an online primary care clinic and pharmacy service; Gogodoc, an online GP video consultation service with possible follow-up home visits; Healthcare Communications UK, which provides appointment management software and patient experience surveys; HealthPad, an online platform that manages and distributes healthcare video information between health providers and patients in order to improve outcomes and cut costs, and has accrued a proprietary content library of over 6,000 short videos contributed by leading clinicians that address peoples FAQs across some 30 therapeutic pathways, (HealthPad is the publisher of this Commentary).  HealthTalksOnline, an events and community portal for health; HealthUnlocked, a social networking service that offers peer support to help people manage their health; Healum provides healthcare professionals with a software, which enables them to support and motivate their patients to better manage their conditions; LIVI, provides GP video consultations; Medshra platform for medical professionals to discover, discuss and share clinical cases and medical images; Microtest Health, a health informatics company that provides practice management systems for NHS GP surgeries. MSKnote Limited creates clinical applications for healthcare professionals and patients with a focus on musculoskeletal conditions; MyWay Digital Health provides advice and solutions to help patients better manage diabetes; NHS.uk/conditions provides online text-based information and advice about medical conditions; NHS 111, a free-to-call medical helpline; the Now Healthcare Group, a GP video consultation platform and tele-pharmacy; Patient Access, which started by enabling patients to book GP appointments online and order repeat prescriptions and has evolved to allow patients to connect with their GPs remotely and access their medical records online; Patientinfo provides patients and health professionals with online health information. PatientAccess and Patientinfo are subsidiaries of EMIS Health, a leading supplier of  software used by NHS England; Patients Know Best, a social enterprise, which provides patients with access to their medical records and information about treatments; PatientsLikeMe, an online service that helps patients find people with similar health conditions in order to take actions that are expected to improve outcomes; Push Doctor, an online video consultation service; SaySo Medical is a digital communications agency, which connects people in order to improve their health; SystmOne, a centrally hosted computer system that provides primary care professionals with electronic patient health records in real time at the point of care; uMotif, a platform that captures electronic patient-reported outcomes data across a range of conditions and works with pharmaceutical companies to measure patient’s health, outcomes and experience; Unminda workplace mental health platform designed to  empower organisations and employees to improve their mental wellbeing; Visiba Care, a digital solutions company, which provides communication and administration software for healthcare practices; VisionHealth provides NHS primary care professionals with software solutions; VisualDX provides clinical decision support systems to enhance diagnoses and therapeutic decisions in order to improve patient safety; WebMD, an online publisher of healthcare news and information, and Zava, an online GP and pharmacy service.
 
 Technologically heavy and strategically light
 
Despite a significant number of online healthcare solution enterprises entering the market and the fact that some provide services to millions of people in the UK, this market segment is in its infancy and fragmented. All the initiatives mentioned above have been advantaged by the NHS’s response to the coronavirus outbreak. Notwithstanding, to permanently increase their footprint and significantly influence primary care in England, barriers to private enterprises and to online services will need to be reduced; and private companies in this segment will need to act more strategically and consolidate.
 
Most of these online healthcare service providers are technologically heavy and strategically light. For private companies in this market to grow and increase their influence on the NHS they will need to increase their focus on profitability and scale, which will require them to become more strategic and develop merger-integration skills. To become a dominant player, a company will have to successfully consolidate. Speed and merger competence are paramount. Companies that capture critical ground early and move up the consolidation curve the fastest will be successful. Enterprises that are slow to consolidate will become acquisition targets and disappear. Companies that stay out of the consolidation contest altogether will not survive.

A Chinese example
 
History has shown that many short-term emergency measures have a tendency to  become permanent fixtures. Thus, the UK’s response to the coronavirus CoVID-19 outbreak might permanently reduce the barriers to moving routine primary care tasks to innovative private online enterprises.
 
In an attempt to fully appreciate the potential of increasing online primary healthcare services in England, consider WeDoctor, a Chinese mobile app launched in 2010 by artificial intelligence expert Jerry Liao. Originally called Guahao (Mandarin for “booking”), WeDoctor started as a simple booking platform that made it easier for patients to make appointments with doctors. From these humble beginnings WeDoctor grew by adding extra functions such as reminders for regular medical checks, screening, prescriptions and online diagnoses and consultations. This helped to unclog China’s fragmented and bureaucratic healthcare system and made quality healthcare more accessible to the average person.
 
WeDoctor secured backing from Tencent Holdings, a Chinese multinational conglomerate, Sequoia Capital, the Goldman Sachs Group and the insurer AIA Group. In 2018, the company raised US$0.5bn in a private financing round at a valuation of US$5.5bn. Today, WeDoctor has more than 210m registered users mainly in China for its online appointment booking, prescription and diagnosis services and is linked to about 3,200 hospitals and 360,000 doctors. In March 2020, at the height of the CoVID-19 pandemic, it was reported that, in the latter half of 2020, WeDoctor intends to raise HK$1bn in an IPO on the Hong Kong Stock Exchange at a valuation of HK$10bn.
 
Although NHS England is much smaller than China’s healthcare provision, it is similarly fragmented and bureaucratic. The UK online solutions enterprises described in this Commentary have significant potential simply by helping to reduce GPs large and increasing burden of administration while increasing the connectivity between patients and GPs. This will help GPs to concentrate on what they have been trained to do and improve healthcare for people in most need.
 
Takeaways
 
Over the past two decades, legacy primary care systems and attitudes in the UK have slowed the uptake of online healthcare solutions. Notwithstanding, the NHS’s response to the coronavirus CoVID-19 outbreak might prove to have helped to transform the UK’s traditional face-to-face primary care model by making GPs deliver some of their services online. In a recent interview with the New York Times, Dr Bruce Aylward, Assistant Director-General of the World Health Organization, stressed how the Chinese had responded to the coronavirus outbreak by significantly increasing the amount of medical care the nation provides online.  In light of the discussion in this Commentary, be minded that in Mandarin the word “crisis” is denoted by two characters: 危机, one means ‘disaster’ and the other means ‘opportunity’.
 
 
#coronavirus #coVID-19 #NHSEngland #NHS #pandemic #primarycarecrisis #ChinaWeDoctor #WeDoctor #DigitalHealthcare 
view in full page

  • Drug discovery is being commoditised; human truth is the new scarce resource
  • Phase-0’s leverage isn’t de-risking - it’s surfacing (and fixing) human delivery/exposure constraints early enough to change efficacy
  • The bottleneck in pharma is clinical learning speed, not idea generation - Phase-0 is the highest-ROI “human check” to collapse uncertainty fast
  • The investable opportunity is a platform: standardised, decentralised execution + instrumented analytics + a compounding PK/PD dataset flywheel
  • None of it matters without decision discipline: pre-committed thresholds and action paths that make “stop/prioritise/progress” non-negotiable

The Human Bottleneck

In October 2025, HealthPad published a Commentary titled, Phase-0 Goes Mainstream. The reaction was immediate - and strategically revealing. The debate was not whether Phase-0 matters. It was about two sharper questions.

First: what does a Phase-0 “microdose” strategy look like when it does more than de-risk - when it materially improves downstream outcomes by collapsing uncertainty in molecule selection early enough to change which candidate is taken forward?

Second: what must be true for Phase-0 to become a real investment category - not a niche service line, but a compounding, defensible capability?

These questions land because the ground has shifted. Targets and hypotheses are no longer scarce. We are industrialising discovery - and commoditising parts of it. The scarce resource is human truth: early, high-signal evidence that a candidate reaches the right tissue, achieves sufficient exposure, engages the target, and produces the intended biology at a dose people can tolerate.

In plain terms, the question is no longer “does it bind?” It is: “does it work in a body that matters - and why?”

That tension defines modern drug development. Timelines remain stubbornly long, and costs are dominated by failure - not because teams lack intelligence or effort, but because preclinical plausibility does not reliably translate into clinical benefit. We can be right in vitro, compelling in animals, and still wrong where it counts. As  Teslo and Scannell and others have argued, the true bottleneck is not idea generation; it is clinical development - the only stage that produces evidence regulators, investors, and patients accept.

This is where Phase-0 changes status.

Properly conceived, Phase-0 is not “a smaller Phase I.” It is an early, information-dense human experiment - often using microdoses or tightly limited exposure in a small number of participants - designed to answer a narrow but decisive set of questions:
  • Does the drug reach the right place in the human body?
  • At what concentrations, and with what distribution?
  • Is there early evidence of target engagement or pharmacology?
  • Are the exposures required for biological activity feasible in practice?
The goal is not to treat disease at that moment. The goal is to compress learning about delivery, distribution, exposure, and early biology into the earliest possible window - when decisions can still change outcomes.

Done well, Phase-0 does not just reduce uncertainty. It can change the trajectory of efficacy by revealing the constraint early - and making that constraint actionable. Often the hidden failure mode is not the target or the molecule in theory; it is what happens after dosing: insufficient exposure, wrong tissue distribution, unexpected metabolism, or a delivery problem that no animal model reliably predicts. Phase-0 is the fastest way to surface those truths - and to iterate while the programme still has room to move.

That is where the investment thesis becomes coherent.

Phase-0 becomes investable when it is more than bespoke studies sold one-by-one. It becomes investable when it behaves like a repeatable learning system: standardised protocols, fast cycle times, robust instrumentation and analytics, and a growing proprietary dataset that improves decisions over time.

In that world, Phase-0 is not just a risk filter. It is a value-creation engine - converting early human studies into decision-grade evidence with compounding returns: better capital allocation, fewer late failures, and - most importantly - a higher probability that programmes are engineered to work in humans, not just in models.

 
In This Commentary

This Commentary has one purpose: to make the Phase-0 opportunity legible by answering a simple question raised by HealthPad’s earlier piece: What does a Phase-0 strategy look like when it is not just a de-risking step, but a commercially decisive way to collapse uncertainty in molecule selection and improve the odds of downstream clinical success? It sets out what a credible Phase-0 “play” must include: the core capabilities, operating model, unit economics, and data flywheel required to build a repeatable human-signal engine - one that generates early, decision-grade evidence on exposure, delivery, and biological engagement, and converts it rapidly into clear action. Executed well, Phase-0 shortens iteration cycles, safeguards scarce clinical capacity, and compounds learning across a portfolio - turning “human truth” into an institutional capability rather than a downstream bottleneck, and into an investable advantage. To make this concrete, the argument is built around a strategic roadmap:
1. Make Phase-0 clinically consequential (not performative): design it to answer the questions that determine whether efficacy is plausible in humans.
2. Make it operationally routine: remove fixed overhead so “small, fast, high signal” is achievable repeatedly, not occasionally.
3. Make it clinically productive: use early human data to identify and fix delivery/exposure constraints while the programme can still change form.
4. Make it commercially scalable: standardise workflows, build repeat customers, and convert each study into a compounding dataset and defensible operating advantage.
5. Make decisions non-negotiable: pre-commit to action paths so Phase-0 outcomes reliably shape portfolio behaviour.

The Paradox: Scientific Acceleration, Clinical Deceleration

Discovery is accelerating at a rate few R&D leaders imagined a decade ago. We can read biology more cheaply, generate candidates faster, and iterate designs with something close to an engineering cadence. Yet the moment a programme crosses into humans, progress slows to a crawl.

Clinical throughput - the rate at which we convert hypotheses into reliable human evidence - remains slow, administratively heavy, capacity-constrained, and brutally expensive.

That mismatch is not a footnote. It is the operating constraint of modern drug development, and a primary reason R&D productivity remains uneven, often captured by Eroom’s Law. Portfolio-level failure follows a predictable pattern: organisations get better at producing “promising” assets while the clinic remains rate-limiting - and uncertainty accrues interest until it becomes catastrophic in Phase II and Phase III.

For healthcare systems, the consequences are tangible: trials that arrive late, oversized, and under-instrumented for learning; operational burden that competes with care delivery; and finite clinical capacity consumed by programmes that should have stopped earlier.

For investors, the consequence is structural capital inefficiency: long cycles, binary readouts, and value inflection points pushed years downstream. The cost is not only failure. It is time spent being wrong, and the compounding opportunity cost of being wrong at scale.

Two realities dominate drug R&D economics:
  • Attrition is structural: most programmes fail in humans, regardless of how compelling preclinical results look.
  • Returns are heavy-tailed: a small number of winners drive most patient benefit and commercial value.
In a heavy-tailed world, you do not win by perfecting narratives. You win by taking more credible shots - and by building a system that produces earlier, cleaner signals about what deserves the next tranche of capital, time, and patient exposure.

And there is only one source of those signals: structured learning in humans.

Medical misinformation isn’t new, but today it scales at speed. This episode of HealthPadTalksWhen Medical Misinformation Becomes a Public Health Crisis, tracks the shift from fringe vaccine resistance to algorithm-amplified mythmaking, and how institutional failures turn mistrust into harm. From the UK infected blood scandal to the US opioid crisis, we unpack what broke, and what must change.

The Seduction of the Map

Modern biopharma has a recurring risk: confusing the map for the world. A persuasive mechanism, a clean pathway diagram, or a compelling computational model can start to feel like proof - especially when those stories help raise capital and align teams.

But biology does not negotiate with narratives. Many valuable medicines were not born from mechanistic certainty; they were discovered, improved, and positioned through iterative contact with human data. Clinical research is not the “final exam” at the end of a linear pipeline. It is an evolutionary engine: candidates meet real-world human variation, and only those that produce meaningful effects at tolerable doses survive.

GLP-1 medicines (a class of drugs that help regulate appetite and blood sugar) illustrate this pattern. Early human studies produced clear, decision-worthy signals. What followed was not certainty, but optimisation: dose finding, delivery improvements, and side-effect mitigation so more people could stay on treatment. The scientific explanation expanded and sharpened as human exposure accumulated.

The lesson is both warning and strategy: do not confuse plausibility with proof. Build systems that pull human feedback earlier and more routinely.

 
Phase-0: The Highest-Leverage Human Check

When leaders hear “run more trials,” it often triggers the wrong reflex: cost panic, risk control, and a retreat into bigger preclinical packages - as if more assays can substitute for human evidence.

But the strategic case is not for larger, slower late-stage programmes. It is for earlier learning: small, fast, high-signal experiments in humans that collapse the uncertainties that drive failure before you place a nine-figure bet.

That is the leverage of Phase-0 when executed with discipline. It is the highest-ROI human check you can run because it tells you whether the programme is playing the right game.

At its best, Phase-0 is a focused decision instrument:
  • Microdosing where appropriate (to study distribution/exposure with little pharmacological risk),
  • measurement of human exposure through pharmacokinetics (PK),
  • and where feasible, evidence of target engagement or pharmacodynamic (PD) effect.
The goal is not to prove efficacy. It is to answer a handful of narrow, high-leverage questions that determine whether benefit is plausible:
  • Is human exposure aligned with expectations, or is translation already breaking?
  • Are required exposures feasible and tolerable, or does the margin vanish the moment you dose a person?
  • Can the drug reach relevant tissue and engage the intended biology in humans at practical doses?
These are not academic curiosities. They are the fault lines along which programmes fail expensively later.

It is just as important to state what Phase-0 is not. It does not establish clinical efficacy. It does not, by itself, validate a target. It does not magically “de-risk Phase II biology.” What it does – strategically - is reduce the chance you spend years and tens of millions learning something you could have learned in weeks.

In a world where most drug candidates fail, the most valuable early trial is often the one that tells you to stop - quickly, clearly, and for the right reasons. That is not pessimism. It is portfolio hygiene.

So why is Phase-0 not routine? Because traditional clinical operations impose large, fixed overheads even on small studies. Site bottlenecks, start-up bureaucracy, contracting and monitoring, complex sampling logistics, and slow data reconciliation can turn a modest human check into a months-long project - costly and brittle - which defeats the point.

This is where decentralisation matters - not as a scientific shortcut, but as an operational unlock: remove friction, preserve rigour, and make early human learning fast enough and repeatable enough to become standard capability, not occasional luxury.

 
What Decentralised Phase-0 Buys

Separate two kinds of value that are often blurred:
  1. Operational value: speed, access, repeatability, lower fixed overhead
  2. Scientific value: decision-grade evidence - which must be earned by design
Decentralisation buys the operational side: remote pre-screening, eConsent, participant-centric scheduling, local or home-based procedures where appropriate, mobile visits where needed, and reserving specialist sites for what truly requires them.
You might also like:
But speed is not truth. A study can run quickly and still produce weak data if endpoints are ill-chosen, assays are not validated, chain-of-custody is sloppy, or sampling is mis-specified. The platform thesis is not that logistics magically create insight. It is that repeatable infrastructure removes friction so teams can run good studies more consistently - and can afford to be disciplined about what each study is meant to resolve.
For readers new to decentralised trials, the intuition is straightforward: Phase-0 studies are small by design. They do not need the same site footprint as large efficacy trials. Yet traditional trial infrastructure imposes “fixed costs” that dominate small studies. Decentralisation converts those fixed burdens into scalable workflows:
  • participants are screened and consented remotely,
  • sampling is scheduled around participants rather than site calendars,
  • routine procedures move closer to the participant,
  • data capture and reconciliation are digitised end-to-end,
  • site time is reserved for what must be done at specialised centres.
This is not about lowering standards. It is about making high standards routine.
 
The Clinical Opportunity: Phase-0 as an Efficacy Engine, Not Just a Filter

The most important misunderstanding about Phase-0 is that it is “just de-risking.” That framing is too narrow.

Many programmes fail not because the target is wrong, but because the medicine cannot reliably achieve the right exposure in the right tissue at a tolerable dose and feasible delivery route. Preclinical models often miss practical human constraints: absorption variability, tissue penetration, metabolism, formulation limits, drug-drug interactions, transporter effects, unexpected clearance.

In short: the molecule may be conceptually elegant, but human delivery physics breaks the story.

Phase-0 enables a different posture: learn the constraint early, then engineer around it while you still can.

Clinical value emerges when Phase-0 is used to do three things:
  1. Reveal the bottleneck. Is the limiting factor exposure, distribution, metabolism, or engagement? Even small studies can indicate whether human PK aligns with expectations and whether variability is manageable.
  2. Convert bottlenecks into design choices. Once visible, constraints become actionable: formulation changes, prodrugs, delivery route redesign, depot strategies, combinations, dose scheduling, or patient stratification. The goal is not to confirm the original plan. It is to make a better one.
  3. Protect the path to efficacy. Early human evidence improves the odds that Phase I/II programmes are properly dosed, properly instrumented, and not set up to fail.
In this sense, Phase-0 can be clinically creative. It can prevent the common tragedy where a medicine that could have worked is abandoned because early clinical execution was built on the wrong assumptions about human delivery.
 
What Makes Phase-0 an Investable Opportunity

If Phase-0 remains a one-off service - bespoke studies executed on demand - it remains a narrow market. The investable opportunity is the platform: repeatable unit economics with compounding advantage.

A decentralised Phase-0 platform creates commercial value in three ways.

1. It removes the “start-up tax.” Early studies are still treated as custom projects: assemble teams, pick sites, renegotiate contracts, bolt vendors together, unwind it all at the end. Every programme pays the same overhead before a single participant is dosed. Platforms standardise what should be standard: contracts, quality systems, audit-ready workflows, lab logistics, chain-of-custody, data integrity, and reporting. The molecule is bespoke. The operating system is not.

2. It turns execution into a reusable asset. Each study improves the system: SOPs, cycle time, monitoring, data pipelines, and decision playbooks. Over time, execution becomes not only faster, but more reliable. Reliability is commercial: sponsors return to the system that delivers decision-grade evidence without drama.

3. It builds a proprietary “human truth” dataset. The defensible moat is not “we can run a study.” It is “we can interpret and act on early human evidence better than others because we have seen more of it - cleanly, comparably, and at known quality.” A growing dataset of early human PK/PD patterns, operational benchmarks, assay performance, and design outcomes becomes a durable decision advantage.

This is the compounding loop investors should care about:
More studies → more proprietary, comparable human data → better design and triage → better sponsor outcomes → more repeat business → more studies.

 
Why AI Won’t Replace Human Trials - and Why That’s the Strategy

AI will improve drug development. It will not remove the need to test in humans. Therapeutic benefit is not a pure prediction problem. The path from “binds a target” to “helps a person” is shaped by adaptive biology, evolving disease, and human variability that cannot be fully modelled in advance.

This is not bad news for AI. It is strategic clarity. AI’s defensible role is not as an oracle, but as a force multiplier that makes human learning faster, cleaner, and cheaper.

In a Phase-0 platform, AI’s highest value is instrumental:
  • strengthening design by selecting informative timepoints and sampling schedules within practical constraints,
  • reducing overhead by automating reconciliation, monitoring, and reporting work that consumes coordinators and monitors,
  • protecting data integrity by flagging anomalies early - missing samples, timing errors, protocol drift - before datasets become unusable,
  • supporting decisions by surfacing patterns without false certainty: what the evidence suggests, what it does not, and what closes the loop next.
Used this way, AI increase’s reliability, reduces avoidable noise, and compresses cycle time - concentrating spend on programmes with credible human signal.
The prize is not AI that claims authority over biology. The prize is an AI-enabled decentralised Phase-0 capability that repeatedly converts uncertainty into decision-grade evidence earlier in the portfolio, at lower cost, with less burden on sites and participants - so patient benefit and capital efficiency improve together.
You might also like to listen to:

Diversification Is a Trap
The Hidden Constraint: Decision Culture

Phase-0 only creates value if organisations are prepared to act on what it shows. Many companies do not fail because they lack data. They fail because decisions become sticky: sunk cost, narrative commitment, internal momentum, and the default choice of “not yet.”

In that environment, Phase-0 can degrade into a checkbox: a quick study followed by slow rationalisation. The fix is governance by design:
  • define the decision question up front: what uncertainty is this Phase-0 check meant to retire?
  • where feasible, pre-commit to thresholds and action paths: what would “stop”, “prioritise”, or “progress” look like?
  • align incentives so disciplined stopping is treated as progress, not failure
  • instrument the study to produce a decision, not a report
A platform can widen the aperture of human learning. Only decision discipline makes that learning consequential.
 
Ethics and Regulation: Don’t Fight It - Instrument It

Any argument for more human trials must earn ethical legitimacy. “More” cannot mean more burden, more opacity, or lower standards. The goal is better experiments undertaken earlier - with clearer purpose, stronger protections, and more participant agency.

Done properly, decentralisation can strengthen ethics: less travel burden, broader access, participant-centric scheduling, real-time safety monitoring, and auditable consent. But trust must be designed in: privacy, secure bio-sample handling, chain-of-custody, endpoint integrity, and clear governance for secondary data use.

The strategic move is not to evade regulation. Medicines win on credible evidence. The play is to outperform within regulation by making strong evidence cheaper and earlier - instrumenting compliance so quality happens by default.

 
Takeaways: A Roadmap to Clinical and Commercial Success

Drug development is no longer constrained by imagination. It is constrained by human learning - how quickly and cleanly we can convert plausible mechanisms into decision-grade evidence in people. We made discovery cheap and scalable, then acted surprised when the clinic became the choke point. The predictable result is bloated portfolios, uncertainty carried too far downstream, and patient capacity, clinical bandwidth, and capital spent answering questions that should have been resolved earlier.

Phase-0 is the highest-leverage countermeasure - not because it proves efficacy, but because it resolves the translational uncertainties that decide a programme’s fate: exposure, feasibility, and early engagement in humans. It is underused for a reason: traditional operations impose large, fixed overheads even on small studies, stripping Phase-0 of its strategic advantage - speed. Phase-0 pays only when it stays small, fast, high-signal, and leadership has the discipline to act on the result, including the hardest call: stop.

That is why clinically serious, properly governed, AI-enabled decentralised Phase-0 platforms are not a “nice innovation.” They are a structural upgrade. They:
  • cut the start-up tax that makes early studies slow,
  • broaden access beyond narrow site bottlenecks,
  • protect measurement integrity in real time,
  • and make early human experimentation repeatable rather than bespoke.
In this model, AI is neither the product nor an oracle. It is the force multiplier that makes the learning engine reliable: tightening designs, enforcing quality, accelerating review, catching deviations early, and stripping operational waste so small studies can stay small - and decisions can stay timely.

The provocation is straightforward:
  • If you care about patients, you should want more early human learning, not less - because the most ethical trial is often the one that ends a weak programme quickly and redirects resources to something that can help.
  • If you care about ROI, you should want the same thing - because the edge comes from collapsing uncertainty sooner, taking more credible shots, and concentrating resources on real human signal rather than preclinical stories.
Done well, an AI-enabled decentralised Phase-0 platform creates rare alignment: patients get better-targeted medicines sooner, and investors back a system that wastes less time being wrong - while finding winners faster.
view in full page
  • Neurosurgery is a discipline that diagnoses and treats a range of injuries and disorders of the brain and the central nervous system
  • For millennia the speciality was dominated by forms of craniotomies, which are procedures to remove portions of the skull to gain access to brain disorders
  • In the early and mid-20th century visual, guidance and radiation technologies disrupted the treatment of some brain disorders by introducing less- and non-invasive procedures to the discipline
  • At the beginning of the 21st century, a flurry of rapidly developing innovative technologies including, augmented reality, artificial intelligence (AI), robotics and genomic and cellular therapies, are accelerating the trajectory of neurosurgery towards a less- and non-invasive speciality
 
Brain disorders and the changing nature of neurosurgery
 
Populations throughout the world are growing and aging, the prevalence of age-related disabling neurological disorders is increasing, and healthcare systems are facing large and escalating demands for treatment, rehabilitation, and support services for such disorders. According to the most recent Global Burden of Disease (GBD) Study, neurological disorders are the leading cause of disability and the second leading cause of death in the world.
 
The total annual global burden of traumatic brain injury alone is ~US$400bn and in the US, ~16% of households are affected by brain impairment, with many individuals requiring 24-hour care. This suggests that often several family members are involved in the caregiving process, and some are juggling the responsibilities of caregiving, child rearing and employment simultaneously.
 
The scarcity of established modifiable risks for most of this vast and rapidly growing neurological burden suggests that innovations are required to develop efficacious prevention and treatment strategies. This Commentary describes some of these, especially those that have changed or have the potential to change neurosurgery, by making therapies less- and non-invasive, and hold out the prospect of improving patient outcomes and lowering healthcare costs.
 
Neurosurgery is a medical speciality concerned with diagnosing and treating a range of disorders and injuries of the brain and central nervous system (CNS) in patients of all ages. These include tumours of the brain and CNS, infections of the CNS, pituitary tumours and neuroendocrine disorders, traumatic brain injury, cerebral aneurysms and stroke, hydrocephalus and other conditions that affect the flow of cerebrospinal fluid, degenerative spine disorders, Parkinson’s disease, Alzheimer’s, epilepsy, spina bifida, and psychiatric disorders.

Treating brain conditions is complex and challenging. This is partly because the brain is one of the best protected organs of the human body. It is encased in the bones of the skull, covered by the meninges, which consist of three membranes and cushioned by cerebrospinal fluid (CSF). It is also protected by the blood-brain barrier (BBB), which is a network of blood vessels and tissue comprised of closely spaced cells, which shield the brain from toxic substances in the blood, supply brain tissue with nutrients, and filter harmful compounds from the brain back into the bloodstream. The BBB limits the ability of therapeutics to be effectively delivered to the brain and thereby complicates the treatment of CNS disorders. Further, the brain does not feel pain because there are no nociceptors (a sensory receptor for painful stimuli) located in its tissue, which often makes diagnosis of neuro disorders late when treatment becomes more challenging and costly, and survival less likely.

Such factors partly explain why neurology and neurosurgery have been slower than some other specialities to take advantage of new and evolving technologies. However, this is changing. Over the past five decades, progress in three-dimensional (3D) visualization, miniaturisation, digital technology, robotics, computer assisted manipulation, radiation therapy, early diagnosis of cancer, and precision medicine, have contributed to improvements in the diagnosis, prognosis, and prevention of some neurological conditions and started to transform neurosurgery towards less- and non-invasive procedures that efficaciously execute complex challenges, eliminate mechanistic errors, reduce operating times, and improve patient outcomes.
 
Further, the growing significance of applying artificial intelligence (AI) and machine learning techniques to pre-, intra- and post-operative clinical data introduces the possibility of a new suite of medical services that have the potential to enhance patient outcomes and reduce costs by improving diagnosis, planning and the rehabilitation of patients. And more recently, there are growing synergies between neurosurgery and gene and cellular therapies, which promise to accelerate personalized, non-invasive treatments for a range of neuro disorders.
 
In this Commentary
 
This Commentary is divided into 9 sections. Section 1 provides a brief history of neurosurgery, which has its genesis in ancient times when a form of craniotomy (surgical removal of a portion of the skull) was practiced and note the difference between craniotomy and craniectomy. Section 2 describes how, in the mid-20th century, neurosurgery took ~4 decades to pivot when Lars Leksell, a Swedish surgeon, introduced a stereotactic guided device that permitted the accurate positioning of probes to treat small targets in the brain, which were not amenable to conventional surgery. Shortly afterwards Leksell developed ‘stereotactic radiotherapy’, which formed the basis the Gamma Knife®, a device that provides non-invasive surgeries for a range of brain disorders. Section 3 details how advances in magnification, illumination, and the development of fibreoptics contributed to less-invasive endoscopic neurosurgeries, which facilitated a range of brain disorders to be treated through a small burr hole in the skull. Previously such procedures would have required a craniotomy. This section also notes the rapid development of endovascular neurosurgery, which uses tools that pass-through blood vessels to diagnose and treat diseases and conditions of the brain rather than using open surgery. Today, neuro-endovascular surgery is the most practiced therapeutic approach for a range of vascular conditions affecting the brain and spinal cord and is positioned to grow further over the next decade. Section 4 suggests howneurosurgery has benefitted from a range of rapidly developing 21st century technologies including: augmented reality, artificial intelligence (AI), robotics and genomic and cellular therapies. All help to increase less- and non-invasive neurosurgical procedures and contribute to advancing personalized therapies that improve patient outcomes and lower costs. Section 5 provides some insights into the life of a neurosurgeon through the lens of Henry Marsh, an English neurosurgeon who, between 2014 and 2022, published three candid memoirs, which chronicle his career, describe daily challenges and frustrations of the speciality and explain how neurosurgical units have changed the way they are organized and run. Sections 6 briefly mentions the increasing prevalence of dementias. Although outside the direct realm of neurosurgery, the scale and speed of their growth are likely to have an indirect impact on it. Section 7 introduces traumatic brain injury (TBI), a condition caused by a blow to the head and suffered by millions. The section describes the gold standard management of severe TBI and flags a pressing need to develop a non-invasive modality for managing the condition. Section 8 notes the frustration of neurosurgeons with the late diagnosis of brain tumours and describes well-resourced global endeavours to detect a wide range of cancers from a single blood test in asymptomatic people. Takeaways follow in Section 9 and suggest that a significant proportion of neurological disorders, which previously were treated with craniotomies, are now treated with either less- or non-invasive procedures. With the speed at which technology and biomedical science are developing, the only direction of travel for neurosurgery is towards non-invasive procedures.
 
Section 1
History
 
Neurosurgery has a long history with its genesis in Mayan civilizations ~1500 BCE, who practiced cranial deformations that included flattening frontal skull bones. During the Egyptian era, when mummification started to be practiced ~2,500 BCE, embalmers did not use a form of craniotomy to gain access to the brain. Instead, they used hooked instruments to remove the brain through the nose: a prototype of modern transsphenoidal surgery, which is a common procedure today for removing tumours of the pituitary gland. Rather than opening the skull with a traditional craniotomy, the physician reaches the tumour through the nasal passages and the sphenoid sinus.
 
In ancient Peru Inca surgeons practiced an early form of craniotomy referred to as trepanation, which used a scraping technique to penetrate the skull. Such procedures were performed on adult men to treat injuries suffered during combat. A version of this procedure called a trephination was also practiced in Egyptian and Roman times and performed on individuals who had experienced head traumas. The approach entails making a hole in the skull to relieve the build-up of intracranial pressure (ICP) caused by brain oedema (swelling) and is described by Hippocrates in the Greek era. The first known neurosurgery in Greece took place ~1900 BCE in Delphi when skull trephinations were probably performed for religious reasons. Later, the technique was recommended by Galen during the Roman period for people who had suffered a traumatic brain injury (TBI) in battle. From ~500 to ~1500 AD, the rise of religion and war resulted in many craniocerebral traumas, which contributed to the early development of neurosurgery as a distinct specialty.
 
Similar trephination procedures were performed during the American Revolutionary War, which secured American independence from Great Britain, and culminated in the Declaration of Independence on July 4, 1776. During the war soldiers suffered TBIs after being hit on the head with the butt of a rifle. Although the treatment for severe TBI is similar today, (see Section 7) the main difference is that the surgical instruments used in the 18th century were not powered. About 132 years later, in 1909, Theodore Kocher, a Swiss physician and Nobel Laureate in Medicine was the first person to systematically describe a decompressive craniectomy procedure for severe TBI patients. A craniectomy is different to a craniotomy. The latter is a surgical procedure in which a section of the skull is removed to expose the brain and is performed to treat various neurological conditions, or when an injury or infection has occurred in the brain. A craniectomy involves a different surgical technique and is used on people suffering severe TBI to relieve brain oedema. In such a procedure the bone fragment removed may not be replaced immediately and is either replaced during a subsequent surgery or discarded in favour of a future reconstruction using an artificial bone.

 
Section 2
Stereotactic surgery
 
For millennia, a form of craniotomy dominated what we now know as neurosurgery. During the 20th century advances in medical science paved the way for the introduction of less- and non-invasive modalities to treat brain disorders (see below). A landmark event occurred at the beginning of the 20th century with the introduction of stereotactic surgery, which makes use of three-dimensional (3D) coordinates to locate and treat lesions in the brain. The method was first reported in the May 1908 edition of Brain, by two British surgeons Victor Horsley, and Robert Clarke. The device they described became known as the Horsley-Clarke apparatus, and was used to study the cerebellum in animals by enabling accurate electrolytic lesioning to be made in the brain of a monkey. It took ~40 years before the technique was introduced to humans following the publication of a seminal paper by Ernest Spiegel and Henry Wycis,  in the October 1947 edition of Science. Spiegel was a Vienna trained neurologist who moved to Temple Medical School in Philadelphia, which in 2015 was renamed the Lewis Katz School of Medicine. Wycis was one of Spiegel’s students who became a neurosurgeon. By the time they published their 1947 paper, they had performed several neurosurgeries and there had been sufficient advances in neurophysiology, pneumoencephalography, radiology, and electrophysiology for them to design a device like the Horsley-Clarke apparatus, which was fixed to a patient’s head by means of a plaster cast and was accurate enough to be used in human stereotactic surgery. Spiegel’s and Wycis’s surgical innovations attracted attention from physicians internationally, but there were no commercial stereotactic frames and neurosurgeons were obliged to design and manufacture their own. A pivotal moment occurred in 1947, when Lars Leksell, a Swedish physician and Professor of Neurosurgery at the Karolinska Institute, in Stockholm, visited Wycis in Philadelphia and afterwards designed a lightweight titanium head frame to provide the basis for stereotactic surgery, which he described in a 1949 paper entitled, ‘A stereotaxic apparatus for intracerebral surgery’.
 

The Gamma Knife®   
In the early 1950s, Leksell and Börje Larsson, a biophysicist from the University of Uppsala, Sweden, were convinced that agents other than cannulas and electrodes could be used to eradicate pathologies in the brain, and combined a source of radiation with a stereotactic guiding device. This led to the development of a non-invasive device, which Leksell used to perform the first radio-neurosurgical procedure and discovered that a single dose of radiation could successfully destroy deep brain lesions. He called this technique “stereotactic radiosurgery”, which, in 1968, led to the first stereotactic Gamma Knife® that used a focused array of intersecting beams of gamma radiation to treat lesions within the brain. Its success encouraged Leksell to use the device over the ensuing decade in functional brain surgeries to treat intractable pain and movement disorders. Leksell’s radio surgical device used Cobalt-60 (a synthetic radioactive isotope) as a radiation source. The basic physics that drives stereotactic radiosurgery today is substantially the same. It focuses ~200 tiny beams of radiation on a target in the brain with submillimetre accuracy. Although each beam has little effect on the brain tissue it passes through, a strong dose of radiation is delivered to the place where the beams meet.
 
Over time, the Gamma Knife® has been refined and enhanced and its efficacy and safety have been well established. Today, the Gamma Knife® provides a non-invasive operative system for a range of brain disorders, including small to medium size tumours, vascular malformations, epilepsy, and nerve conditions that cause chronic pain. Before its introduction such disorders were treated by surgeries, which involved craniotomies. In 1987, the Gamma Knife® was introduced into the US and installed at the Universities of Pittsburgh and Virginia. Although it took decades to achieve regulatory approval and be widely used throughout the world, the Gamma Knife® represents a significant technological advance in neurosurgery. Unlike craniotomies the device provides painless procedures that do not require anaesthesia, treatments take just one session, and patients can return to normal activities almost immediately. The Gamma Knife® is ~90% successful in killing or shrinking brain tumours, and today, there are ~300 Gamma Knife® sites worldwide, which each year treat >60,000 patients.
 
Neurosurgeon Ranjeev Bhangoo, Clinical Director for neurosurgery at King’s College Hospital, London, UK likens the Gamma Knife® to, “an umbrella, that sits above the patient’s head, rather like the old-fashioned hair dryers in women’s hair salons, but much bigger and more complex”, and stresses that the procedure, “is not painful. Forget any notion of surgery: there’s no knife, there’s no operating theatre. It’s done with the patient awake: you walk in, have your treatment, and walk out.” See videos.

 

What is Gamma Knife Radiosurgery?
 

Is Gamma Knife Radiosurgery painful?

 
Section 3
Endoscopic and endovascular neurosurgery
 
Neuroendoscopy
Neurosurgery pivoted again in the 1990s when disorders that would normally require opening the skull began to be treated less invasively through a small burr hole. Improved magnification, miniaturization, and illumination of lenses and the development of fibre optics facilitated an endoscopic surgical procedure to treat hydrocephalus, a condition in which cerebrospinal fluid (CSF) abnormally accumulates in the brain. There is currently no prevention or cure for the condition, but it can be managed with surgery. The procedure includes creating an opening in the floor of the third ventricle using an endoscope (a thin, flexible, tube-like imaging instrument with a small video camera on the end) placed within the ventricular system through a burr hole in the skull. In the late 1990s, neuro-endoscopy expanded to treat lesions outside the ventricular system and the endoscopic endonasal approach was established as a technique that allowed surgeons to go through the nose to operate on areas at the front of the brain and top of the spine.

Since the early use of the endoscopic procedures for treating intrasellar pituitary adenomas, the approach has been expanded to treat a range of skull base lesions. Today, skull base surgery is undertaken to remove both noncancerous and cancerous growths, and abnormalities on the underside of the brain or the top few vertebrae of the spinal column. Because this is such a difficult area to see and reach, skull base surgery has been advantaged by endoscopic procedures where surgeons insert instruments through natural openings in the skull - the nose or mouth - or by making a small hole just above the eyebrow. This type of surgery requires a team of specialists that may include ear, nose, and throat (ENT) surgeons, maxillofacial surgeons, neurosurgeons, and radiologists. Before endoscopic skull base surgery was developed, the only way to remove growths in this area of the body was by making an opening in the skull. In some cases, today, this type of surgery may be still needed.

Recent advances in endoscope design have produced equipment that is smaller and more efficient, with improved resolution and brighter illumination, than earlier models. Such developments, combined with surgeon enthusiasm, have contributed to the expansion of neuro-endoscopy to treat a range of neuro disorders including intracranial cysts, intraventricular tumours, skull base tumours, craniosynostosis (a birth defect in which the bones in a baby's skull join too early), degenerative spine disease, hydrocephalus and a rare benign tumour called hypothalamic hamartoma.
 
Neuro-endoscopic surgery causes minimal damage to normal structures, carries a lower rate of complications, shortens hospital stays, minimizes cosmetic concerns associated with many neurosurgical conditions and improves patient outcomes. It is positioned to take advantage of further miniaturization of cameras and optical technology, innovations in surgical instrumentation design, and further innovation in navigation and robotics systems.
 

Endovascular neurosurgery
Another innovation that has developed over the past five decades is endovascular surgery. The term ‘endovascular’ means ‘inside a blood vessel’. Endovascular neurosurgery uses tools that pass-through blood vessels to diagnose and treat diseases and conditions of the brain rather than using open surgery. The genesis of endovascular neurosurgery is credited to Professor Alfred Luessenhop, an American physician at Georgetown University Hospital in Washington DC, who, in 1964, carried out the first embolization of a cranial arteriovenous malformation and the first intracranial arterial catheterization to occlude an aneurysm. Over the past 60 years, endovascular neurosurgery has developed and has become a subspeciality. Today, >50% of cerebral aneurysms are treated through this minimally invasive approach.
 
Neuro-endovascular surgery has become the most practiced therapeutic approach for the majority of vascular conditions affecting the brain and spinal cord. It is used more frequently than open surgery for the management of complex vascular conditions, with high rates of safety and efficacy. The expansion of endovascular techniques into the treatment of stroke, the third highest cause of death in the US, has provided meaningful benefits to large numbers of patients worldwide. Further, with populations throughout the world aging neuro-endovascular techniques are poised to become one of the most necessary and important treatment modalities within neurosurgery.
 
With age our brains shrink, which causes a space to develop between the surface of our brain and its outermost covering. This increases the possibility that a knock to the head of a person >60 will result in a brain blood vessel rupturing and bleeding: a subdural hematoma. Research suggests that, “significant numbers occur after no significant antecedent trauma”, and could be the result of “an inflammatory process occurring at the level of the dural border cell”. A chronic version of this disorder can manifest itself within weeks of the first bleeding in which blood accumulates. With aging populations, chronic subdural hematoma (cSDH), is a condition predicted to become one of the most common neurosurgical conditions in the near-term future and expected to be treated with neuro-endovascular techniques.
 
Further, minimally invasive neuro-endovascular procedures are now commonly used to repair cerebral aneurysms, which are weak or thin spots on arteries in the brain that balloon and fill with blood. A bulging aneurysm can put pressure on brain tissue, and may also burst or rupture, spilling blood into the surrounding tissue (brain haemorrhage). Today most brain aneurysms are treated minimally invasively with neuro-endovascular techniques, which means an incision in the skull is not required. Instead, the surgeon guides a catheter or thin metal wires through a large blood vessel in the patient’s groin to reach the brain, using contrast dye to identify the problematic blood vessel. The aneurysm is then sealed off from the main artery, which prevents it growing and rupturing. In the US ~6.5m people are living with an unruptured brain aneurysm. The annual rate of rupture is ~10 per 100,000: ~30,000 Americans suffer a brain aneurysm rupture each year. Ruptured cerebral aneurysms are fatal in ~50% of cases and those who survive, ~66% suffer some permanent neurological deficit. Each year, there are ~0.5m deaths worldwide caused by brain aneurysms and ~50% are <50years.

 
Section 4
Evolving technologies affecting neurosurgery

At the beginning of the 21st century scientific and technological advances are again changing the face of neurosurgery. This section briefly describes four such changes.
 

Neurosurgery and augmented reality
Neurosurgery relies on visualization and navigational technologies and makes liberal use of computed tomography (CT) and magnetic resonance imaging (MRI) scans during preoperative planning and intraoperative surgical navigation. More recently, augmented reality (AR) applications have been used to complement more conventional visualization and navigational technologies to enhance neurosurgery. AR can bring digital information into the real environment and is beginning to play an increasing role to help neurosurgeons train, as well as plan and perform complex surgical procedures. In June 2020, surgeons atJohns Hopkins University successfully carried out a spinal fusion surgery for the first time in the US using xvision™, an FDA approved AR device for spine surgery developed by Augmedics Inc., a Chicago based company, which went public in 2020 through a reverse merger with Malo Holdings. Xvision™ allows surgeons to “see” the patient's anatomy through skin and tissue as if they have X-ray vision, to accurately navigate instruments and implants during surgical spine procedures. Each year, there are ~1.62m instrumented spinal procedures performed in the US, the majority of which are undertaken using a freehand technique, which can lead to suboptimal results.  
 

Neurosurgery and artificial intelligence
Such heavy use of advanced imaging and guidance technologies creates a vast amount of clinical data during a patient’s neurosurgical journey. It is not altogether clear how effectively pre-, intra-, and post-operative clinical patient data are collected and analyzed to enhance surgical procedures and patient outcomes. An article in the August 2021 edition of the journal Neuroscienceentitled, ‘Neurosurgery and Artificial Intelligence’, suggests that the collection and analysis of such data are beginning to happen. Over the past decade, AI techniques applied to data collected during patients’ neurosurgical journeys have enhanced diagnoses and prognostic outcomes and contributed to post operative care and the rehabilitation of patients. Being able to predict prognosis, identify potential postoperative complications, and track rehabilitation are enhanced with AI applications. The future suggests that the symbiotic relationship between AI and neurosurgery, which today is in its infancy, is positioned to grow. This will not only help AI to develop better and more robust algorithms but will provide opportunities for MedTechs to gain access to new revenue streams by providing enhanced patient services.
 
Robotics
Linked to medical imaging and navigation technologies is the increasing use of surgical robotics. However, neurosurgery has been slower than other specialties to incorporate robotics into routine practice owing to the anatomical complexity of the brain and the spatial limitations inherent in neurosurgical procedures. Notwithstanding, the first documented use of a robot-assisted surgical procedure was in neurosurgery. In 1985 Yik San Kwoh and colleagues, at the Memorial Medical Center in Long Beach, California, used an Unimation Programmable Universal Machine for Assembly (PUMA) 200 (which was originally designed for General Motors’ factories) to perform a CT-guided stereotactic biopsy of a brain lesion. Although discontinued, the PUMA 200 is considered the predecessor of current surgical robots.  There are now several robotic systems that have gained regulatory approval for cranial surgery. These include Zimmer Biomet’s ROSA ONE Brain, which obtained FDA approval in 2012 for intracranial applications, and Renishaw’s Neuromate robotic system, which was granted approval by the FDA in 2014. The former has been used extensively in the treatment for epilepsy, and the latter provides surgeons with five degrees of freedom for use in stereotactic applications. Robotics is a fast-moving discipline, which together with AI and machine learning, is positioned to impact neurosurgery in the near to medium term.
  

Neuro-pharmaceuticals and Trojan horses
There are growing synergies between neurosurgery, gene, and cellular therapies. However, the BBB, which plays a significant role in controlling the influx and efflux of biological substances essential for the brain to operated effectively, makes it extremely difficult to effectively deliver drugs to the brain. Over the past three decades, many biologics (medications developed from blood, proteins, viruses, or living organisms) have entered brain and CNS clinical studies. However, they have not gained FDA approval mainly because they did not have effective mechanisms to deliver neuro-pharmaceuticals across the BBB. Instead, the clinical trials were predicated upon a variety of BBB avoidance strategies. Cerebrospinal fluid (CSF) injections are the most widely practiced approach that delivers drugs to the brain by attempting to bypass the BBB. However, this results in limited drug penetration to the brain because of the rapid export of CSF from the brain back into the bloodstream. Future drug or gene-based neuro-pharmaceuticals will need to be accompanied by advances in BBB delivery vehicles.
 
Currently, there are numerous scientific endeavours to devise innovative and effective ways to deliver gene therapies across the BBB to the brain. Success in this regard will mean that genomic and cellular therapies will increasingly have the potential to work synergistically with neurology and neurosurgery to provide non-invasive, personalized care for a range of brain disorders including Alzheimer’s, Parkinson’s, spinal muscular atrophy, spinocerebellar ataxia, epilepsy, Huntington’s disease, stroke, and spinal cord injury. Endeavours are underway to re-engineer biologic drugs as brain-penetrating neuro-pharmaceuticals using BBB molecular Trojan horse technologies. This approach employs genetically engineered molecular Trojan horses (proteins), which carry genes across the BBB to have a therapeutic impact on brain disorders. The future development of neuro-pharmaceuticals linked to effective means to deliver these across the BBB are together positioned to reduce the need for interventional neuro therapies, but this may take some time.

 
Section 5
A perspective: life as a neurosurgeon
 
Three memoirs by Henry Marsh, an English neurosurgeon who treated a range of brain disorders over a 40-year career at a leading neurosurgical unit in London, provide insights into the human dramas that occur in a busy modern hospital. Marsh studied Politics, Philosophy and Economics (PPE) at Oxford University before starting medical school at the Royal Free Hospital in London. In 1984, he became a Fellow of the UK’s Royal College of Surgeons and in 1987, was appointed a consultant neurosurgeon at the Atkinson Morley Regional Neurosciences Centre at St George’s Hospital in London, where he spent his entire career.
 
Marsh’s first book is an unflinching memoir entitled, Do No Harm: Stories of Life, Death and Neurosurgery, which was published in 2014, and describes, with compassion and candour, challenging professional experiences filled with risk and imminent death. The book opens with the sentence, “I often have to cut into the brain and it's something I hate doing.” His first operation as a neurosurgeon was to treat a cerebral aneurysm. Forty years ago, this would have required opening the skull to access the brain. The procedure had a profound impact on Marsh, who commented, “What could be finer than to be a neurosurgeon. The operation involved the brain, the mysterious substrate of all thought and feeling, of all that was important in human life: a mystery, it seemed to me, as great as the stars at night and the universe around us.”
 
Marsh describes the difficult decisions, which neurosurgeons and patients regularly must make that change lives forever. He recalls moments of celebration and gratification when complex operations go well, and candidly recounts some of the more undesirable outcomes and slips of the hand that result in devastating outcomes. Marsh liked working with American neurosurgeons and came to “love their optimism, their faith that any problem can be solved if enough hard work and money is thrown at it, and the way in which success is admired and respected and not a cause for jealously”. He found the attitudes of American surgeons, “a refreshing contrast to the weary and knowing scepticism of the English”. However, after visiting hospitals in the US he expressed some scepticism about “the extremes to which treatments can sometimes be pushed” and wondered whether American physicians and patients “have yet to understand the famous American dictum that ‘death is optional’, was meant as a joke”. Tellingly, Marsh notes that “sometimes doctors admit their mistakes and ‘complications’ to each other, but are reluctant to do so in public, especially in countries that have commercial, competitive healthcare systems.” 
 
Marsh’s second memoir, Admissions: A Life in Brain Surgery, was published in 2017 two years after he retiredfrom his full-time job in England to work pro bono in Ukraine and Nepal. A documentary of his work in Ukraine, The English Surgeon, won an Emmy award. Marsh uses ‘Admissions’ to take an inventory of his life, which makes the book an even more introspective memoir than his first. He compares the challenges of working in troubled, impoverished countries like Nepal with his experience as a neurosurgeon in wealthy nations like the UK and US. The excesses of American medicine intrigued Marsh and he comments, “only in America have I seen so much treatment devoted to so many people with such little chance of making a useful recovery.” But he also expresses disillusionment with the administrative red tape in the English National Health System (NHS), which he maintains has eroded the authority and status of surgeons. In his final years working as a surgeon in St George’s Hospital in London he bemoans, “The feeling that there was something special about being a doctor had disappeared.” Marsh’s true love was patients and neurosurgery and at the end of his career, he was spending less time with patients and more time in meetings justifying his judgements and familiarizing himself with the latest UK government’s targets and edicts, which led him to say, “doctors need regulating, but they need to be trusted as well. It is a delicate balance, and it is clear to me that in England the government has got it terribly wrong”. 
 
Marsh suggests that patients’ fear encourages surgeons to exaggerate their competence and knowledge to “shield our patients from the frightening reality they often face”.  Over time, Marsh suggests, surgeons tend to believe the exaggerated versions of themselves. But the best un-learn their self-deception and come to accept their shortcomings and learn from their mistakes. “We always learn more from failure . . . . . . Success teaches us nothing,” Marsh writes.
 
Marsh’s third memoir,And Finally: Matters of Life and Death, was published in August 2022 and is very British, full of self-deprecation and dominated by the news that he is diagnosed with incurable prostate cancer. Marsh describes the sudden reversal of roles, from omniscient and omnipotent neurosurgeon to humble patient and provides descriptions of the ebbs and flows of his therapeutic journey, which gives valuable insights into how medicine in England works.
 
All three books bear witness to the fact that neurosurgery is a stressful and demanding profession, which requires extensive training, stamina, a high degree of manual dexterity, excellent hand-eye coordination, exquisite precision, extraordinary attention to detail, an ability to rapidly gather and process complex information to resolve challenging problems, compassion and empathy for patients, communication skills and teamwork. Unlike other surgical disciplines, a relatively small mistake can lead to “appalling disability”, coma, and death. According to research published in the October 2014 edition of Surgical Neurology International, ~25% of neurosurgical errors can be prevented or reduced with the increased use of evolving technologies, some of which are described in this Commentary.
 

Changes in the organization of neurosurgical units 
During Marsh’s 40-year career there were changes in the way neurosurgical units were organized and run; particularly the development of subspecialities among physicians and the use of multidisciplinary team approaches to clinical challenges. Much of Marsh’s career reflected a time when neurosurgeons worked in relative isolation and treated a wide range of neurosurgical conditions that presented in their clinics. Today, most neurosurgeons have a primary interest in a subspeciality such as epilepsy, neurovascular surgery, spinal surgery, the excision of tumours etc., and a secondary interest, which they share with colleagues. This tends to facilitate cross referral of patients among a team of physicians and improves patient care and the training of health professionals. In the operating room (OR) neurosurgeons work with other physicians, anaesthetists, trainee doctors, theatre nurses, and medical students. Outside the OR they collaborate with radiologists who use a range of diagnostic tools, including CT, MRI scans, and cerebral angiographies, which are used to detect abnormalities in blood vessels such as aneurysms, blockages, and bleeding. These neuroimaging technologies and neurosurgery have become inseparable. Neurosurgeons also work with neurologists, oncologists, ophthalmologists, and paediatricians. In 2017, Bob Carter, head of neurosurgery at the Massachusetts General Hospital, in the US, appreciated the interconnections between several clinical disciplines that care for people with neurological disorders and merged his neurosurgery department with the departments of neurology, psychiatry, and neuroradiology. While sub specialisms and teamwork have made an impact on the organization of neurosurgical units, new and emerging technologies have expanded the repertoire of neurosurgeons.
 

Awake brain procedures
Marsh specialised in operating on the brain while the patient is awake. This aspect of his work was the subject of a BBC documentary, Your Life in Their Hands. Awake brain procedures are usually performed when a lesion is located near the frontal lobes responsible for motor skills and speech. In the video below, Ranjeev Bhangoo describes the procedure, “It’s a technique where the patient is awake during the brain surgery. The patient is neither in pain nor suffering. When we make a cut in the skin and raise a trapdoor in the skull the patient is completely asleep. We wake them up after that point and the good news is the brain itself doesn’t feel pain. So, you can do this operation without the patient being in any distress or pain. It’s an unusual situation and the patient is prepared for it beforehand. The reason why you might want to do an awake craniotomy is because in some situations, tumours are close to critical structures of the brain that control speech or movement. While we have good maps of the brain and we have image guidance, they’re not precise enough. You want the patient to be talking to you and you want to be stimulating bits of the brain to see precisely where speech is so that you can avoid those areas and do the same with movement, you want to see the patient moving his or her arm or leg while you’re stimulating bits of their brain. So, we use an awake craniotomy when we’re operating near to what we call ‘eloquent’ areas of the brain that, if damaged, would produce a devastating deficit such as problems with speech or movement”. See video.
 

When and why is an awake craniotomy performed?

 
Section 6
The increasing burden of dementias on healthcare systems and economies
 
As populations age and live longer so dementia conditions increase. Alzheimer's, which effects parts of the brain that control thought, memory, and language, is the most common dementia in Western societies. It is a progressive disorder that begins with mild memory loss and leads to a loss of the ability to carry on a conversation and respond to your environment. In the three decades between 1990 and 2019, the global incidence of Alzheimer’s and other dementias increased by ~148%. In 2022, there were >6.5m Americans living with the condition: ~73% >65 and ~66% of these women, but this simply may be due to women living longer. By 2050, it is projected that ~13m Americans will suffer from dementia, which is expected to kill 1 in 3 seniors; that is more than breast and prostate cancers combined.
 
According to the World Health Organization (WHO), there are ~55m people with dementia globally, and >60% are living in low- and middle-income countries (LMIC). Age is the most significant risk factor: the likelihood of Alzheimer’s doubles every 5 years after you reach 65. But also, dementias appear to be increased by conditions that damage the heart and blood vessels, which include heart disease, diabetes, stroke, high blood pressure and high levels of cholesterol. As the proportion of older people in populations increase in nearly every country, people living with dementias are expected to rise to ~78m by 2030 and 139m in 2050. There is no cure for Alzheimer's, and treatments tend to fall to neurologists.  Drug therapies include galantamine, rivastigmine, and donepezil, which are cholinesterase inhibitors (also known as anti-cholinesterase, are chemicals that prevent the breakdown of the neurotransmitter acetylcholine) that are prescribed for mild to moderate Alzheimer's symptoms and may help reduce or control some cognitive and behavioural symptoms. Also, there are non-drug options.  Although outside the direct realm of neurosurgery, the scale and speed of the growth of Alzheimer’s and other dementias are likely to indirectly impact neurosurgery by increasing the burden on over-stretched healthcare systems. Under such circumstances, it seems reasonable to assume that there will be increased pressure on neurosurgery to become less resource intense, which means less invasive and less costly while improving patient outcomes.
 
Section 7
Traumatic brain injury

On Thursday 29th September 2022, Tua Tagovailoa, the Miami Dolphins’ quarterback received a head injury during a match against the Cincinnati Bengals and was stretchered off. Four days earlier he left the field after receiving another head injury while playing against the Buffalo Bills. He was then checked for a concussion and cleared and came back onto the field in the third quarter. Subsequently, the NFL Players Association exercised its right to remove the independent neurological expert who was involved in the decision to clear Tagovailoa to return to the Buffalo Bills game after being evaluated for a traumatic brain injury (TBI). This raises the significance of injuries to the brain and the challenges of accurately assessing their severity and adequately treating them.
 
TBI is as an alteration in brain function pathology by a sudden trauma, causing damage to the brain. Each year, the condition affects ~69m individuals worldwide. Symptoms can be mild, moderate, or severe, depending on the extent of the damage: annually ~5.5m severe cases are recorded globally. The epidemiology of the disorder is challenging because, in low-resourced regions of the world, where the prevalence of TBI is believed to be high, data are poor. According to the World Health Organization (WHO), ~90% of deaths due to head injuries occur in low- and middle-income countries (LMICs), where ~85% of the global population live and where the standards of care are patchy. TBI not only causes health loss and disability for individuals and their families, but also represents a costly burden to healthcare systems and economies through lost productivity and high healthcare costs. The total annual global burden of TBI is ~US$400bn.
 
Since the beginning of the 20th century, our knowledge and understanding of the pathophysiology of brain oedema (swelling) in head trauma patients has increased and today decompressive craniectomy is a recognised procedure for severe TBI to mitigate intracranial hypertension and its impact on clinical outcomes. One of the largest clinical studies, which sought to determine the efficacy of decompressive craniectomies for TBI patients, was the RESCUEicp trial: findings of which were published in the September 2016 edition of the New England Journal of Medicine. The study was carried out over a 10-year period, between 2004 and 2014, on 408 randomly assigned patients, 10 to 65 years of age, and concluded that, “At 6 months, decompressive craniectomy in patients with traumatic brain injury and refractory intracranial hypertension resulted in lower mortality and higher rates of vegetative state, lower severe disability, and upper severe disability than medical care”. 
 
In the US, TBI is a leading cause of death and disability. Each year, ~1.5m Americans sustain a TBI, ~50,000 die from the insult, ~230,000 are hospitalized and survive, and ~90,000 experience the onset of long-term disability. According to the US Centers for Disease Control and Prevention, ~5.3m Americans (~2% of the population) are living with disability as a result of a TBI. In 2010, the economic impact of TBI in the US was estimated to be ~US$77bn in direct and indirect costs. Each year in the UK ∼1.4m patients attend hospital following head injury and TBI is the most common cause of death for people in the UK <40 years.
  
Gold standard monitoring of intracranial pressure
There is no cure for severe TBI, and the gold standard management is to monitor intracranial pressure (ICP), caused by brain oedema (swelling). Current clinical guidelines for raised ICP levels suggest thresholds, usually between 20 and 25 millimetres of mercury (mmHg), at which treatment is recommended to either prevent or reduce further damage to the brain. The device used to monitor ICP is an intraventricular catheter system that requires drilling a burr hole in the skull to insert a catheter and placing it in a cavity (ventricle) in the brain, which is filled with cerebrospinal fluid (CSF). This is then connected to an extra-ventricular drain (EVD) that measures ICP. Such systems are accurate and reliable, but also, they are health-resource-intensive modalities, which run a risk of haemorrhage and infection.

Challenges with gold standard monitoring
According to research findings published in the January 2017 edition of the Journal of Neurosurgery, haemorrhage is a common complication of an EVD placement. Among the cases in which patients underwent imaging after a placement procedure, haemorrhage was found in 94 (21.6%). Another study, of 246 EVDs placed in 218 patients over a 30-month period and published in the November 2014 edition of Interdisciplinary Perspectives on Infectious Diseases, reported the cumulative incidence of EVD-related infections to be 8.3%. Further, because of the dearth of qualified neurosurgeons in under-resourced regions of the world, EVD systems are not widely available in LMIC, where the incidence rates of TBI are understood to be high and increasing.

Non-invasive ICP monitoring
Numerous alternatives to invasive gold standard ICP monitoring are in development, but none have established a valid place within a daily clinical setting. A review paper published in the December 2020 edition of the journal Neurotrauma, entitled “Non-Invasive Techniques for Multimodal Monitoring in Traumatic Brain Injury: Systematic Review and Meta-Analysis”, stresses the significance of monitoring ICP and brain oxygenation continuously in severe TBI patients, and suggests that the “two most prominent and widely used technologies for non-invasive monitoring in TBI are near-infrared spectroscopy [a form of photoplethysmography (PPG)] and transcranial Doppler”. Researchers conclude that, “both techniques could be considered for the future development of a single non-invasive and continuous multimodal monitoring device for TBI”.

Transcranial Doppler (TCD) ultrasonography is a non-invasive, painless ultrasound technique that uses high-frequency sound waves to measure cerebral blood flow velocity that may correlate with ICP. Research suggests that in ~15% of cases the ultrasound waves are unable to penetrate the patients’ skulls, and measurement is prone to intra- and inter- observer variability and accuracy. As the TCD system for measuring ICP non-invasively is encountering challenges, so near infra-red spectroscopy is gaining significance. This is a form of PPG technology, which is an uncomplicated, inexpensive, non-invasive, and convenient optical measurement that has the potential of being used at the site of injuries to quickly assess the severity of the head trauma. In the recent case of Tagovailoa, such a non-invasive ICP measurement device could have been applied on the playing field. Over the next decade, expect PPG technology to impact neurosurgery by potentially providing more accurate triaging and further disrupting the gold standard of care for severe TBI patients.
 
Section 8
Brain cancer and early diagnostics

We mentioned the Gamma Knife’s® ability to treat some brain tumours and suggested that patients have benefitted significantly from its use. The first successful modern brain tumour excision was performed in 1878 by William Macewen, a pioneering Scottish surgeon, at the Glasgow Royal Infirmary. At the beginning of the 20th century, contributions by Americans started with Harvey Cushing, who is generally recognised as the father of modern neurosurgery. Working at the John’s Hopkins Hospital in Baltimore, Cushing introduced meticulous documentation of the clinical and pathological details of cerebral tumours and devised several surgical techniques for operating on the brain that became the foundation of neurosurgery as an autonomous surgical discipline. In 1912, he discovered an endocrinological syndrome caused by a malfunction of the pituitary gland, which is named after him: Cushing’s disease.
 
The prognosis for a brain tumour is dependent upon its type, location, size and time of diagnosis, growth and how much can be surgically removed or treated. Factors including age and general wellbeing as well as some recognised genetic factors also influence prognosis. Poor prognosis for brain cancers is perpetuated by the lack of cost-effective, accurate tests that can be used in a primary care setting to diagnose the conditions. This means that a large proportion of brain cancers are diagnosed too late for current treatments to be effective. However, in recent years there have been advances made in detecting brain cancers early and this is expected to significantly improve prognosis.
 
Although there are >120 different types of brain tumours, lesions and cysts, your chances of developing brain cancer is <1%. Brain tumours account for ~90% of all primary central nervous system (CNS) tumours. In 2020, >0.3m people worldwide were diagnosed with a primary brain or spinal cord tumour. According to the Annual Report of the US Central Brain Tumor Registry, >84,000 Americans were diagnosed with a primary brain tumour in 2021. The US National Cancer Institute, suggests ~0.6% of Americans will develop brain cancer in their lifetime and the 5-year survival rate for those that do is only ~33%. This year, >4,000 Americans <15, are expected to be diagnosed with a brain or CNS tumour. In the UK, each year ~16,000 people are diagnosed with a brain tumour and ~ 60,000 people are living with a brain tumour.
 
The causes of brain tumours are not fully understood and occur because of an abnormal growth of brain cells or cells in the brain’s supporting tissues, which can damage the brain, threaten its function and result in death. Some tumours may occur around the edge of the brain and press on certain parts of it, while others can be more diffuse and grow among healthy tissue. In the video below, neurosurgeon Christopher Chandler, who leads the Paediatric and Adolescent Neurosurgical Service at King’s College Hospital, London explains that, “A brain tumour is an uncontrolled growth of a bunch of cells where the ‘off’ switch is missing. This means that there’s nothing telling these cells to stop growing, so they grow and divide. As this uncontrolled mass, or tumour, grows it displaces brain tissue and causes pressure on the surrounding brain. If you don’t remove the tumour or stop it from growing, it will grow so large that it causes critical pressure on the surrounding structures of the brain, which eventually, if untreated, can kill the patient.” See video.  

 
What is a brain tumour?
 
The Holy Grail
Neurosurgeons are frustrated by the fact that brain cancers are often diagnosed late. This is because brain tumours often present with non-specific symptoms and are therefore challenging to diagnose. In the video below, neurosurgeon Ranjeev Bhangoo explains the reasons for a brain tumour to be diagnosed late. “Firstly, the symptoms are non-specific: tiredness, headache, poor concentration - maybe not finding your keys as well as you use to – the sort of thing that can happen to any of us when we’re tired. The classic thing of having a fit or collapsing occur, but they’re unusual. Your GP is only likely to see just one or two brain tumour cases in his or her whole career. . . . Now, if you do get a scan, the chances of you having a brain tumour are incredibly rare. So, just because a neurologist has organized a scan, you mustn’t get worried because it’s very unlikely that you’ll have a brain tumour. But ultimately, through some path or other, you have a scan, usually a CT scan, which is a form of X-ray, which is quick and safe and if there is a tumour it will show. At that point, what will normally happen is that your doctor will refer you to a neurosurgeon”.    
 
How are brain tumors diagnosed?
 
Technologies positioned to reduce neurosurgeons’ frustration with late diagnosis of brain cancers are quick, easy-to-use, and inexpensive blood tests that can diagnose cancer early. Such tests fall into four general categories: (i) complete blood count used to evaluate your overall health and detect a wide range of disorders, (ii) biomarkers, which are molecules found in your blood and other body fluids that can indicate specific cancers, (iii) blood protein testing that measures the amount of protein in your blood to diagnose cancer, and (iv) circulating tumour cell tests, which look for tumour cells that are shed from a tumour and are now circulating through your bloodstream.
 

Detecting brain cancers early
Two recent examples of simple diagnostic blood tests are reported in the August 2022 edition of Clinical Cancer Research and the October 2019 edition of Nature Communications. In the former paper, scientists at Massachusetts General Hospital (MGH) report findings of a study, which detected the presence of brain cancers early by identifying pieces of tumour cells’ genetic material - mRNA - that circulate in your blood. The test, which has a sensitivity of 72.8% and a specificity of 97.7% can characterize brain tumours and monitor their status after treatment. According to Leonora Balaj, a co-senior author, and assistant professor of Neurosurgery at Harvard Medical School, “There is a real need to make brain tumor diagnosis less invasive than the current technique of tissue biopsy. This research demonstrates that it is now feasible to diagnose a brain tumor via a blood test for one of the most common mutations detected in brain tumors”. Findings of the latter paper suggest that certain brain cancers may be detected early from a simple blood test using PPG technology, which has been used in hospital settings since the 1980s to monitor heart rate and relative blood volume. Today, the technology is used in a wide range of commercially available medical devices, as well as smartwatches (the Apple version is an FDA approved medical device) and fitness trackers, for measuring oxygen saturation, blood pressure and cardiac output, assessing autonomic function and detecting peripheral vascular disease. Previously we described how PPG technology is positioned to provide a non-invasive means to monitor ICP in TBI patients.

The 2019 Nature paper describes how PPG easily, cheaply, and accurately identified asymptomatic people with suspected brain cancer. In the first instance, the technology was used on a retrospective cohort of 724 people, which included those with primary and secondary cancers as well as control participants without cancer. PPG was employed to identify biomarkers from patients’ blood samples and a machine learning algorithm was trained to identify specific biomarkers with cancer present. The algorithm was then used on a sample of 104 random participants and brain cancer was detected in 12. The PPG test revealed a sensitivity of 83.3% and a specificity of 87%. According to Matthew Baker, from the University of Strathclyde, Scotland, the paper’s lead author, “This is the first publication of data from our clinical feasibility study, and it’s the first demonstration that our blood test works in the clinic.
 

A global endeavour
These two studies are part of a well-resourced global endeavour to develop an affordable, simple, point-of-care, blood test, which detects cancer before any symptoms occur. Today, biomedical advances move at a much faster pace than medical technology did in the 1950s and 60s when Lars Leksell was developing minimally invasive stereotactic radiosurgery procedures to accurately locate and remove brain tumours. For example, in ~7 years since its foundation in 2015, GRAIL, a US biomedical start-up backed by Jeff Bezos and Bill Gates, has become a global leader in a ground-breaking multi-cancer, early detection, blood test, Galleri®, which has the potential to detect >50 types of cancers before they are symptomatic. This is achieved by looking for abnormal DNA shed from cancer cells in the blood, called cell-free DNA (cfDNA). The Galleri® test uses genetic sequencing technology and artificial intelligence (AI) to scan for patterns of chemical changes in the cfDNA that come from cancer cells but are not found in healthy cells. If validated, the GRAIL test will provide a simple, cheap, non-invasive means to identify a range of cancers in asymptomatic people when they are more likely to respond positively to therapy.
 

Large UK clinical study
In May 2019, the GRAIL Galleri ® blood test was granted US FDA Breakthrough Device designation. The test is only available commercially in the US but is rapidly gaining provenance in other regions of the world. For example, in September 2021, NHS England launched a massive clinical study for Galleri® and set up ~150 mobile clinics in convenient locations across the country to recruit ~140,000 participants. In July 2022, participants were invited to attend two further appointments spaced ~12 months apart. Findings from the study are expected to confirm the accuracy of the test in asymptomatic participants and lead to its regulatory approval. Although Galleri® is the first of its kind to be trialled on such a scale in the UK, it is not the only player and cfDNA is not the only technology.
 

Guardant Health
Another US biotech developing capabilities to detect a range of cancers early from a simple blood test is Guardant Health. Founded in 2011, the company is now ~US$6bn Nasdaq traded global enterprise with annual revenues ~US$110m. In April 2022, Guardant presented new data at the American Association for Cancer Research Annual Meeting. Findings suggested that the company’s investigational next-generation Guardant SHIELD™ assay has the capacity to analyse ~20,000 epigenomic biomarkers that help to detect a broad range of solid tumours using a single blood test. Guardant’s co-CEO, Amir Ali Talasaz said: “These positive results show that the next-generation Guardant SHIELD multi-cancer assay provides sensitive detection of early-stage cancers with the ability to identify the tumor tissue of origin with high accuracy”.
 
Section 9
Takeaways

For millennia neurosurgery, which has its roots in ancient civilizations, was dominated with forms of craniotomies, which opened the skull to access cerebral disorders. In the 20th century the speciality pivoted and introduced less- and non-invasive procedures to deal with a range of brain and CNS conditions. However, the introduction of these were slowed by the fact that the brain is such a well-protected organ and they took nearly half a century to gain regulatory approval and enter the clinic. At the beginning of the 21st century biomedical research is advancing at such a pace and it positioned to significantly transform neurosurgery towards a less- and non-invasive modality. Further, in the next two decades expect gene and cell therapies to substantially increase their influence as treatments for neurodisoders. Over the past three decades novel neuro-pharmaceuticals have been constantly in clinical trials but failed to receive regulatory approval because they did not have an efficatious mechanism to deliver the therapeutics across the BBB. Today, there are a myriad of novel vehicles under development, which are expected to effectively smuggle 21st century pharmaceuticals across the BBB. These are being advanced in parallel to the drugs, and together are positioned to significantly disrupt traditional neurosurgical procedures over the next two decades.  
view in full page
  • The core business of medical technology companies (MedTechs) has been manufacturing and marketing physical devices
  • Physical devices will continue to be a substantial part of their business, but on their own, are unlikely to deliver high growth rates, which are more likely to come from artificial intelligence (AI) data driven strategies that improve patient outcomes
 
The impact of big data, artificial intelligence, and machine learning on the medical technology industry
 
James Carville, an American strategist, who played a leading role in Bill Clinton winning the 1992 presidential race, insisted that the campaign focus on the economy and coined the phrase “It’s the economy, stupid”. If Carville was asked today for a winning long-term growth strategy for medical technology companies, might he say, “It’s big data, stupid”?
 
This Commentary suggests that while physical products have been the backbone of MedTech companies in the past, they are unlikely to contribute significantly to future growth rates, which are more likely to come from artificial intelligence (AI) driven big data innovations, which create new solutions that improve patient journeys and outcomes.
 
In this Commentary
 
This Commentary describes the meaning of ‘big data’ in a healthcare context, explains ‘the data universe’ and stresses not only its immense volume, but also its variety, and the phenomenal speed at which the data universe is growing. Today, most industries leverage big data and AI techniques to create innovative offerings that drive growth and enhance competitive advantage. However, with few exceptions, traditional MedTechs have been relatively slow to collect and analyse a wide range of health, medical and lifestyle data which have the potential to provide innovative software offerings that improve patients’ therapeutic journeys and complement physical products. This is partly because the industry must adhere to strict regulations and partly because many medical technology companies lack the necessary capabilities and mindsets to collect and leverage big data. Most have business models that tweak legacy physical products and accept growth rates of ~5% as the ‘new normal’. We provide a brief history of big data and AI business strategies mainly to underline that these are relatively new. It was only in the early 2000s that electronic health records (EHR) began to replace paper-based patient records, which were stored in numerous filing cabinets in healthcare silos. It was not until ~2015 that EHRs became standard practice and researchers started to apply algorithms to EHRs and other data to detect patterns and make predictions that could improve diagnoses and treatments, enhance patient outcomes, and reduce healthcare costs. The increased use of big data and AI techniques in healthcare raises important cybersecurity concerns and trust issues because health professionals and patients do not understand how algorithms arrive at their conclusions and actions. Cybersecurity concerns are addresses by a range of encryption techniques and security protocols. Trust in algorithms has been helped by the development of  ‘explainable AI’, which is software that describes the essence of algorithms in easily understood terms. However, more work is still needed in these two areas. We introduce cloud and cloud services together with an explanation why these have experienced such rapid growth across all industries in recent years. The cloud makes it easier to store and access big data via the internet from anywhere in the world. Cloud services provide security for big data as well as a range of management and analytical tools that help to transform data into revenue generating software offerings. For MedTech companies, the cloud and cloud services provide the basis for more efficacious R&D. The medical technology industry has become bifurcated between companies that leverage AI driven big data strategies to enhance growth rates and those that predominantly focus on legacy physical product offerings and settle for lower growth rates. Over the past decade the nature of the medical technology industry has changed; partly because of AI big data strategies supported by the cloud computing and a large and rapidly growing range of open-source, easy-to-use AI tools. This has given small companies a competitive advantage. The Commentary concludes by describing a few of these small MedTechs with disruptive digital products that target large, rapidly growing, underserved market segments.       
 
Big data and healthcare

Big data are comprised of a wide range of information collected from multiple sources that surpasses the traditionally used amount of storage, processing, and analytical power and is unmanageable using conventional software tools. In healthcare settings, big data include hospital records, medical records of patients, results of medical examinations, and data generated by traditional medical devices as well as various biomedical and healthcare tools such as genomics, wearable biometric sensors, and smartphone apps. Biomedical research also generates data relevant for the medical technology industry.
 
The data universe

The massive amount of data, which is generated from the entirety of the internet is referred to as the ‘data universe’. It is not only its volume that makes this special, but it is also the variety of the data and the phenomenal speed at which the universe is growing. The International Data Corporation (IDC) estimated that the data universe grew from ~130 exabytes in 2005 to >40,000 exabytes in 2020.  To put this in perspective: 1 gigabyte of data is 1bn bytes (18 zeros after the 1 or 230 bytes), and 1 exabyte is equal to 1bn gigabytes.
Data generated healthcare innovations

In the past, collecting and interpreting vast quantities of data was not feasible, partly because computer systems were relatively small and did not generate much data, and partly because technologies to manage big data were underdeveloped. Fast forward to the present, and businesses across most industries now generate enormous amounts of data. Organizations apply AI and machine learning (ML) techniques to these data to create innovative product offerings to access new revenue streams with significant growth potential. Such technologies, combined with health-related big data, can positively impact the medical technology industry by generating novel diagnostics and treatments for patients, streamlining the process of medical record keeping and developing more personalized and responsive care plans that improve patient journeys and outcomes.

You might also like: 
 
The new rapidly evolving AI data driven healthcare ecosystem

Despite the potential commercial advantages of AI data driven diagnostic and therapeutic solutions, many traditional MedTechs have been slow to collect health and lifestyle data from multiple sources to develop software offerings, which complement their legacy physical products. One notable exception is Philips Healthcare. In the early 2000s, the company was challenged by new entrants to the market who were successfully leveraging information from health wearables and other sources to create and market AI data driven offerings. At the 2016 annual conference of the American Healthcare Information and Management Systems Society (HIMSS) in Chicago, Jeroen Tas, a Philips executive, said, “We are in the midst of one of the most challenging times in healthcare history, facing growing and aging populations, the rise of chronic diseases, global resource constraints, and the transition to value-based care. These challenges demand connected health IT solutions that integrate, collect, combine, and deliver quality data for actionable insights to help improve patient outcomes, reduce costs, and improve access to quality care”.
 
Philips had the mindset and resources to respond positively to this rapidly changing ecosystem. In 2017 the company appointed Tas as its Chief Innovation & Strategy Officer, tasked with launching a suite of big data AI driven solutions, the IntelliVue® patient monitors, which support the growing demands of health professionals to provide quality care and improved outcomes for an expanding population of older, sicker patients with fewer resources. These monitoring solutions seamlessly connect big data, AI technology and patients to support health professionals to manage patients as they transition through their care journeys. In 2016, Philips and Masimo, a medical technology company specializing in non-invasive AI data driven patient monitoring devices, entered a multi-year business partnership involving both companies’ innovations in patient monitoring. Philips agreed to integrate Masimo's measurement technologies into its IntelliVue® monitors, to help clinicians assess patients’ cerebral oximetry and ventilation status. The outcome of the collaboration was the launch of a new suite of patient solutions, called Connected Care, which give healthcare providers the ability to monitor patients more effectively and reduce costs.
 
The bifurcation of the MedTech market

In addition to large MedTechs such as Philips and Masimo, there are hundreds of small companies developing AI driven big data offerings aimed at improving patient outcomes. The reasons for many traditional companies’ slowness to fully leverage big data and AI applications are partly because medical devices are required to comply with stringent regulatory guidelines and partly because of the lack of capabilities. The different responses have bifurcated the industry. On the one hand there are traditional MedTechs, which predominantly focus on existing customers and market legacy physical offerings in slow growing markets. On the other hand, there are many small companies and a few very large medical technology corporations, which have embraced AI driven big data patient-centric solutions.
 
A brief history

Big data has its genesis in the 1950s and 1960s when scientists and mathematicians began exploring the possibility of using computers to process large amounts of data to make intelligent decisions. This led to the development of technologies such as the first neural networks, which laid the foundation for modern Deep Learning. In the 1980s, researchers at IBM popularized the concept of big data to describe the process of collecting and analyzing large amounts of data, which empowered organizations to gain insights from information that previously was too complex to process. The 1990s saw the development of AI and ML, which enabled computers to learn from data and make decisions without the need for explicit programming. By the early 2000s, AI-based algorithms empowered machines to learn from data and make predictions. Many organizations, across a range of industries, saw the commercial opportunities of this and acquired capabilities to collect, store and analyse large amounts of information to identify patterns and trends that were previously impossible to detect.  Without large amounts of data, AI and ML techniques are less effective, which is significant for healthcare and the medical technology industry.
 
Big data in healthcare

AI driven big data strategies are becoming increasingly important in healthcare. This is because AI techniques applied to masses of health-related information can improve patient care, enable more effective decision-making, reduce costs, identify new treatments, explore new markets, and create more efficient healthcare systems. Further, such applications can provide more accurate and timely diagnoses, as well as insights into how various treatments affect different people. As increasing amounts of health information become available, and data handling techniques improve, so traditional MedTech companies will have opportunities to boost their growth by complementing their physical devices and volume-based care with digital assets and personalised care.
 
Paper-based mindset

Until recently health professionals were responsible for most of the different types of data associated with a patient’s treatment journey, which included medical histories, known allergies, medical and clinical narratives, images, laboratory examinations, and other private and personal information. Until the early 2000s these data were recorded on paper and stored in filing cabinets across numerous healthcare departments. It was not until 2003 that the US Institute of Medicine used the term ‘electronic health records(EHR). By 2008, only ~10% of US hospitals were using EHRs, which increased to ~80% by 2015. As EHRs became standard practice across multiple providers and data interoperability issues were resolved, the provision of healthcare improved, and medical errors and healthcare costs were reduced. Currently, the American National Institutes of Health (NIH) is inviting 1m people from diverse backgrounds across the US to help build a comprehensive big data set, which can be used to learn more about how biology, environment and lifestyles affect health in the expectation of discovering new ways to treat and prevent disease.
 
Trust and medical algorithms
 
As AI driven big data applications have increased, so trust in algorithms has been raised as an issue. This has been a major concern in healthcare. To address this challenge, explainable AI, has been developed. This is an AI technology that explains decisions and actions made by algorithms in a way that is easily understood by health professionals and patients. Explainable AI has helped to create trust in algorithms by providing a level of transparency, understanding and accountability. Further, incorporating feedback from medical professionals, patients, and other stakeholders into the development of medical algorithms has also helped to build trust. However, this entails collecting a wider variety of data than many healthcare companies are used to.
You might also like:

Can elephants be taught to dance?




Have diversified medical technology companies blown their competitive advantage?
Big data healthcare strategies and security
 
With the increasing number of big data and AI healthcare solutions, cybersecurity has become a concern. Reducing this involves using technologies such as data encryption, secure cloud computing (see below), and authorization protocols to protect data stored in large databases. Additionally, organizations may use AI-driven applications to monitor their systems to find anomalies, detect malicious activity and unauthorized access to sensitive, personal information. To ensure the security of healthcare data, organizations also employ measures such as risk assessments, incident response plans, and regular security training of their staff.
Cloud storage and services

Since the early 1990s, big data have benefitted from cloud storage, which makes it easier to store and access data over the internet and helps businesses to become more efficient and productive. It also offers organizations scalability, more control over their data and reduced costs. Organizations can: (i) easily increase their storage capacity as their data needs grow, (ii) access their data from anywhere in the world, and (iii) stop investing in expensive local storage devices. Further, cloud storage is becoming more secure, with encryption and other security measures making it safer to store data.
 
Companies moving their data from local storage devices to the cloud is more than just a simple transfer process and can be a complex, multi-year journey. Any organization that has accumulated several legacy databases and infrastructures will have to develop and manage a hybrid architecture to transfer the data. However, once in place and shared among stakeholders, cloud-based platforms can assist in unlocking clinical and operational insights at scale while speeding up innovation cycles for continuous value delivery. In combination with a secure and interoperable network of connections to hospital systems, cloud-based solutions represent an opportunity for healthcare leaders to unlock the value of data generated along the entire patient journey, from the hospital to the home. By turning data into insights at scale, it is possible to empower healthcare professionals by helping them to deliver personalized care, improved patient outcomes and lower costs.
 
The cloud also offers an increasing number of computing services. These are provided by companies such as Amazon Web Services, Google Cloud Platform, Microsoft Azure, IBM Cloud, Oracle Cloud, and Rackspace Cloud. The services include: (i) Infrastructure-as-a-Service (IaaS), which provides users with access to networks, storage, and computing resources, (ii) Platform-as-a-Service (PaaS) helps users to develop, run, and control applications without the need to manage infrastructure, (iii) Software-as-a-Service (SaaS), provides access to a variety of applications, (iv) Data-as-a-Service (DBaaS), gives users access to several types of databases, and (v) Serverless Computing enables users to run code without needing to provision or manage servers. Such services are expected to continue growing and help to transform healthcare. The provision of cloud computing services in healthcare makes medical record-sharing easier and safer, automates backend operations and facilitates the creation and maintenance of telehealth apps. The increasing use of data and cloud services by MedTech companies helps to break down data silos and develop evidence-based personalized solutions for a connected patient journey. In 2020, the healthcare cloud computing market was valued at ~US$24bn, and it is expected to reach ~US$52bn by 2026, registering a CAGR of >14% during the forecast period. Major drivers of cloud services include the increasing significance of AI driven big data applications.
 
Changes the nature of R&D

Further, the cloud can change and speed up R&D. The starting point for MedTech R&D should be evolving patient needs and affordability. Healthcare-compliant cloud platforms offer a flexible foundation for the rapid development and testing of AI driven big data solutions created by cross functional teams working across an entire life cycle of an application: from development and testing to deployment. This changes medical technology companies’ traditional approach to R&D by transforming it into short cycles undertaken by multiple stakeholders. This modus operandi is replacing traditional lengthy and expensive R&D often carried out in an organisational silo and constrained by annual budgeting cycles. This often means that a significant length of time passes before an innovation gets into the hands of health professionals and patients for testing. Digital health solutions, on the other hand, can be tested by physicians and patients early in their development and improved features quickly added.   
 
Free and easy to use AI and ML software libraries

In the early 2000s, when AI and ML were in their infancy, companies needed data engineers with advanced mathematical capabilities to build complex AI systems. Today, this is unnecessary because of the development of simplified AI and ML libraries such as PyTorch and Tensorflow. These are free, easy to use, open-source, scalable AI, and ML packages, which reduce the need for data engineers to have advanced mathematical skills to build effective software health solutions. PyTorch, released in 2016,  was developed by Facebook and then Meta AI, and is now part of the Linux Foundation. The technology is known for its ease of use and flexibility, making it favoured by developers who want to rapidly prototype and experiment with new ideas. Its tools support graphics processing, which is popular with deep learning medical imaging strategies that involve training large, complex models on big data. TensorFlow was developed by the Google Brain team and originally released in 2015 for internal use.  It is a highly scalable library for numerical computations and allows its users to build, train and deploy large-scale ML models. Both platforms have become significant open-source tools for AI and ML due to their ability to support the development and training of complex models on large datasets. They have been widely adopted by researchers and developers throughout the world and are regularly used in a variety of applications relevant to the medical technology industry. Significantly, they give smaller MedTechs a competitive advantage. 
 
Disruptive effects of AI driven big data strategies

The development and availability of big data and predictive AI help small medical technology companies enter markets, grow, and strengthen their competitive positions, which has the potential to change market dynamics. Over the past decade, several large medical technology companies have experienced their markets dented by small companies, which have successfully used open-source AI applications to leverage big data. For example, Philips Healthcare’s market was affected by the emergence of innovative offerings developed by new entrants using cloud computing services and big data from medical wearables. Above we described how Philips robustly responded to this and became a market leader in AI data-driven patient monitoring technology. Siemens Healthineers’ market share suffered from small MedTechs with innovative AI driven offerings. Further, the rise of digital imaging technology caused GE Healthcare’s market share to shrink. These vast companies have since developed AI driven big data strategies and bounced back. However, traditional MedTechs that fail to leverage big data and AI capabilities risk being left behind in an increasingly competitive digitalized industry.
 
Small MedTechs using big data and AI

Examples of small MedTechs that leverage big data, AI, and ML techniques to capture share of large underserved fast-growing market segments include Brainomix, which was spun out of Oxford University, UK, in 2010 and serves the stroke market. Iradys, a French start-up specialising in interventional neuroradiology. Elucid, a Boston, US-based MedTech founded in 2013, which has developed innovative technology that supports the clinical adoption of coronary computed tomography angiography, and Orpyx Medical Technologies, a Canadian company that provides sensory insoles for people living with diabetes. These are just a few examples of small agile companies that collectively have helped to bifurcate and disrupt segments of the medical technology industry by developing offerings predicated upon big data, AI and ML that deliver faster, more accurate diagnoses to ensure that patients get the treatment they need, when they need it.

Brainomex’s lead product offering is a CE-marked e-Stroke platform, which has been developed using data from images sourced across 27 countries including the UK, Germany, Spain, Italy, and the US and provides fast, effective and accurate analysis of brain scans that expedite treatment decisions for stroke patients. The platform has been adopted across multiple healthcare systems throughout the world, and for the past two years, England’s National Health Service (NHS) has been using the technology on suspected stroke patients. Early-stage analysis of the technology predicated on >110,000 patients suggests that eStroke can reduce the time between presenting with a stroke and treatment by ~1 hour and is associated with a tripling in the number of stroke patients recovering with no or only slight disability - defined as achieving functional independence - from 16% to 49%. With this disease, time is of the essence because after a stroke, each minute that passes without treatment leads to the death of ~2m neurons (nerve cells in the brain), which cause permanent damage. It can be challenging for health professionals to determine whether stroke patients need an operation or drugs, because the interpretation of brain scans is complicated and specialist doctors are required. Sajid Alam, stroke consultant at a large regional hospital in the UK, (Ipswich Hospital), reflected: “As a district general hospital, we don’t have ready access to dedicated neuroradiologists to interpret every stroke scan. Having Brainomix’s AI software gives us more confidence when interpreting each scan.

Intradys is a French start-up, which develops algorithms that combine ML and mixed reality to empower interventional neuroradiologists and help them enhance the care of stroke patients. Orpyx Medical Technologies provides sensory insoles for people living with diabetes who have developed peripheral neuropathy to help prevent foot ulcers. The insoles collect data on pressure, temperature, and steps and give feedback to the wearer and healthcare professionals. Elucid is a Boston-based MedTech founded in 2013. The company’s offerings are predicated on big data, AI, and ML to provide fast and precise treatments that improve the outcomes of patients with cardiovascular disease and reduce healthcare costs. Heart attack and stroke are primarily caused by unstable, non-obstructive plaque (the buildup of fats, cholesterol, and other substances in and on the artery walls) that often goes undiagnosed and untreated. Current non-invasive testing cannot visualize the biology deep inside artery walls where heart disease develops. Elucid’s lead offering is an FDA-Cleared and CE-marked non-invasive software to quantify atherosclerotic plaque.
 
Takeaways
 
The potential benefits for medical technology companies that leverage AI driven big data strategies include: (i) improved diagnoses and treatments, (ii) enhanced patient journeys and outcomes, (iii) cost savings, (iv) a better understanding of stakeholders’ needs, (v) superior decision-making, (vi) more effective products and services, and (vii) increased competitive advantage. Big data strategies may also be used to uncover insights from large datasets to develop predictive models that can automate repetitive tasks, optimize care processes, free up resources for healthcare professionals to focus on providing care, and staying ahead of the competition by providing greater insights into customer trends and needs. Medical technology companies that do not leverage AI driven big data strategies to develop innovative products for growth and competitive advantage potentially risk: (i) falling behind the competition in terms of product innovation, (ii) missing out on key market opportunities, as data-driven insights can help identify new trends and customer needs, (iii) struggling to keep up with the changing pace of technological change, as staying ahead of the competition requires a deep understanding of the latest developments in data-driven product development and (iv) losing the trust of customers, as they may be wary of MedTechs that do not use advanced technologies to develop their product offerings. Future significant growth for medical technology companies is more likely than not to come from AI driven big data strategies. Start collecting data.
view in full page

  • Wearables are no longer lifestyle accessories. They are becoming core infrastructure for modern healthcare
  • Traditional MedTech was too slow to see that continuous data, not just devices, would create the next strategic battleground
  • The boundary between consumer health and clinical utility is dissolving fast, with major consequences for incumbents
  • Future advantage will come from platforms that support entire therapeutic journeys, not products built for isolated interventions
  • This Commentary explores why MedTech drifted, why wearables matter now, and what traditional players must do

The Wearable Reckoning: MedTech Slept Through a Revolution

Wearables were dismissed as gadgets. That was the strategic mistake. For too long, much of traditional MedTech treated wearables as if they were toys for the anxious well. Interesting, perhaps. Fashionable, certainly. But not serious. Not clinical. Not “real” medicine. That judgement is now colliding with reality.

What many incumbents failed to understand is that wearables were never just about counting steps, logging sleep or nudging consumers to stand up more often. They were the first mass-market infrastructure for continuous physiological observation. While traditional MedTech remained focused on devices designed for single interventions, single departments and single moments of contact, the wearable market evolved into something much more consequential: a persistent, data-generating interface between the human body and the healthcare system.

Wearables are no longer a side market orbiting the edge of medicine. They are becoming one of the foundational layers through which modern healthcare will monitor, interpret and manage patients over time. The global wearable medical devices market is growing rapidly, with multiple analysts placing it on a trajectory >$100B by the end of the decade, driven by ageing populations, chronic disease burdens, remote monitoring, and the wider digitisation of care. Estimates vary, but the direction is unmistakable: this is no side market. It is becoming one of the organising layers of healthcare delivery. 


And yet, with a few exceptions such as Masimoknown for developing patient monitoring devices and software platforms used in hospitals and home settings, traditional MedTechs were slow to act. Many incumbants continued to manufacture and market devices for narrow interventions, while underestimating the strategic significance of longitudinal data, patient-facing platforms, and continuous monitoring. They did not collapse. But they drifted and lost value. The significance of that drift is underlined by Danaher’s February 2026 agreement to acquire Masimo for $9.9 billion: one of the few established MedTech companies to invest meaningfully in platform infrastructure and continuous data has proved valuable not despite that strategic shift, but in part because of it.

The lesson is uncomfortable. The wearable market did not grow because incumbents were wiped out. It grew because incumbents largely kept behaving as if the old categories still held. They assumed the market for wearables was mainly personal, not medical. They assumed consumer technology was adjacent to healthcare rather than increasingly entangled with it. They assumed that because wearables did not match invasive gold standards, they were clinically peripheral. All three assumptions now look increasingly untenable.

The line between personal and medical utility is dissolving. That should alarm traditional MedTech, but it should also clarify what comes next.

 
In this Commentary

This Commentary argues that MedTech underestimated the significance of the wearable revolution, allowing consumer technology companies to reshape how health data are generated, interpreted, and used. It examines why incumbents were slow to respond, what this shift means for clinical practice and industry strategy, and why the consequences now extend far beyond the wrist.
 
The Category Error at the Heart of MedTech’s Delay

Traditional MedTech did not just underestimate a device trend. It misunderstood what the wearable market was producing. A significant share of the sector’s leadership was formed in an era where value resided primarily in the physical device: its engineering, reliability, regulatory approval, installed base and integration into specialist workflows. In that worldview, the medical device was the centre of gravity. Software was an accessory. Data was an output. The clinical encounter was the moment that mattered.

Wearables challenged all of that.

Their significance was never only that they could sit on the wrist, chest or finger. Their significance was that they could sit in time. Traditional devices often generate clinically important snapshots. Wearables generate streams. They capture physiology continuously, or at least repeatedly enough to reveal patterns that snapshots cannot. That difference is not cosmetic. It changes the nature of what can be known, when it can be known, and what can be done with that knowledge. Continuous and longitudinal monitoring enables earlier detection of deterioration, richer context for symptoms, better understanding of recovery, and a more realistic account of how physiology behaves in everyday life rather than only in controlled settings.

This is the strategic point many incumbents missed. The opportunity was not simply to sell a new class of sensor. It was to build a layer of persistent clinical visibility. Once viewed that way, the mistake becomes obvious. Traditional MedTech remained largely organised around interventions. Wearables were building toward journeys.

Breakthrough to Breakdown
- HealthPadTalks' new episode -

Healthcare doesn’t lack innovation—it lacks execution. Breakthroughs don’t fail in labs; they stall in systems built for stability, not speed. Real value comes from implementation across workflows, procurement, and regulation. The winners won’t be those who build more tech, but those who make it work. 

From Episodic Medicine to Continuous Medicine

The classic MedTech model is built around episodic contact. A patient appears at a site of care. A device is used. A measurement is taken. An intervention happens. Data are captured within a bounded event. Reimbursement, workflow and commercial logic all follow that structure.

But many of the most important health problems do not behave episodically. Heart failure worsens between visits. Arrhythmias appear intermittently. Respiratory decline may start subtly. Recovery after surgery unfolds unevenly. Cancer treatment produces changes in fatigue, activity, temperature and physiology that do not neatly coincide with appointments. And as populations in advanced, wealthy economies age, the disease burden itself is changing; chronic lifetime conditions and multi-morbidity are becoming more prevalent, while healthcare systems were largely built for a different disease profile and patient cohort. Chronic disease is lived continuously, even if healthcare has historically observed it intermittently.

Wearables matter because they are one of the first scalable infrastructures capable of narrowing that observational gap. They provide the possibility of following patients across time, across setting and, increasingly, across the full therapeutic journey. In practical terms, that means moving from isolated readings to contextualised trends; from reactive discovery to earlier warning; from hospital-only visibility to distributed monitoring.

That is why today’s wearables are increasingly expected to do far more than track heart rate. The market has moved toward continuous ECG, respiratory metrics, heart rate variability, temperature, oxygen saturation, sleep, posture and activity context, while continuous glucose monitoring has become especially important for many people living with diabetes. In some cases, devices also offer inferred measures such as blood pressure, stress, hydration status, or recovery. The important shift is not simply the growing number of metrics. It is the emergence of wearables as multi-physiologic platforms: sensing systems rather than single-purpose trackers.

For MedTech incumbents, this should be a strategic shock. The company that owns the most valuable part of the patient journey may no longer be the company with the strongest device at a single intervention point. It may be the company that can monitor, interpret and support the patient most effectively across time.

 
The Consumer-Health Boundary is Breaking Down

Perhaps the most damaging assumption inside traditional MedTech was the idea that the wearable market belonged to lifestyle rather than medicine. That distinction once appeared neat. Consumer devices were for fitness, wellness and self-optimisation. Medical devices were for diagnosis, treatment and clinical care. But that boundary has been eroding for years, and now it is dissolving fast.

Apple is the obvious case, even if earlier consumer wearables such as Fitbit helped familiarise users with the idea of continuous personal health tracking. The Apple Watch did not begin by trying to resemble a traditional medical device. It entered through habit, design, convenience and ecosystem integration. Yet over time it gained FDA-cleared ECG capabilities and established itself as a serious participant in arrhythmia screening and atrial fibrillation awareness. Its importance is not that it replaced cardiology. It is that it normalised the idea that clinically relevant health monitoring could exist in an everyday consumer device worn by millions.

That changes expectations everywhere else.

Patients begin to wonder why their smartwatch can surface trends their formal care pathway ignores. Clinicians begin to ask which parts of consumer-generated data may be useful for triage, follow-up or escalation. Health systems begin to explore whether lower-cost continuous monitoring can reduce unnecessary admissions or detect deterioration earlier. Payers begin to look for evidence that remote monitoring can lower downstream costs.
You might also like:
 
The key point is not that every consumer wearable is clinically robust. Many are not. The point is that the market has changed the cultural expectation of what monitoring can be. Once the public becomes accustomed to passive, continuous, always-on physiological insight, the old model of healthcare waiting for the patient to arrive before observing them starts to look increasingly archaic.
Traditional MedTech underestimated this because it focused too heavily on what wearables were not. They were not invasive. They were not always gold-standard. They were not confined to clinical settings. They were not sold through the familiar institutional channels. But that scepticism obscured what they were becoming: the everyday interface through which health data enters routine life.
 
Accuracy is Not the Whole Argument. Clinical Relevance Is

One reason incumbents were able to dismiss wearables for so long is that many wearable measurements did not match the precision of invasive or hospital-grade reference systems. This criticism was never entirely wrong. Signal quality matters. Motion artefact matters. Validation matters. Gold standards exist for good reasons.

But the criticism was strategically incomplete.

Wearables do not need to replace invasive devices to be transformative. They need to produce signals that are clinically relevant enough to change decisions, allocate attention more intelligently or flag deterioration early enough to matter. For many use cases, the comparator is not the best possible measurement under ideal conditions. It is the absence of continuous information.

That distinction matters. A wearable ECG does not have to replace a full cardiology work-up to be valuable. A respiratory trend monitor does not have to replace spirometry to signal that a patient is worsening. A multi-parameter patch does not need to achieve the perfection of ICU monitoring to reduce blind spots in recovery or step-down care. In many settings, an early directional signal with appropriate workflow integration can be more valuable than a pristine reading that arrives too late.

This is where the phrase “actionable trends” becomes more important than “raw accuracy”. The frontier for health wearables is not whether they produce elegant streams of data for their own sake. It is whether they can meaningfully signal risk before a crisis, inform escalation, support monitoring and improve allocation of clinical attention.

Traditional MedTech should understand this better than most. Yet too often it has remained trapped in an all-or-nothing mindset: either a device is diagnostic-grade, or it is strategically secondary. That is the wrong frame for a healthcare environment increasingly defined by prevention, surveillance, stratification and remote care.

MedTech Built Products. Wearables are Building Platforms.

This is the deeper challenge. Traditional MedTech companies are typically organised around products, categories and sales channels: a cardiac product line sits here, a respiratory line sits there, a monitoring business sells into one part of the hospital, and a surgical business into another. Success is measured through familiar commercial metrics such as unit sales, account penetration, consumables and service contracts - indicators that feed neatly into quarterly reporting, revenue visibility and earnings calls, and which, over time, have come to shape much of the executive mindset in the sector.

Wearables destabilise that logic because their value does not end at the sensor. It begins there.

The strategic asset is the platform that sits above the sensor: the data architecture, the analytics layer, the workflow integration, the alerting logic, the patient interface, the clinician dashboard, the interpretation models, the interoperability with broader health IT systems. In other words, the device is still important, but it is no longer sufficient.

This is where traditional MedTech’s legacy strengths can become constraints. Their commercial models are often transactional. Their organisational structures are often departmental. Their software capabilities may be fragmented. Their digital investments may still be treated as support functions rather than core strategy. They know how to sell a device. They are less practiced at managing an ongoing data relationship with patients across months or years.

The wearable era rewards different kinds of strength. It rewards firms that can accumulate longitudinal datasets, translate physiological streams into useful risk signals, integrate monitoring into care workflows, and maintain engagement outside the clinic. It rewards interoperability rather than siloed device logic. It rewards persistence rather than event-based contact.

The winners will look less like catalogue companies and more like platform companies. That does not mean every MedTech firm must become Apple. However, it does mean they must stop pretending that hardware alone will remain the centre of defensibility.

 
Why Consumer Technology Learned Faster

There is also a cultural lesson in all this. Consumer technology companies often moved faster not because they understood medicine better, but because they understood adoption better.

Healthcare has long excelled at seriousness, engineering and clinical validation. Consumer technology excels at usability, behaviour and habit formation. In a world of continuous monitoring, that difference matters. The best wearable in the world is useless if patients do not wear it, charge it, trust it or understand it. Longitudinal value depends not only on signal quality, but on sustained human use.

This is where many incumbents were weakest. They judged performance mainly in technical terms, not behavioural ones. Yet what matters in the real world is not simply whether a device performs well in principle, but whether patients will use it consistently. A device that is slightly less sophisticated but fits easily into everyday life can therefore be more valuable than a technically superior one that patients stop using.

This is another reason the “lifestyle” dismissal is strategically foolish. Consumer markets solved adherence, comfort, interface and routine interaction earlier than MedTech did. And those capabilities are not superficial. They are central to the success of remote and continuous monitoring.

The phrase “digital immigrants” may sound harsh, but it captures something real about leadership mindset. Many executives trained in a pre-platform era interpret digital as a wrapper around the product: an app, a dashboard, a software add-on. But in platform markets, digital is not the wrapper. It is the business logic. Wearables exposed that difference.

 
The Therapeutic Journey is now the Real Battleground

The most important strategic lesson for traditional MedTech is that healthcare value is shifting from isolated interventions toward the orchestration of whole patient journeys.
A heart failure patient does not care that one company owns a monitor, another owns a diagnostic device and a third owns a post-discharge patch. They experience a single journey: symptoms, observation, deterioration risk, hospital contact, discharge, recovery, relapse prevention. Likewise, an oncology patient, a respiratory patient or a post-operative patient does not live inside neat device categories. They live inside uncertain therapeutic trajectories.
You might also like to listen to:
 
The company that matters most in that environment is not necessarily the one with the single most impressive piece of hardware. It is the one that can generate meaningful visibility across the journey and turn that visibility into support, interpretation and action.

This is why journey-centric design should replace intervention-centric design as MedTech’s organising principle.

For each therapeutic area, incumbents should be asking harder questions. Where are the blind spots between visits? Which signals change before symptoms become severe? Which data can be collected passively rather than requiring effort from the patient? Which alerts are clinically actionable rather than just noisy? How should information move between patient, clinician, caregiver and system? Which parts of the pathway demand medical-grade certainty, and which are well served by reliable early-warning systems?

These are not just product questions. They are strategic questions about where value is created.

 
Traditional MedTech Still Has Advantages. But only if it Changes the Frame

This is not a story in which incumbent MedTech is doomed and consumer technology wins by default. Traditional MedTech still possesses formidable assets: regulatory experience, clinical credibility, provider relationships, knowledge of pathways and the ability to operate in high-stakes settings where trust matters.

But those strengths only matter if they are reassembled around the realities of continuous, connected care.

A useful example comes from enterprise and hospital monitoring, where firms such as Philips are beginning to frame wearables not as standalone gadgets but as interoperable elements within wider patient-monitoring architectures. Philips, for instance, describes an “end-to-end” monitoring solution built around live device data and, in its smartQare partnership, explicitly positions wearable sensing as part of continuous monitoring “in and out of the hospital,” linking observation across bedside, ward and home settings. That is much closer to the right strategic frame. The product is no longer the device in isolation, but the monitored patient journey it helps make visible.

This is where incumbents can still win. They can build clinically robust wearables for high-value pathways such as cardiac monitoring, respiratory deterioration, post-operative recovery, oncology support or chronic disease management. They can become workflow integrators, using third-party sensors where necessary but owning the orchestration layer. They can focus on analytics, translating streams of noisy physiology into useful risk models and escalation pathways. They can build trusted bridges between consumer-generated data and formal clinical systems.

But they will not win by bolting generic software onto legacy hardware and calling it transformation.

 
The Risks are Real. Denial is Worse

None of this means the wearable future is frictionless. Signal quality remains uneven. Many devices are over-marketed and under-validated. False positives can create anxiety. False negatives can create false reassurance. Remote monitoring can swamp clinicians with noise if not carefully designed. Interoperability remains poor. Reimbursement is still inconsistent across markets. Data privacy and governance are concerns. Health systems are not yet built to metabolise continuous data gracefully.

But these are not arguments for treating wearables as marginal. They are arguments for building better systems around them. Healthcare has always advanced through the combination of new capability and institutional adaptation. The strategic failure would be to wait for the market to become perfect before taking it seriously.

In fact, incumbents should recognise that these frictions are where their capabilities ought to matter most. Clinical governance, validation, regulatory navigation and pathway design are not side issues. They are how wearables move from promising consumer technologies to trusted components of care.

The mistake is not caution but mistaking caution for strategy.

 
The Real Danger is Strategic Drift, Not Collapse

The most important warning for traditional MedTech is that disruption in healthcare rarely looks dramatic at first. Incumbents often do not fail overnight. They continue generating revenue, servicing installed bases and selling into established channels. The balance sheet looks stable. The products still work. The clinician relationships remain intact. Nothing appears to be collapsing.

But underneath, value migrates.

It migrates into data assets, patient interfaces, workflow platforms, predictive models and continuous relationships. It migrates toward firms that understand how to live with the patient beyond the clinical encounter. It migrates toward systems that make deterioration visible earlier, care more distributed and intervention more targeted.

That is the kind of strategic drift the wearable market has exposed. Traditional MedTech did not implode. It simply underestimated where the future centre of gravity was moving. That is often how industries lose their strategic position: not through spectacular failure, but through outdated categories.

 
Takeaways

The wearable market is not just another adjacent segment for MedTech to notice late and enter cautiously. It is a warning about the future structure of healthcare technology. The next generation of winners will not think of themselves only as device manufacturers. They will think of themselves as managers of physiological intelligence across the therapeutic journey. They will combine sensing, software, analytics, patient engagement, workflow integration and services. They will understand the difference between diagnostic perfection and decision-grade usefulness. They will know when clinical-grade precision is necessary and when timely directional insight is what changes outcomes. Most importantly, they will stop treating data as a by-product of the device and start treating it as the basis of the business.

That is the sharper strategic lesson for traditional MedTech. The future will not be won solely in the procedure room, the procurement contract or the single device category. It will also be won on the wrist, on the chest, in the home, across the patient pathway and within the data streams that reveal risk before crisis.

Wearables began at the margins of medicine because incumbents were too comfortable calling them lifestyle devices. They are moving toward the centre because healthcare increasingly needs what they provide: continuity, context, earlier warning and a more patient-centred model of observation.

Traditional MedTech can still respond. But it must do so by abandoning one of its most persistent illusions: that the serious business of medicine begins only when the patient reaches the clinic. Increasingly, it begins long before that. And the companies that understand this will not just build better devices. They will redefine what a medical technology company is.
view in full page
  • Tissue technology has entered a new era - evolving from simple scaffolds to advanced platforms that integrate biologics, sensors, and AI 
  • MedTech leadership is shifting - from product-centric models to outcome-driven ecosystems
  • Convergence is the catalyst - biology, data, and digital infrastructure are redefining care delivery
  • Legacy firms must evolve - or risk being outpaced by agile, cross-disciplinary competitors
  • The future is platform-based - healing will be personalised, predictive, and performance-validated

Tissue Tech’s Breakneck Disruption

Over the past four decades, tissue technology has evolved from experimental promise to clinical cornerstone - transforming the treatment landscape for burns, chronic wounds, and reconstructive surgery. What began as rudimentary scaffolds and passive biomaterials has grown into an ecosystem that now includes bioengineered skin, cellular therapies, synthetic matrices, and intelligent wound interfaces. These innovations have expanded clinical possibilities, and redefined standards of care across trauma, oncology, and limb salvage.

As the sector matures, the strategic imperative for MedTech leaders has shifted. The question is no longer whether tissue technologies will reshape care - but how to lead in a market where disruption is accelerating, convergence is inevitable, and value is measured in real-world outcomes.

 
In this Commentary

This Commentary explores the evolution of tissue technologies from passive biomaterials to biologics, and data-driven healing platforms. It argues that future MedTech leadership will hinge not on product innovation alone, but on orchestrating interdisciplinary ecosystems that integrate cellular science, digital health, and real-world outcomes. As convergence accelerates, the winners will be those who change from device makers into platform providers shaping the next era of regenerative care.
 
The Market Then and Now

The roots of today’s tissue technology market can be traced back to the 1980s and 1990s, when early breakthroughs in biomaterials - such as acellular dermal matrices, artificial skin, and semi-synthetic grafts - were driven by a mechanistic understanding of tissue repair. These innovations, often developed through public-sector research, military collaborations, and burn trauma units, marked a shift from passive dressings to biologically interactive materials. Companies like Organogenesis and Genzyme were among the first to commercialise these therapies, helping to establish the regulatory and reimbursement frameworks that would define a new category of care.

By the early 2000s, tissue technology had begun moving beyond its initial niche in trauma centres, expanding into reconstructive surgery, limb salvage, and chronic wound care. This clinical broadening was accompanied by increased commercial interest. In addition to early pioneers like Integra LifeSciences, newer entrants such as LifeCell and Systagenix (then part of Kinetic Concepts Inc., under the Acelity group) began to shape a more competitive landscape. The 2019 acquisition of Acelity Inc. - including KCI and its subsidiaries - by 3M marked a significant consolidation in the advanced wound care sector, highlighting the market’s growing maturity.

Culture: MedTech’s Hidden Power Play


The new episode of HealthPadTalks is available. Listen now!
Innovation during this phase was characterised by incremental rather than disruptive progress. Improvements in packaging, sterility, handling, and shelf stability supported operational efficiency and facilitated broader clinical integration. At the same time, increasing volumes of clinical data helped de-risk adoption for providers and payers, while regulatory pathways became more defined. The rise of bundled payments and value-based care further incentivised uptake by aligning economic and clinical outcomes.

However, despite commercial and operational advancements, the underlying technological paradigm remained unchanged. Most products continued to centre around the use of biologically derived or synthetic scaffolds to promote tissue repair, with limited integration of active or adaptive functionalities. The industry, while maturing, was still operating within a relatively static innovation framework.

Today, the sector is approaching an inflection point. Advances in regenerative biology, precision manufacturing, and digital health are converging, enabling a new generation of solutions that go beyond scaffolding to actively stimulate, monitor, and modulate healing in real time. This is not an incremental shift - it is a platform-level transformation. The next decade will not be defined by better versions of yesterday’s products, but by new modalities that blend cellular science, smart materials, and predictive data. In short: the tissue technology market is evolving from a materials-driven sector to a biologics-and-data-driven one. For MedTech leaders, the challenge is to recognise that the past 40 years have been prologue. The future will be defined by convergence, complexity - and competition from unexpected directions.

 
Where the Market Is Headed

The broader global tissue regenerative market is projected to surpass $22bn by 2035 - but the composition of that market will be unrecognisable compared to today. The dominant players will no longer be defined solely by proprietary biomaterials or single-product portfolios. Instead, leadership will hinge on an ability to integrate biologics, real-time data, and therapeutic intelligence into comprehensive healing platforms.

First, advanced wound care is no longer confined to materials science. Tissue regeneration is becoming a cross-disciplinary endeavour - where cellular therapies, engineered tissues, gene modulation, and biosensor-enabled feedback loops converge. This evolution demands capabilities that stretch beyond traditional device or biotech silos.

Second, healthcare systems are no longer purchasing promises - they are demanding performance. Cost-effectiveness, total patient outcomes, speed to closure, reduction in readmissions, and long-term functionality are now the metrics that matter. As value-based care models expand globally, reimbursement will follow demonstrated impact - not theoretical potential.

Crucially, the leading companies in this next era will not be those with a superior scaffold or cell line, but those that can operate as regenerative platforms - combining therapeutic modalities with diagnostics, data analytics, and delivery innovation. Think of a company that can provide not just the biologic or graft, but the protocol, the predictive algorithm, the patient monitoring layer, and the real-world data loop to refine care continuously.

We are already witnessing the first wave of a powerful biotech‑driven transformation in wound care. Companies like Vericel and Tissium are pioneering a new generation of targeted tissue therapies - bioengineered solutions designed to accelerate regenerative healing with greater precision and efficacy. At the same time, the emergence of smart dressings is transforming the way wounds are monitored and treated. Start-ups like iCares - whose “lab‑on‑skin” smart bandage, developed by Professor Wei Gao’s team at Caltech and USC - along with Portugal‑based adhesivAI, are integrating miniaturised biosensors into adhesive dressings. These sensors track critical wound metrics like moisture, pH, and temperature, streaming real-time data to cloud‑hosted AI platforms that generate tailored treatment recommendations. Technically, this requires breakthroughs in flexible electronics, biocompatible sensor materials, ultra‑low‑power wireless communication, and AI algorithms refined for biomedical signal processing.

On the business front, this convergence of biotech, digital health, and AI is disrupting traditional wound‑care dynamics. Established MedTechs such as Smith&Nephew and 3M are shifting from supplying consumables to building comprehensive digital care ecosystems. Their platforms now aim to deliver value‐added services - remote monitoring, predictive analytics, and patient engagement tools - beyond the physical dressing. Meanwhile, companies from outside the traditional MedTech sphere - including digital‑health start-ups, data platform operators, and pharmaceutical firms - are positioning themselves to capture share of the once device‑centric market. This influx of cross‑sector players is driving new collaborations, M&A activity, and novel go‑to‑market models that blend devices, diagnostics, data, and therapeutics into integrated care pathways. As the boundaries continue to blur, stakeholders who master this convergence stand to gain competitive advantage in both clinical outcomes and sustainable business models.
To remain relevant, traditional MedTech firms will need to reimagine their role: not just as innovators of regenerative products, but as orchestrators of interdisciplinary care ecosystems. This requires new investment strategies, new talent, and a willingness to partner outside the usual supply chain. Ultimately, the winners in tissue regeneration will be those who understand that healing is no longer a material challenge - it is a systems challenge. And platforms - not products - will define the next generation of leadership.
You might also like:
 

When the scalpel sleeps



 
Key Disruptive Technologies

The next wave of disruption in tissue technology is not driven by any single modality, but by a convergence of biological, digital, and manufacturing breakthroughs. Evolving technologies are positioned to redefine both the structure of the market and the standards of care. Each brings clinical potential, and strategic implications for how value will be created, delivered, and measured. Here are the five disruptors that are already reshaping the tissue technology market.
 
1. Cellular Therapies and Stem Cell-Integrated Scaffolds
Once the domain of academic research and early-phase trials, acellularised scaffolds are now making their way into controlled clinical environments - bringing regenerative capabilities that replicate native tissue structure and biochemical signalling. These next-generation platforms go beyond passive support; they actively engage in tissue healing through integration with autologous or allogeneic stem cells.

Key innovators to watch:
  • Vericel, with its FDA-approved autologous cell therapy MACI, is redefining cartilage repair.
  • Organogenesis and MiMedx are advancing placental and amniotic tissue-derived biologics, showing promise in wound healing and inflammation modulation.
  • Mesoblast and Gamida Cell, among early-stage players, are building scalable platforms for cell manufacturing - critical for expanding clinical and commercial reach.
Strategic implication: The race is on to industrialise living therapies - those with inherent biological function - without degrading their regenerative potential. The companies that master this balance will shape the future of tissue engineering and define new therapeutic standards.

2. 3D Bioprinting and Customisable Tissue Constructs
3D bioprinting is redefining the frontier of tissue engineering by enabling the precision layering of vascularised, patient-specific constructs. While the field remains emergent, regulatory engagement is accelerating, and capital is converging on platforms that blend biomaterials, software, and microfabrication. This convergence is turning once-theoretical applications into tangible clinical possibilities.

Key innovators to watch:
  • CELLINK(BICO Group), a leader in modular bioprinters used across academia and industry for tissue research and prototyping.
  • TissUse and Prellis Biologics, pushing the envelope on micro vascularised models critical for functional tissue viability.
  • United Therapeutics, in collaboration with 3D Systems, developing whole-organ scaffolds - a step toward transplantable bio printed organs.
Strategic implication: The ability to personalise regenerative constructs at scale has the potential to redefine complex surgical interventions - and disrupt the traditional allograft and cadaveric tissue supply chains.

3. Smart Wound Devices and Biosensor-Enabled Dressings
The wound care landscape is shifting from passive materials to sensor-embedded platforms that deliver real-time data on healing dynamics - pH, exudate, bacterial burden, and tissue status. This evolution is impactful in chronic and outpatient care, where early detection enables timely intervention and prevents costly escalation.

Key innovators to watch:
  • Smith&Nephew and 3M, integrating biosensors into advanced dressing systems.
  • Emerging players like 11Health’s Ostom-I sensor and Redsense Medical, focused on wearable sensors and remote wound monitoring.
  • Research powerhouses such as the Fraunhofer Institute, developing multi-modal smart bandages with embedded diagnostics.
Strategic implication: As real-time wound monitoring becomes standard, MedTech companies will shift from product-based offerings to predictive, service-oriented models - aligning with value-based care frameworks.

4. Synthetic Biology and Engineered Biomaterials
Biomaterials are evolving from inert scaffolds to programmable agents capable of interacting intelligently with their biological environment. Whether it is tunable degradation (the ability to control the rate at which a material or substance breaks down or degrades), antimicrobial release, or immunomodulation, these materials are designed to respond to the physiological context - ushering in a new class of "living" biomaterials.

Key innovators to watch:
  • Tissium, advancing programmable, bioresorbable surgical adhesives and barriers.
  • RevBio and Alafair Biosciences, pioneering calcium-based and polymeric materials for bone and soft tissue regeneration.
  • Leading academic spinouts from MIT, Stanford, and ETH Zürich, pushing the limits of functional bio-interfaces and responsive scaffolding.
Strategic implication: The emergence of smart biomaterials will reduce surgical variability, improve integration, and enable more predictable outcomes in complex reconstructions - redefining material science’s role in therapeutic design.

5. AI-Guided Wound Management and Predictive Healing Analytics
AI is transforming wound care from a reactive discipline into a proactive science. By integrating imaging, wearable data, and EHRs, predictive algorithms are now forecasting wound trajectories, infection risks, and optimal interventions. This data-driven intelligence reduces subjectivity and accelerates clinical decision-making.

Key innovators to watch:
Strategic implication: Those who successfully embed AI into the clinical workflow will not just sell devices - they will become partners in care delivery, influencing outcomes, workflows, and reimbursement models.

Each of these disruptive domains is reshaping traditional value chains and redefining core capabilities. What is becoming increasingly evident is that future leaders in the field will not just create superior wound dressings or biomaterials - they will master the orchestration of complex, interdependent systems spanning biology, data science, and care delivery. The most successful organisations will function less like conventional product manufacturers and more like platform integrators, blending scientific innovation, digital infrastructure, and clinical intelligence to unlock outcomes that were once thought unattainable.
HealthPadTalks' episodes are available on most platforms. Listen now and follow us!
Strategic Pressures and Market Shifts

The competitive terrain in tissue technology is undergoing a structural transformation. What was once a race among proprietary biomaterials has become a multi-front battle across platforms, disciplines, and data ecosystems. Market incumbents - many of whom have built dominance on a single scaffold, matrix, or biologic - are now contending with a new breed of competitors that bring different capabilities and value propositions.

1. Cross-Platform Competition
Today’s competitive threat is not just product-to-product - it is platform-to-platform. Device firms are being challenged by biotech spinouts developing living therapies, software-native start-ups offering wound assessment and predictive analytics, and hybrid models that fuse biologics with digital diagnostics or drug delivery.
  • Tissium, for instance, is blending surgical devices with programmable biomaterials.
  • Swift Medical and Tissue Analytics are capturing provider share with imaging and AI - offering no physical product at all.
  • Vericel and Gamida Cell are making cell therapy products that bypass traditional material approaches.
  • Meanwhile, Amazon and Alphabet have made signals toward remote diagnostics and logistics infrastructure that could reshape post-acute and home-based wound care.
Strategic implication: Capability convergence is collapsing traditional market boundaries - and the firms with modular, data-integrated platforms will outperform those with siloed products.

2. Regulatory Evolution and Evidence Expectations
Regulatory frameworks are evolving - but also tightening. Both the FDA’s regenerative medicine advanced therapy (RMAT) designation and the EMA’s Advanced Therapy Medicinal Products (ATMP) pathway have accelerated review for cutting-edge treatments. However, regulators are demanding more robust, longitudinal data, particularly in the post-market phase.

Real-world evidence (RWE) is becoming obligatory. Companies that cannot generate, analyse, and report meaningful outcomes across diverse populations will struggle to maintain reimbursement and access.
  • Organogenesis has invested in post-market studies to retain content management system (CMS) coverage for its wound products.
  • Smith&Nephew is building evidence platforms through partnerships with data providers and clinical networks.
  • Digital-first companies can natively integrate outcome tracking, creating a structural advantage in long-term data capture.
Strategic implication: Regulatory compliance is shifting from trial execution to full-lifecycle evidence generation. MedTech leaders must think like data companies, not just manufacturers.

3. Health System Demands for Total Value
Payers and health systems are no longer swayed by marginal improvements or marketing claims. They are demanding total value: therapies must prove efficacy, speed to healing, functional recovery, reduction in complications, and downstream cost savings. The burden of proof is rising - not just for initial performance but for durability of outcomes.
  • In diabetic foot ulcers, for example, payers are favouring products that reduce amputations and readmissions, not just close wounds faster.
  • 3M’s advanced wound care division is focused on bundling products and services to offer measurable episode-of-care value.
  • Start-ups like Kerecis (acquired by Coloplast) emphasise natural, cost-effective outcomes with fish-skin grafts - aligning with emerging payer preferences for bio economics.
Strategic implication: The product-centric pitch is obsolete. Future competitiveness hinges on a solution-based narrative - what total problem do you solve?, not just “how well does your material work?

These strategic pressures - cross-platform competition, regulatory scrutiny, and economic accountability - are not temporary headwinds. They represent a rewiring of the tissue tech market. Leadership will no longer be defined by innovation alone, but by strategic integration, data fluency, and health economic literacy. For MedTech companies, the imperative is clear: evolve from being product developers to ecosystem orchestrators, capable of delivering outcome-centric, data-validated solutions in a complex, converging landscape.
HealthPadTalks is a podcast exploring the trends redefining healthcare’s future. Building on HealthPad’s Commentaries, we don’t just deliver answers — we question them. Through bold ideas, diverse voices, and meaningful debate, we aim to improve outcomes, cut costs, and expand access for all. Make sure to follow us! 
Strategic Imperatives for Legacy MedTech Leaders

As the tissue technology market shifts from materials to systems, from products to platforms, and from innovation to outcomes, legacy MedTech companies must undergo not just technical evolution, but strategic transformation. Survival - and leadership - will depend on acting across five key imperatives:

1. Reframe the Business from Product Maker to Solution Integrator
What must change:
Stop thinking in product categories - start thinking in patient journeys. Legacy firms must evolve from selling wound dressings, matrices, or scaffolds to delivering integrated care solutions that combine therapy, monitoring, and outcome management.
Action steps:
  • Develop end-to-end offerings that bundle biological products with diagnostics, patient education, and post-acute care pathways.
  • Build or acquire digital tools (e.g., AI wound imaging, remote monitoring apps) that plug into care pathways.
  • Shift go-to-market language from “features and claims” to “clinical and economic outcomes.”
2. Operationalise Real-World Evidence (RWE) as a Core Capability
What must change:
Clinical trials are no longer enough. Companies must generate continuous, credible real-world data to meet regulatory, payer, and provider demands.
Action steps:
  • Build in-house RWE teams that can generate, analyse, and publish data at scale.
  • Form post-market study consortia with providers to validate long-term outcomes.
  • Create digital infrastructure to collect real-time healing data across multiple settings, including the home.
Example: Organogenesis’ strategy of investing in RWE helped it navigate CMS reimbursement volatility in chronic wound care.

3. Forge Strategic Partnerships Beyond the MedTech Sector
What must change:
The most transformative innovations will not be built in-house. Future leaders will collaborate across biotechnology, software, AI, diagnostics, and even logistics.
Action steps:
  • Partner with biotech firms for cell or gene therapy adjacencies.
  • Collaborate with AI and imaging start-ups to enhance clinical decision-making.
  • Explore co-development agreements with digital health or wearable companies.
  • Consider joint ventures with payers or providers for bundled outcome models.
Example: Smith&Nephew’s partnerships with AI start-ups and EHR providers signal a pivot toward being a smart-wound care ecosystem, not just a product supplier.

4. Invest in Platform Thinking and Modularity
What must change:
Legacy pipelines built for single-use products must be redesigned for modularity and scale. The future is platform-driven - where the same biological or digital core can power multiple indications and settings.
Action steps:
  • Create modular platforms (e.g., scaffold + cells + sensor) that can be tailored for different use cases: burns, DFUs, surgical wounds, reconstructions.
  • Standardise across product lines to enable plug-and-play innovation.
  • Design data architectures that integrate across therapies and care stages.
Example: Vericel’s platform approach allows expansion from cartilage repair to other autologous cell therapies with shared infrastructure.

5. Rewire the Culture: From Device-Centric to Data-Literate
What must change:
Culture must shift from engineering-first to evidence-first - from compliance-focused to outcomes-obsessed. This requires talent, mindset, and metrics evolution.
Action steps:
  • Hire data scientists, systems biologists, and AI strategists into leadership roles.
  • Align incentives around long-term outcomes, not short-term sales.
  • Train commercial teams to speak the language of health economics, not just technical specs.
Example: 3M’s integration of Health Information Systems into its MedTech division reflects this evolution in cultural DNA.

Legacy MedTech firms will not succeed over the next decade by making better versions of the past products. They will win by thinking systemically, acting cross-functionally, and building ecosystems of care that outperform across clinical, economic, and human dimensions. To lead the future of tissue technology, companies must not just adapt to convergence - they must become engines of it.

 
The Future Shape of the Market

A decade from now, the tissue technology landscape will be defined not by incremental advances, but by full-scale convergence - of biology, data, and digital infrastructure. Four shifts will reshape the competitive and clinical terrain:
  1. Personalised Regenerative Therapies Cell-, gene-, and scaffold-based treatments will be tailored to individual biology, tissue type, and comorbidity - moving from off-the-shelf to on-demand healing.
  2. Closed-Loop Wound Care Systems Smart dressings embedded with biosensors, paired with AI-driven platforms, will deliver real-time diagnostics, automated intervention triggers, and predictive healing analytics - blurring the lines between treatment and monitoring.
  3. Hybrid Surgical-Biologic Interventions Operating rooms will routinely deploy integrated biologic devices - engineered grafts, living adhesives, and smart implants - delivered alongside precision surgical protocols in trauma, oncology, and complex reconstructions.
  4. Globalisation of Access and Manufacturing As production scales and costs decline regenerative platforms will expand into emerging markets - bringing advanced wound care to millions currently underserved by conventional therapies.
This future will not belong to the largest players but to the most agile. MedTech firms that are digitally fluent, biologically sophisticated, and clinically aligned will succeed and lead. Those that cling to legacy portfolios or underestimate the speed of market convergence will not survive. The next decade is not just about innovating faster - it is about redefining what it means to innovate in medicine.
 
Takeaways

The regenerative revolution is no longer speculative - it is here, unfolding in clinics, operating rooms, outpatient centres, and home care settings. What was once visionary science is now viable business, driving clinical outcomes and attracting capital. Tissue technology has moved beyond the laboratory and into the healthcare mainstream - but the rules of success are changing. The next decade will not be defined by who first developed a breakthrough scaffold or patented a novel material. It will be shaped by those who build platforms, integrate disciplines, and deliver outcomes at scale. In a market where biology meets data, and care is increasingly decentralised and value-driven, leadership requires orchestration - not just invention. Standing still is no longer a neutral act. For MedTech companies, complacency is a strategic liability. Firms that continue to operate as product manufacturers will be outpaced by those that position themselves as solution providers, data stewards, and ecosystem enablers. This is a moment of both risk and opportunity. The companies that rise to it - by embracing convergence, investing in real-world evidence, and aligning with clinical and economic value - will not just survive the next wave of change; they will define it.
view in full page