|
By Cambridge Design Partnership

Exploring the potential (and pitfalls) in on-body large-volume injectors

In a recent Q&A with Healthcare Packaging, CDP’s Clare Beddoes and Stephen Augustyn discuss the latest trends in on-body large-volume injectors (LVIs) and what they expect to see in 2024.


Clare Beddoes

Clare Beddoes

Head of Drug Delivery

Steve Augustyn

Steve Augustyn

Deputy Head of Drug Delivery

Here are their key takeaways:

There are a growing number of untapped disease states for on-body LVIs

“In terms of  large volume drugs that could be appropriate for delivery via an on-body LVI, conditions such as neurological disorders, oncology, and autoimmune diseases present promising opportunities.” 

On-body LVIs present both pros and cons when it comes to sustainability 

“Compared to traditional autoinjectors, the pros lie in enabling more patients to receive treatment at home, reducing travel and clinic resource usage. However, on-body LVIs are notably more complex than autoinjectors, involving adhesive patches, sterility barriers, intricate fluid paths, and often electromechanical drive systems.”

The two main hurdles to commercializing on-body LVIs are cost and risk

“Companies are understandably cautious about risk, often preferring to launch conventional devices initially, while avoiding adding risk on top of risk with a new device for a new drug product. Technical complexities, novel primary containers, device costs, and manufacturing for lower production volumes create hurdles.” 

Regulatory challenges have seen a significant improvement since 2022

“ISO 11608 Part 6 offers a clearer framework than before, when these devices were verified as infusion pumps, focusing on ‘rate accuracy’ and not ‘dose accuracy’. The regulatory route is much better understood and there are now products on the market, which gives confidence to device manufacturers.”

Device manufacturers are pushing to enhance automation for large and small-batch production

“In terms of assembly and packaging, high-speed automation rarely makes sense below 2m units a year, so below that, semi-automated or manual processes remain, adding to the device cost. However, some committed device manufacturers are pushing to enhance automation using well-considered device design and process monitoring.” 

In summary, the industry still favours established options like autoinjectors, with some pharma companies preferring to use two autoinjectors rather than deal with the complexity of an on body delivery system. However, as LVIs become more established, the anxiety around their use will diminish.


Missed our keynote at Pharmapack?

In their keynote session this year, resident experts in Combination Products, Head of Drug Delivery Clare Beddoes, and Deputy Head of Drug Delivery Steve Augustyn, asked: Where are all the on-body delivery devices?

Explaining the need to deliver innovation fast, they covered:

  • Barriers to market that on-body devices face and how to overcome them.
  • How to navigate relevant drug product pipelines, regulations and standards.
  • Future drivers for successful LVI development.

“The huge demand for high-volume biologics that was expected to drive on-body systems hasn’t materialized, or the devices are proving extremely complex and difficult to assemble. [But] as LVIs become more established, the anxiety around their use will diminish.” 

– Steve Augustyn

Innovations in Oncology|María|Antibody
Find the authors
on LinkedIn:

Innovations in Oncology: Past, Present, Future

 

This article will explore five exciting developments in the field of oncology and how advances in diagnostics, data analytics, cell biology and delivery devices are enabling oncology companies to bring the hope of longer, healthier lives to a wider range of patients than ever before.

First, let’s look at the history of this field of medicine and the current state of play. It is a disease that has plagued us for millennia. The oldest known records date as far back as Ancient Egypt, when the great physician Imhotep described a “bulging in the breast” for which there was no therapy. Thankfully, we’ve come a long way since.

A history of oncology: a whistle-stop tour [1]

Doctor radiologist running CT scan for patient's body lungs from control room. Computed Tomography

Modern surgery (the 1800s):

Surgeons faced severe limitations until the discovery of anaesthesia and antisepsis in the late 1800s. By the early 1900s, sophisticated surgeries had become commonplace, and it was possible to treat a wide range of early-stage cancers by surgically removing solid tumours.

Radiation and chemotherapy (late 1800s-1900s):

The discovery of X-rays in 1895 ushered in the era of radiation therapy. Chemotherapy followed in the mid-1900s, with the first effective cytotoxic drug found in nitrogen mustard, a derivative of the mustard gas used as a chemical weapon in WWI.

Progress in the 1950s-60s:

A flurry of discoveries in the 1950s and 60s yielded gains in cancer prevention, detection, and treatment. Notable achievements include proving the link between smoking and lung cancer, the introduction of cervical cancer screening, and advancements in chemotherapy.

The knowledge revolution (1980s-present):

The 1980s marked a turning point in our understanding of cancer biology. What was once thought of as a monolithic disease, and then a collection of altogether disparate conditions, is now understood to be a group of related diseases. Underneath the heterogeneity of cancer lies a consistent pattern: mutated genes that typically sit at key junctions in cellular signalling pathways, granting the cell distinctive pathological capabilities (e.g., the ability to evade growth-curbing mechanisms).

With the discovery of proto-oncogenes, tumour suppressor genes, and an understanding of the ways in which they disrupt specific cellular pathways, came the promise of (molecular) targeted therapies. Herceptin was one of the first: a monoclonal antibody specifically engineered to target and block receptors encoded by the HER2 oncogene.

New insights also paved the way for better prevention and diagnostics, such as the development of preventative cancer vaccines and the use of cancer biomarker assays to improve clinical decision-making.

Oncology today: a global challenge

The knowledge revolution continues, with over 700 proto-oncogenes and tumour suppressor genes identified by 2018, providing a cornucopia of potential treatment targets. However, the focus has shifted from “finding a cure” to prolonging healthy life through better prevention and care. And it’s working: in the US, the age-adjusted cancer death rate dropped by 22% between 2005 and 2020 [2]. But challenges persist, including:

  • Detecting cancer early when it can be most successfully treated
  • Identifying the best treatment for each patient
  • Targeting treatments to kill cancer cells while minimising off-target toxicity effectively
  • Increasing equitable access to prevention and care 

Oncology tomorrow: a multidisciplinary solution

Opportunities that were previously unimaginable are now within our grasp – we have unprecedented insight into cancer and access to powerful new technologies. Here are five developments we’re excited about:

1. Investments in decentralised testing

In the world of oncology diagnostics, decentralised testing can help more patients get an earlier and more accurate diagnosis, improving their chances of survival.

Elsewhere in diagnostics – notably respiratory illnesses and sexually transmitted diseases – there has been a successful decentralisation of testing. An abundance of point-of-care tests are available, along with the infrastructure to support them. By speeding up diagnosis, these tests streamline the process from the appearance of symptoms to the patient receiving the correct medication.

In cancer diagnostics, this decentralisation hasn’t been possible to date. The vast majority of testing still requires a biopsy and subsequent analysis in a pathology lab; throughput is therefore constrained.

Ultimately, science is only part of the answer.
There is also a need to improve workflows across
the entire diagnostic journey, from sample collection
and preparation to data interpretation.

Emerging cell-free DNA methods, which use liquid biopsies, can potentially reduce the reliance on solid tumour biopsies, but limited sensitivity restricts their applicability. Ultimately, science is only part of the answer. There is also a need to improve workflows across the entire diagnostic journey, from sample collection and preparation to data interpretation. Here, too, progress is being made. For example, the UK has seen investment in Community Diagnostic Centres – vastly increasing computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound scanning capacity to reduce the time to diagnosis [3].

A monoclonal antibody (yellow) blocks the interaction of PD-L1 with PD-1

2. Big data meets precision oncology

Once a diagnosis is made, clinicians must choose the most appropriate course of treatment. High-throughput sequencing technology and large-scale cancer genome studies have put a wealth of data at their disposal to aid in this decision. Precision oncology aims to harness that data to improve outcomes by using mutation analysis to guide treatment decisions for individual patients. However, making this a reality has turned out to be far more complicated than it sounds.

To be clinically useful, genetic or biomarker tests should be sufficiently predictive of treatment response from a targeted agent (e.g., HER2 positivity in breast cancer is predictive of susceptibility to Herceptin). Finding such test/treatment targets in the vast quantities of multi-omic data by brute force (and then developing or pairing with suitably matching drugs) requires immense computational power, which has so far been a limiting factor.

Because of its superior pattern recognition capabilities,
we’re excited about the potential for artificial intelligence (AI)
to change the game entirely.

Because of its superior pattern recognition capabilities, we’re excited about the potential for artificial intelligence (AI) to change the game entirely. For example, AI could leverage the ever-growing stores of data to more efficiently:

  • Identify driver mutations that may also be actionable drug targets
  • Generate and down-select potential drug candidates with in-silico screening
  • Identify predictive biomarkers that could be used to match patients with the most appropriate therapy

Of course, it’s early days for the application of AI for this purpose and making the best use of the data requires as much of it as possible to be publicly accessible, as well as the development of standard tools and conventions to improve cross-centre collaboration – so we’ll monitor developments with interest.

3. Foundational models of the cell

The promise of AI in oncology goes well beyond pattern recognition: just as foundation models have been developed for language (and are now used to power chatbots and generative AI), they could be developed for biological systems. Imagine general-purpose models of human cells of human cancers that could be adapted to represent specific patient cohorts or even individual patients (e.g., digital twins).

The models might be quite intricate, representing
a hierarchy of structure – from organ systems,
through tumours and their microenvironments,
to individual genes and the proteins they encode.

The models might be quite intricate, representing a hierarchy of structure – from organ systems, through tumours and their microenvironments, to individual genes and the proteins they encode. They could be adapted to assist in drug discovery, trial design and monitoring, and clinical decision-making. For example, to:

  • Simulate the effect of potential treatments on different patient populations
  • Evaluate treatment options in a specific cohort

Key to the successful implementation of such models is explainability: the ability to explain the model’s behaviour and its decisions in human terms. Additionally, to unleash their full potential, these models should not be static but rather learn continuously as new data is acquired – which will pose an interesting regulatory challenge.

4. The right device for the right patient at the right time

Once an appropriate therapeutic agent has been identified comes the challenge of delivering it safely and effectively. Here, the choice of administration route and delivery device are crucial.

In therapeutic areas such as rheumatoid arthritis, drugs that were previously delivered intravenously (IV) have long been available for subcutaneous (SubQ) self-administration – a transition made possible by improvements in formulation as well as device technology (e.g., autoinjectors).

A similar trend of IV to SubQ for in-home self-injection has been forecast for anticancer drugs for some time, with the promise of reducing the treatment burden for patients and healthcare systems [4]. It hasn’t quite materialised, and for good reason:

  • Most traditional cancer therapies are vesicant (i.e., known to damage subcutaneous tissue) and, therefore, fundamentally unsuitable for SubQ administration
  • Many are hazardous to handle: the risk of a leakage causing harm is high in a home or self-administration setting
  • Many patients need to be seen face-to-face by a clinician because their therapy requires variable dosing or close monitoring for side-effects

… there is certainly room for subcutaneous
delivery in oncology and even self-administration
– if the risk/benefit profile for a specific
patient and therapy warrants it.

As evidenced by the small but growing number of regulatory approvals (mostly in the form of pre-filled syringes or syringes prepared by the hospital pharmacy), there is certainly room for subcutaneous delivery in oncology and even self-administration – if the risk/benefit profile for a specific patient and therapy warrants it.

The solution, however, is not a push for wider adoption of any specific device technology but rather device selection that is underpinned by a thorough understanding of all stakeholder needs. Not just the immediate needs of the patient, clinicians, and formulation but also the wider context in which the therapy will be administered (care workflow, reimbursement pathways, etc.).

Interventional radiology. surgeon radiologist at operation during catheter based treatment with X-ray visualization.

5. Devices for targeted drug delivery

Cytotoxic chemotherapy has long been the mainstay of oncology treatment: it is versatile and effective. The problem is that off-target toxicity limits its tolerability. Targeted delivery aims to address this challenge by increasing drug concentration in cancerous tissues relative to healthy ones. This can improve efficacy while reducing side effects.

Drug-loaded nanoparticles, which are delivered systemically, have been investigated since the ‘90s as a way to achieve selective binding to tumour targets. However, despite extensive research, the number of such therapies available to patients is well below projections; promise in animal studies often fails to translate to success in humans [5].

We’re encouraged by the potential of targeted
delivery, for example, direct injection of cancer
drugs into tumours, as well as the use of
implantable pumps and reservoirs.

We’re excited about the potential of targeted delivery, for example, direct injection of cancer drugs into tumours, as well as the use of implantable pumps and reservoirs to access pharmacological sanctuaries such as the blood-brain barrier.

There is a robust pipeline of intratumoral therapies, with drugs for melanoma leading the charge – partly because the lesions are often superficial and, therefore, easier to find and inject into. Delivery to deeper tumours is more challenging and is currently conducted by highly skilled clinicians with imaging support – therefore requiring the development of delivery devices that provide a high degree of flexibility and control over the injection technique

Early diagnosis and targeted treatment offer hope of better outcomes

Oncology has come a long way and will continue to evolve with our growing understanding of the disease and the emergence of new technologies to prevent, detect and treat it. We’re excited to see innovations along the entire care pathway:

  • Investments in decentralised testing, including the development of cell-free DNA technologies, to allow faster and more accurate diagnosis
  • The potential of AI to revolutionise drug discovery and development and to help clinicians match patients to the most appropriate treatment
  • Improvements in delivery device technology to allow for safer, more targeted, and effective treatment


The journey continues, and hope prevails.

Antibody,Drug,Conjugated,With,Cytotoxic,Payload.,Antibody,Linked,To,A

References
  1.  The Emperor of All Maladies: A Biography of Cancer, by Siddhartha Mukherjee
  2. National Center for Chronic Disease Prevention and Health Promotion (U.S.). Division of Cancer Prevention and Control. An Update on cancer deaths in the United States (2022).
  3. https://www.gov.uk/government/news/government-to-deliver-160-community-diagnostic-centres-a-year-early
  4. Levêque, D. Subcutaneous Administration of Anticancer Agents. Anticancer Research 34 (4) 1579-1586 (2014).
  5. Mitchell, M.J., Billingsley, M.M., Haley, R.M. et al. Engineering precision nanoparticles for drug delivery. Nat Rev Drug Discov 20, 101–124 (2021).

Our capabilities

See our range of capabilities in developing drug delivery devices, diagnostics, and medical therapy solutions by visiting the below pages.

Bespoke device for targeted intranasal delivery

Taking a drug to first-in-human trials in a bespoke device for targeted intranasal delivery

Featured in ONdrugDelivery, Mark Allen, Andrew Fiorini, and Shai Assia discuss the need to develop delivery devices early when formulating nasally delivered drugs for systemic and local action, and a method by which the route to clinic can be made easier, faster and cheaper.

Systemic delivery has long been the mainstay of drug administration, whether via the oral, injectable, inhalable, nasal or another delivery route. There are, of course, many well-documented downsides of systemic delivery, including unintended side effects in locations beyond the drug target and reduced efficacy due to dose safety requirements to reduce those side effects. Targeted drug delivery can address many of those issues1 with targeted intranasal delivery, in particular, having the potential to treat many debilitating conditions, from as yet underserved conditions, such as cluster headaches, through to central nervous system (CNS) conditions such as Alzheimer’s disease. Indeed, there are currently many active studies on therapeutic delivery via this specialised route2. These targeted treatments have the potential to improve the lives of patients, their families and their carers immeasurably.

However, the key challenge lies in achieving the delivery of an accurate dose to a precise location within the nasal anatomy. A device that can enable that targeting is intrinsically linked to drug efficacy, meaning that it is necessary to consider device development earlier in the process than usual. In comparison, a drug intended for parenteral delivery has the well-trodden option of using a vial and syringe for administration by a healthcare practitioner during early development phases while proving basic safety and efficacy. A more complex drug delivery system can then be sourced or designed (if required) in parallel, ready for use in Phase III trials as part of a combination product development pathway.

“The key challenge lies in achieving the delivery of an accurate dose to a precise location within the nasal anatomy. A device that can enable that targeting is intrinsically linked to drug efficacy.”

This off-the-shelf-device approach, aimed at reducing the risk and cost associated with early-stage clinical studies, is not an option available to those developing highly targeted intranasal delivery – most of the currently available nasal devices are designed to coat as much of the nasal cavity as possible, making them unsuitable for delivery to a precise area. A nasal device with a broad spray pattern may even lead to the drug not reaching the intended target area at the required dose level.

So, how can a new, bespoke device be developed and made available for the initial Phase I and II trials? These are complex devices that need to be suitably well designed to ensure that patients or clinical professionals can use them during clinical trials to administer the drug accurately and repeatedly to the correct location, often deep in the nasal cavity.

To answer this, a minimum viable product (MVP) prototype device can be designed for the needs of the Phase I and II clinical trials. Designing for use within the controlled setting of a clinical trial and prioritising solely patient safety, spray geometry and usability (relating to holding and positioning the device) at this stage can considerably reduce the effort, cost and time required to reach the clinic. This MVP device will then allow the safety, efficacy and feasibility of the self-administered, targeted intranasal delivery method to be proven during these early clinical trials. The device performance and usability are critical to correctly delivering the drug, so learnings from this MVP device can be used in the further development and refinement of the device for Phase III trials, as well as the future commercial-scale device. Carrying out risk assessments and timely iterative testing (via formative studies) on the usability of the device is crucial; misuse or an inability to use the device could stop the patient from administering the drug to the intended location within the nasal cavity, or even cause harm, ultimately preventing the drug from achieving its intended therapeutic effect. Therefore, usability and human factors engineering must be incorporated into the design and development process from the start.

Defining a usable design

The challenge for the device development team is to successfully incorporate design for usability throughout a “lean” MVP device development process, meaning that a safe, usable device must be produced with reduced cost compared with traditional development processes. This can be achieved by careful adaptations to the typical design for usability process. When applying user-centric design principles, as outlined in ISO 9241-210, four steps should be followed:

  • Understand the context of use
  • Define the requirements
  • Build the design
  • Evaluate the design against the requirements.

Although this is not the only relevant ISO standard (others, such as ISO 62366, cover the application of usability engineering to medical devices), ISO 9241-210 provides a set of recommendations and requirements for applying user-centric design principles within design and development activities. These processes help to identify “real” user needs and usability challenges, which can then be used to establish a clearer framework for user interaction and interface design.

Understand the Context of Use

Consideration of the patient, including when and why they are receiving treatment, is essential. For example, if a new targeted nasal delivery device is to replace a healthcare practitioner-administered treatment, it is likely that the patient currently visits a clinic to receive their treatment, disrupting their schedule and placing an additional burden on the healthcare system. A self-administered device will naturally put the patient in control of their treatment and improve their quality of life – as has been witnessed through the advent of self-injection devices. However, targeted nasal delivery relies on the patient not only following the treatment regimen and using the device correctly, but also positioning the device accurately to ensure that the drug is delivered to the precise location intended.

“The best form of information gathering is to consult the patients themselves – they know their needs, and frustrations, better than anyone.”

Another key factor in the design process is predicting how a patient may interpret the device and, therefore, how they would go about using it. This is where the concept of mental models is useful, as it reflects the patient’s perception of how a device works and how to use it based on the patient’s experiences of similar devices. Perception is what a patient sees, hears, touches or smells, which, in turn, triggers mental recall and cognition, which then drives their actions.

The best form of information gathering is to consult the patients themselves – they know their needs, and frustrations, better than anyone. Clinicians and caregivers can provide additional information about patient behaviour and trends based on their experience across a wide range of patients, but their answers should take second place.

Speaking to patients is crucial to building an understanding of the context of use; however, care must be taken with the specific questions asked – they must be suitably phrased to avoid leading patients to give similar answers, but also to gather the information required to guide the device design via user needs. Working with experienced insight researchers and human factors experts can greatly increase the value gleaned from patient interaction throughout the design and development process.

Define the Requirements

Once the context of use is understood, the findings and needs of the patient must be converted from a range of opinions and perceptions into clearly defined requirements. It is essential to align patient needs with requirements in a format that can be validated. Similarly, technical requirements need to be verifiable, while also ensuring a cost-effective and usable device design.

User requirements should drive the technical requirements for the device. Requirements are living documents, so each set of patient interviews will typically lead to updates to the requirements throughout the design process. Equally, unknown parameters in the requirements documents can be used to drive patient interviews that can, in turn, be used to refine the requirements further or provide specific values for the device design team. These documents and patient interviews can then both be iteratively tested and updated as required.

Build the Design

The design stage is the point at which activities can be prioritised to reduce development time and costs by differentiating between a prototype device suitable for first-in-human testing and a fully developed and validated device. Here, the typical process of concept generation followed by down selection (via assessment against device requirements) is used to identify a suitable device design for further development.

Once initial prototype devices are available, engineering testing against the requirements can be performed to provide confidence in the design. Full design verification testing is not required at this stage, but sufficient evidence should be generated in the key areas, including safety and dose delivery performance. Development and evaluation of the important training materials, such as the instructions for use, should be started, but with a lowered risk assessment burden, in the knowledge that there will be clinicians available during initial trials.

“Once initial prototype devices are available, engineering testing against the requirements can be performed to provide confidence in the design.”

Focusing on the requirements of the MVP will accelerate time to clinic by concentrating on safety and usability. This MVP device is equivalent to a syringe and vial or prefilled syringe in injectable development for systemic treatments, so there will be future opportunities to refine the design for Phase III trials and commercial launch. This is an appropriate strategy, as the devices will only be used under supervision at this point. All learnings from the study can then be prioritised and incorporated into the final design as required, according to risks identified.

“Once a final prototype has been developed, it must be evaluated against the design requirements by design review, engineering testing and formative human factors studies.”

Evaluate Against Requirements

Once a final prototype has been developed, it must be evaluated against the design requirements by design review, engineering testing and formative human factors studies. This should incorporate a usability assessment for self-administration and simulate as many real functionalities as possible, including tactile, visual and auditory feedback from the device. This process should prioritise evaluating areas highlighted as high risk during previous activities, but also gather information on any additional learnings relevant to future design updates.

The Future of Targeted Intranasal Devices

The approach discussed here aligns with developing a bespoke prototype device suitable for first-in-human trials for targeted nasal delivery. The success or failure of this strategy depends on the nature of the collaboration between the pharmaceutical partner and the device design engineers, as well as in the experience of the insight researchers and usability engineers. Experience in the process required to develop a usable device is critical to the successful outcome of such a project and will pave the way for bringing a device to market in this new and exciting area of nasal drug delivery. It will be fascinating to see just how many new, life-changing improvements will be made possible by targeted nasal delivery.


References
  1. Hanson LR, Frey WH 2nd, “Intranasal delivery bypasses the blood-brain barrier to target therapeutic agents to the central nervous system and treat neurodegenerative disease”. BMC Neurosci, 2008, Vol 9(Suppl 3), S5.
  2. Hallschmid M, “Intranasal Insulin for Alzheimer’s Disease”. CNS Drugs, 2021, Vol 35(1), pp 21–37.

 

Five hurdles to digital health innovation in the UK|||

Five hurdles to digital health innovation in the UK (and how to overcome them)

CDP recently led an investigation into how to advance innovation in digital health in the UK for the CPI, UKRI/Innovate UK and ABHI. Our aim was to find out how best to enable the UK to be the place of choice for enabling high-risk digital health innovation, improving patient outcomes. 

Our work with 50 leading healthcare professionals and entrepreneurs revealed that the UK has an enviable record in early-stage innovation, a highly regarded healthcare system and a potential treasure trove of high-quality data. 

However, we also found several hurdles that trip up many innovations before their potential can be truly realized. 

In this article, we describe our top five hurdles to success and signpost the resources available to help innovators overcome them. 

Let’s start with perhaps the most obvious: you need to get your offering right. This was one of the more frequent topics to emerge in our discussions. True, it tended to come from the industry and investors, rather than entrepreneurs themselves. But perhaps this is the point; those closest to the concept are so captivated by the opportunity to solve a problem that they are rarely the best judge of commercial success.

“The biggest problem is developing stuff we don’t need, at the wrong price point.”

Life Sciences Lead, multinational consultancy

“People have struggled with finding the right balance between fixing a problem not just for the sake of it because it’s going to add value, but also there is a market attached to it.”

Medical Director, AI dermatology revenue-earning startup

Getting the right product at the right price is not easy. Regardless of how it is funded, healthcare everywhere is a complex system of separate entities with conflicting priorities. One of the biggest challenges for digital health offerings in particular is that the person paying the bills is rarely the direct beneficiary. This is as true across the NHS in the UK as it is in insurance-led services in the US. 

Digital interventions are regularly shown to make significant positive impacts on diagnosis, therapy, adherence and behavior change. To date, the FDA has approved, authorised or cleared 171 AI/ML-enabled medical devices. However, digital means adding overheads (electronics, batteries, software or new digital services) to an already overstretched budget that tends to bring value much further down the pathway. 

To get the right product at the right price, you need to be crystal clear about the value you bring and who you bring it to so that you can ensure the price is right. 

Helpful Resources

For the UK market, we found the following resources helpful in crossing this important hurdle:

  • The NHS Innovation Service provides an innovation guide that explains how to build a value proposition
  • The NICE Advice Service provides personalized advice on the value propositions for a fee 
  • The NHS Clinical Entrepreneur Programme (CEP), launched in 2016, provides training for NHS staff on the skills required to build a healthcare startup, all without them needing to leave the NHS

Indeed, this is such an important area that we at CDP are looking at how recent advances in Generative AI might make this easier to get right from the outset – not just for offering services within the UK, but how UK-based innovation can provide the right offering in the larger markets of the US and EU.

2. Neglecting the needs of key stakeholders

Digital products and services are still a novelty in healthcare. Even the regulation is taken from a device mindset – consider the terms SaMD (Software as a Medical Device) and now even AIaMD (Artificial Intelligence as a Medical Device). The digital-first mindset is to move fast, learn and repeat to get the best user insight and optimum benefit to market as fast as possible. This is not an easy marriage for healthcare, where verification and validation are critical steps to approval.

“If you are manufacturing a digital health product, you have three sets of policies to navigate right now [in the UK].”

CEO, digital health SME (referring to NHS DTAC, NICE and MHRA)

“The regs are written to cover all medical devices. They’re not very specific; it’s very high level and quite hard to interpret what we should actually be doing as an individual company.”

Medical Director, AI dermatology revenue-earning startup

This is not simply about the regulator; it is also about who will receive, who will administer and who will pay for your digital offering. On top of proving safety and efficacy, payers and adopters want to see evidence that your technology works under real-world conditions and produces sufficient benefit relative to current clinical practice to justify its cost. Not only do you need to convince your investor you have the right product at the right price point; you also need to convince them you have access to reimbursement. 

This need has led us here at CDP to build a strategy and insight team that explicitly looks across the spectrum of stakeholders including the end-user, practitioner and payer.

Helpful Resources

The following resources are helpful when considering the regulatory and UK purchaser stakeholders:

  • The NHS’ AI and Digital Regulations service offers a developer’s guide that leads you through the various regulatory and NHS requirements for digital technologies 
  • The NICE Evidence Standards Framework is designed to help ensure NHS stakeholders are adopting robust technologies that are likely to provide the expected performance, and are good value for money. The framework can be used by developers to understand their customer needs. NICE also offers an assessment of current/planned evidence via their META tool
  • The NICE Early Value Assessment can also help indicate the value your product can bring, and allow you to get support to understand what further evidence needs to be generated.
  • Similarly, NHS’ Digital Technology Assessment Criteria (DTAC) are designed to assess suppliers at the point of procurement, or as part of a due diligence process, ensuring digital technologies meet minimum baseline standards. The criteria can also be used by developers to understand what is expected for entry into the NHS and social care
  • FDA’s list of approved, authorized or cleared AI/ML-enabled medical devices

3. Testing, verifying and validating

The regulatory pathway will force you to verify and validate. It will be rigorous. It will take more time than you or your investors want. So, you will need to test, test and test again as early as possible to build the evidence you need for investment. And, importantly, test both the medical efficacy of your offering and its likely commercial success.

“[It] can take longer than six months, ridiculously, to build a cohort of data.
Getting people to step away from frontline service in the NHS is a fundamental challenge of getting access to that data. Even if you offered to pay, they’d say” ‘I don’t care; it’s not the money I’m short of, it’s people’.”

President, medical imaging multinational

“We really struggle to work with SMEs because we’re not able to move at the pace that they require for their cash flow.”

Director of Innovation, NHS trust

As these sentiments show, however, gathering data takes time and patience. The NHS is indeed a treasure trove of data, but unlocking it is a real hurdle. Existing NHS data typically needs preparation – cleaning and anonymizing – before you can access it. And there simply may not be the staff available to do this, meaning you may need to build additional paths to gather test data. 

The fidelity of the test can start low-fi, but will need to increase as you develop. CDP typically starts with insights research and human factors studies using UI sketches/descriptions of the product to explore the true user journey, before moving onto trials with real-life samples and wizard-of-Oz demonstrators. This builds a body of evidence that reinforces your expectation of efficacy with the all-important usability and the commercial viability of your offering, before embarking on the summative human factors, clinical and market trials.

Helpful Resources

The following resources can help you prepare your plans for testing, verification and validation:

  • NIHR study support service provides guidance and advice
  • The HDR UK Gateway portal helps researchers find existing data sets and connects them to relevant stakeholders
  • Trusted Research Environments (TREs) are a new initiative to facilitate access to NHS data for R&D. Only a few TREs currently exist and there is no guarantee they will have the data you’re looking for. However, the teams involved are well placed to advise you on next steps. Even though this requires approval from HRA and notification to MHRA, consider if it might be better to just do your own trial to collect fresh data instead. This is where an experienced external innovation partner can be very helpful

4. Navigating healthcare as a ‘system of systems’

The benefits of digital health typically require systems integration. Yet, healthcare everywhere is a complex system of systems, each element with its own approaches, tools and requirements.  

“[Different hospitals are] probably using different systems, different levels of maturity with different versions, with different level plugins. That probably means, even if I create it using the standard, it won’t automatically fit. It needs modification, adaptation and someone to do the translation.”

Digital Health Advisor, ex-NHSX

“You can often have very inflexible contracts with your electronic health records supplier; for example, if you want them to make one change or open up in an API or something like that, it can be prohibitively expensive.”

Director of Innovation, NHS trust

While there are only a few dominant providers of Electronic Health Record (EHR) systems, each installation is likely to be different. Moreover, the EHR providers will guard access jealously. Microsoft, Google, Amazon and others provide integration services to structure and translate data, but that is likely to be only a small part of the problem, and only useful if you are ingesting unstructured data from multiple sources. 

At CDP, we encourage our clients to focus on providing easy-to-use, yet secure APIs built around well-structured data that map well to the established data standards such as FHIR. Taking ownership of your own data in this way makes it easier to deploy, integrate and support. 

Helpful Resources

The following resources can help you prepare your digital health services for system integration:

5. Building a strong team

Innovation is rarely one guy in a garage. This is especially true in the digital health space. You will need to build a great team led by experienced professionals across the disciplines. Get this right and everything else will fall into place. Work out your strengths and weaknesses and actively seek resources to complement your team. 

“The most useful thing to an innovator is access to an actual practicing frontline clinician who understands the problem that they want solved. It’s a real challenge to get to these people. I might spend months trying to find someone who would talk to me.”

President, medical imaging multinational

“The UK does not have enough engineering capacity… does not have enough people with product skills… does not enough people with this sort of legal regulatory skills.”

Digital Health Advisor, ex-NHSX

Helpful Resources

The following resources can help innovators looking to build a world class team:

  • Many Health Innovation Networks (HINs) and NHS trusts have innovation teams who may be able to help matchmake with clinical champions. The NHS Innovation Service is a good place to start, but it’s worth seeing what individual trusts are doing. 
  • HINs and NHS trusts often also support innovation and hackathon events which are a great way to find those with a similar innovative mindset – but a complementary skill set.   
  • And then there are organizations such as CDP, who bring end-to-end product development services with the hard-won experiences of how to navigate this exciting but often frustrating area of innovation.

In addition, many of your digital needs are engineering and operational ones. Recruiting experienced people from the finance and technology sectors where the UK is strong will bring you good skills and expertise in algorithm development, handling personal data and building scalable secure systems.

It’s tempting for you (and your investors) to under-resource your team and compromise in the early phases. But as the hurdles above clearly show, this rarely leads to success. Build the great team you need from the outset, to make sure you truly have the right product at the right price, that meets the expectations of the key stakeholders, is properly tested and ready to integrate into the healthcare system of systems.

At CDP, we continue to follow up our insights working with clients and partners to find practical solutions to complex problems. To find out more about what successful innovation in digital health looks like, please do get in touch. 

In the meantime, download the full action plan for digital health innovation in the UK here.

New frontiers in implantable neuromodulation therapies||Medical Therapy article|New frontiers in implantable neuromodulation therapies|||
Find the authors
on LinkedIn:

New frontiers in implantable neuromodulation therapies

Neuromodulation, where electrical signals in a patient’s nervous system are modified or stimulated to deliver a therapeutic effect, continues to be an exciting and evolving space within the healthcare sector.

There are many drivers contributing to its advancement. Ongoing clinical neuroscience research fueling new possibilities in neuromodulation therapies, the invention of new technologies, and the development of new product formats to meet unmet needs, are all notable factors.

Additionally, there has been increased acceptance and presence of established therapies for implantable devices – such as deep brain stimulation for Parkinson’s disease, spinal cord stimulation for chronic back and leg pain, and vagus nerve stimulation for epilepsy and depression – with Medtronic, Boston Scientific, Abbott, Nevro and others leading the industry.

In all, these factors have made the electrical-based neuromodulation space to become one of the fastest-growing medical device sectors, with market size expected to rise from $6.09 billion in 2021 to $14 billion by 20301.

The diversity of solutions is evident, with Figure 1 illustrating the current landscape of established and emerging implantable neuromodulation therapies.

Fig 1. Selection of established and emerging electrical neuromodulation technologies and their indications.

In this first article of a two-part series, we look at a few notable emerging therapies to illustrate how the implantable neuromodulation space is rapidly developing.

Addressing continence issues is a growing area in the healthcare sector, where neuromodulation is seeking to play a significant role in specific therapies.

Implant-based stimulation of the sacral nerve has relatively recently established itself as a way of addressing incontinence with the presence of Medtronic’s Interstim and Axonics’ product range. Alongside the sacral nerve, other nerves are being considered for implantable stimulation to address similar conditions and to respond to specific unmet clinical and patient needs.

One alternative is tibial nerve stimulation, which has a history of effectiveness for certain cases in its non-implantable form: percutaneous tibial nerve stimulation (PTNS). The implant-based approach seeks to address a patient and clinician inconvenience of PTNS, i.e., the need for repeated stimulation sessions and user steps2.

An example of such is the BlueWind Revi, which is part implantable (the electrode is placed near the tibial nerve) and, for minimizing invasive procedures, part wearable (a through-body power source). The device stimulates the tibial nerve which is connected to the sacral nerve plexus, containing the efferent and afferent nerve fibers that control the bladder and are responsible for bladder function. Here, the electrical impulses aim to modify the compromised activity of the detrusor muscle in patients with overactive bladder3. The company has recently achieved clinical results on their pivotal trial evaluating safety and efficacy (still under review by the FDA)4.

Similarly, Medtronic is seeking to develop an implantable tibial nerve stimulation system for incontinence which is currently undergoing clinical trials5.

Another nerve for addressing incontinence is the pudendal nerve. Amber Therapeutics is currently developing an implantable closed-loop therapy, Amber-UI,  for urge and mixed urinary incontinence. The therapy involves implanting electrodes that can sense, interpret, adapt and respond to individual patient signals, such as muscle contraction, in an attempt to restore normal bladder function. By accessing the pudendal nerve, it aims to treat both urge and stress incontinence episodes for the first time, not possible with existing neuromodulation devices, thereby expanding the overall addressable market. First-in-human clinical studies are expected to conclude by the end of 2023.

 

Emerging Vagus Nerve Stimulation (VNS) therapies

Along with established therapies for epilepsy and depression, VNS is also being explored for conditions such as Rheumatoid Arthritis (RA) to displace injectable and oral medication.

SetPoint Medical is currently evaluating a novel VNS treatment that activates the ‘inflammatory reflex’ pathway (neurophysiological mechanism by which the central nervous system regulates the immune system) that may decrease the type of excess inflammation that is the underlying cause of RA. Its multivitamin pill-sized MicroRegulator platform is currently an investigational device.

SetPoint Medical is progressing clinical trials not only for RA, but also for Crohn’s disease, and furthermore exploring the therapeutic effect, in animal models, to treat multiple sclerosis with VNS therapy.

Implantable VNS therapy is also being explored for other conditions such as sepsis, lung injury, stroke, traumatic brain injury (TBI), obesity, diabetes, pain management and cardiovascular conditions7. One example of cardio-based therapies include low stimulation of the vagus nerve to liberate the body’s own neurochemicals to improve heart function.

 

New pain indications

Neuromodulation has worked well in establishing itself to address specific intractable pain of the trunk and/or limbs and for diabetic nerve damage – both conditions treated by implanting electrodes in the epidural space using spinal cord stimulation. In light of this success and available product types, pain specialists are continually seeking solutions from neuromodulation to address different causes for different parts of the body.

This impetus was clearly illustrated in panel sessions and discussions with clinicians attending the American Society of Pain and Neuroscience 2023 conference in Miami. We heard testimonials of how specialists, using available stimulators, succeeded in treating a variety of new pain sources and anatomical locations in the wrist, joints, abdominal region and in one case, at the neck to relieve a patient’s sensation of being choked.

This dynamic led to some clinicians proposing that the future of neuromodulation should also consider the treatment of pain associated with oncology treatments, given the improved extended lives seen in cancer patients. This exploration and success could pave the way for the creation of more established therapies – which would be welcome given the prevalence of chronic pain in the general population and the initiative to deliver non-opioid alternatives.

 

Novel developments for spinal cord injuries

Along with surgical, drug and stem cell therapies, neuromodulation has also entered the frame for addressing spinal cord injuries.

ONWARD has seen success with its partial and fully implantable versions of its ARC Therapy™ product range, where electrodes are implanted in the epidural space to stimulate the lower portion of the spinal cord affected by the injury that fails to (properly) communicate with the brain. By stimulating these lower nerves, the system aims to help restore and optimize their functioning in connection with the brain. ONWARD indicated that for their ARCIM product, one study demonstrated the ability for long-paralyzed people to stand and walk again with little or no assistance using this therapy.

ONWARD’s products have been granted Breakthrough Device Designation status for a range of indications such as improving upper and lower limb function; bladder control and blood pressure regulation; and alleviation of spasticity in patients with such injuries8.

Also in ONWARD’s pipeline is a plan to integrate an implanted Brain Computer Interface (BCI) which senses the patient’s brain signals relating to the intent of leg/joint movement. In turn, these signals are wirelessly sent to its spinal cord stimulator which can activate nerves which are poorly connected to the brain due to injury. This aims to create a “digital bridge” between the brain and poorly connected nerves to enable and improve the patient’s walking ability. Much research and iteration is anticipated; however, this ambition is indicative of how neuromodulation can be innovative and transformational to people’s lives.

 

The road ahead for neuromodulation

The above examples only skim the surface of emerging therapies; neurostimulation, neuro-adaptive therapies and BCI technologies are attracting significant research and investment to create new therapies by leveraging the body’s physiological pathways.

We foresee continued progress in materials science, engineering, device design and biomedical research into neuro-physiological understanding of the human body to fuel the foundations for new, highly functional and patient-centered neuromodulation platforms.

We also foresee exciting developments in how targeting different nerves can potentially tackle similar medical conditions while the same nerve can be used to address various indications.

In our next article, we will explore the varied technology drivers and their considerations that are leading to the creation of new, innovative neuromodulation implants.


References 
  1. Strategic Market Research website https://www.strategicmarketresearch.com/market-report/neuromodulation-devices-market visited on 12/07/2023
  2. DOI: 10.1186/1471-2490-13-61
  3. DOI: 10.2147/RRU.S231954
  4. Clinical Study Results of the BlueWind System for Patients with Overactive Bladder Featured at the 2023 AUA Annual Meeting. https://www.prnewswire.com/news-releases/clinical-study-results-of-the-bluewind-system-for-patients-with-overactive-bladder-featured-at-the-2023-aua-annual-meeting-301811486.html 
  5. Evaluation of Implantable Tibial Neuromodulation Pivotal Study https://classic.clinicaltrials.gov/ct2/show/NCT05226286
  6. DOI: 10.1016/j.xjtc.2022.03.007
  7. DOI: 10.2147/JIR.S163248
  8. Website Onwards https://www.onwd.com/ visited 12/07/2023
How data and AI are changing bioprocessing
Find the authors
on LinkedIn:

How data and AI are changing bioprocessing – and why it’s needed

After numerous insightful talks and engaging conversations with industry leaders at this year’s BioProcess International, the key theme was clear: data, data and more data. 

Data has always been important, but now it is being collected to model current processes, understand how they work, and improve them. This is a trend that is only likely to accelerate in the future as AI becomes part of everyday life – both in and outside of work

Using data-based modeling to optimize well-established industrial processes 

There are many traditional processes that are used in the manufacture of antibodies, mRNA vaccines and cellular therapies. Companies are now collecting extensive data from these processes and using modeling to create their ‘digital twin’. 

The processes modeled range from relatively simple tasks such as optimization of freezing/thawing product intermediates, freeze-drying and automated buffer preparation, to more complex procedures such as bioreactor scale-up. Although these used to be manual ‘craft’ processes run by a combination of experience and pre-existing data, there is now a trend for them to be tested and optimized using in silico methods.

Using modeling to improve purification methods

Bioprocessing is used to create many therapeutic products, from molecules such as protein, DNA and RNA to much larger entities such as viruses and eukaryotic cells. Their production has many different steps that often require extensive purification before the next step can proceed. Common purification methods include clarification, chromatography, ultrafiltration/diafiltration and sterile filtration. 

These methods were typically used in an empirical way based on experience with similar products. Now however, use of modeling has led to a much more detailed understanding of how these separation/purification methods work. It allows the prediction of when column/membrane capacity is reached, and when “breakthrough” of contaminants is likely to occur. It has also led to the development of alternatives to standard resin-based column chromatography such as the incorporation of new reactive chemical groups on membrane filters that can then act like traditional resin-based columns.

Benefits of Process Analytical Technology (PAT)

PAT refers to on-line/at-line measurement of critical product quality and performance attributes so that real-time direct data collection can be used to control and optimize manufacturing processes.

PAT is being augmented by a much wider range of analytical techniques than before and now includes many different types of spectroscopy including variable path length, Fourier-transform infrared, Raman and Dynamic Light Scattering, as well as Nuclear Magnetic Resonance. The use of PAT for direct data collection that links to immediate process control is only likely to accelerate.

Inexorable rise of disposable closed cell processing systems

In addition to the data theme, it was clear to see that the number of automated closed cell handling and processing systems – from cell selection to expansion and harvesting – is rapidly increasing. Companies aim to offer end-to-end solutions to traditionally manual processes, either by offering modular components or a single complete system. 

The options for choosing automated disposable bioreactors/cell expansion systems are also increasing, with many players recently entering the market. It is clear why this option is advantageous; traditional stainless-steel bioreactors are complex, expensive, and laborious to clean and maintain.

Just how large these systems can grow is shown by ThermoFisher’s 5000L disposable Dynadrive bioreactor, which is offered as a fast-to-install option compared to stainless-steel alternatives. However, the environmental impact of the disposable route is a long-term concern and is expected to be a point of contentious discussion over the coming years.

Bioprocessing technology is developing (but not fast enough for demand) 

The technological developments described above are certainly needed as advances in eukaryotic culturing methods are allowing higher and higher cell densities to be realized, which makes purification more challenging. Furthermore, the pipeline for products that use these technologies is growing at a dizzying rate with over 1,500 cell and gene therapy and 700 mRNA trials listed on the US Clinical Trials site. New higher throughput processing techniques will need to be developed to accommodate this demand. 

The industry clearly recognizes this and companies were very open in sharing their results at BioProcess International – both good and bad! They are also keen to work with the process equipment manufacturers to optimize performance. Overall, improvements have been made, but there is a long way to go. 

Performance can be improved by a virtuous circle of data generation, data modeling and innovative design and engineering – something we at CDP are already doing to help our clients succeed.

If you would like to see how innovative engineering and automation could help increase bioprocessing throughput, please get in touch.


Insights into GenAI data scientist perspective - Cambridge Design Partnership|The data scientist’s perspective|Insights into GenAI product owner's perspective - Cambridge Design Partnership
Find the authors
on LinkedIn:

The product owner’s perspective: Five practical insights to accelerating innovation with GenAI

Here at CDP, we’ve delivered a range of Generative AI (GenAI) projects that use Large Language Models (LLMs). Each has been a journey of discovery, and sometimes frustration. But ultimately each has reinforced the potential for GenAI to dramatically accelerate innovation.

In an attempt to provide a useful contribution that cuts through the noise, we’ve distilled our learnings into a four-part series on how businesses, data scientists and product owners can leverage GenAI for success with a final perspective from a GenAI-powered ChatBot.

In this second article, we draw from our experiences implementing GenAI from a product owner’s perspective. To get a high-level view of LLMs check out Part 1 or for a deeper dive in the technology from a data scientist perspective check out Part 3.

1. Start at the end and work backwards

As with all truly transformative innovation, start by understanding what you are offering your users and work back from them. Ignore the undoubted magic of the technology at this stage – you can rely on that coming later.

You will need to set your success criteria, and this is where to start. Delighting your user base and measuring how they will benefit will do more to drive adoption than any shiny AI tech that might be going on behind the scenes.


Choose your project carefully.

  • Choose an area that you already know well or for which you have a good way of measuring success. This will ensure you see beyond the magic of the black-box and can truly judge the performance and value that LLMs bring.
  • Choose an area where LLMs work to their strengths by taking advantage of at least one of the core competencies they have been shown to do well; summary, expansion, inference and analysis.

2. Don’t forget the basics

Make good use of Service Design techniques to define what success looks like. Map the User Journey and spend time defining the touchpoints and modelling the semantic information architecture.

And then strip it back. Cut away absolutely everything that isn’t vital to the successful outcome you plan for. Don’t let the designers loose until this is done. And consider any investigative work with the technology up to this point as exploratory and should almost certainly be archived.

You’ll then have a clear set of priorities, requirements, information flows and use-cases that everyone understands, and everyone can support. The whole team will be clear about what they are aiming for. Keeping their eye focussed on the prize makes the Product Owner’s primary catch-phrases more effective: “No that is not in scope” and “This is lower priority”.

And if this is starting to sound like the start of any solid digital project – good, it should.

3. Experiment

Give your team as much time as possible to try things out. Build the time into the plan and break the experiments down into small and well-defined steps to learn and iterate.


Look to experiment with the following:

  • How the structure of prompts changes the output.
  • How the different LLMs compare when asked to respond to the same prompt.
  • How to extend the LLMs by adding training to embed your own information data.

Aim to build the experimental steps around the core competencies of Generative AI. And later, bring these together to form an overall solution using your favourite AI automation tool chain.


There will be surprises. There will be frustrations. And there will be changes in the way that you approach the use of the LLMs. Don’t be afraid to pivot on how you use the technology; or indeed ‘if’ you use the technology. But remember the basics, keep your eye on what success looks like and don’t let the team get carried away with ‘shiny object syndrome’.

4. Get lots of feedback

While using AI, remember to share your work with real humans as early as possible: People outside your team who can give you useful feedback. Set up demos within the team to share learnings and put on regular show-and-tell sessions with your target audience. And, as soon as possible, let them try it out – on their own, without you there. They will learn to see beyond the magic, and you will quickly find out what works and what doesn’t.

Your priorities will change – but the fundamental definition of success won’t (hopefully). And don’t forget the importance of plain old testing. The outputs from a LLMs can vary widely with only the smallest changes in training data and prompts. Fortunately, LLMs can come to the rescue here – they are great at evaluating the output from other models through peer- review. Use that capability to help you test. This is also useful for building into the architecture of your solution. Where you have the resources; double up the LLMs to interact and increase the quality of output for a production system.

5. Don’t underestimate the time you need


Don’t underestimate the time it will take to gather, prepare and refine your data. When it comes to data, quality and variety are just as important as quantity. With demographic information, a good distribution of variety is vital to represent your users truly and ethically. And don’t forget to set aside at least 10% randomly selected from the training set so that you can properly test the results.


To save time and increase the training and test data available, explore opportunities to synthesise data to add to your original data set. Also, don’t underestimate the time it will take to test and refine the prompts and LLMs settings to achieve the repeatable outcomes you are looking for. Prompt engineering is an art as well as a skill and takes time to learn.


Finally, know when to stop. It will always be possible to make it a bit better. Be clear about what is good enough and recognise when you get there. The impulse for the team to keep tweaking will never end – it’s simply too absorbing.

Interested in exploring how GenAI can accelerate your innovation?

Come and join us in Cambridge, UK, and Raleigh, NC, where we’ll be running a series of in-person workshops to help clients identify the opportunities (and threats) of GenAI and plan a path to accelerate their innovation.

Trends in Respiratory
Find the authors
on LinkedIn:

Trends in respiratory therapies: why pMDIs hang in the balance of new technology

In May 2023, RDD Europe returned to a real-world conference after years of pandemic-enforced online-only presence. The location was spectacular – Antibes on the Cote d’Azur – with the sparkling Mediterranean Sea providing welcome relief from a dismal British spring. 

The industry was well represented by device technology companies, CMOs, academics and pharma companies, and the presentations and workshops provided an engaging blend of research and practical advice.

Even though much of my time over the past ten years has been focused on parenteral device development, my career in combination products started in respiratory devices, working on a variety of dry powder inhaler (DPI) and pressurised metered dose inhaler (pMDI) devices, including the GSK Ellipta inhaler. This year at RDD, as I returned to my roots in this industry, three main themes struck me: preparing for the pMDI cliff edge; moving beyond traditional respiratory diseases; and implementing particle engineering for targeted treatment.

There were also two notable omissions: users and connectivity. More on those later.

The shift in pMDIs from using HFC propellants towards gases with a lower global warming potential (GWP) has gained momentum, with California imposing a ban on the sale and distribution of R227ea from the end of 2030, and R134a from the end of 2032, including for medical use. This means the end of the line for the sale of all current pMDI products in California, with other jurisdictions likely to follow suit as the world tries to move to a more sustainable solution.

The transition needs formulators, device designers, scientists, and other disciplines to collaborate to solve the challenges presented by the different physical properties of the new gases. Different thermodynamic and fluid dynamic properties can dramatically alter the plume geometry, droplet size and particle velocity, requiring careful redesign of the fluid pathways to compensate for the differences. These challenges were outlined in evidence presented by Recipharm (1), Proveris and Koura (2), and Healthy Airways LLC (3).

At Cambridge Design Partnership, we are receiving far fewer enquiries for pMDI products than DPIs and soft-mist inhalers. Obviously, an n=1 sample does not have a high degree of certainty, but it reflects a general sentiment among clients to focus future developments away from pMDI platforms.

Asthma and COPD remain the biggest drivers in device and formulation development, much the same way that diabetes treatment has driven pen injector development. Two drivers that our drug delivery team have seen pushing device design in respiratory and the inhalation market are the need for home treatment, rather than hospital centered treatment; and platforms for biological drugs. The other significant drive is for vaccines that are stable at higher temperatures, which can be delivered without leaving behind copious volumes of blood-contaminated medical waste.

One challenge that comes with these new treatment regimens, beyond formulating drugs that will be stable in powder form, is getting the drug to the correct part of the body and making sure it remains present long enough to be effective. One paper from UCL and the University of Hong Kong (4) highlighted a promising approach to developing therapeutic antibodies against future SARS outbreaks. Some of these developments also require higher dose payloads, or API-only formulations; this presents a substantial challenge to device designers to make sure that the inhalation capabilities of different patient groups can achieve the required dose efficiency.

Aptar and Recipharm also shared their own device innovations to present novel spray and softmist technologies based on a syringe primary container. Targeting rapid treatment to the brain via the olfactory route is a much-neglected treatment option, in part due to the challenges of getting consistent behavior with users. At Cambridge Design Partnership, we’ve been working with a pioneering device company looking to exploit this pathway, and my colleague, Clare Beddoes, will be presenting information on this device development at PODD in October.

In addition to the paper from UCL (4), particle engineering to target specific areas in the respiratory and nasal pathway was a topic that several posters and presentations addressed directly. Building on standard jet milling techniques, a paper from Aston University explained how isothermal dry particle coating (iDPC) can be used to create more potent formulations without increasing the volume of powder inhaled by the user (5). A third paper from Hovione and two Portuguese institutions focused on the characterization of different particle manufacturing techniques and how they affect deposition in nasal passages (6).

Closing the gap between the early stages of in vitro and in silico models, and the later stage in vivo performance, continues to receive a lot of attention. As the cost of computing power continues to fall, going into clinical or preclinical trials with greater confidence will accelerate time to market and reduce the cost burden on pharma companies looking to novel treatments.

Two areas of development that received relatively little focus at the conference were human factors engineering (HFE) and connectivity – two concerns that are the subject of a great deal of effort in the parenteral sector. Recipharm presented a poster on the HFE advantages of their novel unit dose nasal spray when compared to a reference device (which bore a striking resemblance to an Aptar Unidose Liquid Nasal Spray). Research institution Solvias presented a paper showing how training users can lead to worse outcomes due to misperception of expertise using a device (7). This counterintuitive result demonstrated that patients with limited one-to-one training with a Handihaler showed more errors in use than patients who only had access to the device and IFU. 

While these insights were welcome, our in-house team knows that patients continue to struggle to use inhalers reliably and consistently, leaving even the most effective drug products showing variable results.

These challenges for patient use are also being seen in the parenteral market, which is why we are working so closely with our clients to find better ways to train patients and leverage connectivity to improve adherence to medication regimens. These connectivity solutions are often in direct conflict with cost and sustainability targets and finding a route to square this circle is a challenge with which CDP’s designers and engineers are actively engaging.

See you in Tucson?

RDD 2023 was the first RDD conference I have attended. It was great to reconnect with former colleagues and make new connections across the industry. The conference was very well run, and the standard of papers and presentations ensured there was plenty of fascinating material for industry and academia to engage with. I’ve already blocked out my diary for RDD 2024 in Tucson and I look forward to seeing you there.


References
  1. Albuterol Sulfate Metered Dose Inhaler Feasibility Using an Environment Friendly Propellant HFA152a and Novel Valves (Lei Mao, Sheryl Johnson, Nischal Pant, James Murray, Donald Ellis, Benjamin Zechinati, Johnathan Carr and Victoria Cruttenden)
  2. Comparison of Spray Characteristics of P-134a and Low GWP P-152a pMDIs With and Without Ethanol (Lynn Jordan, Sheryl Johnson, Ramesh Chand, Grant Thurston, Deborah Jones, Vanessa Webster and Sally Stanford)
  3. Accelerated Development of MDIs with Low GWP Propellants in a QbD Era: Practical, Regulatory and Scientific Considerations (Healthy Airways LLC and First Flight Pharma LLC)
  4. Inhaled Antibody Therapies: Enabling Prophylactic Protection against SARS-CoV-2 Infection with a Dual Targeting Powder Formulation (Han Song Saw and Jenny Ka-Wing Lam)
  5. Use of Isothermal Dry Particle Coating (iDPC) for the Development of High Dose Dry Powder Inhalers (Jasdip S. Koner, David A. Wyatt, Amandip S. Gill, Shital Lungare, Rhys Jones and Afzal R. Mohammed)
  6. Benchmarking of Particle Engineering Strategies for Nasal Powder Delivery: Characterization of Nasal Deposition Using the Alberta Idealized Nasal Inlet (Patricia Henriques, Cláudia Costa, António Serôdio, Ana Fortuna, and Slavomíra Doktorovová)
  7. Effect of Capsule-Based Dry Powder Inhaler User Training on In Vitro Performance (Oleksandra Troshyna and Yannick Baschung)

 

 

 

 

Care tech: exploring the latest trends in dementia care
Find the authors
on LinkedIn:

Care tech: exploring the latest trends in dementia care

We are witnessing important advances in the treatment of the most common cause of dementia, Alzheimer’s disease, most noticeably by the emergence of disease-modifying therapeutics. And this trend is only set to continue, with new innovations and technologies promising to help slow the progression of this devastating disease.

However, patients who do not yet have access to these treatments or are in a more advanced stage of the disease will continue to require significant care support. The caregiving sector is already under significant pressure due to the increasing demand for long-term care within aging populations [1]. As the disease progresses, family members, including elderly spouses, are often the main caregiver – but they may be left poorly equipped to do this without the right support.

With the cost of dementia care running to £32,250 per person per annum [2] technology innovators are finding new ways to make resources go further and give dementia patients independence for longer – providing reassurance to the caregiver and peace of mind to family members.

The challenge lies in making these solutions accessible to caregivers and usable for patients. In this article, we take a deep dive into the technologies available to support dementia care and explore emerging trends that are transforming the landscape by using the right technology at the right time.

 

Alzheimer’s disease is a progressive and irreversible neurodegenerative condition that primarily affects the cognitive functions of the brain, particularly memory, thinking and behavior. It is the most common cause of dementia, a broader term for a set of symptoms that impact a person’s ability to live independently.

In the UK, it is estimated that more than 900,000 people live with dementia, and this is projected to double by 2040 [3]. Of the people diagnosed, up to a third live alone [4]. With the aging population outpacing the rate of training and recruiting caregivers, the already significant caregiver shortage is set to increase [5].

Meanwhile, family members are taking on caregiver responsibilities, often with unsustainable and distressing consequences. This is in part because every patient journey is different and the rate of their disease progression can vary widely. Some patients may require discreet support at the early stages of the disease, while others may require constant care. Knowing when and how to intervene to provide the care support needed is crucial.

The care sector is increasingly looking to technology to maximize the impact of the professional and informal caregiver workforce. There is an increasing recognition that caregivers require ongoing support to make their role more manageable, especially following the pandemic.

Assistive technologies rarely exist in isolation. In fact, it is often the combination of these technologies that yields the best results. Here are some of the technologies available to support independent living and managing disease progression.

Personal alarms and safety tracking

Alarms and tracking technologies allow people to call for help if they need it – wherever they are – as well as providing peace of mind for caregivers and family members when they are not there. They are simple to use and can help patients stay independent for longer.

 

Location. GPS trackers such as Mindme, Ubeequee, and Angelsense consist of battery powered or rechargeable wearables that connect to a 24/7 monitoring support center to alert family members and emergency services if a vulnerable adult is outside designated safe zones. Direct-to-consumer devices, such as Medpage, work similarly, but the information links directly to family members and may not have predefined safety zones or raise an alarm. Connectivity is based on broadband and subject to subscription charges.

Alarms and calls. Technologies such as Tunstall’s MyAmie, Oysta, and Saga’s SOS allow patients to raise an alarm for relatives, caregivers or emergency services with the use of a single button. These technologies often come in the form of a pendant worn around the house and are connected to a hub via a radio signal. The patient can also use the hub to raise an alarm. The pendant must be within reach of the hub for it to work. Other technologies, however, work similarly to the GPS tracker and can rely on broadband for wider network reach. These technologies often also incorporate fall detection and GPS.

Fall detection. Wearables such as Buddi, Telecare, and Careline are designed specifically for dementia care. These use inertia measurement units, gyroscopes, and pressure sensors to detect falls and automatically send messages to caregivers, family members, and first-aid responders. These devices are often accompanied by an alarm button for the user and GPS tracking. Many of these technologies can also be connected to a 24/7 monitoring support team.

Reminders and medication adherence. There are a variety of technologies in this category which allow caregivers to set reminders for patients to take medication, drink water, eat, or  remember appointments or social events. Memory aid kits available include the MemRabel care alarm clock with a large screen, connected to a Pivotell Vibratime rechargeable wrist watch that vibrates for reminders. These can be in photo, video or audio format.

The challenge many of these technologies face is that they depend on a caregiver to ensure the patient remembers to engage with and wear the device, charge it when necessary, and crucially, press the button if in distress. In the case of some technologies, they must also be within reach of a hub.

These technologies are good for the early stages of the disease, but as cognitive decline continues, patients will rely more on caregivers to support them, thus limiting their advantages.

In other words, the longevity of these technologies can become incompatible with the patient’s journey, and this is one of the key hurdles to consider when designing and adopting technology in dementia care.

Remote monitoring

This is a fast-growing area for dementia care. Remote monitoring technologies share information on the patient’s daily living patterns with caregivers and family members. The purpose is to provide peace of mind to family members and enable caregivers to make informed care decisions in the short and long term.

Common functions include:

  • Movement monitoring. Generally delivered by several passive infrared (PIR) sensors installed around the house, and pressure mats in beds and sofas, connected to a hub.
  • House occupancy. Sensors on external doors to monitor whether an individual has left the house.
  • Appliance usage. Monitored by connected sensors placed between the mains inlet and the device plug.
  • Fall detection. Cameras or mmWave radar sensors to detect when an individual has had a fall, without the need for a wearable.

Many of these functions can be delivered by single systems, e.g. Taking Care Home Alert, with the more sophisticated fall detection systems generally targeted at professional care provider users, e.g. Hikvision and Vayyar Care.

It is also common for families to create their own solutions, especially when they feel no existing single solution works for them. This includes the use of consumer tech, such as smartphones, video doorbells, smart home speakers, and cameras around the house. Video doorbells, for example, can be valuable in preventing scams, while smart home speakers can set reminders, automate house functions, or call a relative. However, the use of cameras around the house does pose privacy concerns which need to be considered.

Although the overall objective is to monitor daily independent living, the information often requires interpretation by the caregiver. This can often be facilitated through a dashboard, although the information can be disjointed, and assessment of patterns may not be clear-cut.

Innovator Matt Ash from Supersense Technologies, however, believes we can do more to obtain valuable insights and monitor disease progression efficiently and noninvasively.

 

“There is a real need for technologies that support caregivers in their role and provide them with the confidence to take a break, knowing their loved one is safe. Though there are some credible assistive technologies out there, the unique needs of families living with dementia are not well served. Projects like the Longitude Prize on Dementia are investing in radical thinking to generate solutions with families living with dementia.”

 

Talking about some of the latest advancements being tested, Ash continues:

 

“Everyone’s journey with dementia is different. Right now, we are working on leveraging recent consumer developments in sensor technology, machine learning, and user experience to create personalized assistive systems that can evolve with the needs of an individual with dementia and their caregivers. It’s an incredible opportunity to provide the community with supporting technologies that serve their needs.”

 

If we want to empower those with dementia to live independently, maximize the impact of caregivers, and provide peace of mind to family members, we must enable the right type of intervention at the right time. Someone with early Alzheimer’s disease may feel overwhelmed or suspicious of new technology, while a person in later stages may be too vulnerable to learn how to use it.

The future of dementia care will center around collecting the right data and extracting the right insights from it to enable better care choices. By allowing technology to provide information on the progression rate of the disease for a particular patient, we can start building a profile of care by recognizing changes in patterns to a baseline. Emerging technologies such as remote monitoring platforms can support this and guide the longevity of other technological interventions to ensure that they align with the individual patient’s journey. At the heart of these technologies, privacy must be a top priority, which may include the use of AI and other methods to allow for patterns to be recognized quickly and with minimal need of human intervention.

We are entering a new era of therapeutics for Alzheimer’s disease, but there is still much to do, particularly in care. Although the use of technology can ultimately support patients, caregivers and family members, it is often incompatible with the individual’s stage of the disease, or inaccessible to caregivers. But as new technologies emerge, data and AI can unlock new insights to support a personalized care plan that scopes each patient to their individual needs – allowing caregivers and families to provide the best care at the right time.


References
  1. E. adult social care insight. The size and structure of the adult social care sector and workforce in England. Technical report, Skills for Care, Workforce Intelligence, 2023.
  2. Alzheimer’s Society, How much does dementia care cost? https://www.alzheimers.org.uk/blog/how-much-does-dementia-care-cost
  3. L. B.-A. A. R. Raphael Wittenberg, Bo Hu. Projections of older people with dementia and costs of dementia care in the United Kingdom, 2019–2040. Technical report, Care Policy and Evaluation Centre, London School of Economics and Political Science, 2019.
  4. B. W. Claudia Miranda-Castillo and M. Orrell. People with dementia living alone: what are their needs and what kind of support are they receiving? International Psychogeriatrics, 2010.
  5. E. adult social care insight. The size and structure of the adult social care sector and workforce in England. Technical report, Skills for Care, Workforce Intelligence, 2023.

 

 

 

 

respiratory drug delivery|
Find the authors
on LinkedIn:

Key trends in respiratory drug delivery

It was great to be back in person for the Drug Delivery to the Lungs conference in Edinburgh recently. Here, we share insights on three major themes from the event and a trend we think will reshape the future of respiratory drug delivery in the next 10-20 years.

Sustainable pMDIs

The shift in pMDIs from using HFC propellants towards less polluting gases has gained momentum with California imposing a ban on the sale and distribution of R227ea from the end of 2030 and R134a from the end of 2032, including medical use. This provides an end-of-the-line for the sale of all current pMDI products in California.

The transition needs formulators, device designers, scientists, and other disciplines to collaborate to solve the challenges presented by the different physical properties of the new gases. The assessment of all types of inhalers from a sustainability perspective has advanced, too, with life cycle analysis (LCA) and carbon credits schemes being discussed – our sustainability team provides reviews and recommendations for a range of medical devices to help our clients improve their devices and provide evidence to back up their green credentials.

Usability for adherence

Time and again, studies show that it’s challenging to measure asthma and COPD patients’ adherence to their medication. Medication adherence appears much lower than for other diseases – estimates range from 22-78% adherence, compared to 70% for diabetes.

Low adherence needs to be addressed by making devices easier to use and tailoring them to the patient’s needs. Reducing user steps is key to make using the device easier, but patient feedback and tailoring to specific needs are necessary, too – something connected inhalers could help solve through digital reminders appropriate to the patient’s needs. This is just one of the ways that CDP Mosaic, our digital ecosystem catalyst, can be used. Independently verifying that increased adherence is due to connected or smart inhalers is difficult to prove – something the industry is investigating.

Modelling of drug delivery

Several talks at this year’s event covered modelling, with in-silico methods advancing in capability and popularity over the last 10 years. Topics covered included constructing a full airway model to assess drug deposition under different breathing profiles and using maths with physiological signals to detect disease and drug-induced changes. Posters demonstrated an even wider range of possible models, including our own.

Our modelling and simulation teams produce models for clients that highlight potential robustness issues with mechanical components and digital sensing techniques at early stages to determine suitable technologies for medical devices.

Learning from the past, looking to the future

Federico Lavorini, Professor and Consultant in Respiratory Medicine at the Department of Clinical and Experimental Medicine, Careggi University Hospital, Florence, Italy, gave an excellent summary of drug delivery over the last 100 years, including innovations where design has reduced user error.

Further talks considered what pharma could learn from other sectors, especially as we move from ‘sick care’ to ‘health care’ – where technology identifies and treats conditions before they become symptomatic. Our Drug Delivery and Insight & Strategy teams work closely together to understand upcoming trends and draw on insights into consumer expectations from the consumer and digital sectors for our clients.

Biologic treatments are coming to respiratory drug delivery and are likely to use Soft Mist Inhalers (SMIs) and Dry Powder Inhalers (DPIs) for delivery, with current trends looking to lean heavily on DPIs. This is likely to lead to the development of new, higher-performance DPIs to provide the best efficiency delivering these high-cost treatments to the patient. We have dramatically increased the performance of DPI engines for our clients through our science-based approach to increase fine particle fraction for their devices.

How we can help

Our team are experienced in all stages of the development of drug delivery devices for a wide range of scenarios and applications in the medical industry, with a dedicated team working in these areas. Here at CDP, we have these specialists all under one roof to partner with you to bring your device to market and can also draw on the learnings of our colleagues in consumer markets to guide on relevant future consumer expectations.