Golden Age of Molecular Diagnostics

Recently, the New York Times had an article on Page 1, above the fold, titled, “Suddenly, It Looks Like We’re in a Golden Age for Medicine”. As the piece describes, the explosion of the utility of new biotech technologies, from CRISPR to mRNA, is leading to a continuous stream of new molecules targeted at an ever-expanding range of human disease.


While it is inarguable that the new biologics will revolutionize medicine, including the eventual arrival of the long-heralded ‘personalized medicine’, it also invites the question, “Are we entering the Golden Age of molecular diagnostics?”. It certainly seems so.


It is no surprise then that the molecular diagnostics sector is projected to grow at more than an 11% CAGR, from $16B to more than $28B over the next seven years. The opportunity is driven both by the rapidly expanding set of molecules for which to test, but also by the equally rapidly changing expectations and needs of each of the diagnostics labs and healthcare-delivery environments.


The combination of changing technology, such as biosensors, microfluidics, bioanalytical platforms, improved lab-on-a-chip technologies, along with the desire for healthcare delivery centers to capture more revenue, is driving large-scale adoption of Point-of-Care diagnostics (POCD) systems, along with continuous monitoring and remote (home) testing variations.


In the diagnostic labs, wholesale adoption of digitization, robot-based automation and, now, AI-assisted diagnostics are forcing change on the clinical chemistry industry – and fast. One of the most important aspects of these large, centralized testing systems will be their flexibility regarding incorporation of new testing protocols. Genomic sequencing continues to evolve rapidly, improving in all directions – accuracy, speed and cost. Anticipating and designing for the continued evolution of products and platforms will likely be the difference between success and failure.


In the POCD market, the ability to adjust to the evolving tests will also be a major driver. Designing highly accurate diagnostic devices at a much lower selling price is also required. The move out of central labs will require improved ease of use by the less-trained personnel, as well as new techniques to determine the operational condition of the equipment and system maintenance needs.


For both markets, the total integration with sophisticated cloud-based backends will require new approaches to meet the needs of distributed medical architectures.


Overlaying all these new technologies is also a very rapidly evolving regulatory environment in the areas of software as a medical device (SaMD), cyber security in medical devices and AI in medical devices, all of which will be in effect by the end of 2023.


What does this really mean?


As the famous business consultant, Michael Porter, said, “Change brings opportunity.” OEMs of clinical and molecular diagnostics systems will need to change how their products are conceptualized, architected, developed and supported. However, in an interesting way, the FDA’s new guidance can be extremely helpful in bringing focus to some of the most impactful areas of product development. Each of the following areas have been part of a strong software development life cycle process (SDLC) for quite some time but have not been broadly adopted in the medical device industry. We are now entering a period when these areas not being handled properly will make fielding a successful diagnostic system nearly impossible.


Architecture, Architecture, Architecture

Anyone who has built a house knows that it requires a well thought out architecture – the overall plan that defines the major trade-offs; tall or wide building, kitchen on the first or second floor, spread-out bedrooms or all at one end, attached or detached garage, conventional or renewable HVAC. Before a design process begins, it is critical to have answered the high-level questions about how the house will be used, whether it will be expanded and what the utility infrastructure will look like.


Complex software systems are the same… but more so. The new guidance on SaMD directly reinforces previous FDA recommendations to properly segment system architectures to separate high and low risk components of the software controls (incorporated in IEC 62304, “Medical Device Software — Software Life Cycle Processes”).  Proper risk segmentation does not happen by accident but requires forethought about the overall system architecture. Done properly, it will not only improve the safety and reliability of the resultant product but has the potential to significantly reduce development and testing time. It can also significantly lower the time, effort and costs associated with 510k submittals by reducing the amount of software that is required to be documented to meet higher levels of risk.


The new regulations around cyber security, among other equally important topics, directly address the requirement to have fully articulated architectural designs, with full tracing of system cyber risks into the architecture trade-offs. The FDA is clear that down-stream liability for cyber breaches will reside with the OEM. The identification of cyber threats must be determined by thorough Cyber Threat Models, which identify “controlled” and “uncontrolled” risks and properly traced into the system architecture.


The incorporation of fast evolving technology, such as AI/ML, into slower changing, long-life products is a unique challenge in and of itself. Using isolation and replaceability of any AI engine as an architectural driver will eliminate the risk of being “locked-in” in the likely event that early components are surpassed by new technology. Use of techniques such as Architecture Trade-off Analysis Methodology (ATAM), developed by the Software Engineering Institute at Carnegie Mellon, provide formalized methods to capture, design for, and trace key system capabilities, functions, and interactions across complex distributed systems – resulting in improved product design. Without the use of such formalized methods, critical interplay, operational or security issues are easily missed until identified either in system integration and testing, or worse, post-product release. Both of which are expensive in terms of time and money, the latter with the potential for disaster.


Planning for the Life Cycle


A better understanding of the expected product life cycle drives a better system design. Understanding how the product is expected to evolve allows system architects and designers to develop pathways to support anticipated functionality or variations in operational modes. The FDA has helped by having both the new cyber security and new AI regulations add new OEM obligations around the development of specific plans for post-release field monitoring and product upgrade plans. For medical devices incorporating AI, the new regulations require the development and submittal of a Predetermined Change Control Plan (PCCP) which defines anticipated areas of modification of the AI algorithm and how the changes might be made and verified.


Once again, this is an excellent practice. 70% of the cost of product software occurs post release. Thinking through the longer-term support allows for better design trade-offs and better field monitoring capabilities.  Coupled with the new requirement to prove that verification capabilities are incorporated in both design and post-release plans will improve the overall testability of delivered software, thereby reducing the time associated with debugging and rework.


Finally, better tools, better systems


Recognition that these new diagnostics systems will utilize new technologies is a critical step. Moving from systems with large dependencies on embedded control and limited user interfaces to large, cloud-based, distributed systems utilizing a range of user interfaces represents a very real challenge to development organizations. The software tool environment in the application, cloud, and mobile spaces have evolved at a dizzying pace over the past decade.


Many engineering organizations have not had the opportunity to develop experience with these tools which can both drive up costs by lost productivity, and increase the risk of poorly implemented systems. To exacerbate the situation, there are a wide variety of available cloud platforms, AI/ML engines, devOps toolkits, and mobile platforms from which to choose.


It is important to avoid selecting tools and technologies based on a limited understanding of available options and to remember to demonstrably align selection of technologies and tools to specific system characteristics and goals.




The clinical and molecular diagnostics markets are changing very rapidly, putting increased pressure on the OEMs to respond with completely new products, based on very different expectations. Successful product deployment will require incorporating a wide variety of new  and rapidly changing concepts, technologies, tools and regulations. No easy task in the best of times. Under pressure from post-covid revenue drops, the need to be efficient and effective has never been higher. Taking a lead from the FDA guidance is a pretty good place to start.