Source: genengnews.com
Artificial Intelligence is often defined as the ability of a machine to learn how to solve cognitive problems—a process akin to human intelligence. Within the realm of scientific methodology and laboratory interconnectivity, the most applicable way of thinking about AI is not that it seeks to replicate human reasoning precisely, but rather that it uses human reasoning as a model with the goal of supplementing and augmenting human observation and decision processes.
In this context, an AI algorithm must be trained to interpret available data and specified criteria to meet business objectives and drive operational decision making. AI further refines this process through an accumulation of relevant observations over time as both the accumulated data and business objectives are refined, allowing the understanding of the machine to approach that of human cognition.
The inherent diversity of equipment and systems involved in science and discovery is particularly conducive to the benefits of AI, where laboratory operations can be continuously optimized by this constant and cumulative refinement ability.
AI goes beyond our human capacity by never getting bored, never getting tired, while constantly observing and relentlessly improving. Complex classification through pattern recognition is where the technology is today and is rapidly growing towards a centralized lab intelligence system.
AI and laboratory complexity
As scientific discovery becomes more complex and demands on lab throughput increase, AI applied in the lab environment is helping to achieve both scientific and economic outcomes. Labs are already looking to understand system usage, automate quality processes, and optimize throughput and capital spend. Most enterprises are collecting some form of data to drive not only quality and expand capacity, but also process consistency and effectively plan future resource allocation.
Visionary companies are using this data for both scientific purposes and operational needs to predict output and assess how a process or team is performing. However, the diversity of equipment, coupled with the need for manual analysis of the data collected, means that high-quality data is not consistently available, and whatever data is available is often not helpful for decision-making.
AI, coupled with universal sensing capabilities, enables companies to realize these operational and financial benefits. Through high-quality and readily available insights, for example, AI enables simultaneous monitoring of all equipment usage in a lab and holistic tracking capacity.
Other similarly valuable data-producing capabilities include automatic real-time process bottleneck detection, sending users issue alerts, and detecting equipment operational anomalies. The outcomes of having such data include a proactive, conditional maintenance strategy which ensures production commitments are met in a timely fashion.
In this article we will focus on the use of AI for operational applications, specifically employing AI to augment the actionable operational insights available to a lab manager, operator, director, or executive.
Comprised of an extensive array of instrumentation and countless methods, the complexity of today’s lab requires an understanding of how each instrument behaves within a complex matrix of interactions.
The ability to monitor operations and continually provide increasingly sophisticated insights, is the core reason for introducing AI in the operational lab environment. Business models for scientific advancement are demanding higher levels of efficiency. Adopting powerful sources of information will become a necessary component of scientific productivity and is an inevitable next step in the creation of lab management systems that are so efficient, only AI will be able to produce them.
The hidden language of power
To effectively manage the growing complexity of the lab and achieve a uniform understanding of operations, AI must be fed universal and high-quality data. Relatively simple sensors that collect basic information such as power, humidity, and vibration can provide surprisingly rich information if correctly interpreted. Collectively, these various sensors can act as a type of nervous system, providing AI the means to understand instrument status, behavior, and utilization.
For example, as power is drawn by an instrument, the patterns of the current reflect the detailed operation of each component within the instrument. However, this pattern is complex and often incomprehensible using standard signal processing techniques. AI can learn to detect and decipher these subtle variations from complex signal profiles and provide vital insights into instrument activity and condition.
The beauty of using AI to interpret power is that it can provide sensitivity, accuracy, and universality—simultaneously looking deeply into the inner workings of any given instrument while conferring the ability to look broadly at all instrumentation in concert. Using this ubiquitous and seemingly mundane sensor, AI can actually learn to identify and interpret a hidden language within the signals to produce a surprisingly detailed understanding of behavior and health of any instrument.
Power is the “voice of the machine.” Every ripple and pattern in the power signature can be correlated to an operation state of the instrument, yielding insight into function, health, and usage.
Through an observation and training period, AI learns to recognize the discrete operational modes of an instrument, such as a measurement, a calibration, a test run, and so on. By determining what each mode of operation looks like, AI learns to recognize it in the future. Over time, AI fine tunes an ever-growing model of what is actually happening within the instrument and begins to understand the voice of the machine.
In the following example of a nextgen sequencing prep system, the power draw reflects the superposition of multiple components and behaviors which AI can decipher into discrete operations.
Once AI understands the machine’s typical operation, the system can then become predictive in nature. Over time, AI improves the internal model of the equipment and learns to identify more complex workings and deviations, even to the level of correlating precursors with future behavior, such as predicting when an instrument will fail and why. AI can even learn to develop a prescriptive ability, suggesting when maintenance must be performed and actions that must be taken to prevent failure.
Once AI has extracted operational states and system conditions, the information can be aggregated into visualization dashboards to provide insights for making data-driven decisions. For example, the following utilization heat map is used to identify and reallocate underutilized instruments.
AI can focus on any “metric of merit.” By understanding the activity of all processes, metrics can be built around the instrument. Valuable operational metrics include utilization, throughput, and capacity planning. AI’s ability to accurately measure instrument state increases the longer the system is connected to the instrument. Power monitor data analysis using non-AI (heuristic-based) methods has trouble accomplishing even basic running/idle utilization identification yielding an accuracy of 80%. AI routinely achieves accuracy of 98% or more.
The lab-wide intelligence system
The applicability is limitless as power is ubiquitous for nearly all instrumentation. By connecting all lab equipment to AI through this universal source of data, AI algorithms can provide an unprecedented view of lab operations in their entirety. The universal nature of power allows a laboratory of any scientific discipline—be it biology, physics, or any other discipline—to greatly benefit from such a holistic operational view.