Close

Our Blog

< Back to All Articles

Blog Series Part 4 – Contextualised Data and Connectivity

Posted on February 25, 2020 by David Staunton, Global Services Director & Ryan McInerney, Technical Consultant

Contextualised data and connectivity, part of our: ‘Life Sciences 4.0, Revolutionising Life Sciences Manufacturing Through Connected Systems & Data’ blog series

The foundation of any change to a manufacturing environment driven by Life Sciences 4.0 thinking will be contextualised data and connectivity. Every system and piece of equipment needs to be able to record and distribute reliable event data. It also needs to communicate with other systems and pieces of equipment, then subsequently access relevant and reliable data.  Once this connectivity is in place, operational teams have the basis for making better choices, or have the need to make choices removed as self-learning systems do this by interpreting data. It is imperative that data is contextualised for it to be useful – the integrated technologies need to know what the data is and when it was created.

A database with time-stamped data is essential, as consistent time data makes every subsequent interpretation and decision simpler and more reliable. If we use the example of the temperature gauge again, when an irregular temperature reading is given, the automated system will look for data around a previous event that mirrors the current one – it simply cannot do this reliably if said data is not accurately time-stamped.

Batch context – by adding context information such as product or recipe names, process phases or batch identification to time-series data in process historians, the value of the data for process engineers is greatly increased. However, historians are ‘write’ optimised and not ‘read’ optimised, creating a technical constraint as they store and compress new data making extraction and interpretation arduous. Finding the relevant historical event and building the process context around it can be laborious, requiring manual manipulation of data rather than an automated approach.

With the right systems, software and approach in place – operators can find specific batches, filter by products or phases and create and overlay profiles for good and bad batches. The ability to search data over a specific timeline and visualise all related events in that time frame quickly and efficiently will allow users (and eventually machines) to predict more precisely what is occurring or what will occur across industrial processes. Human-machine interfaces such as this are one of the many lauded Life Sciences 4.0 disruptions that will revolutionise pharmaceutical manufacturing.

Communication standards, such as those from the Object Linking and Embedding for Process Control (OPC) Foundation and the FieldComm Group, and data exchange standards from the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) are crucial for making sure devices and systems are designed to communicate with each other. In addition, standards from the International Society of Automation (ISA) specify engineering design data (e.g. ISA S108 for configuring intelligent devices), and the Capital Facilities Information Handover Standard (CFIHOS) standardises product data to facilitate equipment and device acceptance testing and handover. These standards will be increasingly important as machines and systems communicate directly with each other and make decisions with less human involvement in the coming years.

Following on from this topic ‘contextualised data and connectivity,’ in the next blog in our Life Sciences series we look at the role big data has to play and question its validity.

Leave a Reply

Got a question? Talk to us We are happy to discuss any queries you may have

Get in touch today