At the Analytica 2018 in Munich, many companies demonstrated new or upgraded interesting instrumentation or improved software. However, for the future of analytics, another discussion was also of major importance: the digitized lab adopting ideas of the present hype: industry 4.0. Based on these concepts, the idea of analytics 4.0 (or lab 4.0) was presented and discussed. To understand these ideas, one has to realize the development and ideas leading to industry 4.0. After mechanization (industry 1.0), mass production (industry 2.0), and automation (industry 3.0), the idea of an internet of things (IoT) influenced the approach of industry 4.0. The idea of IoT – a network of physical devices with embedded electronic software, sensors, and actuators that are connected with each other and exchange data – will influence the development in and future of analytics, also. The network of laboratory elements such as processes, data analysis, combining equipment, and collecting big data is considered the basis for an efficient, smart laboratory with quality assurance and for digitizing the processes within the lab. Some of you might have seen central labs in hospitals where analytical equipment is arranged along a kind of conveyor belt on which barcode-labelled samples pass by to be examined. This is a first approach and can be compared to industry 3.0.

Many years ago we learned about laboratory information management systems (LIMS) that combined large numbers of instruments (in earlier times through V.24 or IEC interfacing) and collected data, forming something like a large database. Over the years, such LIMS have evolved from simply controlling samples and collecting data to managing modern aspects of chemoinformatics. Workflow and data tracking are limited, and there is poor crosstalk of instrumentation. At this point, recent approaches go beyond this type of data handling – they aim at digitizing the processes, and are based on the capability of instruments to communicate with each other. The old interfaces are no longer able to handle the large data volumes, the real-time applications or remote sensing, or to realize the key idea of process programming. However, the new approach intends that all instruments have access to the network and communicate via standardized protocols, and can end up with fully automated laboratory activity sequences, allowing quality assurance management. This approach allows data protection (a currently very important issue) – the focus is on reliability, data throughput, and error prevention. Fully automated individual devices can exchange data and sample with each other. Standardized interfaces allow a combination of various instruments and even a combination of centralized and decentralized laboratories. Complex data evaluation can be carried out, data can be stored centrally, and used for trend analyses.

All these arguments for new developments in analytics could be considered to be invalid for research. Analytics scientists want to intervene in measurements anytime. They want to carry out evaluation themselves and use their experience to set up the experimental design for their measurements, and they wish to implement their own feasibility controls. They have neither artificial intelligence nor deep thinking (new keywords) in mind. However, looking at new research areas using analytics, especially in life sciences, it can be seen that in high throughput screening measurements, very large data volumes are produced. The data formats sometimes are not compatible with evaluation software in the research lab or with commercial software. Imaging techniques also produce large data volumes. With view to the necessity in analytics to calibrate and to produce reliable and reproducible data, a solution where sample handling for calibration is automated might be welcome. We all have realized in the past that in many cases different instruments such as pumps, injection valves, or samplers cannot communicate with the measuring device, and that quite a bit of PhD time has to be spent on enabling the various devices to communicate with each other. All these problems are hoped to be solved by approaches for standardizing interfaces and data formats and for achieving a status of plug and play known from computers and their peripheral devices. With modern operating systems, we are used to the idea that it does not take much to connect, e.g., a printer or an external storage device to our computer. The intention is to achieve this with the concepts mentioned in the title: SiLA (Standardization in Lab Automation) or OPC UA (Open Platform Communication – Unified Architecture) are approaches for the exchange of information between different platforms of different providers to enable simple integration of these platforms without costly and time-consuming software development, and to enable data exchange between instruments of different providers. Thus, the objective is to standardize the interfaces and the transfer protocols, to integrate security, and to define a channel data model. The Allotrope Foundation (an international consortium of pharmaceutical, biopharmaceutical, and other research-intensive industries) is working in this area and aims at developing an advanced data architecture to standardize the acquisition, exchange, and management of laboratory data. On account of the large data volumes, the approaches have to go beyond JCAMP. One approach is comparable with an XML standard language for analytical chemistry called AnIML (Analytical Information Markup Language).

In summary, lab 4.0 goes beyond using a single instrument, feeding data to software, and trying to connect sampler and instrumentation. Instruments can talk to each other, where an operating system as in a computer system allows programming of instrument interaction, back-feed from evaluated data to instrument for additional measurements, or even a programming of an overall process involving many instruments with ending up in quality control management and documentation. In research, we will consider these aspects as very futuristic and not really relevant for our daily work. However, new instrumentation, imaging techniques, high throughput, and modern concepts of big data handling also require new approaches in research laboratories. However, multimodal spectroscopy might be a first example, another will be publishers requirements for better and higher quality presentation of calibration experiments, or even of standardized presentation of all the raw data as expected in the future by good scientific practice. Thus, lab 4.0 will not be an approach influenced by industrial processes, but will also influence procedures in research institutes.

These aspects should be discussed in the near future and considered in modern teaching of analytics. They will, hopefully, improve publishing calibration results, experiments, and quality standards of data handling as a prerequisite for the interpretation of experiments. Thus, this new world will certainly influence analytics in the future. ABC will consider these aspects by providing feature articles and spotlights dealing with these new concepts, describing their background, their necessity, and their consequences.