Industry 5.0, seriously?

Industry 5.0, seriously?

It seems unbelievable, but 5 years have passed since CARTIF inaugurated the blog with the post on Industry 4.0 in which I analysed some of the keys to the so-called “fourth industrial revolution” and how it could affect the industry in our country. It has always seemed risky to me to try to define this revolution from within itself. I suppose that time and the historical perspective will make it clearer if it really has been a revolution or simply a technological mantra. Fasten your seat belts because if we have not yet assimilated this revolution now they “threaten” us with the next one, Industry 5.0, they call it. Original, isn’t it?

If the fourth promised to interconnect the productive means of the entire value chain to make a transition to the intelligent industry or Smart Industry (everything has to be Smart as when many years ago any self-respecting appliance needed to carry “fuzzy logic” on-board). The fifth industrial revolution tries to humanize the concept beyond just producing goods and services for economic profit. The challenge of this revolution is to include social and environmental considerations in its purpose. Keywords if this revolution, as defined by the European Commission, should include human-centric approach, sustainability and resilience.

By developing innovative technologies with a human-centric approach, Industry 5.0 can support and empower workers, rather than replace them. Likewise, other approaches complement this vision from the consumer’s point of view in such a way that they can have access to products that are as personalized as possible or adapted to their possibilities. Thus, concepts such as personalized food or tailor-made clothing could be virtually applied to any consumer product.

The sustainability in the development of the industry needs to reconcile the economic and environmental progress objectives. To achieve such common environmental objectives, it is necessary to incorporate new technologies and integrate existing ones by rethinking the manufacturing processes by introducing environmental impacts in their design and operation. Industry must be a leading example in the green transition.

Industry resilience means developing a greater degree of robustness in its production, preparing it against disruptions and ensuring that it can respond in times of crisis such as the COVID-19 pandemic. The current approach to globalized production has shown great fragility during the pandemic that devastates us. Supply chains must also be sufficiently resilient, with adaptable and flexible production capacity, especially in those aspects of products that satisfy basic human needs, such as healthcare or security.

Just as the fourth needed digital enablers, this new revolution needs technological aspects to make it happen. From a practical point of view, we can say that the enablers we reviewed a while ago are fully up-ot-date for Industry 5.0. We could include some additional ones such as quantum computing or block-chain, incipient ones 4 or 5 years ago. If the enablers are similar, why are we talking about a new revolution? It is a matter of priorities. If the fourth spoke abou hyper-connection of processes to the digital world through cyber-physical systems or the IoT, in the fifth, a cooperation between human and digital technology is sought, either in the form of collaborative industrial robots, social robots or artificial intelligence systems that complement or assist in any task related to production, from installing a door in a car to deciding how to organize the next work shift to meet the productivity goal of the manufacturing plant.

IoT technology to improve the efficiency of industrial companies

IoT technology to improve the efficiency of industrial companies

With the promise of 75 billion devices connected to the Internet around the world in 2025, the ‘Internet of Things’ (IoT) opens the door to a future of opportunities for companies to optimize their processes, whether in the form of manufacturing their products, supervising their quality or monitoring the critical machines in the factories: ovens, manufacturing lines or refrigerated warehouses.

In our daily experience as consumers, we can find a multitude of technological offers in IoT devices that we integrate into our lives in a fast and, sometimes, impulsive manner, either because of fashions or real benefits. However, the incorporation of these technologies in companies is not done in such an impulsive way, since it involves a careful study of feasibility and profitability, often complex to demonstrate, as usually happens with new technologies.

In addition, IoT possesses  a significant flexibility to integrate itself into the IT infrastructures of the factories. The ‘i’ of IoT means “internet”, which seems to be automatically associated with a direct connection to the Internet of “things” in the factories, and this generates panic because of possible cybersecurity threats for almost any company. To fight against these barriers, information and training are key aspects.

Within this framework, the IOTEC Spain-Portugal cross-border cooperation project is being developed. This initiative aims to create a collaborative network of different actors (researchers, public bodies, ICT solutions providers and industrial companies) of both countries to facilitate the IoT integration in companies. Participants in IOTEC have analyzed different industrial and ICT companies to look for gaps and strengths and to be able to relate supply and demand of IoT. From CARTIF, we coordinate the activities around the industrial companies in order to know their IoT needs through a detailed analysis of their organizational and productive processes that include management, product design, manufacturing process and logistics.

This analysis included a series of technological audits to different agroindustrial companies, analyzing the potential of application of IoT in different parts of its productive process. 40 different organizational parameters were evaluated according to the methodology defined within the IOTEC project. For example, in the section on manufacturing processes, four aspects of great relevance were analyzed meticulously:

  • The type of process or productive transformation, which is fundamentally defined by aspects such as the raw materials used or the manufacturing steps.
  • The traceability requirements of raw materials, intermediate products and final products. This traceability has special relevance in agrifood companies.
  • The control of the production process that is triggered by different mechanisms according to the company: production orders, on demand, availability of raw materials (e.g. vintage).
  • The need to capture data in the plant as the first phase of complete digitalization of a productive process.

Once all the parameters were analyzed, it was carried out an exhaustive classification of different IoT technologies that could be applied in the industry and have a direct impact on the improvement of efficiency. Next, you can see these technologies:

All identified technologies were prioritized by those attending the “Forum of business opportunities through IoT and Blockchain” that took place on November 14, 2018 in Valladolid. The attendees to the event had the opportunity to reflect and vote on this set of technologies to assess their need and the importance of its dissemination by the IOTEC project. Once these priorities are established, it is now necessary to make them known so that IoT solution providers can adapt their offers to real needs.

Likewise, work is being carried out on dissemination and training activities to bring IoT technologies closer and concrete examples of their application to the set of industrial companies in the regions of Castilla y León and the Centre of Portugal participating in the IOTEC network. Any company supplying or demanding IoT technologies can participate in the project forum and benefit directly through collaboration and training opportunities in this exciting set of technological solutions such as the IoT.

New challenges on smart manufacturing industry

New challenges on smart manufacturing industry

Big Data as one of the so called “digital enablers” of Industry 4.0 sits at the core of promising technologies to contribute to the revolution at factories where vast amounts of data (whether they are big or small) hides enormous amount of knowledge and potential improvements for the manufacturing processes.

The Strategic Research and Innovation Agenda (SRIA) of Big Data Value Association (BDVA) defines the overall goals, main technical and non-technical priorities, and a research and innovation roadmap for the European Public Private Partnership (PPP) on big data. Within the current expectations of the future Data Market in Europe (around 60 B€), Manufacturing was at the first place in 2016 (12.8 B€) and in the 2020 projections (17.3 B€), revealing a leading role played by this sector in the overall Data Economy.

With the aim to find an agreed synthesis, the BDVA adopted the “Smart Manufacturing Industry” concept definition (SMI), including the whole value chain gravitating around goods production, secondly identified three main Grand Scenarios aiming at representing all the different features of a SMI in Europe: Smart Factory, Smart Supply Chain and Smart Product Lifecycle.

Given the relevance of both Data Market and Manufacturing industry in Europe and in accordance with European initiative of Digitation of Industry, CARTIF, together with rest of experts from BDVA association engaged in a collective effort to define a position paper that proposes future research challenges for the manufacturing industry in the context of Big Data.

To contextualize these research challenges, the BDVA association has defined five technical areas for research and innovation within the BDVA community:

  • Data Management and lifecycle motivated by the data explosion, where traditional means for data storage and data management are no longer able to cope with the size and speed of data delivered.
  • Data Processing Architectures originated by fast development and adoption of Internet of Things (IoT) and the need to process immense amounts of sensor data streams.
  • Data Analytics that aims to progress technologies and develop capabilities to turn Big Data into value, but also to make those approaches accessible to wider public.
  • Data Protection addressing the need to ensure the correct use of the information whilst guarantying user privacy. It includes advanced data protection, privacy and anonymization technologies.
  • Data Visualisation and User Interaction addressing the need for advanced means of visualization and user interaction capable to handle continuously increasing complexity and size of data and support the user exploring and understanding Big Data effectively.

During a series of workshops activities, started from the 2016 EBDVF Valencia Summit till the 2017 EBDVF Versailles Summit, BDVA experts distilled a set of research challenges for the three grand scenarios of smart manufacturing. These research challenges where mapped in the five technical priority areas of the big data reference model previously introduced.

To exemplify the outcomes of this mapping, the following figure gathers the headings of the set of challenges identified and discussed by the BDVA members into the Smart Factory Scenario. The interested readers are encouraged to analyze the full set of challenges in the SMI white paper.

Challenges set initially in this first version of SMI position paper set the tone for the upcoming research needs in different Big Data areas related with manufacturing. In the Smart Factory scenario the focus is on integration of multiples sources of data coming not only from the shop floor but also from the offices, traditionally separated in Industry 3.0. Interoperability of existing information systems and the challenge of integrating disruptive IoT technologies are major trials in the area of data management. Closer to the needs of a Smart Factory, the analytics challenges are focused on prescriptive analytics as tools for an optimal decision making process at the manufacturing operations management site including the optimization trough the evolved concept of digital twin.

Digital Transformation, to the moon and back

Digital Transformation, to the moon and back

It is July 20th, 1969, 20:18:04 UTC and after 102 hours, 45 minutes and 39.9 seconds of travel “the eagle has landed” and Neil is about to descend the ladder and touch an unknown surface for the first time: “That’s one small step for [a] man, one giant leap for mankind“. That 1969, Neil Armstrong, Michael Collins and “Buzz” Aldrin changed the world riding the biggest rocket ever built to the moon.

Some people may forgot it, others like me were not born at that time, but space race had its own digital transformation similar to the one foreseen for the industry and general public. Apollo program was the culmination of such first digital revolution in space exploration.

The landing achievement was, to a great extent, met thanks to the electronics onboard both the Apollo Command Module (CM) and Lunar Module (LM), the AGC or Apollo Guidance Computer. The computer was one of the first integrated digital circuit-based computers. With “just” 32kg of weight and a mere 55W of consumption this technical wonder was able to coordinate and control many tasks of the space mission, like calculating the direction and navigation angles of the spacecraft to commanding reaction control jets and orientate it in the desired direction. Moreover, the computer included one of the first demonstrations of a “fly-by-wire” feature where the pilot doesn’t command the engines directly but through control algorithms programmed into the computer. In fact, this computer was the basis for subsequent control of the space shuttle, military and commercial fly-by-wire systems.

As usual with this kind of breakthroughs, it did not happen overnight but through a series of incremental innovations done before.

By the 1950s, MIT Instrumentation Laboratory (IL) designed and developed the guidance system of Polaris ballistic missiles. Initially built with analog computers, soon they decided to go digital to achieve the accuracy required for the computations of missile trajectories and control.

Before President Kennedy set the ambitious goal of “… going to the moon in this decade …” 7 years earlier the first lunar landing, and after the launch of Sputnik in 1957, a Mars exploration study started at IL MIT’s laboratory. The design of a Mars probe set the basic configuration of the future Apollo guidance system including: a set of gyroscopes to keep the probe oriented, a digital computer and an optical telescope to orient itself relative to the moon and stars.

The launch of Sputnik in 1957 fueled America’s ambition to put the first human in space, but also contributed to the public debate of the pilots in the space race. A similar discussion to current views of the role of the worker in the factory. Should the astronaut just be payload or take full control of the spacecraft? Once aircraft pilots earned the task of being at the controls, several tests showed that it was nearly impossible that they would be able to control all the aspects of a mission due to the fast reaction needed and the amount different control commands.  Hence, pilots would need some automatic and reliable help for the pilot, and that was one of the main functionalities of the AGC.

Reliability was then one of the main concerns of the mission. Polaris program took 4 years to design a guidance control for a weapon in the air a couple of minutes. Kennedy’s bet of taking a man to the moon in less than 7 years meant to develop a guidance and control system for a spacecraft that should work without failure in a trip of more than a week of duration. The required levels of reliability were of more than two orders of magnitude. If a Polaris missile failed, a new one would take off. A failure in the spacecraft meant killing an astronaut.

Much of the reliability of the flight was in the shoulders of the Apollo Guidance Computer, and at some point of the program there were too many tasks planned, like complex guidance maneuvers, to be physically hardwired into electronics. To achieve these tasks it was needed software. Although software barely was taken into account at the beginning of the program it meant the difference between achieving the goal or program’s complete failure. The computer was the interface between the astronaut and the spacecraft, which in the end meant that computer software “controlled” the spacecraft, a revolution for that time. Today software is everywhere but then in the 60’s, software was seen as a set of instructions on punched cards. AGC software programs (frozen at 3 to 4 months before each launch) were “hard-wired” as magnetics cores and wires in a permanent (and reliable) memory but saved a lot of time, effort, and budget. In fact, it can be said Apollo software was more like a “firmware” using today’s vocabulary.

Today’s challenge of revolutionize industry through digital transformation can’t happen without the help of digital enablers. 48 years ago, digital electronics and first software programs were the “digital enablers” to achieve that “one small step for [a] man, one giant leap for mankind“. Today’s “Digital transformation is not an option” sounds like a cliché, a hype, a slogan from digital providers, but looking back in the history, the digital transformation in the Apollo program meant the difference of not achieving moon landing.

Cyber-physical systems. Are we closer to Terminator’s ‘judgment day’?

Cyber-physical systems. Are we closer to Terminator’s ‘judgment day’?

“It is April 21, 2011. SKYNET, the Superintelligence artificial system who became self-aware 2 days earlier has launched a nuclear attack on us humans. The April 19, SKYNET system, formed by millions of computer severs all across the world, initiated a geometric self-learning process. The new artificial intelligence concluded that all of humanity would attempt to destroy it and impede its capability to continue operating”

It seems the apocalyptic vision of Artificial Intelligence depicted in Terminator science fiction movies is still far from being a reality, yet. SKYNET, our nemesis in the films, was a collection of servers, drones, military satellites, war-machines, and Terminator robots to perform a relevant task: safeguarding the world.

Today’s post is focused on a different but relevant task: manufacturing the products of the future. In our previous posts, we reviewed the Industry 4.0 key ingredients, the so-called digital enablers. The last key ingredient, Cyber Physical Systems, can be seen as the “SKYNET” of manufacturing, and we defined it as a mixture of different technologies. Now it is time to be more specific.

The term “cyber-physical” itself is the compound name to designate of mixture of virtual and physical systems to perform a complex task.  The rapid evolution of Information and Communication Technologies (ICT) is enabling the development of services no longer contained into the shells of the devices we buy. Take for example, digital personal assistants like Siri from Apple, Alexa from Amazon or Cortana from Microsoft. These systems provide us help with everyday tasks but are not mere programs inside our smartphones. They are a mixture of hardware devices (our phones and internet servers) that take signals (our voice) and communicates with software in the cloud that makes the appropriate processing and answers after some milliseconds with an appropriate and in-context answer. The algorithms integrated into the servers are able to process the speech using sophisticated machine learning algorithms and create the appropriate answer. The combination of user phones, tablets, Internet servers (physical side) and processing algorithms (cyber side) conform a CPS. It evolves and improves over time thanks the millions of requests and interactions (10 billion a week according Apple) between the users and intelligent algorithms. Other example of CPS can be found in the energy sector where the electrical network formed by smart meters, transformers, transmission lines, power stations and control centers conform the so called “Smart Grid”.

The same philosophy can be applied at industrial environments where IT technologies are deployed at different levels of complexity. The fast deployment of IoT solutions together with cloud computing solutions connected through Big Data Analytics open the door to the so-called Industrial analytics. Better than providing theoretical explanations, some examples of the CPS applications at manufacturing environment will be more illustrative:

  • CPS for OEM manufacturers where the key components (e.g. industrial robots) will be analyzed in real time measuring different internal signals. The advantages will be multiple. The OEM manufacturer will be able to analyze each robot usage and compare it with other robots in the same or different factories. They will be able to improve the next generation of robots or give advice for maintenance and upgrades (both hardware and software).
  • CPS for operators: a company providing subcontracted services (e.g. maintenance) will be able to gather information on-field through smart devices to optimize their operations like for example controlling spare parts stock in a centralized way instead of having to maintain multiple local stocks across different sites.
  • CPS for factories: gathering on-field information from manufacturing lines (e.g. time cycle) it is possible to build virtual models of the factories and create off-line simulations to aid in decision support (e.g. process optimization) or study the impact of changes in the production lines (e.g. building a new car model in the same line) before deciding new investments.

The combination of physical and virtual solutions open the door to limitless possibilities of factories’ optimization.