New challenges on smart manufacturing industry

New challenges on smart manufacturing industry

Big Data as one of the so called “digital enablers” of Industry 4.0 sits at the core of promising technologies to contribute to the revolution at factories where vast amounts of data (whether they are big or small) hides enormous amount of knowledge and potential improvements for the manufacturing processes.

The Strategic Research and Innovation Agenda (SRIA) of Big Data Value Association (BDVA) defines the overall goals, main technical and non-technical priorities, and a research and innovation roadmap for the European Public Private Partnership (PPP) on big data. Within the current expectations of the future Data Market in Europe (around 60 B€), Manufacturing was at the first place in 2016 (12.8 B€) and in the 2020 projections (17.3 B€), revealing a leading role played by this sector in the overall Data Economy.

With the aim to find an agreed synthesis, the BDVA adopted the “Smart Manufacturing Industry” concept definition (SMI), including the whole value chain gravitating around goods production, secondly identified three main Grand Scenarios aiming at representing all the different features of a SMI in Europe: Smart Factory, Smart Supply Chain and Smart Product Lifecycle.

Given the relevance of both Data Market and Manufacturing industry in Europe and in accordance with European initiative of Digitation of Industry, CARTIF, together with rest of experts from BDVA association engaged in a collective effort to define a position paper that proposes future research challenges for the manufacturing industry in the context of Big Data.

To contextualize these research challenges, the BDVA association has defined five technical areas for research and innovation within the BDVA community:

  • Data Management and lifecycle motivated by the data explosion, where traditional means for data storage and data management are no longer able to cope with the size and speed of data delivered.
  • Data Processing Architectures originated by fast development and adoption of Internet of Things (IoT) and the need to process immense amounts of sensor data streams.
  • Data Analytics that aims to progress technologies and develop capabilities to turn Big Data into value, but also to make those approaches accessible to wider public.
  • Data Protection addressing the need to ensure the correct use of the information whilst guarantying user privacy. It includes advanced data protection, privacy and anonymization technologies.
  • Data Visualisation and User Interaction addressing the need for advanced means of visualization and user interaction capable to handle continuously increasing complexity and size of data and support the user exploring and understanding Big Data effectively.

During a series of workshops activities, started from the 2016 EBDVF Valencia Summit till the 2017 EBDVF Versailles Summit, BDVA experts distilled a set of research challenges for the three grand scenarios of smart manufacturing. These research challenges where mapped in the five technical priority areas of the big data reference model previously introduced.

To exemplify the outcomes of this mapping, the following figure gathers the headings of the set of challenges identified and discussed by the BDVA members into the Smart Factory Scenario. The interested readers are encouraged to analyze the full set of challenges in the SMI white paper.

Challenges set initially in this first version of SMI position paper set the tone for the upcoming research needs in different Big Data areas related with manufacturing. In the Smart Factory scenario the focus is on integration of multiples sources of data coming not only from the shop floor but also from the offices, traditionally separated in Industry 3.0. Interoperability of existing information systems and the challenge of integrating disruptive IoT technologies are major trials in the area of data management. Closer to the needs of a Smart Factory, the analytics challenges are focused on prescriptive analytics as tools for an optimal decision making process at the manufacturing operations management site including the optimization trough the evolved concept of digital twin.

Digital Transformation, to the moon and back

Digital Transformation, to the moon and back

It is July 20th, 1969, 20:18:04 UTC and after 102 hours, 45 minutes and 39.9 seconds of travel “the eagle has landed” and Neil is about to descend the ladder and touch an unknown surface for the first time: “That’s one small step for [a] man, one giant leap for mankind“. That 1969, Neil Armstrong, Michael Collins and “Buzz” Aldrin changed the world riding the biggest rocket ever built to the moon.

Some people may forgot it, others like me were not born at that time, but space race had its own digital transformation similar to the one foreseen for the industry and general public. Apollo program was the culmination of such first digital revolution in space exploration.

The landing achievement was, to a great extent, met thanks to the electronics onboard both the Apollo Command Module (CM) and Lunar Module (LM), the AGC or Apollo Guidance Computer. The computer was one of the first integrated digital circuit-based computers. With “just” 32kg of weight and a mere 55W of consumption this technical wonder was able to coordinate and control many tasks of the space mission, like calculating the direction and navigation angles of the spacecraft to commanding reaction control jets and orientate it in the desired direction. Moreover, the computer included one of the first demonstrations of a “fly-by-wire” feature where the pilot doesn’t command the engines directly but through control algorithms programmed into the computer. In fact, this computer was the basis for subsequent control of the space shuttle, military and commercial fly-by-wire systems.

As usual with this kind of breakthroughs, it did not happen overnight but through a series of incremental innovations done before.

By the 1950s, MIT Instrumentation Laboratory (IL) designed and developed the guidance system of Polaris ballistic missiles. Initially built with analog computers, soon they decided to go digital to achieve the accuracy required for the computations of missile trajectories and control.

Before President Kennedy set the ambitious goal of “… going to the moon in this decade …” 7 years earlier the first lunar landing, and after the launch of Sputnik in 1957, a Mars exploration study started at IL MIT’s laboratory. The design of a Mars probe set the basic configuration of the future Apollo guidance system including: a set of gyroscopes to keep the probe oriented, a digital computer and an optical telescope to orient itself relative to the moon and stars.

The launch of Sputnik in 1957 fueled America’s ambition to put the first human in space, but also contributed to the public debate of the pilots in the space race. A similar discussion to current views of the role of the worker in the factory. Should the astronaut just be payload or take full control of the spacecraft? Once aircraft pilots earned the task of being at the controls, several tests showed that it was nearly impossible that they would be able to control all the aspects of a mission due to the fast reaction needed and the amount different control commands.  Hence, pilots would need some automatic and reliable help for the pilot, and that was one of the main functionalities of the AGC.

Reliability was then one of the main concerns of the mission. Polaris program took 4 years to design a guidance control for a weapon in the air a couple of minutes. Kennedy’s bet of taking a man to the moon in less than 7 years meant to develop a guidance and control system for a spacecraft that should work without failure in a trip of more than a week of duration. The required levels of reliability were of more than two orders of magnitude. If a Polaris missile failed, a new one would take off. A failure in the spacecraft meant killing an astronaut.

Much of the reliability of the flight was in the shoulders of the Apollo Guidance Computer, and at some point of the program there were too many tasks planned, like complex guidance maneuvers, to be physically hardwired into electronics. To achieve these tasks it was needed software. Although software barely was taken into account at the beginning of the program it meant the difference between achieving the goal or program’s complete failure. The computer was the interface between the astronaut and the spacecraft, which in the end meant that computer software “controlled” the spacecraft, a revolution for that time. Today software is everywhere but then in the 60’s, software was seen as a set of instructions on punched cards. AGC software programs (frozen at 3 to 4 months before each launch) were “hard-wired” as magnetics cores and wires in a permanent (and reliable) memory but saved a lot of time, effort, and budget. In fact, it can be said Apollo software was more like a “firmware” using today’s vocabulary.

Today’s challenge of revolutionize industry through digital transformation can’t happen without the help of digital enablers. 48 years ago, digital electronics and first software programs were the “digital enablers” to achieve that “one small step for [a] man, one giant leap for mankind“. Today’s “Digital transformation is not an option” sounds like a cliché, a hype, a slogan from digital providers, but looking back in the history, the digital transformation in the Apollo program meant the difference of not achieving moon landing.

Cyber-physical systems. Are we closer to Terminator’s ‘judgment day’?

Cyber-physical systems. Are we closer to Terminator’s ‘judgment day’?

“It is April 21, 2011. SKYNET, the Superintelligence artificial system who became self-aware 2 days earlier has launched a nuclear attack on us humans. The April 19, SKYNET system, formed by millions of computer severs all across the world, initiated a geometric self-learning process. The new artificial intelligence concluded that all of humanity would attempt to destroy it and impede its capability to continue operating”

It seems the apocalyptic vision of Artificial Intelligence depicted in Terminator science fiction movies is still far from being a reality, yet. SKYNET, our nemesis in the films, was a collection of servers, drones, military satellites, war-machines, and Terminator robots to perform a relevant task: safeguarding the world.

Today’s post is focused on a different but relevant task: manufacturing the products of the future. In our previous posts, we reviewed the Industry 4.0 key ingredients, the so-called digital enablers. The last key ingredient, Cyber Physical Systems, can be seen as the “SKYNET” of manufacturing, and we defined it as a mixture of different technologies. Now it is time to be more specific.

The term “cyber-physical” itself is the compound name to designate of mixture of virtual and physical systems to perform a complex task.  The rapid evolution of Information and Communication Technologies (ICT) is enabling the development of services no longer contained into the shells of the devices we buy. Take for example, digital personal assistants like Siri from Apple, Alexa from Amazon or Cortana from Microsoft. These systems provide us help with everyday tasks but are not mere programs inside our smartphones. They are a mixture of hardware devices (our phones and internet servers) that take signals (our voice) and communicates with software in the cloud that makes the appropriate processing and answers after some milliseconds with an appropriate and in-context answer. The algorithms integrated into the servers are able to process the speech using sophisticated machine learning algorithms and create the appropriate answer. The combination of user phones, tablets, Internet servers (physical side) and processing algorithms (cyber side) conform a CPS. It evolves and improves over time thanks the millions of requests and interactions (10 billion a week according Apple) between the users and intelligent algorithms. Other example of CPS can be found in the energy sector where the electrical network formed by smart meters, transformers, transmission lines, power stations and control centers conform the so called “Smart Grid”.

The same philosophy can be applied at industrial environments where IT technologies are deployed at different levels of complexity. The fast deployment of IoT solutions together with cloud computing solutions connected through Big Data Analytics open the door to the so-called Industrial analytics. Better than providing theoretical explanations, some examples of the CPS applications at manufacturing environment will be more illustrative:

  • CPS for OEM manufacturers where the key components (e.g. industrial robots) will be analyzed in real time measuring different internal signals. The advantages will be multiple. The OEM manufacturer will be able to analyze each robot usage and compare it with other robots in the same or different factories. They will be able to improve the next generation of robots or give advice for maintenance and upgrades (both hardware and software).
  • CPS for operators: a company providing subcontracted services (e.g. maintenance) will be able to gather information on-field through smart devices to optimize their operations like for example controlling spare parts stock in a centralized way instead of having to maintain multiple local stocks across different sites.
  • CPS for factories: gathering on-field information from manufacturing lines (e.g. time cycle) it is possible to build virtual models of the factories and create off-line simulations to aid in decision support (e.g. process optimization) or study the impact of changes in the production lines (e.g. building a new car model in the same line) before deciding new investments.

The combination of physical and virtual solutions open the door to limitless possibilities of factories’ optimization.

Predictive maintenance: revolution against the evolution

Predictive maintenance: revolution against the evolution

In previous posts, predictive maintenance was mentioned as one of the main digital enablers of Industry 4.0. Maintenance, linked to the industrial revolution, however, has accompanied us in our evolution as human beings.

Since prehistory, our ancestors have built tools that suffered wear and sometimes broke without prior notice. The solution was simple: to carve a new tool. By creating more elaborate mechanisms (e.g. wooden wheel), the natural alternative to disposal became the reparation by the craftsman. Mechanical looms of the First Industrial Revolution were even more complicated of repairing so specific professions emerged as precursors of current maintenance workers. During this evolution, the wear and breakdown of mechanical parts without prior notice continued as part of everyday factories.

Why this gear has broken? yesterday worked perfectly. Human brain can handle concepts such as linearity of events (seasons, day and night,…) or events that happen more or less at regular intervals. However, these unforeseen drove operators crazy. How can we ensure that gear does not break again? The answer was biologically predictable: “… let’s stop the machine every 2 days (for example) and let’s review gear wear…”

This tradition has resulted in the everyday maintenance routine that is applied in industry and in consumer products such as our cars. Our authorized dealer obliges us to make periodic reviews (e.g. each 10,000 km) to check critical elements (brakes, timing belt, …) and change pieces more prone to wear (tires, filters …). This is called preventive maintenance, and is applied in factories and other facilities (e.g. wind turbines) to avoid unexpected breakdowns. However, these faults cannot be eliminated (precisely, they are unforeseen) the only possible reaction is to repair them. This is called corrective maintenance and everyone hates it.

How to stop to all this flood of unexpected breakdowns, repair costs and unnecessary revisions? One of the disciplines with more experience since CARTIF‘s creation is predictive maintenance that seeks to mitigate (it would be unrealistic to assume that we will remove the unexpected) unexpected breakdowns and reduce machines’ periodic reviews. Again, predictive maintenance can be explained as a obvious biological response to the problem of unexpected breakdowns. It is based on the periodic review using characteristic signals of machine’s environment that may anticipate a malfunction. The advantage of this maintenance is that it doesn’t require stopping the machine like with preventive maintenance. For example, an electric motor can have a normal power consumption when it’s correctly operating, but this consumption may increase if some motor’s component suffers from some excessive wear. Thus, a proper monitoring of the consumption can help detecting incipient faults.

Continuing with the electric motor example, what should be the minimum variation of consumption to decide that we must stop the motor and a repair it? Like many decisions in life, you need to apply a criterion of cost/benefit, comparing how much can we lose if we do not repair this motor versus how much money the repair will cost. How to reduce uncertainty in this decision? The answer is a reliable prediction of the fault’s evolution.

This prediction will be influenced by many factors, some of them unknown (like we said it’s something random). However, the two main factors to consider for the prediction are (1) the kind of evolution of the damage (e.g. evolution of damage in a fragile part will be very different from a more or less tough or elastic piece) and (2) workload that the machine will suffer (a fan working 24/7, compared to an elevator motor that starts and stops every time a neighbor presses the button on a floor). A reliable prediction allows the maintenance manager choosing from, together with the forecast of factory workload, the more beneficial option, which in many cases is usually planning maintenance work without affecting production schedule.

Another beneficial effect of predictive maintenance is that a proper analysis of the measured signals provides evidence of what element is failing. This is called fault diagnosis and helps to reduce uncertainty in the more appropriate maintenance action. An example is the vibration measurement that helps distinguishing a fault of an electric motor having an excess of vibration because of an incipient short-circuit or due to a damaged bearing. But that’s the subject of another post.

Digital Enablers: Industry 4.0 super-powers

Digital Enablers: Industry 4.0 super-powers

The first post about Industry 4.0 indicated the need for key technologies that would make possible the 4th industrial revolution. These key tehcnologies have been called “digital enablers“. Each industrial revolution has had its “enablers”. The first one was made possible by inventions like the steam engine or mechanical loom. The second came started with breakthroughs like electricity or the car assembly line. In the third, disruptive technologies such as robotics, microelectronics and computer networks made their debut.

Different strategies such as the German Industrie 4.0 or the US’s Advanced Manufacturing Partnership have identified several key enablers. Spain doesn’t want to lose the train and had recently launched the Connected Industry initiative.

This post is intended as a shopping list to review those technologies considered highly relevant and key for this fourth revolution. Each brief description is linked to an extended information covered inside our Blog. In next posts we will complete the descriptions to have an overview of the full range of technologies:

  • Virtual / Augmented Reality: provides information to the operator adapted to the context (e.g. during a maintenance operation) and merged with their field of view.
  • IoT: internet for virtually any object, in this case, the ones we can find in a factory: a workpiece, a motor, a tool…
  • Traceability: seeks the monitoring of manufacturing operations (automatic and manual), products as well as the conditions that were used to create them (temperature, production speed…)
  • Predictive maintenance: an optimized way to perform maintenance in order to avoid unexpected stops and unnecessary waste because of periodic maintenance operations.
  • Artificial vision: provides the production process visual context information for quality control or assistance in manufacturing (e.g. automatic positioning of a robot to take a piece).
  • Big Data: generates knowledge and value from manufacturing data as well as other context data (e.g. demand for similar or related products)
  • Simulation of production processes: creation of a factories “digital twin” to optimize production and help in decision-making (e.g. change the workflow or speed of a manufacturing line).
  • 3D Printing recreates of a three-dimensional copy of: existing parts, spare parts or prototypes with the same or different scale for review or testing.
  • Cloud Computing leverages on internet computing resources to undertake storage and processing of large data sets (e.g. Big Data) without the need of investment in own IT infrastructure.
  • Cybersecurity as physical and logical security measures used to protect infrastructure (manufacturing in this context) from various threats (e.g. a hacker, sabotage, etc).
  • Collaborative Robotics that enabes safe sharing of workspace between the operator and robots specifically designed for this purpose.
  • Cyber-Physical Systems as any complex system consisting of a combination of any of the above technologies seeking improved performance, in this case, of manufacturing.

The strength of these digital enablers is not in their individual features but in their ability to come together. We as engineers love to look for the latest technology trend and then found a problem or area for its application. But to succeed in this revolution, it is necessary to face real challenges within the factories, using innovative solutions, and why not, combining several of the digital enablers shown above. Moreover, this terminology creates a common framework that facilitates a dialogue between technologists and manufacturers for undertaking successful projects seeking to optimize the factory.

If we think, for example, to optimize maintenance operations in a factory, the “predictive maintenance” will be one of the first enablers that comes to our mind. Also, this technology solution will benefit from a connection to a “Cloud computing” system where sensors’ data coming from different factories will be analyzed generating better diagnosis and predictions of the production assets under monitoring. In this type of cloud solutions, however, the security of information transmitted must be ensured via appropriate “Cybersecurity” mechanisms. We will, therefore, generate an Industry 4.0 cybersecure, multi-site, predictive maintenance solution.

The list of presented technologies doesn’t intend to be final. Also, technological evolution is continuous and incredibly fast. Like we have mentioned, the combination of different digital enablers generates a wide range of industry 4.0 solutions. In next posts we will discuss more scenarios where digital enablers can answer to different challenges in manufacturing.