New challenges on smart manufacturing industry

New challenges on smart manufacturing industry

Big Data as one of the so called “digital enablers” of Industry 4.0 sits at the core of promising technologies to contribute to the revolution at factories where vast amounts of data (whether they are big or small) hides enormous amount of knowledge and potential improvements for the manufacturing processes.

The Strategic Research and Innovation Agenda (SRIA) of Big Data Value Association (BDVA) defines the overall goals, main technical and non-technical priorities, and a research and innovation roadmap for the European Public Private Partnership (PPP) on big data. Within the current expectations of the future Data Market in Europe (around 60 B€), Manufacturing was at the first place in 2016 (12.8 B€) and in the 2020 projections (17.3 B€), revealing a leading role played by this sector in the overall Data Economy.

With the aim to find an agreed synthesis, the BDVA adopted the “Smart Manufacturing Industry” concept definition (SMI), including the whole value chain gravitating around goods production, secondly identified three main Grand Scenarios aiming at representing all the different features of a SMI in Europe: Smart Factory, Smart Supply Chain and Smart Product Lifecycle.

Given the relevance of both Data Market and Manufacturing industry in Europe and in accordance with European initiative of Digitation of Industry, CARTIF, together with rest of experts from BDVA association engaged in a collective effort to define a position paper that proposes future research challenges for the manufacturing industry in the context of Big Data.

To contextualize these research challenges, the BDVA association has defined five technical areas for research and innovation within the BDVA community:

  • Data Management and lifecycle motivated by the data explosion, where traditional means for data storage and data management are no longer able to cope with the size and speed of data delivered.
  • Data Processing Architectures originated by fast development and adoption of Internet of Things (IoT) and the need to process immense amounts of sensor data streams.
  • Data Analytics that aims to progress technologies and develop capabilities to turn Big Data into value, but also to make those approaches accessible to wider public.
  • Data Protection addressing the need to ensure the correct use of the information whilst guarantying user privacy. It includes advanced data protection, privacy and anonymization technologies.
  • Data Visualisation and User Interaction addressing the need for advanced means of visualization and user interaction capable to handle continuously increasing complexity and size of data and support the user exploring and understanding Big Data effectively.

During a series of workshops activities, started from the 2016 EBDVF Valencia Summit till the 2017 EBDVF Versailles Summit, BDVA experts distilled a set of research challenges for the three grand scenarios of smart manufacturing. These research challenges where mapped in the five technical priority areas of the big data reference model previously introduced.

To exemplify the outcomes of this mapping, the following figure gathers the headings of the set of challenges identified and discussed by the BDVA members into the Smart Factory Scenario. The interested readers are encouraged to analyze the full set of challenges in the SMI white paper.

Challenges set initially in this first version of SMI position paper set the tone for the upcoming research needs in different Big Data areas related with manufacturing. In the Smart Factory scenario the focus is on integration of multiples sources of data coming not only from the shop floor but also from the offices, traditionally separated in Industry 3.0. Interoperability of existing information systems and the challenge of integrating disruptive IoT technologies are major trials in the area of data management. Closer to the needs of a Smart Factory, the analytics challenges are focused on prescriptive analytics as tools for an optimal decision making process at the manufacturing operations management site including the optimization trough the evolved concept of digital twin.

Hardware and software ‘easy-to-use’

Hardware and software ‘easy-to-use’

There are many research and innovation projects whose objective is the design and development of an electronic device, whose purpose is to satisfy main requirements of the market. In general, we look for devices with the necessary capacity to acquire information about the physical world that surrounds us and, in many cases, interact with it.

To carry out the validation of the idea, it is necessary to carry out a previous prototype that allows a first approximation of the final solution. Generally, the most complex and interesting part is the electronic design of the device. In this part, the design and development of the electronic board is carried out, defining consumption and communication requirements, selecting microcontrollers, PCB board, components, connectors, etc.

This task means to have expensive electronic design software licenses, to integrate expert electronic staff into the work team and to allocate a significant part of the project hours to its execution.

Times change, more and more hardware development platforms are involved in making these changes possible. These platforms offer the user a board that integrates the microcontroller with the circuits and basic components of communication, power, etc. Among them stand out: Parallax, STMicroelectonics, LaunchPad, Microchip ChipKIT, mbed (version of ARM to give solutions to “internet of things”).

But, if I had to choose one of these platforms at this time, I would do it for Arduino. I think he has cleverly combined the hardware and software, generating a flexible prototyping platform, open source and easy to use, whose features are:

  • A hardware based on powerful boards that integrate simple microcontrollers. Its main characteristics are low cost, small size and low consumption. It is published under a Creative Commons license, a wide variety of auxiliary equipment developed by other manufacturers that support this platform is available on the market.
  • Open source software, based on a simple and clear development environment. That allows expert programmers to generate complex solutions. In part, this must availability of a multitude of standardized libraries contributed by a large community on the internet.

These characteristics facilitate and guarantee the integration of the new trends and evolutions that are continuously generated in the field of electronics, thus improving their features and capabilities.

Although a priori it may be thought that this platform is designed to start experimenting with electronics, its features make it a flexible and powerful tool for expert users, facilitating the development of advanced prototypes.

Therefore, these tools allow to reduce costs and design times of any technological proposal, facilitating the creation of prototypes and reducing the errors generated in its development phase. This allows the researcher to forget about the implementation at a low level and focus on the design features.

This technology has great potential for integration in several of the technological research and innovation lines with which the European Union is currently working, such as, the Internet of Things and in Factories of the future, of H2020.

In CARTIF we are aware of its importance and we have started to use these platforms as support in the development of our research work. A sample of this is the European project “SANDS, where the Internet of Things, Social Networks and Intelligent Systems converge, and the Spanish project “REPARA 2.0, in which new autonomous and wireless sensors are searched to be embedded in the asphalt layer of our roads.

How to improve your processing plant without large investments?

How to improve your processing plant without large investments?

Any processing plant – continuous, batch, hybrid – can improve its economic, safety and environmental indicators in two ways: improving its processing equipment or improving the control of those equipments.

The improvement of processing equipment is usually a task that requires large investments, since it almost always involves the acquisition of new processing equipment or in the best case requires expensive remodeling.

On the contrary, these performance indicators can be substantially improved through control, without, in the majority of cases, any investment in new technical means of instrumentation and control. This is because in practically all cases there is a wide margin to improve the performance indicators of a processing plant through its regulation.

The origin of this margin of improvement is multiple. The most common causes are: the control system is not well designed or tuned; due to ignorance or haste, all the benefits of the available control system are not used; the automaton programmer or the process engineer are not control experts; the dynamics of the processes under automatic regulation are not known with the required depth; the design of the plant has not been made under the integrated design approach.

Are also diverse and numerous actions that can be implemented to improve the performance of control loops without any investment and that we will review in next posts, such as: improve the tuning of the regulator, redesign the controller, implement anticipatory compensation of disturbances, enhance the tuning of cascade regulation loops, redesigning or re tuning the level controllers of the buffer tanks if necessary, using a control algorithm advanced available or supportable by the controller instrument, reducing couplings between loops, making a better assembly of the measuring probe, etc.

In its vast majority, the processing plants are automated under control structures (basic control, cascade control, split range control, selective control, coupled loops, etc.) based on the universal PID controller, in all its particularizations (P, PI, PD, PID, PI_D, etc.).

Despite its longevity and the development of multiple advanced control techniques, PID control maintains an overwhelming presence in the process industry.

Its extensive use in the industry is such that all the surveys known by the author conclude unanimously, in which more than 95% of the existing control loops are of the PID type. However, also many surveys conclude that a high percentage of the loops with PID control in the world, are operated in manual mode, while another similar percentage operates defectively. For example, as shown in the following figure, in [1] it is reported that only 16% of the PID regulation loops are optimally tuned and their performance is therefore excellent.

There is no doubt that in most cases, the incorrect or poor tuning of the controller can be the cause of the poor performance of the control loop or its irregularities in the operation.

However, it should not be forgotten that automatic regulation systems are holistic systems, and as such they must be analyzed as a whole and not only through the parts that compose them. That is why it is necessary to review the other components of the loop before deciding what action should be exercised on said loop.

Hence, the procedure of action in all cases, must begin with a field review of all the components of the loop (controller, process, actuator, measurer and communication channels), as well as an analysis of the possibility of coupling with other process loops.

The result of this first phase will determine what concrete action corresponds to perform to solve the poor performance of the automatic regulation loop.

CARTIF offers this service to optimize the performance of the regulation systems of processing plants. The optimization reduces the oscillations and the variability of the production plant, making the regulation system more accurate, faster, more stable and safer, and in this way improving its efficiency, safety, environmental impact and profitability.

In next post, the execution procedure will be described for each of the possible actions, starting with the simplest one, the re-tuning of the controller.

New applications of Deep Learning

New applications of Deep Learning

A little more than a year ago, in another post of this blog, our colleague Sergio Saludes already commented what is deep learning and detailed several of its applications (such as the victory of a machine based on these networks over the world champion of Go, considered the most complex game in the world).

Well, in these 16 months (a whole world in this topic) there has been a great progress in terms of the number of applications and the quality of the results obtained.

Considering, for example, the field of medicine, it has to be said that diagnostic tools based on deep learning are increasingly used, achieving in some cases higher success rates than human specialists. In specific specialties such as radiology, these tools are proving to be a major revolution and in related industries such as pharmaceuticals have also been successfully applied.

In sectors as varied as industrial safety, they have recently been used to detect cracks in nuclear reactors, and have also begun to be used in the world of finance, energy consumption prediction and in other fields such as meteorology and the study of sea waves.

Autonomous vehicle driving projects, so in vogue these days, mainly use tools based on deep learning to calculate many of the decisions to be made in each case. Regarding this issue, there is some concern about how these systems will decide what actions to take, especially when human lives are at stake and there is already a MIT webpage where the general public can collaborate in creating an “ethics” of the autonomous car. Actually, these devices can only decide what has previously been programmed (or trained) and there is certainly a long way to go before the machines can decide for themselves (in the conventional sense of “decide”, although this would lead to a much more complex debate on other issues such as singularity).

Regarding the Go program discussed above (which beat the world champion by 4 to 1), a new version (Alpha Go Zero) has been developed that has beaten by 100 to 0 to that previous version simply knowing the rules of the game and training against itself.

In other areas such as language translation, speech comprehension and voice synthesis have also advanced very noticeably and the use of personal assistants on the mobile phone is beginning to become widespread (if we overcome the natural rejection or embarrassment of “talking” with a machine).

CARTIF is also working on deep learning systems for some time now and different types of solutions have been developed, such as the classification of architectural heritage images within the European INCEPTION project.

All these computer developments are associated with a high computational cost, especially in relation to the necessary training of the neural networks used. In this respect, progress is being made on the two fronts involved: much faster and more powerful hardware and more evolved and optimized algorithms.

It seems that deep learning is the holy grail of artificial intelligence in view of the advances made in this field. This may not be the case and we are simply looking at one more new tool, but theres is no doubt that is an extremely powerful and versatile tool that will give rise to new and promising developments in many applications related to artificial intelligence.

And of course there are many voices that warn of the potential dangers of this type of intelligent systems. The truth is that it never hurts to prevent the potential risks of any technology, although, as Alan Winfield says, it’s not just artificial intelligence that should be feared, but artificial stupidity. Since, as always happens in these cases, the danger of any technology is in the misuse that can be given and not in the technology itself. Faced with this challenge, what we must do is promote mechanisms that regulate any unethical use of these new technologies.

We are really only facing the beginning of another golden era of artificial intelligence, as there have been several before, although this time it does seem to be the definitive one. We don’t know where this stage will take us, but trusting that we will be able to take advantage of the possibilities offered to us, we must be optimistic.

The dynamic electricity prices and the washing machines with initiative

The dynamic electricity prices and the washing machines with initiative

The day all of us enjoy electricity dynamic prices thanks to the smart grid, we will see how the washing machine and other home appliances come into life. And they will do it to allow us to pay less for the energy they need to do their duties. This will be one of the advantages of dynamic prices that are those that change along the day to encourage us to use energy when there is a surplus and to dissuade us of using energy when there is a shortage.

To have a better understanding of how dynamic prices will impact on our lives, there has been a research project conducted in Belgium that involved 250 families equipped with smart home appliances, namely washing machines, tumble dryers, dishwashers, water heaters and electric car chargers. Smart home appliances are those that receive information about electricity rates and that can make decisions about their own operation. For the purposes of the project, the day was divided into six time slots with different electricity prices according to the energy market. The families involved in the experiment were divided into two groups.

Researchers of Liner Intelligent Networks project in a demostration

The first group got information about next day electricity prices through an app installed in a mobile device. Then, they have to plan the use of the appliance for the next day considering the prices and their needs.

The second group have appliances that reacted to the prices in an automated fashion while preserving the owners’ utility. To understand how it worked, imagine a family who wants to have their dishes ready for dinner at 6PM. At 8AM, when they left home to work, they switch on the washing machine and indicate the hour the dishes must be ready. In the case the washing machine needs two hour to complete the work, the machine knows it could start to work at some moment between 8AM and 4PM and it chooses the moment in the time slot with lower price. In the case the energy were cheaper after 4PM, the washing machine started to work at 4PM to assure the dishes were clean and dry at the moment the owners needed them. Other appliances, like the water heater, just chose the time slots with cheaper energy to keep water at desired temperature.

The customers in the first group found the system annoying and they left the experiment. However, those in the second group found the method did not affect their comfort and that their appliances preferred the night to work. Besides this, there was a reduction in the electricity bill: 20% for dishwashers, 10% for washing machines and tumble dryers, and 5% for water heaters.

One of the findings of the project was that customers do not like to be on the lookout of the next day prices. This result is quite surprising if we consider the success of the Opower company, that according to them they were capable of reducing the bill, energy use and CO2 emissions using a customer information system quite similar to the one used by the Belgians with the first group, the one based on getting information the day before to make decisions in advance. But today Opower is in the Oracle realm, maybe because this big company was more interested in the data and knowledge Opower had about how people demand energy than in the possible benefits for environment, electric grid and customers’ wallets. Anyway, it seems the original’s Opower spirit remains alive.

The smart grid will make possible our washing machines will be connected to the power company through Internet soon and it will be in charge of making decisions about when to work in order to reduce our electricity bill. After that, if the washing machine makers were able to design a machine capable of ironing the clothes our happiness would be complete.

Best practices in energy efficiency in industry projects

Best practices in energy efficiency in industry projects

Traditionally, factors that were taken into account in manufacturing processes were economic, management, production, etc. However, this situation has changed in recent years. Energy efficiency and sustainable management are fundamental aspects that many companies have incorporated in their processes. Aware of that reality, CARTIF is accompanying the companies to incorporate in them the “Factories of Future” concept. An example of work done is the REEMAIN project.

REEMAIN moves toward zero carbon manufacturing and Energy Efficiency 2.0 through the intelligent use of renewable energy technologies and resource saving strategies that consider energy purchase, generation, conversion, distribution, utilization, control, storage, re-use in a holistic and integrated way.

In addition to that, REEMAIN project has provided us with the opportunity to expand our knowledge and experience in the Resource and Energy Efficient Manufacturing world. During the demonstration actions at the factories, the team has experimented energy and materials saving technologies and process and, of course, tested their effectiveness.

As the project comes to an end, we have produced a Best Practices Book as a way of sharing our experience with other professionals in the material and energy efficiency manufacturing domain.

The REEMAIN Best Practice Book summarises the key findings from our experience of over four years working on the project and are recommendations we make to the overall community involved in this kind of projects (designers, research institutions, factory owners, workers, contractors, public bodies, investors, etc.), in order to provide a help if some of them decide to get involve in an efficiency improvement project within a factory.

18 Best Practices are featured. They were based on our experience while searching and testing efficiency measures in our three demo factories: GULLON (Biscuit), BOSSA (Textile) and SCM (Iron & Steel). Three main thematic areas had been identified: Best practices on “design”, best practices on “Operation and maintenance” and “Exploitation & Dissemination”.

Each of them is presented in a short and visual way. They are composed of: title, description (being itself a recommendation), stakeholders, replicability, practical guidelines and things to avoid, impact rating, and finally the REEMAIN practical experience.

The Best Practice Book is available online for download free.