Traditionally, factors that were taken into account in manufacturing processes were economic, management, production, etc. However, this situation has changed in recent years. Energy efficiency and sustainable management are fundamental aspects that many companies have incorporated in their processes. Aware of that reality, CARTIF is accompanying the companies to incorporate in them the “Factories of Future” concept. An example of work done is the REEMAIN project.
REEMAIN moves toward zero carbon manufacturing and Energy Efficiency 2.0 through the intelligent use of renewable energy technologies and resource saving strategies that consider energy purchase, generation, conversion, distribution, utilization, control, storage, re-use in a holistic and integrated way.
In addition to that, REEMAIN project has provided us with the opportunity to expand our knowledge and experience in the Resource and Energy Efficient Manufacturing world. During the demonstration actions at the factories, the team has experimented energy and materials saving technologies and process and, of course, tested their effectiveness.
As the project comes to an end, we have produced a Best Practices Book as a way of sharing our experience with other professionals in the material and energy efficiency manufacturing domain.
The REEMAIN Best Practice Book summarises the key findings from our experience of over four years working on the project and are recommendations we make to the overall community involved in this kind of projects (designers, research institutions, factory owners, workers, contractors, public bodies, investors, etc.), in order to provide a help if some of them decide to get involve in an efficiency improvement project within a factory.
18 Best Practices are featured. They were based on our experience while searching and testing efficiency measures in our three demo factories: GULLON (Biscuit), BOSSA (Textile) and SCM (Iron & Steel). Three main thematic areas had been identified: Best practices on “design”, best practices on “Operation and maintenance” and “Exploitation & Dissemination”.
Each of them is presented in a short and visual way. They are composed of: title, description (being itself a recommendation), stakeholders, replicability, practical guidelines and things to avoid, impact rating, and finally the REEMAIN practical experience.
I have tried my best to avoid starting this post with the awarded as the most-used-ever sentence in this sort of texts that states that “buildings account for a 40% of the energy consumption and the 36% of the GHG emissions” but the fact is that it is good starting point when writing about buildings and energy. To tell the truth, in this field, with the unsustainable energy consumption rates, CO2 and other contaminants emissions, and their still too low improvement trends, everyone knows that a 40% is much more than we can afford.
When searching for reasons, it is more than evident that there is a moment in which the architecture is somehow decontextualized; losing its connection with the environment and nature, and the so called “international” style defends architecture valid for every place, where machines solve all those aspects that have not been solved during the design. But in 1973 a reality check came, and an unprecedented crisis saw the first laws about energy and the first awareness campaigns were launched. Once the energy “free-for-all” was ended, it was time to think of how to reduce the energy consumption but without affecting comfort in all its levels.
In that moment, after the effects of the crisis, architecture had a great opportunity to self-reinvent and introduce into its principles (those from Vitrubio, Le Corbusier or whatever fundaments the design process of every architect) the energy efficiency. Sigfired Giedion (Space, Time, and Architecture, 1941) states that “architecture is intimately linked to the life of an age in all its aspects (…). When an age tries to hide, its actual nature will be transparent through its architecture”. Thus, in my humble opinion, the last quarter of the 20th Century will be characterised by a strange mix of three tendencies: a magazine architecture far from understanding that the energy sources are limited; the housing bubble (this bubble could be issue for more than one post), also far; and a third movement that looks behind to find the origin of the architecture and searching to be adapted to climate while taking advantage of the latest technical developments. The two first (and many other factors, let’s avoid putting the blame only on construction) made that the 73s crisis has reappeared –or perhaps it never went– into what we know today as “energy poverty”, that has been set up to affect sectors of society that didn’t seem to be that vulnerable in the gold years of the bubble.
And, being realistic, with a necessarily low tax of new construction, and with a building stock that suffers the consequences of the above, make that energy retrofitting is one of our best “weapons” in the fight against climate change while, at the same time, one of the main opportunities for the construction sector, so hardly penalised in the recent years. But the problem with this is found on the “agnosticism” that has been set up around energy savings, which still are not understood as an economic, social and environmental benefit. It is, thus, our responsibility (read here the technicians of the construction sector) to quantify and valorise these benefits so that financial institutions, public bodies, companies of the sector and specially users, demand energy efficiency in buildings not as an extra, but as a must.
In CARTIF we have been working during years in the sector of energy efficient retrofitting and, specially, in quantifying and valorising energy savings to make of them a guarantee both economic and social. Thus, projects like OptEEmAL, about which we have already talked in this blog, work capturing all the knowledge that we have generated these years when developing methodologies to evaluate these issues and offer tools that support this change of paradigm: from establishing approaches of collaborative work and risk sharing during the design and execution, to the support in the informed decision-making to all stakeholders involved through the use of modelling and simulation tools.
All in all, we only aim at recover the relevance of the energy efficiency as project mechanism in architecture, what could make Vitrubio reformulating its principles as firmitas, utilitas, venustas et navitas efficientum.
With this post, I would like to try to show a very clear example where, the intelligent use of a suitable artificial vision system can solve a major problem in a production line at a reasonable price.
The body of our vehicle consists of a multitude of metallic pieces, each with its own requirement. The automotive industry manufactures these parts through a laminating sheet forming process called stamping. In this process a metal sheet is placed on a matrix, it is fixed and later, a punch pushes the sheet towards the matrix generating the desired cavity.
Depending on the temperature of the steel blanks two types of stamping are defined: cold stamping and hot stamping. In this case, we will focus on the hot stamping, which is applied mainly in elements of high structural requirement, such as reinforcements, pillars, etc.
Image captured by the vision system at the exit of the oven
In this process the steel blanks is heated above the austenization temperature, obtaining a high ductility and then proceeding to a rapid cooling to achieve the martensitic hardening of the sheet. The pieces obtained reach high resistance, complex shapes are obtained and the effects of springback are reduced. This allows, among other things, to improve the passive safety of our cars and reduce their weight.
In this manufacturing process, the steel blanks leave the furnace at high speed, at a temperature around 900-950 ºC, they stop abruptly in a fixed position and, later, a robot collects them to introduce them in the press as quickly as possible , In order to avoid its cooling before the press stroke.
The problem arises from the difficulty of ensuring a fixed position with mechanical fasteners. This is due, among other things, to the speed of the line, the great variety of references, the high temperatures of the steel blanks (which cools very quickly at the point where there is a contact) and the internal characteristics of the furnace (which can measure up to 30m).
An incorrect position means that the robot fails to pick up the steel blanks, or worse, to pick it up incorrectly and place it incorrectly in the press, producing a wrong press stroke and stopping the line, together with a deterioration of the tools.
In this case, the artificial vision is presented as the best choice to indicate to the robot if the actual position of the steel blanks is correct. The most important task of the vision system will be to correctly segment the steel blanks into the image in order to accurately determine the position of the steel blanks.
CARTIF position. Application developed by CARTIF
A priori, given the intense infrared radiation emitted by the plates due to their high temperature, it seems that the easiest alternative to achieve this task is to use industrial infrared cameras. This solution presents two problems: the high cost of these equipments and the low resolution of the infrared sensors.
The working area in which the steel blanks are positioned is very wide, due to the size of the parts and because in many cases it is worked in batches, handling up to four units simultaneously. Given the low resolution of these sensors, it is necessary to use several cameras to increase the precision with which the position is defined.
From CARTIF we have been developing more economical solutions, using industrial cameras within the visible electromagnetic spectrum with a greater sensitivity in the infrared range. The resolution of these cameras is much higher than that of the infrared cameras which allows to increase the accuracy of the measurements.
This has allowed companies such as Renault to obtain a robust and configurable system that avoids undesirable stops of the line and extends the useful life of its tools, which leads to a considerable improvement in the production line.
In recent years the definition of the human microbiome has been postulated as an essential tool for medicine, pharmacy, nutrition and other disciplines in order to understand the role of microorganisms present in the body on health and immunity. In fact, the microbiome affects aging, digestion, immune system, mood and cognitive functions.
But, what is the microbiome?
There are different definitions for this term. Generally speaking, we can say that the human microbiome is the set of microorganisms in each person (microbiota)and the genes these cells harbour.
Microbiome research area comprises a field of science associated primarily with advances in DNA / RNA sequencing and computational biology. Thus, the microbiome can be defined as the genomic content of all microorganisms recovered from a habitat or ecosystem (eg saliva, feces or skin).
The study of the microbiome started in the 17th century with the development of the first microscopes and the beginnings of the science of microbiology. However, it has been in recent years when the development of rapid sequencing methods, the reduction of the costs associated with these techniques and the development of data management techniques have been developed which has enabled the microbiome and its constituents.
And why is it important?
Taking into account that the number of microorganisms that we harbour is between 10 and 100 billion (ten times higher than our number of cells), that we can have more than ten thousand different species and that the types of microorganism vary greatly among different people, we can think that the microbiome has a special role in our health. In fact, the knowledge of these microorganisms, the functions of their genes, their metabolic and regulatory pathways is already allowing them to develop strategies to prevent diseases and improve general health.
However, the microbiome of each person is not something static. Nutritional imbalances, lifestyle, use (and abuse) of antibiotics, low exposure to pathogens (or excess of hygiene) permanently modify our microbiome.
And what is your relationship with the diet?
There is a clear relationship between what we eat and the balance of our native flora that has a direct impact on our health status. Indeed, is interesting that changes in diet are always accompanied by changes in the microbiota and the enrichment of their corresponding genes.
Balanced diets can promote a proper and well-structured microbiota and conversely, alterations in the composition of our microbiota or reduction of some of the microorganisms that make up the diversity of the microbiota, increase the risk of suffering from diseases related to lifestyle such as allergies, diabetes, obesity and / or irritable bowel syndrome. In addition, a prolonged state of these situations has been related to metabolic alterations.
Recent studies have shown that there are notable differences in the microbiota of people who follow rich meat diets versus those who follow more ancestral life-styles and diets based mainly on vegetable consumption. There are studies that suggest that a type of diet rich in proteins and animal fat is associated with a particular kind of flora while carbohydrate-rich diets are associated with the prevalence of another type of flora. These differences have been linked to the risk of developing non-communicable diseases such as atherosclerosis.
Over and undernutrition malnutrition has a direct impact on the microbiota that favours alterations of the same that, finally, lead to problems associated with an increase in inflammation and metabolic problems. A strong influence has been observed in nutrient-poor diets, especially those deficient in certain amino acids, in the positive incidence of intestinal inflammation. Likewise, the pathogenesis of various diseases is associated with certain components of the diet that promote disorders in the microbiota.
Therefore, the better balanced the diet, the more diverse the microbiota. Thus, intervention through personalized diets improves the response in individuals with low microbiome richness.
And then, can it be improved?
Of course we can! The importance of food, nutritional balance and life-style have a direct influence on the composition of our microbiota and its activity and, therefore, directly on our health. From this relationship arises the interest to develop new strategies to personalize our diet.
Low cost alternative innovations. The barometer and how to think outside the box
I finished my previous post commenting how an ILM approach –to disaggregate energy consumption in a factory- can be an unbeatable challenge, financially, for those factories with highly distributed energy consumption.
The commercial market offers several alternatives for industrial measurement systems, designed by the main equipment manufacturers such as SIEMENS, ABB, SCHNEIDER, … capable of providing a hyper-exhaustive follow-up (several measures per second) of the energy consumptions of the different elements in a production chain. However, the cost of the necessary hardware, -the required computer and communications installation-, or the cost of the software licenses make such systems quite expensive. The consequence is that nowadays, they keep being a luxury only available to the large multinationals that also have several similar factories in different locations and, therefore a better purchase negotiation capacity and an easy and high internal replicability. In addition, its production processes are highly automated and computerized through the latest generation MES (Manufacturing Execution System) systems. They already have the necessary IT and communications infrastructure. They just lack the investment in hardware and the “upgrade” of their software licenses.
For other small and medium-sized factories, these solutions can mean “using a sledgehammer to crack a nut”, so that the investment in monitoring will never be profitable (in terms of produced savings). However, these types of factories are increasing their interest in optimizing their energy costs, but employing a reasonable economic investment more appropriate to their billing volumes.
Every science student will have heard the supposed anecdote of Niels Bohr and the barometer in one of its many versions. Although the anecdote of Bohr and the barometer is not real but invented, the moral of trying to think differently when solving a possible problem is more relevant than ever. The difference is that we now call it “thinking outside the box“. The question now is not how to measure the height of a building with the help of a barometer, but, how the measurement and monitoringof energy consumption of a factory could be developed without spending the whole sum of the factory one-year investment budget ?
The answer, as in the problem of the barometer, is not unique, as it will depend on each particular factory. Fortunately, the IOT revolution is producing economies of scale in some of the necessary components. Continuing with the ‘Star Wars’ tribute, the low cost monitoring energy consumption systems can be compared to an X-wing starfighter formed by the following four wings:
The lower cost of electronics, which is allowing the development of new low-cost non-invasive sensors such as Hall effect-based electric current sensors, ultrasonic flow sensors, or infrared temperature sensors.
The open source hardware-software platforms for signals capturing and processing through low cost devices like Arduino, Raspberry Pi and others.
The emergence of new wireless communication protocols oriented to the M2M (Machine To Machine) communication with characteristics of low bandwidth and energy consumption and high resistance to the interferences, like Zigbee, BT LE or Wi-Fi HaLow.
Software systems for storage and processing all the recorded data, for example the database systems, the multiple indicator reports automatic calculation tools and the use of displays showing the current values of the most important parameters. Both, residents on physical servers located on the factory intranet, or virtual cloud rented servers.
These new technologies are not yet mature and obviously the industry can be very reluctant to use them. If there is something that scares a production or maintenance manager those are the experimental systems that have not been tested previously for years. However, it is necessary to remember that we are not talking about modifying the control systems of processes and machines, but about deploying a parallel system throughout the factory that allows the monitoring and records the energy consumption of the main elements and production systems. We are talking about the detection of possible energy inefficiencies. We are talking about its correction and the corresponding economic savings. And we are talking about doing so with a reasonable investment cost, that is, that an SME can afford it.
It is July 20th, 1969, 20:18:04 UTC and after 102 hours, 45 minutes and 39.9 seconds of travel “the eagle has landed” and Neil is about to descend the ladder and touch an unknown surface for the first time: “That’s one small step for [a] man, one giant leap for mankind“. That 1969, Neil Armstrong, Michael Collins and “Buzz” Aldrin changed the world riding the biggest rocket ever built to the moon.
Some people may forgot it, others like me were not born at that time, but space race had its own digital transformation similar to the one foreseen for the industry and general public. Apollo program was the culmination of such first digital revolution in space exploration.
The landing achievement was, to a great extent, met thanks to the electronics onboard both the Apollo Command Module (CM) and Lunar Module (LM), the AGC or Apollo Guidance Computer. The computer was one of the first integrated digital circuit-based computers. With “just” 32kg of weight and a mere 55W of consumption this technical wonder was able to coordinate and control many tasks of the space mission, like calculating the direction and navigation angles of the spacecraft to commanding reaction control jets and orientate it in the desired direction. Moreover, the computer included one of the first demonstrations of a “fly-by-wire” feature where the pilot doesn’t command the engines directly but through control algorithms programmed into the computer. In fact, this computer was the basis for subsequent control of the space shuttle, military and commercial fly-by-wire systems.
As usual with this kind of breakthroughs, it did not happen overnight but through a series of incremental innovations done before.
By the 1950s, MIT Instrumentation Laboratory (IL) designed and developed the guidance system of Polaris ballistic missiles. Initially built with analog computers, soon they decided to go digital to achieve the accuracy required for the computations of missile trajectories and control.
Before President Kennedy set the ambitious goal of “… going to the moon in this decade …” 7 years earlier the first lunar landing, and after the launch of Sputnik in 1957, a Mars exploration study started at IL MIT’s laboratory. The design of a Mars probe set the basic configuration of the future Apollo guidance system including: a set of gyroscopes to keep the probe oriented, a digital computer and an optical telescope to orient itself relative to the moon and stars.
The launch of Sputnik in 1957 fueled America’s ambition to put the first human in space, but also contributed to the public debate of the pilots in the space race. A similar discussion to current views of the role of the worker in the factory. Should the astronaut just be payload or take full control of the spacecraft? Once aircraft pilots earned the task of being at the controls, several tests showed that it was nearly impossible that they would be able to control all the aspects of a mission due to the fast reaction needed and the amount different control commands. Hence, pilots would need some automatic and reliable help for the pilot, and that was one of the main functionalities of the AGC.
Reliability was then one of the main concerns of the mission. Polaris program took 4 years to design a guidance control for a weapon in the air a couple of minutes. Kennedy’s bet of taking a man to the moon in less than 7 years meant to develop a guidance and control system for a spacecraft that should work without failure in a trip of more than a week of duration. The required levels of reliability were of more than two orders of magnitude. If a Polaris missile failed, a new one would take off. A failure in the spacecraft meant killing an astronaut.
Much of the reliability of the flight was in the shoulders of the Apollo Guidance Computer, and at some point of the program there were too many tasks planned, like complex guidance maneuvers, to be physically hardwired into electronics. To achieve these tasks it was needed software. Although software barely was taken into account at the beginning of the program it meant the difference between achieving the goal or program’s complete failure. The computer was the interface between the astronaut and the spacecraft, which in the end meant that computer software “controlled” the spacecraft, a revolution for that time. Today software is everywhere but then in the 60’s, software was seen as a set of instructions on punched cards. AGC software programs (frozen at 3 to 4 months before each launch) were “hard-wired” as magnetics cores and wires in a permanent (and reliable) memory but saved a lot of time, effort, and budget. In fact, it can be said Apollosoftware was more like a “firmware” using today’s vocabulary.
Today’s challenge of revolutionize industry through digital transformation can’t happen without the help of digital enablers. 48 years ago, digital electronics and first software programs were the “digital enablers” to achieve that “one small step for [a] man, one giant leap for mankind“. Today’s “Digital transformation is not an option” sounds like a cliché, a hype, a slogan from digital providers, but looking back in the history, the digital transformation in the Apollo program meant the difference of not achieving moon landing.