Spatial Augmented Reality in Industry

Spatial Augmented Reality in Industry

Recently, the Augmented Reality is becoming more and more common due to use of hand-held devices on our daily life such as smart phones, tablets and lately smart glasses. In this way, different applications, in many cases for leisure, like “Pokemon GO” or “Snapchat” image editor tool, have become popular this technology. But it is also includes for professional use on multitude of application areas.

However, AR is neither a new technology nor it is subject to the use of smart phones orsmart glasses. Spatial Augmented Reality (SAR) augments real world objects and scenes without the use of special displays such as monitors or hand-held devices. The key difference in SAR is it makes use of fixed digital projectors to display graphical information onto physical object surface.  The display is separated from the user of the system.

Perhaps the most popular application of SAR is also referred to as “projection mapping”, or “video mapping”. It is a video projection which turns complex industrial landscapes, such as buildings, into a display surface. This projection is commonly combined with audio to create an attractive audio-visual show. CARTIF has been involved in some projects that apply the projection mapping on cultural heritage field through virtual recovery of the primitive appearance paintings in a significant edifice.

Spatial Augmented Reality in industry

Because the displays are not associated with each user, SAR scales naturally up to groups of users, thus allowing for collocated collaboration between users. Furthermore, users avoid suffering eye strain due to use of smart glasses or be loaded with extra hand-held devices. For these reasons, aside from games and leisure applications, SAR has many potential applications in Industry.

In the automotive industry is used frequently during design stage projecting onto the car surface different options to choose the finish, or showing the employee how to perform the tasks of a specific reparation. Although, one of most implementations in this field is assistance in manual assembly tasks.

One or more optical devices (projectors) fixed provide immediate guidance for tasks step by step, projecting indications (text, images, animations) onto the work surface and in some cases directly on the parts on which a user is working. Spatial Augmented Reality can offer the following benefits:

•    Reduces or eliminates the need for computer monitors and screens, as the instructions appear directly in the task space.
•    Reduces users’ cognitive load when following work instructions, specially for training new workers.
•    Reduces the need to interrupt workflows to consult information elsewhere because there is no is no need for “attention switching” between work instructions and the task at hand.

In addition of previously commented:
•    Workers avoid suffering eye strain due to use of smart glasses or be loaded with extra hand-held devices.
•    One SAR system allows groups of users and collaboration between them.

This technology combined with some validation system, such as tool localization system or hand tracker trough computer vision, to ensure and confirm correct execution of the tasks, provides feedback for process improvement, traceability and reduces errors. CARTIF is involved in some projects that apply the benefits of Spatial Augmented Reality and reduce as much as possible its most delicate features, such as ambient brightness, adaptation of projection to colour and shape of the pieces, or possible occlusions produced by workers.

Geolocation systems are reaching indoors

Geolocation systems are reaching indoors

With global positioning systems, a phenomenon similar to what happened with mobile phones has occurred: in a few years we have gone from non-existence to consider it essential. The truth is that, in fact, geolocation is one of those technologies that has led to the development of many applications and in many areas is not conceived to work without the use of commonly called GPS.

These types of positioning systems are based on receiving the signal from three or more satellites and using trilateration: position is obtained in absolute coordinates (usually WGS84) by determining the distance to each satellite.

satelite-geolocalizacion

Global positioning systems based on satellites have their origin in the US system TRANSIT  in the 60s. With this system you could get fix the position once an hour (at best) with an accuracy of about 400 meters. This system was followed by the Timation system and in 1973 the Navstar project began (both from USA). The first satellite of this project was launched in February 1978 and full operational capability was declared in April 1995. This Navstar-GPS system is the origin of the GPS generic name we usually apply to all global navigation systems. In 1982 the former Soviet Union launched the first satellite of a similar system called GLONASS that became operational in 1996. Meanwhile, the People’s Republic of China in 2000 launched the first satellite of BeiDou navigation system, which is scheduled to be fully operational in 2020. Finally, in 2003, it began the development of the positioning system of the European Union called Galileo, with a first launch in 2011. Currently there are 12 satellites in active (and 2 in tests) and the simultaneous launching of four more is scheduled on 17 November 2016. This way, 18 satellites will be in orbit and initial service of Galileo positioning system could begin in late 2016. It is expected to be fully operational in 2020. It must be said that there are also other systems, complementary to those already mentioned, in India and Japan in a local range.

As you can see, the global positioning systems are fully extended and are widely used both military and commercial level (transport of people and goods, precision agriculture, surveying, environmental studies, rescue operations …) and on a personal level (almost everyone has a mobile phone with GPS available, although their battery always run out at the worst moment).

Regarding the precision obtained with current geolocation equipment, it is about a few meters (and even better with the Galileo system) and can reach centimetre accuracy using multifrequency devices and applying differential corrections.

geolocation-system

One of the problems of these systems is that not work properly indoors since the satellite signal cannot be received well inside buildings (although there are highly sensitive equipment that reduce this problem and other devices called pseudolites, acting simulating the GPS signal indoors). And of course it’s not enough to know our exact position outdoors but now comes the need to also be located inside large buildings and infrastructure (airports, office buildings, shopping centres, …).

So indoor positioning systems (IPS) have appeared allowing location inside enclosed spaces. Unlike global positioning systems, in this case there are many different technologies that are usually not compatible with each other making it difficult to dissemination and adoption by the general public. There are already very reliable and accurate solutions in enterprise environments but these developments are specific and not easily transposable to a generic use of locating people indoors. In this type of professional context, CARTIF has done several projects indoor positioning for autonomous movement of goods and service robotics. There is not a standard indoor positioning system but there are many technologies competing for a prominent place.

The technologies used can be differentiated on the need or not of a communications infrastructure. Those who no need existing infrastructure are often based on the use of commonly available sensors in a smartphone: variations in the magnetic field inside the building that are detected by the magnetometer, measuring the movements by using accelerometers or identifying certain feature elements (such as QR codes) using the camera. In all these cases the accuracy achieved is not very high but may be useful in certain applications as simple guidance in a large building.

Indoor positioning systems using communications infrastructure exploit almost all available technologies of this kind for the location: WiFi, Bluetooth, RFID, infrared, NFC, ZigBee, Ultra Wideband, visible light, phone masts (2G / 3G / 4G), ultrasound, …

geolocalizacion-smartphone

With these systems, the position is usually determined by triangulation, calculating the distance to the fixed reference devices (using the intensity of the received signal, coded signals or by direct measurement of this distance). Thus you can reach greater precision than in the three previous cases. There are also new developments that combine several of the above technologies in order to improve the accuracy and availability of positioning.

Although, as has been said there is no standard, the use of systems based on Bluetooth low energy are spreading (BLE nodes). Examples of such systems are the Eddystone (from Google) and iBeacons (Apple).

Logically, as in the case of outdoor positioning the corresponding environment map is required to allow navigation. There are other systems, called SLAM, which generate environment maps (which may be known or not) as they move, widely used in robots and autonomous vehicles. A recent example is the Tango project (from Google once again) that generates 3D models of the environment just using mobile devices (smartphones or tablets).

As we have seen, we are closer to be located anywhere, which can be very useful but also can make us overly dependent on these systems while the usual privacy issues concerning positioning systems are increased. So although thanks to these advances the sense of orientation is less necessary, we must always keep common sense.

Your credit is about to expire: the Earth Overshoot Day

Your credit is about to expire: the Earth Overshoot Day

August 8, 2016. 07:00h a.m. Radio turned on driving to work. Headlines begin. “Today is Earth Overshoot Day”, I hear. Oh. Bad news. The Earth Overshoot day in 2016 has been brought forward again.

I’m sure you are wondering about some things right now:
1) If I am able to understand the radio at that time in the morning,
2) Earth Overshoot day? What does it mean?

The answer to the first question is yes. I can understand if radio plays the summer song or the speech is about an environmental issue, in both cases my attention is activated immediately. And the answer to the second question is broader and deeper. I need to enlarge on this problem. Let’s see.

Earth Overshoot concept was originally developed by the New Economics Foundation (NEF) and Earth Overshoot Day is defined as a mark that indicates when humanity has maxed up all the Earth’s resources for the calendar year. Although it is only an estimation, this day is considered the best scientific approach to measure the gap between natural resources generated and destroyed annually, so that, once passed, Earth is operating in overshoot and everything consumed until the end year is supported by resources that planet cannot produce and contaminants that Earth is not able to absorb (www.footprintnetwork.org)..

The simplest example to understand the concept is thinking about Earth´s resources being money in a bank. Overshoot occurs when we withdraw money from the bank faster than to wait for the interest this money generates.

Just as a bank statement tracks income against expenditures, Global Footprint Network is the organization that analyzes thousands of data points and measures humanity’s demand for supply of natural resources and ecological services every year, that is, it compares income of the Earth (which are achieved by increasing the use of renewable energy, for example) against expenses (which are produced, among others, by massive use of private cars overconsuming fuel) and the result of the equation provides the date when humanity exhausts the nature budget for that year so for the rest months, it will maintain by drawing down local resource stocks and accumulating CO2 in the atmosphere, making climatic change worse.

Therefore it’s not a holiday definitively. Earth Overshoot day has moved from the end of October in 1993 to August 13th last year, which means that the deadline was shortened almost a week in 2016. Therefore each year the problem grows worse, if consumption patterns continue apace, it is hard to imagine that the day when we will have spent all the “credit” that exists in this Earth account could come. If we continue to destroy its natural capital and its ability to renew its environmental services, it will be very difficult to avoid it.

In CARTIF, our year already started with environmental purposes and to encourage companies for “funding” to Earth seems to us to be vital because we are in overcapacity nowadays. One of the most interesting ads to do this is the call Spanish CLIMA projects. The Spanish Ministry of Agriculture, Food and Environment (MAGRAMA) launches this call every year and encourages companies to reduce their greenhouse gas emissions with the development of new low carbon activities. It is one of the best funding for those companies that need to receive the last effort to transform their activities towards low carbon technologies, since the MAGRAMA “buys” the CO2 equivalent emissions avoided (expense to Earth eluded), raising the fixed price per tonne each year.

Thus, if you choose to carry out a more environmental process, it will be attractive from the economic point of view and besides, you will contribute to add to the “money box” of the Earth, therefore activities like changing your fossil fuel boilers by other fueled by biomass, transforming your fleet to electric vehicles or using the residual heat of your process, could make an important difference for the future.

Do you dare to bring out the environmental banker in you?

Fighting with triple A (AAA), the silent enemy

Fighting with triple A (AAA), the silent enemy

The Abdominal Aorta Aneurysm (AAA) has been recognized as a major health problem in the last decade. The statistics associated with this condition are of great concern and, as recorded in most of the studies found in scientific literature, it is expected that its impact will increase in the next years mainly due to the increase in life expectancy of the population. The rupture of abdominal aortic aneurysms represents a major clinical event because of its high mortality rate.

According to Dr. Felix Nieto comments in his previous post, currently the indicators used to determine the treatment of patients with aneurysms are the maximum transverse diameter and the growth rate that can be considered insufficient; they do not have a physically grounded theoretical basis. Because of this limitation, in recent years research has been basically aimed at improving understanding of the phenomena associated with the emergence and evolution of this disease, in order to determine whether other variables could be predictive of rupture.

One of the major constraints in obtaining accurate results in modeling vascular diseases is the use of a realistic computational domain, which is closer to be possible due to technological advances in equipment for conducting tomography computed axial (CT), magnetic resonance imaging (MRI) and the development of CAD techniques, which has advanced significantly in the detailed extraction, in vivo, of anatomical structures.

CARTIF team is working on automated conversion of 2D set of images obtained by CT in a realistic 3D model that constitutes the geometric domain of integration into the AAA simulation by finite element techniques

The activity related to medical imaging AAA has been the key to one of the issues recently treated in CARTIF, called the study of the influence of geometric parameters on the rate of rupture of AAA, the work is particularly focused on iliac angle.

In the first phase they were carried out fluid dynamics and structural simulations to calculate the Rupture Potential Index (IPR) of several cases of patients affected by AAA

The results show that the values of the iliac angle (α) are related to other geometric parameters such as the eccentricity of AAA, which together can characterize the IPR.

The next step would confirm this trend over a larger database of patients with AAA, being essential as now, the good cooperation with HCUV (University Clinical Hospital of Valladolid).

For the simplicity of obtaining these parameters by the specialist through the TAC, the results of this research could be a very effective tool for the surgeon when making the decision to submit or not the patient to a surgical repair procedure.

Vegetal covers or mulches, right or wrong?

Vegetal covers or mulches, right or wrong?

More than once, especially in the villages, we have heard a farmer saying a similar phrase: “Wheat is growing a lot due to rains this year, what a pity! Because many weeds will grow and I will have to spread herbicides”. Before this comment it could open a long discussion.

It is known that weeds are a problem, but there are other ways so that they do not encroach crops, without resorting to herbicides. We must not let them grow so that invade crops, but neither eradicate them, because they also have other beneficial effects, such as they help to control pests and favor the presence of pollinators in the field.

So, what happens? Should we leave them or not? Well, there are other solutions such as leaving margins between crops and introduce aromatic plants or fruit trees which prevent the emergence of weeds in these margins, like cereal fields were interspersed with orchards and even fruit trees years ago. It is what is called creating mulches.

What are the covers or mulches vegetal?

The implementation of mulches consists on sowing between the streets or lines any cultivable crop species or let the natural vegetation grow spontaneously. The mulches are used as a strategy for soil management, because they reduce the risk of erosion besides increasing the biodiversity of the natural enemies of the usual crop pests.

Mulches compete with weeds for space, light and nutrients and, therefore, they help to reduce the costs of weed control which is an advantage for the farmer. Other cover crops produce allelopathic substances (biochemical compounds that influence in growth, survival or reproduction of other organisms) which inhibit the growth of certain weeds and, in general, mulches, like hedges, are a reservoir where beneficial organisms can live, which can pass to the crop in search of prey (pests).

Are mulches used a lot?

Mulches are widely used in organic farming. Organic farming, regulated by (EC) Nº 834/2007 Regulation of 28 June 2007, forbids the use of chemical pesticides and synthetic origin fertilizers; therefore, in organic production different management strategies are developed to comply with regulations. One of these techniques is to increase the diversity in and around the crop with different plants which stimulate the diversity of beneficial organisms. The most important diversification technique is the use of mulches between lines of crops.

The mulches issue is highly questionable by both parts, the staunchest supporters and those who never would put into practice. It is clear that by introducing a new crop between the lines of other major crop, for example a vineyard, there will be a competition with the vines, and this can join a production loss, but it can also be very interesting to get a higher quality product due to improvements in soil, or a different microbial flora which can influence the sensory characteristics of the wine produced with those grapes may appear.

Sometimes it takes several decades and we evolved on many issues but in other cases it tends to do things as in the past, as in the case of introducing mulches between lines of crops, although you can always add new techniques as a result of R & D. In CARTIF, we work both in of viticulture and enology fields, where we have made seeding experiences of mulches of aromatic plants between vineyard lines, as in the development of low environmental impact and organic farming techniques.

Predictive maintenance: revolution against the evolution

Predictive maintenance: revolution against the evolution

In previous posts, predictive maintenance was mentioned as one of the main digital enablers of Industry 4.0. Maintenance, linked to the industrial revolution, however, has accompanied us in our evolution as human beings.

Since prehistory, our ancestors have built tools that suffered wear and sometimes broke without prior notice. The solution was simple: to carve a new tool. By creating more elaborate mechanisms (e.g. wooden wheel), the natural alternative to disposal became the reparation by the craftsman. Mechanical looms of the First Industrial Revolution were even more complicated of repairing so specific professions emerged as precursors of current maintenance workers. During this evolution, the wear and breakdown of mechanical parts without prior notice continued as part of everyday factories.

Why this gear has broken? yesterday worked perfectly. Human brain can handle concepts such as linearity of events (seasons, day and night,…) or events that happen more or less at regular intervals. However, these unforeseen drove operators crazy. How can we ensure that gear does not break again? The answer was biologically predictable: “… let’s stop the machine every 2 days (for example) and let’s review gear wear…”

This tradition has resulted in the everyday maintenance routine that is applied in industry and in consumer products such as our cars. Our authorized dealer obliges us to make periodic reviews (e.g. each 10,000 km) to check critical elements (brakes, timing belt, …) and change pieces more prone to wear (tires, filters …). This is called preventive maintenance, and is applied in factories and other facilities (e.g. wind turbines) to avoid unexpected breakdowns. However, these faults cannot be eliminated (precisely, they are unforeseen) the only possible reaction is to repair them. This is called corrective maintenance and everyone hates it.

How to stop to all this flood of unexpected breakdowns, repair costs and unnecessary revisions? One of the disciplines with more experience since CARTIF‘s creation is predictive maintenance that seeks to mitigate (it would be unrealistic to assume that we will remove the unexpected) unexpected breakdowns and reduce machines’ periodic reviews. Again, predictive maintenance can be explained as a obvious biological response to the problem of unexpected breakdowns. It is based on the periodic review using characteristic signals of machine’s environment that may anticipate a malfunction. The advantage of this maintenance is that it doesn’t require stopping the machine like with preventive maintenance. For example, an electric motor can have a normal power consumption when it’s correctly operating, but this consumption may increase if some motor’s component suffers from some excessive wear. Thus, a proper monitoring of the consumption can help detecting incipient faults.

Continuing with the electric motor example, what should be the minimum variation of consumption to decide that we must stop the motor and a repair it? Like many decisions in life, you need to apply a criterion of cost/benefit, comparing how much can we lose if we do not repair this motor versus how much money the repair will cost. How to reduce uncertainty in this decision? The answer is a reliable prediction of the fault’s evolution.

This prediction will be influenced by many factors, some of them unknown (like we said it’s something random). However, the two main factors to consider for the prediction are (1) the kind of evolution of the damage (e.g. evolution of damage in a fragile part will be very different from a more or less tough or elastic piece) and (2) workload that the machine will suffer (a fan working 24/7, compared to an elevator motor that starts and stops every time a neighbor presses the button on a floor). A reliable prediction allows the maintenance manager choosing from, together with the forecast of factory workload, the more beneficial option, which in many cases is usually planning maintenance work without affecting production schedule.

Another beneficial effect of predictive maintenance is that a proper analysis of the measured signals provides evidence of what element is failing. This is called fault diagnosis and helps to reduce uncertainty in the more appropriate maintenance action. An example is the vibration measurement that helps distinguishing a fault of an electric motor having an excess of vibration because of an incipient short-circuit or due to a damaged bearing. But that’s the subject of another post.