It is undeniable that the coming decades will be crucial for both the society and the Earth´ s environmental health, so it will be determined if our Planet is able or not to support all the world population. Nowadays, it seems that the situation is more than complicated, and it is becoming worse day by day.
Taking into account this situation, the creation of new policies focused on the reduction of greenhouse gas emissions is more than needed, fizing a set of clear objectives from now to 2050. In this sense, the main objective of the Estrategia de Descarbonización a Largo Plazo (ELP 2050) created by the Spanish Government calls for a 905 reduction in greenhouse gas emissions by 2050 in relation to 1990, considering that the other 10% will be absorbed by carbon sinks.
Sustainable mobility plays a very important role within all the objectives defined in the aforementiones ELP 2050, so it will be essential to work together to try to change the way we move (specially travelling to and from work). Encouraging the use of electric vehicles and alternative means of transport will be key of achieving a much more sustainable mobility, and it will be also necessary to inform the citizens (e.g. the employees) using the proper information and reasons to do so.
The number of transit journeys on working days surpassed 123 million in 2007, according to the Mobility Survey of the People Resident in Spain of Movilia. Approximately 83% of the Spanish population carries out at least one journey each working day and more than a 16% of these journeys were to go to the workplace. Considering the aforementiones data coming from Movilia (please, note that Movilia does not consider the latests crisis and COVID19 effects due tot he fact that the study was done before), the number ofin itinere transit journeys in 2006-07 was around 37 million out of a total of 123 million (so, around a third), and around a 63% of these in itinere transit journeys were made by private vehicle as indicated in the E-Cosmos project.
As it has been detailed before, in Spain, the labor mobility has a very important influence on collective mobility, according to data from the Observatory of Logistic and Transport in Spain, having a big environmental, social and economic impact specially when those journeys are done by private vehicle.
Additionally, using the private vehicle to go to work is a very important health hazard. In Spain, traffic accidents have become the primary cause of death for accidents at work (around an 11,6% of the accidents at work were related toin itinere traffic accidents according to the Job, Migrations and Social Security Ministery, Spain Government. The amount of sleep time loss to try to avoid traffic jams, the stress caused by driving in peak hours or by being thinking and thinking about being late increases a lot the risk of traffic accident.
To solve these issues, a very good collaboration between companies, public entities and mobility providers (among others) is extremely needed. The establishment of frameworks of collaboration between the aforementioned entities will make possible the creation of real and effective employee´ s sustainable mobility plans taking into account employee´ s needs. These sustainable mobility plans will lead to real and fruitful interventions focused on reducing the amount of in itinere transit journeys done by private car.
Given the great need of encouraging sustainable mobility, from CARTIF we are collaborating with multiple entities with the main aim of developing real sustainable mobility plans. In this sense, we are working with some enterprises (and with all the involved stakeholders) in order to make more sustainable the in itinere transit journeys of their employees.
It is responsibility of everyone to try to take the leap and to actively contribute to Planet decarbonization, so… let´ s fight all together to make an effort to not continue damaging our planet in order to let the new generations to develop themselves in the same (or better) conditions than us.
CARTIF has the know-how to accompany the institutions in thei path to contribute to pur planet decarbonization, and not only concerning sustainable mobility plans, but also in a lot of other actiones that can be carried out in this sense. It´ s now or never.
Climate change is an increasingly visible reality on our planet, affecting millions of people around the world. These changes in climate are clearly recognizable by the increase in temperatures, the decrease in water resources, the sea level rise or the increasingly irregular and torrential precipitation events. The consequences, effects and impacts of these changes in the weather are becoming more frequent and relevant every day, inducing damages of great magnitude and generating a displacement of the population by making the areas in which they lived uninhabitable, with examples such as extreme droughts, floods or desertification. In our day to day, we can see how these changes in the climate manifest themselves. A clear example is the winter that has just started with softer than normal average temperatures and unusual high maximum temperatures for the period of the year.
In this context of climate change, the thermometer continues to break records of increase and it is estimated that in Spain the average temperatures are increasing around 0.3ºC per decade, which gives us an idea of the high rate o warming to whom our country is being subjected and the planet in general. In addition, it must be taken into account that although we manage to reduce the emissions that generate climate change by trying to avoid the consequences it produces, the change trends reflected in the climate variables will continue in the coming decades due to the inertia of the climate system. Faced with such a negative outlok, it is necessary to ask ourselves the following question: how can we contribute to mitigate and reduce the impacts of climate change or adapt to them by generating more resilient territories?
To help us in this fight, mitigation and adaptation strategies are of huge relevance. Mitigation strategies seek to reduce greenhouse gas emissions into the atmosphere, which are ultimately the food of anthropogenic climate change. For their part, adaptation strategies seek to limit the risks derived from climate change, reducing our vulnerabilities. Both strategies are complementary in such a way that, if we do not take mitigation into account, the capacity for adaptation can easily be overwhelmed and developing an adaptation that is not low in emissions is meaningless.
But, what can we do as citizens? We can contribute with small measures such as recycling, the use of public transport or bikes, local commerce that minimizes transport, ecological and sustainable products, all of them helping to reduce greenhouse gas emissions. However, adaptation requires great responses that generally must be promoted by the public administrations or organizations that are in charge of land management. Therefore, we must not overlook that the fight against climate change must be an effort of all (citizens, administrations, companies,etc.) integrating as many agents as possible and covering a multisectorial and systemic approach that not lose the social perspective of the problem.
Under this climate change perspective and to promote adaptation, the European Union has launched the Climate Change Adaptation Mission that aims to promote and support the transition towards resilience in Europe at the individual level, cities and regions, both in the private and public sectors as economy, energy, society, etc. Its main objective is to support at least 150 European regions and communities towards climate resilience by 2030. To this end, the mission will help regions and communities to better understand, prepare for and manage their climate risks, seek opportunties, as well as facilitate the implementation of innovative and resilient solutions providing information on the different additional sources of investment.
In a complementary way and to respond to the adaptation needs generated by changes in the climate, it is necessary to provide the entities with a common framework that guarantees a homogeneity of criteria in the conception of climate change. In this sense, the public action against climate change in Spain is coordinated and organized through the National Plan for Adaptation to Climate Change (PNACC), which establishes the framework of reference and national coordination for impact assessment initiatives and activities, vulnerability and adaptation. Its main objective is to avoid or reduce present and future damages derived and to build a more resilient economy and society.
This plan that covers the needs at national level establishing the starting point for the development of more detailed strategies at regional or municipal level helping the territories in the acievement of their objectives through the implementation of priority lines of action against the impacts caused by climate change. As a starting point for any adaptation strategy, it is necessary to know in detail how the current and future climate variables (temperature, precipitation, wind, etc.) will be like in order to be able to assess the vulnerability of our territory and promote measures that make it more resilient to climate impacts. As a starting point, the AdapteCCa climate scenario viewer developed by the Spanish Ministry of Agriculture, Fisheries and Food (MAPAMA) in coordination with the Spanish Office for Climate Change (OECC) and the Spanish Meteorological Agency (AEMET) together with the IPCC Interactive Atlas, provide us with relevant data to understand the future climate through different climate projections. All the information they collect allows to obtain an idea of the magnitude of the changes in the future climate to establish the baseline for the evaluation of vulnerability and risk, as well as for the definition of measures for each priority sector identified in each territory. Finally, the implementation of the identified and selected measures must be associated with a monitoring and a follow-up system that enables the achievement of the proposed adaptation objectives to be evaluated.
At CARTIF, we work to help the different public administrations in the development of adaptation plans and strategies in the face of climate change. We must highlight the projects in which we recently work together with GEOCYL Conultoría S.L. in the development of adaptation strategies to climate change for the municipality of Valladolid (EACC_Val project) and the region of Extremadura (EACC_Extremadura project), respectively.
In addition, the RethinkAction project, coordinated by CARTIF will allow us to advance over the effects generated by adaptation and mitigation measures through the development of integrated assessment models that allow the evaluation before implementation of measures in relevant climatic regions of Europe.
Artificial Intelligence, Machine Learning, Deep Learning, Smart Devices, terms that we are constantly bombarded with in the media, making us believe that these technologies are capable of doing anything and solving any problem we face. Nothing is further from reality!!
According to the European Commission, “Artificial intelligence (AI) systems are software (and possibly also hardware) systems designed by humans that, given a complex goal, act in the physical or digital dimension by perceiving their environment through data acquisition, interpreting the collected structured or unstructured data, reasoning on the knowledge, or processing the information, derived from this data and deciding the best action(s) to take to achieve the given goal.”1.
AI encompasses multiple approaches and techniques, among others machine learning, machine reasoning and robotics. Within them we will focus our reflection on machine learning from data, and more specifically on Intelligent Data Analysis aimed at extracting information and knowledge to make decisions. Those data (historical or streaming) that are stored by companies over time and that are often not put into value. Those data that reflect the reality of a specific activity and that will allow us to create statistical and mathematical models (in the form of rules and/or algorithms) that contain information about what reality is. Then, how to “cook” the data to obtain relevant information? What are the main actors involved? First the data, which will be our “ingredients”; second the algorithms capable of processing these data, which will be our “recipes”; third computer scientists and mathematicians, who will be the “chefs” capable of correctly mixing data and algorithms; and forth the domain experts, who will be our private “tasters” and whose task will be to validate the results obtained.
First one the data. Those data from which we want extract information in order to generate models or make predictions. Through a continuous learning process of trial and error, based on analysing how things were in the past, what trends there were, what patterns were repeated,etc. we can build models and make predictions that will be as “good” as data are. It is not a question of quantity, but of quality data. What does that mean exactly? It means that if we want to teach an AI system to multiply (giving it examples of correct multiplications) the system will know how to do that task (multiply) but it will never know how to subtract or divide. And if we give it ‘wrong’ examples (3*2=9 instead of 3*2=6) the system will learn to multiply, but in the wrong way. Therefore, as fundamental ingredient of our recipe, data must be well organized, be relevant and quality
On the other hand, the AI algorithms. Our “recipes” that tell us how to mix the “ingredients” correctly, how to use the available data to try to solve our problem. Algorithms that allow us to build computer systems that simulate human intelligence when automating tasks. However, not all algorithms can be used to solve any type of problem. On the “inside” of these algorithms there are mainly mathematical and statistical formulas proposed decades ago, and whose principles have advanced little in recent years, but which are now more effective thanks to (1) the increase in the amount of data and (2) the increase in power computer calculation (which is allowing much more complex calculations in less time and at low cost). However, skills such as intuition, creativity or consciousness are human abilities that (for now) we have not been able to transfer to a machine effectively. Therefore, our “chefs” and our “tasters” will be in charge of contributing these human factors in our particular”kitchen”.
That is why not all problems can be solved using AI. Because neither data are capable of “speaking” by themselves (they are not “carriers” of the absolute truth) nor are algorithms “seers” capable of guessing the unpredictable. What data and algorithms really know how to do is answer the questions we ask them based on the past, as long as the questions asked are the right ones. After the failure of a machine, how is the data provided by the sensors that monitor the machine mathematically related to the failure produced? When an image is analysed, how similar is it to images that have been previously analysed? When a question is asked of a virtual assistant, what answer has been given (by humans) more frequently in the past to that same question? It is therefore about questioning the data in the correct way so that they reveal the information we want.
Over the last century, AI has survived several technological ‘winters’ with lack of funding and research, mainly caused by the uncontrolled enthusiasm put into technology in the previous years2. It´ s time to “learn” from our hisorical data and not make the same mistakes again. Let´ s acknowledge AI for the capabilities it really has, and leave to wizards the ability to make the impossible come true. Only in this way AI will enter in its perpetual spring.
The energy sector is undergoing a deep transformation to respond to the need to combat climate change and thus contribute to the sustainability of life onour planet. This is being articulated through the so-called “Energy Transition”, which involves two big transformations in the electricity grid. On the one hand, traditional centralised generation is being replaced by an increasing number of distributed renewable generation plants located closer to the final consumer. In addition, the number of “self-consumers”, i.e. consumers capable of producing renewable energy, mainly photovoltaic, for their own use, is increasing. Secondly, we are witnessing a growth in the demand for electricity, with new needs such as electric vehicles and the air-conditioning of buildings.
All this results ingreater complexity of the electricity grid, especially the distribution grid, but also the transmision grid, because the flow of electricity is no longer unidirectional, but bidirectional. A more flexible management system is essential to make the transmission and distribution of electricity more efficient. Grid operators also need new technologies and tools to ensure a reliable and high quality service. These changes, which are already part of the present, are made possible by the evolution of traditional electricity grids towards smart grids.
The smart grid concept refers to a new feature of the electricity grid: in addition to transporting energy, it also transports data. To achieve this, digital technologies are needed to facilitate two-way communication between the user and the grid, IT and home automation tools to manage demand flexibility and distributed generation and storage resources, as well as the necessary technology and equipment capable of responding to volatile renewable generation.
One of the threats to guaranteeing an adequate and quality supply to the different players in the medium and low voltage network is faults. It is necessary to have the necessary means to locate them quicklly, givinig continuity of supply after a reconfiguration of the network, provided that this is useful to alleviate the effects of the fault, in the shortest possible time.
There are two indices for measuring the quality of supply in an electricity system: SAIDI (System Average Interruption Duration Index) and SAIFI (System Average Interruption Frequency Index). The SAIFI index takes into account the number of unavailabilities per user, while the SAIDI index takes into account the cumulative time of unavailability. These unavailabilities are generated as a result of various types of faults, the most frequent of which are earth and phase faults, the former being the most frequent.
When an earth fault occurs in a medium-voltage distribution network, the circuit breaker of one of the outlets of the high-voltage to medium-voltage transformer station shall trip by menas of the earth fault protection.
Subsequently, and in order to rule out that the fault is transient, the reclosing function shall operate, closing the circuit breaker. If the fault persists, tripping shall be repeated until the number of reclosings provided has been exhausted. If the fault is permanent, the affected part of the network will be out of service and it will be necessary to locate the fault and reconfigure the network in order to continue providing service to as many users as possible.
Traditionally, follwing the detection of a permanent fault by the telecontrol equipment, it is possible to carry out a remote reconfiguration operation from the control centre. This operation is carried out by an operator, following a defined protocol,and can take several minutes at best.
A modern, automated network will allow this protocol to be carried out without operator intervention, automatically between the telecontrol equipment. This network feature is known as self-healing, and allows the network to reconfigureitself autonomously in the event of a permanent fault, without the manual intervention of the control center. This significantly speeds up the time it takes to restore the power supply.
CARTIF has developed, within the framework of the INTERPRETER project (H2020, GA#864360), an assistance tool aimed at medium and low voltage grid operators. This tool, known as GCOSH-TOOL, helps to evaluate different scenarios by applying diferrent action protocols in the event of the appereance of one or more faults in the network. Its operation is based on proposing a seqeunce of optimisation problems with different constraints and objective functions, which allows the power to be delivered to each customer to be calculated, ensuring that the demand is met. To do this, a reconfiguration of the grid will be necessary to ensure electricity supply to the largest possible number of users in the scenario chosen by the operator based on technical and economic objectives.
The smart grids of the future will be more flexible and reliable than traditonal grids and will provide a higher quality of electricity supply to users. They will be connected in real time, receiving and providing information that will allow them to optimise their own electricity consumption and improve the operation of the overall system (active demand management). On the other hand, the trend towards distributed generation from renewable sources leads to a structure in the form of interconnected microgrids that will have the capacity to automatically reconfigure themselves in the event of any breakdown. The rapid evolution of technology is allowing these changes to take place very quickly, so that the so-called energy transition is becoming a reality, and we already have the infrastructure in place to reduce CO2 emissions, thus helping to curb climate change.
Identity and user data theft, ransomware, phishing, pharming or denial-of-service attacks are terms that appear more and more in the media1,2,3,4. The hyper-connected world in which we live also affects companies that, as productive entities, are increasingly exposed to being the target of cybercrimes 5,6,7. Existing campaigns to raise awareness in cybersecurity are very diverse, but how can companies protect themselves against all these threats without compromising their final business objectives?
Traditionally, cybersecurity orchestration in industrial environments has been delegated almost exclusively to the company´ s IT department, which have focused on protecting office networks, applying well-known standards and regulations such as: ISO/IEC 27001, ISO/IEC 15408 or ISO/ICE 19790. For these cybersecurity expert teams, “your best defense is a good offense”. This quote by the Chinese general Sun Tzu (author of the book “The Art of War”, considered a masterpiece on strategy) underlies the background of what are known as penetration tests (or pentesting). Pentesting tests are basically a set of simulated attacks against a computer system with the sole purpose of detecting exploitable weaknesses or vulnerabilities so they can be patched. Why are these tests so important? Several studies show that most attacks exploit known vulnerabilities collected in databases such as CVE, OWASP or NIST that for various reasons have not already been addressed 8,9.
In the IT sector, some of the most popular security audit methodologies and frameworks for pentesting are: Open Source Security Testing Methodology Manual (OSSTMM), Information Systems Security Assessment Framework (ISSAF), Open Web Application Security Project (OWASP), and Penetration Testing Execution Standard (PTES). Each of these methodologies follows a different strategy to perform the penetration test according to the type of application to be audited (native mobile apps, web applications, infrastructure…), being in this sense complementary approaches.
On a practical level, IT teams have a large number of tools to perfomr these tests both free and/or open-source and paid applications. Some of the best known are: Metasploit (Community Edition), NESSUS (Personal Edition), Saint, Nmap, Netcat, Burp Suite, John the Ripper or Wireshark. Most of these tools are already pre-installed in specific pentesting distributions such as Kali Linux, BlackArch Linux or Parrot Security.
However, office networks, of which the IT department is in charge, are not the only existing networks in an industrial company. Today, there is a growing number of production-related devices (PLC, SCADA, …), normally interconnected by fieldbus networks, that support the Internet TCP/IP protocol such as PROFINET or MODBUS TCP. Thanks to the routing function available in PLCs of some brands, it is possible to access to field buses that could not be accessed from the outside in the past, such as PROFIBUS, through gateways. The interconnection between IT (Information Technology) and OT (Operation Technology) networks, so necessary when talking about Industry 4.0, greatly increases the chances of the industry being a target of cyberattacks.
In the next article, we will talk about how we can defend ourselves against such a threat …
As we mentioned in our previous post, companies OT (Operation Technology) networks are no exception from suffering cyberattacks. So far, there have been multiple cyber-attacks suffered by industrial companies since the first registered one in 2010 that had a direct impact on the physical world1. These security incidents affect a wide range of entities ranging from large technology companies to final products suppliers2. All industrial infrastructures, and not only the critical ones, are in the crosshairs of cyber criminals or crackers, in which the OT sector is in a certain way “negligent”, since alomst 90% of vulnerabilities and attack vectors present in an industrial system are identifiable and exploitable using strategies widely known by the attackers, with 71% being extremely high or critical risk as they can partially or totally take to a halt all the company production activity3.
Given this outlined panorama, a series of questions should arise: Are there appropriate kit tools adapted to these OT network environments? Can cybersecurity experts protect the industry OT scenario? The detection and exposure of vulnerabilities that affect the resources associated with OT networks, key elements in the automation of industrial plants, is shown as a compulsory step for any penetration test. Once these vulnerabilities have been identifies, it will be possible to take the necessary preventive measures, adapting existing solutions and well-known good practices from the IT environment to the OT world, and not carrying out a direct implementation of them.
Some attempts to adapt existing standards are IEC 62443, based on the ISA 99 standar, which sets up the international reference framework for cybersecurity in industrial systems, or ISO/IEC 27019:2013 which provides guiding principles for the management of information security applied to the world of the process control systems. Regarding specific tools, we find, among others, the ControlThings platform, which is a specific Linux distribution to exposure vulnerabilities in industrial control systems, without forgetting tools dedicated to get a real-time asset inventory in the OT infrastructure like IND from Cisco, eyeSight from ForeScout (these are paid applications) or GRASSMARLIN opne source which passively maps the network and visually shows the topology of the different ICS/SCADA systems present in the network. The different objectives liable to be attacked in an OT environment in a specific way can be found in databases such as MITTRE-ATT&CK.
Nevertheless, these attempts at standardization are not enough and it is essential to continue going on different fronts supporting initiatives such as the following:
To allow experts from the OT environment to take the initiative and learn how to protect their systems. To train them in the correct way to commission the devices of these type of networks, making that commissioning easier for non-IT experts and thus, avoiding the possibility of misconfigurations due to lack of the associated technical information (simplifying the security aspect of this).
Improve the adaptation of SIEM (Security Information and Event Management) solutions to the OT networks, so that they ae less intrussive than current ones and making them to identify patterns that are typical of the indsutrial process networks, allowing and early identification of anomalous situations4.
Put into practice new ways of cyberprotecting industrial systems, not focused on the continuous software updating and/or the periodic investments on them5.
Until not long ago, OT network systems have run disconnected from the outside world and therefore with a false feeling of being secure6. However, the protection of these OT environments should be prioritized, as well as the creation of new professional profiles in OT cybersecurity, capable of understanding the needs and particularities of these specific environments.