Innovation is a marathon, not a sprint

Innovation is a marathon, not a sprint

Innovation isn´t just about having a good idea. It´s about fighting for it when nobody does. It´s about shaping it when it´s still intangible, about falling a hundred times until, suddenly, something starts to shine.

We tend to think when we talk about innovation, that thanks to the emergence of a brilliant idea we´ve solved that small goal or problem or, on other occasions, a huge problem that can transform the world. The truth is that no brilliant idea goes very far unless it´s achieved by people who, with great effort and perserverance, manage to “materialse” that brilliant idea.

Innovations can emerge from any corner of an organization. Often, management levels find it easier to turn an intuition into a line of work, but it´s also common for a good idea to emerge from the technical team, production, quality, or even administration. The real challenge is ensuring that the idea survives all the obstacles involved in becoming a reality. And there are many obstacles, fromclose colleagues, managers who don´t support the idea because is more comfortable to remain in their comfort zone, to obtaining internal or external financing. In other words, the challenge of innovation often lies in surviving the effort of going from an idea to a clear reality.


History is full of brilliant ideas that died for lack of perseverance. But it´s also full of projects that were born, not beacuse they were obvious or perfect, but because someone insisted beyond reason. Rarely do these ideas and the people behind them change the worl as we know it (disruptive innovations1).

To illustrate with real-life examples how brilliant ideas change the world, let´s talk about one of the most shocking stories of innovation: the 2014 Nobel Prize in Physics awarded to Isamu Akasaki, Hiroshi Amano and Shuji Nakamura for the invention of the BLUE LED. That small invention allows us to have thin LED screens today, and thanks to which you can read this from your laptop, mobile phone or tablet (there are several more applications thanks to this invention)

Source: La Vanguardia. https://www.lavanguardia.com/ciencia/20141007/54416831597/nobel-fisica-2014-akasaki-amano-nakamura.html

For decades, red, green and yellow LEDs were achieved. But not blue. And without blue, efficient white light, LED displays, and low-consumption projectors couldn´t be created. The world´s leading companies and research centres tried and failed. It was simply too difficult.

The challenge lay in the base material. Gallium nitride (GaN) was the best option, but it was very complex to synthesize and dope. The crystals were defective. The light emission was unstable. Many tried and all gave up.

All except three people: Isamu Akasaki, Hiroshi Amano y Shuji Nakamura.

Akasaki and Amano, from Nagoya University, began experimenting in the 80´s.

Nakamura, an engineer at a small Japanese company called Nichia Corporation, continued his research almost self-taught, against the advice of those around him.

For years, they worked with limited resources, without visibility, suffering constant failures and rejection from the scientific community. But they persisted.

In 1993, Nakamura finally succeeded in developing the first highly efficient, commercially available blue LED. That blue light not only changed lighting, but also paved the way for new sustainable and more efficient technologies. In 2014, the three received the Nobel Prize in Physics for an innovation that took 30 years to see the light of day… literally.


I would encourage you to reread the story of this invention in more detail, because it truly is a great example of how perseverance and effort can literally change the world for the better. Furthermore, we can learn valuable lessons from this story:

  • Most companies abandon an idea if there are no results within six months. The blue LED took more than a decade to work, and decades to be recognized.
  • Understand innovation as a marathon, not a sprint.
  • True innovation requires more effort than genius.
  • Innovation must be sustained, even when there are no results.

Because in the end, innovation isn’t just about being right, it’s about having the determination to prove it when no one else sees it.


When “green” doesn’t come from doing an LCA, but only from a Pantone® colour

When “green” doesn’t come from doing an LCA, but only from a Pantone® colour

If my grandma had heard about green marketing, she would have raised an eyebrow saying: “That sounds like they’re selling you the same thing… just with a pine-scented label.”

And if I told her about the recent situation with Ursula von der Leyen, having to confirm her support for the Green Claims Directive after days of confusion in her team, she would say: “Typical, Laura… they say one thing in the morning, the opposite in the afternoon, and in the end, you don’t know if they’re talking about sustainability or horoscopes.”

And honestly, she wouldn’t be wrong.

In recent years, environmental sustainability has become a powerful marketing tool but not always supported by real actions. To stop misleading practices known as “greenwashing”, the European Union has worked on two key directives: the Consumer Empowerment Directive (2024/825), already approved and waiting to be adapted into Spanish law, and the Green Claims Directive, which sets clear rules for making environmental claims that are based on real data. It was planned to start applying from 27 September 2026. But we say was because, just before its final approval, the text was suspended after disagreements in the European Parliament, which has left the proposal at a critical point, now depending on clarification and a common position among EU Member State.

This directive aimed to bring order to the confusing jungle of green labels. The goal: make sure any environmental claim (like “100% recycled” or “carbon neutral”) is checked and supported by solid data, such as a Life Cycle Assessment (LCA). In its most ambitious version, it even required using official methods like Product Environmental Footprint (PEF) or Organisation Environmental Footprint (OEF). But political discussions have diluted the content, and now it risks being forgotten. A shame, because people need protection from greenwashing and honest companies should be acknowledged. This law wasn’t meant to annoy them. Quite the opposite.



Meanwhile, pressure from consumers and civil organisations is already working. Just look at the recent cases of Coca-Cola and Adidas, who had to step back from their “green” messages after investigations into misleading advertising.

Source: Adidas

In Coca-Cola’s case, a complaint from European consumer and environmental groups led the Commission to act. The company agreed to change phrases like “made with 100% recycled plastic”, because it only referred to the bottle’s body, not the cap or label. Adidas, on the other hand, had to stop advertising a shoe line as “more sustainable” without explaining how or why. These cases show one thing clearly: it’s not enough to use a green leaf or the recycling symbol. It’s not about looking green, you have to prove it.

So, while some still confuse sustainability with decoration, we at CARTIF provide solid technical tools to help companies move towards models that are truly sustainable and transparent. Our Sustainability and Climate Neutrality team has worked for years with companies that want to improve and base their decisions on real, measurable data.

And how do we do that without magic balls or green leaves? With tools like these:

  • Life Cycle Assessment (LCA): because understanding a product’s environmental impact requires robust calculations based on ISO standards, not just guessing.
  • Environmental footprinting: starting with carbon (the celebrity of the group), but also including others like acidification or land use… to support decisions that are grounded in real impact, not in excuses.
  • Eco-labelling and green communication support: because telling the truth also needs practice.
  • Eco-design strategies: because if something is poorly designed from the beginning, no label can save it. This is where sustainability starts, in the plans, the materials, the packaging… and yes, even in the stylish decisions (with less waste and more purpose).

By combining all these tools, our mission is to help companies move towards models that are not only more environmentally sustainable, but also more honest and consistent. We guide them to measure, improve and communicate (in that order). We want them to share their sustainability story with confidence, and make sure their storytelling matches their storydoing.

And to Úrsula, we ask just one simple (but urgent) thing: don’t leave out the companies doing things right. The ones that choose to measure, improve and communicate with transparency while competing with those selling green smoke.

Because yes, it is possible to talk about sustainability without green make-up. All it takes is rigour, commitment… and a bit of common sense. Just like my grandma had.

How Did We Recover from the Blackout?

How Did We Recover from the Blackout?

By now, we’re probably all tired of hearing every kind of theory—some quite colorful—about the causes behind the April 28th blackout. But what has received far less media attention is the set of technical solutions that made it possible to restore power to a peninsula with over 50 million people. That’s precisely the focus of the following paragraphs.

Although an official report already outlines the causes of this blackout in our electrical system, one word echoes across the entire chain of missteps: frequency. In electrical terms, frequency refers to the rate at which alternating current switches polarity (from positive to negative and back), and it must always remain constant—50 Hz in the Iberian Peninsula—since the entire grid infrastructure is designed to operate under that non-negotiable condition.

That frequency, however, has become a point of media debate, sometimes used to criticize renewable energies, and other times to advocate for the unchecked use of fossil fuels. Yet there’s one renewable source—less flashy, quieter, but vital—that plays a key role in frequency control: hydropower.

Just like other technologies such as nuclear plants or gas turbines, hydropower generates electricity through synchronized rotation of mechanical components, which allows it to contribute directly to maintaining the system’s 50 Hz frequency. On the other hand, technologies like solar photovoltaics and wind—while essential to the energy transition—lack this direct regulatory capability and are also highly sensitive to frequency deviations due to their power electronics. It’s a vicious cycle.

But what truly made hydropower a star after the blackout was its black start capability—the ability to start an electric facility without relying on the grid. Only a few plants in the system have this feature, and in Spain, most of them are hydroelectric stations with reservoirs. Thanks to their design, they can start their turbines using only auxiliary batteries or diesel generators, harnessing the pressure from stored water.

That’s exactly what happened after the “electrical zero” of April 28. Plants such as Aldeadávila, Ricobayo, or Riba-roja d’Ebre started operating autonomously, injecting the first kilowatts into a completely dark grid. Spain’s transmission system operator, Red Eléctrica de España (REE), coordinated these plants to create small “electrical islands,” where both frequency and voltage were stabilized before rebuilding the interconnected grid from there.

In this scenario, the challenge wasn’t just to generate electricity again, but to ensure power quality—primarily meaning keeping frequency and voltage within very specific margins. To achieve this, power systems rely on balancing mechanisms such as primary, secondary, and tertiary regulation, each reacting at different time scales to generation-demand imbalances.


The first step was to activate primary regulation, which responds immediately to frequency deviations. Here, the islanded hydropower plants were able to autonomously maintain stable frequency within their sub-networks. Once stabilized, secondary regulation (AGC) was activated from REE’s control center to fine-tune the frequency to its nominal 50 Hz, supporting the primary regulation. This phase was enabled by remote communication and the fast response capability of hydropower turbines.

As more zones regained voltage, hydropower plants increased output or transferred load to other technologies, such as combined-cycle gas plants. This process released reserves through tertiary regulation, which also activated pumped-storage plants—like Estany Gento in the Pyrenees—that acted as giant batteries, providing extra support over the following hours and days.

In short, the April 28 blackout not only tested the resilience of Spain’s electrical system—it also highlighted the strategic value of hydropower. In today’s context of electrification and energy transition, it’s becoming increasingly clear that we need flexible technologies capable of modulating output, storing energy, or responding to demand.

At CARTIF, we are actively working in this direction through European projects like D-HYDROFLEX and iAMP-Hydro, which aim to modernize existing hydropower stations through hybrid systems and intelligent control. The goal: to provide these facilities with greater flexibility, efficiency, and stabilization capacity, contributing to the development of a more robust, sustainable, and future-ready electric system.

Preserving what we are: a new technical perspective on the maintenance of built heritage

Preserving what we are: a new technical perspective on the maintenance of built heritage

Beneath the vaults of a Gothic church, within the thick walls of a Cistercian monastery, in the stucco of a Renaissance palace or the rammed earth and timber frames of a traditional house, a single truth emerges: built heritage is an essential part of our history and collective identity. It is a physical legacy made of stone, wood, lime, brick or raw earth, conceived with construction wisdom adapted to its time.

Today, however, many of these buildings are deteriorating, left empty, and, far too often, disappearing without ever having been given a second chance. The lack of contemporary use, societal passivity, the absence of maintenance plans, the associated costs and, above all, something rarely discussed or deliberately overlooked: a technical misunderstanding of how they were built, are accelerating their loss.

Lifecycle of the Monastery of Nuestra Señora del Prado (Valladolid), pilot building of the INHERIT project. Source: own elaboration

How can we preserve what we don´t understand? How can we maintain with sound judgement if we ignore how something was built, why specific materials were used, or what structural logic underlies it? Preventive conservation is not a trend, it is an urgent necesssity if we want to safeguard our cultural heritage with rigour and responsibility.

At CARTIF, we believe it is essential to research and develop technical, innovative, yet realistic and implementable solutions that address this challenge through knowledge and respect for what has already been built. We aim to contribute to a smarter, more useful conservation approach, one that avoids improvisation and standard formulas, and instead promotes a deep understanding of how things were constructed, in order to care for them better. We are convinced that heritage conservation is a collective process: a way of valuing what connects us, engaging citizens, and reinforcing our bond with the built environment.

Projects we have been involved in, such as INHERIT and iPhotoCult, support this vision and underscore the need for a new technological perspective on heritage conservation. We already explored this line of thought in our blog post “A proper approach to inspecting historic buildings”; if you’re interested in digging deeper, we recommend giving it a read.

Historic buildings do not follow the rules of modern construction. Their materials, lime, brick, stone, wood, earth, are porous, natural, and adapted to local climates and contexts. Their construction systems, load-bearing walls, vaults, timber roof frames, obey a different logic. Assessing them using the same technical criteria as reinforced concrete or steel buildings is not only incorrect, it’s unjust.

We need tools that speak the language of built heritage. A specific approach that values their unique technical nature, because constructive diversity is not a problem, it’s a valuable asset.

Today, many diagnostic inspections still rely almost exclusively on the expertise of the technician conducting them. While that professional judgement is valuable, even essential, it becomes insufficient if the data gathered is not structured in a consistent, traceable and useful way for follow-up actions such as maintenance planning, rehabilitation, or risk assessment.

Workflow towards preventive maintenance based on HBIM: from data collection to knowledge. Source: own elaboration

That’s why we believe it is crucial to open the debate and move towards the development of a methodological proposal that addresses the specific needs of this field, through clear technical criteria and a systematic approach that enables us to:

  • Identify and evaluate historical construction systems according to their own internal logic.
  • Detect and structure deterioration symptoms by technical domain (foundations, structure, façades, roofs, interior partitions and finishes, metalwork and joinery, accessibility, installations and smart systems).
  • Assess associated risks, whether physical, functional or environmental.
  • Generate structured, reusable data that can be connected to digital tools such as H-BIM models or maintenance platforms.

This approach does not aim to simplify through standardisation, but to intelligently unify technical criteria through consensus among professionals, adapting to different contexts and typologies while respecting the architectural and cultural diversity of the built heritage. It remains fully aligned with current regulatory frameworks, such as the UNE 41805 standard for building diagnostics, and takes as a reference the National Preventive Conservation Plan of Spain’s Institute of Cultural Heritage (IPCE).

Adopting a technical methodology adapted to heritage buildings offers tangible benefits for technicians, companies and public administrations alike:

  • Reduced medium- and long-term costs by avoiding emergency interventions.
  • Greater transparency and traceability through structured, comparable data across buildings.
  • Enhanced appreciation of traditional technical knowledge, acknowledging the logic and effectiveness of historic systems and materials, while also addressing professional niches that currently lack recognition.
  • Real support for decision-making without replacing professional judgement.
  • Seamless integration with digital models and H-BIM platforms to plan maintenance, evaluate deterioration risks, monitor material ageing or assess energy performance (when appropriate).

These tools are key to achieving a more useful and proactive form of management, enabling better planning, fewer interventions, and more effective conservation, helping us move towards sustainable, resilient, resource-efficient and ultimately cost-effective heritage.

The potential of this approach does not end with inspection or diagnostics. It opens the door to digital tools capable of integrating 3D models, geolocated imagery, environmental or structural sensors, and lesion monitoring systems, or even AI-based tools capable of predicting deterioration patterns.

Workflow applied to the former collegiate church of Nuestra Señora de la Asunción in Roa (iPhotoCult project), with data acquisition using a ground-based robotic platforma (UGV). Source: own elaboration

But none of this will be useful without a solid foundation: reliable, technically sound and well-structured data. Because technology alone doesn’t preserve buildings. It’s people, with sound judgement, supported by tools that respect and understand what has been built.

Built heritage is not merely a collection of old stones. It is a living expression of our identity, our way of inhabiting space, our craftsmanship, our decisions and our memory. And today, more than ever, preserving it is a way of taking care of ourselves as a society.

Chemical recycling of textile waste: a new life for fiber blends

Chemical recycling of textile waste: a new life for fiber blends

Clothing and textile consumption has increase with the expansion of so-called “fast fashion”, giving rise to enromous amounts of waste. In Europe, the European Environment Agency (EEA)1 reports that each EU citizen purchased an average of 19kg of clothing, footwear and household textiles in 2022, up from 17kg in 2019. Furthermore, around 6.94 million tons of textile waste was generated in the EU. However, collection infraestructure has not kept pace with this growth, and most of this waste is not recovered. Only around 15% of textile waste in Europe is collected separately or recycled, meaning the remaining 85% ends up in the trash, incinerated or in landfills without any second life.

In Spain the situation is also worrying. Our country exceeds the European average in fashion consumption, with an estimated generation of nearly 900,000 tons os textile waste per year. According to the Spanish Federation for Recovery and Recycling (FER), only 11% of used clothing in Spain is collected in specific containers. This enormous waste of materials reflects the fact that the vast majority of our used clothes never find a second life.

F

Why is so little clothing recycled? One of the main obstacles is the propper composition of the clothes. It is common that they are confeccionated with fibre mixture, for example, a t-shirt with 50% cotton and 50% polyester, or fabrics that combine synthetic and natural fibres. These mixtures, joint with colour and additives applied to fabrics, difficults the traditional mechanical recycling that consists on TRITURAR the used clothes to obtain reusable fibres.

Why is clothing recycled so little? One of the main obstacles is the composition of the garments themselves. They are often made from blends of fibers, for example, a T-shirt with 50% cotton and 50% polyester, or fabrics that combine polyester with viscose. In fact, most post-consumer textile waste contains combinations of synthetic and natural fibers. These blends, along with the dyes and additives applied to the fabrics, make traditional mechanical recycling, which involves shredding used garments to obtain reusable fibers, extremely difficult.

This process requires fairly pure and uniform waste streams to be successful. If we introduce a blend of cotton and polyester into the shredder, we will obtain a mass of mixed fibers of different natures that cannot be easily spun into new, high-quality yarn. Furthermore, with each recycling cycle, the fibers become shorter and weaker. Therefore, mechanical recycling typically repurposes recovered fibers into lower-value products—a process known as “downcycling”—such as insulation, cushion stuffing, or construction materials, rather than being converted back into clothing.

When a garment contains multiple types of fibers glued together or includes complex chemical treatments, it often cannot be recycled mechanically at all, and that mixed garment ends up directly in the landfill. In short, our current garments are full of mixtures and finishes that traditional recycling can’t separate, and they end up wasted.


Faced with this problem, chemical recycling is positioned as a promising solution. Through processes such as selective dissolution or depolymerization, it allows fabrics to be broken down at the molecular level and their basic components recovered: cellulose, plastic monomers, or new regenerated fibers. Instead of shredding or melting, the raw material is “rewound” to “start over.”

Some recent examples show that this technology is already taking steps towards industrial reality. The German startup Eeden, for example, is building a pilot plant to recycle cotton and polyester blends. Its process allows for the recovery of high-purity cellulose from cotton and polyester monomers (such as terephthalic acid), which can be reused in the manufacture of new fibers.

For their part, BASF and Inditex have developed Loopamid®, the first nylon 6 recycled entirely from textile waste. Thanks to a chemical depolymerization and repolymerization process, it is possible to obtain a new polymer with comparable quality to the original, which has already been used to manufacture garment prototypes.

Although still an emerging technology, chemical recycling is demonstrating its ability to close the textile loop even in the most complex cases, and will be key to moving toward truly circular fashion.


In short, chemical recycling of textile waste is emerging as a necessary and complementary solution to mechanical recycling to address the fashion waste crisis. Faced with an ever-increasing volume of discarded clothing—and, especially, the challenge posed by mixed-fiber garments, omnipresent in our wardrobes—chemical technologies offer the possibility of recovering materials with original quality, overcoming the limitations of traditional methods. Although they still need to be scaled industrially and reduced in cost, it has already been demonstrated that it is technically feasible to convert a used garment into a new one, separating polymers and removing impurities in the process.

In the future, combining better designs (longer-lasting and more recyclable garments), responsible consumption, efficient collection and sorting systems, and all available forms of recycling will be key to achieving a truly circular economy in the textile sector. In this scenario, chemical recycling will become an essential ally so that that used T-shirt we today consider waste can be “reborn” into high-quality raw materials, reducing the environmental burden of fashion and closing the textile cycle.


1 Agencia Europea de Medio Ambiente (2025). https://www.eea.europa.eu/en/newsroom/news/consumption-of-clothing-footwear-other-textiles-in-the-eu-reaches-new-record-high#:~:text=The%20average%20EU%20citizen%20bought,the%20EU%E2%80%99s%20textile%20value%20chain

2 Federación Española de la Recuperación y el Reciclaje (FER). Info Textil (2024). Estadísticas de residuos textiles en España: https://www.recuperacion.org/info-textiles/

Meat by-product valorization: a scientific recipe for reducing waste

Meat by-product valorization: a scientific recipe for reducing waste

I’ll start by adapting a saying: “One man’s by-product is another man’s treasure.” That is, we can use the waste generated during the production stages of the industry—in this case, the agri-food industry—in a wide variety of ways and with a multitude of applications in different areas.

And how is this done? Well, in our case, we extract (or at least try to) various components of meat byproducts, such as proteins, by applying a series of “tricks” in the laboratory.

To put things in context, let’s first give a brief introduction to proteins. They have a series of properties, such as their structure, that we can use to our advantage to extract them from the matrix in which they are found. As you know, the basic organization of proteins is a “skeleton” of amino acids, known as the primary structure, which, depending on its combination, results in one protein or another. However, apart from this basic organization, we will also have other, slightly more complex aspects: the folding and three-dimensional structure of that chain of amino acids, known as secondary, tertiary, and quaternary structures. This spatial organization is what allows proteins to perform their multiple functions, because it gives rise to the physicochemical interactions between them and other components, applicable from the cellular level to the component level within a food.

Source: Instagram @ifas_publication

Once the theoretical framework is introduced, we can delve deeper into the practical part, which is more entertaining, or so they say. If we change some condition in the laboratory of our protein of interest, such as temperature or pH, we can disturb it enough for it to denature. When a protein denatures, it loses its three-dimensional structure, sometimes in a more dramatic way and, therefore, irreversible. Thus, we can extract them and uncouple them from the rest of the components because we have altered the established chemical bonds.

One way to denature proteins is to change their pH values ​​as desired. By changing the pH of the sample containing proteins, we change the interactions between them and the medium, altering their structure and behavior, for example, affecting their solubility. First, we change the pH, causing them to leave the sample and solubilize in water. Once they are removed from the rest of the sample, we change the pH again, causing them to no longer have charges available to interact with water and precipitate. Finally, by shaking them, we isolate them from the rest of the components of our raw material to obtain a protein concentrate.

Fuente: https://labster-image-manager.s3.amazonaws.com/v2/PRD/8c2fc0e1-7746-4ae8-a261-37206bc736de/PRD_Denaturation_definition.es_ES.png
Credit: David Baker

And now it’s time to get creative, because after the intricate laboratory process, we move on to the kitchen! These proteins we’ve obtained can be used, for example, as a dietary supplement or as an ingredient in food. This opens the door to endless possibilities, but without forgetting the most important thing: we reduce industrial waste, eliminating byproducts and enabling product development and improvement, because, as they say, nothing goes to waste here!

And that, among many other things, is what we do at CARTIF, we try to use the byproducts of the agri-food industry as widely as possible to reduce the waste it creates, while always supporting a healthy diet.