Consciousness is not a computation, it is quantum

Consciousness is not a computation, it is quantum

A lot of the new hype arounf the Artificial Intelligence (AI) is directly related with the potentiality for imitate or overcome the capabilities of the human brain (in terms of data volume managed and process speed) using computers. The neuroscientist Henry Markram in 2009 announced a project that pretend to simulate the human brain in a super-computer with different objectives such as “understanding perception or reality and maybe even undertanding physic reality as well”.

The so-called “technological singularity” established how the AI and robotics will surpass us humans. There are different predictions about when will occur this “apocalypse”. Elon Musk figured it out this singularity in 2025, the Russian millionaire Dmitri Itskov in 2045, to cite some examples. The continuous advance of microprocessors capabilities also feeds, wrongly, the hype of the AI. If someone compares only the numebr of neurons (around 86,000 million) with the number of transistors of the last M1 chip from Apple (16,000 millions) may be tempted to ensure that the “computation capacity” of the human being is easily overcome. I know, comparisons are hateful, and in this case, very daring.

Until very recently I was already among the expectant of such predictions, but with a reasonable scepticism degree. All this changed for me in the crudest of the confinement of 2020. I was wandering around YouTube in search of interesting videos related to AI and I came to a very curious one that gives the title to this post, and that attracted my curosity: 1consciousness is not a computation. In this video, a more than kucid Sir Roger Penrose, physicist, mathematician and philosopher, is interviewed by the vlogger Lex Fridman, expert in AI and autonomous driving.

I have to say that, even the scientific level of what is exposed in the video is very high, the lucidity, detail and kindness shown by Penrose, caught me and got me to be attentive throughout the whole interview. Specially, there is a part that sticks me on the chair, and makes me rewind several times to try to understand as much details as possible. The interview starts directly with this devastating thesis: “I´ m trying to say that whatever consciousness is, it´ s not a computation…it´ s not a physical process which can be described by computation”.

During the interview, Penrose explains how his curiosity about neurophysiology led him to explore the basic principles of the physic, cosmology, mathematics and philosophy in his book in 1989 “The Emperor´ s New Mind” to propose that human thinks could never be emulated by a machine, against the “mainstream” thesis of those times about how computers using artificial intelligence could soon make everything a human can do.

Which leads him to assure so bluntly the impossibility of emulating human consciousness by using a computer? It is not supposed that joining several chips of our computers one could overcome the number of neurons of our brain and its capacity of computation (if you allow me this crude comparison)? Just like life isn´ t a set of cells grouoed in organs, the “emulation” of the brain capacities is not a question of grouping a high number of transistors and their electrical impulses. We all remember the explanations of how neurons transport information throughout electrical impulses. In his analysis of the brain physiology, Penrose, even at the final of his book could not get to explain completely how it was possible that the nervous signals could transmit by electrical impulses consistently across the brain. Something did not fit or was missing in his theory. But it seems that, to a reader of his book, the anaesthesiologist Stuart Hameroff, was to the only one that figured it out. “I think you have missed something, don´ t you know what microtubules are?”- said to Penrose. “Is what you need to amke your theory work”. Microtubules could be the answer to the Penrose search about a no computable source in the human consciousness, from a physiological point of view.

But what the hell are microtubules? May molecular biologists forgives me, but it seems that they are molecular structures of tubular shape that we can found in different cells of our body, from red blood cells to neurons. These structures that “inhbait” the interconnections of our grey cells, have the property of conserving their state in a very effective way (quantum type state, but we will leave this for another post) and allow us to somehow return to being the same as we were after a loss of consciousness, for example, after and anaesthesia. We could say that microtubules are the basic storage unit (quantum) of our brain. Some scientifics call them “the neuron brain“.

Another reason for being able to aspire to emulate the brain has been to be able to replicate the number of connections that exist in our neurons. It´´ s a pretty big number actually. It is estimated that each neuron has an average of 1,000 connections. With 86,000 million of neurons this would give us about 86 trillion of connections or so. Even though the numbers give vertigo, for some experts they seems achievable with the current calculation capacity in operations per second (FLOP) of the processors. Going back to Apple´ s M1, this processor declares to be able to carry out 2.6 TFLOP, 2.6 billion operations per second (10 to the 12th). Again, a number apparently “near” to our connections if we join a lot of chips working at the same time. But it seems that consciousness is something more than connections, right?

If we focus only on the quantitative question and we return to microtubules that inhbait our neurons, how much of them can we have? Neurobiology said that inhabit our neurones, how much of them can we have? Neurobiology said that more than 1,000 microtubules per each one of our 86,000 million of neurons, that is, 86,000,000,000,000 micortubules (86 billion, similar to the neural connections) that “stores quantum information” in which some scientifics affirm, our consciousness live. We can say that actually our brain is a quantum computer, don´ t you think? Wow, sorry to fall back into a computational analogy. Let´´ s go back to the technology to conclude this post. IBM, promises a quantum computer of 1,000 qubits for 2023. Quite inferior to the 86 billion microtubules of our head. In my humble opinion, and comparing only quantitative aspects of actual and future computation capacities, the so called “technological singularity” as a promise of overcoming our human capacities only with actual computer technology and artificial intelligence i still very far away or seems almost unattainable. I don´ t know about you, but I still see a little bit far away the technological singularity, don´ t you think?


1 Human beings’ ability to recognize and relate to the surrounding reality

Industry 5.0, seriously?

Industry 5.0, seriously?

It seems unbelievable, but 5 years have passed since CARTIF inaugurated the blog with the post on Industry 4.0 in which I analysed some of the keys to the so-called “fourth industrial revolution” and how it could affect the industry in our country. It has always seemed risky to me to try to define this revolution from within itself. I suppose that time and the historical perspective will make it clearer if it really has been a revolution or simply a technological mantra. Fasten your seat belts because if we have not yet assimilated this revolution now they “threaten” us with the next one, Industry 5.0, they call it. Original, isn’t it?

If the fourth promised to interconnect the productive means of the entire value chain to make a transition to the intelligent industry or Smart Industry (everything has to be Smart as when many years ago any self-respecting appliance needed to carry “fuzzy logic” on-board). The fifth industrial revolution tries to humanize the concept beyond just producing goods and services for economic profit. The challenge of this revolution is to include social and environmental considerations in its purpose. Keywords if this revolution, as defined by the European Commission, should include human-centric approach, sustainability and resilience.

By developing innovative technologies with a human-centric approach, Industry 5.0 can support and empower workers, rather than replace them. Likewise, other approaches complement this vision from the consumer’s point of view in such a way that they can have access to products that are as personalized as possible or adapted to their possibilities. Thus, concepts such as personalized food or tailor-made clothing could be virtually applied to any consumer product.

The sustainability in the development of the industry needs to reconcile the economic and environmental progress objectives. To achieve such common environmental objectives, it is necessary to incorporate new technologies and integrate existing ones by rethinking the manufacturing processes by introducing environmental impacts in their design and operation. Industry must be a leading example in the green transition.

Industry resilience means developing a greater degree of robustness in its production, preparing it against disruptions and ensuring that it can respond in times of crisis such as the COVID-19 pandemic. The current approach to globalized production has shown great fragility during the pandemic that devastates us. Supply chains must also be sufficiently resilient, with adaptable and flexible production capacity, especially in those aspects of products that satisfy basic human needs, such as healthcare or security.

Just as the fourth needed digital enablers, this new revolution needs technological aspects to make it happen. From a practical point of view, we can say that the enablers we reviewed a while ago are fully up-ot-date for Industry 5.0. We could include some additional ones such as quantum computing or block-chain, incipient ones 4 or 5 years ago. If the enablers are similar, why are we talking about a new revolution? It is a matter of priorities. If the fourth spoke abou hyper-connection of processes to the digital world through cyber-physical systems or the IoT, in the fifth, a cooperation between human and digital technology is sought, either in the form of collaborative industrial robots, social robots or artificial intelligence systems that complement or assist in any task related to production, from installing a door in a car to deciding how to organize the next work shift to meet the productivity goal of the manufacturing plant.

IoT technology to improve the efficiency of industrial companies

IoT technology to improve the efficiency of industrial companies

With the promise of 75 billion devices connected to the Internet around the world in 2025, the ‘Internet of Things’ (IoT) opens the door to a future of opportunities for companies to optimize their processes, whether in the form of manufacturing their products, supervising their quality or monitoring the critical machines in the factories: ovens, manufacturing lines or refrigerated warehouses.

In our daily experience as consumers, we can find a multitude of technological offers in IoT devices that we integrate into our lives in a fast and, sometimes, impulsive manner, either because of fashions or real benefits. However, the incorporation of these technologies in companies is not done in such an impulsive way, since it involves a careful study of feasibility and profitability, often complex to demonstrate, as usually happens with new technologies.

In addition, IoT possesses  a significant flexibility to integrate itself into the IT infrastructures of the factories. The ‘i’ of IoT means “internet”, which seems to be automatically associated with a direct connection to the Internet of “things” in the factories, and this generates panic because of possible cybersecurity threats for almost any company. To fight against these barriers, information and training are key aspects.

Within this framework, the IOTEC Spain-Portugal cross-border cooperation project is being developed. This initiative aims to create a collaborative network of different actors (researchers, public bodies, ICT solutions providers and industrial companies) of both countries to facilitate the IoT integration in companies. Participants in IOTEC have analyzed different industrial and ICT companies to look for gaps and strengths and to be able to relate supply and demand of IoT. From CARTIF, we coordinate the activities around the industrial companies in order to know their IoT needs through a detailed analysis of their organizational and productive processes that include management, product design, manufacturing process and logistics.

This analysis included a series of technological audits to different agroindustrial companies, analyzing the potential of application of IoT in different parts of its productive process. 40 different organizational parameters were evaluated according to the methodology defined within the IOTEC project. For example, in the section on manufacturing processes, four aspects of great relevance were analyzed meticulously:

  • The type of process or productive transformation, which is fundamentally defined by aspects such as the raw materials used or the manufacturing steps.
  • The traceability requirements of raw materials, intermediate products and final products. This traceability has special relevance in agrifood companies.
  • The control of the production process that is triggered by different mechanisms according to the company: production orders, on demand, availability of raw materials (e.g. vintage).
  • The need to capture data in the plant as the first phase of complete digitalization of a productive process.

Once all the parameters were analyzed, it was carried out an exhaustive classification of different IoT technologies that could be applied in the industry and have a direct impact on the improvement of efficiency. Next, you can see these technologies:

All identified technologies were prioritized by those attending the “Forum of business opportunities through IoT and Blockchain” that took place on November 14, 2018 in Valladolid. The attendees to the event had the opportunity to reflect and vote on this set of technologies to assess their need and the importance of its dissemination by the IOTEC project. Once these priorities are established, it is now necessary to make them known so that IoT solution providers can adapt their offers to real needs.

Likewise, work is being carried out on dissemination and training activities to bring IoT technologies closer and concrete examples of their application to the set of industrial companies in the regions of Castilla y León and the Centre of Portugal participating in the IOTEC network. Any company supplying or demanding IoT technologies can participate in the project forum and benefit directly through collaboration and training opportunities in this exciting set of technological solutions such as the IoT.

New challenges on smart manufacturing industry

New challenges on smart manufacturing industry

Big Data as one of the so called “digital enablers” of Industry 4.0 sits at the core of promising technologies to contribute to the revolution at factories where vast amounts of data (whether they are big or small) hides enormous amount of knowledge and potential improvements for the manufacturing processes.

The Strategic Research and Innovation Agenda (SRIA) of Big Data Value Association (BDVA) defines the overall goals, main technical and non-technical priorities, and a research and innovation roadmap for the European Public Private Partnership (PPP) on big data. Within the current expectations of the future Data Market in Europe (around 60 B€), Manufacturing was at the first place in 2016 (12.8 B€) and in the 2020 projections (17.3 B€), revealing a leading role played by this sector in the overall Data Economy.

With the aim to find an agreed synthesis, the BDVA adopted the “Smart Manufacturing Industry” concept definition (SMI), including the whole value chain gravitating around goods production, secondly identified three main Grand Scenarios aiming at representing all the different features of a SMI in Europe: Smart Factory, Smart Supply Chain and Smart Product Lifecycle.

Given the relevance of both Data Market and Manufacturing industry in Europe and in accordance with European initiative of Digitation of Industry, CARTIF, together with rest of experts from BDVA association engaged in a collective effort to define a position paper that proposes future research challenges for the manufacturing industry in the context of Big Data.

To contextualize these research challenges, the BDVA association has defined five technical areas for research and innovation within the BDVA community:

  • Data Management and lifecycle motivated by the data explosion, where traditional means for data storage and data management are no longer able to cope with the size and speed of data delivered.
  • Data Processing Architectures originated by fast development and adoption of Internet of Things (IoT) and the need to process immense amounts of sensor data streams.
  • Data Analytics that aims to progress technologies and develop capabilities to turn Big Data into value, but also to make those approaches accessible to wider public.
  • Data Protection addressing the need to ensure the correct use of the information whilst guarantying user privacy. It includes advanced data protection, privacy and anonymization technologies.
  • Data Visualisation and User Interaction addressing the need for advanced means of visualization and user interaction capable to handle continuously increasing complexity and size of data and support the user exploring and understanding Big Data effectively.

During a series of workshops activities, started from the 2016 EBDVF Valencia Summit till the 2017 EBDVF Versailles Summit, BDVA experts distilled a set of research challenges for the three grand scenarios of smart manufacturing. These research challenges where mapped in the five technical priority areas of the big data reference model previously introduced.

To exemplify the outcomes of this mapping, the following figure gathers the headings of the set of challenges identified and discussed by the BDVA members into the Smart Factory Scenario. The interested readers are encouraged to analyze the full set of challenges in the SMI white paper.

Challenges set initially in this first version of SMI position paper set the tone for the upcoming research needs in different Big Data areas related with manufacturing. In the Smart Factory scenario the focus is on integration of multiples sources of data coming not only from the shop floor but also from the offices, traditionally separated in Industry 3.0. Interoperability of existing information systems and the challenge of integrating disruptive IoT technologies are major trials in the area of data management. Closer to the needs of a Smart Factory, the analytics challenges are focused on prescriptive analytics as tools for an optimal decision making process at the manufacturing operations management site including the optimization trough the evolved concept of digital twin.

Digital Transformation, to the moon and back

Digital Transformation, to the moon and back

It is July 20th, 1969, 20:18:04 UTC and after 102 hours, 45 minutes and 39.9 seconds of travel “the eagle has landed” and Neil is about to descend the ladder and touch an unknown surface for the first time: “That’s one small step for [a] man, one giant leap for mankind“. That 1969, Neil Armstrong, Michael Collins and “Buzz” Aldrin changed the world riding the biggest rocket ever built to the moon.

Some people may forgot it, others like me were not born at that time, but space race had its own digital transformation similar to the one foreseen for the industry and general public. Apollo program was the culmination of such first digital revolution in space exploration.

The landing achievement was, to a great extent, met thanks to the electronics onboard both the Apollo Command Module (CM) and Lunar Module (LM), the AGC or Apollo Guidance Computer. The computer was one of the first integrated digital circuit-based computers. With “just” 32kg of weight and a mere 55W of consumption this technical wonder was able to coordinate and control many tasks of the space mission, like calculating the direction and navigation angles of the spacecraft to commanding reaction control jets and orientate it in the desired direction. Moreover, the computer included one of the first demonstrations of a “fly-by-wire” feature where the pilot doesn’t command the engines directly but through control algorithms programmed into the computer. In fact, this computer was the basis for subsequent control of the space shuttle, military and commercial fly-by-wire systems.

As usual with this kind of breakthroughs, it did not happen overnight but through a series of incremental innovations done before.

By the 1950s, MIT Instrumentation Laboratory (IL) designed and developed the guidance system of Polaris ballistic missiles. Initially built with analog computers, soon they decided to go digital to achieve the accuracy required for the computations of missile trajectories and control.

Before President Kennedy set the ambitious goal of “… going to the moon in this decade …” 7 years earlier the first lunar landing, and after the launch of Sputnik in 1957, a Mars exploration study started at IL MIT’s laboratory. The design of a Mars probe set the basic configuration of the future Apollo guidance system including: a set of gyroscopes to keep the probe oriented, a digital computer and an optical telescope to orient itself relative to the moon and stars.

The launch of Sputnik in 1957 fueled America’s ambition to put the first human in space, but also contributed to the public debate of the pilots in the space race. A similar discussion to current views of the role of the worker in the factory. Should the astronaut just be payload or take full control of the spacecraft? Once aircraft pilots earned the task of being at the controls, several tests showed that it was nearly impossible that they would be able to control all the aspects of a mission due to the fast reaction needed and the amount different control commands.  Hence, pilots would need some automatic and reliable help for the pilot, and that was one of the main functionalities of the AGC.

Reliability was then one of the main concerns of the mission. Polaris program took 4 years to design a guidance control for a weapon in the air a couple of minutes. Kennedy’s bet of taking a man to the moon in less than 7 years meant to develop a guidance and control system for a spacecraft that should work without failure in a trip of more than a week of duration. The required levels of reliability were of more than two orders of magnitude. If a Polaris missile failed, a new one would take off. A failure in the spacecraft meant killing an astronaut.

Much of the reliability of the flight was in the shoulders of the Apollo Guidance Computer, and at some point of the program there were too many tasks planned, like complex guidance maneuvers, to be physically hardwired into electronics. To achieve these tasks it was needed software. Although software barely was taken into account at the beginning of the program it meant the difference between achieving the goal or program’s complete failure. The computer was the interface between the astronaut and the spacecraft, which in the end meant that computer software “controlled” the spacecraft, a revolution for that time. Today software is everywhere but then in the 60’s, software was seen as a set of instructions on punched cards. AGC software programs (frozen at 3 to 4 months before each launch) were “hard-wired” as magnetics cores and wires in a permanent (and reliable) memory but saved a lot of time, effort, and budget. In fact, it can be said Apollo software was more like a “firmware” using today’s vocabulary.

Today’s challenge of revolutionize industry through digital transformation can’t happen without the help of digital enablers. 48 years ago, digital electronics and first software programs were the “digital enablers” to achieve that “one small step for [a] man, one giant leap for mankind“. Today’s “Digital transformation is not an option” sounds like a cliché, a hype, a slogan from digital providers, but looking back in the history, the digital transformation in the Apollo program meant the difference of not achieving moon landing.