• Last minute: Coronavirus in Spain, last minute: the US predicts between 100,000 and 200,000 deaths from coronavirus
  • Health: coronavirus symptoms, treatment and how to avoid contagion
  • Covid-19 map of the coronavirus in Spain: evolution of cases by autonomous community

It was Sunday of last week. In the White House press room, US President Donald Trump announced that four US tech giants - IBM, Microsoft, Amazon, and Alphabet, which owns Google - have "unleashed the potential of America's Supercomputing Resources "in the Fight Against Coronavirus.

Trump was not exaggerating. The coronavirus is facing the largest computing capacity in Human History to help scientists develop models to cure and prevent the coronavirus. It is an alliance formed by the world's largest technology companies, five US public laboratories, NASA, and the Massachusetts Institute of Technology (MIT), the Rensselaer Polytechnic Institute (RPI), and the campus of the University of California, San Diego, and various public bodies, including NASA and the National Science Foundation. This week, Hewlett Packard Enterprises joined the initiative.

It is https://covid19-hpc.mybluemix.net/ Consortium of High-Performance Computers COVID-19, an initiative promoted and launched in less than a week by a 44-year-old from Murcia named Darío Gil, a doctor from MIT and IBM Research Director. Gil is in charge of 3,000 researchers from the New York technology giant, is a White House technology advisor, and his was the idea of ​​connecting to the world's most powerful computers - used, for example, to simulate atomic explosions and thus not having to carry out nuclear tests - in the fight against the coronavirus.

Among the more than 16 connected supercomputers is the Summit, designed and built by IBM, which this week has analyzed 8,000 molecules and has detected the 77 most promising for therapies and vaccines. It is a job that would have taken months to do on a traditional computer. The Summit took two days.

How can technology help the fight against the coronavirus? I see supercomputers and advanced computing as a knowledge accelerator. Traditional science has always been based on the principle of trial and error, that is, on experiments. But experiments are expensive and time consuming. When computers appeared, they began to be used to analyze the data that the experiments gave us, until a new field was born, which is the field of simulation. And that's where we are now. What is the simulation? Simulation is like doing a virtual experiment and imagining what could happen. When the simulations began to be applied to the physical world - Physics, Chemistry, Biology - we realized that the interaction of atoms with each other is enormously complex, and that forced us to build very special computers. This is how the idea of ​​supercomputers was born, for example, to simulate a nuclear reaction process. How does that apply to a pandemic? A virus is a biological entity, and things like reacting with other cells or searching for molecules that can inhibit the protein that 'injects' the virus are processes of the physical world and have, for Therefore, a level of interaction with each other that is very high. A supercomputer is a computer with a lot of computing power, lots of CPUs [the part of the computer that processes all instructions from a computer's software and hardware], and what's more, those CPUs are connected to each other very efficiently. In the 'cloud' there are also many connected machines, yes, but in general in the cloud you do many independent processes. The machines need not be 'talking' to each other. But, when you are doing a very specific calculation - for example, how molecules interact with the virus - you need to look for all the interactions and connections. That means having thousands and thousands of computers 'talking' to each other, that is, passing information on each other from the calculations very efficiently. And a supercomputer is designed to do that very well. What is the capacity of the Consortium? We have already added the largest capacity in history in supercomputing. Not only do we fear the most powerful supercomputers, but there are also 16 systems. There has never been anything quite like it in capacity or, of course, in terms of the organizations that are collaborating. Those 16 systems - and new participants are still arriving - have more than 330 power petaflots, and here are five national laboratories in the US, the National Science Foundation, NASA, MIT, the RPI, and many technology companies. Those 5 labs used to collaborate on supercomputing before, although not with the coronavirus. In fact, we have designed the Summit and Lassen computers, which are managed and administered respectively by the Oak Ridge National Laboratory and the Lawrence Livermore National Laboratory. So we have a close relationship for a long time. That is why this progressed so quickly: we are very confident and know each other well. How has the negotiation with the other private companies been? You are competitors but it seems that everything has gone very well. It was on the 19th. We already had all the national laboratories and the MIT and the RPI, and this already had a lot of momentum, and that's when the White House asked me: "What do you think of others participating?" I replied: "The idea of ​​the consortium is for everyone to participate." So the request was made and everyone said yes. There are many other companies - also from outside the US - that have shown interest and will be incorporated. How is the work organized in practice? We are very focused on the scientific area, because the proposals must be presented on a portal of the National Science Foundation. We have an evaluation committee there and in the first 24 hours there were already more than 30 team proposals registered. 30 proposals in 24 hours? Yes. That is why we are trying to expand the capacity as much as possible and attract the best scientists. How was the Consortium's gestation? I was responsible at IBM for the technological response and the professionals of the company in the entire context of the pandemic, and, as part of that effort, we structured a series of work areas. One of them was what we could do to speed up the process of discovering new treatments and vaccines, and there we realized that one of the fields in which we have the most experience and capacity is supercomputing. And since we knew there was a lot of dependency in the modeling area, the idea was that instead of just doing it ourselves, we would join forces, so I called the White House Office of Science and Technology and spoke with Michael Kratsios, who the Chief Technology Officer. I told him about IBM's position and explained that this was an opportunity to make this quantitatively bigger. In fact, we had already spoken with the directors of some national laboratories of the Department of Energy and this had created quite a lot of enthusiasm. So we joined forces, and Kratios, Paul Dabbar [Under Secretary for Energy for Science] and I launched it. This began on Tuesday 17, by Friday 20 we had it very announced and on Sunday 22 the president announced it. This crisis comes just when we have been questioning the role of experts in everything from climate change to vaccines for years, and in some countries, the pandemic has sharpened the debate between public and private. What lessons can we draw from this crisis in science policy? I am an advocate of the importance of science to society. But the institutional component is essential. And from that point of view we have to have a great variety of institutions, public and private. The role of universities, industry, and state laboratories is to reinvent how we work with each other. If we do not join efforts, we will not develop the potential we have. My entire professional career has been focused on creating new collaborative models. For example, the laboratory that we created a few years ago with MIT, which is the largest AI libertarian that runs between a university and a company, is a collaborative model, where MIT is not, it is not IBM Research, but the two together. This consortium is another example. But what it is about is that this is not just for emergencies, but a natural way to advance science.When after the Second World War the research aid system was launched in the US, 80% or 90% % of spending was done by the federal government. In the most recent statistic, if you look at the total spending on R&D in the US, which is 600,000 million dollars [538,000 million euros], 70% is done through private companies. But in reality, in quantum computing, which is one of the fields in which you most work, there is only talk of competition, between companies, and also between countries, with a relentless fight between the US, the EU and China to see who develops that technology first. Doesn't that contradict what you're saying about collaboration? Competence is essential. But even in the context of quantum computing, in which it is fighting to maintain leadership, already in 2016 the first thing we did at IBM was to create a platform in which the use of quantum computing was open to the world. In the IBM Quantum Experience we have more than 220,000 users, and more than 240 scientific publications have come out of it. You can create whatever you want, but you have to open the systems to create a community. If more people also open it, an ecosystem begins to be created. That does not mean that whoever created the quantum computer has to donate it as charity. The competition will continue because it is what gives margin and allows to maintain the research laboratories. But it can be done in a way that adds up to everyone. Is society making the most of technology? For example, one of the biggest customers for quantum computers is banks. And that's all well and good, but banks are likely to want quantum computers to buy and sell the same bond thousands of times per second, which is not necessarily a visible social benefit. I believe that there are a number of sectors that have very much internalized the role of information technology because they understand that it is essential for their business. It is something that is independent of the resources that each institution or each company has. You see that in the financial system or in the energy sector, where they tell you, "if we are not able to make calculations, we cannot calculate the risks or make the models." So they want to see any technological advance, because they have realized that it is critical to their competitiveness. To all of this, are you using quantum computers in the Consortium? No, because it is still too early given the development of that technology. At IBM, we have 16 quantum computers at work, but they still have the capacity to contribute to an effort like this. In the long term, in this decade we are going to see the convergence of this type of supercomputer, of systems designed specifically for Artificial Intelligence, and of quantum computers. It is what I call bits + neurons + qubits [the qubit is the unit of measurement of the power of quantum computers], and that is going to be a transformation in the process of discovery. When we talk about this pandemic or other problems like global warming, we see that everything has a basis in the physical world, and therefore we need to know the physical world better.

According to the criteria of The Trust Project

Know more

  • Science and health
  • Covid 19

HealthHope in Italy: patient 1 from Piedmont who was definitively cured of the coronavirus

ItalyThe harsh testimony of an Italian doctor: "It is an enormous pain to see your friends fall for the coronavirus"

Such are the confinement rules in Italy, what will the Spanish be like?