[ad_1]
Earlier this year, a South Korean lab announced a breakthrough that was pitched as the “holy grail” of electric efficiency and one that could be “a possible real solution to the energy crisis”. Lumps of a grey-black, polycrystalline compound known as LK-99 were presented by scientists at the Seoul-based Korea University in July with accompanying claims that the inputs could be fused to fabricate a superconductor that worked at room temperature and at normal pressure: claims that subsequently didn’t survive the test of scrutiny.
But given all the excitement that was triggered by the prospect of just one such breakthrough material, a new announcement by researchers at Google DeepMind late November could have Brobdingnagian implications.
Artificial intelligence (AI) was used by these researchers to predict the structures of more than 2 million new materials, a breakthrough that could have wide-reaching application in sectors such as renewable energy, battery research, semiconductor design and computing efficiency.
With this DeepMind AI tool, known as Graph Networks for Materials Exploration or GNoME, the design and generation of potential recipes for new material looks far easier now.
Why is this significant?
In one shot, this AI-linked breakthrough increases the number of ‘stable materials’ known to mankind ten-fold. These materials include inorganic crystals that modern tech applications from computer chips to batteries rely on.
To enable new technologies, crystals must be stable — otherwise they can simply decompose. While these materials will still need to undergo the process of synthesis and testing, DeepMind has published a list of 381,000 of the 2.2 million crystal structures that it predicts to be most stable.
To put this in context, consider the ongoing search for solid electrolytes that could replace the liquid electrolytes that are currently found in Li-ion batteries. As a prerequisite, these electrolytes have to be stable, should have specific conduction properties, while being non-toxic and non-radioactive. Or say, the ongoing work on new layered compounds similar to graphene (a carbon form) that could potentially revolutionise electronics and superconductors.
What DeepMind’s AI-led discovery does is to scale this process up by using filters, to narrow down on a list of materials that can be synthesised and could potentially meet the listed requirements. And it can potentially delve down to the atomic bond level while making these predictions.
For elements to link up and form stable solids, it is important that the bonds between the constituent atoms are strong enough to ensure that they do not spontaneously decompose. New stable materials are generally discovered by practitioners of solid state chemistry through a process of trial and error that involves making small tweaks to known materials or by fusing elements together — an expensive and time-consuming process.
Over the last decades, experimentation by humans has resulted in the discovery of the structures of some 28,000 stable materials, which are listed in the Inorganic Crystal Structures Database, the largest database of identified materials.
So how does GNoME actually work?
In a blog post announcing the project, DeepMind researchers fleshed out the design of the computational model: GNoME is a state of the art graph neural network model or GNN, where the input data for the model takes the form of a graph that can then be likened to connections between atoms.
GNoME was trained using “active learning”, a technique to scale up a model first trained on a small specialised dataset. Developers can then introduce new targets allowing machine learning to label new data with human assistance. This makes the algorithm “well suited” to the science of discovering new materials, which requires searching for patterns not found in the original dataset.
As seen here, GNoME uses two pipelines to discover low-energy (stable) materials. The structural pipeline creates candidates with structures similar to known crystals, while the compositional pipeline follows a more randomized approach based on chemical formulas. The outputs of both pipelines are evaluated using established Density Functional Theory calculations and those results are added to the GNoME database, informing the next round of active learning.
As a result, the model has boosted the precision rate for predicting materials stability from 50 per cent to around 80 per cent. Again, given that scientists have only managed to unearth 28,000 stable materials using resource-intensive and time-consuming computational methods, DeepMind claims its current research is equivalent to nearly 800 years of knowledge, given that 380,000 of its stable predictions are now publicly available to help researchers make further breakthroughs in materials discovery teams.
The crystal structure data that GNoME was originally trained on was the database put out by The Materials Project, a multi-institution, multi-national endeavour to compute the properties of all inorganic materials and provide the data for every materials researcher free of charge.
“We used GNoME to generate novel candidate crystals, and also to predict their stability. To assess our model’s predictive power during progressive training cycles, we repeatedly checked its performance using established computational techniques known as Density Functional Theory (DFT), used in physics, chemistry and materials science to understand structures of atoms, which is important to assess the stability of crystals,” DeepMind’s Amil Merchant and Ekin Dogus Cubuk said in the blogpost on November 29.
[ad_2]