![]() Final GNoME models accurately predict energies to 11 meV atom −1 and improve the precision of stable predictions (hit rate) to above 80% with structure and 33% per 100 trials with composition only, compared with 1% in previous work 17. Consistent with observations in other domains of machine learning 28, we observe that our neural networks predictions improve as a power law with the amount of data. Given that discovered materials compete for stability, the updated convex hull consists of 381,000 new entries for a total of 421,000 stable crystals, representing an-order-of-magnitude expansion from all previous discoveries. Through this iterative procedure, GNoME models have discovered more than 2.2 million structures stable with respect to previous work, in particular agglomerated datasets encompassing computational and experimental structures 15, 16, 17, 27. The energy of the filtered candidates is computed using DFT, both verifying model predictions and serving as a data flywheel to train more robust models on larger datasets in the next round of active learning. In a series of rounds, these graph networks for materials exploration (GNoME) are trained on available data and used to filter candidate structures. Second, we use state-of-the art graph neural networks (GNNs) that improve modelling of material properties given structure or composition. Our approach relies on two pillars: first, we establish methods for generating diverse candidate structures, including new symmetry-aware partial substitutions (SAPS) and random structure search 26. In this paper, we scale up machine learning for materials exploration through large-scale active learning, yielding the first models that accurately predict stability and, therefore, can guide materials discovery. ![]() ![]() Although data-driven methods that aid in further materials discovery have been pursued, thus far, machine-learning techniques have been ineffective in estimating stability (decomposition energy) with respect to the convex hull of energies from competing phases 25. Combining ab initio calculations with simple substitutions has allowed researchers to improve to 48,000 computationally stable materials according to our own recalculations 22, 23, 24 (see Methods). Instead, computational approaches championed by the Materials Project (MP) 16, the Open Quantum Materials Database (OQMD) 17, AFLOWLIB 20 and NOMAD 21 have used first-principles calculations based on density functional theory (DFT) as approximations of physical energies. However, this strategy is impractical to scale owing to costs, throughput and synthesis complications 19. Experimental approaches over the decades have catalogued 20,000 computationally stable structures (out of a total of 200,000 entries) in the Inorganic Crystal Structure Database (ICSD) 15, 18. The discovery of energetically favourable inorganic crystals is of fundamental scientific and technological interest in solid-state chemistry. The scale and diversity of hundreds of millions of first-principles calculations also unlock modelling capabilities for downstream applications, leading in particular to highly accurate and robust learned interatomic potentials that can be used in condensed-phase molecular-dynamics simulations and high-fidelity zero-shot prediction of ionic conductivity. Of the stable structures, 736 have already been independently experimentally realized. Stable discoveries that are on the final convex hull will be made available to screen for technological applications, as we demonstrate for layered materials and solid-electrolyte candidates. Our work represents an order-of-magnitude expansion in stable materials known to humanity. Building on 48,000 stable crystals identified in continuing studies 15, 16, 17, improved efficiency enables the discovery of 2.2 million structures below the current convex hull, many of which escaped previous human chemical intuition. Here we show that graph networks trained at scale can reach unprecedented levels of generalization, improving the efficiency of materials discovery by an order of magnitude. Concurrently, deep-learning models for language, vision and biology have showcased emergent predictive capabilities with increasing data and computation 12, 13, 14. From microchips to batteries and photovoltaics, discovery of inorganic crystals has been bottlenecked by expensive trial-and-error approaches. Novel functional materials enable fundamental breakthroughs across technological applications from clean energy to information processing 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |