DEVELOPMENT OF NEW METHODS FOR MATERIALS SCIENCE: TO ACCELERATE SIMULATIONS AND AUTOMATE ANALYSIS OF EXPERIMENTAL DATA
Johns Hopkins University
High-throughput simulations and characterization are essential for accelerating material development and reducing lab-to-market time. Considerable efforts have been devoted to the development of new algorithms and techniques to reduce the computational costs of materials simulations and streamline the quantitative analysis of experimental data, leveraging the latest advances in theories of material computation and machine learning paradigms. This thesis exemplifies such efforts in three projects. The first project develops a set of algorithms for the dynamic generation of optimized generalized Monkhorst-Pack k-point grids, which were shown to be capable of greatly reducing the computational costs of electronic structure calculations of crystalline materials. The new set of algorithms reduces the computational overhead in the search for efficient grids and eliminates the need for a pre-generated database by transforming the problem from an enumeration of three-dimensional superlattices to an enumeration of two-dimensional superlattices and a small set of symmetry-permitted shifts. A lightweight C++ library with a Python interface and a stand-alone Java application are developed for integration with existing DFT software. In the second project, a program is developed to accelerate the structure search of stable nanoclusters using machine-learned interatomic potentials and active learning. Interatomic potentials are updated on-the-fly during the exploration of the configuration space with DFT calculations on extrapolating structures and low-energy clusters in the genetic pool at each retraining stage, which guarantees accuracy of the final reported structures. Using this approach, new lowest-energy isomers of elemental aluminum clusters were found for 25 out of the 35 studied sizes. The third project involves developing a program to automate the characterization of X-ray Phase Contrast Imaging (XPCI) of particles emitted from metal composite combustion. The program automates the laborious manual characterization process and provides statistical analysis of particles. It consists of three stages: particle detection, trajectory reconstruction, and property analysis. Gaussian mixture model and convolutional neural networks are evaluated for particle detection, while the Kalman filter is used for trajectory reconstruction. A shape detection procedure classifies particles based on their shapes, leveraging classic computer vision algorithms. Finally, a graph data structure is employed to identify complex events like microexplosions and collisions.
Computational Materials Science, Density Functional Theory, Algorithm, Crystalline Materials, K-Points, Nanoclusters, Image Processing