Steven K. Kauwe

Ph.D. Candidate, University of Utah
Email address: [email protected]

Presented: March 11 and 12, 2021

“Accuracy, uncertainty, inspectibility: learning with compositionally-restricted attention-based networks”

The materials science community has established that data-driven modeling, based on chemical composition, generates accurate and actionable results. Tools for performing this task have evolved from simple statistical methods to large endeavors containing complex neural-networks with millions of parameters. With each successive work, we have seen a marked improvement in property predictions with a trend towards raw model accuracy. However, model performance often includes additional considerations such as uncertainty quantification or interpretability.

In this talk, I will present our new model architecture, the Compositionally-Restricted Attention-Based Network (CrabNet). CrabNet generates high-fidelity predictions based on the self-attention mechanism, a fundamental component of the transformer architecture which revolutionized natural language processing. The transformer encoder uses self-attention to encode the context-dependent behavior for the components within a system. In physical environments, elements contribute differently to a material’s property based on the materials system itself. For example, boron behaving as an electrical dopant in one system while behaving as a mechanical strengthening bond modification in another. CrabNet’s ability to potentially capture this type of context-dependent behavior allows for highly accurate model predictions.

CrabNet does not stop there. Our modeling approach uses a robust loss function and model ensembling techniques to provide uncertainty values along with the property predictions. Importantly, we achieve this with a model architecture that generates simple and inspectable self-attention maps. These attention maps govern the learned material property by representing element importance and interactions. The visualization and analysis of these attention maps are available during training and inference periods.

As a result, CrabNet provides property prediction, uncertainty estimation, and the governing element interactions for every prediction. We expect that the ideas and capabilities of CrabNet will encourage additional research and discussion around the uses and benefits of our community’s maturing data-driven approaches.

steven kauwe

Steven K. Kauwe, Ph.D. Candidate, University of Utah

Steven “Ka’ai” Kauwe is a graduating Ph.D. student in the Materials Science and Engineering Department at the University of Utah and works under the guidance of Dr. Taylor Sparks. Mr. Kauwe focuses on the intersection of machine learning and materials science by understanding, applying, and improving tools for materials informatics. Mr. Kauwe is the first author of five peer-reviewed publications and has made significant contributions to over 15 peer-reviewed works, some of which were through his mentorship of undergraduate students.

Mr. Kauwe enjoys teaching the principles of materials informatics. He developed the curriculum and taught the “Programming for Engineers” course for his department’s undergraduate students. He has presented at materials informatics workshops hosted by the University of Utah, the University of Houston, and the University of California Santa Barbara. Finally, Mr. Kauwe has presented his work at several conferences, most notably an invited talk at TMS 2020, and a presentation at MRS 2020.

NAMCS Spring 2021 Schedule