COMPLAS 2023

High-dimensional symbolic regression via neural feature polynomials for interpretable machine learning plasticity

  • Bahmani, Bahador (Columbia University)
  • Suh, Hyoung Suk (Columbia University)
  • Sun, WaiChing (Columbia University)

Please login to view abstract download link

This paper introduces a new approach that combines the strengths of the expressivity of deep neural networks and the interpretability and portability of the mathematical expression determined by symbolic regressions to formulate plasticity models that can precisely capture the plastic behaviors of solids. By introducing neural network architectures that generate feature space and aggregate those data in a polynomial form, we enable the yield function to be determined analytically. By comparing with the benchmark state-of-the-art algorithms, the proposed method is capable of delivering more robust and accurate predictions, while the divide-and-conquer approach significantly improves the computational efficiency, especially for high-dimensional models aimed to capture material behaviors that lack material symmetry, exhibit size-dependent effect or complex hardening/softening mechanisms. By leveraging the portability of symbolic regression, the resultant models can be easily deployed to third-party software such as UMAT in ABAQUS. Extensions of the proposed approach for inverse problems and materials design in feature space will also be discussed.