Princeton University researchers are leveraging large language models (LLM) to help manufacturers more easily predict the behavior of the crystalone stuctures. The AI is aimed at streamlining processes in battery and semiconductor production.
According to a recent paper, a team led by Andre Niyongabo Rubungo adapted Google Research’s T5 LLM to synthesize text descriptions of materials properties. Using a benchmark library of 140,000 descriptions, the AI accurately forecast the properties of crystaline structures ranging from table salt to the silicon used in microchip foundaries.
While challenges remain over data access and compute, the AI promises to expedite design and testing of new technologies. A January report from Tech Xplore has more.
https://arxiv.org/abs/2310.14029
https://www.researchgate.net/profile/Andre-Niyongabo-Rubungo
https://blog.research.google/2020/02/exploring-transfer-learning-with-t5.html
https://techxplore.com/news/2024-01-harness-large-language-materials-discovery.html