Scaling in-situ process monitoring to qualify AM parts
Alumnus Luke Scime, who was the first student to apply machine learning to additive manufacturing at Carnegie Mellon, is now a staff scientist at Oak Ridge National Laboratory who is developing process monitoring software to qualify additive manufactured parts at scale.
When Luke Scime was a mechanical engineering Ph.D. student at Carnegie Mellon in 2018, he took on a small side project to apply computer vision techniques he was learning in a computer science course to the additive manufacturing (AM) research he was conducting in Jack Beuth’s new lab.
Those early experiments to apply machine learning to some of CMU’s first metal 3D printers has grown in scope and scale and now encompasses a nationwide effort to develop software that is being used at 15 U.S. government labs and more than a dozen companies and universities.
Scime is now a staff scientist at Oak Ridge National Laboratory (ORNL). He and a team from the Manufacturing Demonstration Facility at ORNL have been developing Peregrine, a scalable in-situ process monitoring software stack that uses artificial intelligence to generate data to support additive manufactured part qualification. Like the peregrine falcon that the software is named for, their software is fast, agile, and adaptable.
“Luke was the first student of mine to apply machine learning to AM process monitoring applications. He made a big impact on future research in our group,” said Beuth, a professor of mechanical engineering.
Additive manufacturing offers numerous advantages, including the ability to create complex and intricate geometries that can’t be produced using traditional manufacturing methods. It can reduce material waste and supports a wide range of materials, including various metals and alloys. AM also reduces lead times by eliminating the need for tooling and molds and can produce parts on-demand and, therefore, the need for large inventories. But the ability to fully realize the many significant advantages of the growing range of AM applications depends upon the ability to qualify the finished parts.
Luke was the first student of mine to apply machine learning to AM process monitoring applications. He made a big impact on future research in our group.
Jack Beuth, Professor, Mechanical Engineering
Scime came to campus in November to meet with students and faculty and deliver a presentation about his work at the Manufacturing Demonstration Facility, a Department of Energy user facility focused on early-stage research and development to improve the energy and material efficiency, productivity, and competitiveness of American manufacturers.
He outlined a progression of how in-situ data can support AM part qualification. At the most basic level, the data and meta data are recorded but generally exhibit low quality and inconsistent formatting. The second level provides better quality data that is recorded more consistently and is available for use in artificial intelligence applications that can detect anomalies. At the next level, in-situ data are spatially registered with ex-situ characterization data and enable physics-based modeling that can predict flaws and material properties.
The final aspirational application of in-situ data will generate reliable property predictions that allow AI algorithms to automatically iterate both the part design and the manufacturing process steps.
The challenge is developing a standard solution that can be applied to the many different types of printers, printer manufacturers, users, use cases, and materials, as well as a wide range of methods to monitor, store, and analyze the data that researchers need in order to build effective computer models.
Scime told his Carnegie Mellon audience that, “Many national labs are doing in-situ monitoring work, but it would not be efficient for every one of them to come up with their own data management structure. If we can provide a flexible framework that they can all use, then their time and money can expand into some of the other important work that needs to be done.”
Like many students who came out of Beuth’s program, Scime had the option to take a job where he would have been the primary expert working in a small AM operation or join a large team like the one at ORNL, where he would be surrounded by a large number of experts who were pursuing both the scientific and engineering challenges.
“It’s really cool to be working with hundreds of engineers and scientists who are striving to solve these challenges so that solutions we develop can be applied at scale and used every day over and over across a lot of different systems and by many types of users,” said Scime.
Currently, the Peregrine team is gathering data from nine laser powder bed fusion systems, four electron beam powder bed fusion systems, and six binder jet systems, as well as more than 10 different systems operated by their government and industry partners. They have loaded more than 2,500 builds into the software and are currently adding approximately 50 new builds each month.
It’s really cool to be working with hundreds of engineers and scientists who are striving to solve these challenges so that solutions we develop can be applied at scale and used every day over and over across a lot of different systems and by many types of users.
Luke Scime, Staff Scientist, Oak Ridge National Laboratory
The original scope of the project was smaller and its goal, which was similar to the project he worked on at Carnegie Mellon, was simply to use images to find anomalies. But the ORNL team knew they couldn’t create a solution for every individual printer, so the key research objectives are now to develop high-resolution, multi-modal, sensor-agnostic deep learning algorithms to identify in-situ and ex-situ data registration techniques and create common data formats and clean software interfaces that can be applied to a broad range of printers, users, and use cases.
The technology autonomously collects and analyzes layer-wise imaging data; provides remote monitoring and process intervention capabilities; tracks metadata and part information; produces advanced visualizations of both the underlying in-situ data and machine learning results; and enables identification of correlations between in-situ data and process parameters or ex-situ characterization data.
Relying on image data or sensing modalities that can be converted into image data, the code allows for various steps of pre-processing of the data, such as registration; distortion correction; annotation of training ground truth data; processing of data online or offline; creating a 3D rendering of the detected anomalies; and implementing corrective actions.
The goal for 2024 is to begin incorporating software features developed at other DOE laboratories back into the Peregrine code base.
“We are now working with more than 20 government, academic, and industry partners, who are contributing software features, providing unique datasets, and performing tests in production environments that will allow us to continue to grow Peregrine’s capabilities and generalizability,” explained Scime.
Beuth can’t help but feel a sense of pride in the work his former student is doing.
I see the Peregrine software having significant impact on the AM community as a whole. At that point Luke’s impact on CMU will have come full circle.
Jack Beuth, Professor, Mechanical Engineering
“I see the Peregrine software having significant impact on the AM community as a whole,” said Beuth, who reports that Carnegie Mellon’s Next Manufacturing Center is now working to acquire Peregrine.
“At that point Luke’s impact on CMU will have come full circle.”