Hello, Guest.!
DOE to Build Supercomputing Pipeline for Data-streaming, Analysis
//

DOE to Build Supercomputing Pipeline for Data-streaming, Analysis

1 min read

The Stanford Linear Accelerator Center is collaborating with the Department of Energy laboratories to build a new data-streaming pipeline that would allow researchers to analyze data in real time.

The project aims to integrate artificial intelligence and machine learning-powered software into computing systems to achieve faster and more accurate results for scientific experiments, the Oak Ridge National Laboratory said Monday.

Called Intelligent Learning for Light Source and Neutron Source User Measurements Including Navigation and Experiment Steering, or ILLUMINE, the five-year project is part of efforts to expand connections between DOE computing centers and research facilities under the U.S. national laboratory system.

According to Jana Thayer, technical research manager at SLAC National Accelerator Laboratory, ILLUMINE aims to transmit experimental data to a remote computing facility without saving any information to a disk.

The project attempts to realize the ability to analyze the data of an ongoing experiment for optimal research results, which means “faster times to solutions and more accurate science,” Thayer said.

Additionally, the project will get a boost from high-performance computing centers, which would “expedite the data analysis process and alleviate in-house data storage issues,” Valerio Mariani, head of the LCLS Data Analytics Department at SLAC, explained.

Currently, the ILLUMINE project is using the Summit supercomputer at the Oak Ridge Leadership Computing Facility and will soon shift to Frontier, which take over from Summit when it is decommissioned by the end of 2024.