Learning more about particle collisions with machine learning
Date:
July 8, 2020
Source:
DOE/Argonne National Laboratory
Summary:
A team of scientists has devised a machine learning algorithm that
calculates, with low computational time, how the ATLAS detector
in the Large Hadron Collider would respond to the ten times more
data expected with a planned upgrade in 2027.
FULL STORY ==========================================================================
The Large Hadron Collider (LHC) near Geneva, Switzerland became famous
around the world in 2012 with the detection of the Higgs boson. The
observation marked a crucial confirmation of the Standard Model of
particle physics, which organizes the subatomic particles into groups
similar to elements in the periodic table from chemistry.
==========================================================================
The U.S. Department of Energy's (DOE) Argonne National Laboratory
has made many pivotal contributions to the construction and operation
of the ATLAS experimental detector at the LHC and to the analysis of
signals recorded by the detector that uncover the underlying physics
of particle collisions. Argonne is now playing a lead role in the high-luminosity upgrade of the ATLAS detector for operations that are
planned to begin in 2027. To that end, a team of Argonne physicists and computational scientists has devised a machine learning- based algorithm
that approximates how the present detector would respond to the greatly increased data expected with the upgrade.
As the largest physics machine ever built, the LHC shoots two beams of
protons in opposite directions around a 17-mile ring until they approach
near the speed of light, smashes them together and analyzes the collision products with gigantic detectors such as ATLAS. The ATLAS instrument
is about the height of a six-story building and weighs approximately
7,000 tons. Today, the LHC continues to study the Higgs boson, as well
as address fundamental questions on how and why matter in the universe
is the way it is.
"Most of the research questions at ATLAS involve finding a needle in
a giant haystack, where scientists are only interested in finding one
event occurring among a billion others," said Walter Hopkins, assistant physicist in Argonne's High Energy Physics (HEP) division.
As part of the LHC upgrade, efforts are now progressing to boost the LHC's luminosity -- the number of proton-to-proton interactions per collision
of the two proton beams -- by a factor of five. This will produce about
10 times more data per year than what is presently acquired by the LHC experiments. How well the detectors respond to this increased event rate
still needs to be understood. This requires running high-performance
computer simulations of the detectors to accurately assess known processes resulting from LHC collisions.
These large-scale simulations are costly and demand large chunks of
computing time on the world's best and most powerful supercomputers.
The Argonne team has created a machine learning algorithm that will be
run as a preliminary simulation before any full-scale simulations. This algorithm approximates, in very fast and less costly ways, how the
present detector would respond to the greatly increased data expected
with the upgrade. It involves simulation of detector responses to a particle-collision experiment and the reconstruction of objects from the physical processes. These reconstructed objects include jets or sprays
of particles, as well as individual particles like electrons and muons.
"The discovery of new physics at the LHC and elsewhere demands ever
more complex methods for big data analyses," said Doug Benjamin, a computational scientist in HEP. "These days that usually means use
of machine learning and other artificial intelligence techniques."
The previously used analysis methods for initial simulations have not
employed machine learning algorithms and are time consuming because
they involve manually updating experimental parameters when conditions
at the LHC change.
Some may also miss important data correlations for a given set of input variables to an experiment. The Argonne-developed algorithm learns, in
real time while a training procedure is applied, the various features that
need to be introduced through detailed full simulations, thereby avoiding
the need to handcraft experimental parameters. The method can also capture complex interdependencies of variables that have not been possible before.
"With our stripped-down simulation, you can learn the basics at
comparatively little computational cost and time, then you can much
more efficiently proceed with full simulations at a later date,"
said Hopkins. "Our machine learning algorithm also provides users with
better discriminating power on where to look for new or rare events in
an experiment," he added.
The team's algorithm could prove invaluable not only for ATLAS, but for
the multiple experimental detectors at the LHC, as well as other particle physics experiments now being conducted around the world.
========================================================================== Story Source: Materials provided by
DOE/Argonne_National_Laboratory. Original written by Joseph
E. Harmon. Note: Content may be edited for style and length.
========================================================================== Journal Reference:
1. D. Benjamin, S. Chekanov, W. Hopkins, Y. Li, J.R. Love. Automated
detector simulation and reconstruction parametrization using machine
learning. Journal of Instrumentation, 2020; 15 (05): P05025 DOI:
10.1088/ 1748-0221/15/05/P05025 ==========================================================================
Link to news story:
https://www.sciencedaily.com/releases/2020/07/200708125352.htm
--- up 24 weeks, 1 day, 2 hours, 39 minutes
* Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1337:3/111)