Skip to content

Commit

Permalink
writing the history part
Browse files Browse the repository at this point in the history
  • Loading branch information
jpivarski committed Aug 21, 2024
1 parent 1403346 commit a29fd1f
Show file tree
Hide file tree
Showing 4 changed files with 892 additions and 1 deletion.
16 changes: 15 additions & 1 deletion deep-learning-intro-for-hep/history.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,22 @@ Big data processing has always been an essential part of HEP, since the beginnin
2. These collisions produced new particles to discover, for any particle whose mass is less than the center-of-mass energy of the collision (also constrained by quantum numbers). Although in the first few decades, accelerated particles were collided with stationary targets, the goal of the experiments was to produce particles that can't ordinarily be found in nature and study their properties, then as now.
3. Computers quantified particle trajectories, reconstructed invisible (neutral) particles, and rejected backgrounds, for as many collision events as possible.

The last point is key: high energy physicists started using computers as soon as they became available. For example, Luis Alvarez's group at Berkeley's \$9,000,000 Bevatron built a \$1,250,000 detector experiment and bought a \$200,000 IBM 650 to analyze the data ([ref](https://www2.lbl.gov/Science-Articles/Research-Review/Magazine/1981/81fchp6.html)). Computers were a significant fraction of the already large costs, and yet analysis productivity was still limited by processing speed.
The last point is key: high energy physicists started using computers as soon as they became available. For example, Luis Alvarez's group at Berkeley's \$9,000,000 Bevatron built a \$1,250,000 detector experiment and bought a \$200,000 IBM 650 to analyze the data in 1955 ([ref](https://www2.lbl.gov/Science-Articles/Research-Review/Magazine/1981/81fchp6.html)). Computers were a significant fraction of the already large costs, and yet analysis productivity was still limited by processing speed.

![](img/overall-view-of-bevatron-magnet-photograph-taken-september-6-1955-bevatron-088cb0-1600.jpg){. width="31%"} ![](img/alvarez-group-bubble-chamber.jpg){. width="28%"} ![](img/ibm-650.jpg){. width="39%"}

The limitation was algorithmic: physicists wanted to compute kinematics of the observed particles, which is easy (just a formula), but the particles were observed as images, which is hard. To convert images into trajectories, something has to find the line segments and express them as vectors and endpoints. For the first 10‒20 years, that "something" was people: humans, mostly women, identified the vertices and trajectories of tracks in bubble chamber photos on specially designed human-computer interfaces (see [ref](https://www.physics.ucla.edu/marty/HighEnergyPhysics.pdf) for a first-hand account).

![](img/franckenstein-3.jpg){. width="50%"}

This is a pattern-recognition task—if ML had been available (in a usable form) in the 1950's, then it would have better than using humans for this task. Moreover, humans couldn't keep up with the rate. Then as now, the quality of the results—discovery potential and statistical precision—scales with the number of analyzed events: the more, the better. The following plot (from <a href="https://books.google.de/books?id=imidr-iFYCwC&lpg=PA129&dq=jack%20franck%20franckenstein&pg=PA130#v=onepage&q&f=false">ref</a>) was made to quantify the event interpretation rate using different human-computer interfaces.

![](img/scaleup.png){. width="60%"}

Below, I extended the plot to the present day: the number of events per second has continued to increase exponentially.

![](img/event-rates.svg){. width="100%"}

These event rates have been too fast for humans since 1970, when human scanners were replaced by heuristic track-finding routines, usually by iterating through all combinations within plausible windows (which are now limiting in high track densities).

Although many computing tasks in particle physics are suitable for hand-written algorithms, the field has always had tasks that are a natural fit for artificial intelligence, to the extent that human intelligence was enlisted to solve them. While ML would have been beneficial to HEP from the very beginning of the field, algorithms and computational resources have only recently made it possible.
877 changes: 877 additions & 0 deletions deep-learning-intro-for-hep/img/event-rates.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added deep-learning-intro-for-hep/img/scaleup.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit a29fd1f

Please sign in to comment.