Skip to content

Commit a29fd1f

Browse files
committed
writing the history part
1 parent 1403346 commit a29fd1f

File tree

4 files changed

+892
-1
lines changed

4 files changed

+892
-1
lines changed

deep-learning-intro-for-hep/history.md

Lines changed: 15 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,8 +27,22 @@ Big data processing has always been an essential part of HEP, since the beginnin
2727
2. These collisions produced new particles to discover, for any particle whose mass is less than the center-of-mass energy of the collision (also constrained by quantum numbers). Although in the first few decades, accelerated particles were collided with stationary targets, the goal of the experiments was to produce particles that can't ordinarily be found in nature and study their properties, then as now.
2828
3. Computers quantified particle trajectories, reconstructed invisible (neutral) particles, and rejected backgrounds, for as many collision events as possible.
2929

30-
The last point is key: high energy physicists started using computers as soon as they became available. For example, Luis Alvarez's group at Berkeley's \$9,000,000 Bevatron built a \$1,250,000 detector experiment and bought a \$200,000 IBM 650 to analyze the data ([ref](https://www2.lbl.gov/Science-Articles/Research-Review/Magazine/1981/81fchp6.html)). Computers were a significant fraction of the already large costs, and yet analysis productivity was still limited by processing speed.
30+
The last point is key: high energy physicists started using computers as soon as they became available. For example, Luis Alvarez's group at Berkeley's \$9,000,000 Bevatron built a \$1,250,000 detector experiment and bought a \$200,000 IBM 650 to analyze the data in 1955 ([ref](https://www2.lbl.gov/Science-Articles/Research-Review/Magazine/1981/81fchp6.html)). Computers were a significant fraction of the already large costs, and yet analysis productivity was still limited by processing speed.
3131

3232
![](img/overall-view-of-bevatron-magnet-photograph-taken-september-6-1955-bevatron-088cb0-1600.jpg){. width="31%"} ![](img/alvarez-group-bubble-chamber.jpg){. width="28%"} ![](img/ibm-650.jpg){. width="39%"}
3333

34+
The limitation was algorithmic: physicists wanted to compute kinematics of the observed particles, which is easy (just a formula), but the particles were observed as images, which is hard. To convert images into trajectories, something has to find the line segments and express them as vectors and endpoints. For the first 10‒20 years, that "something" was people: humans, mostly women, identified the vertices and trajectories of tracks in bubble chamber photos on specially designed human-computer interfaces (see [ref](https://www.physics.ucla.edu/marty/HighEnergyPhysics.pdf) for a first-hand account).
3435

36+
![](img/franckenstein-3.jpg){. width="50%"}
37+
38+
This is a pattern-recognition task—if ML had been available (in a usable form) in the 1950's, then it would have better than using humans for this task. Moreover, humans couldn't keep up with the rate. Then as now, the quality of the results—discovery potential and statistical precision—scales with the number of analyzed events: the more, the better. The following plot (from <a href="https://books.google.de/books?id=imidr-iFYCwC&lpg=PA129&dq=jack%20franck%20franckenstein&pg=PA130#v=onepage&q&f=false">ref</a>) was made to quantify the event interpretation rate using different human-computer interfaces.
39+
40+
![](img/scaleup.png){. width="60%"}
41+
42+
Below, I extended the plot to the present day: the number of events per second has continued to increase exponentially.
43+
44+
![](img/event-rates.svg){. width="100%"}
45+
46+
These event rates have been too fast for humans since 1970, when human scanners were replaced by heuristic track-finding routines, usually by iterating through all combinations within plausible windows (which are now limiting in high track densities).
47+
48+
Although many computing tasks in particle physics are suitable for hand-written algorithms, the field has always had tasks that are a natural fit for artificial intelligence, to the extent that human intelligence was enlisted to solve them. While ML would have been beneficial to HEP from the very beginning of the field, algorithms and computational resources have only recently made it possible.

deep-learning-intro-for-hep/img/event-rates.svg

Lines changed: 877 additions & 0 deletions
Loading
Loading
443 KB
Loading

0 commit comments

Comments
 (0)