Knocking on Heaven's Door (37 page)

BOOK: Knocking on Heaven's Door
9.71Mb size Format: txt, pdf, ePub
ads

If you know the detector is measuring all the transverse momentum, and the momentum perpendicular to the beam doesn’t appear to be conserved after a collision, then something must have disappeared undetected and carried away momentum. Detectors, as we have seen, measure momentum in the perpendicular directions very carefully. The calorimeters in the forward and backward regions ensure hermeticity by guaranteeing that very little energy or momentum perpendicular to the beam can escape unnoticed.

The CMS apparatus has steel absorbers and quartz fibers in the end regions, which separate the particle tracks better because they are denser. The brass in the endcaps is recycled material—it was originally used in Russian artillery shells. The ATLAS apparatus uses liquid-argon calorimeters in the forward region to detect not only electrons and photons but also hadrons.

MAGNETS

The remaining pieces of both detectors that remain to be described in more detail are the magnets that give both experiments their names. A magnet is not a detector element in that it doesn’t record particle properties. But magnets are essential to particle detection because they help determine momentum and charge, properties that are critical to identifying and characterizing particle tracks. Particles bend in magnetic fields, so their tracks appear to be curved rather than straight. How much and in which direction they bend depends on their energies and charges.

CMS’s enormous solenoidal magnet made of refrigerated superconducting niobium-titanium coils is 12.5 meters long and six meters in diameter. This magnet is the defining feature of the detector and is the largest magnet of its type ever made. The solenoid has coils of wire surrounding a metal core, generating a magnetic field when electricity is applied. The energy stored in this magnet is the same as that generated by a half-metric ton of TNT. Needless to say, precautions have been taken in case the magnet quenches and suddenly loses superconductivity. The solenoid’s successful 4-tesla test was completed in September 2006, but it will be run at a slightly lower field—3.8 tesla—to ensure greater longevity.

The solenoid is sufficiently big to enclose the tracking and calorimeter layers. The muon detectors, on the other hand, are on the outer perimeter of the detector, outside the solenoid. However, the four layers of muon detector are interlaced with a huge iron structure surrounding the magnetic coils that contains and guides the field, ensuring uniformity and stability. This magnetic return yoke, 21 meters long and 14 meters in diameter, reaches to the full seven-meter radius of the detector. In effect, it also forms part of the muon system since the muons should be the only known charged particles to penetrate the 10,000 metric tons of iron and cross the muon chambers (though in reality energetic hadrons will sometimes also get in, creating some headaches for the experimenters). The magnetic field from the yoke bends the muons in the outer detector. Since the amount muons bend in the field depends on their momenta, the yoke is vital to measuring muons’ momenta and energy. The structurally stable enormous magnet plays another role as well. It supports the experiment and protects it from the giant forces exerted by its own magnetic field.

The ATLAS magnet configuration is entirely different. In ATLAS, two different systems of magnets are used: a 2-tesla solenoid enclosing the tracking systems and huge toroidal magnets in the outer regions interleaved with the muon chambers. When you look at pictures of ATLAS (or the experiment itself), the most notable elements are these eight huge toroidal structures (seen in Figure 34) and the two additional toroids that cap the ends. The magnetic field they create stretches 26 meters along the beam axis and extends from the start of the muon spectrometer 11 meters in the radial direction.

Among the many interesting stories I heard when visiting the ATLAS experiment was how when the magnets were originally lowered by the construction crews, they started off in a more oval configuration (when viewed from the side). The engineers had factored in gravity before installing them so they correctly anticipated that after some time, due to their own weight, the magnets would become more round.

Another story that impressed me was about how ATLAS engineers factored in a slight rise of the cavern floor of about one millimeter per year caused by the hydrostatic pressure from the cavern excavation. They designed the experiment so that the small motion would put the machine in optimal position in 2010, when the initial plan was to have the first run at full capacity. With the LHC delays, that hasn’t been the case. But by now, the ground under the experiment has settled to the point that the experiment has stopped moving, so it will remain in the correct position throughout operation. Despite Yogi Berra’s admonition that it’s “tough to make predictions, especially about the future,”
52
the ATLAS engineers got it right.

COMPUTATION

No description of the LHC is complete without describing its enormous computational power. In addition to the remarkable hardware that goes into the trackers, calorimeters, muon systems, and magnets we just considered, coordinated computation around the world is essential to dealing with the overwhelming amount of data the many collisions will generate.

Not only is the LHC seven times higher in energy than the Tevatron—the highest-energy collider before—but it also generates events at a rate 50 times faster. The LHC needs to handle what are essentially extremely high resolution pictures of events that are happening at a rate of up to about a billion collisions per second. The “picture” of each event contains about a megabyte of information.

This would be way too much data for any computing system to deal with. So trigger systems make decisions on the fly about which data to keep and which to throw away. By far the most frequent collisions are just ordinary proton interactions that occur via the strong force. No one cares about most of these collisions, which represent known physical processes but nothing new.

The collisions of protons are analogous in some respects to two beanbags colliding. Because beanbags are soft, most of the time they wilt and hang and don’t do anything interesting during the collision. But occasionally when beanbags bang together, individual beans hit each other with great force—maybe even so much so that individual beans collide and the bags themselves break. In that case, individual colliding beans will fly off dramatically since they are hard and collide with more localized energy, while the rest of the beans will fly along in the direction in which they started.

Similarly, when protons in the beam hit each other, the individual subunits collide and create the interesting event, whereas the rest of the ingredients of the proton just continue in the same direction down the beampipe.

However unlike bean collisions, in which the beans simply collide and change directions, when protons bang into each other, the ingredients inside—quarks, antiquarks, and gluons—collide together—and when they do the original particles can convert into energy or other types of matter. And, whereas at lower energies, collisions involve primarily the three quarks that carry the proton charge, at higher energies virtual effects due to quantum mechanics create significant gluon and antiquark content, as we saw earlier in Chapter 6. The interesting collisions are those in which any of these subcomponents of the protons hit each other.

When the protons have high energy, so do the quarks, antiquarks, and gluons inside them. Nonetheless, that energy is never the entire energy of the proton. In general, it is a mere fraction of the total. So more often than not, quarks and gluons collide with too small a fraction of the proton’s energy to make heavy particles. Due possibly to a smaller interaction strength or to the heavier mass expected for new particles, interesting collisions involving as-yet-unseen particles or forces occur at a much lower rate than “boring” Standard Model collisions.

As with the beanbags, most of the collisions therefore are uninteresting. They involve either protons just glancing off each other or protons colliding to produce Standard Model events that we already know should be there and that won’t teach us much. On the other hand, predictions tell us that roughly one-billionth as often as that the LHC might produce a new exciting particle such as the Higgs boson.

The upshot is that only in a small but lucky fraction of the time does the good stuff get made. That’s why we need so many collisions in the first place. Most of the events are nothing new. But a few rare events could be very special and informative.

It’s up to the
triggers
—the hardware and software designed to identify potentially interesting events—to ferret these out. One way to understand the enormity of this task (once you account for different possible channels) is as if you had a 150-megapixel (the amount of data from each bunch crossing) camera that can snap pictures at a rate of 40 million per second (the bunch crossing rate). This amounts to about a billion physics events per second, when you account for the 20 to 25 events expected to occur during each bunch crossing. The trigger would be the analog of the device responsible for keeping only the few interesting pictures. You might also think of the triggers as spam filters. Their job is to make sure that only interesting data make it to the experimenters’ computers.

The triggers need to identify the potentially interesting collisions and discard the ones that won’t contain anything new. The events themselves—what leaves the interaction point and gets recorded in the detectors—must be sufficiently distinguishable from usual Standard Model processes. Knowing when the events look special tells us which events to keep. This makes the rate for readily recognizable new events even smaller still. The triggers have a formidable task. They are responsible for winnowing down the billion events per second to the few hundred that have a chance of being interesting.

A combination of hardware and software “gates” accomplishes this mission. Each successive trigger level rejects most of the events it receives as uninteresting, leaving a far more manageable amount of data. These data in turn get analyzed by the computer systems at 160 academic institutions around the globe.

The first-level trigger is hardware based—built into the detectors—and does a gross pass at identifying distinctive features, such as selecting events containing energetic muons or large transverse energy depositions in the calorimeters. While waiting a few microseconds for the result of the level-one trigger, the data from each bunch crossing are held in buffer. The higher-level triggers are software based. The selection algorithms run on a large computer cluster near the detector. The first-level trigger reduces the billion per second event rate to about 100,000 events per second, which the software triggers further reduced by a factor of about a thousand to a few hundred events.

Each event that passes the trigger carries a huge amount of information—the readouts of the detector elements we just discussed—of more than a megabyte. With a few hundred events per second, the experiments keep well over 100 megabytes of disk space per second, which amounts to over a petabyte, which is 1015 bytes, or one quadrillion bytes (how often do you get to use that word?), the equivalent of hundreds of thousands of DVDs worth of information, each year.

Tim Berners-Lee first developed the World Wide Web to deal with CERN data and let experimenters around the world share information on a computer in real time. The LHC Computing Grid is CERN’s next major computational advance. The Grid was launched late in 2008—after extensive software development—to help handle the enormous amounts of data that the experimenters intend to process. The CERN Grid uses both private fiber-optic cables and high-speed portions of the public Internet. It is so named because data aren’t associated with any single location but are instead distributed in computers around the world—much as the electricity in an urban area isn’t associated with one particular power plant.

Once the trigger-happy events that made it through are stored, they are distributed via the Grid all over the globe. With the Grid, computer networks all over the globe have ready access to the redundantly stored data. Whereas the web shares information, the Grid shares computational power and data storage among the many participating computers.

With the Grid, tiered computing centers process the data. Tier 0 is CERN’s central facility where the data get recorded and reprocessed from their raw form to one more suitable for physics analyses. High-bandwidth connections send the data to the dozen large national computing centers constituting Tier 1. Analysis groups can access these data if they choose to do so. Fiber-optic cables connect Tier 1 to the roughly 50 Tier 2 analysis centers located at universities, which have enough computing power to simulate physics processes and do some specific analyses. Finally, any university group can do Tier 3 analyses, where most of the real physics will ultimately be extracted.

At this point, experimenters anywhere can go through their data to sleuth out what the high-energy proton collisions might reveal. This can be something new and exciting. But in order to establish whether or not this is the case, the first task for the experiments—which we’ll explore further in the following chapter—is deducing what was there.

CHAPTER FOURTEEN

IDENTIFYING PARTICLES

The Standard Model of particle physics, compactly categorizes our current understanding of elementary particles and their interactions (summarized in Figure 40).
53
It includes particles like the up and down quarks and the electrons that sit at the core of familiar matter, but it also accommodates a number of other heavier particles that interact through the same forces, but which are not commonly found in nature—particles that we can study carefully only at high-energy collider experiments. Most of the Standard Model’s ingredients, such as the particles the LHC is currently studying, were rather thoroughly buried until the clever experimental and theoretical insights that revealed them in the latter half of the twentieth century.

BOOK: Knocking on Heaven's Door
9.71Mb size Format: txt, pdf, ePub
ads

Other books

Something Like This (Secrets) by Eileen Cruz Coleman
Maiden Voyages by Mary Morris
Adelaide Confused by Penny Greenhorn
Just North of Whoville by Turiskylie, Joyce
High Sobriety by Jill Stark
Death in Brunswick by Boyd Oxlade
Stone Fox by John Reynolds Gardiner
Stealing Cupid's Bow by Jewel Quinlan
The Betrayal by Ruth Langan