Dancing with ISABELLE: Brookhaven's Last Accelerator

Discussion in 'Alternate History Discussion: After 1900' started by Workable Goblin, Jan 11, 2017.

Loading...
  1. Threadmarks: Origins

    Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    The creation of the Atomic Energy Commission and the establishment of the national lab system beginning in the late 1940s marked the beginning of a new epoch in what would later be known as high-energy physics research. Henceforth, rather than being the province of itinerant dabblers mostly depending on private funding to construct devices in university laboratories, it would be shaped by massive government funding and massive government facilities, backdropped by massive government expectations. Previously, high-energy physicists had given the world the atomic bomb and the nuclear reactor, both of which were quickly becoming indispensable to the military; who knew what they might come up with next?

    For particle physicists, in particular, two of the AEC’s many national labs stood out. The first was E.O. Lawrence’s famous Radiation Laboratory at Berkeley, California, founded even before the war and the home of the cyclotron, the most powerful type of particle accelerator in the world in the 1930s and 1940s. Despite the foundation of Lawrence Livermore, a classified nuclear design shop similar to Los Alamos, the original Radiation Laboratory remained part of the AEC system, though with a focus on unclassified physics research instead of the nuclear-related work it performed during the war. The second was an entirely new laboratory named Brookhaven located near the town of Brookhaven on Long Island, on the site of a former Army base. Besides doubling the number of federally supported scientists and engineers, thereby creating additional jobs and expanding the pool of physics talent, having a second site would hopefully mean that the failures or foibles of one would not impact the ever-onwards advancement of science. Flawed or failed proposals could be critiqued by a competitor, while the pressure of Berkeley and Brookhaven would drive Brookhaven and Berkeley to higher and presumably more productive heights of discovery. Despite the additional expense, then, building Brookhaven appealed to the AEC. It also appealed to many within the particle physics community, for whom a site in the northeast, relatively proximate to the many major research universities located in the region, would be far more convenient than distant California.

    For more than a decade this arrangement worked perfectly, with both labs alternatively competing and cooperating to advance the state of the art ever farther. When the AEC was founded, in 1946, the highest energy accelerators in the world, at Berkeley, could reach about 700 MeV, or millions of electronvolts, a unit of energy commonly used in particle physics and equivalent to the energy gained by an electron moving through a one-volt potential. That is, Berkeley's machines could, in theory, raise an electron to the energy it would gain by passing through a 700,000,000 volt potential. Over the next fifteen years, however, a series of new machines and new machine designs with tongue-twisting names like the synchrocyclotron pushed the state-of-the-art far beyond 1946, culminating in 1960 with Brookhaven's Alternating Gradient Synchrotron, able to accelerate particles to 33 billion electronvolts, almost fifty times the energy of Lawrence's old cyclotron. Besides being a technological tour de force, the rapid advancement in machine capability had enabled a string of discoveries, unveiling new particles at a rate too fast for the theorists to keep up.

    By then, however, Berkeley was starting to run into significant trouble, for two reasons. First, they were beginning to literally run out of room. Already in the 1940s Lawrence's laboratory had relocated from the actual campus to the hills above in order to build ever-larger cyclotrons, and by the early 1960s the increasing size of state-of-the-art accelerators was causing them to run out of hilltop. In of itself this problem was surmountable, for the wide-open Central Valley of California beckoned beyond the Coast Ranges where Berkeley lived, but the second problem was, ultimately, more significant. Unlike Brookhaven, Berkeley was disliked among the so-called "users," the physicists who actually designed and built experiments and as a result turned the machine from a mere money-sucking toy into a major scientific experiment. Rightly or wrongly, they viewed Berkeley as giving more credence to the needs of their own staff than those of visiting physicists, a phenomenon exacerbated by the failure of an insurgent group of Midwestern physicists to obtain AEC support for the construction of a 12 GeV machine at the University of Wisconsin and the resulting perception of the system as being dominated by the coasts at the expense of the growing middle.

    Perhaps these issues might have been worked out had the management of the AEC's laboratory system and particle accelerators remained as informal as it had been in the 1950s, let alone the 1940s, but with the rapid growth in machine energy came an equally rapid growth in machine costs. The agreement to split development between Brookhaven and Berkeley had only ever been a gentleman's agreement, and with the stakes so much higher than they had been before, gentlemen were in short supply. At first it seemed that Berkeley might win through in any case as Presidential commissions and expert panels supported their bid for the next big accelerator, but in 1965 the AEC for the first time ever asked the opinion of the National Academies of Sciences on its location, and next year opted to form a new laboratory near Chicago, the National Accelerator Laboratory, in order to host it. Berkeley never built another major accelerator.

    Brookhaven, meanwhile, was promised the next accelerator after NAL's, and in any case was not directly touched by the debate. After all, what did it matter if it was competing with California or Illinois? Some of the staff, however, saw that the elevation of the National Accelerator Laboratory, later and better known as Fermilab, set a dangerous precedent; if Berkeley could be dethroned, why not Brookhaven? And the director of NAL, Robert R. Wilson, was certainly the sort of person who might do just that, with a reputation for an autocratic and hard-charging leadership style. They organized, and pushed the laboratory's management board to form a committee, the so-called Fitch panel, to study the laboratory's accelerator program. At first, this committee merely elaborated on the details of the machine which would naturally succeed the one under construction at NAL under the old succession of designs, but soon enough other concerns--both technical and political--began to drive more blue-sky thinking. Not only were existing machine designs apparently approaching a limit in terms of their physics results, but Brookhaven was being torn apart by increasing disputes between its users and its machine builders, and an ambitious new machine seemed to be just the thing to draw them back together.

    Such a decision was facilitated by a series of major developments over the course of the 1960s that seemed to promise--and require--a revolution in accelerator design as significant as the development of the synchrotron or, before it, the cyclotron. The first was the trend, well-known by that time, of accelerators to increase in peak energy exponentially, similar to the Moore's Law of greater and more recent fame. Every few years a new accelerator would come on-line and increase achievable energy by a factor of ten. However, again much like Moore's Law, maintaining this rapid growth in energy required constant innovation in accelerator designs. Electrostatic generators and linear accelerators had been supplanted by cyclotrons; cyclotrons, by synchrotrons; synchrotrons, by...well, what? One option, under investigation at the time, was the particle collider, a device that would smash two beams of particles into each other unlike the conventional particle accelerator that simply propelled a particle beam into a fixed target of some sort. Due to the peculiar rules of special relativity, a collision between two fast-moving particles, like the ones in a collider, would be many times more energetic than one between one fast-moving and one virtually stationary particle, allowing another big jump in peak energy. In fact, thanks to this fact a collider using two beams of relatively low energy, say 200 GeV, can actually produce much higher energy collisions than a conventional accelerator using a single very powerful beam.

    However, while most people know that a higher energy accelerator is a better accelerator, prompted by news articles that trumpet particle energy, the truth is that energy is only half of the equation for physicists interested in an accelerator's performance. The other half is luminosity, the rate at which particles flow through a given area, or in other words how many collisions--and therefore how much physics--actually occur. To a certain extent, energy and luminosity can trade off for each other, since higher energy generally makes it more likely for a given interaction to produce a particular physics outcome, while higher luminosity makes it more likely for an outcome rare at a given energy to occur at least once. Both energy and luminosity also require expensive equipment to realize their potential, too, such as the sophisticated magnets and beam control systems needed to achieve high energies, or the complex detector designs needed to disentangle the flood of particles produced by frequent collisions. In the end, however, there is a minimal luminosity needed for any accelerator to be useful, no matter how high its energy, and a minimal energy needed for an accelerator to probe new physics questions, no matter its luminosity. At some point, either there would just not be enough energy in the particles to do anything new, or the rate of interactions would be too small to collect useful data in any reasonable amount of time.

    And this was the problem with colliders. While colliders offered a big step forwards in energy, their impact on luminosity--at least for protons--was far less beneficial. Right off the bat, the lower density of a particle beam compared to a solid target meant that luminosity would necessarily be lower in a collider than in a conventional accelerator, all else being equal. And all else was decidedly not equal, as colliders also had problems related to so-called injection, when particles were added to their beams, that could exacerbate beam instability, where beams would spread out and essentially evaporate over time. These issues might further lower luminosity so that a collider would produce practically no actual physics, instead being a mere exercise in engineering. Moreover, colliders had the additional problem of only allowing a few experiments to use the beam at a time, while also making it more difficult and costly to operate a program of experiments involving multiple kinds of particle. For all these reasons, accelerator physicists tended to view colliders as exciting challenges, while particle physicists tended to view them as expensive and potentially dangerous experiments. By the time Brookhaven was studying its next big accelerator, however, CERN had actually built a proton collider, the Intersecting Storage Rings, under the influence of its former Director General, Victor Weisskopf. Contrary to the fears of many physicists, the ISR showed that the problems of injection and beam stability could be solved, and was able to easily reach a satisfactory luminosity, although it was compromised in physics results due to what proved to be poor detector design. Hence, when Brookhaven began to consider its future options in the late 1960s, the idea of building a collider rather than a high-energy conventional accelerator seemed far more practical than it had earlier in the decade, and certainly a far better option to continue Blewett's Law of exponential accelerator energy growth.

    Augmenting this reasoning was the developing technology of superconducting magnets, which promised to further increase the potential energies of particle accelerators. While superconductivity had been discovered a half-century earlier, early superconducting materials--so-called Type I superconductors--could not tolerate significant magnetic fields or large currents before a small part of the material would lose its superconductivity, triggering a quench, the abrupt reversion of a superconductor to its normal state. The rapid increase in temperature resulting from the virtually instantaneous increase in electrical resistance can then trigger various undesirable events, including explosions from the boil-off of the cryogenic coolant needed to maintain superconducting temperatures. Unfortunately, the magnets of a particle accelerator need to be able to tolerate both significant magnetic fields and powerful currents, so superconductors were not, at the time, capable of use in this role. In the 1950s, however, a new material, an alloy of niobium and tin, was discovered to exhibit superconductivity, perhaps unsurprisingly as both niobium and tin are superconductors on their own. Far more surprising was the discovery in 1961 that niobium-tin could support large magnetic fields and high current densities, unlike every other known superconductor, and the development by RCA in 1962 of methods for fabricating conducting ribbons of niobium-tin in commercial quantities.

    Together, these led to a rapid increase in research into superconducting magnets at the national labs, now that such magnets seemed technically and even commercially viable. Although intended for many possible applications, including fusion reactors, particle accelerators have particularly strenuous magnet demands in order to bend and focus their beams, and superconducting magnets offered a particularly attractive route to increase performance. A very considerable amount of work had been carried out at Brookhaven during the 1960s in developing superconducting magnets, with an increasing focus later in the decade on the particular demands of accelerator magnets, which must be compact and able to rapidly and precisely raise and lower their magnetic field in time with the beam while at the same time resisting major stresses caused by those fields themselves. By 1968 Brookhaven had made a number of breakthroughs in this area, in particular coming up with a method of distributing the conductors around the central bore of an accelerator to generate the proper field inside with as little as possible "wasted" on fields outside the accelerator itself, later called the "cosine" method (since the current in the conductors would be proportional to the cosine of their angle from the horizontal plane). Still, their magnets required a very large amount of research before they could possibly be used in an accelerator, although the laboratory's Accelerator Department was confident these could be worked out in time for the next machine.

    That year, however, Peter Smith of the Rutherford Laboratories in Britain dropped a bombshell when he announced that they had discovered a method of completely eliminating two of the most persistent problems in the superconducting magnet business, flux jumping and AC losses, both caused by currents induced in the superconductor by changing magnetic fields, and both involving the discharge of heat into the superconducting material. If the superconductor was unable to shed this heat quickly enough, it could quench, losing its superconductive properties. By using extremely fine conducting filaments, so that the eddy currents involved in flux jumping simply could not deposit much energy, and by twisting the wires to minimize the effective area exposed to the induced fields involved in AC losses, both problems could be virtually eliminated. The only problem was how to use such fine filaments to form a magnet, and on this matter the two sides of the Atlantic quickly developed different approaches. The Rutherford group proposed weaving a kind of superconducting cloth from the filaments, similar in structure to a hair braid. In theory, this approach had several advantages; the woven structure, for example, would firmly interlock the filaments without external intervention, lending the conductor a certain degree of mechanical strength, while it would allow simpler construction of certain kinds of magnet. In particular, the cosine magnet designed by Brookhaven would only need a single layer of conductor with a Rutherford weave, thereby making it simpler and potentially easier to mass-produce while protecting against quenches, always a serious threat to superconducting magnets. Rutherford began experimenting with a series of prototype magnets intended for CERN or even American accelerators, which seemed to bear out the advantages of the woven layout in practice.

    Brookhaven, however, was not fully convinced. In the words of its then-director of superconducting magnet development, William Sampson, "When I saw the Rutherford weave I thought, 'That's a very complex design. What happens if one of the filaments quenches and shorts? What happens if a filament breaks in the weaving process?' In the Accelerator Department [the group responsible for designing and building the laboratory's accelerators], we liked to stick to Thoreau's words: 'Simplify, simplify.' And the Rutherford weave didn't seem very simple." Brookhaven's designers instead plumped for a cable, twisting a small number of filaments together to form larger strands, which were then squeezed together to form a rectangular cable that could be stacked for magnet construction. This seemed mechanically simpler and easier to manufacture than Rutherford's weave, and made more efficient use of space while maintaining a near-optimum ratio of superconductor to supporting copper used to further suppress flux jumping. However, the cable required careful production to ensure that the strands would hold together, and would require a more complex multi-layer design for cosine-type magnets, potentially harming their manufacturability and hence the ability of the Department to deliver the number of magnets needed for a new accelerator. The Department, however, thought that it could develop manufacturing techniques that would allow efficient production, and that the greater simplicity of the conductor could compensate to some extent for the greater complexity of the magnet.[1]

    Thus, by 1971 the technology needed for the next generation of superconducting, high-energy particle colliders was in place, or at least under development. By itself, however, this would not have been enough to encourage Brookhaven's management to pursue this new technology had physicists not developed a hunger for precisely the type of high-energy, high-luminosity accelerator that these technological advances promised. Besides the perennial interest in what might be uncovered with more events at higher energies, there was a specific target that was drawing the attention of particle physicists at that time: the W boson, believed to be the "carrier" of the weak interaction in the same way that the photon is the carrier of the electromagnetic interaction or the gluon the carrier of the strong interaction. The W boson had originally been proposed in the early 1960s to resolve inconsistencies in the theory of weak interactions at higher energies, but a number of ingenious searches had puzzlingly failed to resolve any trace of the W at all, a problem explained in 1966 by a quartet of Japanese and American physicists as being due to previously unanticipated corrective factors that greatly reduced the probability of producing the W compared to previous assessments. However, their work, and the work of physicists investigating the collision of high-energy electrons with protons, showed by 1971 that an accelerator able to produce a center-of-mass energy of 400 GeV and luminosity of 5*10^33 or greater, about ten times that of the Intersecting Storage Rings at CERN, would be all but certain to either find the W or rule out its existence, in either case making a great discovery.

    Such an accelerator could not be practically built as a fixed-target machine due to the energy requirement[2], nor could it be built using conventional magnets due to the luminosity target. Only a superconducting collider could meet these design goals, and by late 1971 members of the Fitch panel were already openly discussing what a collider intended to meet these parameters would look like. In their final report, delivered in early 1972, they explicitly called for the construction of a W machine--called ISABELLE, for Intersecting Storage Accelerator + "Belle," beauty--as the next major goal for Brookhaven, a goal which was quickly embraced by the laboratory's management and made the centerpiece of the laboratory's plan for the next decade.

    [1]: This is the PoD. In reality, the approaches were precisely the opposite of what I have indicated; Rutherford favored a cable, whereas Brookhaven favored a braid. This would have serious ramifications later...

    The Sampson quote is fictional, as you might have guessed from the above, but he was a real person who really was a director of superconducting magnet development at Brookhaven in 1971 (the structure of the laboratory was complex), and who really did express a preference for simplicity when it came to superconducting magnets through the words of Thoreau.

    [2]: Achieving the same center-of-mass energy with a fixed target would require a 100 TeV beam, far beyond what anyone at the time--or today, for that matter!--could possibly have generated.

    ---​

    Wow, this has been a long time coming! I first had the idea for this time way back in 2012, at the same time I first learned about ISABELLE, but for obvious reasons (aka Eyes Turned Skywards) never got around to actually writing it until last year, during my long break from the forums. Maybe it's just because I'm a particle physicist myself, but the ISABELLE project hit a chord with me when I heard about it, and it seemed so very ripe for some AH love...

    Many thanks to e of pi and Asnys for reading and commenting on earlier drafts; any errors remaining (or questionable decisions made) are, of course, my own. I hope you all enjoy this little timeline very much.
     
    Dlg123, cosimas, AndyF and 10 others like this.
  2. wizz33 Well-Known Member

    Joined:
    Apr 30, 2009
    nice a science TL
     
    Dlg123 likes this.
  3. Threadmarks: Development

    Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    Building ISABELLE, however, would take more than the support of Brookhaven management. While in the 1950s or even the early 1960s this might have been enough for the Atomic Energy Commission, or AEC, to allocate the necessary funds, by the early 1970s the AEC, and American basic research in general, was facing an increasing budget crunch as the Vietnam War and Great Society drained off tax dollars and the first signs of the economic malaise of the latter half of the decade began to appear. At the same time ISABELLE itself promised to be far costlier than previous accelerators, with an estimated price tag of $100 million, or over half a billion in today's dollars. As always when costs soar and budgets shrink, the informal management methods of the past were giving way to a codified bureaucracy, in this case symbolized and spearheaded by the formation of the High Energy Physics Advisory Panel, or HEPAP, in 1967, just as the National Accelerator Laboratory was breaking ground. Staffed by physicists instead of managers, HEPAP had effectively taken control of particle physics construction projects by 1972, meaning that ISABELLE would need to justify itself before HEPAP in order to receive approval. Therefore, the Fitch panel's recommendation in 1972 was merely one step in the gestation of ISABELLE, not the beginning of an actual program. The technical details that had been sketched by the Fitch panel needed to be fleshed out through concentrated study and dedicated engineering analysis into an actual proposal, while the user community, which would be responsible for turning the machine into a productive source of new physics, needed to be brought on board with the idea.

    Here, the laboratory's management had two major advantages. First, Brookhaven, unlike Berkeley, had always had a friendly relationship with its user community, so that they were predisposed to look favorably on new Brookhaven proposals. Indeed, the laboratory was so friendly to users that its Accelerator Department, responsible for actually building and maintaining the machines, had been feeling put-upon, driven to achieve too much too quickly with too little so that the laboratory could satisfy physicists both on and off Long Island. Secondly, and perhaps more importantly, the entire concept of the machine had been driven by the users themselves. ISABELLE had not been conceived by the Accelerator Department as an interesting exercise in machine development, or by the laboratory as a means of preserving itself, but in order to tackle a specific set of scientific questions, and as a result there was little need to persuade those who were asking those questions that it might be useful to them.

    Nevertheless, refinement was necessary. Almost immediately, it became clear that the initial ISABELLE plan, using a "racetrack"-style accelerator tunnel that would leave two areas clear for the construction of detectors, was badly flawed in the new era of colliding beams. Whereas previous machines had directed beams of protons into fixed targets, which could easily enough be swapped out for different experiments, the new colliders would require their detectors to surround the beam, located where the accelerator's magnets would send packets of protons smashing into each other at high speed, making them far more complex to install and operate. In fact, it would be necessary to integrate the design of detectors, or at least locations where they could be installed, directly into the design of the accelerator itself, something unnecessary with the fixed-target machines.

    At the same time, physicists were beginning to appreciate that there were a whole range of phenomena were taking place in a previously neglected regime, that of the so-called "high-pT" particles. These were particles that were generated with a relatively large speed perpendicular to the direction of whatever particle beams were being used, so that they flew away from the collision at relatively large angles. In the jargon of high-energy physics, they had a large "transverse momentum" or, when simplified into mathematical symbols, "pT," which meant that they had to have originated from a high-energy collision between two of the sub-particles making up the protons being collided, or "partons" in the cautious and inclusive language used to avoid mentioning Murray Gell-Mann's quarks. Other types of collisions and interactions would simply generate a spray of particles aimed mostly along the beam rather than throwing them off nearly perpendicularly. This, in turn, meant that these perpendicularly-moving particles carried the signature of the most energetic collisions, the ones where the particles approached most closely to each other and interacted most strongly. If new physics were to be found, the high-pT regime was certainly the place to look.

    Together, this meant that providing space for two detectors, originally a bold and daring step, now seemed laughably inadequate. A detector for the W--one for low-pT physics--another for high-pT physics--perhaps one for slamming electrons into the proton beam--and more and more and more now graced the desks of users, who almost as quickly as they could think of them inquired whether Brookhaven could make them work. To have even a chance of doing so, the racetrack design had to be abandoned in favor, it turned out, of a simple polygon, a circle distorted to possess short straight segments suitable for the installation of detectors. Designers experimented with providing four and eight experiment areas before finally settling on six, four for firmly planned experiments and two others for possible future experiments, determining what turned out to be the collider's ultimate footprint (barring one, albeit significant, later modification).

    More important than determining the physical layout of the machine, though, was the program to develop its magnets. From the very beginning, it was obvious that the magnets would make or break ISABELLE. If they could generate adequate fields sufficiently reliably at an acceptably low cost in time and money, then the machine would be successful. If not...

    Here, the internal divisions of the laboratory began to rear their ugly heads. Magnet development was, after all, the province of the Accelerator Department, which the users--the Physics Department, and those outside of Brookhaven altogether--perceived as an unresponsive, slow-moving bureaucratic behemoth, while the Accelerator physicists felt that they were under appreciated and overworked, having to not only maintain the machines and keep them running from day to day, but also carry out long-range research and development to build new, better machines, to discover methods of squeezing just a bit more luminosity or energy out of the existing devices, and to satisfy the constant demands of the operator-physicists, who frequently seemed to see accelerator physicists as little more than animate piñatas to attack until physics came out.

    As ISABELLE needed to begin turning from paper research into physical development, this conflict was further exacerbated by the interest the leaders of the Accelerator Department had in building the National Synchrotron Light Source, or NSLS, which the particle physicists believed would suck attention and funding away from their project. With their influence over the lab's administration, this triggered a roiling conflict between the Accelerator Department and the administration. The head of the Department through most of the 1960s, Kenneth Green, had already been forced to step down in 1970, after problems emerged with upgrades to the Cosmotron; now his replacement, Frederick Mills, was forced to resign, while his right-hand man in advanced R&D, John Blewett (of Blewett's Law) was reassigned to a nominally more senior position outside of day-to-day research and development duties. Mills' replacement, Harald Hahn, began receiving threatening phone calls, and the entire department was, understandably, caught in a sense of malaise and uncertainty mirroring that of the country at large. Many of the accelerator physicists, connecting the political difficulties of the department to ISABELLE, wanted nothing to do with that program, instead focusing on other projects and experiments.

    Fortunately for ISABELLE, William Sampson's magnet group was relatively unaffected by this turmoil, and was able to move from the experimental magnet development it had been carrying out in the 1960s to prototyping magnets for ISABELLE itself. Though well short of what would be needed to actually build the accelerator, these new "ISA" magnets--for Intersecting Storage Accelerator--still represented a jump from older, shoebox-sized experiments to meter-sized prototypes, and began to map new and uncharted territory in magnet assembly. The small number of magnets previously built had allowed a manufacturing process that was not much more sophisticated than handcrafting, with each and every magnet practically a unique piece of art. Although the prototypes would be constructed in equally small numbers, ISABELLE itself would require hundreds of magnets, far too many for anything other than a dedicated industrial plant to produce in any reasonable amount of time, and questions of manufacturability were beginning to bubble up towards the surface.

    In particular, there was the question of magnet motion to contend with. The strength of the magnetic fields involved in a large accelerator could cause the magnets themselves to begin flexing and moving, which would be disastrous if the machine was active. The subtle changes in the field caused by such movements could cause the delicately controlled particle beams to smash against the walls of the vacuum tube, potentially damaging it and the magnets beyond, to say nothing of the damage done to the beam itself and by extension the number of collisions taking place, or the fact that the dissipation of heat from mechanical flexing could cause a quench. The coil inside the magnet needed to be supported by a tight and firm grip, while at the same time evenly transporting the stresses on the magnet to the case enclosing it, to prevent concentration of forces and excessive mechanical stress. Whatever method was used to provide this firm, tight grip also needed to be easily and precisely repeated, so that it could be incorporated into assembly-line manufacturing procedures with as little difficulty and fuss as possible.

    The intuitive method of binding the two while meeting these goals was heat-shrinking, or more precisely in this case heat-expanding. The coil--that is, the superconducting portion of the magnet--was cooled to cryogenic temperatures in a liquid nitrogen bath, shrinking it, then rapidly put into the casing, kept at room temperature. Normally the coil would have been slightly too large to fit in the casing, but as a result of its cryogenic cooling it just barely fit. Once it was allowed to warm back up, it would attempt to expand back to full-size, which, in theory, would guarantee a tight, heavily stressed, and uniform fit. When the magnets were later cooled back down to cryogenic temperatures to operate, the casing would be cooled with the coil, maintaining the tight fit.

    The first several ISA magnets were assembled in this fashion, and seemed to work well, but knowing that this was an engineering job, not a physics one, the magnet group began about this time to bring in engineers as well as physicists, and set them to evaluate the assembly methods currently in use. They pointed out that the process involved a rapid and uncontrolled bolting process to fix the casing in place around the coil after it was pulled out of the cryogenic bath, introducing a host of new variables that completely undermined the goal of developing a uniform and precise bonding method. Additionally, the magnet group had simply assumed that the heat-shrinking method was necessary to produce a uniform stress on the coil, without actually testing whether this was correct or whether a simpler method might work just as well. To rectify this oversight, the engineers pushed the magnet group to build several copies of the next generation of ISA magnet, the ISA IV, testing different methods to bind the casings together.[1]

    Unfortunately, this goal collided with further budget limits at the Energy Research and Development Agency, or ERDA, which had succeeded the AEC as the agency responsible for managing the national laboratory system. These budgetary problems were transmitted, as usual, to Brookhaven, which in turn put serious pressure on Sampson's magnet group to trim the cost of its prototyping program, in particular by canceling the construction of redundant magnets. Many of the physicists in the group were not at all displeased by the prospect of the ISA IV copies being cancelled. Intellectually, they might have known that it was important to think about how to build hundreds of powerful magnets quickly and efficiently from the beginning; emotionally, however, they couldn't shake the feeling that their job was to figure out how to meet the field strength and switching rates needed for the accelerator to work, and that performing an engineering experiment was just a costly, time-consuming distraction from this work.

    Fortunately, both for the engineers and, as it proved, ISABELLE itself, Sampson did not share these feelings. Instead, in his words, "If we were going down a blind path, I wanted to know it now, when we were working on only one or two magnets, not when we were working with Westinghouse or General Electric or whoever to build five hundred or a thousand magnets. So I pushed for the engineering test program."[2] With Sampson's support, an assembly test was retained, but trimmed to involve only one alternate magnet assembly method, rather than the range of methods that the engineers had originally proposed. Additionally, a second magnet program at the laboratory, focused more on short-term needs than the long-range R&D needed for ISABELLE, was essentially gutted, with the development programs of this “Danby group” reassigned in their entirety to Sampson. This reflected a larger struggle for budgets among the national labs, now fighting to wrest whatever scraps they could from ERDA; Danby's magnets were approximately the same size as those needed for ISABELLE, and could plausibly be presented as showing that the laboratory was entirely capable of constructing that machine, though they differed in many important details.

    This boost was, perhaps, not necessary, as support for ISABELLE remained strong in the physics community. With its relatively high energy and high luminosity, there would be nothing like it in the world if it were built, always a powerful argument in favor of a new machine, and one sufficient to keep ISABELLE near the top of HEPAP's priority lists. This high-priority status was only further magnified by the virtually simultaneous discovery of the J/Ψ at Brookhaven and the Stanford Linear Accelerator Center, SLAC, in 1974, a sign interpreted as showing that exciting new physics awaited any new machine. This priority was also reflected at Brookhaven, where ISABELLE had been recreated as a separate division of the Accelerator Department, a further bureaucratic sign of the accelerator's progress.

    In parallel with these positive developments was another of considerably more import to the success of the project, as the ISA IV/A and IV/B magnets were completed in June and July of 1974, respectively, and underwent the usual battery of magnet tests. The /A magnet was constructed using the conventional heat-shrinking method; the /B was instead built at room temperature, with the casing split into several sections and attached together very tightly by bolts. Testing revealed that, contrary to the fears of some within the magnet group, the /B magnet had internal stresses that were just as uniformly distributed as in the /A magnet, while completely eliminating any need to cool and shrink the coil during assembly.[3] Buoyed by this discovery, the magnet group moved forwards, completing the ISA V in 1975. Besides the various modifications and tweaks made during the ISA program as a whole, and the new IV/B assembly method, the ISA V would include one further modification that later proved to be of considerable importance. Earlier ISA magnets had used multiple layers of cable to produce their magnetic fields, in an effort to cancel undesirable multipoles, higher-order terms in the mathematical description of the magnet field. One of the researchers in the magnet group had realized that wedges, gaps in the superconducting wiring, could serve the same purpose, and had developed a magnet design that required one fewer layer of conductor, making the magnet cheaper and simpler to construct. The magnet group anticipated great things from this magnet as they began to think about building full-scale prototypes.

    What they got was beyond even those bloated expectations. Normally, superconducting magnets are limited by what is called the short sample limit, essentially the current-carrying capacity of a very small piece of superconducting material. Naturally, a real magnet, with much more material and many more flaws and imperfections, will not be able to carry quite so much current, and will as a result quench at a lower field strength. Usually, in fact, magnets will quench at considerably lower field strengths and have to train, or undergo a series of quenches, to come even close to the short sample limit. More training, of course, interferes with magnet manufacturability, as it means more time must be spent on each magnet to make it reach its performance goals. ISA V had a short sample limit of 5.2 tesla, comfortably above the 4 tesla needed for ISABELLE. Previous ISA magnets with similar short sample limits had quenched for the first time at 3-4 tesla, and required training to boost them safely above the 4 tesla requirement. Not the V. When it was tested, it kept going, and going, and going, and going, far enough that the testers were afraid that there was a short in their machine, before suddenly quenching--at 4.5 tesla. With a field more than half a tesla greater than needed at first quench, the magnet group knew they had found something special, all the more so when it reached 5 tesla after just a few rounds of training. As Jim Sanford, the recently appointed head of the ISABELLE division explained later, "We knew we had it in the bag."[4]

    Almost simultaneously with the dramatic success of ISA V's first quench, HEPEP--the High Energy Physics Advisory Panel, created to provide scientific oversight of the nation's particle physics program--was meeting in Woods Hole, Massachusetts, to consider the effects of the recent discovery of the so-called J/ψ particle at SLAC and Brookhaven on the research program they had just developed the previous year. ISABELLE was, as it had been at the last several meetings, an important topic on the agenda, along with SLAC's PEP electron-positron collider, the teraelectronvolt-scale Energy Doubler superconducting proton storage ring proposed by Robert Wilson at the National Accelerator Laboratory, recently renamed Fermilab after the Italian-American physicist, and a range of other, more minor projects.

    With ISA V's successful first quench test coming just a few weeks before the beginning of the Woods Hole meeting, Brookhaven's management were optimistic that HEPAP would recommend an immediate start to the program. After all, the magnets were performing beyond expectation, the accelerator design was finalized, and the laboratory had, at last, been unified behind the project with promises that both ISABELLE and NSLS would proceed forward; what else needed to be done? Plenty, it turned out, at least in the eyes of the outside physicists on HEPAP. To begin with, the ISA series had been small-scale prototypes, not full-scale pre-production models; perhaps ISA V's success could not be replicated at production scale, and the project would be doomed to failure. If it could, on the other hand, ISABELLE's physical parameters--a mere 400 gigaelectronvolts, or GeV, of energy in the collision, and a luminosity of the order of 10^33--could be significantly improved, producing a machine far more capable of long-term development. A bevy of other minor questions also bothered the panel. On the whole, the conference judged, it was not quite time to start building ISABELLE, though it was the second-highest priority project behind SLAC's new positron-electron collider. Instead, they recommended, a so-called half-cell--half of one cell, that is one of the identical strings of magnets that fit together to form the full machine--should be built and tested. If this full-scale prototype worked as well as ISA V had, then the accelerator should be redesigned to take full advantage of the stronger field available, and would then certainly be approved by the next HEPAP Woods Hole meeting, scheduled for 1977.

    Although disappointed, Brookhaven recognized that HEPAP's recommendations consisted of steps that the laboratory would need to take in any case, even if it had been given the green light. Besides, not being able to start construction of the accelerator tunnel and associated facilities, the most significant effect of lacking HEPAP's approval, could even be a blessing in disguise if the stronger than expected magnets permitted a more powerful facility. After all, such a facility would likely have a different physical configuration--a larger accelerator ring, for one--than the current ISABELLE design, and therefore require different construction work. And if they did demonstrate the half-cell successfully, they would have a strong basis for appealing to Congress for a new project start even before the next Woods Hole meeting, since they had, in a sense, been "pre-approved" by HEPAP in that case.

    Nevertheless, having to engage in research and development for another two years carried its own risks, most prominently those posed by their rival laboratory, Fermilab. Despite being in early development, Fermilab's Energy Doubler had emerged as an aggressive competitor for whatever research dollars were available from the Energy Research and Development Agency, ERDA. More importantly, there were a number of voices suggesting that it could be used as a collider considerably more powerful and capable than ISABELLE, whether in conjunction with Fermilab's Main Ring accelerator or with other new, dedicated facilities. Any delay might give these voices time to organize and grow in strength, perhaps to dominate HEPAP or even the House Science and Technology Committee where New York's Congressional delegation had long supplied Brookhaven a bastion of support.

    For the moment, though, all Brookhaven could do was to forge ahead with ISABELLE and strive to overcome Fermilab with success. To begin with, there was clearly the need to begin testing full-size magnets, which the magnet group had already begun work on after ISA V's successful first quench test. Just like the ISA series, this "Mark" series would represent a major leap in size and capabilities; whereas the ISA magnets had been only a meter or so long, the Mark series magnets were the size of a good-sized car, and much denser, tipping the scale at nearly ten tons each. Mark I was delivered in early 1976, after several months of construction work, and was a disappointment. At first quench it was still a hair short of the 4 tesla goal, and it only reached ISA V's initial 4.5 tesla after more than two dozen quenches, an unacceptable number when nearly a thousand magnets needed to be manufactured, tested, and installed in the span of two or three years. Fortunately, investigation revealed a number of easily corrected design flaws, and subsequent Mark magnets were more successful; Mark II first quenched at 4 tesla and reached 4.5 after just over a dozen quenches, Mark III at 3.8, though it reached 4.5 after ten quenches, and Mark IV at 4.2, reaching 5 tesla after just eight quenches. Although the Mark series were still not production magnets, they were the closest to that status yet, and they seemed to have fully resolved the "magnet question" by the beginning of 1977.[5]

    With the success of the Mark IV magnet in October 1976 had come another change of direction for the magnet group, this time in two complementary but different directions. Although part of the group continued working on the Mark series, hoping to uncover further possible improvements to the magnet design, the bulk of the group began working on what they called the "P" series, pre-production magnets that, unlike the Mark series, would be suitable for accelerator use and could be used to construct the half-cell. In parallel, for the first time Brookhaven contracted outside firms such as Westinghouse to deliver their own pre-production magnets, looking ahead to the major production contract that would come once the project itself was approved. These outside magnets, grouped together as the "O" series, would be compared with Brookhaven's own P series magnets as a performance baseline.

    By time the 1977 HEPAP Woods Hole meeting got underway in June, the three P magnets needed for the half-cell--two dipole magnets used for beam containment and a smaller but more complex quadrupole used for correcting and steering the beam--had been completed, assembled, and tested at 4, 4.5, and 5 tesla, in all cases successfully containing and steering a beam of protons. At the same time, the first O magnets had been delivered, and, in quench testing, seemed to stack up well against the P magnets. Further testing was underway to determine which contractor had done the best job, but the consensus at Brookhaven was that all of the major questions around ISABELLE had been addressed and it was time to start building.

    That proved to be the consensus at Woods Hole as well, although it took some time for this to become clear. Fermilab, for its part, was wedded to its Energy Doubler and aggressively pushing for a funding guarantee despite being ranked third in priority at the 1975 Woods Hole, behind ISABELLE. Robert Wilson, Fermilab's director, spoke passionately and persuasively about the danger of choosing the safe, technically sound path in building new accelerators, pointing out that doing so meant each machine would be too expensive, take too long to build, and be underpowered for whatever new physics awaited at the energy frontier. Of course, he had an ulterior motive; Fermilab was pushing not only for the Energy Doubler but for a complimentary project intended to add a new accelerator tunnel and new accelerator hardware to enable proton-proton and proton-electron collisions. Fermilab scientists and engineers claimed that this new accelerator could be built more quickly and cheaply than ISABELLE, while achieving far higher center-of-mass energies than Brookhaven's accelerator. As this scheme was little more than paper it was naturally far riskier than ISABELLE, with its operating hardware, and it was therefore in Wilson's interest to make the riskier path seem paradoxically safer.

    Whatever the rationale, Fermilab's efforts proved to be more of a last ditch effort to secure as much ERDA funding as possible in a time of highly constrained budgets than a powerful and overwhelming offensive. Brookhaven already commanded broad community support, stretching from the universities of the northeast all the way to California, from which the legendary James Bjorken, an important theorist and co-author of an influential textbook, made a pilgrimage to argue in favor of ISABELLE in front of the Woods Hole panel, and had strong Congressional backing on the House Science and Technology Committee. Moreover, they had a trump card against Fermilab's new collider, one that cut to the very heart of their "quicker, cheaper, better" argument. The maximum energy of a circular proton collider is dominated by the need to bend the beams of protons, the ability of the collider to contain them to a near-circular path. Mathematically, this capability is dependent on just two factors: the size of the collider, and the strength of its magnets, its dipoles. Increase the size, and the maximum energy increases; increase the magnetic strength, and the energy increases. With the field strength demonstrated by the latest P series magnets, the energy in the beams could be increased from 200 to 400 GeV with a slight increase in the collider's size, increasing the cost by $60 million, equivalent to about $250 million today and roughly a one-third increase in total cost, but with no impact on the planned five year construction time. Just like the baseline design, this ISABELLE would also be done by late 1982.

    Fermilab's collider, by contrast, would require using the Energy Doubler as an injector, that is as a source of high-energy protons, and so could not be operational any earlier than that facility, and likely later as civil construction, magnet assembly, and other steps were deferred for the more immediate project. Moreover, it would require magnets even stronger than the P series, with central fields of 6 tesla or more, all while Fermilab was struggling to successfully develop the 4.2 tesla magnets needed for the Energy Doubler. Finally, many key steps that Brookhaven had completed, from major project milestones like testing full-scale machine prototypes to minor obstacles like completing environmental impact statements were still awaiting Fermilab. Thus, the revised ISABELLE would not only nearly match Fermilab's proposal in terms of energy, it would do so at a much lower cost and on a much faster schedule. Although mindful of Wilson's admonition to remember that they were operating on the forefront of science, not building bridges, HEPAP considered that the effort Brookhaven had made over the past five years to push the state of the art in superconducting magnets was proof enough that ISABELLE was a forefront machine, and, thanks to that effort, one with relatively contained and well-understood technical risks. They recommended immediate approval of ISABELLE, and Congress obligingly accommodated them by passing the necessary funding bills by the end of the year. ISABELLE was finally on its way.

    [1]: The method described was in fact used to build early ISA magnets, but was later replaced with the moon shot method, where the coil would be yanked into the casing by a hydraulic jack instead. This had similar problems--the coils had a habit of warming up too quickly and getting stuck in the casings--but the lack of engineers in the project meant that no one pointed this out until they were already trying to build production magnets which, needless to say, was far too late to try something new. Also true is the fact that no one checked whether either method was actually necessary until the magnets were showing an inability to meet other performance goals, again far too late to adjust the production process and still meet deadlines.

    [2]: This quote is entirely fictional.

    [3]: In reality, this was not discovered until much later, long after the project was underway. By that point, however, it was, once again, far too late to make any difference.

    [4]: This is based on the first tests of the so-called "Palmer magnet" (which was the one that demonstrated all of those things mentioned in the previous footnotes). In fact, the Palmer quench test was considerably more dramatic, since it was the last chance to salvage ISABELLE and actually breached the short sample limit (due, it turned out, to a mistake in measuring the latter). Thanks to the PoD, and due to paying slightly more concern towards manufacturability, they have now essentially discovered the Palmer magnet.

    Additionally, the quote from Sanford is a slightly edited version of a real quote, "We thought we had it in the bag."

    [5]: This is where the fruits of stumbling on the Palmer magnet design early begin to come in. IOTL, the Mark series did slightly worse overall, often first quenching below 4 tesla and only reaching that level with training, whereas here it's routine to reach mid-high 4 tesla fields after relatively little training. More importantly, whereas the Mark V IOTL misled Brookhaven into believing that they had successfully designed a 5 tesla magnet (it first quenched at 4.1 tesla and eventually reached 4.9, though no one then or later ever figured out how), here that conclusion is perfectly correct.
     
    AndyF, Dlg123, Vatem and 7 others like this.
  4. Unknown Member

    Joined:
    Jan 31, 2004
    Location:
    Corpus Christi, TX
    This is interesting.

    Will follow...
     
    Dlg123 likes this.
  5. Threadmarks: Construction

    Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    Thank you for the kind comments. Enjoy the next post!

    ---​

    With formal project approval, ISABELLE could at last move away from the long series of low-key, low-cost experiments that had been taking place since it had first been recommended to Brookhaven management, and towards the industrial-scale construction and manufacturing jobs that would be needed to bring it into being. In quick succession, formal approval brought about a whole-scale reorganization of the project into a new operating division at Brookhaven; its inclusion as a separate line-item in President Carter's FY 1979 budget; the selection of the industrial giant Westinghouse as the main magnet contractor; and the beginning of preparations for civil construction. It also brought about a seismic shift in the Accelerator Department, and not just in those parts of it involved with ISABELLE, as the National Synchrotron Light Source, or NSLS had been approved for construction at the same time. For some time the Accelerator Department had been divided between those interested in ISABELLE and those interested in NSLS, in what was nearly a low-grade civil war, but now, with both of those projects underway, the two factions could be reconciled in mutual focus on big new capital projects.

    And ISABELLE certainly would be a big new capital project, from any perspective. Even leaving aside the necessary conventional construction, which would need to build a tunnel more than two miles long to house the collider, ISABELLE would require more than one thousand high-field superconducting magnets, on par with the prototype and pre-production magnets that the Accelerator Department had been building so far, and it would need them quickly. To meet the accelerator's planned completion date, Westinghouse would need to build more than one car-sized dipole magnet per day on average, a figure which would clearly need massive technical support from the Accelerator Department to achieve when prototype magnets had been taking weeks or months to assemble and test.

    In the meantime, the specter of competition was rising again, this time across the Atlantic. Although Brookhaven had managed to overcome spirited competition from Fermilab for scarce Department of Energy dollars, ensuring that it would build the first proton-proton collider in the United States, it had no such political leverage against the European particle physics collaboration, CERN. Traditionally, this had not mattered much; CERN, though capable enough, simply didn't have the resources or technical capability to best the United States when it came to pushing back the frontiers of physics. Instead, CERN had played more of a consolidation role, following up American experiments with others intended to tease out precise interaction details or the exact properties of a newly discovered particle, worthy enough but not very exciting in comparison. In the 1970s, however, that had begun to change, as the damage of the World Wars faded into memory and a new generation of physicists pushed more boldly against the edges of their knowledge.

    The first hints of this new CERN came in 1973, when it was the giant Gargamelle bubble chamber which had first detected the recently predicted neutral current interaction, making a major breakthrough that would previously have come from the United States; now another bold challenge to American hegemony had come from Geneva with the recent completion of the Super Proton Synchrotron, or SPS, a 400 gigaelectronvolt, or GeV, proton accelerator. Although less powerful than Fermilab's 500 GeV Main Ring, and thus outwardly just another follow-up machine, the Europeans had prepared a brilliant maneuver that would turn it from an average proton accelerator into a world-class collider. Using a beam cooling technique devised by the Dutch physicist Simon van der Meer, they could create a beam of antiprotons to circulate in the SPS ring, with exactly the same energy as the protons it was nominally designed to accelerate. Since they would have the opposite charge to the protons, they would circulate in the opposite direction, and through suitable modifications to the synchrotron's equipment could be made to collide with a simultaneously circulating proton beam at selected points. In a single stroke, the SPS would be transformed from a 400 GeV proton accelerator into an 800 GeV proton-antiproton collider--exactly the same energy as ISABELLE, and with a fraction of the time, money, and effort needed for Brookhaven's machine. Although work on the SppS collider had not yet begun, it was receiving serious consideration among CERN's leadership and looked all but certain to be constructed soon.

    The one saving grace for ISABELLE was that the modified SPS would have a low luminosity--that is, a low rate of particle collisions. Even optimistic predictions estimated that it would produce 10,000 times fewer collisions in a given time period than ISABELLE. To put it differently, the data that ISABELLE would collect in one second would require the better part of three hours at the SPS. Although in some ways this would be beneficial for CERN, as their physicists would have to sort through many fewer irrelevant side reactions while searching for the main prize, on the whole this meant that much more data-taking would be needed for any new discovery. Therefore, even though the CERN collider might be operational a year or more before ISABELLE, the latter might still be able to claim the prize of the W and Z bosons. The trick would be ensuring that ISABELLE was operational soon enough after the CERN collider that it would be able to make up the difference, rather than having already lost the race.

    The first step in doing so was to begin civil construction, that is construction of the physical buildings and tunnels that would house the collider and its ancillary equipment. Even before formal project approval had been obtained, some preliminary work had begun, such as completing the environmental impact statements necessary since the National Environmental Policy Act had been passed in 1970, but now the pace accelerated. By the end of 1978 work had begun on land clearance around the accelerator, and in early 1979 the first construction contract, for one-third of the accelerator structure, was let, to the local firm A.D. Herman Construction Company. Spurred onwards by the promise of receiving contracts for the other two-thirds if it performed well, A.D. Herman managed to nearly finish the first third by the end of the year, and the rest of the machine in another, completing all work in early 1981, four months ahead of schedule.[1]

    Unfortunately for the scientists eager to use the machine, things were not going so well with the magnets. Brookhaven's own prototype and pre-production magnets had performed exceptionally well, leading lab management to increase the machine's specifications to fit what they thought was an easily achievable level of performance. The experimental pre-production magnets from Westinghouse had also performed well in Brookhaven's trials, and Westinghouse had offered an overall attractive bid, so they had been selected as the magnet contractor, responsible for constructing the 1,100 magnets needed to guide and contain the powerful beams of protons ISABELLE was intended to collide. By late 1978 they were gearing up for full-scale industrial production, and in January of 1979, only a few months before the civil construction contract was let, they delivered their first production magnet.

    It was a disaster. Whereas the most recent prototype and pre-production magnets had first quenched in magnetic fields stronger than 4 tesla, and easily trained to reached the design goal of 5, Westinghouse 0001 first quenched at just 3.8 tesla, and even after dozens of training quenches was barely able to sustain 4. Magnet 0002 was little better, as were magnet 0003 and 0004. None were able to reach the 5 tesla design goal even with training, let alone with reasonable amounts of it, and magnets 0002 and 0003 suffered serious damage from shorts between superconducting wires during the training process, effectively destroying them. Since Brookhaven's continuing series of experimental Mark and P magnets were performing perfectly well, there was clearly something wrong with Westinghouse's production process.

    At first, laboratory management assumed that the magnet problems were merely teething issues, the sort of issue that inevitably crops up when a prototype, however well-behaved, is reworked for mass production. This viewpoint was reinforced in May when the ISABELLE Magnet Group disassembled magnet 0004 after it failed to reach 5 tesla after fifty quenches, and discovered numerous manufacturing flaws in the magnet, primarily in the assembly of the network of superconducting cables at the heart of the magnet. Additionally, the Westinghouse-built magnets had several design flaws which had been identified and eliminated in Brookhaven's own prototypes. On the one hand, it seemed obvious that Westinghouse merely needed to be informed about these issues to correct them, and indeed magnets 0005 and 0006, built once the Magnet Group provided a lengthy list of the flaws in 0004, exhibited better performance than their predecessors. On the other, these "improved" magnets still required unacceptable amounts of training to reach the design field goals, and still had a tendency to suffer magnet-destroying shorts during testing, indicating that Westinghouse was still not doing a good enough job in assembling its magnets. Compounding matters was the fact that two critical review meetings--one by the Department of Energy and another by Associated Universities, Inc., the group operating the lab--were upcoming that fall, and that it was absolutely necessary that delivery of accelerator-quality magnets be ongoing at full production rates by that time, in order to show that the collider would be completed on schedule.

    The only consolation for the ISABELLE group was that Fermilab was suffering even worse; whereas Brookhaven had adapted a fairly conventional magnet design to superconducting operation for ISABELLE, Fermilab's resident mad genius (and lab director), Robert Wilson, had come up with a quite original but unfortunately difficult to work with design for his Energy Doubler. Compounding matters was his use of the Rutherford weave within the accelerator's magnets, as the weave was proving to be difficult to work with and prone to internal shorts and other problems limiting magnet performance. Despite their construction of an on-site, Fermilab-operated magnet factory, and their resulting ability to systematically test dozens of minor design variants and optimize overall magnet performance, Fermilab was having serious difficulty obtaining magnets for the Energy Doubler.[2] With the United States in the throes of stagflation, the Department of Energy was struggling to maintain funding for both ISABELLE and the Energy Doubler, and whispers began to circulate that Department and perhaps even national leadership were considering axing one of the accelerator programs to relieve budget pressures.

    As the summer of 1979 began to draw to a close, the problem with Westinghouse's magnets was gradually becoming a crisis. Magnet 0007, delivered in August, still failed to deliver acceptable performance, even after another round of criticisms by the Magnet Group of the flaws in 0005 and 0006, and even as Brookhaven's own internal designs were continuing to perform well. By now, the laboratory's top management had also taken notice of the problem, and were beginning to seriously worry about ISABELLE's magnets. Although the basic design seemed sound, something seemed to have gone badly wrong in the mass production process, and it was by this point impossible for full-scale production to begin anytime soon. In the time-honored tradition of those desperate for time everywhere, management appointed a committee made up of engineers and physicists mostly from within Brookhaven to review the production process and recommend methods for "perfecting the large-scale manufacture of ISABELLE magnets".

    At first, the committee members were as baffled by the discrepancy between Brookhaven's Mark/P series magnets and Westinghouse's N series as the members of the Magnet Group. The basic design of the Mark/P series was, by this point, well established, as were the techniques necessary to construct them. It was reasonable that Brookhaven's magnets might perform slightly better than Westinghouse's--after all, they were hand-built prototypes incorporating whatever innovations the Magnet Group's scientists could come up with--but the size of the discrepancy was far too large to be attributable to an older design frozen for mass production. Besides, there was the matter of the documented manufacturing errors in Westinghouse's magnets, which could hardly be chalked up to design differences.

    So, they took a step, prompted by their mandate to investigate the magnet's mass production, that no member of the Magnet Group had previously done: they visited Westinghouse's magnet manufacturing plant. What they found there appalled them. Much of the equipment and tooling needed to construct the magnets was broken or improperly set up; many of the workers were poorly trained, with little interest in the work, bad to atrocious morale, and endemic suspicion of corporate management. Indeed, a strike broke out as the committee was wrapping up its investigation, delaying the delivery of Westinghouse 0009.[3] Westinghouse itself seemed to have relegated ISABELLE to a second or third-tier priority, which, in turn, seemed to be the root source of these problems. Except when prompted by Brookhaven, the committee discovered, Westinghouse had made little effort to correct the obvious problems with their manufacturing plant. Their strong performance in the early magnet trials had been due to using equipment and personnel more appropriate to pre-production or prototype efforts, which had not then informed Westinghouse's mass-production attempts. All in all, the committee was forced to conclude, Westinghouse was treating the contract mostly as an easy means of obtaining income, with little real interest in seeing the collider completed on time.

    Ultimately, there were three possible methods of compensating for Westinghouse's failures, the committee concluded in their final report. First, the Magnet Group, which was already producing acceptable magnets, could be transformed into a manufacturing plant, along the lines of Fermilab's efforts for the Energy Doubler. This would require significant physical plant construction, substantial expansion of the Magnet Group, would likely result in a delay of a year or more in collider completion as Brookhaven's production capacity was ramped up, and would, as a result, be quite costly, but would almost undoubtedly be able to complete the collider to specification. Second, Westinghouse's contract could be voided and one of Westinghouse's competitors from the original magnet selection could be tapped to replace them. This would also result in a long delay and ran the risk of the new firm making the same mistakes, but would probably be cheaper than establishing a Brookhaven manufacturing plant, if it worked. Third, Brookhaven could increase supervision of Westinghouse, establishing a permanent oversight group at the Westinghouse plant, effectively replacing Westinghouse management with Brookhaven's. This would still require expansion of the Magnet Group, but would be cheaper and faster than either of the other options, and so received the committee's recommendation.

    Laboratory management was almost relieved by the final report; a failure of their corporate oversight was embarrassing, but most of the blame clearly fell on Westinghouse, not Brookhaven. Besides, the report clearly exonerated management of the more serious charges that had begun to circulate in rumors, claims that management had seriously erred in their choice of magnet design.[4] Westinghouse, by contrast, was highly embarrassed by the report's conclusions, and took some pains to try to keep it from being widely circulated. Although ISABELLE was hardly a national priority on par with the Apollo Program or a major military contract, it was still a large government project, and abdicating their basic responsibility to ensure in-specification production was neither a good look nor a strong foundation to base bids for future projects on.

    With the threat of contract cancellation looming over their heads, Westinghouse accepted the imposition of direct Brookhaven oversight, though with perhaps less grace than was warranted in the situation. The leadership of the Magnet Group, now the Magnet Division, were formed into a Production Oversight team based on-site at Westinghouse's production facility with a mandate to correct all of the flaws uncovered by the Yellow Report (so-called because of the color of the paper used for the cover sheet). By the end of January 1980, they had brought the magnet production rate up to one per week; by the end of May, to one per day; and by the end of August, to more than two per day, with the factory operating around the clock. At the same time, they sliced the rate of magnets failing to quench at acceptable fields at Brookhaven from one magnet in every two in December 1979 to one magnet in every ten by the end of 1980 thanks to the formation of a magnet testing division at the plant itself, able to quickly test every magnet and reject those that needed rebuilding before they were shipped to Brookhaven.

    Brookhaven's ability to ramp up magnet production once it became aware of the problems in its production process was fortunate, for CERN had been equally successful in its program to upgrade the SPS into a proton-antiproton collider. By the beginning of 1981, much of the construction work needed for the CERN project was nearing completion, and it seemed as if proton-antiproton collisions could begin by the middle of the year. Despite Brookhaven getting back on track and even promising to finish their machine early, it would be extremely difficult for the Americans to begin collecting data before late 1982, opening a significant window for the Europeans to reach the prize before the Americans even began the race. Previously, American scientists had been confident that they would discover the W boson, as they had with practically every subatomic particle since the 1940s; now, they were beginning to realize that it would be a competition, and one that they were not certain to win.

    The only method of further accelerating the collider's construction was to begin installing the magnets and their ancillary equipment before the structure itself was finished, a task made more palatable by the fact that by mid-1980 a third of the structure had already been delivered according to contract. Almost as quickly as magnets were delivered from Westinghouse, they were assembled into full cells, tested as an accelerator unit, then disassembled, transported to the worksite, and lowered into the tunnel, where they were reassembled in their cryogenic dewars while power supplies, structural supports, monitoring equipment, and other necessary details were being attended to. Even as the accelerator itself was beginning to take form, on the other side of the worksite construction workers were still sweating over the remaining portions of the accelerator ring, pouring foundation concrete and embedding the massive steel structural rings that would form the continuous tunnel to eventually house ISABELLE.

    The high degree of investment, both managerial and financial, needed to accelerate collider construction so much had to come from somewhere, however, and the most available and tempting source were the detectors that the collider would use to search for new physics once it came online. Although it might sound absurd to delay the detectors while building the collider, it is in fact fairly normal for particle accelerators to install detectors after construction, as technology advances or new questions arise that require new methods of exploration. Indeed, despite having six sites for installing them, only four detectors had been planned for the collider's start even when construction started, never mind once every efficiency needed to be squeezed out of the construction process. Two of these--one a general-purpose detector capable of exploring a wide range of physics and one optimized for the W boson search--were being built by Brookhaven-led collaborations, and it was almost inevitable that they would suffer from the focus being on building the collider itself. The W detector, HREPPS, was protected by its importance to ISABELLE's new goal, but the other was not, while the two detectors being built by outside collaborators were limited by their ability to obtain extra funding from the Department of Energy and, in some cases, national science ministries. By early 1981 it was apparent that only HREPPS would actually be ready by the time the collider turned on, while the other three experiments could not possibly be installed before 1983.

    Through 1981, installation work at the accelerator ring continued, bolstered when construction was formally declared complete in April, though some subsidiary work, like the tunnel connecting ISABELLE's proton source to the collider itself, continued for several more months. Now engineers and technicians had access to the whole collider tunnel, and they took advantage by ramping up assembly once again, working almost around the clock as they worked to bring ISABELLE online before CERN could knock them out of the race. Nearly a third of the collider had already been assembled, with another sixth underway; now work ramped up on the remaining half, so that by November the entire accelerator ring was finally in place, and it could, at long last, nearly ten years after it was first proposed, be lowered to cryogenic temperatures. Through the remainder of the year the magnets underwent a final series of checks and tests to uncover any last minute problems before, in the afternoon of December 31st, the first beam of protons was injected into the collider. Circulating at an energy of just over 20 GeV they were hardly breaking new ground, yet they fueled plenty of drinking later that day. Nevertheless, Brookhaven's leaders were not yet ready to declare victory, as CERN had recorded its first proton-antiproton collision months earlier, and was even then accumulating collision data.

    With the clock ticking, the final step was to undertake commissioning--the process of gradually bringing the machine from a mere oversized beam circulator to an accelerator, and then a collider. Rushing this process ran the risk of pushing the machine too hard, too fast, and causing expensive damage, possibly even putting it out of service for years. Even so, with a glance back across the Atlantic, Brookhaven pushed its new machine as hard as it dared, ramping up the energy and quickly circulating beams in both beam pipes, allowing them to claim the record of highest energy proton-proton collisions in late April, and highest luminosity collider in early June. By mid-June they had persuaded themselves that the collider would function properly and set themselves to begin Run I, four months of full-energy, full-luminosity run time before a year-long shutdown for inspection, maintenance, and the installation of the other three detectors.

    Despite the competition with CERN, high-energy physics had never been a closed field, at least not in the West, and plenty of rumors and scuttlebutt managed to make their way back and forth across the Atlantic all throughout the construction process. By late October, as ISABELLE was wrapping up its first run, physicists attending to the machine were hearing that CERN had found one or two events that almost exactly matched the expected profile of the W boson. However, CERN hadn't analyzed enough data yet to be certain that they hadn't simply made some kind of mistake, and were waiting for more events to show up before announcing a discovery. With a tiny window open to make the breakthrough themselves, the collaboration behind HREPPS threw itself into a feverish overdrive, working almost 24/7 to analyze their data and determine whether or not they were seeing the same thing as CERN as quickly as possible. With the data collected through the machine's first shutdown in October it rapidly became clear that, yes, they really had seen the W, and through a nearly superhuman writing and review effort they were just able to complete the necessary paper on their subject by the middle of January. On Monday, January 17th, 1983, George Vineyard, the Director of Brookhaven[5], ascended a podium at the lab's briefing room and announced that ISABELLE had discovered, at last, the W boson.

    The race was over.[6]

    [1]: This is OTL. Fortunately, because in reality the ex-ISABELLE tunnel now houses RHIC. Whatever the other problems of the accelerator, the civil construction was an enduring product that turned out quite well.

    [2]: Another side effect of the PoD. IOTL, Fermilab used the cable (which came from Rutherford) rather than the weave/braid (from Brookhaven in reality), owing partially to inter-lab rivalry and partially because only Brookhaven, which obviously had its own projects to support, was producing the weave/braid. Both of these factors are, naturally, reversed ITTL. Aside from this, however, the paragraph is per OTL. In fact, the braid turned out to be so difficult to work with that it ultimately had to be completely abandoned, directly leading to the failure of ISABELLE IOTL.

    [3]: This actually happened. As far as I can tell, though, Westinghouse didn’t have any systematic issues IOTL; they were just stuck with a bad magnet design that neither they nor anyone else could get to work.

    [4]: This is more or less parallel to OTL, except that IOTL, as I have mentioned, the magnet design chosen was fatally flawed, and it was true that management had provided poor leadership. They're still, honestly, not very good leaders ITTL, but they've lucked out by the magnets needing little care and feeding to work properly.

    [5]: In reality, he had been effectively fired in September 1981 due to the near-total failure of ISABELLE at that time, and then replaced by Nicholas Samios. Samios did a stellar job resuscitating ISABELLE, but his achievement in that regard was undercut by essentially political concerns that ended up killing ISABELLE altogether. Not wholly unfairly, it turned out, since ISABELLE was just not going to be able to make any discoveries by the time he took it over, but that wasn't entirely apparent at the time. In any case, Vineyard obviously looks better without the big collider crashing and burning under his watch, and so is still lab director. Unfortunately he rather abruptly died of cancer in 1987, so he won't have very long to enjoy his success.

    [6]: In reality, CERN formally announced the discovery of the W boson January 25th, 1983, in what was less a race and more a casual saunter due to the lack of any effective competitor for CERN (ISABELLE having spectacularly self-immolated between 1979 and 1982, and what wasn't quite yet the Tevatron not able to get online sufficiently quickly to even think of competing).
     
    AndyF, Dlg123, Vatem and 4 others like this.
  6. Dathi THorfinnsson Daði Þorfinnsson

    Joined:
    Apr 13, 2007
    Location:
    Syracuse, Haudenosaunee, Vinland
    From what little I can tell, ISABELLE had a much smaller ring than the Tevatron, which means that the maximum energy was a lot less.
    What is the point of ISABELLE, then? You point out that iOTL, it was rendered obsolete before completion. Here, it's still going to have a very short window before obsolescence. No?
     
    Dlg123 likes this.
  7. Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    Well, there are several points. First of all, as I said in the first post, energy is not the only important quality of a particle collider. Luminosity--the number of collisions--is also important, and ISABELLE has a way higher luminosity than Tevatron ever could have (because Tevatron was a proton-antiproton collider, which kills maximum luminosity). It has, opening day, ten times the luminosity that Tevatron had when it shut down IOTL in 2011...and that was after multiple major upgrades that had increased luminosity several orders of magnitude. So while it might not be able to make some of the discoveries that Tevatron made, what it can reach it can do at much greater precision. This is still useful! There are plenty of low-energy colliders around the world (e.g. the Beijing Electron-Positron Collider) that are nowhere close to the energy frontier but can still do good physics. Even if ISABELLE were totally outmatched in energy terms by Tevatron from 1987 onwards, it would still be, by far, the highest luminosity high-energy particle collider in the world until the SSC or LHC were built.

    Second, ISABELLE can be finished before Tevatron (this was supposed to be the case IOTL, but the severe magnet problems that ISABELLE had made that impossible; here, of course, they're not having any such issues). Quite a bit earlier, actually, so it can provide several years of frontier data before Tevatron could possibly come online.

    Third, ISABELLE is not a pure proton-proton collider. I don't want to spoil the final post too much--if nothing else, I explain everything there--but while they were planning and building it they were thinking ahead to things it could do that did not simply involve smashing protons together.

    Fourth, and again I don't want to spoil things too much, but you might want to consider the second footnote in the post above and what it implies about how Tevatron (aka Energy Doubler) is doing at the moment ITTL.

    Finally, and most importantly, they don't know that ISABELLE is incapable of pushing beyond the W/Z bosons when they're building it. At the time, they thought that the top quark and Higgs boson might be in the 30-40 GeV range, which would put them smack-dab in ISABELLE's energy reach. Of course, they're actually not in that mass range, but there was no way to know that at the time. If the collider hadn't been so troubled, they certainly would not have abandoned it! It managed to stagger on until 1983 even when it was going nowhere IOTL, after all, and its cancellation was a near-run thing, as they say.
     
  8. Dathi THorfinnsson Daði Þorfinnsson

    Joined:
    Apr 13, 2007
    Location:
    Syracuse, Haudenosaunee, Vinland
    Thanks for the reply. That makes a lot of sense.
     
    Dlg123 likes this.
  9. Threadmarks: Operations

    Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    Even as Brookhaven basked in its moment of triumph, ISABELLE itself was slumbering in the throes of its first major shutdown. The machine and its human operators had been pushed to the limit to beat CERN, and it showed in a lengthy list of niggles, problems, and issues that had been accumulated over Run I, not to mention the absence of three entire detectors that had not been completed by the time the collider had initially been brought online in June 1982. Over the year-long shutdown, the three remaining detectors were installed, hundreds of superconducting magnets were inspected to ensure that no unseen problems had built up over months of operation, and vast amounts of support equipment, from cryogenic supplies to electrical cabling, were checked and fixed where problems had been noted.

    When ISABELLE's Run II started in October 1983, hopes were high among physicists that it would quickly lead to more discoveries like the W boson, which had after all fallen so quickly into their grasp once ISABELLE had been brought online. With ISABELLE colliding particles at 800 gigaelectronvolts, or GeV, and with a luminosity higher than any other collider in the world, there was every reason to think that it might soon find the top quark, then thought to mass about 30 or 40 GeV. Some physicists even thought that the Higgs boson might be produced by Brookhaven's new machine, a factor in the decision of 1983's HEPAP facilities meeting to recommend deferring work on the giant new supercollider some other physicists had become enamored with. Unfortunately, these hopes were dashed by reality; when Run II ended with ISABELLE's second scheduled shutdown in September 1985, no evidence of either the top quark or the Higgs boson had been found. In combination with data from CERN's SPS, this meant that the top quark had to mass at least 80 GeV, heavy enough that ISABELLE was very unlikely to see it no matter how long it ran. Evaluating the negative results of the Higgs boson search was trickier, as a very light Higgs boson might decay in such a way that it was difficult to disentangle from other collisions that had not produced a Higgs boson. Nevertheless, the luminosity of ISABELLE was high enough that physicists felt fairly confident that they had ruled out the lightest possible Higgs bosons, leaving, again, only an allowed energy range too high for ISABELLE to probe.

    Despite this, Run II had by no means been a failure. Operating for two years instead of four months allowed the number of detected W and Z bosons to mushroom tremendously, in turn permitting more precise measurements of their properties. At the same time, the sheer number of proton collisions and their high energy opened new windows into their internal structure, the complex mixture of quarks and gluons making up much of the matter we encounter every day. This, in turn, helped refine models of particle collisions that would underpin the interpretation of results from ISABELLE and future colliders. Finally, many physical processes that take place during proton-proton collisions could be measured with greater precision than ever before, again thanks to the sheer number of collisions available to observe. Although not the most exciting aspect of science, the resulting extension of the number of decimal places on the various parameters used to model the collisions was still useful, and certainly an accomplishment.

    Run II also fed into a larger debate about the future of particle physics in the United States, one that had been raging for a number of years. On one side were those who favored the incremental approach that had led to ISABELLE and the Energy Doubler project. These looked at the troubles that had plagued the ISABELLE and especially Energy Doubler projects and saw portents of things to come, if physicists tried to push the available technology too hard. Instead, they suggested that the next collider be a relatively modest step forwards, aiming at energies about five to ten times higher than ISABELLE. Not coincidentally, a collider of this scale could easily be accommodated at Fermilab or Brookhaven without extensive land acquisition, taking full advantage of existing facilities and lab structures.

    On the other were those who felt that this approach was timid and would take too long to make any significant discoveries. They wanted to know, for certain, whether the exciting new physics being bandied about to complete or replace the now established Standard Model--not just the Higgs boson and top quark, but exotic theories by the names of supersymmetry or technicolor--was real or a figment of the imagination. Doing so would require a much bigger machine than the conservative's collider, able to undertake collisions twenty to forty times more energetic than anything ISABELLE could create, at least the size of CERN's proposed Large Hadron Collider, and likely a new laboratory or at least the significant expansion of an existing one.[1] They looked at ISABELLE and the Energy Doubler and saw not a cautionary tale, as the conservatives did, but a mixed message, where greater technical aggression did not necessarily lead to failure. Their takeaway was the importance of adequate support to find and solve technological problems with any new machines, not the need to be careful in creating technological problems to solve.

    The debate had started almost as soon as ISABELLE and the Energy Doubler began construction, though at that time most physicists favored the conservative approach of building a new collider that would, essentially, use the existing labs to their maximum capacity--a LabMax, as if the national laboratories were some kind of strait being navigated by particle-colliding ships. Over time, however, more imaginative physicists had realized the possibilities inherent in new, more expansive labs, located perhaps in the deserts of Arizona or New Mexico, and in the powerful magnets that Brookhaven had perfected for ISABELLE, and had begun to consider what one could do with them. Robert Wilson, as always one of the most creative if not necessarily the most grounded of particle physicists, was one of the first to propose a giant new supercollider, which he believed could be assembled for about the same price as a much smaller conventional design through several novel design techniques. Even more conservative estimates tended to make the proposition financially plausible, and by 1983 matters had come to a head, with the HEPAP facilities meeting torn between those who favored radical proposals like Wilson's Desertron or SLAC's giant electron-positron collider design, and those who favored more conservative options like completing the Energy Doubler and building some kind of follow-on Dedicated Collider to complement it.

    The resulting stalemate, not to mention ISABELLE's upcoming Run II and the possibility that it would make any major decisions in 1983 obsolete, created a strange, divided atmosphere, where no one was able to make any firm recommendations. In the executive summary of the panel's final report, no mention was made of any of the major capital projects that had been proposed, whether Desertron or Dedicated Collider. Instead, it largely just listed existing particle physics-related facilities projects, essentially stating that the Department of Energy should keep doing what it was already doing or planning to do. The largest exception, however, was highly significant; instead of recommending completion of Fermilab's Energy Doubler, a project favored by conservatives, it suggested that the proposed new colliders could provide equally or more significant physics results at not much more cost and almost as quickly, and were therefore better candidate for investment.[2] Equally significantly, in the body of the report all of the major capital projects were listed, grouped into two categories: "Preferred" and "Acceptable". The Dedicated Collider was the only entrant in the "Acceptable" category, while the Desertron, now officially named the SSC, was in the "Preferred" category. SLAC's TLC was also in "Preferred," but with the caveat that a large amount of research was needed to make it practical, and it could not be considered a reasonable element of the near-future program.[3]

    The division only became more complex during Run II, as both major particle physics laboratories, Fermilab and Brookhaven, underwent their own internal shifts on the project. Fermilab had always been a bastion of support for the radicals, with Robert Wilson one of the first and Leon Lederman an early leader, but the cancellation of the Energy Doubler had led to internal division. After all, the laboratory's Main Ring had been finished in the 1970s while the SSC, assuming it was built immediately as the radicals wanted, could not be done until the late 1990s or early 2000s. That would leave a gap of nearly twenty-five years between the two projects, and it was difficult to tell exactly what the laboratory could possibly do during that time to maintain the skills and experience needed to build and operate such large machines, naturally a conservative position.

    Meanwhile, Brookhaven had always been more conservative, partially because it had historically benefited from such an approach, and partially because it might be able to host a post-ISABELLE collider of several teraelectronvolts, or TeV, but not one of several tens of TeV, as the radicals desired. Now, though, Brookhaven's accelerator physicists considered that the SSC would pose exciting new challenges in accelerator construction and design, ones which Brookhaven's experienced magnet men and women would almost certainly be tapped to address. At the same time, users of the new National Synchrotron Light Source, or NSLS, were opposed to any new colliders at Brookhaven since the construction would inevitably disrupt their own delicate work, and therefore tended to prefer the radicals when asked. Both labs were also influenced by the Reagan administration, which had come out strongly in favor of big science projects like NASA's space station designed to demonstrate and maintain American superiority in all fields, and which therefore favored the huge SSC over the more modest Dedicated Collider.

    When HEPAP's facilities panel met again in 1985, therefore, it was again confused, with the battle lines and factions no longer as clear as they had been two years earlier. The only thing all sides could agree on was that ISABELLE could not save them now; some kind of new collider was definitely needed, especially with the Europeans moving ahead with their two-step program of LEP followed by LHC. The question was whether to push for a giant that could, with practical certainty, tackle all of the most pressing questions in particle physics but which would be complex, expensive, and difficult to develop, or a smaller machine that would be relatively cheap and simple but risk being simply unable to discover anything new. The debate was long and difficult, but ultimately the radicals had been strengthened by ISABELLE's results, or lack thereof, and the conservatives weakened, so the decision was made: the SSC was going forwards, for better or for worse.

    Meanwhile, during the long shutdown between the end of ISABELLE's Run II in September 1985 and the beginning of Run III in October 1986, a number of upgrades had been made to the collider and the older accelerators that fed it with protons, as had been planned for these scheduled shutdowns from the beginning. With the care and attention of the lab's Accelerator Department, when ISABELLE returned to service it set new records in both luminosity and energy, more than doubling its capabilities in the former area and increasing collision energy by nearly 5 GeV in the latter. Unlike Run II, Run III would last three years, until January 1990, before the machine once again shutdown, this time for a longer and more significant overhaul.

    With the beginning of Run III, the machine had moved on to a new phase in its lifecycle. Like all particle accelerators, its first years, when it had been breaking new ground had been the most productive, leading to the discovery of the W and later Z bosons. Now, though, the most significant results had been found, and the machine stood little chance of making any further breakthroughs without further effort. One method of countering this trend, and one that had been half of the reason behind the run-shutdown-run sequence to begin with, was to continually upgrade the accelerator and its associated equipment. While huge leaps in performance were unlikely, continually pushing up the luminosity and energy of the machine would still open new ground and increase the probability of new discoveries. New detectors, too, using the latest technology or optimized for the most pressing physics questions could help extend the machine's period of maximum productivity.[4]

    Another method of extending the machine's physics reach was to look to different fields, where the capabilities of the machine might be entirely novel. For particle accelerators, in particular, there are frequently great opportunities in nuclear physics, the science of nuclear structure, closely related to but distinct from particle physics, more interested in the structure of the individual particles which combine to form the atomic nucleus. Any accelerator that can accelerate protons--that is, a hydrogen nucleus--can, in principle, accelerate heavier nuclei as well, and experiments had been carried out since the late 1960s to smash heavy nuclei into fixed targets in order to study exotic nuclear conditions. In the late 1970s and early 1980s a powerful incentive for moving from fixed targets to colliding beams, just as particle physics was doing, became apparent as swiftly improving computers combined with rapidly improving theories of the strong nuclear force showed that in sufficiently high energy collisions a quark-gluon plasma, an exotic and theoretical state of matter, might form. Much like an ordinary ion-electron plasma, in such a state of matter the quarks that make up most matter and the gluons which carry the strong nuclear force between them would separate, mutually screening the "color charges" that drive the strong force and allowing new and interesting behaviors. Only the collision of heavy ions like gold or lead at high energies could produce such a state of matter, however, as many quarks and gluons interacting at extreme temperatures were needed for the complex interactions that would take place in the plasma.

    Even before it had been completed, it was apparent that ISABELLE would be ideally suited for carrying out such collisions, as it was already designed to smash two positively-charged particle beams into each other, and had sufficient beam energy to be nearly certain of producing the quark-gluon plasma. Plans were laid to undertake such work, and during its second shutdown, between Runs II and III, an ion source was installed to permit collisions between lighter ions, up to about the mass of silicon, along with a specialized detector optimized for observing ion-ion collisions. Although some tentative signs hinting at the quark-gluon plasma were seen, scientists were cautious at claiming its discovery, as it was difficult to use models to make testable predictions, and difficult to match observations to model features. To resolve these questions, nuclear physicists designed "the Booster," a small synchrotron that would sit between the ion source and the Alternating Gradient Synchrotron, the accelerator that directly fed ISABELLE, allowing both the acceleration of heavier ions and the production of even more intense proton beams.[5]

    Ironically, the beginning of construction of the Booster that would lend new life to ISABELLE was directly induced by the beginning, at last, of its putative successor, the Superconducting Super Collider. After years of study and magnet research efforts mirroring those of ISABELLE's early days, the project had finally finalized a design, then sought out and found a home at Fermilab[6], before almost immediately being put on hold following the collapse of the Soviet Union and the election of President Clinton in 1992. Given the projected expense of the SSC, at least six billion dollars, the Department of Energy had sought to fund a slate of smaller projects including Brookhaven's Booster across the national lab system, to assure them that the giant accelerator would not simply absorb all research funding available. As usual, though, these projects immediately took on a life of their own and persisted even as the future of the SSC itself was in doubt.[7]

    On Long Island, the cadence of run-shutdown-run-shutdown continued its steady pace, even as Brookhaven was becoming more and more deeply involved in the SSC program. With Run IV, starting in November 1991, ISABELLE's new ion-collision physics program was first and foremost, dominating overall collider time. Despite increasing evidence that they were seeing the formation of quark-gluon plasmas in silicon-silicon collisions, Brookhaven's scientists were still hesitant to claim a discovery by the time Run IV ended in early 1995, given the theoretical difficulties involved and some curious discrepancies in their experimental observations from theoretical expectations. Instead, they eagerly awaited Run V, which would be almost totally focused on nuclear physics. Besides the beginning of gold-gold collisions with the completion of the Booster, Run V would also allow, for the first time, proton-ion collisions, a valuable test to see whether the same "fireball" effects thought to indicate the formation of a plasma also appeared under less favorable conditions. Additionally, the collider was undergoing a major upgrade with the installation of so-called "Siberian snakes" that would allow the acceleration of polarized protons, that is protons with their spin axis pointing in a particular direction. By doing so, the effect of spin on particle collisions could be explored, another sign of how Brookhaven was attempting exotic measures to extend the utility of its aging machine.

    While nuclear physicists still hesitated to definitively claim a discovery of the quark-gluon plasma[8], results from Run V strongly buttressed the claim that ISABELLE could produce such an exotic state of matter. In some collisions, measurements showed strong signs of elliptic flow, as if an exotic fluid following the laws of hydrodynamics had existed for an instant where the particles collided, then condensed into a shower of exotic particles as it cooled. Simultaneously, jets, which would normally be generated by fast-moving quarks and gluons as they moved away from the collision, were not being seen at the rates expected from a simple extrapolation of proton-proton collisions, another sign that something new was going on in the collision's core. Such jet quenching could be explained if some quarks and gluons were passing through a dense medium, like the quark-gluon plasma, that absorbed them before they could travel far enough to start forming a jet.

    Moreover, although both of these signals were seen quite clearly in collisions where gold atoms seemed to have collided nearly head-on, in cases where they merely grazed each other neither elliptic flow nor jet quenching were seen, as would be expected if those were signs of the quark-gluon plasma. Neither were elliptic flow or jet quenching seen when protons or deuterons collided with gold atoms, again something that would be expected if those were signs of the quark-gluon plasma. Both of those types of collisions would deposit too little energy into too small a region of space containing too few partons--now the term used to refer to quarks and gluons together--to form a plasma. Together with the simultaneous measurements of polarized proton-proton collisions that were providing new information on proton structure, complementing parallel results from Germany's HERA electron-proton collider, Run V was one of the most scientifically productive in the collider's history.

    While ISABELLE was undergoing a renaissance, the SSC was finally getting off the ground--or into it, as the case might have been. After a series of redesigns, complex international politics (and a financial scandal at CERN) that led to a greatly increased international presence at the machine, and a decade of construction, the SSC finally took away ISABELLE's crown as the most energetic particle collider in the world in 2008 after an unprecedented twenty-six years on top. The discovery of the top quark and the Higgs boson followed shortly afterward, as did a much-belated spate of Nobel Prizes for theorists who had waited decades for their predictions to be borne out. Unfortunately for more recent thinkers, subsequent years of operation have not proven nearly as fruitful; despite high hopes, as with ISABELLE, that the discovery of these milestones meant that more revelations were around the corner, SSC operations have turned up no evidence for supersymmetry or other exotic theories of the universe.[9] Many particle physicists are beginning to fear that the next milestone of note will be the GUT threshold, requiring a collision energy so high it's unclear whether it's even possible to build a collider capable of reaching it. Instead, attention is turning towards astrophysics and the so-called "intensity frontier" of neutrinos and rare processes to try to continue exploring the fundamental components of matter.[10]

    As for ISABELLE, it remains the world's center for high-energy nuclear physics, and for the specialized physics of polarized protons. Even in this era of hyper-energetic proton collisions, ISABELLE remains a useful tool, able to undertake investigations its giant brother is incapable of. Indeed, its small size can sometimes be a benefit, as in the recent concluded Low Energy Scan program where ISABELLE attempted to identify the exact point where collisions begin producing the quark-gluon plasma. The small size also means a less bureaucratic management program, meaning that researchers are freer to innovate and investigate more radically than they would be at the SSC--though, as it remains a billion dollar machine, not too much freer.

    To ensure it remains a competitive part of the collider landscape, ISABELLE was further upgraded during the SSC's construction to produce ten times more collisions than it did when first built, ensuring that it remains the most luminous collider on Earth. Moreover, a new electron accelerator was recently completed to allow electron-ion and electron-proton collisions, the first time such collisions have ever been undertaken for the collider's signature heavy ions and polarized protons. Proposed even before the completion of ISABELLE itself[11], an entirely new generation of physicists has now undertaken this project in an effort to once again extend ISABELLE's physics reach, spurred on by the recent discovery of elliptic flow in proton-proton collisions at the SSC, and reanalysis of deuteron-gold and proton-gold collisions revealing similar signatures at ISABELLE itself. Given that proton-gold collisions have long been considered too small and short-lived to create the quark-gluon plasma, the discovery of plasma-like signatures in the even smaller and shorter-lived proton-proton system has been an electric shock to the field, and there are many theorists and experimentalists scrambling to explain or disprove the results.[12] Despite having recently celebrated its thirtieth birthday, ISABELLE retains its power to shock and surprise, and physicists can only hope it continues to do so long into the future.

    [1]: Large Hadron Collider planning was already underway in 1983, in fact so far as I can tell probably began soon after LEP planning in the 1970s.

    [2]: The official estimates of the time tended towards optimism. The Dedicated Collider, in particular, was supposedly going to be ready by 1989, with a center-of-mass energy of 5-6 TeV. Compared to an Energy Doubler (Tevatron) that might accelerate its first particles by 1985 and achieve its first collisions by 1988 (based on how long it took to achieve both goals in reality), and which will only be able to achieve center-of-mass energies of about 2 TeV, this obviously looks quite attractive. Given what we now know, the Dedicated Collider would have been able to readily detect the top quark and Higgs boson, but that, of course, is pure hindsight.

    [3]: In reality, this panel, known historically as the Wojcicki panel, recommended cancellation of ISABELLE, continuation of Energy Doubler, and construction of the SSC, which was the beginning of the SSC program. SLAC did propose a large electron-positron linear collider, but the technology was far too immature at the time--we're only just now about to start building one, after nearly thirty years of R&D--so I don't think it was seriously considered. The presence of ISABELLE leads to a more conservative overall report because, as I said, there is something of an expectation that great new discoveries are on the threshold, which was not as present in reality (the Energy Doubler--that is, the Tevatron--having been a bit more distant in terms of physics results in OTL’s 1983 than ISABELLE in TTL’s 1983).

    [4]: Incidentally, this is currently where the LHC is. It was a bit unusual in that thanks to the magnet quench incident it effectively had two "ground-breaking" periods, but by this point it's not very likely that it will find anything radically new. Hence, CERN is presently, in reality, looking to upgrade the LHC's luminosity and otherwise improve its possible physics performance, while starting to consider its successor.

    [5]: This is literally just the Booster from the actual RHIC acceleration complex. The rationale for constructing it in the real world was precisely that I give for it here, and it is quite small and relatively inexpensive, therefore I didn't see any strong reason for it not to be built (especially considering the next paragraph).

    [6]: In reality Fermilab was essentially the very close runner-up to the well-known Texas location in the SSC site competition (Brookhaven, so far as I can tell, was never seriously considered, I believe due to the site geography and geology). Fermilab suffered because there was substantial local opposition to the construction of the SSC, whereas Waxahachie was quite enthusiastic about it. Additionally, and probably more importantly, Department of Energy ranking criteria ignored the value of the "social plant" of the laboratory when considering credit for the fact that Fermilab already existed, that is they ignored the fact that Fermilab already had hundreds of physicists and engineers working together, with management that had experience in overseeing substantial capital projects that required close physics-engineering collaboration. Considering that one major reason for the SSC's failure was the inability of SSC management and leadership to establish just such productive working relationships between physicists and engineers, this was a huge flaw in their evaluation criteria, and just having Fermilab manage the project instead of a new laboratory would likely have resulted in a significantly better outcome for the project.

    In any case, in reality Fermilab tried quite hard to win the SSC, for the obvious reason that it was going to be the next major particle physics facility. Here, though, they have an extra incentive to make a big effort, because Fermilab doesn't even have the Tevatron (that is, the Energy Doubler) to fall back on if it doesn't win, and end up pushing just a bit harder across all fronts. The result is that the very close ranking of OTL goes the opposite way, with Fermilab coming out slightly ahead of Texas, and so the collider is built in Illinois instead.

    [7]: This is also as per OTL in broad outline (the projected SSC budget is higher thanks to inflation and slightly different design developments). In reality the Booster was built before this largesse started, but it's been moderately delayed here due to the demands of running ISABELLE. Functionally, this is similar to the construction of Fermilab's Main Injector, which was also initially funded as part of a program to reassure other labs that they would not be eaten by the SSC. Ironically, these other projects ended up being highly scientifically productive, while the SSC, of course, was canceled.

    [8]: True IOTL; for example, I was just looking at a paper from RHIC that began, "Numerous experimental results have suggested that a Quark-Gluon Plasma...is created in [gold-gold] collisions..." (Italics mine) And this is in 2016, after it's been colliding heavy ions for 16 years!

    [9]: Obviously it's impossible nowadays to know whether evidence of superpartners or other such exotica lurks in the unexplored region between 14 and 40 TeV, but personally I tend towards a sort of pessimism in my physics, that is an assumption that the world has a perverse tendency to be boring (after a fashion). Certainly (certain forms of) supersymmetry is (are) not looking so good these days, which is really a pity.

    [10]: This is as OTL. To push forwards nowadays, colliders have to be so huge and expensive that it takes decades to build them, so neutrino and astrophysics experiments, which are not as difficult, are looking more fruitful. Even those are starting to get expensive and difficult (consider LBNF-DUNE, which is projected to cost somewhere from $1-2 billion, and be done sometime in the 2020s), but not to the same degree just yet.

    [11]: Also true IOTL! Not only that, but the very same idea is just now getting to the construction phase as eRHIC; what a wait!

    [12]: This is actual recent science, so I can't tell you what the results are. How exciting!

    ---​

    And so we reach the end of our story. Thank you for reading, and I hope you enjoyed it. If you have any questions, please ask; I'm happy to answer.
     
    cosimas, AndyF, Dlg123 and 5 others like this.
  10. Workable Goblin Chronicler of the Pony Wars

    Joined:
    Aug 3, 2009
    Location:
    Canterlot
    The timeline's done, but I just came across a set of videos from Fermilab that I thought might interest any readers. They explain a lot of the basic physics involved in particle accelerators and colliders in, I think, a rather clear and concise matter.

    As a taste, check out this video, which discusses an issue that was very important in Dancing with ISABELLE: Energy versus Luminosity:


    You can find a lot more at the Fermilab YouTube channel, if you found that interesting.

    (Disclaimer: As an American particle physicist, I obviously have a lot of connections with Fermilab--among other things, my advisor is also a member of one of their collaborations--but I don't directly work for them or on any project directly involving them)
     
    Dlg123 likes this.
Loading...