4

5 myths about the next Large Hadron Collider

 2 years ago
source link: https://medium.com/starts-with-a-bang/5-myths-about-the-next-large-hadron-collider-957d29e23c0c
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

5 myths about the next Large Hadron Collider

You have 2 free member-only stories left this month.

Jul 13

12 min read
1*WC7pNc3FMk2WGNBOy-WF0g.jpeg
The inside of the LHC, where protons pass each other at 299,792,455 m/s, just 3 m/s shy of the speed of light. Particle accelerators like the LHC consist of sections of accelerating cavities, where electric fields are applied to speed up the particles inside, as well as ring-bending portions, where magnetic fields are applied to direct the fast-moving particles towards either the next accelerating cavity or a collision point. (Credit: Maximilien Brice and Julien Marius Ordan, CERN)

5 myths about the next Large Hadron Collider

The way to understand the earliest moments of creation is to recreate those conditions and study them. Why would we stop now?

Here in early July of 2022, the world’s most powerful, most successful particle physics collider of all-time is currently smashing its own records. Having just completed a series of upgrades, it’s now colliding protons into protons:

  • at higher energies (13.6 TeV),
  • at faster speeds (299792455.15 m/s, or just 2.85 m/s shy of the speed of light),
  • with more tightly-packed beams, which greatly increase the rate of collisions,
  • and with all four detectors working faster and more precisely

than ever before. This new run successfully started on July 5th, kicking off with an announcement of the discovery of three new types of composite particles: one pentaquark species and two new tetraquark species.

And yet, you’d never know it from reading the news. Instead, the topics that people are talking about are dominated by two baseless assertions:

  1. that these experiments somehow are a threat to Earth and/or the Universe (no, they’re not),
  2. and that, due to its failure to find any new fundamental particles not predicted by the Standard Model, the Large Hadron Collider (i.e.., the LHC) should be humanity’s last cutting-edge particle accelerator.

While it’s very easy for physicists to demonstrate that the first scenario has no basis in reality, the truth is struggling to penetrate the public consciousness in the second case, likely due to five widespread myths that have been elevated and repeated at length. Here are the refutations — and the truthful reality — underlying each of them.

0*L0E6PFCmpsG4J3Fb

The Standard Model particles and their (hypothetical) supersymmetric counterparts. This spectrum of particles is an inevitable consequence of unifying the four fundamental forces in the context of String Theory, but if String Theory and supersymmetry are not relevant for our Universe, this picture is a mathematical curiosity only. Despite the copious theoretical work on supersymmetry, there remains no evidence for any of the supersymmetric partner particles. (Credit: Claire David)

Myth #1: the motivation for particle colliders is rooted in ideas like string theory, supersymmetry, extra dimensions, grand unified theories, and particle dark matter.

This is a tale that almost everyone has heard by now: that theoretical physicists have led themselves down a blind alley, have overcommitted to it despite a lack of supporting evidence, and are simply moving the goalposts towards higher and higher energies to keep these unsupported ideas from being ruled out. There’s a kernel of truth to this, in the sense that, as a theorist, you never say, “We didn’t find evidence where we were expecting it, and therefore the idea is wrong.”

Instead, you say, “we didn’t find evidence where we were hoping it would be, and therefore, we can place constraints on the relevance of this idea for our physical Universe. The best constraints we can infer apply to this particular regime, but not these other ones.” Yes, it’s true that you can find overly optimistic theorists advocating for their favorite particular ideas — string theory, supersymmetry, grand unification, and extra dimensions among them — but these are not the leading motivations for building better and better particle colliders.

0*4XHKmP96csy-ZQDD

The particles and antiparticles of the Standard Model obey all sorts of conservation laws, but also display fundamental differences between fermionic particles and antiparticles and bosonic ones. While there’s only one “copy” of the bosonic contents of the Standard Model, there are three generations of Standard Model fermions. Nobody knows why. (Credit: E. Siegel/Beyond the Galaxy)

In fact, it’s easy to argue the exact opposite: that we have a very successful theory, the Standard Model, that works extraordinarily well for describing the Universe up to LHC-era energies. Simultaneously, we know new fundamental physics must exist to explain the Universe we observe, as without some sort of new physics, we can’t explain:

  • what dark matter is,
  • what dark energy is and why it exists at all,
  • why there’s more matter than antimatter in the Universe,
  • and how particles got to have the fundamental properties that they actually possess.

There are only three ways we have at our disposal as far as how to proceed. The first is to derive potential extensions to the Standard Model, calculate the consequences that would arise, and to look for them in experiment. That’s precisely what theorists and phenomenologists do: a hitherto remarkably unsuccessful approach. (If we could successfully guess what physics was out there beyond the Standard Model, the story would end very differently!) The second is to build giant, isolated detectors and hope that new physics literally falls from the skies; cosmic ray observatories and underground neutrino experiments do precisely this. But the third way is to directly recreate the highest energy collisions you can, and to detect the results of those collisions to the greatest precisions possible. This third way, irrespective of the expectations or hopes of theorists, remains arguably the path of greatest scientific value towards the endeavor of exploring the unknown.

0*VKFKm3nf7GSxBWQt

The Collider Detector at Fermilab (CDF) just released the best-ever measurement for the mass of the W-boson. For the first time, from an experimental physics perspective, we may have discovered evidence that takes us beyond the Standard Model. When the LHC turned on, the main accelerator at Fermilab quickly shut down, as it was no longer competitive at the energy frontier. (Credit: CDF collaboration/Fermilab/Department of Energy)

Myth #2: It would be better to invest in thousands of small experiments rather than one large, brute-force one akin to the LHC.

If you’ve ever made this statement, congratulations! You yourself are now meme-worthy, as you’ve accomplished the task of, “announce to the world that you don’t understand experimental particle physics without saying ‘I don’t understand experimental particle physics.’” In any experimental or observational science, you typically have two major routes that you can pursue.

  1. You can design a tailor-made experiment or observatory to look explicitly for one particular signal under one specific condition or set of conditions.
  2. Or you can build a large, general, all-purpose (or multi-purpose) apparatus, pooling and investing a large set of resources together that serves a broad swath of your community, achieving a wide range of science goals.

In almost every scientific discipline that there is, administrators advocate for a balanced portfolio between these two types. The large, brute-force style, flagship-class missions are the ones that drive science forward in great, ambitious leaps, while the small, finesse style, lower-budget missions are the ones that fill in the gaps that cannot be sufficiently probed by the prevailing large-scale experiments and observatories of the time.

0*va-ijNSzbk00UD1D

The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created, and also describe how they will curve spacetime in General Relativity. The properties of particles, fields, and spacetime are all required to describe the Universe we inhabit, but the actual values of these masses are not determined by the Standard Model itself; they must be measured to be revealed. (Credit: Universe-review)

In the realm of experimental nuclear and particle physics, the “things” that we need if we want to better understand the Universe we inhabit are greater numbers and rates of events that show probe the process we’re looking to investigate.

  • Want to investigate neutrino oscillations? Collect data on neutrinos, under a variety of conditions, that show them oscillating from one species into another.
  • Want to investigate heavy, unstable particles? Create them in large numbers, and measure their decay pathways to the greatest possible precisions.
  • Want to investigate the properties of antimatter? Create and cool it, and then probe it the same way you’d probe normal matter, to the limits of your capabilities.

While these low-cost, finesse-style experiments are well-positioned to investigate certain aspects of these (and other) questions in fundamental particle physics, there is no pathway that’s even close to as successful for probing the properties of fundamental particles as a dedicated collider with dedicated detectors surrounding them. The reason that the experimental particle physics community has consolidated around a “put all your eggs in one basket” collider for so long is that the science that results from pooling our resources into one gargantuan effort far outstrips any series of efforts that arises from divided resources.

The collider physics community is of one voice when it comes to calls to build one superior machine to advance past our current generation’s technological limits. The alternative is to give up.

0*KEVIR5BWJ1ehlLOq

Using a variety of methods, scientists can now extrapolate back the atmospheric concentration of CO2 for hundreds of thousands of years. The current levels are unprecedented in Earth’s recent history. Although this is a very real problem that humanity must reckon with, decreasing funding to fundamental science simply hampers our species in a different manner, without necessarily addressing the problem at hand at all. (Credit: NASA/NOAA)

Myth #3: Instead of investing in a next-generation particle collider, we would be better off investing in (solving X problem), as that’s a better use of our resources.

This line of thought rears its ugly head anytime an opponent of a particular project wants to see that project killed, simply asserting, “project A is more important than this particular project under consideration, so therefore we should kill this project and instead devote more resources into project A.” Time and time again, history has shown us that this is a terrible way to fund project A, but it’s a great way to torpedo public support for whatever particular project is being considered.

This argument was raised many times over during the late 1960s, when terrestrial concerns such as civil rights, war, world hunger, and poverty sought to gain funding at the expense of the Apollo project.

It was raised again in the early 1990s, when concerns over the domestic economy led to the cancellation of the Superconducting Supercollider: a collider that would have been many times as powerful as the current LHC.

It was raised again in the early 2000s, when then-NASA administrator Sean O’Keefe wanted to cancel the Hubble Space Telescope and numerous other science-based NASA missions to focus on a crewed return to the Moon: a disastrous plan that almost ended our current golden era of discovery in space.

And it’s being raised now, again, by voices that argue that our climate crisis, the energy crisis, the water crisis, and many other environmental and humanitarian concerns all outstrip the need for a new, expensive particle collider.

0*hUnocmv5SqS6EBMc

The first view with human eyes of the Earth rising over the limb of the Moon. The discovery of the Earth from space, with human eyes, remains one of the most iconic achievements in our species’ history. Apollo 8, which occurred during December of 1968, was one of the essential precursor missions to a successful Moon landing, and one of the most unifying achievements for all of humanity. Despite being one of the most expensive government-funded endeavors in 20th century history, the long-term economic benefits reaped from the Apollo program far surpass its initial cost. (Credit: NASA/Apollo 8)

Here’s the thing that I wish people would understand so that they wouldn’t be led astray by these baseless arguments: the amount of funding we have to give out is not fixed. If you “save money” by not spending it on A, that doesn’t free up that money to go towards B; it simply doesn’t get spent. You might think, “hey, great, I’m a taxpayer, and that means more money in my pocket,” but that’s not true. All that happens — particularly in a world where governments don’t balance their budgets anyway — is that your project doesn’t get funded, resources don’t get invested into it, and society doesn’t reap the rewards.

And yes, all of these projects come about with great rewards, both scientifically and technologically. The list of NASA spinoff technologies shows that the Apollo Program actually created nearly $2 in new economic growth/activity for every $1 we invested in it. From the LHC at CERN, the advances in computing and data handling, alone, have already paid substantial dividends in industries around the world.

In general, taking on a hard problem and investing the necessary resources to do it justice is something that has always required national and international investments to occur, while the beneficiaries are the whole of humanity. Yes, there are plenty of other problems to be solved; abandoning fundamental physics won’t help anyone solve them.

0*yQDHU9D0P6mH5Ezd

A candidate Higgs event in the ATLAS detector at the Large Hadron Collider at CERN. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles, and due to the fact that dozens of proton-proton collisions occur with every bunch crossing. Examining how the Higgs decays to very high precision is one of the key goals of the HL-LHC. (Credit: CERN/ATLAS Collaboration)

Myth #4: The next-generation successor to the LHC will cost in the ballpark of $100 billion, which is way too expensive to justify.

Yes, this is really a claim that people are making. But it’s not true, and that should matter.

First off, the only way to get a figure in the ballpark of $100 billion is to assume that the next machine will:

  • be an enormous, underground circular tunnel,
  • of somewhere around ~100 kilometers in diameter,
  • requiring an all-new set of high-field electromagnets,
  • that collides protons with protons around a variety of collision points,
  • and that will fund the machine, including construction, maintenance, energy costs of operation, and the employment of its personnel for a total timespan of somewhere between 40 and 60 years.

There are two problems with this argument. The first problem is that, when we talk about “how much does a machine cost,” we typically ask about cost of construction, not about the cost over its total lifetime. There are many endeavors, scientific and non-scientific alike, that we’re happy to invest a billion or two dollars in a year, because of the cumulative benefit to humanity that results. Experimental particle physics is simply one of them; ~$2 billion a year is a tiny price to pay to continue to probe the frontiers of nature as we never have probed them before.

0*jetuHegwmF0hCqdx

The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but only if the LHC makes a beyond-the-Standard-Model discovery. Direct confirmation of what new particles could be causing CDF’s observed discrepancy in the W-boson’s mass might be a task best suited to a future circular collider, which can reach higher energies than a linear collider ever could. (Credit: Rey Hori/KEK)

But the second problem is arguably even bigger: this machine, that might cost up to $100 billion at most, is not the next machine that collider physicists are asking for to further the science of their community. This machine being considered — known in the community as the Future Circular Collider: proton-proton program — is a proposal for the particle accelerator after the next one.

The reason is simple: you build proton-proton colliders to probe nature at higher energies than you ever have before. It’s the best way to peer into the unknown. But if you want to study the nature you’ve already uncovered to high precision, you want to build an electron-positron collider, that way you can produce enormous numbers of the particles you want to study them under a more pristine set of conditions.

The next collider that the particle physics community is seeking to build is either:

  • a large, circular electron-positron collider, one whose tunnel could be the basis for a next-generation proton-proton collider,
  • or a long, linear electron-positron collider, whose sole use would be for this one experiment.

Either way, the next step is to produce large numbers of W-and-Z bosons, Higgs bosons, and top quarks in an attempt to study their properties more precisely. The “$100 billion” number is about five times too much, meant to scare budget hawks away from a scientifically viable, valid, and even sensible path.

0*RTqUmBz5qTixVIuJ

The final results from many different particle accelerator experiments have definitively showed that the Z-boson decays to charged leptons about 10% of the time, neutral leptons about 20%, and hadrons (quark-containing particles) about 70% of the time. This is consistent with 3 generations of particles and no other number. (Credit: CERN/LEP collaboration)

Myth #5: If the LHC doesn’t turn up anything new and/or unexpected, experimental particle physics using colliders becomes pointless, anyway.

This is perhaps the most ubiquitous and untrue myth out there: that if you can’t find a new particle, force, or interaction that isn’t part of the Standard Model, there’s nothing worth learning from studying elementary particles further. Only the most incurious minds would draw such a conclusion, especially given facts like:

There are a plethora of puzzles that we can’t explain in particle physics, and colliding as many particles as possible at the highest energies possible is one primary avenue for revealing properties of nature that may be immune to our other lines of inquiry. In fact, even as the LHC continues its “mundane” operations, we’re finding properties of nature we didn’t expect: challenges to lepton flavor universality and bound states of quarks and/or gluons that we couldn’t have predicted.

0*0pKipIIOO4FC8kHq

Based on the masses of the top quark and the Higgs boson, we could either live in a region where the quantum vacuum is stable (true vacuum), metastable (false vacuum), or unstable (where it cannot stably remain). The evidence suggests, but does not prove, that we are in a region of false vacuum. (Credit: T. Markkanen, A. Rajantie and S. Stopyra, Front. Astron. Space. Sci, 2018)

The simple truth of the matter is this: as it now begins its third major run of data-taking here in mid-2022, the various collaborations are searching for a variety of potential signals. As we create more and more Higgs bosons, we’ll see if they decay precisely as the Standard Model predicts, or if cracks arise. As we create large numbers of composite particles containing charm and/or bottom quarks, we can see how their properties inform our conception of the Universe. As new pentaquarks and tetraquarks are discovered, we can test all sorts of our predictions, and we can also look for predicted states that have never been observed before, like glueballs and other exotic forms of (purely Standard Model-based) matter. The LHC has only taken 2% of all the data it will ever take; the other 98% is now finally on its way.

But the biggest motivator is this: we can look at the Universe as we’ve never looked at it before, and that’s almost always where new and unexpected discoveries come from. We have the technology to push our scientific frontiers, and although the trickle-down benefits to society (what business-types call the return on investment) are always substantial, that’s not why we do it. We do it because there’s a whole Universe out there to discover, and no one ever found anything by giving up before even sufficiently trying. Don’t let the fear of not finding what we’ve hoped for prevent us from looking at all.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK