Skip to main content

What Next for the Higgs Boson? – Jim Baggott

by the author of Higgs
On 4 July 2012, scientists at CERN announced the discovery of a new elementary particle that they judged to be consistent with the long-sought Higgs boson. The next step is therefore reasonably obvious. Physicists involved in the ATLAS and CMS detector collaborations at CERN’s Large Hadron Collider (LHC) facility will be keen to push ahead and fully characterize the new particle. They will want to know if this is indeed the Higgs boson, the one ingredient missing from the so-called standard model of particle physics.
How will they tell?
Physicists at Fermilab’s Tevatron collider and CERN’s LHC have been searching for the Higgs boson by looking for the tell-tale products of its different predicted decay pathways. The current standard model is used to predict both the rates of production of the Higgs boson in high-energy particle collisions and the rates of its various decay modes. After subtracting the ‘background’ that arises from all the other ways in which the same decay products can be produced, the physicists are left with an excess of events that can be ascribed to Higgs boson decays.
Now that we know the new particle has a mass of between 125-126 billion electron-volts (equivalent to the mass of about 134 protons), both the calculations and the experiments can be focused tightly on this specific mass value.
So far, excess events have been observed for three important decay pathways. These involve the decay of the Higgs boson to two photons (written H → γγ), decay to two Z bosons (H → ZZ → l+l-l+l-, where l signifies leptons, such as electrons and muons and their anti-particles) and decay two W particles (H → W+W- → l+ν l-ν, where ν signifies neutrinos). All these decay pathways involve the production of bosons. This should come as no real surprise, as the Higgs field was originally invented to break the symmetry between the weak and electromagnetic forces, thereby giving mass to the W and Z particles and leaving the photon massless. There is therefore an intimate connection between the Higgs, photons and W and Z particles.
The decay rates to these three pathways are broadly as predicted by the standard model. There is an observed enhancement in the rate of decay to two photons compared to predictions, but this may be the result of statistical fluctuations. Further data on this pathway will determine whether or not there’s a problem (or maybe a clue to some new physics) in this channel.
But the Higgs field is also involved in giving mass to fermions – matter particles, such as electrons and quarks. The Higgs boson is therefore also predicted to decay into fermions, specifically very large massive fermions such as bottom and anti-bottom quarks and tau and anti-tau leptons. Bottom quarks and tau leptons (heavy versions of the electron) are third-generation matter particles with masses respectively of about 4.2 billion electron volts (about four and a half proton masses) and 1.8 billion electron volts (about 1.9 proton masses).
But these decay pathways are a little more problematic. The backgrounds from other processes are more significant and so considerably more data are required to discriminate the background from genuine Higgs decay events. The decay to bottom and anti-bottom quarks was studied at the Tevatron before it was shut down earlier this year. But the collider had insufficient collision energy and luminosity (a measure of the number of collisions that the particle beams can produce) to enable independent discovery of the Higgs boson.
ATLAS physicist Jon Butterworth, who writes a blog for the British newspaper The Guardian, recently gave this assessment:
If and when we see the Higgs decaying in these two [fermion] channels at roughly the predicted rates, I will probably start calling this new boson the Higgs rather than a Higgs. It won’t prove it is exactly the Standard Model Higgs boson of course, and looking for subtle differences will be very interesting. But it will be close enough to justify [calling it] the definite article.
When will this happen? This is hard to judge, but perhaps we will have an answer by the end of this year.

Comments

Popular posts from this blog

It's On You - Nick Chater and George Loewenstein *****

Going on the cover you might think this was a political polemic - and admittedly there's an element of that - but the reason it's so good is quite different. It shows how behavioural economics and social psychology have led us astray by putting the focus way too much on individuals. A particular target is the concept of nudges which (as described in Brainjacking ) have been hugely over-rated. But overall the key problem ties to another psychological concept: framing. Huge kudos to both Nick Chater and George Loewenstein - a behavioural scientist and an economics and psychology professor - for having the guts to take on the flaws in their own earlier work and that of colleagues, because they make clear just how limited and potentially dangerous is the belief that individuals changing their behaviour can solve large-scale problems. The main thesis of the book is that there are two ways to approach the major problems we face - an 'i-frame' where we focus on the individual ...

Introducing Artificial Intelligence – Henry Brighton & Howard Selina ****

It is almost impossible to rate these relentlessly hip books – they are pure marmite*. The huge  Introducing  … series (a vast range of books covering everything from Quantum Theory to Islam), previously known as …  for Beginners , puts across the message in a style that owes as much to Terry Gilliam and pop art as it does to popular science. Pretty well every page features large graphics with speech bubbles that are supposed to emphasise the point. Funnily,  Introducing Artificial Intelligence  is both a good and bad example of the series. Let’s get the bad bits out of the way first. The illustrators of these books are very variable, and I didn’t particularly like the pictures here. They did add something – the illustrations in these books always have a lot of information content, rather than being window dressing – but they seemed more detached from the text and rather lacking in the oomph the best versions have. The other real problem is that...

The Laws of Thought - Tom Griffiths *****

In giving us a history of attempts to explain our thinking abilities, Tom Griffiths demonstrates an excellent ability to pitch information just right for the informed general reader.  We begin with Aristotelian logic and the way Boole and others transformed it into a kind of arithmetic before a first introduction of computing and theories of language. Griffiths covers a surprising amount of ground - we don't just get, for instance, the obvious figures of Turing, von Neumann and Shannon, but the interaction between the computing pioneers and those concerned with trying to understand the way we think - for example in the work of Jerome Bruner, of whom I confess I'd never heard.  This would prove to be the case with a whole host of people who have made interesting contributions to the understanding of human thought processes. Sometimes their theories were contradictory - this isn't an easy field to successfully observe - but always they were interesting. But for me, at least, ...