Hunting for leptoquarks at CERN's Large Hadron Collider

avatar

Hi everyone and happy new year 2022! While many are still enjoying their winter break, school has already restarted in France. Correspondingly, I restarted blogging about particle physics and cosmology on STEMsocial and Hive.

For this first blog of the year (on the day of my birthday by the way), I discuss one of my own research topics. This work gave rise to a first publication in 2020, a second one that is in the middle of the peer-review process, and a contribution to a conference proceedings.

The subject of the day lies in the context of theoretical computations relevant for searches for new phenomena at the Large Hadron Collider at CERN. With my collaborators, we performed new calculations allowing for the best predictions for the production of hypothetical particles called leptoquarks. This last sentence naturally sets the scene for this blog, as I (purposely) managed to introduce several concepts that need to be explained/clarified.

Accordingly, this post will first discuss leptoquarks, and explain what they are, how they come from and how they are searched for. Next, it will focus on how predictions for the Large Hadron Collider work and how precision predictions can be achieved. Finally I will describe some of the findings of my publications. I will detail in particular how generic predictions cannot be made (in contrast to what was thought up to now), and that there could be large uncertainties associated even with the best predictions.

I will try to keep the content of this blog as simple as possible. I hence hope every reader will be able to take something home from it. Please do not hesitate to let me know whether it worked.


[Credits: CERN]


Leptoquarks in a nutshell


A few weeks ago, I shared a description of the microscopic world from the point of a view of a particle physicist (see here for details). In this description, I introduced the two classes of elementary particles that form the matter sector of the Standard Model, i.e. the quarks and leptons. More precisely, there are six quarks (up, down, strange, charm, bottom, top), three charged leptons (the electron, muon and tau) and three neutral leptons (the electron neutrino, the muon neutrino and the tau neutrino) in the Standard Model.

The main difference between these two types of particles is that quarks are sensitive to the strong interaction (one of the three fundamental forces included in the Standard Model), whereas leptons are not. Equivalently, we can say that in the Standard Model quarks interact among themselves through electromagnetic, weak and strong interactions. On the other hand, leptons interact among themselves via electromagnetic and weak interactions only. There is however no direct interaction simultaneously involving one quark and one lepton. The two classes of particles are somehow disconnected.

It is now time to move on with the topic of the day: leptoquarks. First of all, it is important to emphasise that leptoquarks are hypothetical particles. This means that they are not part of the Standard Model and that they have not been observed in data (at least up to now). There are however many good motivations behind them, which I will detail below.

Before doing so, let’s have a look to the word leptoquark itself. We can find both lepton and quark inside it. And there is a very good reason for this: a leptoquark is a hypothetical particle that simultaneously interacts with one lepton and one quark. In addition, leptoquarks interact among themselves through electromagnetic, weak and strong interactions (as for quarks).


[Credits: mohamed_hassan (Pixabay)]

But why should we bother about leptoquarks in the first place? The main reasons have been detailed in this blog. The Standard Model of particle physics works very well, but there are good motivations to consider it as the tip of an iceberg that needs to be probed. Among the plethora of possible options for the hidden part of this iceberg, many predict the existence of leptoquarks.

For instance, when we try to unify all fundamental interactions and building blocks of matter (this is called Grand Unification), we automatically get leptoquarks in our way. Similarly, they naturally arise in certain technicolour or composite models. In those models, we add an extra strong force and new building blocks of matter, and the Higgs boson has a composite nature. The list does not end there, and we can cite may other classes of models in which leptoquarks arise. These include specific supersymmetric models or even low-energy realisations of string theory models.

As can be guessed from the previous paragraph, considering leptoquarks as serious candidates for new phenomena is quite motivated. Accordingly, they are currently actively searched for the Large Hadron Collider at CERN (i.e. the LHC).


Leptoquarks signals at the Large Hadron Collider


In high-energy proton-proton collisions such as those on-going at the LHC, the most efficient mechanism to produce leptoquarks is a mechanism in which they are pair-produced. It is indeed usually much easier to produce a pair of leptoquarks than a single leptoquark. The reason is that for the former case, the strong force is involved. It hence naturally leads to copious production. On the contrary, for the latter case we need to rely on the weak force. Single lepton quark production at colliders is thus associated with a rarer rate.

Leptoquarks are unstable and decay almost instantaneously once they are produced. If this was not the case, cosmology would be in big troubles. But into what would a leptoquark decay? The answer was already given earlier in this blog. A leptoquark by definition couples to a quark and a lepton. Consequently, it decays into a pair of particles comprising a lepton and a quark. Two leptoquarks freshly produced at the LHC would then decay into two leptons and two quarks (one lepton and one quark for each leptoquark). This dictates the typical LHC signatures to consider experimentally.


[Credits: ATLAS @ CERN]

As we have six quarks and six leptons in the Standard Model, there is a certain number of possibilities to look for pair-produced leptoquarks. Correspondingly, the ATLAS and CMS experiments look for leptoquark pair-production and decays in a variety of channels. For instance, we have searches for excesses of events (relative to the Standard Model expectation) featuring two top quarks and two tau leptons, two lighter quarks and two electrons, and so on.

Unfortunately, there is no sign of a leptoquark in data so far… Constraints are thus put on leptoquark models. One exception in those null search results could (I insist on the conditional tense) be the so-called flavour anomalies which I have already mentioned last month, but not blogged about yet (this will come soon; I promised it to @agmoore).

Those anomalies are connected to long-standing issues in data that are getting more and more solid with time, and that could soon be a proof of physics beyond the Standard Model. We are however not there yet, so that we only see these anomalies as a potential hint for physics beyond the Standard Model. Please do not hesitate to have a look to this blog to get more information on what it takes to get a discovery in particle physics.

In the case where we take those anomalies as something solid, we can make use of leptoquarks to explain them. For that reason, leptoquarks have become more and more attractive candidate for physics beyond the Standard Model during the last couple of years.

I mentioned above that the most efficient process to produce a pair of leptoquark was to rely on the strong interaction. This naturally leads us to the next part of this blog, in which I will describe how associated theory calculations are achieved. I will try to leave out from the discussion any too technical detail. I however need to be a bit technical too to convey somewhat precisely the novelty of my research work. Please do not hesitate to come back to me for clarifications, if needed.


Predictions for the Large Hadron Collider


In order to understand the achievement of my research work, it is important to get information about how we can calculate a production rate in particle physics. Equivalently, I will try to explain how we can estimate the occurence of collisions at the Large Hadron Collider in which two leptoquarks are produced and further decay.


[Credits: CERN]

The first important concept to introduce is that of parton distribution functions. At the Large Hadron Collider, protons are collided. However, in any specific high-energy collision the objects that annihilate and give rise to the final state considered (a pair of leptoquarks here) are not the protons themselves, but their constituents (generically coined partons).

For instance, in order to produce a pair of leptoquarks, we may consider the annihilation of one quark (being one constituent of the first colliding proton) and one antiquark (being one constituent of the second colliding proton). Parton distribution functions then provide the way to connect the colliding protons to their colliding constituents.

The second ingredient in our computation is what we call the hard-scattering rate. This rate is that at which a given partonic initial state (a quark-antiquark pair for instance) will transform into a given final state (a pair of leptoquarks in our example). By ‘partonic’, we mean that we lie at the level of the constituents of the colliding protons and not the proton anymore. As already said, the connection between both is made by the parton distribution functions.

This rate can be computed by making use of the master equation of the theory. In the Standard Model, we thus use the master equation of the Standard Model. In the case of physics beyond the Standard Model, another master equation has to be used (for instance one including leptoquarks and their properties).

With these two ingredients, we are in principle capable to calculate the desired quantity. However, there is a catch. The microscopic world is quantum: this quantum nature implies that any prediction should include quantum corrections. This was already discussed in this blog in the context of the hierarchy problem of the Standard Model.


[Credits: IRFU]

When dealing with predictions of a production rate at the Large Hadron Collider, we can decide to compute the rate from the above two ingredients without considering quantum corrections at all. We may get some numbers for the predictions, but the missing bits lead to a computation possibly plagued with large uncertainties. In other words, we may get the right order of magnitude for the result, but not necessarily anything more precise than this.

In order to reduce those uncertainties, we need to include quantum corrections in the predictions. The latter can be organised into a next-to-leading-order piece, a next-to-next-to-leading-order piece, and so on (the leading-order contribution being obtained without any quantum correction). Let’s skip any details about how this organisation works (it is driven by the perturbative nature of the strong force). Instead, let’s keep in mind that when we add a next-order contribution to the calculation, the results become more and more precise, and therefore more and more reliable.

The next-to-leading-order component of the calculation is in principle not too difficult to evaluate. When I was a PhD student, it took me a whole year to compute it for a given process. However, with the advent of more efficient computing techniques, this can now be done in five minutes on any laptop. I may explain how we managed to do this in a future post if there is an interest in it.

For any sub-leading contribution, the story is nevertheless very different. We in fact only know them in specific cases and solely for a bunch of processes of the Standard Model. Therefore, for what concerns leptoquark pair production, only the next-to-leading-order contributions are known. This is the case since the end of the 1990s.


My research work: precision predictions for leptoquark pair production


In my recent scientific publications (here and there), we computed the most precise predictions for leptoquark pair-production to date. There were several improvement relative to what was done in the 1990s.

First, we improved the computation of the leading term of the production rate (i.e. predictions without any quantum corrections) by not only including contributions arising from strong interactions, but also from the specific leptoquark-quark-lepton interaction that was mentioned in the beginning of this blog (and that is neither dictated by the strong nor of the weak force). The latter was always assumed to be negligible for what concerned leptoquark pair production. However, in the light of the recent flavour anomalies, this is by far not necessarily the case anymore.

Second, we computed the next-to-leading-order corrections to the full leading-order production rate. In this way, we did not only reproduce the computations from the 1990s, but we additionally included corrections to contributions involving a leptoquark-quark-lepton interaction.

Our achievement does not however end there. We thirdly manage to consistently include bits of every single higher-order component to the rate, making the calculation as precise as could be with the knowledge of today.

An example of our results is shown in the figure below, for a given leptoquark scenario that can explain the flavour anomalies (that are still undiscussed; I know).


[Credits: arXiv]

In blue, we have predictions as taken from the calculations of the 1990s. They consistently include leading-order and next-to-leading-order contributions, but after ignoring any effect associated with the leptoquark-quark-lepton coupling. The figure exhibits three sets of predictions, the quantity shown on the y-axis being the leptoquark production rate at the LHC in appropriate units.

For each of the three ‘blue predictions’, different parton distribution functions are employed. All of these three are very modern and widely used, and none is better than another. The differences in the predictions can then be taken as extra uncertainties on the results.

In red we have our new results. We can see that the predicted rates are almost 50% higher than what could be expected from the older calculations. This shows that previously neglected pieces of the calculations were not so negligible after all.

Moreover, the level of uncertainties (i.e. the size of the error bars) is strongly affected by the employed parton distribution functions. The gain in precision due to the newly added quantum corrections (comparing the solid blue and solid red error bars) is here tamed down by the loss in precision due to the parton distribution functions (comparing the pale blue and pale red error bars). This was unexpected.

For other leptoquark scenarios, we have even found that sometimes predictions relying on different parton distributions did not even agree with each other. The good news is that more LHC data is expected to cure this issue, as already visible on the rightmost prediction. This set of parton distribution function indeed uses a larger amount of available LHC data, relative to the two others that still need to be updated. In that case, the improvement cannot be missed.


Summary, take home message and TLDR version of this blog


It is now time to finalise this blog. The topic of today concerned one of my current research works, and was dedicated to the computation of the best possible predictions for leptoquark pair production at the Large Hadron Collider.

Leptoquarks are hypothetical particles that are sensitive to all fundamental forces. Moreover, they feature a special interaction involving simultaneously one quark and one lepton of the Standard Model. Leptoquarks have received an increasing attention during the recent years as they could provide an explanation for several anomalies seen in data.

In my work, we improved predictions relative to older calculations from the 1990s, and showed that the older results could be misleading in scenarios relevant in light of current data. We have shown that the production rates could be largely different (by several tens of percents in relative size) from what was expected, and that the associated uncertainties could be additionally larger.

Care must thus be made when driving conclusions from data. Today, both the ATLAS and CMS experiments already use our results in the context of their searches for leptoquarks. Precision is always welcome for a search, as it affects the associated limits (when no signal is found) or a discovery (when a signal is found). It hence offers in all cases more robust conclusions.

I hope you all enjoyed this post, in which I have tried to use a vocabulary that is as simple as possible. I hope the job has been well done. Please let me know (or ask for clarifications if needed).

See you next Monday!



0
0
0.000
17 comments
avatar

Congratulations @lemouth! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s):

You distributed more than 66000 upvotes.
Your next target is to reach 67000 upvotes.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

To support your work, I also upvoted your post!

Check out the last post from @hivebuzz:

Happy New Year - Feedback from the first Hive Power Up Day of 2022
PUD - PUH - PUM - It's all about to Power Up!
0
0
0.000
avatar

I have some knowledge of Physics from college courses but my degree is not in Physics. I found your article to be understandable and interesting. When searching for something new, it helps to know where to look and how to know that you have actually found it. We build on the work of past explorers who collected data informed by mathematical models as we comb the search space for signs of what we seek to find.

I wish you further success in your hunt for leptoquarks!

0
0
0.000
avatar

Thanks a lot for your feedback. When writing such posts, I am always puzzled about how much readers will get from them. Every single week it is the same! I usually like doing this kind of things with an attending public, i.e. in a situation where it is easy to get live feedback and adapt the speech accordingly. Writing on Hive is thus a different story, that I enjoy too.

Let's now go back to the topic. What is really exciting with new phenomena in high-energy physics is precisely that we do not know where to look at. It is really a matter of exploring all options (for that reason, my research is actually quite diverse, allowing me to learn many new things every day). Somehow, the status of the field is not very different from space exploration. Our gut feelings tell us something must be there, but we have no idea where. Therefore it is important to be pragmatic and consider all options we may think about.

PS: Thanks for sharing my post on Twitter. I knew this was possible from @gentleshaid. I however belong to the rare class of human beings without any Twitter account :)

0
0
0.000
avatar

Twitter is really a nice place to get off-chain exposure. I will make it a point of duty to help share your future post on my Twitter feeds henceforth.

0
0
0.000
avatar

Yeah I know. But it is really a matter of my time being so limited... that I don't want to add another layer on the agenda ;)

0
0
0.000
avatar

I understand perfectly. What you share here is a gem of works that deserves more eyes.

0
0
0.000
avatar

I am always looking forward to get more engagement (already on-chain). It for now looks quite random (sometimes it works, sometimes it does not).

0
0
0.000
avatar

Happy Birthday!!! Happy New Year's. One thing I come away with is that in theoretical physics, anomalies point to the possibility, the suggestion of something beyond the standard model.

Several points stay with me: leptoquarks are hypothetical, and yet important to understanding what may exist beyond the standard model. They must decay almost instantly,as soon as they are created, "or cosmology would be in big trouble". And, it is easier to create pairs of leptoquarks than single leptoquarks, because the strong force is used to create pairs and that naturally creates more than the weak force would.

Finally, everything is dependent on data. The more accurate the data, the more reliable the predictions. Data today is much more predictable than it was in the 90s.

I think, for a single class in physics I have learned a lot, more than I mention here. Slowly I am filling in critical gaps. A great deal may be beyond me, but overall I think I'm getting 'it'--the thrust of your work, the motivation behind and direction of theoretical physics.

It's really amazing that we have you here on Hive. You open the universe to us.

I hope you have had a wonderful birthday, @lemouth.

0
0
0.000
avatar

Thanks a lot for your wishes! I had a very nice (offline) evening yesterday with the family, to celebrate a new multiple of ten ;)

What you summarised in your message is correct. Anomalies, as well as conceptual issues and limitations of the Standard Model point that something else should be around, not too far from the current frontier of knowledge. Concerning leptoquarks, let's say that other options for physics beyond the Standard Model are quite appealing too. For now, it is impossible to say that one is better than another (preferences are only a matter of taste). My approach here is to be as open as possible and study all possibilities (which also guarantee to learn as many new things as possible).

Regarding precision, one thing that is inherent to high-energy physics is that we have error bars both on measurements (as for any measurement in fact) and on theory calculations. Therefore, it is important to make progress on both sides so that both error bars could be reduced as much as possible. Having a super precise measurement for which there is no as precise theoretical prediction is useless as we would not be able to tell whether there is an agreement or a disagreement. And vice versa, having precise predictions but only approximate measurements won't help.

Cheers, and have a great day!

0
0
0.000
avatar

Thanks for your contribution to the STEMsocial community. Feel free to join us on discord to get to know the rest of us!

Please consider delegating to the @stemsocial account (85% of the curation rewards are returned).

You may also include @stemsocial as a beneficiary of the rewards of this post to get a stronger support. 
 

0
0
0.000
avatar
(Edited)

From my understanding, it seems what you guys do is to first hypothesise before looking for the particles. For how long do you continue looking and what if the particle is never realized? Do you go back to your predictions and modify them to start all over again?

BTW, a belated happy birthday to you. Wishing you more fruitful years ahead.

0
0
0.000
avatar

Making hypotheses is good to have a dictionary of possible signatures of new phenomena. Next, for each signature we check whether data is consistent with the Standard Model expectation (i.e. the background). The game is of course to make sure our catalogue of signatures is complete. In other words, are we looking at all possible options?

I will probably write a post soon on leptoquarks and dark matter. The main findings of this paper is that some experimental searches were not performed. Those searches were however the best ones to constrain the model considered. We have a unique machine and it is important to exploit it best. This is where theory works like that one are important. They make sure there is no loophole in the search program.

Now in terms of the hypotheses that are studied, there is always an unknown about the masses of the new particles and how strongly they are coupled. This will impact the rates of anomalies that could be seen in data. From a null result (no observation of anything anomalous), we constrain those rates and thus restrict the possibilities for the properties of the new states.

I have however only talked about restrictions. It is actually very hard (probably impossible) to fully exclude an hypothesis. For instance, we can constrain the maximal strength at which a hypothetical particle could be coupled. Anything below that would be fine as the particle would be so feebly coupled that it won't produce enough signal to be visible. There, there is nothing we can really do, except trying to design more clever analyses dedicated to rarer signals.

I hope this clarified. Otherwise, feel free to come back to me ;)

PS: thanks a lot! :)

0
0
0.000