In this article we are going to delve into the topic of 750 GeV diphoton excess, an issue that has sparked interest and debate in recent times. 750 GeV diphoton excess and its implications in our society have been discussed from different areas, so it is crucial to address this issue in an exhaustive and objective manner. Along these lines, we will analyze the different aspects related to 750 GeV diphoton excess, exploring its origins, evolution and repercussions in the current context. Likewise, we will stop at the different perspectives that exist around 750 GeV diphoton excess, considering opinions and arguments from experts in the field. Ultimately, the objective of this article is to shed light on 750 GeV diphoton excess and offer a detailed and balanced view that allows the reader to fully understand this matter and form their own judgment on it.
Composition | Elementary particle |
---|---|
Statistics | suspected bosonic |
Status | Refuted; absent in August 2016 data |
Symbol | Ϝ, Ϝ(750), Φ, X, ηzy |
Discovered | Resonance of mass ≈750 GeV decaying into two photons could have been seen by CERN in 2015 (though sufficient statistical significance never reached) |
Mass | ≈ 750 GeV/c2 (CMS + ATLAS) |
Decay width | < 50 GeV/c2 |
Decays into |
The 750 GeV diphoton excess in particle physics was an anomaly in data collected at the Large Hadron Collider (LHC) in 2015, which could have been an indication of a new particle or resonance. The anomaly was absent in data collected in 2016, suggesting that the diphoton excess was a statistical fluctuation. In the interval between the December 2015 and August 2016 results, the anomaly generated considerable interest in the scientific community, including about 500 theoretical studies. The hypothetical particle was denoted by the Greek letter Ϝ (pronounced digamma) in the scientific literature, owing to the decay channel in which the anomaly occurred. The data, however, were always less than five standard deviations (sigma) different from that expected if there was no new particle, and, as such, the anomaly never reached the accepted level of statistical significance required to announce a discovery in particle physics. After the August 2016 results, interest in the anomaly sank as it was considered a statistical fluctuation. Indeed, a Bayesian analysis of the anomaly found that whilst data collected in 2015 constituted "substantial" evidence for the digamma on the Jeffreys scale, data collected in 2016 combined with that collected in 2015 was evidence against the digamma.
On December 15, 2015, the ATLAS and CMS collaborations at CERN presented results from the second operational run of the Large Hadron Collider (LHC) at the centre-of-momentum energy of 13 TeV, the highest ever achieved in proton-proton collisions. Among the results, the invariant mass distribution of pairs of high-energy photons produced in the collisions showed an excess of events compared to the Standard Model prediction at around 750 GeV/c2. The statistical significance of the deviation was reported to be 3.9 and 3.4 standard deviations (locally) respectively for each experiment.
The excess could have been explained by the production of a new particle (the digamma) with a mass of about 750 GeV/c2 that decayed into two photons. The cross-section at 13 TeV centre-of-momentum energy required to explain the excess, multiplied by the branching fraction into two photons, was estimated to be
(fb = femtobarn)
This result, while unexpected, was compatible with previous experiments, and in particular with the LHC measurements at a lower centre-of-momentum energy of 8 TeV.
Analysis of a larger sample of data, collected by ATLAS and CMS in the first half 2016, did not confirm the existence of the Ϝ particle, which indicates that the excess seen in 2015 was a statistical fluctuation.
The non-observation of the 750 GeV bump in follow-up searches by the ATLAS and CMS experiments had a significant impact on the particle physics community. The event highlighted the desire in the community for the LHC to discover a fundamentally new particle, and the difficulties in searching for a signal which is unknown a priori.