Measuring known Standard Model processes is by no means easy!
Watch today's CERN-LHC Seminar at 11h CEST:
Observation of the H->b bbar decay at ATLAS and CMS
https://webcast.web.cern.ch/event/i750541
"This seminar presents the observation of the Higgs boson decay to a bottom quark-antiquark pair by the ATLAS and CMS experiments. The results presented use all available datasets from the LHC Run 1 and Run 2 including the most recent 13 TeV dataset that corresponds to an integrated luminosity of ~80 fb^-1. The analysis strategy and the background estimation techniques are discussed and a comprehensive set of measurements are presented."
The speakers are Luca Perrozzi (ETH Zurich (CH)) on behalf of the Compact Muon Solenoid - CMS Collaboration, and Nicolas Morange (LAL, CNRS (FR)) on behalf of the ATLAS Collaboration.
Observation of the H->b bbar decay at ATLAS and CMS
https://webcast.web.cern.ch/event/i750541
"This seminar presents the observation of the Higgs boson decay to a bottom quark-antiquark pair by the ATLAS and CMS experiments. The results presented use all available datasets from the LHC Run 1 and Run 2 including the most recent 13 TeV dataset that corresponds to an integrated luminosity of ~80 fb^-1. The analysis strategy and the background estimation techniques are discussed and a comprehensive set of measurements are presented."
The speakers are Luca Perrozzi (ETH Zurich (CH)) on behalf of the Compact Muon Solenoid - CMS Collaboration, and Nicolas Morange (LAL, CNRS (FR)) on behalf of the ATLAS Collaboration.
View 19 previous comments
+Stam Nicolis Yes assumptions about the background is relevant for parameter space. And yes, there are many proposals suggesting mechanism in the standard model, this should however stand out.2h
Stam NicolisModeratorIt won't stand out at all, until it's written up as a technical paper and sent to the arXiv.2h
+Stam Nicolis Or as a philosophical paper with better explained generalisation so particle physicists can more easily include it in their models with outlined bounds and falsifiable statements.
Or let's call it Theory, you know, before mathematicians and then experimentation take over. The math a 100 years ago wasn't that much, so what happened and how many working in theory are there? And I don't mean mathematicians setting functional limits.
Specialisation and all that, choose maximum two.1h
Stam NicolisModeratorPhilosophy isn't relevant. The Standard Model and all proposals for its extensions is falsifiable-just compute any process, its backgrounds and check with the data from the LHC. That's what's going on-for the moment, all measured processes are consistent with the Standard Model, there are hints for effects beyond it but all the proposals that have tried to become specific, for the moment, haven't found processes where a discrepancy has appeared, in a way that allows identifying the new particles involved. Now not all Standard Model processes have been measured to discovery precision and, as stressed, the data analysis is hard.
It would be more useful to discuss concretely with experimentalists, what their needs are, rather than wasting time in vacuous statements that don't have any tangible effects.
It would be much more useful to learn how the experimentalists carry out their data analysis and try to come up with methods that accelerate them. There's a lot of room for improvement. On the theoretical side, there's considerable room for improving how backgrounds are calculated.35m
+Stam Nicolis I'm aware that general attitude of statistics being better than science. That first sentence threw me off the rest so much I'm just going to let it pass.36m
Stam NicolisModerator+Balder Oddson The statement about statistics and science is, simply, meaningless.34m