Yeah, but you could also fund a lot of other research with this budget. The point is, physicists just don’t know, if there are more particles existing. There is no theoretical theory there predicting particles at a certain mass with certain decay channels. They won’t know what to look for. That’s actually already a problem for the LHC. They have this huge amount of data, but when you don’t know, what kind of exotic particles you are looking for and how they behave, you can’t post-process the data accordingly. They are hidden under a massive amounts of particles, that are known already.
Yes, with finite resources, we have to make choices. As long as there are some resources for people to just poke around, I’m good with whatever. If we’re actually looking for some place to drop a few billion, I actually don’t think another collider should be on the list, let alone at the top.
The problem as I see it is that “but what good is it” is used to limit pretty much all fundamental research.
So why don’t they just use post processing to remove all the known particles and start looking at the particles that remain, discover a new one, remove it, continue until there’s none left?
There are multiple reasons for that. We don’t know the decay channels of already discovered particles precisely. So there might be very rare processes, that contribute to already known particles. It is all a statistical process. While you can give statements on a large number of events, it is nearly impossible to do it for one event. Most of the particles are very short-lived and won’t be visible themselves in a detector (especially neutral particles). Some will not interact with anything at all (neutrinos). Then your detectors are not 100% efficient, so you can’t detect all the energy, that was released in the interaction or the decay of a particle. The calorimeters, that are designed to completely stop any hadrons (particles consisting of quarks) have a layer of a very dense material, to force interactions, followed by a detector material. All the energy lost in the dense material is lost for the analysis. In the end you still know, how much energy was not detected, because you know the initial energy, but everything else gets calculated by models, that are based on known physics. A neutral weakly interacting particle would just be attributed as a neutrino.
Yeah, but you could also fund a lot of other research with this budget. The point is, physicists just don’t know, if there are more particles existing. There is no theoretical theory there predicting particles at a certain mass with certain decay channels. They won’t know what to look for. That’s actually already a problem for the LHC. They have this huge amount of data, but when you don’t know, what kind of exotic particles you are looking for and how they behave, you can’t post-process the data accordingly. They are hidden under a massive amounts of particles, that are known already.
Yes, with finite resources, we have to make choices. As long as there are some resources for people to just poke around, I’m good with whatever. If we’re actually looking for some place to drop a few billion, I actually don’t think another collider should be on the list, let alone at the top.
The problem as I see it is that “but what good is it” is used to limit pretty much all fundamental research.
So why don’t they just use post processing to remove all the known particles and start looking at the particles that remain, discover a new one, remove it, continue until there’s none left?
There are multiple reasons for that. We don’t know the decay channels of already discovered particles precisely. So there might be very rare processes, that contribute to already known particles. It is all a statistical process. While you can give statements on a large number of events, it is nearly impossible to do it for one event. Most of the particles are very short-lived and won’t be visible themselves in a detector (especially neutral particles). Some will not interact with anything at all (neutrinos). Then your detectors are not 100% efficient, so you can’t detect all the energy, that was released in the interaction or the decay of a particle. The calorimeters, that are designed to completely stop any hadrons (particles consisting of quarks) have a layer of a very dense material, to force interactions, followed by a detector material. All the energy lost in the dense material is lost for the analysis. In the end you still know, how much energy was not detected, because you know the initial energy, but everything else gets calculated by models, that are based on known physics. A neutral weakly interacting particle would just be attributed as a neutrino.