Communities and Collections
Posts
Post has attachment
First off, there's surely a typo in one of the quotes. It should read as "10^120 times" bigger. Second off, I think the presentation about Dark Energy is wrong. It is the canonical description of Dark Energy --- that is there are no models besides vacuum energy that can explain dark energy even theoretically --- but it turns out you can actually make a pretty cute toy model of inflation that gives you the almost totally correct dark energy value and other models exist that give you a small cosmological constant (basically the same thing as dark energy). The reason why I think this presentation is canonical is that very few people actually work on dark energy. But really, before you can have experimental progress on probing the nature of these things, you probably need to have a better theoretical understanding of it. That takes time, and it's definitely 'getting there', so I am not too worried about it. Then again, I also don't work on dark energy so buyer beware.
Can we talk about naturalness? That's the first scary number that is mentioned: the problem with the Higgs boson field. A natural theory of this would be one where you predict that the Higgs should be close to where it is through some kind of physical mechanism that prevents needing to have the subtraction of two very insanely large numbers to give you a very small number. We call this fine-tuning. A strictly natural theory, where that number is fixed to be zero and then also predicted for why it is the number it is, is going to be tricky to make given the data at the LHC, but it's still probably possible. But I mean, there's technically a bit of fine-tuning in producing the right constants at this energy to get life to exist. This isn't entirely an anthropic argument; I am trying to point to the fact that some level of fine-tuning is totally acceptable already in a wide range of physical theories. The problem with this fine-tuning is that it is coming from quantum field theory, but if you solve that deep problem with a well known solution, but still get some technical fine-tuning left so you get the exact right numbers out, then it seems to me how big of a problem this is to particle physics critically depends on how much fine-tuning you end up having to do. I'd say if it reduces your fine-tuning by more than half of the orders of magnitude of the fine-tuning in the Standard Model then you've technically solved this problem, that is if the Standard Model requires 30 digits of large numbers to cancel, and your experimentally verified theory only requires 14. If that was really how much fine-tuning was left, I wouldn't be all that happy really, but if we are within a few parts per billion (9 digits of cancellation) then I'm not all that concerned philosophically. So let's say you find SUSY at 100 TeV. You technically have a solution to naturalness, that is at the deep quantum field theory level you've removed the most thorny aspects of the issue, and the question is then what is the level of fine-tuning in the theory that explains this new physics followed by can we understand or alleviate this fine-tuning with other new physics mechanisms? These are kind of interesting technical questions, but at the end of the day you'd still really have solved the naturalness problem because you reduced fine tuning by billions. And, I mean, with SUSY around for so long have a lot of physicists really been looking for alternative solutions to naturalness? I don't think so. But with the SUSY having the strictest forms of naturalness ruled out you already have a new proposal for how you can partially fix this problem (ie the relaxion mechanism). I think the only way to be sure of this is to keep looking into these issues theoretically, and so it's far from certain that the end is neigh. Particle physics funding on the other hand....
Let me shift gears as the above is a bit clunky: the LHC found the Higgs. That's strictly new information. We know it's mass, we know it actually exists, we know it is mostly like the Standard Model prediction. We don't know how close it actually is the Standard Model prediction to within a few percentage, so there is still justification for the ILC. Let's say we find nothing else. Would you call the LHC a failure? Well, it did significantly change our understanding of particle physics. The ILC would then refine this understanding, and maybe significantly change it again. So what next? Let's say we convince people to build a 100 TeV machine, and then this machine finds nothing. Well, that would be new. The LHC and ILC would have either found something or have been built to just refine our understanding, but a brand new machine finding nothing? That would be the first failed particle physics experiment. I think if we are declaring the end of particle physics, let alone all of physics, maybe it would be nice if we had a major experiment fail to give us any information and thus fail in its scientific mission before we wax poetic about the end of physics.
Can we talk about naturalness? That's the first scary number that is mentioned: the problem with the Higgs boson field. A natural theory of this would be one where you predict that the Higgs should be close to where it is through some kind of physical mechanism that prevents needing to have the subtraction of two very insanely large numbers to give you a very small number. We call this fine-tuning. A strictly natural theory, where that number is fixed to be zero and then also predicted for why it is the number it is, is going to be tricky to make given the data at the LHC, but it's still probably possible. But I mean, there's technically a bit of fine-tuning in producing the right constants at this energy to get life to exist. This isn't entirely an anthropic argument; I am trying to point to the fact that some level of fine-tuning is totally acceptable already in a wide range of physical theories. The problem with this fine-tuning is that it is coming from quantum field theory, but if you solve that deep problem with a well known solution, but still get some technical fine-tuning left so you get the exact right numbers out, then it seems to me how big of a problem this is to particle physics critically depends on how much fine-tuning you end up having to do. I'd say if it reduces your fine-tuning by more than half of the orders of magnitude of the fine-tuning in the Standard Model then you've technically solved this problem, that is if the Standard Model requires 30 digits of large numbers to cancel, and your experimentally verified theory only requires 14. If that was really how much fine-tuning was left, I wouldn't be all that happy really, but if we are within a few parts per billion (9 digits of cancellation) then I'm not all that concerned philosophically. So let's say you find SUSY at 100 TeV. You technically have a solution to naturalness, that is at the deep quantum field theory level you've removed the most thorny aspects of the issue, and the question is then what is the level of fine-tuning in the theory that explains this new physics followed by can we understand or alleviate this fine-tuning with other new physics mechanisms? These are kind of interesting technical questions, but at the end of the day you'd still really have solved the naturalness problem because you reduced fine tuning by billions. And, I mean, with SUSY around for so long have a lot of physicists really been looking for alternative solutions to naturalness? I don't think so. But with the SUSY having the strictest forms of naturalness ruled out you already have a new proposal for how you can partially fix this problem (ie the relaxion mechanism). I think the only way to be sure of this is to keep looking into these issues theoretically, and so it's far from certain that the end is neigh. Particle physics funding on the other hand....
Let me shift gears as the above is a bit clunky: the LHC found the Higgs. That's strictly new information. We know it's mass, we know it actually exists, we know it is mostly like the Standard Model prediction. We don't know how close it actually is the Standard Model prediction to within a few percentage, so there is still justification for the ILC. Let's say we find nothing else. Would you call the LHC a failure? Well, it did significantly change our understanding of particle physics. The ILC would then refine this understanding, and maybe significantly change it again. So what next? Let's say we convince people to build a 100 TeV machine, and then this machine finds nothing. Well, that would be new. The LHC and ILC would have either found something or have been built to just refine our understanding, but a brand new machine finding nothing? That would be the first failed particle physics experiment. I think if we are declaring the end of particle physics, let alone all of physics, maybe it would be nice if we had a major experiment fail to give us any information and thus fail in its scientific mission before we wax poetic about the end of physics.
Add a comment...
Latest ChromeOS update has really messed up the usability of my chromebook (which is sad because I love my chromebook). Huge amounts of bugs with the Wifi and VPN interface. If the VPN server loses connection, or if you close the chromebook's lid, you'll get one of two problems: the wifi will disconnect and take a while to reconnect, maybe being fixed only with a sudo ifconfig wlan0 down/up in the Crosh shell. Another issue is if the VPN will say it is disconnected, but the Wifi interface will be frozen thinking it is still connected to the VPN and doing nothing when clicking 'disconnect' on the VPN. In this case, ifconfig does nothing (as soon as the VPN disconnects it is removed from the ifconfig list), and you have to sign out and sign back in to fix it. Given that VPN disconnects happen several times a day for various reasons, this is obviously a time consuming issue. Shouldn't have to log out and in again every time I close the lid, or have a VPN disconnect.
#chrome #chromeOS #chromebook
#chrome #chromeOS #chromebook
Add a comment...
Post has shared content
I've seen this go around. Haven't dug into the actual paper, but a few immediate problems spring to mind. Namely, impact is being measured by citations, which appears to demonstrate that when a superstar (measured by citation numbers) dies, his collaborators publish less and people who disagreed publish more. These alternative superstars are subsequently cited more.
I could have predicted this exact same effect if I hadn't made any assumptions about the quality of the work being produced. Turns out when someone who is getting all the citations dies other people get cited, but the collaborators may have been publishing more than they could have individually because of the super star. Profound.
It deeply concerns me that scientists who would otherwise criticize this push for citation count metrics as a stand-in for quality are willing to so quickly accept citation count metrics as a literal measure of scientific innovation uncritically.
I could have predicted this exact same effect if I hadn't made any assumptions about the quality of the work being produced. Turns out when someone who is getting all the citations dies other people get cited, but the collaborators may have been publishing more than they could have individually because of the super star. Profound.
It deeply concerns me that scientists who would otherwise criticize this push for citation count metrics as a stand-in for quality are willing to so quickly accept citation count metrics as a literal measure of scientific innovation uncritically.
"When a prominent researcher suddenly dies in an academic subfield, a period of new ideas and innovation follow." http://www.vox.com/science-and-health/2015/12/15/10219330/elite-scientists-hold-back-progress
Add a comment...
Post has attachment
Nobel Neutrinos
My latest Tumblr post on Physics, took a while for me to get to it, but I discuss a bit about the underlying physics of the recent Nobel Prize in Physics. In the very least, I made cute cartoons of particles that are worth checking out.
My latest Tumblr post on Physics, took a while for me to get to it, but I discuss a bit about the underlying physics of the recent Nobel Prize in Physics. In the very least, I made cute cartoons of particles that are worth checking out.
Add a comment...
Post has attachment
Skimmed it. Has interesting points. I haven't dug into it in much detail (and I worry whenever physicists try philosophy), but here is an excerpt that I think I agree with as my 'poor-man's philosophy of math':
"However, mathematical objects are more abstract than those that appear in the physical world, and they include entities that seem to have no physical referrent, such as hierarchies of infinities of ever increasing size. To deal with this issue, I maintain that mathematical objects do not refer directly to things that exist in the physical universe. As the formalists suggest, mathematical theories are just abstract formal systems, but not all formal systems are mathematics. Instead, mathematical theories are those formal systems that maintain a tether to empirical reality through a process of abstraction and generalization from more empirically grounded theories, aimed at achieving a pragmatically useful representation of regularities that exist in nature."
"However, mathematical objects are more abstract than those that appear in the physical world, and they include entities that seem to have no physical referrent, such as hierarchies of infinities of ever increasing size. To deal with this issue, I maintain that mathematical objects do not refer directly to things that exist in the physical universe. As the formalists suggest, mathematical theories are just abstract formal systems, but not all formal systems are mathematics. Instead, mathematical theories are those formal systems that maintain a tether to empirical reality through a process of abstraction and generalization from more empirically grounded theories, aimed at achieving a pragmatically useful representation of regularities that exist in nature."
Add a comment...
This 750 GeV diphoton excess. I haven't heard anyone defend it as a real signal, and almost every is hedging in their public commentary about it. Yet, everyone is putting out a paper trying to explain it. It definitely feels like peer pressure.
"You want citations right? Everyone else is writing a 750 GeV paper and getting citations."
"You want citations right? Everyone else is writing a 750 GeV paper and getting citations."
Add a comment...
Post has attachment
Link to Resonaances and some added commentary
So the notable things about the LHC data from the latest run? Both experiments are seeing a bump at the same energy when looking at events with two photons. Namely, if you look at events with two photons, and then add up their momentum in the tranverse direction (that is the direction that is perpendicular to the way the proton beam is traveling in the LHC), you can figure out what the total energy of the pair of photons is. A bump at a particular energy is suggestive that a single particle is produced in a particle collision, and then it subsequently decayed into two photons. The energy of the photons is then related to the mass of this new particle (in this case 750 GeV). The significance of these bumps isn’t large enough to think that this is a discovery except that both experiments see it at the same place so the chances of this being a statistical fluctuation are a little lower, combined with the fact that this signal could come from a bunch of different models, are why people are excited about it.
I’d say trying to make a model to describe this signal alone is still a bad way of going about this kind of research. This signal isn’t at the discovery level yet, so there’s no need to try on a bunch of ad hoc hypotheses to fit the inconclusive data. That being said, if you can build a model that gives you other theoretically interesting predictions or results, or are working on a model that could easily fit this kind of prediction in or had a prediction for a heavier scalar that decays to diphotons already, then this is the sort of signal that might give some insight.
But it’s also important to note we’ve already see a few excesses disappear at the LHC already, so don’t get too excited yet.
So the notable things about the LHC data from the latest run? Both experiments are seeing a bump at the same energy when looking at events with two photons. Namely, if you look at events with two photons, and then add up their momentum in the tranverse direction (that is the direction that is perpendicular to the way the proton beam is traveling in the LHC), you can figure out what the total energy of the pair of photons is. A bump at a particular energy is suggestive that a single particle is produced in a particle collision, and then it subsequently decayed into two photons. The energy of the photons is then related to the mass of this new particle (in this case 750 GeV). The significance of these bumps isn’t large enough to think that this is a discovery except that both experiments see it at the same place so the chances of this being a statistical fluctuation are a little lower, combined with the fact that this signal could come from a bunch of different models, are why people are excited about it.
I’d say trying to make a model to describe this signal alone is still a bad way of going about this kind of research. This signal isn’t at the discovery level yet, so there’s no need to try on a bunch of ad hoc hypotheses to fit the inconclusive data. That being said, if you can build a model that gives you other theoretically interesting predictions or results, or are working on a model that could easily fit this kind of prediction in or had a prediction for a heavier scalar that decays to diphotons already, then this is the sort of signal that might give some insight.
But it’s also important to note we’ve already see a few excesses disappear at the LHC already, so don’t get too excited yet.
Add a comment...
Post has attachment
Hey check out my tumblr. It's mostly photos from some of my time in Seoul, but every once in a while I talk some physics.
http://physicsfromtheseoul.tumblr.com/
http://physicsfromtheseoul.tumblr.com/

Add a comment...
Post has attachment
Moving my physics posts to Tumblr, it helps me stay concise, plus I can more easily add Feynman diagrams.
Add a comment...
Post has shared content
Someone from my cohort and some other particle physicsts made this pretty epic particle physics cover song.
Music and SCIENCE! combine!
Add a comment...
Wait while more posts are being loaded

