I’m not sure that’s a good comparison. The kill mechanism from a neutron bomb is the deposition of ionizing radiation in the body, but the microwave radiation is non-ionizing.
I’m not sure that’s a good comparison. The kill mechanism from a neutron bomb is the deposition of ionizing radiation in the body, but the microwave radiation is non-ionizing.
You’ve gotten some good answers explaining that heat changes the density, and therefore the index of refraction of air.
Fun fact: Schlieren Imaging allows one to photograph shockwaves by relying on the same effect. As a shockwave travels through air, it creates a region of high density, which can be imaged with this technique.
in the photon’s frame of reference
There are no valid inertial frames for an object moving at the speed of light. The idea that “a photon doesn’t experience time” is a common, but misleadingly incorrect statement, since we can’t define a reference frame for it. Sometimes this misconception can be useful for conveying some qualitative ideas (photons don’t decay), but often it leads to contradictions like your question about Hawking Radiation for black holes.
Yes, the wavelength of photons will be preserved if they travel through non-expanding space. If the photon is emitted by a source that’s in motion with respect to a detector, there could still be redshift or blueshift from the relativistic Doppler effect. This would only depend on the relative velocity between the emitter and observer, and not on the distance the photon traveled between them.
Unfortunately for me, there is no community at Lemmy dedicated to the history of science
I agree! The history of science is often even more interesting since you get both the science and the personalities of all the people involved, plus the occasional world war in the mix. It’s a shame there isn’t an “askhistorians” type community here.
how people very knowledgeable on the current paradigm cannot see (most of times historicaly) that a paradigm shift is about to happen ?
I’m not sure I’d agree with that assessment. Generally a new model or understanding of physics arises because of known shortcomings in the current model. Quantum physics is the classic example that resolved a number of open problems at the time: the ultraviolet catastrophe in black body radiation, the photoelectric effect, and the interference pattern of the double slit experiment, among others. In the years leading up to the development of quantum theory, it was clear to everyone active in physics that something was missing from the current understanding of Newtonian/classical physics. Obviously it wasn’t clear what the solution was until it came about, but it was obvious that a shift was coming.
The same thing happened again with electroweak unification%20and%20the%20weak%20interaction.) and the standard model of particle physics. There were known problems with the previous standard model Lagrangian, but it took a unique mathematical approach to resolve many of them.
Generally research focuses on things that are unknown or can’t be explained by our current understanding of physics. The review article you linked, for example, details open questions and contradictory observations/predictions in the state of the art.
Haha it’s in the title: “Cosmological Particle Production: A Review”. Also the journal it was published in is for review articles: Reports on Progress in Physics. Mostly though the abstract promises to give a review of the subject.
Another indication is its lengthy (28 pages) with tons of citations throughout. If someone is doing new work, citations will mostly be in the introduction and discussion sections.
So unfortunately the article they reference by Parker is paywalled. I have access but can’t share it easily. The article is essentially the foundation of quantum field theory in curved space time - in other words the genesis of the standard cosmological model. Cosmological particle production in an expanding universe isn’t an alternative to the Big Bang, it’s an essential part of it.
Leonard Parker’s work is summarized on his Wikipedia page. You can also read an interview with him on the arxiv
There isn’t a link in your post, but it looks like you’re referring to this preprint. The article has been published in a peer reviewed journal paywall warning.
This is a review article, so it isn’t proposing anything new and is instead giving a summary of the current state of the field. These sorts of articles are typically written by someone who is deeply familiar with the subject. They’re also super useful if you’re learning about a new area - think of them as a short, relatively up-to-date textbook.
I’m not sure how you’re interpreting this review as an alternative to the standard model of cosmology and the Big Bang. Everything is pretty standard quantum field theory. The only mention of the CMB is in regards to the possibility that gravitons in the early universe would leave detectable signatures (anisotropies and polarization). They aren’t proposing an alternative production mechanism for the CMB.
This falls a bit outside my wheelhouse but I believe the answer is no. The established symmetries in particle physics are all associated with the quantum mechanical state of a particle (charge, parity, etc) and to my knowledge there isn’t an “information” quantum number.
The closest you might get to this is quantum information theory, where information is encoded in other physical characteristics (spin, parity, energy, etc). In this sense information is more of an emergent phenomenon than a fundamental property.
Sorry, physics can be cruel sometimes :(
Hah tell me about it. The 2017 neutron star merger happened while I was writing a proposal for an experiment where the physics was sort of related. So of course I completely reframed the proposal around that event, and it got funded! And that was just a few years ago, right?
Man I really need to publish the results of that project…
Certainly! You can see discrete emission lines from the ionized air molecules, which only occurs because of quantum physics. I realize that’s not what you’re asking though.
I did a quick calculation and for a plasma torch (~27000 Kelvin) and assuming air molecules, the average velocity of the plasma ions would only be like 6000 m/s. That’s 0.001% the speed of light, so you aren’t going to see any relativistic effects.
First a caveat: An object with mass can’t move at the speed of light, but it could move at speeds arbitrarily close to that.
The most successful model of gravity isGeneral Relativity, which treats gravity as a curvature of 4-dimensional space time. Gravity’s influence travels at the speed of light. There’s a classic thought experiment that sort of answers your question: what would happen if the sun was teleported away? The answer is the earth would continue to orbit around the spot the sun was for 8 minutes, and we would continue to see sunlight for that same amount of time since that’s how long it takes light to travel that distance. Then after 8 minutes the sun would disappear and the first “lack of gravity” would reach us, and things would be bad for earth :(
The fact that gravity travels at the speed of light actually leads to an interesting phenomenon: Gravitational waves If a massive object rapidly accelerates (or decelerates), for example a star sized mass moving quickly and then coming to an abrupt stop, it will emit a ripple in space time called a gravitational wave that will travel outward at the speed of light.
It was big news about a decade ago when gravitational waves were first detected by LIGO, a series of large interferometers that look for expansion/contraction in spacetime. Their first detection was the collision of 2 black holes; as the black holes spiral around each other and eventually merge, they emit oscillating waves with increasing frequency. They made a cool video showing how the frequency increases by converting it to sound.
Since then LIGO and VIRGO (similar European collaboration) have detected multiple gravitational waves from the collision of black holes and neutron stars. So not only are gravitational waves a neat validation of general relativity, they’re actually being used to do astronomy.
Thanks! I forgot to put the exclamation mark at the front of the link. Hopefully it works now
lmk if there’s a better community to ask this in
Shameless plug: you could try !askscience@lemmy.world
The Milwaukee Protocol is a treatment plan that is essentially a more advanced version of what you’re asking. The patient is put in a medically induced coma and then given antivirals and IV fluids, which avoids the issue of hydrophobia.
It got a lot of press because one person survived on it (a big deal given that rabies is a death sentence once symptoms appear) but this success hasn’t been reproduced with other patients. A paper on the protocol has a remarkably blunt title: Critical Appraisal of the Milwaukee Protocol for Rabies: This Failed Approach Should Be Abandoned.
The x-axis range spans the same region of “photon energy” space in both plots. The data starts at about 280 nm in the first plot, which is 1000 THz (the maximum value in the second plot).
The stretching effect caused by working in different x-axis units is because the units don’t map linearly, but are inversely proportional. A 1 nm wide histogram bin at 1000 nm will contain the histogram counts corresponding to a 0.3 THz wide region at 300 THz in the frequency plot. Another 1 nm wide bin at 200 nm will correspond to a 7.5 THz wide region located at 1500 THz in the frequency plot.
You can get a sense of how this works just by looking at how much space the colorful visible light portion of the spectrum takes up on each plot. In the wavelength plot, by eye I’d say visible light corresponds to about 1/6 the horizontal axis scale. In the frequency plot, it’s more like 1/4.
That normalization is necessary because otherwise exactly how you bin the data would change the vertical scale, even if you used the same units. For example, consider the first plot. Let’s assume the histogram bins are uniformly 1 nm wide. Now imaging rebinning the data into 2 nm wide bins. You would effectively take the contents of 2 bins and combine them into one, so the vertical scale would roughly double. 2 plots would contain the same data but look vastly different in magnitude. But if in both cases you divide by bin width (1 nm or 2 nm, depending) the histogram magnitudes would be equal again. So that’s why the units have to be given in “per nm” or “per THz).
A quasiparticle is more of a useful concept for describing the behavior of systems than it is a distinct object. In the example you cite, phonons are a convenient way of describing how vibrations are transmitted in matter. The fact that phonons are “quantized” is more accurately just emergent behavior from the system of atoms or molecules, a consequence of the fact that the atoms have quantized vibrational states.
As an analogy, consider a ripple in a pond. The ripple appears to be a real, distinct thing. You can describe it with math (as a wave) and predict its behavior. But it cannot exist separately from the water in the pond. The ripple is an emergent phenomenon in water, a quasi-object. It only exists as a collective behavior of the water molecules.
By definition quasiparticles cannot exist in a vacuum.
For physics specifically, a bachelor’s degree probably won’t be enough to get a job in physics.
You might be able to get a job as a technician in a lab, but they typically will look for people with a master’s degree for those roles. With just a bachelor’s , you’d need to get your foot in the door by already having some relevant experience, which is a possibility if you get some research experience in college and pivot that into an internship or something. But it would definitely require effort and luck.