Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Fizix: Fundamental Fifth Force Finally Found?
#34
Scientists evade the Heisenberg uncertainty principle
March 22, 2017

[Image: 1-scientistsev.jpg]
The evolution of a spin and its uncertainty as they orbit due to a magnetic field. The uncertainty, initially equal in all directions, is squeezed into only the out-of-plane component, leaving the two in-plane components highly certain. Credit: ICFO
ICFO Researchers report the discovery of a new technique that could drastically improve the sensitivity of instruments such as magnetic resonance imagers (MRIs) and atomic clocks. The study, published in Nature, reports a technique to bypass the Heisenberg uncertainty principle. This technique hides quantum uncertainty in atomic features not seen by the instrument, allowing the scientists to make very high precision measurements.



State-of-the-art sensors, such as MRIs and atomic clocks, are capable of making measurements with exquisite precision. MRI is used to image tissues deep within the human body and tells us whether we might suffer from an illness, while atomic clocks are extremely precise timekeepers used for GPS, internet synchronization, and long baseline interferometry in radio-astronomy. One might think these two instruments have nothing in common, but they do: both technologies are based on precise measurement the spin of the atom, the gyroscope-like motion of the electrons and the nucleus. In MRI, for example, the pointing angle of the spin gives information about where in the body the atom is located, while the amount of spin (the amplitude) is used to distinguish different kinds of tissue. Combining these two pieces of information, the MRI can make a 3D map of the tissues in the body.
The sensitivity of this kind of measurement was long thought to be limited by Heisenberg's uncertainty principle, which states that accurately measuring one property of an atom puts a limit to the precision of measurement you can obtain on another property. For example, if we measure an electron's position with high precision, Heisenberg's principle limits the accuracy in the measurement of its momentum. Since most atomic instruments measure two properties (spin amplitude and angle), the principle seems to say that the readings will always contain some quantum uncertainty. This long-standing expectation has now been disproven, however, by ICFO researchers Dr. Giorgio Colangelo, Ferran Martin Ciurana, Lorena C. Bianchet and Dr. Robert J. Sewell, led by ICREA Prof. at ICFO Morgan W. Mitchell. In their article "Simultaneous tracking of spin angle and amplitude beyond classical limits", published this week in Nature, they describe how a properly designed instrument can almost completely avoid quantum uncertainty.
The trick is to realize that the spin has not one but two pointing angles, one for the north-east-south-west direction, and the other for the elevation above the horizon. The ICFO team showed how to put nearly all of the uncertainty into the angle that is not measured by the instrument. In this way they still obeyed Heisenberg's requirement for uncertainty, but hid the uncertainty where it can do no harm. As a result, they were able to obtain an angle-amplitude measurement of unprecedented precision, unbothered by quantum uncertainty.


Prof. Mitchell uses a solid analogy to state that "To scientists, the uncertainty principle is very frustrating - we'd like to know everything, but Heisenberg says we can't. In this case, though, we found a way to know everything that matters to us. It's like the Rolling Stones song: you can't always get what you want / but if you try sometimes you just might find / you get what you need."
[Image: scientistsev.jpg]
Ferran Martin Ciurana and Dr. Giorgio Colangelo working on the experimental setup. Credit: ICFO
In their study, the ICFO team cooled down a cloud of atoms to a few micro-degrees Kelvin, applied a magnetic field to produce spin motion as in MRI, and illuminated the cloud with a laser to measure the orientation of the atomic spins. They observed that both the spin angle and uncertainty can be continuously monitored with a sensitivity beyond the previously expected limits, although still obeying the Heisenberg principle.
As for the challenges faced during the experiment, Colangelo comments that "in the first place, we had to develop a theoretical model to see if what we wanted to do was really possible. Then, not all the technologies we used for the experiment existed when we started: among them, we had to design and develop a particular detector that was fast enough and with very low noise. We also had to improve a lot the way we were "preparing" the atoms and find a way to efficiently use all the dynamic range we had in the detector. It was a battle against the Dark Side of Quantum, but we won it!"
The results of the study are of paramount importance since this new technique shows that it is possible to obtain even more accurate measurements of atomic spins, opening a new path to the development of far more sensitive instruments and enabling the detection of signals, such as gravitational waves or brain activity, with unprecedented accuracy.
[Image: 1x1.gif] Explore further: Researchers beat the quantum limit of microwave measurements
More information: Simultaneous tracking of spin angle and amplitude beyond classical limits, Naturenature.com/articles/doi:10.1038/nature21434 
Journal reference: Nature [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: ICFO



Read more at: https://phys.org/news/2017-03-scientists...e.html#jCp


[/url]Does the universe have a rest frame?
by Staff Writers
Washington DC (SPX) Mar 23, 2017


[Image: silicon-based-quantum-optics-lab-on-a-chip-lg.jpg]
File image.

Physics is sometimes closer to philosophy when it comes to understanding the universe. Donald Chang from Hong Kong University of Science and Technology, China, attempts to elucidate whether the universe has a resting frame. The results have recently been published in EPJ Plus.
To answer this tricky question, he has developed an experiment to precisely evaluate particle mass. This is designed to test the special theory of relativity that assumes the absence of a rest frame, otherwise it would be possible to determine which inertial frame is stationary and which frame is moving.
This assumption, however, appears to diverge from the standard model of cosmology, which assumes that what we see as a vacuum is not an empty space. The assumption is that the energy of our universe comes from the quantum fluctuation in the vacuum.
In a famous experiment conducted by Michelson and Morley in the late 19th century, the propagation of light was proved to be independent of the movement of the laboratory system.
Einstein, his Special Theory of Relativity, inferred that the physical laws governing the propagation of light are equivalent in all inertial frames--this was later extended to all physics laws not just optics.
In this study, the author set out to precisely measure the masses of two charged particles moving in opposite directions.
The conventional thinking assumes that the inertial frame applies equally to both particles. If that's the case, no detectable mass difference between these two particles is likely to arise.
However, if the contrary is true, and there is a rest frame in the universe, the author expects to see mass difference that is dependent on the orientation of the laboratory frame.
This proposed experiment partially inspired by the Michelson and Morley experiments can be conducted using existing experimental techniques. For simplicity, an electron can be used as the charged particle in the experiment.
D. C. Chang (2017), 
Is there a rest frame in the universe? A proposed experimental test based on a precise measurement of particle mass, Eur. Phys. J. Plus 132:140, DOI 10.1140/epjp/i2017-11402-4
http://www.spacedaily.com/reports/Does_t...e_999.html


Quote: Wrote:Recall From first article in this post above^

In their study, the ICFO team cooled down a cloud of atoms to a few micro-degrees Kelvin, applied a magnetic field to produce spin motion as in MRI, and illuminated the cloud with a laser to measure the orientation of the atomic spins. They observed that both the spin angle and uncertainty can be continuously monitored with a sensitivity beyond the previously expected limits, although still obeying the Heisenberg principle.


Read more at: https://phys.org/news/2017-03-scientists-evade-heisenberg-uncertainty-principle.html#jCp
Physicists prove that it's impossible to cool an object to absolute zero

March 23, 2017 by Lisa Zyga feature



[Image: ice.jpg]
Credit: photos-public-domain.com
(Phys.org)—In 1912, chemist Walther Nernst proposed that cooling an object to absolute zero is impossible with a finite amount of time and resources. Today this idea, called the unattainability principle, is the most widely accepted version of the third law of thermodynamics—yet so far it has not been proved from first principles.





Now for the first time, physicists Lluís Masanes and Jonathan Oppenheim at the University College of London have derived the third law of thermodynamics from first principles. After more than 100 years, the result finally puts the third law on the same footing as the first and second laws of thermodynamics, both of which have already been proved.
"The goal of fundamental physics is to derive all the laws of nature and to describe all phenomena by only assuming a small set of principles (like quantum mechanics, the Standard Model of particle physics, etc.)," Masanes told Phys.org. "And that's what we do. In addition, this derivation unveils the strong connections among the limitations of cooling, the positivity of the heat capacity, the reversibility of microscopic dynamics, etc. Personally, I love that the whole of thermodynamics (including the third law) has been derived from more fundamental principles."
To prove the third law, the physicists used ideas from computer science and quantum information theory. There, a common problem is to determine the amount of resources required to perform a certain task. When applied to cooling, the question becomes how much work must be done and how large must the cooling reservoir be in order to cool an object to absolute zero (0 Kelvin, -273.15°C, or -459.67°F)?
The physicists showed that cooling a system to absolute zero requires either an infinite amount of work or an infinite reservoir. This finding is in agreement with the widely accepted physical explanation of the unattainability of absolute zero: As the temperature approaches zero, the system's entropy (disorder) approaches zero, and it is not possible to prepare a system in a state of zero entropy in a finite number of steps.
The new result led the physicists to a second question: If we can't reach absolute zero, then how close can we get (with finite time and resources)? It turns out that the answer is closer than might be expected. The scientists showed that lower temperatures can be obtained with only a modest increase of resources. Yet they also showed that there are limits here, as well. For example, a system cannot be cooled exponentially quickly, since this would result in a negative heat capacity, which is a physical impossibility.
One of the nice features of the new proof is that it applies not only to large, classical systems (which traditional thermodynamics usually deals with), but also to quantum systems and to any conceivable type of cooling process.
For this reason, the results have widespread theoretical implications. Cooling to very low temperatures is a key component in many technologies, such as quantum computers, quantum simulations, and high-precision measurements. Understanding what it takes to get close to absolute zero could help guide the development and optimization of future cooling protocols for these applications.
"Now that we have a better understanding of the limitations of cooling, I would like to optimize the existing cooling methods or come up with new ones," Masanes said.
[Image: 1x1.gif] Explore further: Quantum shortcuts cannot bypass the laws of thermodynamics
More information: Lluís Masanes and Jonathan Oppenheim. "A general derivation and quantification of the third law of thermodynamics." Nature Communications. DOI: 10.1038/ncomms14538 
Journal reference: Nature Communications


Read more at: https://phys.org/news/2017-03-physicists...e.html#jCp


Wow!

They can 'evade' uncertainty.
A reference of a 'rest' frame.
They have 'proved' all 3 laws of thermodynamics.  

Cool.
More Wine?

Physicist develops drip-free wine bottle

March 23, 2017 by Lawrence Goodman



[Image: physicistdev.jpg]
Credit: Brandeis University
Drips are the bane of every wine drinker's existence. He or she uncorks a bottle of wine, tips it toward the glass, and a drop, or even a stream, runs down the side of the bottle. Sure, you could do what sommeliers in restaurants do, wrapping a napkin around the neck of the bottle to catch the liquid, but who has time for that? Much more likely, you'll ruin the tablecloth.



Daniel Perlman—wine-lover, inventor and Brandeis University biophysicist—has figured out a solution to this age-old oenophile's problem. Over the course of three years, he has been studying the flow of liquid across the wine bottle's lip. By cutting a groove just below the lip, he's created a drip-free wine bottle.
Perlman is a renowned inventor with over 100 patents to his name for everything from specialized lab equipment to the first miniaturized home radon detector. Along with Professor Emeritus of Biology K.C. Hayes, he developed the "healthy fats" in Smart Balance margarine. Most recently, he devised coffee flour, a food ingredient and nutritional supplement derived from par-baked coffee beans.
There are already products on the market designed to prevent wine spillage, but they require inserting a device into the bottle neck. Perlman didn't want consumers to have to take an additional step after they made their purchase. "I wanted to change the wine bottle itself," he says. "I didn't want there to be the additional cost or inconvenience of buying an accessory." Figure out the physics, he thought, and you might be able to build a drip-free wine bottle.



Perlman studied slow-motion videos of wine being poured. He observed first that drippage was most extreme when a bottle was full or close to it. He also saw that a stream of wine tends to curl backward over the lip and run down the side of the glass bottle because glass is hydrophilic, meaning it attracts water.
Using a diamond-studded tool, Perlman, assisted by engineer Greg Widberg, created a circular groove around the neck of the bottle just beneath the top. A droplet of wine that would otherwise run down the side of the bottle encounters the groove, but can't traverse it. Instead, it immediately falls off the bottle into the glass along with the rest of the wine.
Remember that when you pour a full or nearly-full bottle of wine, you hold it at a slightly upward angle in relation to the glass. For a drop of wine to make it across Perlman's groove, it would have to travel up inside the groove against the force of gravity or have enough momentum to jump from one side of the groove to the other. After many tests, Perlman found the perfect width, roughly 2 millimeters, and depth, roughly 1 millimeter, for the groove so that the wine stream can't get past it.
Current wine bottle designs date to the early 1800s and haven't changed much since. About 200 years of drips, drabs, stains and spots may be coming to an end. Perlman is currently speaking with bottle manufacturers about adopting his design.
[Image: 1x1.gif] Explore further: New 'smart' bottle helps uncorked wine keep longer
Provided by: Brandeis University


Read more at: https://phys.org/news/2017-03-physicist-drip-free-wine-bottle.html#jCp[url=https://phys.org/news/2017-03-physicist-drip-free-wine-bottle.html#jCp]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#35
Quote:“We have yet to confirm it is a new particle,” admits Feng, “but it would be revolutionary if true—the biggest discovery in particle physics in at least 40 years.” His theoretical work predicts that the putative new particle is just 33 times heavier than the electron. If so, it shouldn’t be hard to make in particle collisions—but it would be hard to see. “It is very weakly interacting, and we’ve shown that it would have eluded all previous experiments,” says Feng. Perhaps, he adds, it could be sought at colliders such as the Large Hadron Collider at the particle-physics center CERN in Geneva.
[Image: precision-needle-thread.jpg]
The Fifth Force of Physics Is Hanging by a Thread
As scientists chase tantalizing hints of a new force, modern physics hangs in the balance.
BY PHILIP BALLMARCH 16, 2017


How about that! Mr. Galileo was correct in his findings.” That conclusion wasn’t based on the most careful experiment you’ll ever see, but it was one of the most spectacular in its way—because it was performed on the moon.
In 1971, Apollo 15 astronaut David Scott dropped a feather and a hammer from the same height and found that they hit the lunar surface at the same time. The acceleration due to gravity doesn’t depend on a body’s mass or composition, just as Galileo asserted from his (probably apocryphal) experiment on the Leaning Tower of Pisa.
IN GALILEO’S DREAMS: A moon-bound reprise of Galileo’s famous experiment from the leaning tower of Pisa. Nikolas Zane
Or does it? Jump forward to the front-page headline of The New York Times in January 1986: “Hints of 5th Force in the Universe Challenge Galileo’s Findings.” The newspaper was reporting on a paper in the premier physics journal Physical Review Letters by physicist Ephraim Fischbach and his colleagues, describing evidence that the acceleration due to gravity does vary depending on the chemical composition of the object in question. Gravity, it seemed, was not quite what we thought it was: its effects are modified by what the The New York Times’ reporter John Noble Wilford christened a “fifth force,” adding to the four fundamental forces we already know.
More than 30 years later, many experiments have sought to verify this putative fifth force. Yet despite their extraordinary accuracy, none has ever found convincing evidence for it. That search shows no sign of abating, however. Even in the past year a new tantalizing hint that such a force exists has emerged from experiments in nuclear physics, provoking fresh speculation and excitement.
What hangs in the balance are some of the foundational principles of modern physics. Some physicists believe that a fifth force is permitted, even demanded, by efforts to extend and unify the current fundamental theories. Others hope such a force might shed light on the mysterious dark matter that seems to outweigh all the ordinary matter in the universe. If it exists, says physicist Jonathan Feng of the University of California, Irvine, “it would imply that our attempts to unify the known forces have been premature, as now there will be a fifth one to unify, too.”

Why speculate about another fundamental force of nature, when there’s no good evidence for it? The original motivation was appreciated even in Galileo’s time: There are two ways of thinking about mass. One comes from inertia: An object’s mass is its “resistance” to being moved, this being greater the more massive it is. The other comes from gravity: According to Isaac Newton’s law of universal gravitation, the force of gravity experienced between two masses, such as an apple and the Earth, is proportional to the product of their masses divided by the square of the distance between them. This force causes a falling apple to accelerate. If, and only if, the two definitions of mass are the same, the gravitational acceleration doesn’t depend on the amount of mass being accelerated.
Are they the same, though? If they aren’t, then different masses would fall under gravity at different rates. The intuitive notion that a greater mass should “fall faster” had motivated tests before Galileo. The Dutch natural philosopher Simon Stevin is thought to have dropped lead balls from the clock tower in Delft around 1586, finding no detectable difference in how long they took to reach the ground. Newton himself tested the idea around 1680 by measuring whether pendulums of different mass but identical length have the same period of swing—as they should if gravitational acceleration is mass-independent. His studies were repeated with more accuracy by the German scientist Friedrich Wilhelm Bessel in 1832. Neither of them found any detectable difference.
Quote:Gravity might be fine as it stands—but there might be a new, fifth force that makes it look different.
The idea that inertial and gravitational mass are the same is known as the weak equivalence principle. It became a crucial issue when Einstein formulated his theory of general relativity around 1912-16, which rested on the central idea that the acceleration caused by gravity is the same as the acceleration of an object subject to the same force in free space. If that’s not true, general relativity won’t work.
“The equivalence principle is one of the basic assumptions of general relativity,” says Stephan Schlamminger, who works at the Mecca of high-precision measurement, the National Institute of Standards and Technology in Gaithersburg, Maryland. “As such, it should be thoroughly tested. Tests of the equivalence principle are relatively cheap and simple, but could have a huge impact if a violation was found. It would be careless not to perform these experiments.”
If the weak equivalence principle fails, then there are two possibilities. Either Newton’s expression for the force of gravity between two masses (which is also what general relativity predicts if gravity is not extreme) is slightly inaccurate and needs tweaking. Or gravity might be fine as it stands—but there might be a new, fifth force that makes it look different. That fifth force would add to the four we already know to exist: gravity, electromagnetism, and the strong and weak nuclear forces that govern the interactions of subatomic particles inside atomic nuclei. Whether we think about “modified gravity” or a fifth force is, says Fischbach, in the end just a semantic distinction.
Either way, says Feng, there is “no reason at all that there can’t be a fifth force that we have not noticed until now.”

By the time Einstein pinned his new gravitational theory to it, the weak equivalence principle had already undergone some very exacting tests. At the end of the 19th century a Hungarian nobleman named Baron Loránd Eőtvős, working at the University of Budapest, realized it could be tested by placing two masses in delicate balance.
Eőtvős used an instrument known as a torsion balance. He attached two objects to the ends of a horizontal rod suspended by a thread. If the objects have the same weight—the same gravitational mass—then the rod is balanced horizontally. But the masses also experience a centrifugal force due to the rotation of the Earth, which depends on the objects’ inertial masses. If inertial mass is the same as the gravitational mass, all the forces are in balance and the rod stays still. But if they differ, then the masses will tend to swing away from the horizontal because of the Earth’s rotation.
And if the two masses experience a different “swing”—one possibility would be because the deviation from the weak equivalence principle is dependent on composition—then the rod will experience a net twisting force (torque), and it will rotate. Even if this rotation is very slight, it might be detected by, say, measuring the deflection of a light beam from a mirror attached to the rod.
[Image: 9080_d82d678e9583c1f5f283ec56fbf1abb7.png]
ALSO IN PHYSICS  
Ingenious: Robbert Dijkgraaf
By Michael Segal
This past week was the inauguration of Harvard University’s Black Hole Initiative. Stephen Hawking gave a lecture, media was gathered, and millions of dollars committed. A mural was also unveiled, full of fantastical dust swirls, particle jets, and an interstellar...READ MORE


Now, the fact is that the force of gravity does vary slightly from place to place on the Earth anyway. That’s because the planet is not a smooth uniform sphere. Rocks have different density, and so exert a very slightly different gravitational tug. And at the precision of Eőtvős’s experiments, even the presence of the nearby university buildings could disturb the results. One way of eliminating these local variations is to carry out the measurements for two different orientations of the dangling rod—say, east-west and north-south. Both should experience the same local effects of gravity, but the centrifugal forces will differ—and thus any deviation from weak equivalence would show up as a difference in torque between the two measurements. This approach fits with the general strategy of setting up the balance experiment to be sensitive to differences in gravitational acceleration between two test masses or configurations: That way, you don’t need to worry about local effects or about how accurately you can measure absolute forces.
Local perturbations might, however, also vary in time: Even a passing truck could induce a tiny gravitational disturbance. So the researchers had to take care to rule out such things. In fact, even the presence of the observing experimenter might matter. So the Hungarian scientists would stand well off as the balance came to rest, then dash into the lab to make a measurement before it had time to adjust to their presence (its twisting period was a slow 40 minutes).
Eőtvős built a revised torsion balance that was a masterpiece of precision engineering. On one end of the hanging rod was a standard platinum mass, while the samples of other materials were suspended from the other end. The rod was mounted on a tripod that could pivot to alter its orientation. A telescope and mirror attached to the moving parts could show if any rotation of the rod had occurred. Tiny imbalances in temperature of the environment could induce warping of the apparatus, leading to spurious rotation, and so the whole assembly was encased in a sealed, insulated chamber. To make the experiments even more exquisitely accurate, the researchers later took to conducting them in a darkened, closed room, so that no light could produce temperature variations. What’s more, they put the device inside a double tent insulated with seaweed.
[Image: 11891_6ef1173b096aa200158bfbc8af3ae8e3.png]DISTURBANCE IN THE FORCE: The Eőtvős torsion balance was designed to be extremely sensitive to torque that could be evidence of a fifth force of nature.Fischbach, E. The fifth force: A personal history. The European Physical Journal H40, 385-467 (2015).
The Hungarian researchers began their torsion balance experiments in 1889, when they found no detectable rotation due to deviations from inertial-gravitational mass equivalence for masses of several different materials, with an accuracy of one part in 20 million.
So by the end of the 19th century, there seemed to be no reason to doubt the weak equivalence principle. But at that very time, new reasons began appearing. For one thing, the discovery of radioactivity suggested the presence of an unknown source of energy locked inside atoms. What’s more, Einstein’s theory of special relativity offered a new perspective on matter and mass. Mass, it seemed, could be converted to energy—and it was sensitive to velocity, increasing as the speed of an object approached the speed of light. Mindful of all this, in 1906 the Royal Scientific Society of Göttingen in Germany offered a 4,500-mark prize for more sensitive tests of the equivalence of “inertia and gravitation,” citing Eőtvős’ experiments as inspiration.
Quote:It began to seem as though Fischbach was the discoverer of something non-existent.
Eőtvős himself couldn’t resist returning to the fray. “He was the world expert in this kind of experiment,” says Fischbach. He and his students Dezső Pekár and Jenő Fekete in Budapest dusted off their torsion-balance experiments, devoting thousands of hours to testing different materials: copper, water, asbestos, dense wood, and more. They submitted their findings in 1909, claiming an improved accuracy of one part in 200 million. But the full report of the work wasn’t published until 1922, three years after Eőtvős’ death. Another of his students, János Renner, continued the work and published it in Hungarian in 1935, claiming to verify the weak equivalence principle to one part in 2-5 billion.
Was such sensitivity really possible back then? Physicist Robert Dicke, a specialist in general relativity, expressed doubts when he came to tackle the same question in the 1960s. Regardless of whether Dicke’s criticisms are valid, he and his coworkers used a more sophisticated torsion balance that achieved an accuracy of one in 100 billion. They did it by measuring the acceleration of their test masses caused not by the Earth’s gravity but by that of the sun. This meant there was no need to disturb the balance by rotating it: The direction of the gravitational attraction was itself being rotated as the Earth moved around the sun. Any deviation from weak equivalence should have showed up as a signal varying every 24 hours in step with the Earth’s rotation, giving a precise way to discriminate between this and false signals due to local gravitational variations or other disturbances. Dicke and his colleagues saw no sign of such deviations: No indication that Newton’s law of gravity needed amending with a fifth force.
Were physicists satisfied now? Are they ever?

Fischbach became interested in the fifth force after hearing about an experiment performed by his Purdue colleague Roberto Colella and coworkers in 1975, which looked at the effects of Newtonian gravity on subatomic particles. Fischbach wondered whether it would be possible to conduct similar experiments with subatomic particles in a situation where the gravity is strong enough to make general relativity, rather than Newton’s theory, the proper description of gravity—that might then offer a completely new way of testing Einstein’s theory.
He began to think about doing so using exotic particles called kaons and their antimatter siblings anti-kaons, which are produced in particle accelerators. Analyzing studies of kaons at the Fermilab accelerator facility near Chicago led Fischbach to suspect that some kind of new force might be affecting the particles’ behavior, which was sensitive to a quantity called the baryon number, denotedB.
This is a property of fundamental particles that, unlike mass or energy, doesn’t have any everyday meaning. It is equal to a simple arithmetic sum of the number of even more fundamental constituents called quarks and antiquarks that make up the protons and neutrons of atomic nuclei. Here’s the thing, though: If this new force depended on baryon number, it should depend on the chemical composition of materials, since different chemical elements have different numbers of protons and neutrons. More precisely, it would depend on the ratio of B to the masses of the component atoms. Naively it might seem that this ratio should be constant for everything, since atomic mass comes from the sum of protons and neutrons. But actually a small part of the total mass of all those constituents is converted into the energy that binds them together, which varies from atom to atom. So each element has a unique B/mass ratio.
A force that depends on composition … well, wasn’t that what Eőtvős had been looking for? Fischbach decided to go back and look closely at the Hungarian baron’s results. In the fall of 1985, he and his student Carrick Talmadge calculated the B/mass ratio for the substances in the samples of Eőtvős and his students. What they found astonished them.
The Hungarian team had found very small deviations for the measured gravitational acceleration of different substances, but apparently lacking any pattern, suggesting that these were just random errors. But when Fischbach and Talmadge plotted these deviations against the B/mass ratio, they saw a straight-line relationship, suggesting a force that induced a very small repulsion between masses, weakening their gravitational attraction.
[Image: 11892_2d16ad1968844a4300e9a490588ff9f8.png]A SECOND LOOK: Eőtvős and his coworkers measured very slight differences (Δκ) in gravitational acceleration between two equal masses of different composition. But it was only half a century later when Fischbach and his colleagues plotted these against the difference in the baryon number B divided by the mass (μ) of the two samples that they ceased to look like random measurement errors and revealed what seemed to be a systematic relationship.Fischbach, E. The fifth force: A personal history. The European Physical Journal H40, 385-467 (2015).
The chemical composition of Eőtvős’ samples wasn’t always easy to deduce—for snakewood and “suet,” who could be sure?—but as far as they could see, the relationship stood up. In one of the most striking cases, platinum and copper sulfate crystals turned out to have the same deviation. Everything about these two substances (density and so forth) are different—except for their near-identicalB/mass ratio.
Fischbach and Talmadge presented these findings in their headline-grabbing 1986 paper, helped by postdoc Peter Buck whose command of German enabled him to translate the original 1922 report by Eőtvős’ team. The Purdue group’s paper was reviewed by Dicke, who voiced some doubts but felt eventually that it should be published. Dicke later followed up with a paper claiming that the anomalies in the Eőtvős measurements could be explained by temperature gradients in the apparatus. It was hard, though, to see how such everyday environmental effects would end up producing such a convincing-looking correlation with a quantity as exotic as baryon number.
Once the word was out, the world came calling—not only The New York Times but also the legendary Richard Feynman, whose call to Fischbach’s home four days after the paper was published he initially assumed to be a prank. Feynman was unimpressed, and said as much both to Fischbach and in the Los Angeles Times. But for him to show interest at all showed how the Purdue team’s provocative result had got folks talking.

Considering that our paper was suggesting the presence of a new force in nature,” wrote Fischbach, “it may seem surprising that the referring process went as smoothly as it did.” But maybe the path was smoothed by the fact that there were already both theoretical and experimental reasons to suspect a fifth force might exist.
Back in 1955, the Chinese-American physicists T.D. Lee and C.N. Yang, who shared a Nobel prize two years later for their work on fundamental particle interactions, explored the idea of a new force that depended on baryon number, and had even used Eőtvős’ work to set limits on how strong it could be. Lee met Fischbach just over a week after his paper was published, and congratulated him on it.
What’s more, in the late 1970s two geophysicists in Australia, Frank Stacey and Gary Tuck, had made an accurate measurement in a deep mine of the gravitational constant that relates force to masses in Newton’s equation of gravitational attraction. They reported a value significantly different from that measured previously in laboratories. One way of explaining those results was to invoke a new force that acted over distances of a few kilometers. Stacey and Tuck’s measurements were themselves partly inspired by work in the early 1970s by Japanese physicist Yasunori Fujii on the possibility of “non-Newtonian gravity.”
Quote:That’s simply the way physics has always worked: When all else fails, you place a new piece on the board and see how it moves.
After 1986 the hunt was on. If a fifth force indeed acted over distances of tens to thousands of meters, it should be possible to detect deviations from what Newtonian gravity predicts about free fall high above the Earth’s surface. In the late 1980s a team at the United States Air Force laboratory at Hanscom in Bedford, Massachusetts, measured the acceleration due to gravity up a 600-meter television tower in North Carolina and reported evidence for what seemed to be in fact a “sixth force,” for in contrast to Fischbach’s repulsive fifth force it seemed to enhance gravity. After subsequent analysis, however, these claims evaporated.
The most extensive studies were conducted at the University of Washington in Seattle by a team of physicists who, playing on the proper Hungarian pronunciation of “Eőtvős” (close to “Ert-wash”), called themselves the Eot-Wash group. They were co-led by nuclear physicist Eric Adelberger, who “has by now become the world’s leading experimentalist in searching for deviations from the predictions of Newtonian Gravity,” according to Fischbach. The Eot-Wash team used state-of-the-art torsion balances, taking all manner of precautions to eliminate artifacts from their measurements. Result: nothing.
One of the most evocative and suggestive experiments was begun right after the 1986 announcement, by Peter Thieberger of Brookhaven National Laboratory in Upton, New York. He floated a hollow copper sphere in a tank of water and placed it near the edge of a cliff. In 1987 Thieberger reported that the sphere consistently moved in the direction of the edge, where the gravitational attraction by the surrounding rock was smaller—just what you’d expect if there was indeed some repulsive force that counteracted gravity. This was the only corroborating evidence for a fifth force published in a prominent physics journal. Why did it alone see such a thing? That’s still a mystery. “It is not clear what—if anything—was wrong with Thieberger’s experiment,” wrote Fischbach.
By 1988 Fischbach counted no fewer than 45 experiments searching for a fifth force.  Yet five years later only Thieberger’s had produced any sign of it. In a talk to mark the tenth anniversary of the 1986 paper, Fischbach admitted that “There is at present no compelling experimental evidence for any deviation from the predictions of Newtonian gravity … the preponderance of the existing experimental data is incompatible with the presence of any new intermediate-range or long-range forces.”
It began to seem as though, as Fischbach ruefully puts it, he was the discoverer of something non-existent. The mood was captured by physicist Lawrence Krauss, then at Yale University, who responded to the 1986 paper by formally submitting to Physical Review Letters a spoof paper claiming to have re-analyzed Galileo’s experiments on the acceleration of balls rolling downhill under gravity, reported in his 1638 book Discourses on Two New Sciences, and to have found evidence for a “third force” (in addition to gravity and electromagnetism). The paper was rejected by the journal in the same spirit as it was submitted: on the basis of six spoof referees’ reports clearly written in house.

After a few decades of almost universal non-detection of a fifth force, you might think the game is over. But if anything, reasons to believe in a fifth force have become ever more attractive and diverse as physicists seek to extend the foundations of their science. “There are now thousands of papers suggesting new fundamental interactions that could be a source of a fifth force,” says Fischbach. “The theoretical motivation is quite overwhelming.”
For example, the latest theories that attempt to extend physics beyond the “standard model,” which accounts for all the known particles and their interactions, throw up several possibilities for new interactions as they attempt to uncover the next layer of reality. Some of those theories predict new particles that could act as the “carriers” of previously unknown forces, just as the electromagnetic, strong, and weak forces are known to be associated with “force particles” such as the photon.
A group of models predicting deviations from Newtonian gravity called Modified Newtonian Dynamics (MOND) have also been put forward to account for some aspects of the movements of stars in galaxies that are otherwise conventionally explained by invoking a hypothetical “dark matter” that interacts with ordinary matter only (or perhaps almost only) via gravitational attraction. No clear evidence has been discovered to support MOND theories, but some physicists have found them increasingly promising as extensive searches for dark-matter particles have yielded no sign.
Quote:Were physicists satisfied now? Are they ever?
Alternatively, says Feng, a fifth force might help us find out about dark matter itself. As far as we know, dark matter only interacts with other matter through gravity. But if it turned out to feel a fifth force too, then, Feng says, “it could provide a ‘portal’ through which we can finally interact with dark matter in a way that is not purely gravitational, so we can understand what dark matter is.”
What’s more, some theories that invoke extra dimensions of space beyond our familiar three—such as the currently most favored versions of string theory—predict that there could be forces similar to but considerably stronger than gravity acting over short distances of millimeters or less.
That’s the scale at which some researchers are now looking. It means measuring the forces, with extraordinary precision, between small masses separated by very small gaps. Three years ago Fischbach and colleagues set out to do this for tiny particles just 40 to 8,000 millionths of a millimeter apart. The difficulty with such measurements is that there is already a force of attraction between objects this close, called the Casimir force. This has the same origin as the so-called van der Waals forces that operate at even closer approach, and which stick molecules together weakly. These forces come from the synchronized sloshing of clouds of electrons in the objects, which give rise to electrostatic attraction because of the electrons’ charge. Casimir forces are basically what van der Waals forces become when the objects are far enough apart—more than a few nanometers—for the time delay between the electron fluctuations across the gap to matter.
Fischbach and his coworkers found a way to suppress the Casimir force, making it about a million times weaker by coating their test masses with a layer of gold. They attached a gold-coated sapphire bead about 150 thousandths of a millimeter in radius to a solid plate, whose motions could be detected electronically. Then they rotated a microscopic disk patterned with patches of gold and silicon just below the bead. If there were any differences in the force exerted by the gold and silicon, that should produce a vibration of the bead. They saw no such effects, which meant they could place even more stringent limits on the possible strength of a material-dependent fifth force at these microscopic scales.
Torsion-balance measurements can be used in this region, too. Researchers at the Institute for Cosmic Ray Research at the University of Tokyo have used the device to look for deviations from the standard Casimir force caused by a fifth force. All they found were yet stricter lower limits on how strong such a force can be.
As well as detecting a fifth force directly, it might still be possible to spot it the way Fischbach originally thought to look: through the high-energy collisions of fundamental particles. In 2015 a team at the Institute for Nuclear Research in Debrecen, Hungary, led by Attila Krasznahorkay, reported something unexpected when an unstable form of beryllium atoms, formed by firing protons at a lithium foil, decays by emitting pairs of electrons and their antimatter counterparts positrons. There was a rise in the number of electron-positron pairs ejected from the sample at an angle of about 140 degrees, which standard theories of nuclear physics couldn’t explain.
The results were all but ignored until Feng and his coworkers suggested last year that they could be accounted for by the ephemeral formation of a new “force particle” which then quickly decays into an electron and a positron. In other words, this hypothetical particle would carry a fifth force, with a very short range of just a few trillionths of a millimeter.
Although they haven’t yet been replicated by other researchers, the Hungarian findings look pretty solid. The chance that they are just a random statistical fluctuation is tiny, says Feng: about 1 in 100 billion. “More than that, the data fit beautifully the hypothesis that they’re caused by a new particle,” he says. “If such a new particle exists, this is exactly how it would come to light.” Schlamminger agrees that Feng’s interpretation of the Hungarian observations was “one of the exciting things that happened in 2016.”
“We have yet to confirm it is a new particle,” admits Feng, “but it would be revolutionary if true—the biggest discovery in particle physics in at least 40 years.” His theoretical work predicts that the putative new particle is just 33 times heavier than the electron. If so, it shouldn’t be hard to make in particle collisions—but it would be hard to see. “It is very weakly interacting, and we’ve shown that it would have eluded all previous experiments,” says Feng. Perhaps, he adds, it could be sought at colliders such as the Large Hadron Collider at the particle-physics center CERN in Geneva.
The hypothesis of a fifth force is, then, anything but exhausted. In fact it’s fair to say that any observations in fundamental physics or cosmology that can’t be explained by our current theories—by the Standard Model of particle physics or by general relativity—are apt to get physicists talking about new forces or new types of matter, such as dark matter and dark energy. That’s simply the way physics has always worked: When all else fails, you place a new piece on the board and see how it moves. Sure, we haven’t yet seen any convincing evidence for a fifth force, but neither have we seen a direct sign of dark matter or supersymmetry or extra dimensions, and not for want of looking. We have ruled out a great deal of the territory that a fifth force might inhabit, but there is still plenty of terrain left in shadow.
[Image: 11893_0f95f7ad389a372d9876a2ddb2551a43.png]PAINTED INTO A CORNER: Limits on the possible strength of a fifth force α at large (left) and small (right) scales. The yellow regions show the excluded zones, with boundary labels referring to individual experiments. The dashed lines for small scales show some possible magnitudes of a fifth force predicted by some theories.Fischbach, E. The fifth force: A personal history. The European Physical Journal H40, 385-467 (2015).
At any rate, the search continues. In April 2016, the European Space Agency launched a French satellite called Microscope that aims to test the weak equivalence principle in space with unprecedented accuracy. It will place two nested pairs of metal cylinders in free fall: One pair is made of the same heavy platinum-rhodium alloy, the other has an outer cylinder of lighter titanium-vanadium-aluminum. If the cylinders fall at a rate that depends ever so slightly on the material—so that deviations from the weak equivalence principle occur at a level of one part in a thousand trillion, about 100 times smaller than is detectable in current Earth-based experiments—it should be possible to measure the differences with electrical sensors on the satellite.
“String-theory models predict WEP violations below one part in 10 trillion,” says Joel Bergé, a scientist at the French Centre for Aerospace Research (ONERA) that manages the Microscope project. He says that the scientific operations of the mission began last November and the first results should be published this summer.
Despite such high-tech studies, it’s the Eőtvős torsion-balance experiments that Fischbach keeps returning to. Back then, the Hungarians had no theoretical motivation to expect a composition-dependent fifth force—nothing that could have subconsciously swayed them in their incredibly delicate work. “Whatever we need to explain their data simply didn’t and couldn’t conceptually exist then,” says Fischbach. And yet they did seem to see something—not a random scatter of results, but a systematic deviation. “I keep thinking, maybe I’m missing something about what they did,” says Fischbach. “It’s still a puzzle.”

Philip Ball is a writer based in London. His latest book is The Water Kingdom: A Secret History of China.

http://54.197.248.184/issue/46/balance/t...y-a-thread



Quote:just 33 times heavier = all ma'at @ that.

[Image: maat-scales-600x272.jpg]


Ancient Egyptian scales are topped with the head of Ma’at, the goddess of truth, justice and balance. A dead person’s heart is weighed against a feather to see if the owner is worthy to enter paradise. Ma’at’s symbolism is still apparent in the western personification of Lady Justice.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#36
Quote:Quote:That’s simply the way physics has always worked: When all else fails, you place a new piece on the board and see how it moves. 
Hi 
Eye do this everyday no hocus pocus@33

New twist on sofa problem that stumped mathematicians and furniture movers
March 20, 2017 by Becky Oskin

[Image: newtwistonso.gif]
The Moving Sofa problem asks, what is the largest shape that can move around a right-angled turn? UC Davis mathematician Dan Romik has extended this problem to a hallway with two turns, and shows that a 'bikini top' shaped sofa is the largest so far found that can move down such a hallway. Credit: Dan Romik, UC Davis
Most of us have struggled with the mathematical puzzle known as the "moving sofa problem." It poses a deceptively simple question: What is the largest sofa that can pivot around an L-shaped hallway corner?



A mover will tell you to just stand the sofa on end. But imagine the sofa is impossible to lift, squish or tilt. Although it still seems easy to solve, the moving sofa problem has stymied math sleuths for more than 50 years. That's because the challenge for mathematicians is both finding the largest sofa and proving it to be the largest. Without a proof, it's always possible someone will come along with a better solution.
"It's a surprisingly tough problem," said math professor Dan Romik, chair of the Department of Mathematics at UC Davis. "It's so simple you can explain it to a child in five minutes, but no one has found a proof yet.
The largest area that will fit around a corner is called the "sofa constant" (yes, really). It is measured in units where one unit corresponds to the width of the hallway.
Inspired by his passion for 3-D printing, Romik recently tackled a twist on the sofa problem called the ambidextrous moving sofa. In this scenario, the sofa must maneuver around both left and right 90-degree turns. His findings are published online and will appear in the journal Experimental Mathematics.
[Image: 1-newtwistonso.gif]
The Gerver sofa is the largest found that will fit round a single turn. It has a “sofa constant” of 2.22 units, where one unit represents the width of the hallway. Credit: Dan Romik/UC Davis
Eureka Moment
Romik, who specializes in combinatorics, enjoys pondering tough questions about shapes and structures. But it was a hobby that sparked Romik's interest in the moving sofa problem—he wanted to 3-D print a sofa and hallway. "I'm excited by how 3-D technology can be used in math," said Romik, who has a 3-D printer at home. "Having something you can move around with your hands can really help your intuition."
The Gerver sofa—which resembles an old telephone handset—is the biggest sofa found to date for a one-turn hallway. As Romik tinkered with translating Gerver's equations into something a 3-D printer can understand, he became engrossed in the mathematics underlying Gerver's solution. Romik ended up devoting several months to developing new equations and writing computer code that refined and extended Gerver's ideas. "All this time I did not think I was doing research. I was just playing around," he said. "Then, in January 2016, I had to put this aside for a few months. When I went back to the program in April, I had a lightbulb flash. Maybe the methods I used for the Gerver sofa could be used for something else."


Romik decided to tackle the problem of a hallway with two turns. When tasked with fitting a sofa through the hallway corners, Romik's software spit out a shape resembling a bikini top, with symmetrical curves joined by a narrow center. "I remember sitting in a café when I saw this new shape for the first time," Romik said. "It was such a beautiful moment."
Finding Symmetry
Like the Gerver sofa, Romik's ambidextrous sofa is still only a best guess. But Romik's findings show the question can still lead to new mathematical insights. "Although the moving sofa problem may appear abstract, the solution involves new mathematical techniques that can pave the way to more complex ideas," Romik said. "There's still lots to discover in math."
[Image: 1x1.gif] Explore further: Advertisement manipulation studied
More information: Dan Romik, Differential Equations and Exact Solutions in the Moving Sofa Problem, Experimental Mathematics (2017). DOI: 10.1080/10586458.2016.1270858 
Provided by: UC Davis


Read more at: https://phys.org/news/2017-03-sofa-problem-stumped-mathematicians-furniture.html#jCp[/url][url=https://phys.org/news/2017-03-sofa-problem-stumped-mathematicians-furniture.html#jCp]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#37
Chasing each one of our individual passions leads humanity to its best.

Should buy Stock in Tesla or one of the other neuron-net companies ... I'm signed up for POST death scientific experiments..should I volunteer while alive?

Hmm2


Bob... Ninja Bong7bp Meds Assimilated
"The Light" - Jefferson Starship-Windows of Heaven Album
I'm an Earthling with a Martian Soul wanting to go Home.   
You have to turn your own lightbulb on. ©stevo25 & rhw007
Reply
#38
Solving one of nature's great puzzles: What drives the accelerating expansion of the universe?
May 15, 2017

[Image: solvingoneof.jpg]
Credit: NASA
UBC physicists may have solved one of nature's great puzzles: what causes the accelerating expansion of our universe?



PhD student Qingdi Wang has tackled this question in a new study that tries to resolve a major incompatibility issue between two of the most successful theories that explain how our universe works: quantum mechanics and Einstein's theory of general relativity.
The study suggests that if we zoomed in-way in-on the universe, we would realize it's made up of constantly fluctuating space and time.
"Space-time is not as static as it appears, it's constantly moving," said Wang.
"This is a new idea in a field where there hasn't been a lot of new ideas that try to address this issue," said Bill Unruh, a physics and astronomy professor who supervised Wang's work.
In 1998, astronomers found that our universe is expanding at an ever-increasing rate, implying that space is not empty and is instead filled with dark energy that pushes matter away.
The most natural candidate for dark energy is vacuum energy. When physicists apply the theory of quantum mechanics to vacuum energy, it predicts that there would be an incredibly large density of vacuum energy, far more than the total energy of all the particles in the universe. If this is true, Einstein's theory of general relativity suggests that the energy would have a strong gravitational effect and most physicists think this would cause the universe to explode.
Fortunately, this doesn't happen and the universe expands very slowly. But it is a problem that must be resolved for fundamental physics to progress.
Unlike other scientists who have tried to modify the theories of quantum mechanics or general relativity to resolve the issue, Wang and his colleagues Unruh and Zhen Zhu, also a UBC PhD student, suggest a different approach. They take the large density of vacuum energy predicted by quantum mechanics seriously and find that there is important information about vacuum energy that was missing in previous calculations.
Their calculations provide a completely different physical picture of the universe. In this new picture, the space we live in is fluctuating wildly. At each point, it oscillates between expansion and contraction. As it swings back and forth, the two almost cancel each other but a very small net effect drives the universe to expand slowly at an accelerating rate.
But if space and time are fluctuating, why can't we feel it?
"This happens at very tiny scales, billions and billions times smaller even than an electron," said Wang.
"It's similar to the waves we see on the ocean," said Unruh. "They are not affected by the intense dance of the individual atoms that make up the water on which those waves ride."
Their paper was published last week in Physical Review Dhttps://journals.aps.org/prd/abstract/10.1103/PhysRevD.95.103504.
[Image: 1x1.gif] Explore further: Quest to settle riddle over Einstein's theory may soon be over
More information: Qingdi Wang et al, How the huge energy of quantum vacuum gravitates to drive the slow accelerating expansion of the Universe, Physical Review D (2017). DOI: 10.1103/PhysRevD.95.103504 
Journal reference: Physical Review D [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: University of British Columbia



Read more at: https://phys.org/news/2017-05-nature-gre...e.html#jCp[url=https://phys.org/news/2017-05-nature-great-puzzles-expansion-universe.html#jCp][/url]


ABSTRACT

We investigate the gravitational property of the quantum vacuum by treating its large energy density predicted by quantum field theory seriously and assuming that it does gravitate to obey the equivalence principle of general relativity. We find that the quantum vacuum would gravitate differently from what people previously thought. The consequence of this difference is an accelerating universe with a small Hubble expansion rate HΛeβGΛ0 instead of the previous prediction H=8πGρvac/3GΛ2 which was unbounded, as the high energy cutoff Λ is taken to infinity. In this sense, at least the “old” cosmological constant problem would be resolved. Moreover, it gives the observed slow rate of the accelerating expansion as Λ is taken to be some large value of the order of Planck energy or higher. This result suggests that there is no necessity to introduce the cosmological constant, which is required to be fine tuned to an accuracy of 10120, or other forms of dark energy, which are required to have peculiar negative pressure, to explain the observed accelerating expansion of the Universe.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#39
[Image: hydrogen-bond-may_1024.jpg]
University of Basel, Department of Physics

Hydrogen Bonds Have Been Directly Detected For The First Time

We've hit next-level physics.

BEC CREW
15 MAY 2017


For the first time ever, physicists have managed to directly detect a hydrogen bond within a single molecule - meaning we can now observe the smallest and most abundant element in the Universe in ways that scientists could only ever theorise about.
The experiment also reveals just how sensitive our imaging devices have become - hydrogen bonds are far weaker than chemical bonds, and until now, it's been impossible to see them. Now, scientists can visualise them so clearly using an atomic force microscope, they can measure their exact force.

Of all the elements scientists in the Universe that scientists are striving to get better grasp on, hydrogen is arguably at the top of the pile. 
Hydrogen makes up 75 percent of all the visible mass in the Universe, and more than 90 percent of all the atoms.
It easily forms compounds with nearly all non-metallic elements on the periodic table, and its bonds with oxygen and carbon are why any of us exist at all.
You can also thank hydrogen bonds for your very stable DNA double helix structure - millions of hydrogen bonds are the reason your DNA base pairs stay intact, which means it really is one of the fundamental building blocks of life as we know it.
But there have been two major challenges when it comes to studying hydrogen bonds in their purest form: hydrogen is as small as atoms get; and its weak bonds are very easily broken, particularly when it comes to studying single molecules.
"The hydrogen atom - the smallest and most abundant atom - is of utmost importance in physics and chemistry," say researchers from the University of Basel's Swiss Nanoscience Institute.

"Although many analysis methods have been applied to its study, direct observation of hydrogen atoms in a single molecule remains largely unexplored."
Using hydrogen compounds called propellane with configurations that resemble a propeller, the Swiss team has successfully measured the force and distance between an oxygen atom and two hydrogen atoms.
"Our ... calculations confirm the signature of directional bonding, characteristic of very weak hydrogen bonding," the researchers report.
"The direct measurement of the interaction with a hydrogen atom paves the way for the identification of three-dimensional molecules such as DNAs and polymers."
So how did they do it?
They selected hydrocarbon compounds that always arrange themselves to have two hydrogen atoms pointing upwards.

You can see the side-on view of the propeller shape here, with the hydrogen atoms in white (the second hydrogen bond pointing upwards is obstructed behind the first one):
[img=660x0]http://www.sciencealert.com/images/2017-05/prop-may-new.jpg[/img]

Shigeki Kawai et. al/Science Advances
[size=undefined]
The team then subjected this molecule to an atomic force microscope (AFM), which is a very high-resolution type of scanning probe microscopy that's able to visualise and measure minuscule forces.
They augmented the tip of the microscope with carbon monoxide, which made it extremely sensitive to hydrogen. When the tip was brought close enough to these hydrogen atoms, hydrogen bonds were formed in a way that they could be directly examined.
In this image, you can see both hydrogen atoms pointing upwards:
[img=676x0]http://www.sciencealert.com/images/2017-05/hydrogenbond-new.jpg[/img][/size]

A hydrogen bond forms between a propellane (lower molecule) and the carbon monoxide microscope tip (upper molecule) Credit: University of Basel, Department of Physics
[size=undefined]
You can see this in the above illustration, with the carbon monoxide tip above forming a bond with the hydrocarbon 'propellane' compound below.
When the researchers compared their results to established calculations of hydrogen bonds in this kind of molecule, they matched exactly.
As the researchers point out, "[H]ydrocarbons are one of the most varied and functionalised products at the heart of engineering, chemistry, and life, and hydrogen is often critical in their function."
Now that we can directly measure hydrogen bonds, we're about to see one of the most fundamental building blocks of the Universe in a whole new light, and we can't wait to see where this next-level physics takes us next.
The research has been published in Science Advances.[/size]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#40
Test of general relativity could potentially generate new gravitational models
May 29, 2017

[Image: testofgenera.png]
The orbits of two stars, S0-2 and S0-38 located near the Milky Way’s supermassive black hole will be used to test Einstein’s theory of General Relativity and potentially generate new gravitational models. Credit: S. Sakai/A.Ghez/W. M. Keck Observatory/ UCLA Galactic Center Group
A UCLA-led team has discovered a new way of probing the hypothetical fifth force of nature using two decades of observations at W. M. Keck Observatory, the world's most scientifically productive ground-based telescope.



There are four known forces in the universe: electromagnetic force, strong nuclear force, weak nuclear force, and gravitational force. Physicists know how to make the first three work together, but gravity is the odd one out. For decades, there have been theories that a fifth force ties gravity to the others, but no one has been able to prove it thus far.
"This is really exciting. It's taken us 20 years to get here, but now our work on studying stars at the center of our galaxy is opening up a new method of looking at how gravity works," said Andrea Ghez, Director of the UCLA Galactic Center Group and co-author of the study.
The research is published in the current issue of Physical Review Letters.
Ghez and her co-workers analyzed extremely sharp images of the center of our galaxy taken with Keck Observatory's adaptive optics (AO). Ghez used this cutting-edge system to track the orbits of stars near the supermassive black hole located at the center of the Milky Way. Their stellar path, driven by gravity created from the supermassive black hole, could give clues to the fifth force.
"By watching the stars move over 20 years using very precise measurements taken from Keck Observatory data, you can see and put constraints on how gravity works. If gravitation is driven by something other than Einstein's theory of General Relativity, you'll see small variations in the orbital paths of the stars," said Ghez.
This is the first time the fifth force theory has been tested in a strong gravitational field such as the one created by the supermassive black hole at the center of the Milky Way. Historically, measurements of our solar system's gravity created by our sun have been used to try and detect the fifth force, but that has proven difficult because its gravitational field is relatively weak.
"It's exciting that we can do this because we can ask a very fundamental question – how does gravity work?" said Ghez. "Einstein's theory describes it beautifully well, but there's lots of evidence showing the theory has holes. The mere existence of supermassive black holes tells us that our current theories of how the universe works are inadequate to explain what a black hole is."
Ghez and her team, including lead author Aurelien Hees and co-author Tuan Do, both of UCLA, are looking forward to summer of 2018. That is when the star S0-2 will be at its closest distance to our galaxy's supermassive black hole. This will allow the team to witness the star being pulled at maximum gravitational strength – a point where any deviations to Einstein's theory is expected to be the greatest.
[Image: 1x1.gif] Explore further: Astronomers solve puzzle about bizarre object at the center of our galaxy
More information: A. Hees et al. Testing General Relativity with Stellar Orbits around the Supermassive Black Hole in Our Galactic Center, Physical Review Letters (2017). DOI: 10.1103/PhysRevLett.118.211101 
Journal reference: Physical Review Letters [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: W. M. Keck Observatory



Read more at: https://phys.org/news/2017-05-relativity-potentially-gravitational.html#jCp[url=https://phys.org/news/2017-05-relativity-potentially-gravitational.html#jCp][/url]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#41
Physicists review three experiments that hint at a phenomenon beyond the Standard Model of particle physics
June 8, 2017

[Image: 59398308ceff0.png]
Event display recorded by the BaBaR detector showing the decays of two B mesons into various subatomic particles, including a muon and a neutrino. Credit: SLACNATIONAL ACCELERATOR LABORATORY
To anyone but a physicist, it sounds like something out of "Star Trek." But lepton universality is a real thing.



It has to do with the Standard Model of particle physics, which describes and predicts the behavior of all known particles and forces, except gravity. Among them are charged leptons: electrons, muons and taus.
A fundamental assumption of the Standard Model is that the interactions of these elementary particles are the same despite their different masses and lifetimes. That's lepton universality. Precision tests comparing processes involving electrons and muons have not revealed any definite violation of this assumption, but recent studies of the higher-mass tau lepton have produced observations that challenge the theory.
A new review of results from three experiments points to the strong possibility that lepton universality—and perhaps ultimately the Standard Model itself—may have to be revised. The findings by a team of international physicists, including UC Santa Barbara postdoctoral scholar Manuel Franco Sevilla, appear in the journal Nature.
"As part of my doctoral thesis at Stanford, which was based on earlier work carried out at UCSB by professors Jeff Richman and Michael Mazur, we saw the first significant observation of something beyond the Standard Model at the BaBaR experiment conducted at the SLAC National Accelerator Laboratory," Franco Sevilla said. This was significant but not definitive, he added, noting that similar results were seen in more recent experiments conducted in Japan (Belle) and in Switzerland (LHCb). According to Franco Sevilla, the three experiments, taken together, demonstrate a stronger result that challenges lepton universality at the level of four standard deviations, which indicates a 99.95 percent certainty.
BaBaR, which stands for B-Bbar (anti-B) detector, and Belle were carried out in B factories. These particle colliders are designed to produce and detect B mesons—unstable particles that result when powerful particle beams collide—so their properties and behavior can be measured with high precision in a clean environment. The LHCb (Large Hadron Collider b) provided a higher-energy environment that more readily produced B mesons and hundreds of other particles, making identification more difficult.
Nonetheless, the three experiments, which measured the relative ratios of B meson decays, posted remarkably similar results. The rates for some decays involving the heavy lepton tau, relative to those involving the light leptons—electrons or muons—were higher than the Standard Model predictions.
"The tau lepton is key because the electron and the muon have been well measured," Franco Sevilla explained. "Taus are much harder because they decay very quickly. Now that physicists are able to better study taus, we're seeing that perhaps lepton universality is not satisfied as the Standard Model claims."
While intriguing, the results are not considered sufficient to establish a violation of lepton universality. To overturn this long-held physics precept would require a significance of at least five standard deviations. However, Franco Sevilla noted, the fact that all three experiments observed a higher-than-expected tau decay rate while operating in different environments is noteworthy.
A confirmation of these results would point to new particles or interactions and could have profound implications for the understanding of particle physics. "We're not sure what confirmation of these results will mean in the long term," Franco Sevilla said. "First, we need to make sure that they're true and then we'll need ancillary experiments to determine the meaning."
[Image: 1x1.gif] Explore further: SLAC particle physicist discusses the search for new physics
More information: Gregory Ciezarek et al. A challenge to lepton universality in B-meson decays, Nature (2017). DOI: 10.1038/nature22346
F. Archilli et al. Flavour-changing neutral currents making and breaking the standard model, Nature (2017). DOI: 10.1038/nature21721 
Journal reference: Nature [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: University of California - Santa Barbara



Read more at: https://phys.org/news/2017-06-physicists-hint-phenomenon-standard-particle.html#jCp[url=https://phys.org/news/2017-06-physicists-hint-phenomenon-standard-particle.html#jCp][/url]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#42
LHCb experiment announces observation of a new particle with two heavy quarks
July 6, 2017

[Image: thelhcbexper.png]
Credit: CERN
Today at the EPS Conference on High Energy Physics in Venice, the LHCb experiment at CERN's Large Hadron Collider has reported the observation of Ξcc++ (Xicc++) a new particle containing two charm quarks and one up quark. The existence of this particle from the baryon family was expected by current theories, but physicists have been looking for such baryons with two heavy quarks for many years. The mass of the newly identified particle is about 3621 MeV, which is almost four times heavier than the most familiar baryon, the proton, a property that arises from its doubly charmed quark content. It is the first time that such a particle has been unambiguously detected.


Nearly all the matter that we see around us is made of baryons, which are common particles composed of three quarks, the best-known being protons and neutrons. But there are six types of existing quarks, and theoretically many different potential combinations could form other kinds of baryons. Baryons so far observed are all made of, at most, one heavy quark.
"Finding a doubly heavy-quark baryon is of great interest as it will provide a unique tool to further probe quantum chromodynamics, the theory that describes the strong interaction, one of the four fundamental forces," said Giovanni Passaleva, new Spokesperson of the LHCb collaboration. "Such particles will thus help us improve the predictive power of our theories."
"In contrast to other baryons, in which the three quarks perform an elaborate dance around each other, a doubly heavy baryon is expected to act like a planetary system, where the two heavy quarks play the role of heavy stars orbiting one around the other, with the lighter quark orbiting around this binary system," added Guy Wilkinson, former Spokesperson of the collaboration.
Measuring the properties of the Ξcc++ will help to establish how a system of two heavy quarks and a light quark behaves. Important insights can be obtained by precisely measuring production and decay mechanisms, and the lifetime of this new particle.
The observation of this new baryon proved to be challenging and has been made possible owing to the high production rate of heavy quarks at the LHC and to the unique capabilities of the LHCb experiment, which can identify the decay products with excellent efficiency. The Ξcc++ baryon was identified via its decay into a Λc+ baryon and three lighter mesons K-, π+ and π+.
The observation of the Ξcc++ in LHCb raises the expectations to detect other representatives of the family of doubly-heavy baryons. They will now be searched for at the LHC.
This result is based on 13 TeV data recorded during run 2 at the Large Hadron Collider, and confirmed using 8 TeV data from run 1. The collaboration has submitted a paper reporting these findings to the journal Physical Review Letters.
[Image: 1x1.gif] Explore further: LHCb observes an exceptionally large group of particles
More information: Paper: press.cern/sites/press.web.cer … paper_2017.07.06.pdf 
Journal reference: Physical Review Letters [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: CERN



Read more at: https://phys.org/news/2017-07-lhcb-particle-heavy-quarks.html#jCp[url=https://phys.org/news/2017-07-lhcb-particle-heavy-quarks.html#jCp][/url]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#43
...


Quote:"In contrast to other baryons, 
in which the three quarks,
perform an elaborate dance around each other, 

a doubly heavy baryon Cheers
is expected to act like a planetary system Applause 
where the two heavy quarks  Luv
play the role of heavy stars orbiting one around the other Stars 
with the lighter quark  Angel
orbiting around this binary system," 

added Guy Wilkinson, 
former Spokesperson of the collaboration.




they are going to find so many more subatomic particles in the next 25 years ...
ie
they really know very little still,
though they make it sound like they are almost there ...
on the verge
on the verge
on the verge
the harmonic convergence  
is
inevitable,
and infinitely woven and entangled in the chaos.

they are going to find so many more subatomic particles in the next 25 years ...
...
Reply
#44
New supernova analysis reframes dark energy debate
September 13, 2017

[Image: newsupernova.png]
The difference in the magnitudes of supernovae in the ΛCDM and Timescape cosmologies and the magnitudes the supernovae would appear to have in an empty universe (horizontal dashed line). Both models show recent apparent acceleration following earlier deceleration. In the Timescape model this is not a real effect, however, and the curve is flatter than the ΛCDM case. Credit: Lawrence Dam, Asta Heinesen and David Wiltshire
The accelerating expansion of the Universe may not be real, but could just be an apparent effect, according to new research published in the journal Monthly Notices of the Royal Astronomical Society. The new study—by a group at the University of Canterbury in Christchurch, New Zealand—finds the fit of Type Ia supernovae to a model universe with no dark energy to be very slightly better than the fit to the standard dark energy model.



Dark energy is usually assumed to form roughly 70% of the present material content of the Universe. However, this mysterious quantity is essentially a place-holder for unknown physics.
Current models of the Universe require this dark energy term to explain the observed acceleration in the rate at which the Universe is expanding. Scientists base this conclusion on measurements of the distances to supernova explosions in distant galaxies, which appear to be farther away than they should be if the Universe's expansion were not accelerating.
However, just how statistically significant this signature of cosmic acceleration is has been hotly debated in the past year. The previous debate pitted the standard Lambda Cold Dark Matter (ΛCDM) cosmology against an empty universe whose expansion neither accelerates nor decelerates. Both of these models though assume a simplified 100 year old cosmic expansion law—Friedmann's equation.
Friedmann's equation assumes an expansion identical to that of a featureless soup, with no complicating structure. However, the present Universe actually contains a complex cosmic web of galaxy clusters in sheets and filaments that surround and thread vast empty voids.
[Image: newsupernova.jpg]
This is a computer-simulated image depicting one possible scenario of how light sources are distributed in the cosmic web. Credit: Andrew Pontzen and Fabio Governato / Wikimedia Commons (CC BY 2.0)
Prof David Wiltshire, who led the study from the University of Canterbury in Christchurch, said, "The past debate missed an essential point; if dark energy does not exist then a likely alternative is that the average expansion law does not follow Friedmann's equation."
Rather than comparing the standard ΛCDM cosmological model with an empty universe, the new study compares the fit of supernova data in ΛCDM to a different model, called the 'timescape cosmology'. This has no dark energy. Instead, clocks carried by observers in galaxies differ from the clock that best describes average expansion once the lumpiness of structure in the Universe becomes significant. Whether or not one infers accelerating expansion then depends crucially on the clock used.
The timescape cosmology was found to give a slightly better fit to the largest supernova data catalogue than the ΛCDM cosmology. Unfortunately the statistical evidence is not yet strong enough to rule definitively in favour of one model or the other, but future missions such as the European Space Agency's Euclid satellite will have the power to distinguish between the standard cosmology and other models, and help scientists to decide whether dark energy is real or not.
Deciding that not only requires more data, but also better understanding properties of supernovae which currently limit the precision with which they can be used to measure distances. On that score, the new study shows significant unexpected effects which are missed if only one expansion law is applied. Consequently, even as a toy model the timescape cosmology provides a powerful tool to test our current understanding, and casts new light on our most profound cosmic questions.
[Image: 1x1.gif] Explore further: Can we ditch dark energy by better understanding general relativity?
More information: Lawrence H. Dam et al, Apparent cosmic acceleration from type Ia supernovae, Monthly Notices of the Royal Astronomical Society (2017). DOI: 10.1093/mnras/stx1858 
Journal reference: Monthly Notices of the Royal Astronomical Society[Image: img-dot.gif] [Image: img-dot.gif]
Provided by: Royal Astronomical Society



Read more at: https://phys.org/news/2017-09-supernova-analysis-reframes-dark-energy.html#jCp[url=https://phys.org/news/2017-09-supernova-analysis-reframes-dark-energy.html#jCp][/url]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#45
Major gravitational waves announcement expected – but what will it be?


[Image: 59e239c1fc7e9334688b4567.jpg]


https://www.rt.com/news/406700-gravitati...ouncement/


Can't WAIT until Monday?

http://ligo.org/science/Publication-GW170814/index.php

[Image: virgo.png]

Or here is pdf of the science:

https://dcc.ligo.org/public/0145/P170814...170814.pdf


A bit of "leak" on Monday's announcement.

Bob... Ninja Alien2
"The Light" - Jefferson Starship-Windows of Heaven Album
I'm an Earthling with a Martian Soul wanting to go Home.   
You have to turn your own lightbulb on. ©stevo25 & rhw007
Reply
#46
Thanx for the pre-announcement Bob.  LilD

  Holycowsmile Arrow
Neutron star smashup seen for first time, 'transforms' understanding of Universe
October 16, 2017 by Marlowe Hood

[Image: esotelescope.jpg]
This artist's impression shows two tiny but very dense neutron stars at the point at which they merge and explode as a kilonova. Such a very rare event is expected to produce both gravitational waves and a short gamma-ray burst, both of which were observed on 17 August 2017 by LIGO-Virgo and Fermi/INTEGRAL respectively. Subsequent detailed observations with many ESO telescopes confirmed that this object, seen in the galaxy NGC 4993 about 130 million light-years from the Earth, is indeed a kilonova. Such objects are the main source of very heavy chemical elements, such as gold and platinum, in the Universe. Credit: ESO/L. Calçada/M. Kornmesser
For the first time, scientists have witnessed the cataclysmic crash of two ultra-dense neutron stars in a galaxy far away, and concluded that such impacts forged at least half the gold in the Universe.


Shockwaves and light flashes from the collision travelled some 130 million light-years to be captured by Earthly detectors on August 17, excited teams revealed at press conferences held around the globe on Monday as a dozen related science papers were published in top academic journals.
"We witnessed history unfolding in front of our eyes: two neutron stars drawing closer, closer... turning faster and faster around each other, then colliding and scattering debris all over the place," co-discoverer Benoit Mours of France's CNRS research institute told AFP.
The groundbreaking observation solved a number of physics riddles and sent ripples of excitement through the scientific community.
Most jaw-dropping for many, the data finally revealed where much of the gold, platinum, uranium, mercury and other heavy elements in the Universe came from.
Telescopes saw evidence of newly-forged material in the fallout, the teams said—a source long suspected, now confirmed.
"It makes it quite clear that a significant fraction, maybe half, maybe more, of the heavy elements in the Universe are actually produced by this kind of collision," said physicist Patrick Sutton, a member of the US-based Laser Interferometer Gravitational-Wave Observatory (LIGO) which contributed to the find.
Neutron stars are the condensed, burnt-out cores that remain when massive stars run out of fuel, blow up, and die.
Typically about 20 kilometres (12 miles) in diameter, but with more mass than the Sun, they are highly radioactive and ultra-dense—a handful of material from one weighs as much as Mount Everest.
[Image: 1-discoveredne.jpg]
An image of Swope Supernova Survey 2017a (or SSS17a) from the night of discovery. On August 17, a team of four Carnegie astronomers provided the first-ever glimpse of two neutron stars colliding, opening the door to a new era of astronomy. Credit: Tony Piro.
'Too beautiful'
It had been theorised that mergers of two such exotic bodies would create ripples in the fabric of space-time known as gravitational waves, as well as bright flashes of high-energy radiation called gamma ray bursts.
On August 17, detectors witnessed both phenomena, 1.7 seconds apart, coming from the same spot in the constellation of Hydra.
"It was clear to us within minutes that we had a binary neutron star detection," said David Shoemaker, another member of LIGO, which has detectors in Livingston, Louisiana and Hanford, Washington.


"The signals were much too beautiful to be anything but that," he told AFP.
The observation was the fruit of years of labour by thousands of scientists at more than 70 ground- and space-based observatories on all continents.
Along with LIGO, they include teams from Europe's Virgo gravitational wave detector in Italy, and a number of ground- and space-based telescopes including NASA's Hubble.
"This event marks a turning point in observational astronomy and will lead to a treasure trove of scientific results," said Bangalore Sathyaprakash from Cardiff University's School of Physics and Astronomy, recalling "the most exciting of my scientific life."
"It is tremendously exciting to experience a rare event that transforms our understanding of the workings of the Universe," added France Cordova, director of the National Science Foundation which funds LIGO.
The detection is another feather in the cap for German physicist Albert Einstein, who first predicted gravitational waves more than 100 years ago.
[Image: 2-firstobserva.jpg]
The UC Santa Cruz team found SSS17a by comparing a new image of the galaxy N4993 (right) with images taken four months earlier by the Hubble Space Telescope (left). The arrows indicate where SSS17a was absent from the Hubble image and visible in the new image from the Swope Telescope. Credit: Image credits: Left, Hubble/STScI; Right, 1M2H Team/UC Santa Cruz & Carnegie Observatories/Ryan Foley
Something 'fundamental'
Three LIGO pioneers, Barry Barish, Kip Thorne and Rainer Weiss, were awarded the Nobel Physics Prize this month for the observation of gravitational waves, without which the latest discovery would not have been possible.
The ripples have been observed four times before now—the first time by LIGO in September 2015. All four were from mergers of black holes, which are even more violent than neutron star crashes, but emit no light.
The fifth and latest detection was accompanied by a gamma ray burst which scientists said came from nearer in the Universe and was less bright than expected.
"What this event is telling us is that there may be many more of these short gamma ray bursts going off nearby in the Universe than we expected," Sutton said—an exciting prospect for scientists hoping to uncover further secrets of the Universe.
Among other things, it is hoped that data from neutron star collisions will allow the definitive calculation of the rate at which the cosmos is expanding, which in turn will tell us how old it is and how much matter it contains.
"With these observations we are not just learning what happens when neutron stars collide, we're also learning something fundamental about the nature of the Universe," said Julie McEnery of the Fermi gamma ray space telescope project.
Neutron star smash-up the 'discovery of a lifetime'
"Truly a eureka moment", "Everything I ever hoped for", "A dream come true"—Normally restrained scientists reached for the stars Monday to describe the feelings that accompany a "once-in-a-lifetime" event.
The trigger for this meteor shower of superlatives was the smash-up of two unimaginably dense neutron stars 130 million years ago, when T-rex still lorded over our planet.
Evidence of this cosmic clash hurtled through space and reached Earth on August 17 at exactly 12:41 GMT, setting in motion a secret, sleepless, weeks-long blitzkrieg of star-gazing and number-crunching involving hundreds of telescopes and thousands of astronomers and astrophysicists around the world.
It was as if a dormant network of super-spies simultaneously sprung into action.
The stellar smash-up made itself known in two ways: it created ripples called gravitational waves in Einstein's time-space continuum, and lit up the entire electromagnetic spectrum of light, from gamma rays to radio waves.
Scientists had detected gravitational waves four times before, a feat acknowledged with a Nobel Physics Prize earlier this month.
But each of those events, generated by the collision of black holes, lasted just a few seconds, and remained invisible to Earth- and space-based telescopes.
The neutron star collision was different.
It generated gravitational waves—picked up by two US-based observatories known as LIGO, and another one in Italy called Virgo—that lasted an astounding 100 seconds. Less than two seconds later, a NASA satellite recorded a burst of gamma rays.
[Image: discoveredne.jpg]
Artist's concept of the explosive collision of two neutron stars. Credit: Robin Dienel courtesy of the Carnegie Institution for Science.
A true 'eureka' moment
This set off a mad dash to locate what was almost certainly the single source for both.
"It is the first time that we've observed a cataclysmic astrophysical event in both gravitational and electromagnetic waves," said LIGO executive director David Reitze, a professor at the California Institute of Technology (Caltech) in Pasadena
Initial calculations had narrowed the zone to a patch of sky in the southern hemisphere spanning five or six galaxies, but frustrated astronomers had to wait for nightfall to continue the search.
Finally, at around 2200 GMT, a telescope array in the northern desert of Chile nailed it: the stellar merger had taken place in a galaxy known as NGC 4993.
Stephen Smartt, who led observations for the European Space Observatory's New Technology Telescope, was gobsmacked when the spectrum lit up his screens. "I had never seen anything like it," he recalled.
Scientists everywhere were stunned.
"This event was truly a eureka moment," said Bangalore Sathyaprakash, head of the Gravitational Physics Group at Cardiff University. "The 12 hours that followed are inarguably the most exciting of my scientific life."
"There are rare occasions when a scientist has the chance to witness a new era at its beginning—this is one such time," said Elena Pian, an astronomer at the National Institute for Astrophysics in Rome.
LIGO-affiliated astronomers at Caltech had spent decades preparing for the off chance—calculated at 80,000-to-one odds—of witnessing a neutron star merger.
Don't tell your friends
"On that morning, all of our dreams came true," said Alan Weinstein, head of astrophysical data analysis for LIGO at Caltech.
"This discovery was everything I always hoped for, packed into a single event," added Francesco Pannarale, an astrophysicist at Cardiff University in Wales.
For these and thousands of other scientists, GW170817—the neutron star burst's tag—will become a "do you remember where you were?" kind of moment.
"I was sitting in my dentist's chair when I got the text message," said Benoit Mours, an astrophysicist at France's National Centre for Research and the French coordinator for Virgo. "I jumped up and rushed to my lab."
Patrick Sutton, head of the gravitational physics group at Cardiff and a member of the LIGO team, was stuck on a long-haul bus, struggling to download hundreds of emails crowding his inbox.
[Image: 2-discoveredne.jpg]
A comparison of images of Swope Supernova Survey 2017a (or SSS17a) from the night of discovery, August 17, and four nights later, August 21. Credit: Tony Piro.
Rumours swirled within and beyond the astronomy community as scientists hastened to prepare initial findings for publication Monday in a dozen articles spread across several of the world's leading journals.
"There have been quite a few pints and glasses of wine or bubbly—privately, of course, because we haven't been allowed to tell anyone," Sutton told AFP.
But he couldn't resist telling his 12-year-old son, an aspiring physicist.
"He's sworn to secrecy though. He's not allowed to tell his friends."
LIGO and Virgo: The machines that unlock the universe's mysteries
The three machines that gave scientists their first-ever glimpse of gravitational waves resulting from a collision of neutron stars are the most advanced detectors ever built for sensing tiny vibrations in the universe.
The LIGO and Virgo detectors have previously picked up the "chirp" of black holes merging in the distant universe, sending out ripples in the fabric of space and time.
The detection of these gravitational waves for the first time in 2015 confirmed Albert Einstein's century old theory of general relativity.
The two US-based underground detectors are known as the Laser Interferometer Gravitational-wave Observatory, or LIGO for short.
One is located in Hanford, Washington; the other 1,800 miles (3,000 kilometers) away in Livingston, Louisiana.
Construction began in 1999, and observations were taken from 2001 to 2007.
Then they underwent a major upgrade to make them 10 times more powerful.
The advanced LIGO detectors became fully operational for the first time in September 2015.
On September 14, 2015, the detector in Louisiana first picked up the signal of a gravitational wave, originating 1.3 billion years ago in the southern sky.
Virgo
The third underground detector is near Pisa, Italy, and is known as Virgo.
Built a quarter century ago by a French-Italian partnership, the Virgo detector ended its initial round of observations in 2011 and then underwent an upgrade.
Advanced Virgo came online in April of this year, and made its first observation of gravitational waves on August 14, marking the fourth such event that scientists have observed since 2015.
Virgo is less sensitive than LIGO, but having three detectors helps scientists zero in on the area of the universe where a cosmic event is happening, and measure the distance with greater accuracy.
"A smaller search area enables follow-up observations with telescopes and satellites for cosmic events that produce gravitational waves and emissions of light, such as the collision of neutron stars," said Georgia Tech professor Laura Cadonati.
How they work
These huge laser interferometers—each about 2.5 miles (four kilometers) long—are buried beneath the ground to allow the most precise measurements.
The L-shaped instruments track gravitational waves using the physics of laser light and space.
They do not rely on light in the skies like a telescope does.
Rather, they sense the vibrations in space, an advantage which allows them to uncover the properties of black holes and neutron stars.
"As a gravitational wave propagates through space it stretches space-time," explained David Shoemaker, leader of the Advanced LIGO project at the Massachusetts Institute of Technology (MIT).
The detector, in short, "is just a big device for changing strain in space into an electrical signal."
One way to imagine the curvature of space and time is to imagine a ball falling on a trampoline.
The trampoline bows downward first, stretching the fabric vertically and shortening the sides.
Then as the ball bounces upward again, the horizontal movement of the fabric expands again.
The instrument acts like a transducer, changing that strain into changes in light—and then into an electronic signal so scientists can digitize it and analyze it.
"The light from the laser has to travel in a vacuum so that it is not disturbed by all the air fluctuations," said Shoemaker, noting that LIGO contains the "biggest high vacuum system in the world,"—measuring 1.2 meters (yards) by 2.5 miles (four kilometers) long.
The detectors contain two very long arms that contain optical instruments for bending light, and are positioned like the letter L.
If one arm shortens, and the other lengthens, scientists know they are seeing a gravitational wave.
Read more: What are neutron stars?
Read more: Gravitational waves: Why the fuss?
[Image: 1x1.gif] Explore further: LIGO and Virgo observatories detect gravitational wave signals from black hole collision


Read more at: https://phys.org/news/2017-10-neutron-st...e.html#jCp[/url][url=https://phys.org/news/2017-10-neutron-star-smash-up-discovery-lifetime.html#jCp]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#47
Filling the early universe with knots can explain why the world is three-dimensional
October 16, 2017 by David Salisbury

[Image: fillingtheea.jpg]
Credit: Keith Wood / Vanderbilt
The next time you come across a knotted jumble of rope or wire or yarn, ponder this: The natural tendency for things to tangle may help explain the three-dimensional nature of the universe and how it formed.



An international team of physicists has developed an out-of-the-box theory that shortly after it popped into existence 13.8 billion years ago the universe was filled with knots formed from flexible strands of energy called flux tubes that link elementary particles together. The idea provides a neat explanation for why we inhabit a three-dimensional world and is described in a paper titled "Knotty inflation and the dimensionality of space time" accepted for publication in the European Physical Journal C and available on the arXiv preprint server.
"Although the question of why our universe has exactly three (large) spatial dimensions is one of the most profound puzzles in cosmology … it is actually only occasionally addressed in the [scientific] literature," the article begins.
For a new solution to this puzzle, the five co-authors – physics professors Arjun Berera at the University of Edinburgh, Roman Buniy at Chapman University, Heinrich Päs (author of "The Perfect Wave: With Neutrinos at the Boundary of Space and Time") at the University of Dortmund, João Rosa at the University of Aveiro and Thomas Kephart at Vanderbilt University – took a common element from the standard model of particle physics and mixed it with a little basic knot theory to produce a novel scenario that not only can explain the predominance of three dimensions but also provides a natural power source for the inflationary growth spurt that most cosmologists believe the universe went through microseconds after it burst into existence.
The common element that the physicists borrowed is the "flux tube" comprised of quarks, the elementary particles that make up protons and neutrons, held together by another type of elementary particle called a gluon that "glues" quarks together. Gluons link positive quarks to matching negative antiquarks with flexible strands of energy called flux tubes. As the linked particles are pulled apart, the flux tube gets longer until it reaches a point where it breaks. When it does, it releases enough energy to form a second quark-antiquark pair that splits up and binds with the original particles, producing two pairs of bound particles. (The process is similar to cutting a bar magnet in half to get two smaller magnets, both with north and south poles.)

"We've taken the well-known phenomenon of the flux tube and kicked it up to a higher energy level," said Kephart, professor of physics at Vanderbilt.
The physicists have been working out the details of their new theory since 2012, when they attended a workshop that Kephart organized at the Isaac Newton Institute in Cambridge, England. Berera, Buniy and Päs all knew Kephart because they were employed as post-doctoral fellows at Vanderbilt before getting faculty appointments. In discussions at the workshop, the group became intrigued by the possibility that flux tubes could have played a key role in the initial formation of the universe.
According to current theories, when the universe was created it was initially filled with a superheated and electrically charged liquid called quark-gluon plasma. This consisted of a mixture of quarks and gluons. (In 2015 the quark-gluon plasma was successfully recreated in a particle accelerator, the Relativistic Heavy Ion Collider at Brookhaven National Laboratory, by an international group of physicists, including five from Vanderbilt: Stevenson Chair in Physics Victoria Greene, and Professors of Physics Will Johns, Charles Maguire, Paul Sheldon and Julia Velkovska.)
Kephart and his collaborators realized that a higher energy version of the quark-gluon plasma would have been an ideal environment for flux tube formation in the very early universe. The large numbers of pairs of quarks and antiquarks being spontaneously created and annihilated would create myriads of flux tubes.
Normally, the flux tube that links a quark and antiquark disappears when the two particles come into contact and self annihilate, but there are exceptions.
[Image: 1-fillingtheea.jpg]
Computer graphic showing the kind of tight network of flux tubes that the physicists propose may have filled the early universe. Credit: Thomas Kephart / Vanderbilt
If a tube takes the form of a knot, for example, then it becomes stable and can outlive the particles that created it. If one of particles traces the path of an overhand knot, for instance, then its flux tube will form a trefoil knot. As a result, the knotted tube will continue to exist, even after the particles that it links annihilate each other. Stable flux tubes are also created when two or more flux tubes become interlinked. The simplest example is the Hopf link, which consists of two interlinked circles.
In this fashion, the entire universe could have filled up with a tight network of flux tubes, the authors envisioned. Then, when they calculated how much energy such a network might contain, they were pleasantly surprised to discover that it was enough to power an early period of cosmic inflation.
Since the idea of cosmic inflation was introduced in the early 1980s, cosmologists have generally accepted the proposition that the early universe went through a period when it expanded from the size of a proton to the size of a grapefruit in less than a trillionth of a second.
This period of hyper-expansion solves two important problems in cosmology. It can explain observations that space is both flatter and smoother than astrophysicists think it should be. Despite these advantages, acceptance of the theory has been hindered because an appropriate energy source has not been identified.
"Not only does our flux tube network provide the energy needed to drive inflation, it also explains why it stopped so abruptly," said Kephart. "As the universe began expanding, the flux-tube network began decaying and eventually broke apart, eliminating the energy source that was powering the expansion."
When the network broke down, it filled the universe with a gas of subatomic particles and radiation, allowing the evolution of the universe to continue along the lines that have previously been determined.
The most distinctive characteristic of their theory is that it provides a natural explanation for a three-dimensional world. There are a number of higher dimensional theories, such as string theory, that visualize the universe as having nine or ten spatial dimensions. Generally, their proponents explain that these higher dimensions are hidden from view in one fashion or another.
The flux-tube theory's explanation comes from basic knot theory. "It was Heinrich Päs who knew that knots only form in three dimensions and wanted to use this fact to explain why we live in three dimensions," said Kephart.
A two-dimensional example helps explain. Say you put a dot in the center of a circle on a sheet of paper. There is no way to free the circle from the dot while staying on the sheet. But if you add a third dimension, you can lift the circle above the dot and move it to one side until the dot is no longer inside the circle before lowering it back down. Something similar happens to three-dimensional knots if you add a fourth dimension – mathematicians have shown that they unravel. "For this reason knotted or linked tubes can't form in higher-dimension spaces," said Kephart.
The net result is that inflation would have been limited to three dimensions. Additional dimensions, if they exist, would remain infinitesimal in size, far too small for us to perceive.
The next step for the physicists is to develop their theory until it makes some predictions about the nature of the universe that can be tested.
[Image: 1x1.gif] Explore further: 'Littlest' quark-gluon plasma revealed by physicists using Large Hadron Collider
More information: Knotty inflation and the dimensionality of spacetime. arXiv. arxiv.org/abs/1508.01458 
Provided by: Vanderbilt University


Read more at: https://phys.org/news/2017-10-early-universe-world-three-dimensional.html#jCp[/url][url=https://phys.org/news/2017-10-early-universe-world-three-dimensional.html#jCp]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#48


2 x~33.333...km/sec = ~ 67 kilometers per second per megaparsec (3.3 million light-years)

Quote:Planck's result predicted that the Hubble constant value should now be 67 kilometers per second per megaparsec (3.3 million light-years), and could be no higher than 69 kilometers per second per megaparsec. This means that for every 3.3 million light-years farther away a galaxy is from us, it is moving 67 kilometers per second faster. But Riess's team measured a value of 73 kilometers per second per megaparsec, indicating galaxies are moving at a faster rate than implied by observations of the early universe.

The Hubble data are so precise that astronomers cannot dismiss the gap between the two results as errors in any single measurement or method. "Both results have been tested multiple ways, so barring a series of unrelated mistakes," Riess explained, "it is increasingly likely that this is not a bug but a feature of the universe."

Improved Hubble yardstick gives fresh evidence for new physics in the universe
February 22, 2018 by Donna Weaver, NASA's Goddard Space Flight Center

[Image: improvedhubb.jpg]
This illustration shows 3 steps astronomers used to measure the universe's expansion rate (Hubble constant) to an unprecedented accuracy, reducing the total uncertainty to 2.3 percent. The measurements streamline and strengthen the construction of the cosmic distance ladder, which is used to measure accurate distances to galaxies near to and far from Earth. The latest Hubble study extends the number of Cepheid variable stars analyzed to distances of up to 10 times farther across our galaxy than previous Hubble results. Credit: NASA, ESA, A. Feild (STScI), and A. Riess (STScI/JHU)

Astronomers have used NASA's Hubble Space Telescope to make the most precise measurements of the expansion rate of the universe since it was first calculated nearly a century ago. Intriguingly, the results are forcing astronomers to consider that they may be seeing evidence of something unexpected at work in the universe.



That's because the latest Hubble finding confirms a nagging discrepancy showing the universe to be expanding faster now than was expected from its trajectory seen shortly after the big bang. Researchers suggest that there may be new physics to explain the inconsistency.
"The community is really grappling with understanding the meaning of this discrepancy," said lead researcher and Nobel Laureate Adam Riess of the Space Telescope Science Institute (STScI) and Johns Hopkins University, both in Baltimore, Maryland.
Riess's team, which includes Stefano Casertano, also of STScI and Johns Hopkins, has been using Hubble over the past six years to refine the measurements of the distances to galaxies, using their stars as milepost markers. Those measurements are used to calculate how fast the universe expands with time, a value known as the Hubble constant. The team's new study extends the number of stars analyzed to distances up to 10 times farther into space than previous Hubble results.
But Riess's value reinforces the disparity with the expected value derived from observations of the early universe's expansion, 378,000 years after the big bang - the violent event that created the universe roughly 13.8 billion years ago. Those measurements were made by the European Space Agency's Planck satellite, which maps the cosmic microwave background, a relic of the big bang. The difference between the two values is about 9 percent. The new Hubble measurements help reduce the chance that the discrepancy in the values is a coincidence to 1 in 5,000.
Planck's result predicted that the Hubble constant value should now be 67 kilometers per second per megaparsec (3.3 million light-years), and could be no higher than 69 kilometers per second per megaparsec. This means that for every 3.3 million light-years farther away a galaxy is from us, it is moving 67 kilometers per second faster. But Riess's team measured a value of 73 kilometers per second per megaparsec, indicating galaxies are moving at a faster rate than implied by observations of the early universe.
The Hubble data are so precise that astronomers cannot dismiss the gap between the two results as errors in any single measurement or method. "Both results have been tested multiple ways, so barring a series of unrelated mistakes," Riess explained, "it is increasingly likely that this is not a bug but a feature of the universe."

Explaining a Vexing Discrepancy
Riess outlined a few possible explanations for the mismatch, all related to the 95 percent of the universe that is shrouded in darkness. One possibility is that dark energy, already known to be accelerating the cosmos, may be shoving galaxies away from each other with even greater - or growing - strength. This means that the acceleration itself might not have a constant value in the universe but changes over time in the universe. Riess shared a Nobel Prize for the 1998 discovery of the accelerating universe.
Another idea is that the universe contains a new subatomic particle that travels close to the speed of light. Such speedy particles are collectively called "dark radiation" and include previously known particles like neutrinos, which are created in nuclear reactions and radioactive decays. Unlike a normal neutrino, which interacts by a subatomic force, this new particle would be affected only by gravity and is dubbed a "sterile neutrino."
Yet another attractive possibility is that dark matter (an invisible form of matter not made up of protons, neutrons, and electrons) interacts more strongly with normal matter or radiation than previously assumed.
Any of these scenarios would change the contents of the early universe, leading to inconsistencies in theoretical models. These inconsistencies would result in an incorrect value for the Hubble constant, inferred from observations of the young cosmos. This value would then be at odds with the number derived from the Hubble observations.
Riess and his colleagues don't have any answers yet to this vexing problem, but his team will continue to work on fine-tuning the universe's expansion rate. So far, Riess's team, called the Supernova H0 for the Equation of State (SH0ES), has decreased the uncertainty to 2.3 percent. Before Hubble was launched in 1990, estimates of the Hubble constant varied by a factor of two. One of Hubble's key goals was to help astronomers reduce the value of this uncertainty to within an error of only 10 percent. Since 2005, the group has been on a quest to refine the accuracy of the Hubble constant to a precision that allows for a better understanding of the universe's behavior.
[Image: 1-improvedhubb.jpg]
These Hubble Space Telescope images showcase 2 of the 19 galaxies analyzed in a project to improve the precision of the universe's expansion rate, a value known as the Hubble constant. The color-composite images show NGC 3972 (left) and NGC 1015 (right), located 65 million light-years and 118 million light-years, respectively, from Earth. The yellow circles in each galaxy represent the locations of pulsating stars called Cepheid variables. Credit: NASA, ESA, A. Riess (STScI/JHU)
Building a Strong Distance Ladder
The team has been successful in refining the Hubble constant value by streamlining and strengthening the construction of the cosmic distance ladder, which the astronomers use to measure accurate distances to galaxies near to and far from Earth. The researchers have compared those distances with the expansion of space as measured by the stretching of light from receding galaxies. They then have used the apparent outward velocity of galaxies at each distance to calculate the Hubble constant.
But the Hubble constant's value is only as precise as the accuracy of the measurements. Astronomers cannot use a tape measure to gauge the distances between galaxies. Instead, they have selected special classes of stars and supernovae as cosmic yardsticks or milepost markers to precisely measure galactic distances.
Among the most reliable for shorter distances are Cepheid variables, pulsating stars that brighten and dim at rates that correspond to their intrinsic brightness. Their distances, therefore, can be inferred by comparing their intrinsic brightness with their apparent brightness as seen from Earth.
Astronomer Henrietta Leavitt was the first to recognize the utility of Cepheid variables to gauge distances in 1913. But the first step is to measure the distances to Cepheids independent of their brightness, using a basic tool of geometry called parallax. Parallax is the apparent shift of an object's position due to a change in an observer's point of view. This technique was invented by the ancient Greeks who used it to measure the distance from Earth to the Moon.
The latest Hubble result is based on measurements of the parallax of eight newly analyzed Cepheids in our Milky Way galaxy. These stars are about 10 times farther away than any studied previously, residing between 6,000 light-years and 12,000 light-years from Earth, making them more challenging to measure. They pulsate at longer intervals, just like the Cepheids observed by Hubble in distant galaxies containing another reliable yardstick, exploding stars called Type Ia supernovae. This type of supernova flares with uniform brightness and is brilliant enough to be seen from relatively farther away. Previous Hubble observations studied 10 faster-blinking Cepheids located 300 light-years to 1,600 light-years from Earth.
Scanning the Stars
To measure parallax with Hubble, the team had to gauge the apparent tiny wobble of the Cepheids due to Earth's motion around the Sun. These wobbles are the size of just 1/100 of a single pixel on the telescope's camera, which is roughly the apparent size of a grain of sand seen 100 miles away.
Therefore, to ensure the accuracy of the measurements, the astronomers developed a clever method that was not envisioned when Hubble was launched. The researchers invented a scanning technique in which the telescope measured a star's position a thousand times a minute every six months for four years.
The team calibrated the true brightness of the eight slowly pulsating stars and cross-correlated them with their more distant blinking cousins to tighten the inaccuracies in their distance ladder. The researchers then compared the brightness of the Cepheids and supernovae in those galaxies with better confidence, so they could more accurately measure the stars' true brightness, and therefore calculate distances to hundreds of supernovae in far-flung galaxies with more precision.
Another advantage to this study is that the team used the same instrument, Hubble's Wide Field Camera 3, to calibrate the luminosities of both the nearby Cepheids and those in other galaxies, eliminating the systematic errors that are almost unavoidably introduced by comparing those measurements from different telescopes.
"Ordinarily, if every six months you try to measure the change in position of one star relative to another at these distances, you are limited by your ability to figure out exactly where the star is," Casertano explained. Using the new technique, Hubble slowly slews across a stellar target, and captures the image as a streak of light. "This method allows for repeated opportunities to measure the extremely tiny displacements due to parallax," Riess added. "You're measuring the separation between two stars, not just in one place on the camera, but over and over thousands of times, reducing the errors in measurement."
The team's goal is to further reduce the uncertainty by using data from Hubble and the European Space Agency's Gaia space observatory, which will measure the positions and distances of stars with unprecedented precision. "This precision is what it will take to diagnose the cause of this discrepancy," Casertano said.
[Image: 1x1.gif] Explore further: Hubble finds universe may be expanding faster than expected
Provided by: NASA's Goddard Space Flight Center


Read more at: https://phys.org/news/2018-02-hubble-yar...s.html#jCp

[url=https://phys.org/news/2018-02-hubble-yardstick-fresh-evidence-physics.html#jCp][/url]
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#49
Long-sought decay of Higgs boson observed  Ninja
August 28, 2018, CERN


[Image: longsoughtde.jpg]
A candidate event display for the production of a Higgs boson decaying to two b-quarks (blue cones), in association with a W boson decaying to a muon (red) and a neutrino. The neutrino leaves the detector unseen, and is reconstructed through the missing transverse energy (dashed line). Credit: ATLAS Collaboration/CERN
Six years after its discovery, the Higgs boson has at last been observed decaying to fundamental particles known as bottom quarks. The finding, presented today at CERN1 by the ATLAS and CMS collaborations at the Large Hadron Collider (LHC), is consistent with the hypothesis that the all-pervading quantum field behind the Higgs boson also gives mass to the bottom quark. Both teams have submitted their results for publication today.



The Standard Model of particle physics predicts that about 60% of the time a Higgs boson will decay to a pair of bottom quarks, the second-heaviest of the six flavours of quarks. Testing this prediction is crucial because the result would either lend support to the Standard Model – which is built upon the idea that the Higgs field endows quarks and other fundamental particles with mass – or rock its foundations and point to new physics.

Spotting this common Higgs-boson decay channel is anything but easy, as the six-year period since the discovery of the boson has shown. The reason for the difficulty is that there are many other ways of producing bottom quarks in proton–proton collisions. This makes it hard to isolate the Higgs-boson decay signal from the background "noise" associated with such processes. By contrast, the less-common Higgs-boson decay channels that were observed at the time of discovery of the particle, such as the decay to a pair of photons, are much easier to extract from the background.

To extract the signal, the ATLAS and CMS collaborations each combined data from the first and second runs of the LHC, which involved collisions at energies of 7, 8 and 13 TeV. They then applied complex analysis methods to the data. The upshot, for both ATLAS and CMS, was the detection of the decay of the Higgs boson to a pair of bottom quarks with a significance that exceeds 5 standard deviations. Furthermore, both teams measured a rate for the decay that is consistent with the Standard Model prediction, within the current precision of the measurement.

[Image: 1-longsoughtde.jpg]
Candidate event display for the production of a Higgs boson decaying to two b-quarks. A 2 b-tag, 2-jet, 2-electron event within the signal-like portion of the high pTV and high BDTVH output distribution is shown (Run 337215, Event 1906922941). Electrons are shown as blue tracks with a large energy deposit in the electromagnetic calorimeter, corresponding to light green bars. Two of them form an invariant mass of 93.6 GeV, compatible with a Z boson. The two central high-pT b-tagged jets are represented by light blue cones. They contain the green and yellow bars corresponding to the energy deposition in the electromagnetic and hadronic calorimeters respectively, and they have an invariant mass of 128.1 GeV. The value of pTV is 246.7 GeV, and BDTVH output value is 0.47. Credit: ATLAS Collaboration/CERN
"This observation is a milestone in the exploration of the Higgs boson. It shows that the ATLAS and CMS experiments have achieved deep understanding of their data and a control of backgrounds that surpasses expectations. ATLAS has now observed all couplings of the Higgs boson to the heavy quarks and leptons of the third generation as well as all major production modes," said Karl Jakobs, spokesperson of the ATLAS collaboration.

 

"Since the first single-experiment observation of the Higgs boson decay to tau-leptons one year ago, CMS, along with our colleagues in ATLAS, has observed the coupling of the Higgs boson to the heaviest fermions: the tau, the top quark, and now the bottom quark. The superb LHC performance and modern machine-learning techniques allowed us to achieve this result earlier than expected," said Joel Butler, spokesperson of the CMS collaboration.

With more data, the collaborations will improve the precision of these and other measurements and probe the decay of the Higgs boson into a pair of much-less-massive fermions called muons, always watching for deviations in the data that could point to physics beyond the Standard Model.

[Image: 2-longsoughtde.jpg]
Candidate event display for the production of a Higgs boson decaying to two b-quarks. A 2-tag, 2-jet, 0-lepton event within the signal-like portion of the high pTV and high BDTVH output (Run 339500, Event 694513952) is shown. The ETMiss, shown as a white dashed line, has a magnitude of 479.1 GeV. The two central high-pT b-tagged jets are represented by light blue cones. They contain the green and yellow bars corresponding to the energy deposition in the electromagnetic and hadronic calorimeters respectively. The dijet invariant mass of 128.1 GeV. The BDTVH output value is 0.74. Credit: ATLAS Collaboration/CERN
"The experiments continue to home in on the Higgs particle, which is often considered a portal to new physics. These beautiful and early achievements also underscore our plans for upgrading the LHC to substantially increase the statistics. The analysis methods have now been shown to reach the precision required for exploration of the full physics landscape, including hopefully new physics that so far hides so subtly," said CERN Director for Research and Computing Eckhard Elsen.
Explore further: New level of precision achieved in combined measurements of Higgs boson couplings

More information: Observation of Higgs boson decay to bottom quarks. arXiv:1808.08242 [hep-ex] arxiv.org/abs/1808.08242


Read more at: https://phys.org/news/2018-08-long-sough...n.html#jCp




Light from ancient quasars helps confirm quantum entanglement
August 20, 2018 by Jennifer Chu, Massachusetts Institute of Technology


[Image: lightfromanc.jpg]
The quasar dates back to less than one billion years after the big bang. Credit: NASA/ESA/G.Bacon, STScI
Last year, physicists at MIT, the University of Vienna, and elsewhere provided strong support for quantum entanglement, the seemingly far-out idea that two particles, no matter how distant from each other in space and time, can be inextricably linked, in a way that defies the rules of classical physics.



Take, for instance, two particles sitting on opposite edges of the universe. If they are truly entangled, then according to the theory of quantum mechanics their physical properties should be related in such a way that any measurement made on one particle should instantly convey information about any future measurement outcome of the other particle—correlations that Einstein skeptically saw as "spooky action at a distance."

In the 1960s, the physicist John Bell calculated a theoretical limit beyond which such correlations must have a quantum, rather than a classical, explanation.

But what if such correlations were the result not of quantum entanglement, but of some other hidden, classical explanation? Such "what-ifs" are known to physicists as loopholes to tests of Bell's inequality, the most stubborn of which is the "freedom-of-choice" loophole: the possibility that some hidden, classical variable may influence the measurement that an experimenter chooses to perform on an entangled particle, making the outcome look quantumly correlated when in fact it isn't.

Last February, the MIT team and their colleagues significantly constrained the freedom-of-choice loophole, by using 600-year-old starlight to decide what properties of two entangled photons to measure. Their experiment proved that, if a classical mechanism caused the correlations they observed, it would have to have been set in motion more than 600 years ago, before the stars' light was first emitted and long before the actual experiment was even conceived.

Now, in a paper published today in Physical Review Letters, the same team has vastly extended the case for quantum entanglement and further restricted the options for the freedom-of-choice loophole. The researchers used distant quasars, one of which emitted its light 7.8 billion years ago and the other 12.2 billion years ago, to determine the measurements to be made on pairs of entangled photons. They found correlations among more than 30,000 pairs of photons, to a degree that far exceeded the limit that Bell originally calculated for a classically based mechanism.

 

"If some conspiracy is happening to simulate quantum mechanics by a mechanism that is actually classical, that mechanism would have had to begin its operations—somehow knowing exactly when, where, and how this experiment was going to be done—at least 7.8 billion years ago. That seems incredibly implausible, so we have very strong evidence that quantum mechanics is the right explanation," says co-author Alan Guth, the Victor F. Weisskopf Professor of Physics at MIT.

"The Earth is about 4.5 billion years old, so any alternative mechanism—different from quantum mechanics—that might have produced our results by exploiting this loophole would've had to be in place long before even there was a planet Earth, let alone an MIT," adds David Kaiser, the Germeshausen Professor of the History of Science and professor of physics at MIT. "So we've pushed any alternative explanations back to very early in cosmic history."

Guth and Kaiser's co-authors include Anton Zeilinger and members of his group at the Austrian Academy of Sciences and the University of Vienna, as well as physicists at Harvey Mudd College and the University of California at San Diego.

A decision, made billions of years ago

In 2014, Kaiser and two members of the current team, Jason Gallicchio and Andrew Friedman, proposed an experiment to produce entangled photons on Earth—a process that is fairly standard in studies of quantum mechanics. They planned to shoot each member of the entangled pair in opposite directions, toward light detectors that would also make a measurement of each photon using a polarizer. Researchers would measure the polarization, or orientation, of each incoming photon's electric field, by setting the polarizer at various angles and observing whether the photons passed through—an outcome for each photon that researchers could compare to determine whether the particles showed the hallmark correlations predicted by quantum mechanics.

The team added a unique step to the proposed experiment, which was to use light from ancient, distant astronomical sources, such as stars and quasars, to determine the angle at which to set each respective polarizer. As each entangled photon was in flight, heading toward its detector at the speed of light, researchers would use a telescope located at each detector site to measure the wavelength of a quasar's incoming light. If that light was redder than some reference wavelength, the polarizer would tilt at a certain angle to make a specific measurement of the incoming entangled photon—a measurement choice that was determined by the quasar. If the quasar's light was bluer than the reference wavelength, the polarizer would tilt at a different angle, performing a different measurement of the entangled photon.

[Image: 1-lightfromanc.jpg]
Credit: Massachusetts Institute of Technology
In their previous experiment, the team used small backyard telescopes to measure the light from stars as close as 600 light years away. In their new study, the researchers used much larger, more powerful telescopes to catch the incoming light from even more ancient, distant astrophysical sources: quasars whose light has been traveling toward the Earth for at least 7.8 billion years—objects that are incredibly far away and yet are so luminous that their light can be observed from Earth.

Tricky timing

On Jan. 11, 2018, "the clock had just ticked past midnight local time," as Kaiser recalls, when about a dozen members of the team gathered on a mountaintop in the Canary Islands and began collecting data from two large, 4-meter-wide telescopes: the William Herschel Telescope and the Telescopio Nazionale Galileo, both situated on the same mountain and separated by about a kilometer.

One telescope focused on a particular quasar, while the other telescope looked at another quasar in a different patch of the night sky. Meanwhile, researchers at a station located between the two telescopes created pairs of entangled photons and beamed particles from each pair in opposite directions toward each telescope.

In the fraction of a second before each entangled photon reached its detector, the instrumentation determined whether a single photon arriving from the quasar was more red or blue, a measurement that then automatically adjusted the angle of a polarizer that ultimately received and detected the incoming entangled photon.

"The timing is very tricky," Kaiser says. "Everything has to happen within very tight windows, updating every microsecond or so."

Demystifying a mirage

The researchers ran their experiment twice, each for around 15 minutes and with two different pairs of quasars. For each run, they measured 17,663 and 12,420 pairs of entangled photons, respectively. Within hours of closing the telescope domes and looking through preliminary data, the team could tell there were strong correlations among the photon pairs, beyond the limit that Bell calculated, indicating that the photons were correlated in a quantum-mechanical manner.

Guth led a more detailed analysis to calculate the chance, however slight, that a classical mechanism might have produced the correlations the team observed.

He calculated that, for the best of the two runs, the probability that a mechanism based on classical physics could have achieved the observed correlation was about 10 to the minus 20—that is, about one part in one hundred billion billion, "outrageously small," Guth says. For comparison, researchers have estimated the probability that the discovery of the Higgs boson was just a chance fluke to be about one in a billion.

"We certainly made it unbelievably implausible that a local realistic theory could be underlying the physics of the universe," Guth says.

And yet, there is still a small opening for the freedom-of-choice loophole. To limit it even further, the team is entertaining ideas of looking even further back in time, to use sources such as cosmic microwave background photons that were emitted as leftover radiation immediately following the Big Bang, though such experiments would present a host of new technical challenges.

"It is fun to think about new types of experiments we can design in the future, but for now, we are very pleased that we were able to address this particular loophole so dramatically. Our experiment with quasars puts extremely tight constraints on various alternatives to quantum mechanics. As strange as quantum mechanics may seem, it continues to match every experimental test we can devise," Kaiser says.

[Image: 1x1.gif] Explore further: Tracking down the mystery of entangled particles of light

More information: Dominik Rauch et al, Cosmic Bell Test Using Random Measurement Settings from High-Redshift Quasars, Physical Review Letters (2018). DOI: 10.1103/PhysRevLett.121.080403 , dx.doi.org/10.1103/PhysRevLett.121.080403


Journal reference: Physical Review Letters [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: Massachusetts Institute of Technology


Read more at: https://phys.org/news/2018-08-ancient-qu...t.html#jCp
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#50
New evidence suggests particles detected in Antarctica don't fit Standard Model
October 1, 2018 by Bob Yirka, Phys.org report


[Image: 6-newevidences.jpg]
The ANITA-IV experiment in Antarctica, prior to being launched on a balloon. Credit: Drummermean/CC BY-SA 4.0
A team of researchers at Penn State University has found new evidence that suggests some particles detected in Antarctica do not fit the Standard Model. They have written a paper outlining their arguments and have posted it on the arXiv preprint server.



Prior research has shown that when low energy cosmic particles encounter the Earth, they are likely to pass right on through—high energy particles, on the other hand, are almost certain to run into something else, preventing them from passing through in one piece. Instead, they cause an avalanche of collisions, creating a shower of particles that eventually emerge on the other side of the planet. But what if a high-energy particle were to make it all the way through without creating a particle shower? That would mean there likely exists a particle that is not described by the Standard Model—and that is exactly what researchers studying particles detected over Antarctica are reporting.

To date, two odd particle events have been detected by a sensor attached to a high-altitude balloon hovering over Antarctica as part of a project called the Antarctic Impulsive Transient Antenna (ANITA)—the first detection was back in 2006, the second in 2014. Both indicated that a high-energy particle had somehow made its way through the planet without encountering anything. The first detection was attributed to equipment problems or some other unknown factor. The second caused more concern—but not enough for anyone to seriously consider challenging the Standard Model. In this new effort, the researchers report that they have found other evidence of the same type of particle, suggesting the two anomalies might truly represent unknown particles.

The new evidence came in the form of sensor data from the IceCube experiment in which sensors buried in the Antarctic ice continually detect particle events. Data from the sensors showed that three events with unexplained properties had occurred. The researchers suggest the two unconnected sources of data indicate that it is time to start asking whether the anomalies hint at the possibility of particles beyond the Standard Model.

[Image: 1x1.gif] Explore further: Hunting for dark quarks

More information: The ANITA Anomalous Events as Signatures of a Beyond Standard Model Particle, and Supporting Observations from IceCube, arXiv:1809.09615 [astro-ph.HE] arxiv.org/abs/1809.09615


Journal reference: arXiv


Read more at: https://phys.org/news/2018-10-evidence-p...d.html#jCp
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#51
A good read that goes into this 200-400 microgram LSD DEEP thoughts


[Image: 51BaJYMxk8L._SX331_BO1,204,203,200_.jpg]

Bob... Ninja Assimilated
"The Light" - Jefferson Starship-Windows of Heaven Album
I'm an Earthling with a Martian Soul wanting to go Home.   
You have to turn your own lightbulb on. ©stevo25 & rhw007
Reply
#52
Infinite-dimensional symmetry opens up possibility of a new physics—and new particles
November 16, 2018, University of Warsaw

[Image: particleacce.jpg]
Credit: CC0 Public Domain
The symmetries that govern the world of elementary particles at the most elementary level could be radically different from what has so far been thought. This surprising conclusion emerges from new work published by theoreticians from Warsaw and Potsdam. The scheme they posit unifies all the forces of nature in a way that is consistent with existing observations and anticipates the existence of new particles with unusual properties that may even be present in our close environs.




For a half-century, physicists have been trying to construct a theory that unites all four fundamental forces of nature, describes the known elementary particles and predicts the existence of new ones. So far, these attempts have not found experimental confirmation, and the Standard Model—an incomplete, but surprisingly effective theoretical construct—is still the best description of the quantum world. In a recent paper in Physical Review Letters, Prof. Krzysztof Meissner from the Institute of Theoretical Physics, Faculty of Physics, University of Warsaw, and Prof. Hermann Nicolai from the Max-Planck-Institut für Gravitationsphysik in Potsdam have presented a new scheme generalizing the Standard Model that incorporates gravitation into the description. The new model applies a kind of symmetry not previously used in the description of elementary particles.

In physics, symmetries are understood somewhat differently than in the colloquial sense of the word. For instance, whether a ball is dropped now or one minute from now, it will still fall in the same way. That is a manifestation of a certain symmetry: the laws of physics remain unchanged with respect to shifts in time. Similarly, dropping the ball from the same height in one location has the same result as dropping it in another. This means that the laws of physics are also symmetrical with respect to spatial operations.

"Symmetries play a huge role in physics because they are related to principles of conservation. For instance, the principle of the conservation of energy involves symmetry with respect to shifts in time, the principle of the conservation of momentum relates to symmetry of spatial displacement, and the principle of the conservation of angular momentum relates to rotational symmetry," says Prof. Meissner.

Developing a supersymmetric theory to describe the symmetries between fermions and bosons began back in the 1970s. Fermions are elementary particles whose spin, a quantum property related to rotation, is expressed in odd multiples of the fraction 1/2, and they include both quarks and leptons. Among the latter are electrons, muons, tauons, and their associated neutrinos (as well as their antiparticles). Protons and neutrons, common non-elementary particles, are also fermions. Bosons, in turn, are particles with integer spin values. They include the particles responsible for forces (photons, carriers of the electromagnetic force; gluons, carrying the strong nuclear force; W and Z bosons, carrying the weak nuclear force), as well as the Higgs boson.



"The first supersymmetric theories tried to combine the forces typical of elementary particles, in other words the electromagnetic force with a symmetry known as U(1), the weak force with symmetry SU(2) and the strong force with symmetry SU(3). Gravity was still missing," Prof. Meissner says. "The symmetry between the bosons and fermions was still global, which means the same at every point in space. Soon thereafter, theories were posited where symmetry was local, meaning it could manifest differently at each point in space. Ensuring such symmetry in the theory required for gravitation to be included, and such theories became known as supergravities."

Physicists noticed that in supergravity theories in four spatiotemporal dimensions, there cannot be more than eight different supersymmetric rotations. Each such theory has a strictly defined set of fields (degrees of freedom) with different spins (0, 1/2, 1, 3/2 and 2), known respectively as the fields of scalars, fermions, bosons, gravitinos and gravitons. For supergravity N=8, which has the maximal number of rotations, there are 48 fermions (with spin 1/2), which is precisely the number of degrees of freedom required to account for the six types of quarks and six types of leptons observed in nature. There was therefore every indication that supergravity N=8 is exceptional in many respects. However, it was not ideal.

One of the problems in incorporating the Standard Model into N=8 supergravity was posed by the electrical charges of quarks and leptons. All the charges turned out to be shifted by 1/6 with respect to those observed in nature: the electron had a charge of -5/6 instead of -1, the neutrino had 1/6 instead of 0, etc. This problem, first observed by Murray Gell-Mann more than 30 years ago, was not resolved until 2015, when Professors Meissner and Nicolai presented the respective mechanism for modifying the U(1) symmetry.

"After making this adjustment we obtained a structure with the symmetries U(1) and SU(3) known from the Standard Model. The approach proved to be very different from all other attempts at generalizing the symmetries of the Standard Model. The motivation was strengthened by the fact that the LHC accelerator failed to produce anything beyond the Standard Model and N=8 supergravity fermion content is compatible with this observation. What was missing was to add the SU(2) group, responsible for the weak nuclear force. In our recent paper, we show how this can be done. That would explain why all previous attempts at detecting new particles, motivated by theories that treated the SU(2) symmetry as spontaneously violated for low energies, but as holding in the range of high energies, had to be unsuccessful. In our view, SU(2) is just an approximation for both low and high energies," Prof. Meissner explains.

Both the mechanism reconciling the electric charges of the particles, and the improvement incorporating the weak force proved to belong to a symmetry group known as E10. Unlike the symmetry groups previously used in unification theories, E10 is an infinite group, very poorly studied even in the purely mathematical sense. Prof. Nicolai with Thibault Damour and Marc Henneaux had worked on this group before, because it appeared as a symmetry in N=8 supergravity under conditions similar to those during the first moments after the Big Bang, when only one dimension was significant: time.

"For the first time, we have a scheme that precisely anticipates the composition of the fermions in the Standard Model—quarks and leptons—and does so with the proper electric charges. At the same time it includes gravity into the description. It is a huge surprise that the proper symmetry is the staggeringly huge symmetry group E10, virtually unknown mathematically. If further work confirms the role of this group, that will mean a radical change in our knowledge of the symmetries of nature," Prof. Meissner says.

Although the dynamics is not yet understood, the scheme proposed by Professors Meissner and Nicolai makes specific predictions. It keeps the number of spin 1/2 fermions as in the Standard Model but on the other hand suggests the existence of new particles with very unusual properties. Importantly, at least some of them could be present in our immediate surroundings, and their detection should be within the possibilities of modern detection equipment. But that is a topic for a separate story.

[Image: 1x1.gif] Explore further: Breaking supersymmetry

More information: Krzysztof A. Meissner et al, Standard Model Fermions and Infinite-Dimensional R Symmetries, Physical Review Letters (2018). DOI: 10.1103/PhysRevLett.121.091601 

Journal reference: Physical Review Letters [Image: img-dot.gif] [Image: img-dot.gif]
Provided by: University of Warsaw



Read more at: https://phys.org/news/2018-11-infinite-dimensional-symmetry-possibility-physicsand-particles.html#jCp
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#53
Horsepoop
COSINE-100 experiment investigates dark matter mystery

December 5, 2018, Institute for Basic Science



[Image: cosine100exp.jpg]
The observed (filled circles with black solid line) 90 percent exclusion limits on the WIMP-nucleon interaction are shown with bands for the expected limit assuming the background-only hypothesis. The limits exclude a WIMP interpretation of …more Tp
Astrophysical evidence suggests that the universe contains a large amount of non-luminous dark matter, but no definite signal has been observed despite concerted efforts by many experimental groups. One exception is the long-debated claim by the DAMA group of an annual modulation in the events observed in their detector using sodium-iodide target material as might be expected from weakly interacting massive particle (WIMP) dark matter interactions. The new COSINE-100 experiment, an underground dark matter detector at the Yangyang Underground Laboratory (Y2L) in Korea is starting to explore this claim using the same medium and now has first results that significantly challenge the interpretations made by DAMA that have stood for nearly two decades. Y2L is operated by the Center for Underground Physics (CUP) of the Institute for Basic Science (IBS) in Korea.








The puzzle of DAMA's signal, and its inconsistencies with results from other experiments, have resulted in hundreds of publications. Many new models to explain dark matter have been proposed as a result and the controversy remains of great scientific and public interest.



critical point is that COSINE-100 is investigating the claimed dark matter detection of DAMA using the same target material as DAMA and is the first to release significant results by this means. In a paper published in the 7734 issue of the journal Nature, the collaboration describes results from the first phase of work, a search for the dark matter signal by looking for an excess of events over the expected background. This study indicates that there are no such events present in the data, confirming that DAMA's annual modulation signal is in severe tension with results from other experiments under the assumption of the most traditional so-called Standard Halo Model for dark matter in our galaxy as shown in Fig. 1.



"The result of this search is significant because, for the first time, we have sizeable sodium-iodide crystal detectors with enough sensitivity to look at the DAMA signal region. It has been for 20 years that the potentially significant claim has not been reproduced using the same crystals independently," said COSINE-100 co-spokesperson and the associate director at CUP, Hyun Su Lee. "The initial results even carve out a fair portion of the possible dark matter search region drawn by the DAMA signal. In other words, there is little room left for this claim to be from the dark matter interaction unless the dark matter model is significantly modified."



[Image: 1-cosine100exp.jpg]

The detector is contained within a nested arrangement of shielding components shown in schematic a). The main purpose of the shield is to provide full coverage against external radiation from various background sources. The shielding …more

The COSINE-100 collaboration is composed of 50 scientists from Korea, the United States (with co-spokesperson Professor Reina Maruyama in Yale University), the United Kingdom, Brazil, and Indonesia. COSINE-100 began data taking in 2016. The experiment utilizes eight low-background thallium-doped sodium iodide crystals arranged in a 4 by 2 array, giving a total target mass of 106 kg. Each crystal is coupled to two photosensors to measure the amount of energy deposited in the crystal. The sodium iodide crystal assemblies are immersed in 2,200 L of light-emitting liquid, which allows for the identification and subsequent reduction of radioactive backgrounds observed by the crystals. The liquid scintillator is surrounded by copper, lead, and plastic scintillator to reduce the background contribution from external radiation, as well as cosmic-ray muons. The detector schematic is shown in Fig. 2.







"So far, we have not yet discovered the dark matter particles in this search but we have come closer to testing the origin of the DAMA signal on whether this is from dark matter interactions or some unknown systematic effect," said Chang Hyon Ha, research fellow at CUP.



Despite the strong evidence for its existence, the identity of dark matter remains a mystery. Several years of data will be necessary to fully confirm or refute DAMA's annual modulation results. Improved theoretical understanding and more data from the upgraded COSINE detector (COSINE-200) will help understand the mystery of the signal. To help achieve this goal, CUP is currently constructing a new experimental site in a deeper and more spacious location, called Yemi Laboratory in Jeongseon County. COSINE-100 is currently collecting data with continuous improvement in understanding the detector. Yeongduk Kim, Hyun Su Lee, Reina Maruyama, and Neil Spooner conceived the COSINE-100 experiment.



 Explore further: New Limits on the Origin of Dark Matter



More information: An experiment to search for dark-matter interactions using sodium iodide detectors, Nature(2018). DOI: 10.1038/s41586-018-0739-1 

Journal reference: Nature
Provided by: Institute for Basic Science




Read more at: https://phys.org/news/2018-12-cosine-dark-mystery.html#jCp Naughty


LilD ~all Ma'at @ that!>>> Angel

Bringing balance to the universe: Teetertotter  New theory could explain missing 95 percent of the cosmos
December 5, 2018, University of Oxford

[Image: darkmatter.jpg]
Dark matter map of KiDS survey region (region G12). Credit: KiDS survey
Scientists at the University of Oxford may have solved one of the biggest questions in modern physics, with a new paper unifying dark matter and dark energy into a single phenomenon: a fluid which possesses 'negative mass." If you were to push a negative mass, it would accelerate towards you. This astonishing new theory may also prove right a prediction that Einstein made 100 years ago.




Our current, widely recognised model of the Universe, called LambdaCDM, tells us nothing about what dark matter and dark energy are like physically. We only know about them because of the gravitational effects they have on other, observable matter.

This new model, published today in Astronomy and Astrophysics, by Dr. Jamie Farnes from the Oxford e-Research Centre, Department of Engineering Science, offers a new explanation. Dr. Farnes says: "We now think that both dark matter and dark energy can be unified into a fluid which possesses a type of 'negative gravity," repelling all other material around them. Although this matter is peculiar to us, it suggests that our cosmos is symmetrical in both positive and negative qualities."

The existence of negative matter had previously been ruled out as it was thought this material would become less dense as the Universe expands, which runs contrary to our observations that show dark energy does not thin out over time. However, Dr. Farnes' research applies a 'creation tensor," which allows for negative masses to be continuously created. It demonstrates that when more and more negative masses are continually bursting into existence, this negative mass fluid does not dilute during the expansion of the cosmos. In fact, the fluid appears to be identical to dark energy.

Dr. Farnes's theory also provides the first correct predictions of the behaviour of dark matter halos. Most galaxies are rotating so rapidly they should be tearing themselves apart, which suggests that an invisible 'halo' of dark matter must be holding them together. The new research published today features a computer simulation of the properties of negative mass, which predicts the formation of dark matter halos just like the ones inferred by observations using modern radio telescopes.

Albert Einstein provided the first hint of the dark universe exactly 100 years ago, when he discovered a parameter in his equations known as the 'cosmological constant," which we now know to be synonymous with dark energy. Einstein famously called the cosmological constant his 'biggest blunder," although modern astrophysical observations prove that it is a real phenomenon. In notes dating back to 1918, Einstein described his cosmological constant, writing that 'a modification of the theory is required such that "empty space" takes the role of gravitating negative masses which are distributed all over the interstellar space." It is therefore possible that Einstein himself predicted a negative-mass-filled universe.

Dr. Farnes says: "Previous approaches to combining dark energy and dark matter have attempted to modify Einstein's theory of general relativity, which has turned out to be incredibly challenging. This new approach takes two old ideas that are known to be compatible with Einstein's theory—negative masses and matter creation—and combines them together.

"The outcome seems rather beautiful: dark energy and dark matter can be unified into a single substance, with both effects being simply explainable as positive mass matter surfing on a sea of negative masses."

Proof of Dr. Farnes's theory will come from tests performed with a cutting-edge radio telescope known as the Square Kilometre Array (SKA), an international endeavour to build the world's largest telescope in which the University of Oxford is collaborating.

Dr. Farnes adds: "There are still many theoretical issues and computational simulations to work through, and LambdaCDM has a nearly 30 year head start, but I'm looking forward to seeing whether this new extended version of LambdaCDM can accurately match other observational evidence of our cosmology. If real, it would suggest that the missing 95% of the cosmos had an aesthetic solution: we had forgotten to include a simple minus sign."

[Image: 1x1.gif] Explore further: Dark matter clusters could reveal nature of dark energy

More information: J. S. Farnes. A unifying theory of dark energy and dark matter: Negative masses and matter creation within a modified LambdaCDM framework, Astronomy & Astrophysics (2018). DOI: 10.1051/0004-6361/201832898 , https://arxiv.org/abs/1712.07962

Bizarre 'dark fluid' with negative mass could dominate the universe – what my research suggests 

Journal reference: Astronomy and Astrophysics
Astronomy & Astrophysics
Provided by: University of Oxford


Read more at: https://phys.org/news/2018-12-universe-theory-percent-cosmos.html#jCp
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)