Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
On the matter of MATTER Vs. ANTI-MATTER : A nihilist annihilates duality reality.
#1
What if matter and anti-matter don't always  immediately self-destruct but instead they immediately mediate?

To differentiate the two.

Mind your matters.

Find shorn tatters...

The fabricated fabric of duality as causality of reality.

Triality?

Can matter and anti-matter recombine as trine?

A hybrid state of physicality.

Instantly ancient.

This idea anew.

The big-bang/yin-yan dark opposed to light.

Never knew this clever clue was a third insight.



I just thought this subject up after a beer and a toke on a lark of a joke.
Eye Wander/I Wonder what the forum thinks would happen if matter and antimatter could co-exist?  Cry


A thought experiment.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#2
MARCH 10, 2020
Solved: The mystery of the expansion of the universe
[Image: 5e677f5397ada.jpg]M106. Credit: NASA
The Earth, solar system, the entire Milky Way and the few thousand galaxies closest to us move in a vast "bubble" that is 250 million light years in diameter, where the average density of matter is half as high as for the rest of the universe. This is the hypothesis advanced by a theoretical physicist from the University of Geneva (UNIGE) to solve a conundrum that has been splitting the scientific community for a decade: At what speed is the universe expanding? Until now, at least two independent calculation methods have arrived at two values that are different by about 10% with a deviation that is statistically irreconcilable. This new approach, which is set out in the journal Physics Letters B, erases this divergence without making use of any "new physics."

The universe has been expanding since the Big Bang occurred 13.8 billion years ago—a proposition first made by the Belgian canon and physicist Georges Lemaître (1894-1966), and first demonstrated by Edwin Hubble (1889-1953). The American astronomer discovered in 1929 that every galaxy is pulling away from us, and that the most distant galaxies are moving the most quickly. This suggests that there was a time in the past when all the galaxies were located at the same spot, a time that can only correspond to the Big Bang. This research gave rise to the Hubble-Lemaître law, including the Hubble constant (H0), which denotes the universe's rate of expansion. The best H0 estimates currently lie around 70 (km/s)/Mpc (meaning that the universe is expanding 70 kilometers a second more quickly every 3.26 million light years). The problem is that there are two conflicting methods of calculation.
[b]Sporadic supernovae[/b]
The first is based on the cosmic microwave background: This is the microwave radiation that comes at us from everywhere, emitted at the time the universe became cold enough for light to be able to circulate freely (about 370,000 years after the Big Bang). Using the precise data supplied by the Planck space mission, and given the fact that the universe is homogeneous and isotropic, a value of 67.4 is obtained for H0 using Einstein's theory of general relativity to run through the scenario. The second calculation method is based on the supernovae that appear sporadically in distant galaxies. These very bright events provide the observer with highly precise distances, an approach that has made it possible to determine a value for H0 of 74.
Lucas Lombriser, a professor in the Theoretical Physics Department in UNIGE's Faculty of Sciences, explains: "These two values carried on becoming more precise for many years while remaining different from each other. It didn't take much to spark a scientific controversy and even to arouse the exciting hope that we were perhaps dealing with a 'new physics.'" To narrow the gap, professor Lombriser entertained the idea that the universe is not as homogeneous as claimed, a hypothesis that may seem obvious on relatively modest scales. There is no doubt that matter is distributed differently inside a galaxy than outside one. It is more difficult, however, to imagine fluctuations in the average density of matter calculated on volumes thousands of times larger than a galaxy.
[b]The "Hubble Bubble"[/b]
"If we were in a kind of gigantic 'bubble,'" continues professor Lombriser, "where the density of matter was significantly lower than the known density for the entire universe, it would have consequences on the distances of supernovae and, ultimately, on determining H0."
All that would be needed would be for this "Hubble bubble" to be large enough to include the galaxy that serves as a reference for measuring distances. By establishing a diameter of 250 million light years for this bubble, the physicist calculated that if the density of matter inside was 50% lower than for the rest of the universe, a new value would be obtained for the Hubble constant, which would then agree with the one obtained using the cosmic microwave background. "The probability that there is such a fluctuation on this scale is one in 20 to one in 5, which means that it is not a theoretician's fantasy. There are a lot of regions like ours in the vast universe," says professor Lombriser




Explore further
Providing a solution to the worst-ever prediction in physics



[b]More information:[/b] Lucas Lombriser. Consistency of the local Hubble constant with the cosmic microwave background, Physics Letters B (2020). DOI: 10.1016/j.physletb.2020.135303
[b]Journal information:[/b] Physics Letters B [/url]

Provided by [url=https://phys.org/partners/university-of-geneva/]University of Geneva

https://phys.org/news/2020-03-mystery-ex...verse.html
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#3
MARCH 10, 2020
Paper sheds light on infant universe and origin of matter
[Image: 1-papershedsli.jpg]The rotation of the QCD axion (black ball) produces an excess of matter (colored balls) over antimatter, allowing galaxies and human beings to exist. Credit: Graphic: Harigaya and Co; Photo: NASA
A new study, conducted to better understand the origin of the universe, has provided insight into some of the most enduring questions in fundamental physics: How can the Standard Model of particle physics be extended to explain the cosmological excess of matter over antimatter? What is dark matter? And what is the theoretical origin of an unexpected but observed symmetry in the force that binds protons and neutrons together?

In the paper "Axiogenesis," scheduled to be published in Physical Review Letters on March 17, 2020, researchers Keisuke Harigaya, Member in the School of Natural Sciences at the Institute for Advanced Study, and Raymond T. Co of the University of Michigan, have presented a compelling case in which the quantum chromodynamics (QCD) axion, first theorized in 1977, provides several important answers to these questions.
"We revealed that the rotation of the QCD axion can account for the excess of matter found in the universe," stated Harigaya. "We named this mechanism axiogenesis."
Infinitesimally light, the QCD axion—at least one billion times lighter than a proton—is nearly ghost-like. Millions of these particles pass through ordinary matter every second without notice. However, the subatomic level interaction of the QCD axion can still leave detectable signals in experiments with unprecedented sensitivities. While the QCD axion has never been directly detected, this study provides added fuel for experimentalists to hunt down the elusive particle.
"The versatility of the QCD axion in solving the mysteries of fundamental physics is truly amazing," stated Co. "We are thrilled about the unexplored theoretical possibilities that this new aspect of the QCD axion can bring. More importantly, experiments may soon tell us whether the mysteries of nature truly hint towards the QCD axion."
Harigaya and Co have reasoned that the QCD axion is capable of filling three missing pieces of the physics jigsaw puzzle simultaneously. First, the QCD axion was originally proposed to explain the so-called strong CP problem—why the strong force, which binds protons and neutrons together, unexpectedly preserves a symmetry called the Charge Parity (CP) symmetry. The CP symmetry is inferred from the observation that a neutron does not react with an electric field despite its charged constituents. Second, the QCD axion was found to be a good candidate for dark matter, offering what could be a major breakthrough in understanding the composition of approximately 80 percent of the universe's mass that has never been directly observed. In their work on the early universe, Harigaya and Co have determined that the QCD axion can also explain the matter-antimatter asymmetry problem.
As matter and antimatter particles interact, they are mutually annihilated. In the first fraction of a second following the Big Bang, matter and antimatter existed in equal amounts. This symmetry prevented the predominance of one type of matter over the other. Today, the universe is filled with matter, indicating that this symmetry must have been broken. Harigaya and Co cite the QCD axion as the culprit. Kinetic energy, resulting from the motion of the QCD axion, produced additional baryons or ordinary matter. This slight tipping of the scale in favor of matter would have had a pronounced cascade effect, paving the way for the universe as it is known today.
Greater understanding of the newly discovered dynamics of the QCD axion could potentially change the expansion history of the universe and thus inform the study of gravitational waves. Future work on this topic could also provide further insight into other enduring questions of fundamental physics, such as the origin of the tiny neutrino mass.
"Since theoretical and experimental particle physicists, astrophysicists, and cosmologists began studying the QCD axion, great progress has been made. We hope that our work further advances these interdisciplinary research efforts," added Harigaya.




Explore further
Could the mysteries of antimatter and dark matter be linked?



[b]More information:[/b] Axiogenesis, arXiv:1910.02080 [hep-ph] arxiv.org/abs/1910.02080
[b]Journal information:[/b] Physical Review Letters [/url]

Provided by [url=https://phys.org/partners/institute-for-advanced-study/]Institute for Advanced Study
 

https://phys.org/news/2020-03-paper-infa...verse.html
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#4
Quote:This kind of work had been done in two-dimensional systems, but this is the first time a 3-D system had been studied in this way. The research showed that the dominant topological structures in the system were loop structures that emerge spontaneously, expand and then self-annihilate.

The loops are related to the kinds of defects that emerge in better-studied 2-D systems, but they differ in a key way, the researchers say. In 2-D, defects arise in pairs of points that have opposing characteristics or "charges," a bit like particles and antiparticles. Once they form, they exist until they eventually run into a defect with the opposite charge, which causes them to annihilate.
The loops that form in 3-D, in contrast, have no charge. As a result, they form and annihilate all on their own. They're still related to the 2-D defects structures, however. In fact, the 3-D loops can be thought of as extensions of 2-D point defects. Imagine two point defects sitting on a 2-D surface. Now connect those two points with an arc that rises up out of the 2-D surface, and a second arc on the underside of the surface. The result is a loop that has both charges of the points, but is itself charge neutral. That enables nucleation and annihilation all on their own.

MARCH 9, 2020
Study reveals collective dynamics of active matter systems
by Kevin Stacey, Brown University
[Image: 6-researchreve.jpg]A new study characterizes the defect patterns in an active matter system. The defects tend form loops that form and annihilate spontaneously. Credit: Duclos et. al.
Flocks of starlings that produce dazzling patterns across the sky are natural examples of active matter—groups of individual agents coming together to create collective dynamics. In a study featured on the cover of the March 6 issue of the journal Science, a team of researchers that includes Brown University physicists reveals new insights into what happens inside active matter systems.

The research describes experiments using a three-dimensional active nematic. Nematic describes a state of matter that emerges in the kind of liquid crystals widely used in smartphone and television displays. The cigar-shaped molecules in liquid crystals are able to move as in a liquid, but tend to stay ordered more or less in the same direction, a little like a crystal.
In a normal liquid crystal, the molecules are passive, meaning they don't have the ability to self-propel. But the system involved in this new study replaces those passive molecules with tiny bundles of microtubules, each with the ability to consume fuel and propel themselves. The goal of the research was to study how those active elements affect the order of the system.
"These microtubules tend to align, but also continually destroy their own aligning order with their movement," said study co-author Daniel Beller, an assistant professor of physics at University of California, Merced, who began work on the research while he was a postdoctoral researcher at Brown. "So there are collective motions that create defects in the alignment, and that's what we study here."
As the system evolves, the defects appear to come to life in some sense, creating lines, loops and other structures that meander through the system. The researchers studied the structures using topology, a branch of math concerned with how things deform without breaking.
"If your goal is to understand the dynamics of these systems, then one way to do that is to focus on these emerging topological structures as a way to characterize the dynamics," said Robert Pelcovits, a professor of physics at Brown and a study coauthor. "If we can get guiding principles from this simple system, that might help guide us in understanding more complicated ones."
Beller, Pelcovits and Thomas Powers, a professor of engineering and physics at Brown, led the theoretical work for the study. The experimental work was performed by researchers from Brandeis University and the University of California, Santa Barbara. Researchers from the Max Planck Institute for Dynamics and Self-Organization, the University of Chicago, Brandeis and Eindhoven University of Technology contributed computer modeling expertise.
This kind of work had been done in two-dimensional systems, but this is the first time a 3-D system had been studied in this way. The research showed that the dominant topological structures in the system were loop structures that emerge spontaneously, expand and then self-annihilate.
The loops are related to the kinds of defects that emerge in better-studied 2-D systems, but they differ in a key way, the researchers say. In 2-D, defects arise in pairs of points that have opposing characteristics or "charges," a bit like particles and antiparticles. Once they form, they exist until they eventually run into a defect with the opposite charge, which causes them to annihilate.
The loops that form in 3-D, in contrast, have no charge. As a result, they form and annihilate all on their own. They're still related to the 2-D defects structures, however. In fact, the 3-D loops can be thought of as extensions of 2-D point defects. Imagine two point defects sitting on a 2-D surface. Now connect those two points with an arc that rises up out of the 2-D surface, and a second arc on the underside of the surface. The result is a loop that has both charges of the points, but is itself charge neutral. That enables nucleation and annihilation all on their own.
The researchers are hopeful that this new understanding of this system's dynamics will be applicable in real-world systems like bacterial colonies, structures and systems in the human body, or other systems.
"What we found here is a quite general set of behaviors that we think will be fully present in similar systems that have this tendency to align, but that are also turning stored energy into motion," Beller said.




Explore further
Wriggling microtubules help understand coupling of 'active' defects and curvature



[b]More information:[/b] Guillaume Duclos et al, Topological structure and dynamics of three-dimensional active nematics, Science (2020). DOI: 10.1126/science.aaz4547
[b]Journal information:[/b] Science [/url]

Provided by 
Brown University 



https://phys.org/news/2020-03-reveals-dynamics.html


Jeeze!  Holycowsmile This thread's improv was instantly prescient from the git-go! what a beer and a toke will do for ya! lol...  Arrow


Quote:The second measurement was a search for a difference between the mass of the hypertriton and its antimatter counterpart, the antihypertriton (the first nucleus containing an antistrange quark, discovered at RHIC in 2010). Physicists have never found a mass difference between matter-antimatter partners so seeing one would be a big discovery. It would be evidence of "CPT" violation—a simultaneous violation of three fundamental symmetries in nature pertaining to the reversal of charge, parity (mirror symmetry), and time.

"Physicists have seen parity violation, and violation of CP together (each earning a Nobel Prize for Brookhaven Lab[), but never CPT," said Brookhaven physicist Zhangbu Xu, co-spokesperson of RHIC's STAR experiment, where the hypertriton research was done.
But no one has looked for CPT violation in the hypertriton and antihypertriton, he said, "because no one else could yet."
MARCH 9, 2020
'Strange' glimpse into neutron stars and symmetry violation
[Image: strangeglimp.jpg]Inner vertex components of the STAR detector at the Relativistic Heavy Ion Collider (righthand view) allow scientists to trace tracks from triplets of decay particles picked up in the detector's outer regions (left) to their origin in a rare "antihypertriton" particle that decays just outside the collision zone. Measurements of the momentum and known mass of the decay products (a pi+ meson, antiproton, and antideuteron) can then be used to calculate the mass and binding energy of the parent particle. Doing the same for the hypertriton (which decays into different "daughter" particles) allows precision comparisons of these matter and antimatter varieties. Credit: Brookhaven National Laboratory
New results from precision particle detectors at the Relativistic Heavy Ion Collider (RHIC) offer a fresh glimpse of the particle interactions that take place in the cores of neutron stars and give nuclear physicists a new way to search for violations of fundamental symmetries in the universe. The results, just published in Nature Physics, could only be obtained at a powerful ion collider such as RHIC, a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at DOE's Brookhaven National Laboratory.

The precision measurements reveal that the binding energy holding together the components of the simplest "strange-matter" nucleus, known as a "hypertriton," is greater than obtained by previous, less-precise experiments. The new value could have important astrophysical implications for understanding the properties of neutron stars, where the presence of particles containing so-called "strange" quarks is predicted to be common.
The second measurement was a search for a difference between the mass of the hypertriton and its antimatter counterpart, the antihypertriton (the first nucleus containing an antistrange quark, discovered at RHIC in 2010). Physicists have never found a mass difference between matter-antimatter partners so seeing one would be a big discovery. It would be evidence of "CPT" violation—a simultaneous violation of three fundamental symmetries in nature pertaining to the reversal of charge, parity (mirror symmetry), and time.
"Physicists have seen parity violation, and violation of CP together (each earning a Nobel Prize for Brookhaven Lab[), but never CPT," said Brookhaven physicist Zhangbu Xu, co-spokesperson of RHIC's STAR experiment, where the hypertriton research was done.
But no one has looked for CPT violation in the hypertriton and antihypertriton, he said, "because no one else could yet."
The previous CPT test of the heaviest nucleus was performed by the ALICE collaboration at Europe's Large Hadron Collider (LHC), with a measurement of the mass difference between ordinary helium-3 and antihelium-3. The result, showing no significant difference, was published in Nature Physics in 2015.
Spoiler alert: The STAR results also reveal no significant mass difference between the matter-antimatter partners explored at RHIC, so there's still no evidence of CPT violation. But the fact that STAR physicists could even make the measurements is a testament to the remarkable capabilities of their detector.

[b]Strange matter[/b]
The simplest normal-matter nuclei contain just protons and neutrons, with each of those particles made of ordinary "up" and "down" quarks. In hypertritons, one neutron is replaced by a particle called a lambda, which contains one strange quark along with the ordinary up and down varieties.
Such strange matter replacements are common in the ultra-dense conditions created in RHIC's collisions—and are also likely in the cores of neutron stars where a single teaspoon of matter would weigh more than 1 billion tons. That's because the high density makes it less costly energy-wise to make strange quarks than the ordinary up and down varieties.
For that reason, RHIC collisions give nuclear physicists a way to peer into the subatomic interactions within distant stellar objects without ever leaving Earth. And because RHIC collisions create hypertritons and antihypertritons in nearly equal amounts, they offer a way to search for CPT violation as well.
But finding those rare particles among the thousands that stream from each RHIC particle smashup—with collisions happening thousands of times each second—is a daunting task. Add to the challenge the fact that these unstable particles decay almost as soon as they form—within centimeters of the center of the four-meter-wide STAR detector.

[Image: 1-strangeglimp.jpg]
The Heavy Flavor Tracker at the center of RHIC's STAR detector. Credit: Brookhaven National Laboratory
[b]Precision detection[/b]
Fortunately, detector components added to STAR for tracking different kinds of particles made the search a relative cinch. These components, called the "Heavy-Flavor Tracker," are located very close to the STAR detector's center. They were developed and built by a team of STAR collaborators led by scientists and engineers at DOE's Lawrence Berkeley National Laboratory (Berkeley Lab). These inner components allow scientists to match up tracks created by decay products of each hypertriton and antihypertriton with their point of origin just outside the collision zone.
"What we look for are the 'daughter' particles—the decay products that strike detector components at the outer edges of STAR," said Berkeley Lab physicist Xin Dong. Identifying tracks of pairs or triplets of daughter particles that originate from a single point just outside the primary collision zone allows the scientists to pick these signals out from the sea of other particles streaming from each RHIC collision.
"Then we calculate the momentum of each daughter particle from one decay (based on how much they bend in STAR's magnetic field), and from that we can reconstruct their masses and the mass of the parent hypertriton or antihypertriton particle before it decayed," explained Declan Keane of Kent State University (KSU). Telling the hypertriton and antihypertriton apart is easy because they decay into different daughters, he added.
"Keane's team, including Irakli Chakeberia, has specialized in tracking these particles through the detectors to 'connect the dots,'" Xu said. "They also provided much needed visualization of the events."
As noted, compiling data from many collisions revealed no mass difference between the matter and antimatter hypernuclei, so there's no evidence of CPT violation in these results.
But when STAR physicists looked at their results for the binding energy of the hypertriton, it turned out to be larger than previous measurements from the 1970s had found.
The STAR physicists derived the binding energy by subtracting their value for the hypertriton mass from the combined known masses of its building-block particles: a deuteron (a bound state of a proton and a neutron) and one lambda.
"The hypertriton weighs less than the sum of its parts because some of that mass is converted into the energy that is binding the three nucleons together," said Fudan University STAR collaborator Jinhui Chen, whose Ph.D. student, Peng Liu, analyzed the large datasets to arrive at these results. "This binding energy is really a measure of the strength of these interactions, so our new measurement could have important implications for understanding the 'equation of state' of neutron stars," he added.
For example, in model calculations, the mass and structure of a neutron star depends on the strength of these interactions. "There's great interest in understanding how these interactions—a form of the strong force—are different between ordinary nucleons and strange nucleons containing up, down, and strange quarks," Chen said. "Because these hypernuclei contain a single lambda, this is one of the best ways to make comparisons with theoretical predictions. It reduces the problem to its simplest form."




Explore further
Simple math, antimatter, and the birth of the Universe



[b]More information:[/b] Measurement of the mass difference and the binding energy of the hypertriton and antihypertriton, Nature Physics (2020). DOI: 10.1038/s41567-020-0799-7 , https://nature.com/articles/s41567-020-0799-7
[b]Journal information:[/b] Nature Physics 

Provided by [url=https://phys.org/partners/brookhaven-national-laboratory/]Brookhaven National Laboratory


https://phys.org/news/2020-03-strange-gl...metry.html
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#5
...


Quote:cores of neutron stars,
where a single teaspoon of matter would weigh more than 1 billion tons




Single neutron star merger supplied half the Solar System’s plutonium

https://arstechnica.com/science/2019/05/...plutonium/

We are all, as Carl Sagan said, star-dust.
You might think that since most stars are pretty much the same,
all star-dust is equal.
But we have evidence that some star-dust is more equal than others.
Yes, some elements seem to have a very special origin:
neutron star Hugs mergers.

Most stars are pretty much all hydrogen.
Near their center, fusion busily turns hydrogen into helium.
Eventually, that hydrogen will run out and,
like a pub that runs out of beer,
the real destruction begins.
The star starts turning helium into heavier elements,
at an increasingly feverish rate.
The end,
no matter how hot and heavy the star, comes when the star’s core is made of iron.

Up to iron,
the process of fusion releases more energy than it consumes.
But after iron,
fusion consumes more energy than it releases,
which essentially shuts the star down.
Once this was understood,
scientists were left wondering Stars
where the remaining 80 odd elements that are heavier than iron came from.

Bring on the neutron stars:
Heavier stars end their life in a supernova—a violent explosion.
These explosions can create many of the elements heavier than iron.
However, a supernova will still only get us as far along the periodic table as molybdenum,
leaving about 40 elements unexplained.

Then, a neutron star merger was observed,
first via gravitational waves and later with various other hardware.
It seemed that the merger produced the right conditions,
to create the remaining elements via a process called rapid neutron capture.

Imagine an iron atom sitting around minding its own business.
Iron has 26 protons—the number of protons determines the element—and 30 neutrons,
which act to glue the protons into the nucleus.
Suddenly, thanks to a heavy neutron bombardment,
the iron nucleus starts accumulating neutrons at a rapid rate.
When the iron nucleus hits 32 neutrons,
one of the neutrons emits an electron to turn into a proton.
That turns the ion nucleus into a cobalt nucleus.

The capture and decay process can continue to encompass all the naturally occurring elements.
But it only happens if there's a large source of neutrons to bombard the atom,
which a neutron star merger provides.
We've only observed one neutron star merger at this point Slap2 though,
which leaves things a bit uncertain.

How special are neutron star mergers?

[Image: Neutron_Star_Merger_Still_2_new_1080-800x450.jpg]


In the new study,
researchers have examined the ratio of elements found in asteroids.
Asteroids are a bit like time capsules from the past.
These rocks have floated around the Solar System doing basically nothing Dance2
at least until some of them had the luck to land on Earth.

Over that time, the radioactive elements will decay,
leaving behind stable isotopes of different elements.
For some elements with very long half-lives,
some of the original radioactive material is still around in asteroids.

A team of researchers was able to estimate the abundance of actinides—elements,
with atomic numbers,
from 89 upwards—in asteroids,
and thus what it must have been in the primeval Solar System.
That analysis showed that supernovae are almost certainly not the source of the actinides.

This conclusion is based on a reasonably long chain of logic.
First, if supernovae are a major contributor to actinide formation,
then there should be an average amount of actinide production per explosion.
Stars follow a predictable life,
so the researchers can estimate how many stars went kaboom Whip
in time to contribute material to the formation of our Solar System.
But the numbers simply don’t work out:
if actinides were produced by supernovae,
it would lead to a higher abundance of these elements than we actually observe.

On the other hand,
the researchers are also able to estimate the number of neutron star mergers,
that could contribute material to the formation of the Solar System.
Neutron stars are (from a computational point of view) nearly ideal stars,
so we can model their behavior pretty well.
Combine those models with our observations of a single neutron star merger,
and researchers have a pretty good idea of actinide production.

Here the numbers seem to work out:
the number of mergers that could have contributed to our early Solar System
(a number based on how often these things seem to occur)
produces an actinide abundance that brackets the one estimated from asteroids.

It gets even better.
It seems that nearly half the plutonium in the Solar System,
came from a single neutron star merger.
That is fascinating:
with such low numbers of neutron star mergers contributing to actinide abundance,
the variation from solar system to solar system must be huge.
Imagine, we could have ended up in a solar system with almost no uranium or plutonium.

Now, a note of caution  Naughty  in this research,
the scientists compared standard supernova with neutron star mergers.
But there is a special class of supernova,
called collapsars  Whistle
that are a different story.
Collapsars may also be able to supply actinides,
but we still don't know a lot about the physics there.
And the researchers behind this paper suggest that they are too infrequent,
to have supplied the observed amount of actinides.
This leaves neutron star mergers as the most likely option.

...
Reply
#6
RE: On the matter of: "systems that are driven   Sheep away from equilibrium" MATTER Vs. ANTI-MATTER


This observation violates the standard model of physics that explains the basic fundamental forces of the universe and classifies all known elementary particles.

According to their calculations, there could be two possibilities for new particles. In one scenario, they suggest that the Kaon might decay into a pion—a subatomic particle with a mass about 270 times that of an electron—and some sort of invisible particle. Or, the researchers in the KOTO experiment could have witnessed the production and decay of something completely unknown to physicists.

MARCH 5, 2020
Researchers propose new physics to explain decay of subatomic particle
by Kathleen Haughney, Florida State University
[Image: 6-researchersp.jpg]FSU physicists proposed a new particle (yellow) to explain recently reported rare kaon (blue) decays to neutral pions (orange). Credit: Florida State University
Florida State University physicists believe they have an answer to unusual incidents of rare decay of a subatomic particle called a Kaon that were reported last year by scientists in the KOTO experiment at the Japan Proton Accelerator Research Complex.

FSU Associate Professor of Physics Takemichi Okui and Assistant Professor of Physics Kohsaku Tobioka published a new paper in the journal Physical Review Letters that proposes that this decay is actually a new, short-lived particle that has avoided detection in similar experiments.
"This is such a rare disintegration," Okui said. "It's so rare, that they should not have seen any. But if this is correct, how do we explain it? We think this is one possibility."
Kaons are particles made of one quark and one antiquark. Researchers study how they function—which includes their decay—as a way to better understand how the world works. But last year, researchers in the KOTO experiment reported four instances of a particular rare decay that should have been too rare to be detected yet.
This observation violates the standard model of physics that explains the basic fundamental forces of the universe and classifies all known elementary particles.
According to their calculations, there could be two possibilities for new particles. In one scenario, they suggest that the Kaon might decay into a pion—a subatomic particle with a mass about 270 times that of an electron—and some sort of invisible particle. Or, the researchers in the KOTO experiment could have witnessed the production and decay of something completely unknown to physicists.
Researchers in Japan are conducting a special data run to confirm whether the previous observations were true detections of new particles or simply noise.
"If it's confirmed, it's very exciting because it's completely unexpected," Tobioka said. "It might be noise, but it might not be. In this case, expectation of noise is very low, so even one event or observation is very striking. And in this case there were four."
Okui and Tobioka's co-authors on this study were Teppei Kitahara and Yotam Soreg from the Israel Institute of Technology and Gilad Perez from the Weizmann Institute of Science in Israel.



Explore further
Ultra-rare kaon decay could lead to evidence of new physics



[b]More information:[/b] Teppei Kitahara et al. New Physics Implications of Recent Search for KL→π0νν¯ at KOTO, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.071801
[b]Journal information:[/b] Physical Review Letters [/url]

Provided by 
Florida State University 

https://phys.org/news/2020-03-physics-su...ticle.html




"However, this work not only sheds light on how swimming microorganisms interact with passive particles, like nutrients or degraded plastic, but reveals more generally how randomness arises in an active non-equilibrium environment. This finding could help us to understand the behaviour of other systems that are driven away from equilibrium, which occur not only in physics and biology, but also in financial markets for example."
English botanist Robert Brown first described Brownian motion in 1827, when he observed the random movements displayed by pollen grains when added to water.
Decades later the famous physicist Albert Einstein developed the mathematical model to explain this behaviour, and in doing so proved the existence of atoms, laying the foundations for widespread applications in science and beyond.

MARCH 18, 2020
Mathematicians develop new theory to explain real-world randomness
[Image: randomness.jpg]Credit: CC0 Public Domain
Brownian motion describes the random movement of particles in fluids, however, this revolutionary model only works when a fluid is static, or at equilibrium.

In real-life environments, fluids often contain particles that move by themselves, such as tiny swimming microorganisms. These self-propelled swimmers can cause movement or stirring in the fluid, which drives it away from equilibrium.
Experiments have shown that non-moving 'passive' particles can exhibit strange, loopy motions when interacting with 'active' fluids containing swimmers. Such movements do not fit with the conventional particle behaviours described by Brownian motion and so far, scientists have struggled to explain how such large-scale chaotic movements result from microscopic interactions between individual particles.
Now researchers from Queen Mary University of London, Tsukuba University, École Polytechnique Fédérale de Lausanne and Imperial College London, have presented a novel theory to explain observed particle movements in these dynamic environments.
They suggest the new model could also help make predictions about real-life behaviours in biological systems, such as the foraging patterns of swimming algae or bacteria.
Dr. Adrian Baule, Senior Lecturer in Applied Mathematics at Queen Mary University of London, who managed the project, said: "Brownian motion is widely used to describe diffusion throughout physical, chemical and biological sciences; however it can't be used to describe the diffusion of particles in more active systems that we often observe in real life."
By explicitly solving the scattering dynamics between the passive particle and active swimmers in the fluid, the researchers were able to derive an effective model for particle motion in 'active' fluids, which accounts for all experimental observations.
Their extensive calculation reveals that the effective particle dynamics follow a so-called 'Lévy flight', which is widely used to describe 'extreme' movements in complex systems that are very far from typical behaviour, such as in ecological systems or earthquake dynamics.
Dr. Kiyoshi Kanazawa from the University of Tsukuba, and first author of the study, said: "So far there has been no explanation how Lévy flights can actually occur based on microscopic interactions that obey physical laws. Our results show that Lévy flights can arise as a consequence of the hydrodynamic interactions between the active swimmers and the passive particle, which is very surprising."
The team found that the density of active swimmers also affected the duration of the Lévy flight regime, suggesting that swimming microorganisms could exploit the Lévy flights of nutrients to determine the best foraging strategies for different environments.
Dr. Baule added: "Our results suggest optimal foraging strategies could depend on the density of particles within their environment. For example, at higher densities active searches by the forager could be a more successful approach, whereas at lower densities it might be advantageous for the forager to simply wait for a nutrient to come close as it is dragged by the other swimmers and explores larger regions of space.
"However, this work not only sheds light on how swimming microorganisms interact with passive particles, like nutrients or degraded plastic, but reveals more generally how randomness arises in an active non-equilibrium environment. This finding could help us to understand the behaviour of other systems that are driven away from equilibrium, which occur not only in physics and biology, but also in financial markets for example."
English botanist Robert Brown first described Brownian motion in 1827, when he observed the random movements displayed by pollen grains when added to water.
Decades later the famous physicist Albert Einstein developed the mathematical model to explain this behaviour, and in doing so proved the existence of atoms, laying the foundations for widespread applications in science and beyond.




Explore further
Swimming microbes steer themselves into mathematical order



[b]More information:[/b] 'Loopy Lévy flights enhance tracer diffusion in active suspensions.' K Kanazawa, T Sano, A Cairoli, and A Baule. NatureDOI: 10.1038/s41586-020-2086-2 , http://www.nature.com/articles/s41586-020-2086-2
[b]Journal information:[/b] Nature 

Provided by [url=https://phys.org/partners/queen-mary--university-of-london/]Queen Mary, University of London










Quote:[b]Vianova

Single neutron star merger supplied half the Solar System’s plutonium[/b]


https://arstechnica.com/science/2019/05/...plutonium/

MARCH 10, 2020
Team obtains the best measurement of neutron star size to date
[Image: neutronstarw.jpg]A typical neutron star with a radius of eleven kilometres is about as large as a medium-sized German city. Credit: NASA's Goddard Space Flight Center
An international research team led by members of the Max Planck Institute for Gravitational Physics (Albert Einstein Institute; AEI) has obtained new measurements of how big neutron stars are. To do so, they combined a general first-principles description of the unknown behavior of neutron star matter with multi-messenger observations of the binary neutron star merger GW170817. Their results, which appeared in Nature Astronomy today, are more stringent by a factor of two than previous limits and show that a typical neutron star has a radius close to 11 kilometers. They also find that neutron stars merging with black holes are in most cases likely to be swallowed whole, unless the black hole is small and/or rapidly rotating. This means that while such mergers might be observable as gravitational-wave sources, they would be invisible in the electromagnetic spectrum.



https://phys.org/news/2020-03-neutron-star-kilometers-radius.html

Magnetars are neutron stars endowed with the strongest magnetic fields observed in the universe, but their origin remains controversial.
MARCH 16, 2020

A new theory of magnetar formation

[Image: 10-.jpg]Figure 1: 3D snapshots of the magnetic field lines in the convective zone inside a newborn neutron star. Inward (outward) flows are represented by the blue (red) surfaces. Left: strong field dynamo discovered for fast rotation periods of a few milliseconds, where the dipole component reaches 1015 G. Right: for slower rotation, the magnetic field is up to ten times weaker. Credit: CEA Sacley
Magnetars are neutron stars endowed with the strongest magnetic fields observed in the universe, but their origin remains controversial. In a study published in Science Advances, a team of scientists from CEA, Saclay, the Max Planck Institute for Astrophysics (MPA), and the Institut de Physique du Globe de Paris developed a new and unprecedentedly detailed computer model that can explain the genesis of these gigantic fields through the amplification of pre-existing weak fields when rapidly rotating neutron stars are born in collapsing massive stars. The work opens new avenues to understand the most powerful and most luminous explosions of such stars.
https://phys.org/news/2020-03-theory-mag...ation.html



[Image: neutron.jpg]

Study identifies a transition in the strong nuclear force that illuminates the structure of a neutron star's core
Most ordinary matter is held together by an invisible subatomic glue known as the strong nuclear force—one of the four fundamental forces in nature, along with gravity, electromagnetism, and the weak force. The strong nuclear ...
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#7
PLUTONIUM EH?
WHAT IF THERE IS A 'MAGIC ANGLE' THAT WOULD STABILIZE MATTER VS ANTIMATTER 


"Physics has seven magic numbers: 2, 8, 20, 28, 50, 82 and 126. Atomic nuclei with these numbers of neutrons or protons are exceptionally stable. This stability makes them ideal for research purposes in general."

Scientists at ATLAS will be generating N = 126 nuclei to test a reigning theory of astrophysics—that the rapid capture of neutrons during the explosion and collapse of massive stars and the collision of neutron stars is responsible for the formation of about half the heavy elements
 from iron through uranium.


Quote:
Vianova

It gets even better.
It seems that nearly half the plutonium in the Solar System,
came from a single neutron star merger.
That is fascinating:
with such low numbers of neutron star mergers contributing to actinide abundance,
the variation from solar system to solar system must be huge.
Imagine, we could have ended up in a solar system with almost no uranium or plutonium.


"Physics has seven magic numbers: 2, 8, 20, 28, 50, 82 and 126. Atomic nuclei with these numbers of neutrons or protons are exceptionally stable. This stability makes them ideal for research purposes in general."

Scientists at ATLAS will be generating N = 126 nuclei to test a reigning theory of astrophysics—that the rapid capture of neutrons during the explosion and collapse of massive stars and the collision of neutron stars is responsible for the formation of about half the heavy elements from iron through uranium.

MARCH 6, 2020
Argonne's pioneering user facility to add magic number factory
by Joseph E. Harmon, Argonne National Laboratory
[Image: argonnespion.jpg]Credit: NASA images
One of the big questions in physics and chemistry is, how were the heavy elements from iron to uranium created? The Argonne Tandem Linac Accelerator System (ATLAS) at the U.S. Department of Energy's (DOE) Argonne National Laboratory is being upgraded with new capabilities to help find the answer to that question and many others.

Of five DOE Office of Science user facilities at Argonne, ATLAS is the longest lived. "Inaugurated in 1978, ATLAS is ever changing and developing new technological advances and responding to emerging research opportunities," says ATLAS director Guy Savard. It is now being outfitted with an "N = 126 factory," scheduled to go online later this year. This new capability will soon be producing beams of heavy atomic nuclei consisting of 126 neutrons. This is made possible, in part, by the addition of a cooler-buncher that cools the beam and converts it from continuous to bunched.
For many decades, ATLAS has been a leading U.S. facility for nuclear structure research and is the world-leading facility in the provision of stable beams for nuclear structure and astrophysics research. ATLAS can accelerate beams ranging across the elements, from hydrogen to uranium, to high energies, then it smashes them into targets for studies of various nuclear structures.
Since its inception, ATLAS has brought together the world's leading scientists and engineers to solve some of the most complex scientific problems in nuclear physics and astrophysics. In particular, it has been instrumental in determining properties of atomic nuclei, the core of matter and the fuel of stars.
The forthcoming N = 126 factory will be generating beams of atomic nuclei with a "magic number" of neutrons, 126. As Savard explains, "Physics has seven magic numbers: 2, 8, 20, 28, 50, 82 and 126. Atomic nuclei with these numbers of neutrons or protons are exceptionally stable. This stability makes them ideal for research purposes in general."
Scientists at ATLAS will be generating N = 126 nuclei to test a reigning theory of astrophysics—that the rapid capture of neutrons during the explosion and collapse of massive stars and the collision of neutron stars is responsible for the formation of about half the heavy elements from iron through uranium.
The N = 126 factory will be accelerating a beam composed of a xenon isotope with 82 neutrons into a target composed of a platinum isotope with 120 neutrons. The resulting collisions will transfer neutrons from the xenon beam into a platinum target, yielding isotopes with 126 neutrons and close to that amount. The very heavy neutron-rich isotopes are directed to experimental stations for study.
"The planned studies at ATLAS will provide the first data on neutron-rich isotopes with around 126 neutrons and should play a critical role in understanding the formation of heavy elements, the last stage in the evolution of stars," said Savard. "These and other studies will keep ATLAS at the frontier of science."
The architects of the "N = 126 factory" include Savard, as well as Maxime Brodeur (University of Notre Dame), Adrian Valverde (joint appointment with University of Manitoba), Jason Clark (joint appointment with University of Manitoba), Daniel Lascar (Northwestern University) and Russell Knaack (Argonne's Physics division).
The authors recently published two papers on the subject in Nuclear Instruments and Methods in Physics Research B, "The N = 126 Factory: A New Facility to Produce Very Heavy Neutron-Rich Isotopes" and "A Cooler-Buncher for the N = 126 Factory at Argonne National Laboratory."




Explore further
ISOLDE steps into unexplored region of the nuclear chart to study exotic isotopes



[b]More information:[/b] G. Savard et al, The N = 126 factory: A new facility to produce very heavy neutron-rich isotopes, Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms (2019). DOI: 10.1016/j.nimb.2019.05.024

A.A. Valverde et al. A cooler-buncher for the N=126 factory at Argonne National Laboratory, Nuclear Instruments and Methods in Physics Research Section B: Beam Interactions with Materials and Atoms (2019). DOI: 10.1016/j.nimb.2019.04.070

Provided by Argonne National Laboratory

https://phys.org/news/2020-03-argonne-us...ctory.html





MARCH 4, 2020
25 years on: A single top quark partners with the Z boson
[Image: 25yearsonasi.jpg]Figure 1: The neural network output (ONN) distribution for one of the signal regions. Data is shown in black. The simulated signal is shown in magenta. Backgrounds are shown in other colours. The high part of the ONN spectrum is dominated by signal events. Credit: ATLAS Collaboration/CERN
A quarter-century after its discovery, physicists at the ATLAS Experiment at CERN are gaining new insight into the heaviest-known particle, the top quark. The huge amount of data collected during Run 2 of the LHC (2015-2018) has allowed physicists to study rare production processes of the top quark in great detail, including its production in association with other heavy elementary particles.

In a new paper, the ATLAS Collaboration reports the observation of a single top quark produced in association with a Z boson (tZq) using the full Run-2 dataset, thereby confirming earlier results by ATLAS and CMS using smaller datasets. To achieve this new result, physicists studied over 20 billion collision events recorded by the ATLAS detector, looking for events with three isolated leptons (electrons or muons), a momentum imbalance in the plane perpendicular (transverse) to the proton beam, and two or three jets of hadrons originating from the fragmentation of quarks (with one jet originating from a b-quark). Only about 600 candidate events with such a signature were identified (i.e. the signal region) and, despite strict selection criteria, only about 120 of those are expected to come from the tZq production process.
To best separate their signal from background processes, ATLAS physicists trained an artificial neural network to identify tZq events using precisely simulated data. The neural network provided each event with a score (ONN) that represented how much it looked like the signal process. To check that the simulation fed to the neural network gave a good description of the real data, physicists looked at events with similar signatures (control regions) that are dominated by background processes. Various kinematic distributions of the 600 selected signal-region events were also checked.

[Image: 1-25yearsonasi.jpg]
Figure 2: Distribution of the reconstructed Z boson transverse momentum for events with a neural network output (ONN) > 0.4. Data is shown in black. The simulated signal is shown in magenta. Backgrounds are shown in other colours. Credit: ATLAS Collaboration/CERN
Researchers evaluated the neural network score in both signal (Figure 1) and control regions so that the background levels could be constrained using real data. The tZq signal was extracted and the rate of such events being produced in the given data sample (i.e. the cross-section) was computed. The uncertainty on the extracted cross-section is 14%. This is over a factor of two more precise than the previous ATLAS result, which was based on almost four times less data (from 2015 and 2016). The cross-section was found to be in agreement with the prediction from Standard Model, confirming that even the heaviest particles in the Standard Model still behave as point-like elementary particles.
Further, by selecting for events identified by the neural network as very likely to be tZq events (ONN > 0.4), ATLAS physicists could examine whether the kinematic distributions are well described by the Standard Model calculations. Figure 2 shows that this is indeed the case.
With the observation of the tZq production process now confirmed, ATLAS researchers can anticipate its study in even greater detail. Measurements of the cross-section as a function of kinematic variables will allow physicists to carefully probe the top quark's interactions with other particles. Will more data unveil some unexpected features? Look forward to seeing what Nature is hiding in the top world.




Explore further
Zooming in on top-quark production



[b]More information:[/b] Observation of the associated production of a top quark and a Z boson in proton-proton collisions at 13 TeV with the ATLAS detector (arXiv: 2002.07546): arxiv.org/abs/2002.07546
Provided by ATLAS Experiment

https://phys.org/news/2020-03-years-quar...boson.html

Magic twist angles???



[Image: neutronstar.jpg]

Tracking down the mystery of matter
Researchers at the Paul Scherrer Institute PSI have measured a property of the neutron more precisely than ever before. In the process they found out that the elementary particle has a significantly smaller electric dipole ...




Nanophysics
MAR 11, 2020

Graphene is 200 times stronger than steel and can be as much as six times lighter. These characteristics alone make it a popular material in manufacturing. Researchers at the University of Illinois at Urbana-Champaign recently ..






[Image: twisted2dmat.jpg]

Twisted 2-D material gives new insights into strongly correlated 1-D physics
Researchers from the Max Planck Institute for the Structure and Dynamics of Matter (MPSD) in Hamburg, the RWTH Aachen University (both in Germany) and the Flatiron institute in the U.S. have revealed that the possibilities ...



[Image: quantumresea.jpg]

Quantum researchers able to split one photon into three
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo report the first occurrence of directly splitting one photon into three.




[Image: darkmatter.jpg]

Scientists shed light on mystery of dark matter
Scientists have identified a sub-atomic particle that could have formed the "dark matter" in the Universe during the Big BaNG



[Image: gravity.jpg]

Witnessing the birth of baby universes 46 times: The link between gravity and soliton
Scientists have been attempting to come up with an equation to unify the micro and macro laws of the Universe; quantum mechanics and gravity. We are one step closer with a paper that demonstrates that this unification is ...
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#8
Magic angles: 

Quote:of course, with a small offset.

The matching masses of hypertritons and antihypertritons reaffirms the solid footing of a pillar of physics known as charge-parity-time, or CPT, symmetry. To visualize such symmetry, imagine taking the universe and swapping out all the particles with their antimatter opposites, flipping it in a mirror and running time backward. If you could do that, the universe would behave identically to its nonflipped version, physicists believe. If CPT symmetry were discovered not to hold, physicists would need to reconsider their theories of the universe.
Quote:WHAT ABOUT A SPECIAL EXAMPLE OF A MIRROR??? Arrow There is still symmetry in 3d
[Image: corner_reflectors.png]
To visualize such symmetry, imagine taking the universe and swapping out all the particles with their antimatter opposites, flipping it in a mirror and running time backward.
retroreflector, unlike a mirror, has the property that it always reflects a ray back in the direction from which it came. The specular property of a combination of mirrors can be used to make a corner reflector retroreflector, as shown in the figure. In cases where the incoming ray is constrained to a narrow range of angles, the corner reflector can be truncated into a triangular corner reflector, also shown in the figure. I once used such a triangular corner reflector in a radar rangefinder, since they can be made to work for any electromagnetic radiation, not just light.

[img=665x0]http://tikalon.com/blog/2014/corner_reflectors.png[/img]


A corner reflector (left) and a triangular corner reflector (right). The specular reflection property of mirrors guarantees that a beam will reflect back to the source; of course, with a small offset. (Modified versions of a Wikimedia Commons image.)




Even a weird hypernucleus confirms a fundamental symmetry of nature
A new study could also support the idea that exotic particles called hyperons lie at the centers of neutron stars


[Image: 030620_EC_hypertriton_Feat-1028x579.jpg]

Exotic atomic nuclei called hypernuclei, spotted with the STAR detector (central piece shown), have confirmed a symmetry between matter and antimatter. The result could also hint at the inner workings of neutron stars.

BROOKHAVEN NATIONAL LABORATORY/FLICKR (CC BY-NC-ND 2.0)

By Emily Conover
MARCH 9, 2020 AT 12:00 PM

An exotic version of an atomic nucleus is doing double duty. A study of the hypertriton simultaneously confirms a basic symmetry of nature and potentially reveals new insights into what lurks inside ultradense neutron stars. 
The hypertriton is a twin of the antihypertriton — the antimatter version of the nucleus. Both hypernuclei have the same mass, researchers with the STAR Collaboration report March 9 in [i]Nature Physics[/i]. 
A hypernucleus is an atomic nucleus in which a proton or neutron has been swapped out with a particle called a hyperon. Like protons and neutrons, hyperons are each made of three smaller particles called quarks. Whereas protons and neutrons contain common varieties known as up quarks and down quarks, hyperons are more unusual. They contain at least one quark of a type called a strange quark. 
The matching masses of hypertritons and antihypertritons reaffirms the solid footing of a pillar of physics known as charge-parity-time, or CPT, symmetry. To visualize such symmetry, imagine taking the universe and swapping out all the particles with their antimatter opposites, flipping it in a mirror and running time backward. If you could do that, the universe would behave identically to its nonflipped version, physicists believe. If CPT symmetry were discovered not to hold, physicists would need to reconsider their theories of the universe.
[Image: cta-module-sm@2x.jpg]
Sign Up For the Latest from [i]Science News[/i]
Headlines and summaries of the latest [i]Science News[/i] articles, delivered to your inbox
E-mail*GO

So far, scientists have not found any hints of CPT symmetry violation ([i]SN: 2/19/20[/i]), but they’ve never before tested it in nuclei that contain strange quarks, so they couldn’t be sure it held there. “It is conceivable that a violation of this symmetry would have been hiding in this little corner of the universe and it would never have been discovered up to now,” says physicist Declan Keane of Kent State University in Ohio. But the equal masses of hypertritons and antihypertritons — found in experiments at the Relativistic Heavy Ion Collider, RHIC, at Brookhaven National Laboratory in Upton, N.Y. — means that CPT symmetry was upheld.
In collisions of gold nuclei at RHIC, Keane and colleagues identified the hypernuclei by looking for the particles produced when the hypernuclei decayed inside of the 1,200–metric ton STAR detector. In addition to confirming that CPT symmetry prevailed, the researchers determined how much energy would be needed to liberate the hyperon from the hypernucleus: about 0.4 million electron volts. Previous measurements — which are now decades old — suggested that amount, called binding energy, was significantly lower, with measurements mostly scattered below 0.2 million electron volts. (For comparison, the binding energy of a nucleus consisting of a proton and neutron is about 2.2 million electron volts.)
The new number could alter scientists’ understanding of neutron stars, remnants of exploded stars that cram a mass greater than the sun’s into a ball about as wide as the length of Manhattan. Neutron stars’ hearts are so dense that it’s impossible to re-create that matter in laboratory experiments, says Morgane Fortin of the Nicolaus Copernicus Astronomical Center of the Polish Academy of Sciences in Warsaw. So, “there is a big question mark what’s at the very center of neutron stars.”
Some scientists think the cores of neutron stars might contain hyperons ([i]SN: 12/1/17[/i]). But the presence of hyperons would soften the matter inside neutron stars. Softer neutron stars would more easily collapse into black holes, so neutron stars couldn’t become as massive. That feature makes hyperons’ potential presence difficult to reconcile with the largest neutron stars seen in the cosmos — which range up to about two solar masses.
But the newly measured, larger binding energy of the hyperon helps keep alive the idea of a hyperon-filled center to neutron stars. The result suggests that hyperons’ interactions with neutrons and protons are stronger than previously thought. That enhanced interaction means neutron stars with hyperons are stiffer and could reach higher masses, Fortin says. So neutron stars may still have strange hearts.


https://www.sciencenews.org/article/hype...-of-nature





MARCH 19, 2020
Chandra data tests 'theory of everything'
[Image: chandradatat.jpg]Credit: NASA/CXC/Univ. of Cambridge/C. Reynolds et al.
One of the biggest ideas in physics is the possibility that all known forces, particles, and interactions can be connected in one framework. String theory is arguably the best-known proposal for a "theory of everything" that would tie together our understanding of the physical universe.

Despite having many different versions of string theory circulating throughout the physics community for decades, there have been very few experimental tests. Astronomers using NASA's Chandra X-ray Observatory, however, have now made a significant step forward in this area.
By searching through galaxy clusters, the largest structures in the universe held together by gravity, researchers were able to hunt for a specific particle that many models of string theory predict should exist. While the resulting non-detection does not rule out string theory altogether, it does deliver a blow to certain models within that family of ideas.
"Until recently I had no idea just how much X-ray astronomers bring to the table when it comes to string theory, but we could play a major role," said Christopher Reynolds of the University of Cambridge in the United Kingdom, who led the study. "If these particles are eventually detected it would change physics forever."
The particle that Reynolds and his colleagues were searching for is called an "axion." These as-yet-undetected particles should have extraordinarily low masses. Scientists do not know the precise mass range, but many theories feature axion masses ranging from about a millionth of the mass of an electron down to zero mass. Some scientists think that axions could explain the mystery of dark matter, which accounts for the vast majority of matter in the universe.
One unusual property of these ultra-low-mass particles would be that they might sometimes convert into photons (that is, packets of light) as they pass through magnetic fields. The opposite may also hold true: photons may also be converted into axions under certain conditions. How often this switch occurs depends on how easily they make this conversion, in other words, on their "convertibility."
Some scientists have proposed the existence of a broader class of ultra-low-mass particles with similar properties to axions. Axions would have a single convertibility value at each mass, but "axion-like particles" would have a range of convertibility at the same mass.

"While it may sound like a long shot to look for tiny particles like axions in gigantic structures like galaxy clusters, they are actually great places to look," said co-author David Marsh of Stockholm University in Sweden. "Galaxy clusters contain magnetic fields over giant distances, and they also often contain bright X-ray sources. Together these properties enhance the chances that conversion of axion-like particles would be detectable."

To look for signs of conversion by axion-like particles, the team of astronomers examined over five days of Chandra observations of X-rays from material falling towards the supermassive black hole in the center of the Perseus galaxy cluster. They studied the Chandra spectrum, or the amount of X-ray emission observed at different energies, of this source. The long observation and the bright X-ray source gave a spectrum with enough sensitivity to have shown distortions that scientists expected if axion-like particles were present.
The lack of detection of such distortions allowed the researchers to rule out the presence of most types of axion-like particles in the mass range their observations were sensitive to, below about a millionth of a billionth of an electron's mass.
"Our research doesn't rule out the existence of these particles, but it definitely doesn't help their case," said co-author Helen Russell of the University of Nottingham in the UK. "These constraints dig into the range of properties suggested by string theory, and may help string theorists weed their theories."
The latest result was about three to four times more sensitive than the previous best search for axion-like particles, which came from Chandra observations of the supermassive black hole in M87. This Perseus study is also about a hundred times more powerful than current measurements that can be performed in laboratories here on Earth for the range of masses that they have considered.
Clearly, one possible interpretation of this work is that axion-like particles do not exist. Another explanation is that the particles have even lower convertibility values than this observation's detection limit, and lower than some particle physicists have expected. They also could have higher masses than probed with the Chandra data.
A paper describing these results appeared in the February 10th, 2020 issue of The Astrophysical Journal.





Explore further
Is dark matter made of axions? Black holes may reveal the answer




[b]More information:[/b] Christopher S. Reynolds et al. Astrophysical Limits on Very Light Axion-like Particles from Chandra Grating Spectroscopy of NGC 1275, The Astrophysical Journal (2020). DOI: 10.3847/1538-4357/ab6a0c , On Arxiv: https://arxiv.org/abs/1907.05475]
[b]Journal information:[/b] Astrophysical Journal 

https://phys.org/news/2020-03-chandra-theory.html
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#9
Does Triality (if itz possible to exist) cause Duality?


Quote:Instantly ancient.

This idea anew.


If three or more objects move around each other, history cannot be reversed.


MARCH 23, 2020
Time symmetry and the laws of physics
[Image: physicslawsc.jpg]Two computer simulations of three black holes that influence each other. The red line is the simulation in which the computer goes back in time. The white line is the simulation where the computer moves forward in time. After 35 million years (situation on the left), there is still no deviation. The red line completely covers the white line. After 37 million years (middle), the orbits deviate slightly and the white line becomes visible. The time symmetry is broken because disturbances the size of the Planck length have an exponential effect. After 40 million years (right), the deviation is obvious. Credit: Astronomie.nl/Tjarda Boekholt
If three or more objects move around each other, history cannot be reversed. That is the conclusion of an international team of researchers based on computer simulations of three black holes orbiting each other. The researchers, led by the Dutch astronomer Tjarda Boekholt, publish their findings in the April issue of the journal Monthly Notices of the Royal Astronomical Society.

Most basic laws in physics have no problem with the direction in which they run. They are, as scientists call it, symmetric with respect to time, or time symmetric. In practice, however, everyone knows that time cannot simply be turned back. For example, a cup that falls into hundred pieces really does not fly back into your hand spontaneously and undamaged. Until now, scientists explained the lack of time symmetry by the statistical interaction between large numbers of particles. Three astronomers now show that only three particles are enough to break the time symmetry.
Tjarda Boekholt (University of Coimbra, Portugal), Simon Portegies Zwart (Leiden University) and Mauri Valtonen (University of Turku, Finland) calculated the orbits of three black holes that influence each other. This is done in two simulations. In the first simulation, the black holes start from rest. Then they move towards each other and past each other in complicated orbits. Finally one black hole leaves the company of the two others. The second simulation starts with the end situation of two black holes and the escaped third black hole and tries to turn back the time to the initial situation.
It turns out that time cannot be reversed in 5% of the calculations. Even if the computer uses more than a hundred decimal places. The last 5% is therefore not a question of better computers or smarter calculation methods, as previously thought.

[b]Planck length[/b]
The researchers explain the irreversibility using the concept of Planck length. This is a principle known in physics that applies to phenomena at the atomic level and smaller. Lead researcher Boekholt: "The movement of the three black holes can be so enormously chaotic that something as small as the Planck length will influence the movements. The disturbances the size of the Planck length have an exponential effect and break the time symmetry."
Co-author Portegies Zwart adds: "So not being able to turn back time is no longer just a statistical argument. It is already hidden in the basic laws of nature. Not a single system of three moving objects, big or small, planets or black holes, can escape the direction of time."




Explore further
New way to form close double black holes



[b]More information:[/b] T C N Boekholt et al. Gargantuan chaotic gravitational three-body systems and their irreversibility to the Planck length, Monthly Notices of the Royal Astronomical Society (2020). DOI: 10.1093/mnras/staa452
[b]Journal information:[/b] Monthly Notices of the Royal Astronomical Society [/url]

Provided by 
Netherlands Research School for Astronomy



Quote:Triality?

Can matter and anti-matter recombine as trine?

A hybrid state of physicality.

unconventional superconductors

MARCH 23, 2020 FEATURE
Evidence for broken time-reversal symmetry in a topological superconductor

by Ingrid Fadelli , Phys.org
[Image: evidenceforb.jpg]A phase diagram of UPt3 indicating the three vortex phases (A, B, and C) for H II c. Credit: Avers et al.
Chiral superconductors are unconventional superconducting materials with distinctive topological properties, in which time-reversal symmetry is broken. Two of the first materials to be identified as chiral superconductors are UPtand Sr2RuO4. So far, experimental evidence for broken time-reversal symmetry in both these materials was based primarily on surface measurements collected at a magnetic field equal to zero.

Researchers at the University of Notre Dame and Northwestern University, however, recently set out to gather new evidence for the chiral superconductivity of the material UPt3, moving beyond surface measurements at conditions with a zero magnetic field. Their paper, published in Nature Physics, contains the results of truly bulk measurements of UPt3 with an applied magnetic field, which provide direct evidence of broken time-reversal symmetry in the material.
"The measurements we collected are the conclusion of a decade long-term collaboration between William Halperin at Northwestern University and myself, driven by previous (William Gannon) and current (Keenan Avers) graduate students," Morten Eskildsen, one of the researchers who carried out the study, told Nature Physics. "They are especially timely given that recent thermal conductivity and 17O Knight shift measurements call into question the earlier determination of odd parity pairing in Sr2RuO4."
Compared to Sr2RuO4, odd parity f-wave pairing is well established in UPt3. While in UPt3 the B phase is predicted to be a chiral ground state, evidence for BTRS has come, as mentioned above, from surface probe measurements with zero applied magnetic field.

[Image: 1-evidenceforb.jpg]
Vortex-lattice (VL) diffraction patterns. Credit: Avers et al
In their experiments, Eskildsen and his colleagues collected bulk measurements of UPt3 using small-angle neutron scattering (SANS), a technique that enables the characterization of material structures at a mesoscopic scale. The specific measurement protocol they used, which entails a comparison between field reduction and field reversal measurements, was devised by James Sauls at Northwestern University, who co-authored the paper.

Vortices introduced in superconducting materials by applying a magnetic field can serve as sensitive probes of the superconducting state in the host material. In their study, Eskildsen and his colleagues used vortices to probe the superconducting state in ultraclean UPt3 crystals, specifically by applying SANS studies of the material's vortex lattice.
"Vortices allow measurements as a function of magnetic field strength and probe bulk superconducting properties, as opposed to surface properties," Eskildsen said. "Our measurements were collected at two of the leading neutron scattering facilities: Oak Ridge National Laboratory in Tennessee (US) and Institut Laue Langevin in Grenoble (France). "The measurements were made possible by a long-term effort at Northwestern University to produce single crystals of UPt3 with an unprecedented high quality."

[Image: 2-evidenceforb.jpg]
Graph showing the field dependence of the vortex-lattice (VL) configuration. Credit: Avers et al.
The recent study by Eskildsen and his colleagues offers the first direct evidence of BTRS in the material UPtbased on bulk measurements, ultimately demonstrating an internal degree of freedom in its superconductivity (i.e. the ability to obtain different vortex lattice splitting depending on the field history). In addition to confirming the BTRS of UPt3, these findings could encourage other research teams to use similar measurement techniques to study other unconventional superconductors.
"We do not currently have further plans for this material, but the kind of measurement protocols could be used in small-angle neutron scattering studies of other superconductors that might break time reversal symmetry," Eskildsen said.




Explore further
Observation of non-trivial superconductivity on surface of type II Weyl semimetal



[b]More information:[/b] K. E. Avers et al. Broken time-reversal symmetry in the topological superconductor UPt3Nature Physics (2020). DOI: 10.1038/s41567-020-0822-z

E. Hassinger et al. Vertical Line Nodes in the Superconducting Gap Structure of Sr2RuO4Physical Review X (2017). DOI: 10.1103/PhysRevX.7.011032
A. Pustogow et al. Constraints on the superconducting order parameter in Sr2RuO4 from oxygen-17 nuclear magnetic resonance, Nature (2019). DOI: 10.1038/s41586-019-1596-2
[b]Journal information:[/b] Nature Physics  Physical Review X  Nature




Hmmm another article. 

Posted by EA - 3 minutes ago
Does Triality (if itz possible to exist) cause Duality?

Quote: Wrote:Instantly ancient.
This idea anew.
If three or more objects move around each other, history cannot be reversed.
MARCH 23, 2020
Time symmetry and the laws of physics
by Netherlands Research School for Astronomy
[Image: physicslawsc.jpg]Two computer simulations of three black holes that influence each other. The red line is the simulation in which the computer goes back in time. The white line is the simulation where the computer moves forward in time. After 35 million years (situation on the left), there is still no deviation. The red line completely covers the white line. After 37 million years (middle), the orbits deviate slightly and the white line becomes visible. The time symmetry is broken because disturbances the size of the Planck length have an exponential effect. After 40 million years (right), the deviation is obvious. Credit: Astronomie.nl/Tjarda Boekholt
If three or more objects move around each other, history cannot be reversed. That is the conclusion of an international team of researchers based on computer simulations of three black holes orbiting each other. The researchers, led by the Dutch astronomer Tjarda Boekholt, publish their findings in the April issue of the journal Monthly Notices of the Royal Astronomical Society.  https://phys.org/news/2020-03-symmetry-l...ysics.html



If three or more objects move around each other, history cannot be reversed.
 
Eye Wander  how  irreverseable history now is?Arrow

Note that one object interacts as a seperate entity that always ejects a pair of twinned objects.

Now read the next article in view of a tri-particle mealstrom...lol.
"The biggest stars live a short time and very quickly evolve into stellar black holes, as large as several scores of solar masses; they are small, but many form in these galaxies." The dense gas that surrounds them, explain Boco and Lapi, has a very powerful definitive effect of dynamic friction and causes them to migrate very quickly to the centre of the galaxy. The majority of the numerous black holes that reach the central regions merge, creating the supermassive black hole seed.
MARCH 23, 2020
Supermassive black holes shortly after the Big Bang: How to seed them
[Image: supermassive.jpg]According to classical theories, these space giants would not have had the time to develop in the young Universe. Yet, observations say they were already present. A new study by SISSA proposes a response to the fascinating question Credit: NASA/JPL-Caltech
They are billions of times larger than our Sun: how is it possible that, as recently observed, supermassive black holes were already present when the Universe, now 14 billion years old, was "just" 800 million years old? For astrophysicists, the formation of these cosmic monsters in such a short time is a real scientific headache, which raises important questions on the current knowledge of the development of these celestial bodies.

A recent article published in The Astrophysical Journal, by the SISSA Ph.D. student Lumen Boco and his supervisor Andrea Lapi, offers a possible explanation to the thorny issue. Thanks to an original model theorized by the scientists from Trieste, the study proposes a very fast formation process in the initial phases of the development of supermassive black holes, those up to now considered slower. Proving, mathematically, that their existence was possible in the young Universe, the results of the research reconcile the timing required for their growth with the limits imposed by the age of the Cosmos. The theory can be fully validated thanks to future gravitational wave detectors namely the Einstein Telescope and LISA, but tested in several basic aspects also with the current Advanced LIGO/Virgo system.
[b]The cosmic monster that grows at the centre of galaxies[/b]
The scientists started their study with a piece of well-known observational evidence: the growth of supermassive black holes occurs in the central regions of galaxies, progenitors of the current elliptical galaxies, which had a very high gas content and in which the stellar formation was extremely intense. "The biggest stars live a short time and very quickly evolve into stellar black holes, as large as several scores of solar masses; they are small, but many form in these galaxies." The dense gas that surrounds them, explain Boco and Lapi, has a very powerful definitive effect of dynamic friction and causes them to migrate very quickly to the centre of the galaxy. The majority of the numerous black holes that reach the central regions merge, creating the supermassive black hole seed.
Boco and Lapi continue: "According to classical theories, a supermassive black hole grows at the centre of a galaxy capturing the surrounding matter, principally gas, "growing it" on itself and finally devouring it at a rhythm which is proportional to its mass. For this reason, during the initial phases of its development, when the mass of the black hole is small, the growth is very slow. To the extent that, according to the calculations, to reach the mass observed, billions of times that of the Sun, a very long time would be required, even greater than the age of the young Universe." Their study, however, showed that things could go much faster than that.

[b]The crazy dash of black holes: What the scientists have discovered[/b]
"Our numerical calculations show that the process of dynamic migration and fusion of stellar black holes can make the supermassive black hole seed reach a mass of between 10,000 and 100,000 times that of the Sun in just 50-100 million years." At this point, the researchers say, "the growth of the central black hole according to the aforementioned direct accretion of gas, envisaged by the standard theory, will become very fast, because the quantity of gas it will succeed in attracting and absorbing will become immense, and predominant on the process we propose. Nevertheless, precisely the fact of starting from such a big seed as envisaged by our mechanism speeds up the global growth of the supermassive black hole and allows its formation, also in the Young Universe. In short, in light of this theory, we can state that 800 million years after the Big Bang, supermassive black holes could already populate the Cosmos."
[b]"Looking" at the supermassive black hole seeds grow[/b]
The article, besides illustrating the model and demonstrating its efficacy, also proposes a method for testing it: "The fusion of numerous stellar black holes with the seed of the supermassive black hole at the centre will produce gravitational waves which we expect to see and study with current and future detectors," explain the researchers. In particular, the gravitational waves emitted in the initial phases, when the central black hole seed is still small, will be identifiable by the current detectors like Advanced LIGO/Virgo and fully characterisable by the future Einstein Telescope. The subsequent development phases of the supermassive black hole could be investigated thanks to the future detector LISA, which will be launched in space around 2034. In this way, explain Boco and Lapi, "the process we propose can be validated in its different phases, in a complementary way, by future gravitational wave detectors."
"This research" concludes Andrea Lapi, coordinator of the Astrophysics and Cosmology group of SISSA, "shows how the students and researchers of our group are fully approaching the new frontier of gravitational waves and multi-messenger astronomy. In particular, our main goal will be to develop theoretical models, like that devised in this case, which serve to capitalise on the information originating from the experiments of current and future gravitational waves, thereby hopefully providing solutions for unresolved issues connected with astrophysics, cosmology and fundamental physics."




Explore further
Supermassive black hole at the center of our galaxy may have a friend



[b]More information:[/b] L. Boco et al, Growth of Supermassive Black Hole Seeds in ETG Star-forming Progenitors: Multiple Merging of Stellar Compact Remnants via Gaseous Dynamical Friction and Gravitational-wave Emission, The Astrophysical Journal (2020). DOI: 10.3847/1538-4357/ab7446
[b]Journal information:[/b] Astrophysical Journal 

Provided by [url=https://phys.org/partners/international-school-of-advanced-studies--sissa-/]International School of Advanced Studies (SISSA) 





Quote:I just thought this subject up after a beer and a toke on a lark of a joke.
fun-da-mental physics LilD
For astrophysicists, the formation of these cosmic monsters in such a short time is a real scientific headache, which raises important questions on the current knowledge of the development of these celestial bodies.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#10
Quote: Wrote:I just thought this subject up after a beer and a toke on a lark of a joke.
fun-da-mental physics [Image: lilD.gif]


Quote:Moving forward, the NUS team is endeavouring to develop efficient circuits that mimics functions of the human brain.
eye endeavour to never end ever the quest for gnosis.




the singularity and the planck length.
they still must vector to an ideological x,y,z. coordinate system irregardless of nano-scale.

      so when the universe was possibly a single nano-point my point is that even in the macrocosm things in the relative microcosm of the nano now knows:
      that scale might be a sign superimposed like on a snake a sang song or just plain ol' all ma'at @ that.

If there is X,Y,Z< why is there only duality?
[Image: corner_reflectors.png]
Symmetry when in mirror form is chiral yet when a retro-reflector with three facets instead of one is assumed Can "TIME" be reflected???refracted???


On the scale of things and all things being equal a nano-surface today is no bigger than our initial size before the big bang.

and symmetry breaks again...   so whatz up with that?





Arrow

MARCH 24, 2020
Scientists invent symmetry-breaking for the first time in a nanoscale device that can mimic human brain
[Image: 1-nusscientist.jpg]Professor Venkatesan (left) discussing the charge disproportionation mechanism with Dr Sreetosh Goswami (right). Credit: National University of Singapore
Over the last decade, artificial intelligence (AI) and its applications such as machine learning have gained pace to revolutionize many industries. As the world gathers more data, the computing power of hardware systems needs to grow in tandem. Unfortunately, we are facing a future where we will not be able to generate enough energy to power our computational needs.

"We hear a lot of predictions about AI ushering in the fourth industrial revolution. It is important for us to understand that the computing platforms of today will not be able to sustain at-scale implementations of AI doink-head on massive datasets. It is clear that we will have to rethink our approaches to computation on all levels: materials, devices and architecture. We are proud to present an update on two fronts in this work: materials and devices. Fundamentally, the devices we are demonstrating are a million times more power efficient than what exists today," shared Professor Thirumalai Venky Venkatesan, the lead Principal Investigator of this project who is from the National University of Singapore (NUS).
In a paper published in Nature Nanotechnology on 23 March 2020, the researchers from the NUS Nanoscience and Nanotechnology Initiative (NUSNNI) reported the invention of a nanoscale device based on a unique material platform that can achieve optimal digital in-memory computing while being extremely energy efficient. The invention is also highly reproducable and durable, unlike conventional organic electronic devices.
The molecular system that is key to this invention is a brainchild of Professor Sreebrata Goswami of the Indian Association for Cultivation of Science in Kolkata, India. "We have been working on this family of molecules of redox active ligands over the last 40 years. Based on the success with one of our molecular systems in making a memory device that was reported in the journal Nature Materials in 2017, we decided to redesign our molecule with a new pincer ligand. This is a rational de novo design strategy to engineer a molecule that can act as an electron sponge," said Professor Goswami.
Dr. Sreetosh Goswami, the key architect of this paper who used to be a graduate student of Professor Venkatesan and now a research fellow at NUSNNI, said, "The main finding of this paper is charge disproportionation or electronic symmetry breaking. Traditionally, this has been one of those phenomena in physics which holds great promise but fails to translate to the real world as it only occurs at specific conditions, such as high or low temperature, or high pressure."

"We are able to achieve this elusive charge disproportionation in our devices, and modulate it using electric fields at room temperature. Physicists have been trying to do the same for 50 years. Our ability to realise this phenomenon in nano-scale results in a multifunctional device that can operate both as a memristor or a memcapacitor or even both concomitantly," Dr. Sreetosh explained.
"The complex intermolecular and ionic interactions in these molecular systems offer this unique charge disproportionation mechanism. We are thankful to Professor Damien Thompson at the University of Limerick who modelled the interactions between the molecules and generated insights that allow us to tweak these molecular systems in many ways to further engineer new functionalities," said Prof Goswami.
"We believe we are only scratching the surface of what is possible with this class of materials," added Professor Venkatesan. "Recently, Dr. Sreetosh has discovered that he can drive these devices to self-oscillate or even exhibit purely unstable, chaotic regime. This is very close to replicating how our human brain functions."
"Computer scientists now recognise that our brain is the most energy efficient, intelligent and fault-tolerant computing system in existence. Being able to emulate the brain's best properties while running millions of times faster will change the face of computing as we know it. In discussions with my longtime friend and collaborator Professor Stan Williams from Texas A&M University (who is a co-author in this paper), I realise that our organic molecular system might eventually be able to outperform all the oxide and 'ovonic' materials demonstrated to date," he concluded.
Moving forward, the NUS team is endeavouring to develop efficient circuits that mimics functions of the human brain.




Explore further
Researchers discover unusual 'quasiparticle' in common 2-D material



[b]More information:[/b] Sreetosh Goswami et al. Charge disproportionate molecular redox for discrete memristive and memcapacitive switching, Nature Nanotechnology (2020). DOI: 10.1038/s41565-020-0653-1
[b]Journal information:[/b] Nature Nanotechnology  Nature Materials

https://phys.org/news/2020-03-scientists...mimic.html



Beer Reefer
speaking of mimicking the human brain eye hope this thread mimics a beer and a toke on a lark of a joke to the algo-rythmic A.I. that has to read this itza.
recall:
Quote: Wrote:Triality?

Can matter and anti-matter recombine as trine?

A hybrid state of physicality.

unconventional superconductors
 
MARCH 23, 2020 FEATURE
Evidence for broken time-reversal symmetry in a topological superconductor

 unconventional superconductors manifest.Arrow https://phys.org/news/2020-03-scientists...rites.html
MARCH 24, 2020
Scientists observe superconductivity in meteorites
[Image: 5-scientistsob.jpg]Artistic rendition of a piece of the Mundrabilla meteorite over a protoplanetary nebula; Mundrabilla over Galaxy 4. Credit: James Wampler, UC San Diego (Lens flare from: https://shutr.bz/3bpa4LV; Galactic disc from L. Calcada/ESO: https://bit.ly/2Uv6vNt https://bit.ly/2QGjzyC; Chunk of Mundrabilla, image by James Wampler
Scientists at UC San Diego and Brookhaven Laboratory in New York went searching for superconducting materials where researchers have had little luck before. Setting their sights on a diverse population of meteorites, they investigated the 15 pieces of comets and asteroids to find "Mundrabilla" and "GRA 95205"—two meteorites with superconductive grains.

While meteorites—due to their extreme origins in space—present researchers with a wide variety of material phases from the oldest states of the solar system, they also present detection challenges because of the potentially minute measurability of the phases. The research team overcame this challenge using an ultrasensitive measurement technique called magnetic field modulated microwave spectroscopy (MFMMS). Details of their work are published in Proceedings of the National Academy of Sciences (PNAS).
In their paper, UC San Diego researchers Mark Thiemens, Ivan Schuller and James Wampler, along with Brookhaven Lab's Shaobo Cheng and Yimei Zhu, characterize the meteorites' phases as alloys of lead, tin and indium (the softest non-alkali metal). They say their findings could impact the understanding of several astronomical environments, noting that superconducting particles in cold environments could affect planet formation, shape and origin of magnetic fields, dynamo effects, motion of charged particles and more.
"Naturally occurring superconductive materials are unusual, but they are particularly significant because these materials could be superconducting in extraterrestrial environments," said Wampler, a postdoctoral researcher in the Schuller Nanoscience Group and the paper's first author.

[Image: 6-scientistsob.jpg]
Superconductive grains were found in this piece of the Mundrabilla meteorite, the first identification of extraterrestrial superconductive grains. Credit: James Wampler
Schuller, a distinguished professor in the Department of Physics with expertise in superconductivity and neuromorphic computing, guided the methodological techniques of the study. After mitigating the detection challenge with MFMMS, the researchers subdivided and measured individual samples, enabling them to isolate the grains containing the largest superconductivity fraction. Next, the team characterized the grains with a series of scientific techniques including vibrating sample magnetometry (VSM), energy dispersive X-ray spectroscopy (EDX) and numerical methods.
"These measurements and analysis identified the likely phases as alloys of lead, indium and tin," said Wampler.
According to Thiemens, a distinguished professor of chemistry and biochemistry, meteorites with extreme formation conditions are ideal for observing exotic chemical species, such as superconductors—materials that conduct electricity or transport electrons without resistance. He noted, however, the uniqueness of superconductive materials occurring in these extraterrestrial [minor] planets.

"My part of the project was to determine which of the tens of thousands of meteorites of many classes was a good candidate and to discuss the relevance for planetary processes; one from the iron nickel core of a planet, the other from the more surficial part that has been heavily bombarded and was among the first meteorites where diamonds were observed," said Thiemens.

[Image: 7-scientistsob.jpg]
MFMMS data shows superconductivity in Mundrabilla meteorite grains at 5K. Credit: James Wampler
According to the cosmological chemist, who has a meteorite named after him—Asteroid 7004Markthiemens—Mundrabilla is an iron-sulfide-rich meteorite from a class formed after melting in asteroidal cores and cooling very slowly. GRA 95205, on the other hand, is a ureilite meteorite—a rare stony-like piece with unique mineral makeup—that underwent heavy shocks during its formation.
According to Schuller, superconductivity in natural samples is extremely unusual.
"Naturally collected materials are not phase-pure materials. Even the simplest superconducting mineral, lead, is only rarely found in its native form," Schuller explained.
The researchers agreed that they knew of only one prior report of natural superconductivity, in the mineral covellite; however, because the superconducting phases they report in the PNAS article exists in two such dissimilar meteorites, it likely exists in other meteorites.




Explore further
Meteorites lend clues to solar system's origin



[b]More information:[/b] James Wampler et al. Superconductivity found in meteorites, Proceedings of the National Academy of Sciences (2020). DOI: 10.1073/pnas.1918056117
[b]Journal information:[/b] Proceedings of the National Academy of Sciences


unconventional superconductors
https://phys.org/news/2020-03-scientists...rites.html

recall:

Quote:The tiny, wormlike creature, named Ikaria wariootia, is the earliest bilaterian, or organism with a front and back, two symmetrical sides, and openings at either end connected by a gut. The paper is published today in Proceedings of the National Academy of Sciences.

The earliest multicellular organisms, such as sponges and algal mats, had variable shapes. Collectively known as the Ediacaran Biota, this group contains the oldest fossils of complex, multicellular organisms.


Just like the body of evidence RE: On the matter of MATTER Vs. ANTI-MATTER may have or may still be variable...
The advent of chiral Symmetry as a body of evidence on triality. Arrow

The development of bilateral symmetry was a critical step in the evolution of animal life, giving organisms the ability to move purposefully and a common, yet successful way to organize their bodies. A multitude of animals, from worms to insects to dinosaurs to humans, are organized around this same basic bilaterian body plan.

Evolutionary biologists studying the genetics of modern animals predicted the oldest ancestor of all bilaterians would have been simple and small, with rudimentary sensory organs. Preserving and identifying the fossilized remains of such an animal was thought to be difficult, if not impossible.


MARCH 23, 2020
Ancestor of all animals identified in Australian fossils
[Image: ancestorofal.jpg]Artist's rendering of Ikaria wariootia. Credit: Sohail Wasif/UCR
A team led by UC Riverside geologists has discovered the first ancestor on the family tree that contains most familiar animals today, including humans.

The tiny, wormlike creature, named Ikaria wariootia, is the earliest bilaterian, or organism with a front and back, two symmetrical sides, and openings at either end connected by a gut. The paper is published today in Proceedings of the National Academy of Sciences.
The earliest multicellular organisms, such as sponges and algal mats, had variable shapes. Collectively known as the Ediacaran Biota, this group contains the oldest fossils of complex, multicellular organisms. However, most of these are not directly related to animals around today, including lily pad-shaped creatures known as Dickinsonia that lack basic features of most animals, such as a mouth or gut.
The development of bilateral symmetry was a critical step in the evolution of animal life, giving organisms the ability to move purposefully and a common, yet successful way to organize their bodies. A multitude of animals, from worms to insects to dinosaurs to humans, are organized around this same basic bilaterian body plan.
Evolutionary biologists studying the genetics of modern animals predicted the oldest ancestor of all bilaterians would have been simple and small, with rudimentary sensory organs. Preserving and identifying the fossilized remains of such an animal was thought to be difficult, if not impossible.

[Image: 2-ancestorofal.jpg]
A 3D laser scan that showing the regular, consistent shape of a cylindrical body with a distinct head and tail and faintly grooved musculature. Credit: Droser Lab/UCR
For 15 years, scientists agreed that fossilized burrows found in 555 million-year-old Ediacaran Period deposits in Nilpena, South Australia, were made by bilaterians. But there was no sign of the creature that made the burrows, leaving scientists with nothing but speculation.
Scott Evans, a recent doctoral graduate from UC Riverside; and Mary Droser, a professor of geology, noticed miniscule, oval impressions near some of these burrows. With funding from a NASA exobiology grant, they used a three-dimensional laser scanner that revealed the regular, consistent shape of a cylindrical body with a distinct head and tail and faintly grooved musculature. The animal ranged between 2-7 millimeters long and about 1-2.5 millimeters wide, with the largest the size and shape of a grain of rice—just the right size to have made the burrows.

"We thought these animals should have existed during this interval, but always understood they would be difficult to recognize," Evans said. "Once we had the 3-D scans, we knew that we had made an important discovery."
The researchers, who include Ian Hughes of UC San Diego and James Gehling of the South Australia Museum, describe Ikaria wariootia, named to acknowledge the original custodians of the land. The genus name comes from Ikara, which means "meeting place" in the Adnyamathanha language. It's the Adnyamathanha name for a grouping of mountains known in English as Wilpena Pound. The species name comes from Warioota Creek, which runs from the Flinders Ranges to Nilpena Station.

[Image: 1-ancestorofal.jpg]
Ikaria wariootia impressions in stone. Credit: Droser Lab/UCR
"Burrows of Ikaria occur lower than anything else. It's the oldest fossil we get with this type of complexity," Droser said. "Dickinsonia and other big things were probably evolutionary dead ends. We knew that we also had lots of little things and thought these might have been the early bilaterians that we were looking for."
In spite of its relatively simple shape, Ikaria was complex compared to other fossils from this period. It burrowed in thin layers of well-oxygenated sand on the ocean floor in search of organic matter, indicating rudimentary sensory abilities. The depth and curvature of Ikaria represent clearly distinct front and rear ends, supporting the directed movement found in the burrows.
The burrows also preserve crosswise, "V"-shaped ridges, suggesting Ikaria moved by contracting muscles across its body like a worm, known as peristaltic locomotion. Evidence of sediment displacement in the burrows and signs the organism fed on buried organic matter reveal Ikaria probably had a mouth, anus, and gut.
"This is what evolutionary biologists predicted," Droser said. "It's really exciting that what we have found lines up so neatly with their prediction."




Explore further
Study sheds light on Earth's first animals



[b]More information:[/b] Scott D. Evans el al., "Discovery of the oldest bilaterian from the Ediacaran of South Australia," PNAS (2020). www.pnas.org/cgi/doi/10.1073/pnas.2001045117
[b]Journal information:[/b] Proceedings of the National Academy of Sciences [/url]

Provided by [url=https://phys.org/partners/university-of-california---riverside/]University of California - Riverside

https://phys.org/news/2020-03-ancestor-a...ssils.html


Variable is as improv was.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#11
non-local yokels?


Quote:The common feature of all these effects is the contact of particles at one point in space, which follows the simple intuition of interaction (for example, in particle theory, this comes down to interaction vertices). Hence the belief that the consequences of symmetrization can only be observed in this way. However, interaction by its very nature causes entanglement. Therefore, it is unclear what causes the observed effects and non-classical correlations: Is it an interaction in itself, or is it the inherent indistinguishability of particles?





MARCH 25, 2020
Is nonlocality inherent in all identical particles in the universe?
[Image: entanglement.jpg]Identity of particles entails their entanglement, which can also be observed in pure form without interaction. Credit: Shutter2U/Vecteezy
What is interaction, and when does it occur? Intuition suggests that the necessary condition for the interaction of independently created particles is their direct touch or contact through physical force carriers. In quantum mechanics, the result of the interaction is entanglement—the appearance of non-classical correlations in the system. It seems that quantum theory allows entanglement of independent particles without any contact. The fundamental identity of particles of the same kind is responsible for this phenomenon.

Quantum mechanics is currently the best and most accurate theory used by physicists to describe the world around us. Its characteristic feature, however, is the abstract mathematical language of quantum mechanics, notoriously leading to serious interpretational problems. The view of reality proposed by this theory is still a subject of scientific dispute that, over time, is only becoming hotter and more interesting. New research motivation and intriguing questions are brought forth by a fresh perspective resulting from the standpoint of quantum information and the enormous progress of experimental techniques. These allow verification of the conclusions drawn from subtle thought experiments directly related to the problem of interpretation. Moreover, researchers are now making enormous progress in the field of quantum communication and quantum computer technology, which significantly draws on non-classical resources offered by quantum mechanics.
Pawel Blasiak from the Institute of Nuclear Physics of the Polish Academy of Sciences in Krakow and Marcin Markiewicz from the University of Gdansk focus on analyzing widely accepted paradigms and theoretical concepts regarding the basics and interpretation of quantum mechanics. The researchers are trying to determine to what extent the intuitions used to describe quantum mechanical processes are justified in a realistic view of the world. For this purpose, they try to clarify specific theoretical ideas, often functioning in the form of vague intuitions, using the language of mathematics. This approach often results in the appearance of inspiring paradoxes. Of course, the more basic the concept to which a given paradox relates, the better, because it opens up new doors to deeper understanding a given problem.
In this spirit, both scientists considered the fundamental question: What is interaction, and when does it occur? In quantum mechanics, the result of interaction is entanglement, which is the appearance of non-classical correlations in the system. Imagine two particles created independently in distant galaxies. It would seem that a necessary condition for the emergence of entanglement is the requirement that at some point in their evolution, the particles touch one another, or at least that indirect contact should take place through another particle or physical field to convey the interaction. How else can they establish the mysterious bond of quantum entanglement? Paradoxically, however, it turns out that this is possible. Quantum mechanics allows entanglement to occur without the need for any contact, even indirect.

To justify such a surprising conclusion requires a scheme in which the particles show non-local correlations at a distance (in a Bell-type experiment). The subtlety of this approach is to exclude the possibility of an interaction understood as some form of contact along the way. Such a scheme should also be economical, so it must exclude the presence of force carriers that could mediate this interaction, including a physical field or intermediate particles. Blasiak and Markiewicz showed how this can be done by starting from the original considerations of Yurke and Stoler, which they reinterpreted as a permutation of paths traversed by the particles from different sources. This new perspective allows the generation of any entangled states of two and three particles, avoiding any contact. The proposed approach can easily be extended to more particles.
How is it possible to entangle independent particles at a distance without their interaction? One hint is suggested by quantum mechanics itself, in which the identity—the fundamental indistinguishability of all particles of the same kind—is postulated. This means, for example, that all photons (as well as other families of elementary particles) in the entire universe are the same, regardless of their distance. From a formal perspective, this boils down to symmetrization of the wave function for bosons or its antisymmetrization for fermions.
Effects of particle identity are usually associated with their statistics having consequences for a description of interacting multi-particle systems (such as Bose-Einstein condensates or solid-state band theory). In the case of simpler systems, the direct result of particle identity is the Pauli exclusion principle for fermions or bunching in quantum optics for bosons. The common feature of all these effects is the contact of particles at one point in space, which follows the simple intuition of interaction (for example, in particle theory, this comes down to interaction vertices). Hence the belief that the consequences of symmetrization can only be observed in this way. However, interaction by its very nature causes entanglement. Therefore, it is unclear what causes the observed effects and non-classical correlations: Is it an interaction in itself, or is it the inherent indistinguishability of particles? The scheme proposed by the scientists bypasses this difficulty, eliminating interaction that could occur through any contact. Hence, the conclusion that non-classical correlations are a direct consequence of the postulate of particle identity. It follows that a way exists for purely activating entanglement from their fundamental indistinguishability.
This type of view, starting from questions about the basics of quantum mechanics, can be practically applied to generate entangled states for quantum technologies. The article shows how to create any entangled state of two and three qubits, and these ideas are already implemented experimentally. It seems that the considered schemes can be successfully extended to create any entangled many-particle states. As part of further research, the scientists intend to analyze in detail the postulate of identical particles, both from the standpoint of theoretical interpretation and practical applications.
Surprisingly, the postulate of the indistinguishability of particles is not only a formal mathematical procedure, but in its pure form, leads to the consequences observed in laboratories. Is nonlocality inherent in all identical particles in the universe? The photon emitted by the monitor screen and the photon from the distant galaxy at the depths of the universe seem to be entangled only by their identical nature. This is a great mystery that science will soon confront.




Explore further
How to use entanglement for long-distance or free-space quantum communication



[b]More information:[/b] Pawel Blasiak et al, Entangling three qubits without ever touching, Scientific Reports (2019). DOI: 10.1038/s41598-019-55137-3
[b]Journal information:[/b] Scientific Reports
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#12
MARCH 31, 2020
Physicists weigh in on the origin of heavy elements
by Savannah Mitchem, Argonne National Laboratory
[Image: argonneandce.jpg]A look inside the ISOLDE Solenoid Spectrometer at CERN. Credit: Argonne National Laboratory
A long-held mystery in the field of nuclear physics is why the universe is composed of the specific materials we see around us. In other words, why is it made of "this" stuff and not other stuff?

Specifically of interest are the physical processes responsible for producing heavy elements—like gold, platinum and uranium—that are thought to happen during neutron star mergers and explosive stellar events.
Scientists from the U.S. Department of Energy's (DOE) Argonne National Laboratory led an international nuclear physics experiment conducted at CERN, the European Organization for Nuclear Research, that utilizes novel techniques developed at Argonne to study the nature and origin of heavy elements in the universe. The study may provide critical insights into the processes that work together to create the exotic nuclei, and it will inform models of stellar events and the early universe.
The nuclear physicists in the collaboration are the first to observe the neutron-shell structure of a nucleus with fewer protons than lead and more than 126 neutrons — "magic numbers" in the field of nuclear physics.
At these magic numbers, of which 8, 20, 28, 50 and 126 are canonical values, nuclei have enhanced stability, much as the noble gases do with closed electron shells. Nuclei with neutrons above the magic number of 126 are largely unexplored because they are difficult to produce. Knowledge of their behavior is crucial for understanding the rapid neutron-capture process, or r-process, that produces many of the heavy elements in the universe.
The r-process is thought to occur in extreme stellar conditions such as neutron-star mergers or supernovae. These neutron rich environments are where nuclei can rapidly grow, capturing neutrons to produce new and heavier elements before they have chance to decay.
This experiment focused on the mercury isotope 207Hg. The study of 207Hg could shed light on the properties of its close neighbors, nuclei directly involved in key aspects of the r-process.
"One of the biggest questions of this century has been how the elements formed at the beginning of the universe," said Argonne physicist Ben Kay, the lead scientist on the study. "It's difficult to research because we can't just go dig up a supernova out of the earth, so we have to create these extreme environments and study the reactions that occur in them."

To study the structure of 207Hg, the researchers first used the HIE-ISOLDE facility at CERN in Geneva, Switzerland. A high-energy beam of protons was fired at a molten lead target, with the resulting collisions producing hundreds of exotic and radioactive isotopes.
They then separated 206Hg nuclei from the other fragments and used CERN's HIE-ISOLDE accelerator to create a beam of the nuclei with the highest energy ever achieved at that accelerator facility. They then focused the beam at a deuterium target inside the new ISOLDE Solenoidal Spectrometer (ISS).
"No other facility can make mercury beams of this mass and accelerate them to these energies," said Kay. "This, coupled with the outstanding resolving power of the ISS, allowed us to observe the spectrum of excited states in 207Hg for the first time."
The ISS is a newly-developed magnetic spectrometer that the nuclear physicists used to detect instances of 206Hg nuclei capturing a neutron and becoming 207Hg. The spectrometer's solenoidal magnet is a recycled 4-Tesla superconducting MRI magnet from a hospital in Australia. It was moved to CERN and installed at ISOLDE, thanks to a UK-led collaboration between University of Liverpool, University of Manchester, Daresbury Laboratory and collaborators from KU Leuven in Belgium.
Deuterium, a rare heavy isotope of hydrogen, consists of a proton and neutron. When 206Hg captures a neutron from the deuterium target, the proton recoils. The protons emitted during these reactions travel to the detector in the ISS, and their energy and position yield key information on the structure of the nucleus and how it is bound together. These properties have a significant impact on the r-process, and the results can inform important calculations in models of nuclear astrophysics.
The ISS uses a pioneering concept suggested by Argonne distinguished fellow John Schiffer that was built as the lab's helical orbital spectrometer, HELIOS — the instrument that inspired the development of the ISS spectrometer. HELIOS has allowed exploration of nuclear properties that were once impossible to study, but thanks to HELIOS, have been carried out at Argonne since 2008. CERN's ISOLDE facility can produce beams of nuclei that complement those that can be made at Argonne.
For the past century, nuclear physicists have been able to gather information about nuclei from the study of collisions where light ion beams hit heavy targets. However, when heavy beams hit light targets, the physics of the collision becomes distorted and more difficult to parse. Argonne's HELIOS concept was the solution to removing this distortion.
"When you've got a cannonball of a beam hitting a fragile target, the kinematics change, and the resulting spectra are compressed," said Kay. "But John Schiffer realized that when the collision occurs inside a magnet, the emitted protons travel in a spiral pattern towards the detector, and by a mathematical 'trick', this unfolds the kinematic compression, resulting in an uncompressed spectrum that reveals the underlying nuclear structure."
The first analyses of the data from the CERN experiment confirm the theoretical predictions of current nuclear models, and the team plans to study other nuclei in the region of 207Hg using these new capabilities, giving deeper insights into the unknown regions of nuclear physics and the r-process.
The results of this study were published in an article titled "First exploration of neutron shell structure below lead and beyond N = 126" on February 13 in the Physical Review Letters.




Explore further
ISOLDE steps into unexplored region of the nuclear chart to study exotic isotopes



[b]More information:[/b] T. L. Tang et al, First Exploration of Neutron Shell Structure below Lead and beyond N=126, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.062502
[b]Journal information:[/b] Physical Review Letters [/url]

Provided by 
Argonne National Laboratory

https://phys.org/news/2020-03-physicists...ments.html






MARCH 31, 2020
Electron-eating neon causes star to collapse
[Image: electroneati.jpg]Figure 1: An artist’s impression shows how an imaginary neon footballfish eats away at the electrons inside a star core. Credit: Kavli IPMU
An international team of researchers has found that neon inside a certain massive star can consume the electrons in the core, a process called electron capture, which causes the star to collapse into a neutron star and produce a supernova.

The researchers were interested in studying the final fate of stars within a mass range of eight to 10 solar masses, or eight to 10 times the mass of the sun. This mass range is important because it includes the boundary between whether a star has a large enough mass to undergo a supernova explosion to form a neutron star, or has a smaller mass to form a white dwarf star without becoming a supernova.
An eight- to 10-solar-mass star commonly forms a core composed of oxygen, magnesium and neon (figure 1). The core is rich in degenerate electrons, meaning there is an abundance of electrons in a dense space with high enough energy to sustain the core against gravity. Once the core density is high enough, the electrons are consumed by magnesium and then neon, which are also found inside the core. Past studies have confirmed that magnesium and neon can start eating away at the electrons once the mass of the core has grown close to Chandrasekhar's limiting mass, a process called electron capture, but there has been debate about whether electron capture can cause neutron star formation. A multi-institutional team of researchers studied the evolution of an 8.4-solar-mass star and ran computer simulations on it to find an answer.

[Image: 1-electroneati.jpg]
Figure 2: (a) A star core contains oxygen, neon, and magnesium. Once the core density becomes high enough, (b) magnesium and neon begin eating electrons and inducing a collapse. © Then oxygen burning is ignited and produces iron-group-nuclei and free-protons, which eat more and more electrons to promote further collapse of the core. (d) Finally, the collapsing core becomes a neutron star in the center, and the outer layer explodes to produce a supernova. Credit: Zha et al
Using newly updated data by Suzuki for density-dependent and temperature-dependent electron capture rates, they simulated the evolution of the star's core, which is supported by the pressure of degenerate electrons against the star's own gravity. As magnesium and mainly neon eat the electrons, the number of electrons decreased and the core rapidly shrunk (Figure 2).
The electron capture also released heat. When the central density of the core exceeded 1010 g/cm3, oxygen in the core started to burn materials in the central region of the core, turning them into iron-group nuclei such as iron and nickel. The temperature became so hot that protons became free and escaped. Then the electrons became easier to capture by free protons and iron-group-nuclei, and the density was so high that the core collapsed without producing a thermonuclear explosion.
With the new electron capture rates, oxygen burning was found to take place slightly off-center. Nevertheless, the collapse formed a neutron star and caused a supernova explosion, showing that an electron-capture supernova can occur.

Figure 3: The Crab Nebula, a remnant of the supernova in 1054 (SN 1054; observed by ancient astronomers in China, Japan and Arab). Nomoto et al. (1982) suggested that SN 1054 could be caused by electron capture supernova of a star with the initial mass of about nine times the sun. Credit: NASA, ESA, J. DePasquale (STScI), and R. Hurt (Caltech/IPAC)
A certain mass range of stars with eight to 10 solar masses would form white dwarfs composed of oxygen-magnesium-neon by envelope loss due to stellar wind mass loss. If the wind mass loss is small, on the other hand, the star undergoes the electron capture supernova, as found in their simulation.
The team suggests that the electron capture supernova could explain the properties of the supernova recorded in 1054 that formed the Crab Nebula, as proposed by Nomoto et al. in 1982 (Figure 3).
These results were published in The Astrophysical Journal on November 15, 2019.




Explore further
Merger between two stars led to blue supergiant, iconic supernova



[b]More information:[/b] Shuai Zha et al. Evolution of ONeMg Core in Super-AGB Stars toward Electron-capture Supernovae: Effects of Updated Electron-capture Rate, The Astrophysical Journal (2019). DOI: 10.3847/1538-4357/ab4b4b
[b]Journal information:[/b] Astrophysical Journal 

Provided by University of Tokyo

https://phys.org/news/2020-03-electron-e...lapse.html






MARCH 31, 2020 
A new search for axion dark matter rules out past numerical predictions
by Ingrid Fadelli , Phys.org
[Image: anewsearchfo.jpg]Credit: Braine et al.
The ADMX collaboration, a group of researchers working at universities across the U.S. and Europe, has recently performed a new search for invisible axion dark matter using a cavity haloscope and a low-noise Josephson parametric amplifier. Cavity haloscopes are sensitive instruments designed to detect and study halos around luminous bodies or other physical phenomena. Josephson parametric amplifiers, on the other hand, are technological tools that can be used to manipulate quantum states of microwave light fields.

In their recent paper, published in Physical Review Letters, the researchers searched for dark matter axions in the galactic halo with a mass ranging between 2.81–3.31  μeV. The results of this search could help to rule out previous theoretical predictions, informing the future search for invisible axion dark matter.
"Our recent search was motivated by two different mysteries in physics, both of which would be solved with the detection of axion dark matter," Nick Du, researcher at the University of Washington and co-author of the recent paper, told Phys.org. "The first of these is the dark matter mystery."
Past physics studies have found evidence that what we ordinarily think of as matter only makes up approximately 15% of the total mass of the universe. The remaining 85% is thought to be composed of particles that do not absorb, emit or reflect light, and thus cannot be detected using traditional techniques for studying matter.
This non-luminous material, known as dark matter, remains one of the greatest mysteries in contemporary physics, as researchers are still unsure about whether it exists and what it is made of. While there have now been countless searches for dark matter using a variety of instruments, this mysterious material has so far never been observed or detected.

[Image: 1-anewsearchfo.jpg]
Researchers Chelsea Bartram (Left) and Nicole Crisosto (Right) while conducting the search for axiom dark matter. Credit: Braine et al.
"One possible solution to what dark matter could be comes from another field of physics, namely nuclear physics, in the form of another mystery known as the Strong CP problem," Du explained. "A popular solution to the Strong CP problem predicts the existence of a new particle known as the axion, and the properties of the axion make it a compelling candidate for dark matter. In looking axion dark matter, the ADMX collaboration hopes to solve both the Strong CP problem and the mystery as to the nature of dark matter. "
The cavity haloscope used by Du and his colleagues consists of a resonant cavity placed inside of a large magnetic field. According to theoretical predictions, dark matter axions in the galactic halo should couple to the magnetic field in the cavity and produce photons.

The number of photons produced is likely to be very small, making the resulting signal very difficult to detect. By tuning the resonant cavity inside the axion haloscope to the same frequency as the photons, however, the number of photons produced by the galactic halo can be enhanced.

[Image: 2-anewsearchfo.jpg]
Credit: Braine et al.
"In a way, our search for axion dark matter is quite similar to how a radio operates," Du said. "As the frequency of a radio is tuned, one can pick up on different radio stations. In that respect, our experiment is similar, except that we don't know the frequency of our radio station and the signal is much weaker."
The search for dark matter axions has been ongoing for several decades now, and Du and his colleagues have already carried out a number of such searches using their cavity haloscope. While they were so far unable to detect invisible axions, the results of their recent experiment rule out a range of axion masses that were previously predicted by benchmark theoretical models of axion dark matter.
"This is actually the second time our experiment has achieved this sensitivity to axion dark matter, but this time, we have tripled the range we covered in our previous study," Du said. "By achieving and expanding on this sensitivity, we've shown that in the continued search for axion dark matter, ADMX represents one of the best hopes for finding it."

[Image: 3-anewsearchfo.jpg]
Credit: Braine et al.
The observations gathered by Du and his colleagues could inform future searches for dark matter axions, while also paving the way for novel theoretical predictions. The researchers are now conducting a new search for axion dark matter at higher frequencies. If this search is also unsuccessful, they plan to continue searching for invisible axions at even larger frequencies.
"At higher frequencies, axion dark matter becomes more difficult to detect, because the conventional cavities we would use are no longer as sensitive to axions," Du said. "However, we already have some interesting prototypes in place to get around this, such as experiments that use a multi-cavity array to search to axion dark matter."




Explore further
First high-sensitivity dark matter axion hunting results from South Korea



[b]More information:[/b] T. Braine et al. Extended Search for the Invisible Axion with the Axion Dark Matter Experiment, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.101303

N. Du et al. Search for Invisible Axion Dark Matter with the Axion Dark Matter Experiment, Physical Review Letters (2018). DOI: 10.1103/PhysRevLett.120.151301
[b]Journal information:[/b] Physical Review Letters

https://phys.org/news/2020-03-axiom-dark-numerical.html







MARCH 31, 2020
Electron-eating neon causes star to collapse
[Image: electroneati.jpg]Figure 1: An artist’s impression shows how an imaginary neon footballfish eats away at the electrons inside a star core. Credit: Kavli IPMU
An international team of researchers has found that neon inside a certain massive star can consume the electrons in the core, a process called electron capture, which causes the star to collapse into a neutron star and produce a supernova.

The researchers were interested in studying the final fate of stars within a mass range of eight to 10 solar masses, or eight to 10 times the mass of the sun. This mass range is important because it includes the boundary between whether a star has a large enough mass to undergo a supernova explosion to form a neutron star, or has a smaller mass to form a white dwarf star without becoming a supernova.
An eight- to 10-solar-mass star commonly forms a core composed of oxygen, magnesium and neon (figure 1). The core is rich in degenerate electrons, meaning there is an abundance of electrons in a dense space with high enough energy to sustain the core against gravity. Once the core density is high enough, the electrons are consumed by magnesium and then neon, which are also found inside the core. Past studies have confirmed that magnesium and neon can start eating away at the electrons once the mass of the core has grown close to Chandrasekhar's limiting mass, a process called electron capture, but there has been debate about whether electron capture can cause neutron star formation. A multi-institutional team of researchers studied the evolution of an 8.4-solar-mass star and ran computer simulations on it to find an answer.

[Image: 1-electroneati.jpg]
Figure 2: (a) A star core contains oxygen, neon, and magnesium. Once the core density becomes high enough, (b) magnesium and neon begin eating electrons and inducing a collapse. © Then oxygen burning is ignited and produces iron-group-nuclei and free-protons, which eat more and more electrons to promote further collapse of the core. (d) Finally, the collapsing core becomes a neutron star in the center, and the outer layer explodes to produce a supernova. Credit: Zha et al
Using newly updated data by Suzuki for density-dependent and temperature-dependent electron capture rates, they simulated the evolution of the star's core, which is supported by the pressure of degenerate electrons against the star's own gravity. As magnesium and mainly neon eat the electrons, the number of electrons decreased and the core rapidly shrunk (Figure 2).
The electron capture also released heat. When the central density of the core exceeded 1010 g/cm3, oxygen in the core started to burn materials in the central region of the core, turning them into iron-group nuclei such as iron and nickel. The temperature became so hot that protons became free and escaped. Then the electrons became easier to capture by free protons and iron-group-nuclei, and the density was so high that the core collapsed without producing a thermonuclear explosion.
With the new electron capture rates, oxygen burning was found to take place slightly off-center. Nevertheless, the collapse formed a neutron star and caused a supernova explosion, showing that an electron-capture supernova can occur.

[Image: 1-2-electroneati.jpg]
Figure 3: The Crab Nebula, a remnant of the supernova in 1054 (SN 1054; observed by ancient astronomers in China, Japan and Arab). Nomoto et al. (1982) suggested that SN 1054 could be caused by electron capture supernova of a star with the initial mass of about nine times the sun. Credit: NASA, ESA, J. DePasquale (STScI), and R. Hurt (Caltech/IPAC)
A certain mass range of stars with eight to 10 solar masses would form white dwarfs composed of oxygen-magnesium-neon by envelope loss due to stellar wind mass loss. If the wind mass loss is small, on the other hand, the star undergoes the electron capture supernova, as found in their simulation.
The team suggests that the electron capture supernova could explain the properties of the supernova recorded in 1054 that formed the Crab Nebula, as proposed by Nomoto et al. in 1982 (Figure 3).
These results were published in The Astrophysical Journal on November 15, 2019.




Explore further
Merger between two stars led to blue supergiant, iconic supernova



[b]More information:[/b] Shuai Zha et al. Evolution of ONeMg Core in Super-AGB Stars toward Electron-capture Supernovae: Effects of Updated Electron-capture Rate, The Astrophysical Journal (2019). DOI: 10.3847/1538-4357/ab4b4b
[b]Journal information:[/b] Astrophysical Journal 

Provided by [url=https://phys.org/partners/university-of-tokyo/]University of Tokyo

https://phys.org/news/2020-03-electron-e...lapse.html
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#13
what if matter vs antimatter is  not the end all be all of annihilation?

I mean, can they, instead of self-destruct...
Re-construct?

Are they the 'recording' media and the 're-wind' option as we play this faster-forward in real time?

Is Dark Matter and also Dark Energy a dual-set of factors unknown or a third set of options?

Triality.


Duel  Duel  Dual 

Matter Sheep Anti-Matter 

Now possibly of a third kind.

A Quasi-state of the possible.

Be three to see. Arrow improv manifest:
Quote: Wrote: Wrote:I just thought this subject up after a beer and a toke on a lark of a joke.
fun-da-mental physics [Image: lilD.gif]

Quote:The breakthrough by the researchers revealed that a symmetry that exists within the core of the atom is not as fundamental as scientists have believed. Doh


"Strontium-73 and bromine-73 should appear identical in structure, but surprisingly do not, we found. Probing symmetries that exist in nature is a very powerful tool for physicists. When symmetries break down, that tells us something's wrong in our understanding, and we need to take a closer look," Rogers said.
What the scientists saw will challenge nuclear theory, according to Daniel Hoff, a UMass Lowell research associate who was the lead author of the article published in Nature.

"Comparing strontium-73 and bromine-73 nuclei was like looking in a mirror and not recognizing yourself. 

[Image: corner_reflectors.png]
Once we convinced ourselves that what we were seeing was real, we were very excited," Hoff said.


Hmmmn.
After the article below i don't sound like a beer and toke lark of joke quack  youareaduck

APRIL 1, 2020
Researchers test the way we understand forces in the universe

[Image: discoverybyu.jpg]UMass Lowell Physics Assistant Prof. Andrew Rogers. Credit: UMass Lowell
A discovery by a team of researchers led by UMass Lowell nuclear physicists could change how atoms are understood by scientists and help explain extreme phenomena in outer space.

The breakthrough by the researchers revealed that a symmetry that exists within the core of the atom is not as fundamental as scientists have believed. The discovery sheds light on the forces at work within the atoms' nucleus, opening the door to a greater understanding of the universe. The findings were published today in Nature, one of the world's premier scientific journals.
The discovery was made when the UMass Lowell-led team was working to determine how atomic nuclei are created in X-ray bursts—explosions that happen on the surface of neutron stars, which are the remnants of massive stars at the end of their life.
"We are studying what happens inside the nuclei of these atoms to better understand these cosmic phenomena and, ultimately, to answer one of the biggest questions in science—how the chemical elements are created in the universe," said Andrew Rogers, UMass Lowell assistant professor of physics, who heads the research team.
The research is supported by a grant from the U.S. Department of Energy to UMass Lowell and was conducted at the National Superconducting Cyclotron Laboratory (NSCL) at Michigan State University. At the lab, scientists create exotic atomic nuclei to measure their properties in order to understand their role as the building blocks of matter, the cosmos and of life itself.
Atoms are some of the smallest units of matter. Each atom includes electrons orbiting around a tiny nucleus deep within its core, which contains almost all its mass and energy. Atomic nuclei are composed of two nearly identical particles: charged protons and uncharged neutrons. The number of protons in a nucleus determines which element the atom belongs to on the periodic table and thus its chemistry. Isotopes of an element have the same number of protons but a different number of neutrons.
At the NSCL, nuclei were accelerated to near the speed of light and smashed apart into fragments creating strontium-73—a rare isotope that is not found naturally on Earth but can exist for short periods of time during violent thermonuclear X-ray bursts on the surface of neutron stars. This isotope of strontium contains 38 protons and 35 neutrons and only lives for a fraction of a second.

Working around the clock over eight days, the team created more than 400 strontium-73 nuclei and compared them to the known properties of bromine-73, an isotope that contains 35 protons and 38 neutrons. With interchanged number of protons and neutrons, bromine-73 nuclei are considered "mirror partners" to strontium-73 nuclei. Mirror symmetry in nuclei exists because of the similarities between protons and neutrons and underlies scientists' understanding of nuclear physics.
Roughly every half-hour, the researchers created one strontium-73 nucleus, transported it through the NSCL's isotope separator and then brought the nucleus to a stop at the center of a complex detector array where they could observe its behavior. By studying the radioactive decay of these nuclei, the scientists found that strontium-73 behaved entirely differently from bromine-73. The discovery raises new questions about nuclear forces, according to Rogers.
"Strontium-73 and bromine-73 should appear identical in structure, but surprisingly do not, we found. Probing symmetries that exist in nature is a very powerful tool for physicists. When symmetries break down, that tells us something's wrong in our understanding, and we need to take a closer look," Rogers said.
What the scientists saw will challenge nuclear theory, according to Daniel Hoff, a UMass Lowell research associate who was the lead author of the article published in Nature.
"Comparing strontium-73 and bromine-73 nuclei was like looking in a mirror and not recognizing yourself. Once we convinced ourselves that what we were seeing was real, we were very excited," Hoff said.
Along with Rogers, a Somerville resident, and Hoff of Medford, the UMass Lowell team included Physics Department faculty members Assistant Prof. Peter Bender, Emeritus Prof. C.J. Lister and former UMass Lowell research associate Chris Morse. Physics graduate students Emery Doucet of Mason, N.H., and Sanjanee Waniganeththi of Lowell also contributed to the project.
As part of the team's study, state-of-the-art theoretical calculations were carried out by Simin Wang, a research associate at Michigan State, and directed by Witold Nazarewicz, MSU's John A. Hannah Distinguished Professor of Physics and chief scientist at the Facility for Rare Isotope Beams (FRIB), which will open next year.
The researchers' work "offers unique insights into the structure of rare isotopes," Nazarewicz said. "But much still remains to be done. New facilities coming online, such as FRIB at MSU, will provide missing clues into a deeper understanding of the mirror symmetry puzzle. I am glad that the exotic beams delivered by our facility, unique instrumentation and theoretical calculations could contribute to this magnificent work."
Plans for more experiments are already underway, as the researchers seek to refine and confirm their observations and study these isotopes further.




Explore further
ISOLDE steps into unexplored region of the nuclear chart to study exotic isotopes



[b]More information:[/b] D. E. M. Hoff et al, Mirror-symmetry violation in bound nuclear ground states, Nature (2020). DOI: 10.1038/s41586-020-2123-1
[b]Journal information:[/b] Nature [/url]

Provided by [url=https://phys.org/partners/university-of-massachusetts-lowell/]University of Massachusetts Lowell




Arrow B32C

Triality.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#14
Since the arrival of quantum mechanics and the theory of relativity, physicists have lost sleep over the incompatibility of these three concepts (three, since there are two theories of relativity: special and general).

Be three to see. [Image: arrow.png] improv manifest:
Quote: Wrote:I just thought this subject up after a beer and a toke on a lark of a joke.
fun-da-mental physics [Image: lilD.gif]

"We noticed, incidentally, the possibility of an interesting interpretation of the role of individual dimensions. In the system that looks superluminal to the observer some space-time dimensions seem to change their physical roles. Only one dimension of superluminal light has a spatial character—the one along which the particle moves. The other three dimensions appear to be time dimensions," says Dr. Dragan.
APRIL 2, 2020
Does relativity lie at the source of quantum exoticism?

[Image: doesrelativi.jpg]The evolution of probabilities and the "impossible" phenomena of quantum mechanics may have their origins in the special theory of relativity, as suggested by physicists from universities in Warsaw and Oxford. Credit: FUW
Since its beginnings, quantum mechanics hasn't ceased to amaze us with its peculiarity, so difficult to understand. Why does one particle seem to pass through two slits simultaneously? Why, instead of specific predictions, can we only talk about evolution of probabilities? According to theorists from universities in Warsaw and Oxford, the most important features of the quantum world may result from the special theory of relativity, which until now seemed to have little to do with quantum mechanics.

Since the arrival of quantum mechanics and the theory of relativity, physicists have lost sleep over the incompatibility of these three concepts (three, since there are two theories of relativity: special and general). It has commonly been accepted that it is the description of quantum mechanics that is the more fundamental and that the theory of relativity that will have to be adjusted to it. Dr. Andrzej Dragan from the Faculty of Physics, University of Warsaw (FUW) and Prof. Artur Ekert from the University of Oxford (UO) have just presented their reasoning leading to a different conclusion. In the article "The Quantum Principle of Relativity," published in the New Journal of Physics, they prove that the features of quantum mechanics determining its uniqueness and its non-intuitive exoticism—accepted, what's more, on faith (as axioms)—can be explained within the framework of the special theory of relativity. One only has to decide on a certain rather unorthodox step.
Albert Einstein based the special theory of relativity on two postulates. The first is known as the Galilean principle of relativity (which, please note, is a special case of the Copernican principle). This states that physics is the same in every inertial system (i.e., one that is either at rest or in a steady straight line motion). The second postulate, formulated on the result of the famous Michelson-Morley experiment, imposed the requirement of a constant velocity of light in every reference system.
"Einstein considered the second postulate to be crucial. In reality, what is crucial is the principle of relativity. Already in 1910 Vladimir Ignatowski showed that based only on this principle it is possible to reconstruct all relativistic phenomena of the special theory of relativity. A strikingly simple reasoning, leading directly from the principle of relativity to relativism, was also presented in 1992 by Professor Andrzej Szymacha from our faculty," says Dr. Dragan.
The special theory of relativity is a coherent structure that allows for three mathematically correct types of solutions: a world of particles moving at subluminal velocities, a world of particles moving at the velocity of light and a world of particles moving at superluminal velocities. This third option has always been rejected as having nothing to do with reality.

"We posed the question: what happens if—for the time being without entering into the physicality or non-physicality of the solutions—we take seriously not part of the special theory of relativity, but all of it, together with the superluminal system? We expected cause-effect paradoxes. Meanwhile, we saw exactly those effects that form the deepest core of quantum mechanics," say Dr. Dragan and Prof. Ekert.
Initially, both theorists considered a simplified case: space-time with all three families of solutions, but consisting of only one spatial and one time dimension (1+1). A particle at rest in one system of solutions seems to move superluminally in the other, which means that superluminosity itself is relative.
In a space-time continuum constructed this way, non-deterministic events occur naturally. If in one system at point A there is generation of a superluminal particle, even completely predictable, emitted towards point B, where there is simply no information about the reasons for the emission, then from the point of view of the observer in the second system events run from point B to point A, so they start from a completely unpredictable event. It turns out that analogous effects appear also in the case of subluminal particle emissions.
Both theorists have also shown that after taking into account superluminal solutions, the motion of a particle on multiple trajectories simultaneously appears naturally, and a description of the course of events requires the introduction of a sum of combined amplitudes of probability that indicate the existence of superposition of states, a phenomenon thus far associated only with quantum mechanics.
In the case of space-time with three spatial dimensions and one time dimension (3+1), that is, corresponding to our physical reality, the situation is more complicated. The principle of relativity in its original form is not preserved—the subluminal and superluminal systems are distinguishable. However, the researchers noticed that when the principle of relativity is modified to the form: "The ability to describe an event in a local and deterministic way should not depend on the choice of an inertial reference system," it limits the solutions to those in which all the conclusions from the consideration in (1+1) space-time remain valid.
"We noticed, incidentally, the possibility of an interesting interpretation of the role of individual dimensions. In the system that looks superluminal to the observer some space-time dimensions seem to change their physical roles. 


Only one dimension of superluminal light has a spatial character—the one along which the particle moves.  Arrow

The other three dimensions appear to be time dimensions," says Dr. Dragan.

Quote:[b]If three or more objects move around each other, history cannot be reversed.[/b][b] Naughty https://phys.org/news/2020-03-symmetry-l...ysics.html[/b]

A characteristic feature of spatial dimensions is that a particle can move in any direction or remain at rest, while in a time dimension it always propagates in one direction (what we call aging in everyday language). So, three time dimensions of the superluminal system with one spatial dimension (1+3) would thus mean that particles inevitably age in three times simultaneously. The ageing process of a particle in a superluminal system (1+3), observed from a subluminal system (3+1), would look as if the particle was moving like a spherical wave, leading to the famous Huygens principle (every point on a wavefront can be treated itself as a source of a new spherical wave) and corpuscular-wave dualism.
"All the strangeness that appears when considering solutions relating to a system that looks superluminal turns out to be no stranger than what commonly accepted and experimentally verified quantum theory has long been saying. On the contrary, taking into account a superluminal system, it is possible—at least theoretically—to derive some of the postulates of quantum mechanics from the special theory of relativity, which were usually accepted as not resulting from other, more fundamental reasons," Dr. Dragan concludes.



For almost a hundred years quantum mechanics has been awaiting a deeper theory to explain the nature of its mysterious phenomena. If the reasoning presented by the physicists from FUW and UO stands the test of time, history would cruelly mock all physicists.



Quote: Wrote:Instantly ancient.

This idea anew.

If three or more objects move around each other, history cannot be reversed.

 The "unknown" theory sought for decades, explaining the uniqueness of quantum mechanics, would be something already known from the very first work on quantum theory.



[b][b]More information:[/b] Andrzej Dragan et al, Quantum principle of relativity, New Journal of Physics (2020). DOI: 10.1088/1367-2630/ab76f7
[b]Journal information:[/b] New Journal of Physics [/url]

Provided by [url=https://phys.org/partners/university-of-warsaw/]University of Warsaw
 
[/b]

https://phys.org/news/2020-04-relativity...icism.html

[b][Image: arrow.png] B32C

[/b]


Triality.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#15
Rethinking cosmology: Universe expansion may not be uniform

08/04/2020 455 views 10 likes




ESA / About Us / ESAC
Astronomers have assumed for decades that the Universe is expanding at the same rate in all directions. A new study based on data from ESA’s XMM-Newton, NASA’s Chandra and the German-led ROSAT X-ray observatories suggests this key premise of cosmology might be wrong.

Konstantinos Migkas, a PhD researcher in astronomy and astrophysics at the University of Bonn, Germany, and his supervisor Thomas Reiprich originally set out to verify a new method that would enable astronomers to test the so-called isotropy hypothesis. According to this assumption, the Universe has, despite some local differences, the same properties in each direction on the large scale.

Widely accepted as a consequence of well-established fundamental physics, the hypothesis has been supported by observations of the cosmic microwave background (CMB). A direct remnant of the Big Bang, the CMB reflects the state of the Universe as it was in its infancy, at only 380 000 years of age. The CMB’s uniform distribution in the sky suggests that in those early days the Universe must have been expanding rapidly and at the same rate in all directions.

In today’s Universe, however, this may no longer be true.


[Image: Cosmic_expansion_measured_across_the_sky_article.jpg]

A map showing the rate of the expansion of the Universe in different directions across the sky as measured by the current study




“Together with colleagues from the University of Bonn and Harvard University, we looked at the behaviour of over 800 galaxy clusters in the present Universe,” says Konstantinos. “If the isotropy hypothesis was correct, the properties of the clusters would be uniform across the sky. But we actually saw significant differences.”

The astronomers used X-ray temperature measurements of the extremely hot gas that pervades the clusters and compared the data with how bright the clusters appear in the sky. Clusters of the same temperature and located at a similar distance should appear similarly bright. But that is not what the astronomers observed.

“We saw that clusters with the same properties, with similar temperatures, appeared to be less bright than what we would expect in one direction of the sky, and brighter than expected in another direction,” says Thomas. “The difference was quite significant, around 30 per cent. These differences are not random but have a clear pattern depending on the direction in which we observed in the sky.”

Before challenging the widely accepted cosmology model, which provides the basis for estimating the cluster distances, Konstantinos and colleagues first looked at other possible explanations. Perhaps, there could be undetected gas or dust clouds obscuring the view and making clusters in a certain area appear dimmer. The data, however, do not support this scenario.

In some regions of space the distribution of clusters could be affected by bulk flows, large-scale motions of matter caused by the gravitational pull of extremely massive structures such as large cluster groups. This hypothesis, however, also seems unlikely. Konstantinos adds that the findings took the team by surprise.





Rethinking cosmic expansion
Access the video
 
ESA’s upcoming telescope Euclid, designed to image billions of galaxies and scrutinise the expansion of the cosmos, its acceleration and the nature of dark energy, might help solve this mystery in the future.

“The findings are really interesting but the sample included in the study is still relatively small to draw such profound conclusions,” says René Laureijs, Euclid project scientist at ESA. “This is the best one could do with the available data, but if we were to really re-think the widely accepted cosmological model, we would need more data.”

And Euclid might do exactly that. The spacecraft, to be launched in 2022, might not only find evidence that dark energy is really stretching the Universe unevenly in different directions, it will also enable the scientists to gather more data on the properties of a large amount of galaxy clusters, which might support or disprove the current findings.

Further data will also come soon from the X-ray eROSITA instrument, built by the Max Planck Institute for Extraterrestrial Physics. The instrument, aboard the recently launched German-Russian satellite Spektr-RG, will conduct the first all-sky survey in medium energy X-rays, focusing on the discovery of tens of thousands previously unknown galaxy clusters and active galactic centres.

More information

‘Probing cosmic isotropy with a new X-ray galaxy cluster sample through the LX−Tscaling relation’ by K. Migkas et al. (2020) is published in Astronomy & Astrophysics (DOI: 10.1051/0004-6361/201936602).

For further information, please contact:

Konstantinos Migkas
Argelander Institute for Astronomy
University of Bonn, Germany
Email: kmigkas@astro.uni-bonn.de

Thomas Reiprich
Argelander Institute for Astronomy
University of Bonn, Germany
Email: reiprich@astro.uni-bonn.de

Norbert Schartel
XMM-Newton project scientist
European Space Agency
Email: Norbert.Schartel@esa.int

ESA Media Relations
Email: media@esa.int



Source: http://www.esa.int/About_Us/ESAC/Rethink...be_uniform

Bob... Ninja Assimilated
"The Morning Light, No sensation to compare to this, suspended animation, state of bliss, I keep my eyes on the circling sky, tongue tied and twisted just and Earth Bound Martian I" Learning to Fly Pink Floyd [Video: https://vimeo.com/144891474]
Reply
#16
The AMS (Alpha Magnetic Spectrometer on the ISS is scoping out 
possibilities for the existence of anti-matter.
The detection of anti-helium is said to be proof,
and so far there's been certain signs of its existence.

https://ams.nasa.gov/index.html
Reply
#17
The theory of Triality on trial(all ma'at @ that)

Thanx 007 and KR!

(i.e., it could be a kind of mediator between the two forms of matter)

Improv Manifests itself.


Quote:Posted by EA - Sunday, March 8th, 2020, 03:13 am
What if matter and anti-matter don't always  immediately self-destruct but instead they immediately mediate?





LilD


[b]Link between dark and normal matter[/b]
The Z′ boson may play an interesting role in the interaction between dark and visible matter, (i.e., it could be a kind of mediator between the two forms of matter). The Z′ boson can—at least theoretically—result from the collision of electrons (matter) and positrons (anti-matter) in the SuperKEKB and then decay into invisible dark matter particles.
The Z′ boson can thus help scientists to understand the behavior of dark matter. What's more, the discovery of the Z′ boson could also explain other observations that are not consistent with the standard model, the fundamental theory of particle physics.



APRIL 7, 2020
Belle II yields the first results: In search of the Z′ boson
[Image: apstipsheetf.jpg]A computer graphics of a simulated event in which a Z' boson is produced by e+e- collisions, in association with two muons (green line and hits) and decays into invisible particles. In this figure, the Z' boson decays into an invisible neutrino and anti-neutrino, but it may decays also into the dark matter particle and its anti-particle. Credit: KEK / Belle II Collaboration
The Belle II experiment has been collecting data from physical measurements for about one year. After several years of rebuilding work, both the SuperKEKB electron–positron accelerator and the Belle II detector have been improved compared with their predecessors in order to achieve a 40-fold higher data rate.

Scientists at 12 institutes in Germany are involved in constructing and operating the detector, developing evaluation doink-head and analyzing the data. The Max Planck Institute for Physics made a substantial contribution to the development of the highly sensitive innermost detector, the Pixel Vertex Detector.
With the help of Belle II, scientists are looking for traces of new physics that could explain the unequal occurrence of matter and antimatter and the mysterious dark matter. One of the so-far undiscovered particles that the Belle II detector is looking for is the Z′ boson—a variant of the Z boson, which acts as an exchange particle for the weak interaction.
As far as we know, about 25% of the universe consists of dark matter, whereas visible matter accounts for just under 5% of the energy budget. Both forms of matter attract each other through gravity. Dark matter thus forms a kind of template for the distribution of visible matter. This can be seen, for example, in the arrangement of galaxies in the universe.
[b]Link between dark and normal matter[/b]
The Z′ boson may play an interesting role in the interaction between dark and visible matter, (i.e., it could be a kind of mediator between the two forms of matter). The Z′ boson can—at least theoretically—result from the collision of electrons (matter) and positrons (anti-matter) in the SuperKEKB and then decay into invisible dark matter particles.
The Z′ boson can thus help scientists to understand the behavior of dark matter. What's more, the discovery of the Z′ boson could also explain other observations that are not consistent with the standard model, the fundamental theory of particle physics.

[Image: belleiiyield.jpg]
Electrons and positrons collide within the Belle II detector. Credit: ill./©: Belle II
[b]Important clue: Detection of muon pairs[/b]
But how can the Z′ boson be detected in the Belle II detector? Not directly—that much is sure. Theoretical models and simulations predict that the Z′ boson could reveal itself through interactions with muons, the heavier relatives of electrons. If scientists discover an unusually high number of muon pairs of opposite charge after the electron/positron collisions as well as unexpected deviations in energy and momentum conservation, this would be an important indication of the Z′ boson.

However, the new Belle II data has not yet provided any indication of the Z′ boson. But with the new data, the scientists can limit the mass and coupling strengths of the Z′ boson with previously unattainable accuracy.
[b]More data, more precise analyses[/b]
"Despite the still small amount of data, we can now make measurements that have never been done before," says the spokesperson of the German groups, Dr. Thomas Kuhr from the Ludwig Maximilian University of Munich. "This underlines the important role of the Belle II experiment in the study of elementary particles."
These initial results come from the analysis of a small amount of data collected during the start-up phase of SuperKEKB in 2018. Belle II went into full operation on March 25, 2019. Since then, the experiment has been collecting data while continuously improving the collision rate of electrons and positrons.
If the experiment is perfectly tuned, it will provide considerably more data than in the recently published analyses. The physicists thus hope to gain new insights into the nature of dark matter and other unanswered questions.




Explore further
Belle II measures first particle collisions



[b]More information:[/b] I. Adachi et al. Search for an Invisibly Decaying Z′ Boson at Belle II in e+e−→μ+μ−(e±μ∓) Plus Missing Energy Final States, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.141801
[b]Journal information:[/b] Physical Review Letters [/url]

Provided by [url=https://phys.org/partners/max-planck-society/]Max Planck Society
 







[Image: doubtsaboutb.jpg]

Rethinking cosmology: Universe expansion may not be uniform (Update)
Astronomers have assumed for decades that the Universe is expanding at the same rate in all directions. A new study based on data from ESA's XMM-Newton, NASA's Chandra and the German-led ROSAT X-ray observatories suggests ...


Thanx bob!

This thread is exactly about Rethinking cosmology.

Recall:
WE as a forum know now.
How to use 'the observer effect' for a good cause and...
a beer and a toke on a lark of a joke.

Fun-da-mental fizix for the forum 

and oh yeah.

don't gamble with improv.

Triality is as TRIPL3 THR33 Infinity was and triunity is soon two be.

pontius pilate  Sheepconscious pilot.

Letz Jefferson Starship this 

[Image: randomness.jpg]

As counter-intuitive as it seems perhaps the solution may partially be the monty hall effect.
This is one of many reasons why I say as your eye assays:

don't gamble with improv   when you can Drink a Bud Light and Smoke SUM BUD-LIGHT"

And change your mind at the last picosecond.

rather funny how an improv artist from who's line is it anyway is the new monty hall effect , eh?


Monty Hall Problem
The Monty Hall problem is named for its similarity to the Let's Make a Deal television game show hosted by Monty Hall. The problem is stated as follows. Assume that a room is equipped with three doors. Behind two are goats, and behind the third is a shiny new car. You are asked to pick a door, and will win whatever is behind it. Let's say you pick door 1. Before the door is opened, however, someone who knows what's behind the doors (Monty Hall) opens one of the other two doors, revealing a goat, and asks you if you wish to change your selection to the third door (i.e., the door which neither you picked nor he opened). The Monty Hall problem is deciding whether you do.
The correct answer is that you do want to switch. If you do not switch, you have the expected 1/3 chance of winning the car, since no matter whether you initially picked the correct door, Monty will show you a door with a goat. But after Monty has eliminated one of the doors for you, you obviously do not improve your chances of winning to better than 1/3 by sticking with your original choice. If you now switch doors, however, there is a 2/3 chance you will win the car (counterintuitive though it seems).

The above results are characteristic of the best strategy for the [Image: Inline11.gif]-stage Monty Hall problem: stick until the last choice, then switch.


Posted by EA - Friday, April 3rd, 2020, 12:13 am
Since the arrival of quantum mechanics and the theory of relativity, physicists have lost sleep over the incompatibility of these three concepts (three, since there are two theories of relativity: special and general).

Be three to see.    improv manifest:

Quote: Wrote: Wrote:I just thought this subject up after a beer and a toke on a lark of a joke.
fun-da-mental physics 

"We noticed, incidentally, the possibility of an interesting interpretation of the role of individual dimensions. In the system that looks superluminal to the observer some space-time dimensions seem to change their physical roles. Only one dimension of superluminal light has a spatial character—the one along which the particle moves. The other three dimensions appear to be time dimensions," says Dr. Dragan.
APRIL 2, 2020
Does relativity lie at the source of quantum exoticism?

by University of Warsaw
[Image: doesrelativi.jpg]The evolution of probabilities and the "impossible" phenomena of quantum mechanics may have their origins in the special theory of relativity, as suggested by physicists from universities in Warsaw and Oxford. Credit: FUW
Since its beginnings, quantum mechanics hasn't ceased to amaze us with its peculiarity, so difficult to understand. Why does one particle seem to pass through two slits simultaneously? Why, instead of specific predictions, can we only talk about evolution of probabilities? According to theorists from universities in Warsaw and Oxford, the most important features of the quantum world may result from the special theory of relativity, which until now seemed to have little to do with quantum mechanics.






[Image: quantumresea.jpg]

Quantum researchers able to split one photon into three
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo report the first occurrence of directly splitting one photon into three.




TRIPL3 THR33 INFINITY.

Dual  Sheep   Duel
Reply
#18
The next Question is:

Is space-Time  Conic, Cubic or Tetrahedral...or all three simultaneously?



Physicists at MIT and elsewhere have observed evidence of Majorana fermions—particles that are theorized to also be their own antiparticle—on the surface of a common metal: gold


Quote:~19.471 x 99.9999999999999999999999999999... Arrow

SCIENTISTS MELT GOLD AT ROOM TEMPERATURE

BY [b]ARISTOS GEORGIOU [/b]ON 11/21/18 AT 11:07 AM EST

TECH & SCIENCEPHYSICSGOLD
Most metals have very high melting points, not least gold—which turns into a liquid at temperatures above 1,947 degrees Fahrenheit (1,064 degrees Celsius).
But now, researchers from Chalmers University of Technology in Sweden have found a way to melt gold at room temperature. The surprise finding came about while researchers were investigating gold samples using an electron microscope (EM).
Unlike optical microscopes that use visible light and a system of lenses to magnify small objects, EMs use electrons to produce images of extremely small objects. In fact, with this technique it is possible to study individual atoms.
In an experiment, Ludvig de Knoop, from Chalmers' Department of Physics, placed a small piece of gold in an electron microscope to see how an electric field influenced the gold atoms. He increased the electric field step-by-step while using the highest magnification.
"We wanted to see what happens to gold when it is under the influence of an extremely high electric field," de Knoop told Newsweek. "A known effect when applying such high electric fields on metals is that they evaporate, that is, they boil off from the solid metal."
When he studied the atoms in recordings taken from the microscope, he noticed something totally unexpected—the surface layers of gold had melted, despite being at room temperature.
"It wasn't until later, when we analyzed the data and the recorded movies, that we understood that we had witnessed something new and spectacular," he said. "The big surprise with our work that the outermost few atomic surface layers of gold melted before they evaporate. Further on, we realised that we could controllably switch the structure from surface melted back to being ordered by switching the electric field."
"This is an extraordinary phenomenon, and it gives us new, foundational knowledge of gold," he said in a statement.


https://www.newsweek.com/scientists-melt-gold-room-temperature-1226339

In particle physics, fermions are a class of elementary particles that includes electrons, protons, neutrons, and quarks, all of which make up the building blocks of matter. For the most part, these particles are considered Dirac fermions, after the English physicist Paul Dirac, who first predicted that all fermionic fundamental particles should have a counterpart, somewhere in the universe, in the form of an antiparticle—essentially, an identical twin of opposite charge.

In 1937, the Italian theoretical physicist Ettore Majorana extended Dirac's theory, predicting that among fermions, there should be some particles, since named Majorana fermions, that are indistinguishable from their antiparticles. Mysteriously, the physicist disappeared during a ferry trip off the Italian coast just a year after making his prediction. Scientists have been looking for Majorana's enigmatic particle ever since. It has been suggested, but not proven, that the neutrino may be a Majorana particle. On the other hand, theorists have predicted that Majorana fermions may also exist in solids under special conditions.


APRIL 10, 2020
First sighting of mysterious Majorana fermion on a common metal
[Image: quantum.jpg]Credit: CC0 Public Domain
Physicists at MIT and elsewhere have observed evidence of Majorana fermions—particles that are theorized to also be their own antiparticle—on the surface of a common metal: gold. This is the first sighting of Majorana fermions on a platform that can potentially be scaled up. The results, published in the Proceedings of the National Academy of Sciences, are a major step toward isolating the particles as stable, error-proof qubits for quantum computing.

In particle physics, fermions are a class of elementary particles that includes electrons, protons, neutrons, and quarks, all of which make up the building blocks of matter. For the most part, these particles are considered Dirac fermions, after the English physicist Paul Dirac, who first predicted that all fermionic fundamental particles should have a counterpart, somewhere in the universe, in the form of an antiparticle—essentially, an identical twin of opposite charge.
In 1937, the Italian theoretical physicist Ettore Majorana extended Dirac's theory, predicting that among fermions, there should be some particles, since named Majorana fermions, that are indistinguishable from their antiparticles. Mysteriously, the physicist disappeared during a ferry trip off the Italian coast just a year after making his prediction. Scientists have been looking for Majorana's enigmatic particle ever since. It has been suggested, but not proven, that the neutrino may be a Majorana particle. On the other hand, theorists have predicted that Majorana fermions may also exist in solids under special conditions.
Now the MIT-led team has observed evidence of Majorana fermions in a material system they designed and fabricated, which consists of nanowires of gold grown atop a superconducting material, vanadium, and dotted with small, ferromagnetic "islands" of europium sulfide. When the researchers scanned the surface near the islands, they saw signature signal spikes near zero energy on the very top surface of gold that, according to theory, should only be generated by pairs of Majorana fermions.
"Majorana ferminons are these exotic things, that have long been a dream to see, and we now see them in a very simple material—gold," says Jagadeesh Moodera, a senior research scientist in MIT's Department of Physics. "We've shown they are there, and stable, and easily scalable."
"The next push will be to take these objects and make them into qubits, which would be huge progress toward practical quantum computing," adds co-author Patrick Lee, the William and Emma Rogers Professor of Physics at MIT.
Lee and Moodera's coauthors include former MIT postdoc and first author Sujit Manna (currently on the faculty at the Indian Institute of Technology at Delhi), and former MIT postdoc Peng Wei of University of California at Riverside, along with Yingming Xie and Kam Tuen Law of the Hong Kong University of Science and Technology.

[b]High risk[/b]
If they could be harnessed, Majorana fermions would be ideal as qubits, or individual computational units for quantum computers. The idea is that a qubit would be made of combinations of pairs of Majorana fermions, each of which would be separated from its partner. If noise errors affect one member of the pair, the other should remain unaffected, thereby preserving the integrity of the qubit and enabling it to correctly carry out a computation.
Scientists have looked for Majorana fermions in semiconductors, the materials used in conventional, transistor-based computing. In their experiments, researchers have combined semiconductors with superconductors—materials through which electrons can travel without resistance. This combination imparts superconductive properties to conventional semiconductors, which physicists believe should induce particles in the semiconductor to split , forming the pair of Majorana fermions.
"There are several material platforms where people believe they've seen Majorana particles," Lee says. "The evidence is stronger and stronger, but it's still not 100 percent proven."
What's more, the semiconductor-based setups to date have been difficult to scale up to produce the thousands or millions of qubits needed for a practical quantum computer, because they require growing very precise crystals of semiconducting material and it is very challenging to turn these into high-quality superconductors.
About a decade ago, Lee, working with his graduate student Andrew Potter, had an idea: Perhaps physicists might be able to observe Majorana fermions in metal, a material that readily becomes superconductive in proximity with a superconductor. Scientists routinely make metals, including gold, into superconductors. Lee's idea was to see if gold's surface state—its very top layer of atoms—could be made to be superconductive. If this could be achieved, then gold could serve as a clean, atomically precise system in which researchers could observe Majorana fermions.
Lee proposed, based on Moodera's prior work with ferromagnetic insulators, that if it were placed atop a superconductive surface state of gold, then researchers should have a good chance of clearly seeing signatures of Majorana fermions.
"When we first proposed this, I couldn't convince a lot of experimentalists to try it, because the technology was daunting," says Lee who eventually partnered with Moodera's experimental group to to secure crucial funding from the Templeton Foundation to realize the design. "Jagadeesh and Peng really had to reinvent the wheel. It was extremely courageous to jump into this, because it's really a high-risk, but we think a high-payoff, thing."
[b]"Finding Majorana"[/b]
Over the last few years, the researchers have characterized gold's surface state and proved that it could work as a platform for observing Majorana fermions, after which the group began fabricating the setup that Lee envisioned years ago.
They first grew a sheet of superconducting vanadium, on top of which they overlaid nanowires of gold layer, measuring about 4 nanometers thick. They tested the conductivity of gold's very top layer, and found that it did, in fact, become superconductive in proximity with the vanadium. They then deposited over the gold nanowires "islands" of europium sulfide, a ferromagnetic material that is able to provide the needed internal magnetic fields to create the Majorana fermions.
The team then applied a tiny voltage and used scanning tunneling microscopy, a specialized technique that enabled the researchers to scan the energy spectrum around each island on gold's surface.
Moodera and his colleagues then looked for a very specific energy signature that only Majorana fermions should produce, if they exist. In any superconducting material, electrons travel through at certain energy ranges. There is however a desert, or "energy gap" where there should be no electrons. If there is a spike inside this gap, it is very likely a signature of Majorana fermions.
Looking through their data, the researchers observed spikes inside this energy gap on opposite ends of several islands along the the direction of the magnetic field, that were clear signatures of pairs of Majorana fermions.
"We only see this spike on opposite sides of the island, as theory predicted," Moodera says. "Anywhere else, you don't see it."
"In my talks, I like to say that we are finding Majorana, on an island in a sea of gold," Lee adds.
Moodera says the team's setup, requiring just three layers—gold sandwiched between a ferromagnet and a superconductor—is an "easily achievable, stable system" that should also be economically scalable compared to conventional, semiconductor-based approaches to generate qubits.
"Seeing a pair of Majorana fermions is an important step toward making a qubit," Wei says. "The next step is to make a qubit from these particles, and we now have some ideas for how to go about doing this."




Explore further
New material shows high potential for quantum computing



[b]More information:[/b] Sujit Manna et al. Signature of a pair of Majorana zero modes in superconducting gold surface states, Proceedings of the National Academy of Sciences (2020). DOI: 10.1073/pnas.1919753117
[b]Journal information:[/b] Proceedings of the National Academy of Sciences 

Provided by Massachusetts Institute of Technology 
This story is republished courtesy of MIT News ([url=http://web.mit.edu/newsoffice/]web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

https://phys.org/news/2020-04-sighting-m...ommon.html




B32C  Arrow

As Feldman notes, the standard model of particle physics theorizes that there are two kinds of elementary particles—bosons and fermions. But as he also notes, the standard model describes physics in three dimensions with particles at their highest energy levels. That leaves some wiggle room for the existence of other types of quasiparticles that exist only in two dimensions. One such proposed 2-D quasiparticle is the anyon—it is not a fermion or a boson. And theory has suggested that its charge can be less than that of an electron, which makes them the smallest proposed charged quasiparticle. And they behave differently than either fermions or bosons in one particular way. Fermions avoid each other and bosons can form groups—anyons, in contrast, have been predicted to interact somewhere in between attracting and repelling. And it was this feature that lay at the heart of the work done by the team in France.

APRIL 10, 2020 REPORT
Anyon evidence observed using tiny anyon collider
by Bob Yirka , Phys.org
[Image: 5e90624ed3d01.jpg]Sample and principle of the experiment. (A) Exclusion quasiprobability p: The probability K to have two anyons exiting in the same output edge channel is modified by the factor (1 – p). (B) Principle of the experiment: The voltage V generates the currents I0 toward QPC1 and QPC2. These two QPCs, tuned in the weak-backscattering regime T1, T2. Credit: ≪ 1, act as random Poissonian sources of anyons that collide on cQPC. © False-colored scanning electron microscope (SEM) image of the sample. The electron gas is shown in blue and the gates in gold. Edge currents are shown as red lines (red dashed lines after partitioning).Science (2020). DOI: 10.1126/science.aaz5601
A team of researchers from Sorbonne Université and Université de Paris has reported observational evidence of a quasiparticle called an anyon. In their paper published in the journal Science, the team describes the tiny anyon collider they built in the lab their results. Dmitri Feldman, with Brown University has published a Perspective piece on the work in the same journal issue.

As Feldman notes, the standard model of particle physics theorizes that there are two kinds of elementary particles—bosons and fermions. But as he also notes, the standard model describes physics in three dimensions with particles at their highest energy levels. That leaves some wiggle room for the existence of other types of quasiparticles that exist only in two dimensions. One such proposed 2-D quasiparticle is the anyon—it is not a fermion or a boson. And theory has suggested that its charge can be less than that of an electron, which makes them the smallest proposed charged quasiparticle. And they behave differently than either fermions or bosons in one particular way. Fermions avoid each other and bosons can form groups—anyons, in contrast, have been predicted to interact somewhere in between attracting and repelling. And it was this feature that lay at the heart of the work done by the team in France.
The work involved creating a very tiny 2-D anyon collider—so small they had to use an electron microscope to observe the action inside of it. The collider consisted of a 2-D plane set between another layered material. More specifically, the collider held a quantum Hall liquid that was kept inside of a strong magnetic field. Electric charges were directed along source tunnels to quantum point contacts. Anyon streams were directed in a manner that forced them to collide in the middle of the collider and then exit along one of two designated paths. In such a device, fermions would leave the collider via separate paths, while bosons would leave as clumps. The researchers observed evidence of minor clumping—less than would be seen with bosons, but consistent with what theory has suggested would happen with anyons.

[Image: finallyanyon.jpg]
One of the samples used in the experiment. Credit: Dr Manohar Kumar



Explore further
Quantum copycat: Researchers find a new way in which bosons behave like fermions



[b]More information:[/b] H. Bartolomei et al. Fractional statistics in anyon collisions, Science (2020). DOI: 10.1126/science.aaz5601

https://phys.org/news/2020-04-anyon-evid...lider.html






 It's one reason there are still so many mysteries in nature, including how the universe's basic building blocks coalesce and form stars and galaxies. The same is true in high-energy experiments, in which particles like protons smash together at incredible speeds to create extreme conditions similar to those just after the Big Bang.


APRIL 9, 2020
Charting a course toward quantum simulations of nuclear physics
by Bailey Bedford, Joint Quantum Institute
[Image: chartingacou.jpg]Trapped ion quantum simulators may soon offer new means to explore the properties of matter emerging from complex interactions among quarks, gluons and the other fundamental building blocks of nature. Credit: A. Shaw and Z. Davoudi/University of Maryland
In nuclear physics, like much of science, detailed theories alone aren't always enough to unlock solid predictions. There are often too many pieces, interacting in complex ways, for researchers to follow the logic of a theory through to its end. It's one reason there are still so many mysteries in nature, including how the universe's basic building blocks coalesce and form stars and galaxies. The same is true in high-energy experiments, in which particles like protons smash together at incredible speeds to create extreme conditions similar to those just after the Big Bang.

Fortunately, scientists can often wield simulations to cut through the intricacies. A simulation represents the important aspects of one system—such as a plane, a town's traffic flow or an atom—as part of another, more accessible system (like a computer program or a scale model). Researchers have used their creativity to make simulations cheaper, quicker or easier to work with than the formidable subjects they investigate—like proton collisions or black holes.
Simulations go beyond a matter of convenience; they are essential for tackling cases that are both too difficult to directly observe in experiments and too complex for scientists to tease out every logical conclusion from basic principles. Diverse research breakthroughs—from modeling the complex interactions of the molecules behind life to predicting the experimental signatures that ultimately allowed the identification of the Higgs boson—have resulted from the ingenious use of simulations.
But conventional simulations only get you so far. In many cases, a simulation requires so many computations that the best computers ever built can't make meaningful progress—not even if you are willing to wait your entire life.
Now, quantum simulators (which exploit quantum effects like superposition and entanglement) promise to bring their power to bear on many problems that have refused to yield to simulations built atop classical computers—including problems in nuclear physics. But to run any simulation, quantum or otherwise, scientists must first determine how to faithfully represent their system of interest in their simulator. They must create a map between the two.
Computational nuclear physicist Zohreh Davoudi, an assistant professor of physics at the University of Maryland (UMD), is collaborating with researchers at JQI to explore how quantum simulations might aid nuclear physicists. They are working to create some of the first maps between the theories that describe the underpinnings of nuclear physics and the early quantum simulators and quantum computers being put together in labs.

"It seems like we are at the verge of going into the next phase of computing that takes advantage of quantum mechanics," says Davoudi. "And if nuclear scientists don't get into this field now—if we don't start to move our problems into such quantum hardware, we might not be able to catch up later because quantum computing is evolving very fast."
Davoudi and several colleagues, including JQI Fellows Chris Monroe and Mohammad Hafezi, designed their approach to making maps with an eye toward compatibility with the quantum technologies on the horizon. In a new paper published April 8, 2020 in the journal Physical Review Research, they describe their new method and how it creates new simulation opportunities for researchers to explore.
"It is not yet clear exactly where quantum computers will be usefully applied," says Monroe, who is also a professor of physics at UMD and co-founder of the quantum computing startup IonQ. "One strategy is to deploy them on problems that are based in quantum physics. There are many approaches in electronic structure and nuclear physics that are so taxing to normal computers that quantum computers may be a way forward."
[b]Patterns and Control[/b]
As a first target, the team set their sights on lattice gauge theories. Gauge theories describe a wide variety of physics, including the intricate dance of quarks and gluons—the fundamental particles in nuclear physics. Lattice versions of gauge theories simplify calculations by restricting all the particles and their interactions to an orderly grid, like pieces on a chessboard.
Even with this simplification, modern computers can still choke when simulating dense clumps of matter or when tracking how matter changes over time. The team believes that quantum computers might overcome these limitations and eventually simulate more challenging types of gauge theories—such as quantum chromodynamics, which describes the strong interactions that bind quarks and gluons into protons and neutrons and hold them together as atomic nuclei.
Davoudi and her colleagues chose trapped atomic ions—the specialty of Monroe—as the physical system for performing their simulation. In these systems, ions, which are electrically charged atoms, hover, each trapped by a surrounding electric or magnetic field. Scientists can design these fields to arrange the ions in various patterns that can be used to store and transfer information. For this proposal, the team focused on ions organized into a straight line.
Researchers use lasers to control each ion and its interactions with neighbors—an essential ability when creating a useful simulation. The ions are much more accessible than the smaller particles that intrigue Davoudi. Nuclear physicists can only dream of achieving the same level of control over the interactions at the hearts of atoms.
"Take a problem at the femtometer scale and expand it to micron scale—that dramatically increases our level of control," says Hafezi, who is also an associate professor in the Department of Electrical and Computer Engineering and the Department of Physics at UMD. "Imagine you were supposed to dissect an ant. Now the ant is stretched to the distance between Boston and Los Angeles."
While designing their map-making method, the team looked at what can be done with off-the-shelf lasers. They realized that current technology allows ion trappers to set up lasers in a new, efficient way that allows for simultaneous control of three different spin interactions for each ion.
"Trapped-ion systems come with a toolbox to simulate these problems," says Hafezi. "Their amazing feature is that sometimes you can go back and design more tools and add it to the box."
With this opportunity in mind, the researchers developed a procedure for producing maps with two desirable features. First, the maps maximize how faithfully the ion-trap simulation matches a desired lattice gauge theory. Second, they minimize the errors that occur during the simulation.
In the paper, the researchers describe how this approach might allow a one-dimensional string of ions to simulate a few simple lattice gauge theories, not only in one dimension but also higher dimensions. With this approach, the behavior of ion spins can be tailored and mapped to a variety of phenomena that can be described by lattice gauge theories, such as the generation of matter and antimatter out of a vacuum.
"As a nuclear theorist, I am excited to work further with theorists and experimentalists with expertise in atomic, molecular, and optical physics and in ion-trap technology to solve more complex problems," says Davoudi. "I explained the uniqueness of my problem and my system, and they explained the features and capabilities of their system, then we brainstormed ideas on how we can do this mapping."
Monroe points out that "this is exactly what is needed for the future of quantum computing. This 'co-design' of devices tailored for specific applications is what makes the field fresh and exciting."
[b]Analog vs. Digital[/b]
The simulations proposed by Davoudi and her colleagues are examples of analog simulations, since they directly represent elements and interactions in one system with those of another system. Generally, analog simulators must be designed for a particular problem or set of problems. This makes them less versatile than digital simulators, which have an established set of discrete building blocks that can be put together to simulate nearly anything given enough time and resources.
The versatility of digital simulations has been world-altering, but a well-designed analog system is often less complex than its digital counterpart. Carefully designed quantum analog simulations might deliver results for certain problems before quantum computers can reliably perform digital simulations. This is similar to just using a wind tunnel instead of programming a computer to model the way the wind buffets everything from a goose to an experimental fighter plane.
Monroe's team, in collaboration with coauthor Guido Pagano, a former JQI postdoctoral researcher who is now an assistant professor at Rice University, is working to implement the new analog approach within the next couple of years. The completed system should be able to simulate a variety of lattice gauge theories.
The authors say that this research is only the beginning of a longer road. Since lattice gauge theories are described in mathematically similar ways to other quantum systems, the researchers are optimistic that their proposal will find uses beyond nuclear physics, such as in condensed matter physics and materials science. Davoudi is also working to develop digital quantum simulation proposals with Monroe and Norbert Linke, another JQI Fellow. She hopes that the two projects will reveal the advantages and disadvantages of each approach and provide insight into how researchers can tackle nuclear physics problems with the full might of quantum computing.
"We want to eventually simulate theories of a more complex nature and in particular quantum chromodynamics that is responsible for the strong force in nature," says Davoudi. "But that might require thinking even more outside the box."




Explore further
Quantum cloud computing with self-check



[b]More information:[/b] Towards analog quantum simulations of lattice gauge theories with trapped ions. Phys. Rev. ResearchDOI: 10.1103/PhysRevResearch.2.023015


https://phys.org/news/2020-04-quantum-si...ysics.html

Triality...The new reality?
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#19
Posted by EA - Sunday, March 8th, 2020, 03:13 am
What if matter and anti-matter don't always  immediately self-destruct but instead they immediately mediate?


"The dark photon, a hypothetical invisible particle, is an attractive dark matter candidate, which could also be a new interaction mediator between dark matter and normal matter,"

APRIL 13, 2020 FEATURE
Research identifies detection constraints for dark photons
by Ingrid Fadelli , Phys.org
[Image: 6-researchiden.jpg]Schematic drawing of future CDEX-100 experiment. Credit: She et al.
Past cosmological and astrophysical observations suggest that over one quarter of the universe's energy density is made up of a non-conventional type of matter known as dark matter. This type of matter is believed to be composed of particles that do not absorb, emit or reflect light, and thus cannot be observed directly using conventional detection methods.

Researchers worldwide have carried out studies aimed at detecting dark matter in the universe, yet so far, none of them has been successful. Even the preferred candidate for dark matter, weakly interacting massive particles (WIMPs), have not yet been observed experimentally.
The China Dark Matter Experiment (CDEX) collaboration, a large team of researchers at Tsinghua University and other universities in China, has recently conducted a search for a different possible dark matter candidate known as the dark photon. While the search was unsuccessful, their paper, published in Physical Review Letters, identifies new constraints on a dark photon parameter that could inform future studies.
"The dark photon, a hypothetical invisible particle, is an attractive dark matter candidate, which could also be a new interaction mediator between dark matter and normal matter," Qian Yue, one of the researchers who carried out the study, told Phys.org. "The study and detection of dark matter may contribute to the extension of the standard model (SM) of particle physics and expand our knowledge of the universe."
The CDEX collaboration has been conducting searches for light dark matter for some time now, using a 10 kg p-type point contact germanium detector installed at the China Jinping underground laboratory (CJPL). CJPL is the deepest underground research facility in the world, with a rock overburden of 2400 meters.

[Image: 7-researchiden.jpg]
Schematic drawing of CDEX-10 experimental setup with detector string. Credit: She et al.
The detector used by the researchers consists of three triple-element germanium detector strings, surrounded by 20-cm-thick, high-purity, oxygen-free copper, which acts as a passive shield against ambient radioactivity. This instrument is directly immersed in liquid nitrogen to maintain relatively cool temperatures.
"Dark photons can be experimentally detected through their absorption and conversion to electrons in the germanium detectors in a process analogous to the photoelectric effect of SM photons," Yue explained. "Intense photon sources, e.g., the sun, provide an excellent platform to look for dark photons. At a range of 100 eV, the low energy threshold of point-contact germanium detectors is particularly suitable for the studies of dark photons."

In their recent paper, Yue and her colleagues analyzed data collected using the detector at CJPL between February 2017 and August 2018, searching for solar dark photons and dark photons, two dark matter candidates. While the researchers were unable to observe signals pointing to either of these candidates, they managed to set constraints on the effective kinetic mixing parameter between dark photons and SM photons.
"As an attractive candidate for dark matter and a new possible interaction mediator between dark matter and normal matter, the dark photon is attractive for further theoretical and experimental efforts," Yue said. "Our work has probed a new parameter space and set the most stringent limits on solar dark photons among the direct detection experiments."
The recent study carried out by Yue and his colleagues provides some valuable new feedback that could inform future searches for dark matter, particularly for dark photons. Moreover, their work reinforces the current worldwide interest in exploring other dark matter candidates, going beyond WIMPs and their detection channel of elastic scattering with the nucleus.
"To further advance the search for light dark matter, we will re-install the CDEX-10 detector array in a new, larger liquid nitrogen cryo-tank with a volume of about 1700 m3 at Hall-C of the new CJPL-II laboratory in next two years, where shielding from ambient radioactivity is provided by the 6-meter-thick liquid nitrogen," Yue said. "Additional germanium detectors, up to about 100 kg, are planned for deployment in the cryo-tank with reduced background and higher detection efficiency."




Explore further
A new search for axion dark matter rules out past numerical predictions



[b]More information:[/b] Z. She et al. Direct Detection Constraints on Dark Photons with the CDEX-10 Experiment at the China Jinping Underground Laboratory, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.111301
[b]Journal information:[/b] Physical Review Letters
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#20
' ...on a lark of a joke' 
Fun-da-metal fizix goes live! LilD

Arrow beer and a toke...
They discussed delaying the release, but then decided that with much of the world locked down, there might be interest in a deeply cerebral project by isolated people who are growing bored. He notes that Isaac Newton did some of his best work while self-isolating during the plague.

[Image: stephenwolfr.jpg]

APRIL 15, 2020 REPORT
The Wolfram Physics Project hopes to find fundamental theory of physics
by Bob Yirka , Phys.org
credit: wolframphysics.org
Physicist and entrepreneur Stephen Wolfram has unveiled "The Wolfram Physics Project," which he subtitles "A Project to Find the Fundamental Theory of Physics." The aim of the project is to enlist the assistance of people around the globe to find the fundamental theory of physics—the theory that ties together all of physics, from the general theory of relativity to quantum mechanics. Wolfram has also published several documents on his website that outline the history behind the development of the project. Early in his career, he was a distinguished physicist, but later, left to found a computer company. More recently, he has found a renewed interest in pursuing his ideas about fundamental physics that he believes will lead to the discovery of a fundamental theory.
Wolfram suggests that the universe can be modeled using points in space and rules that, when applied, generate more points. As more points are added, a network is built. He further suggests that model universes can be built using hypergraphs that describe such networks—and the rules that are applied eventually determine the characteristics that make up a given universe. And this, he believes, suggests that it should be possible to start with a few points in space and develop a model that depicts the real universe—at least as we know it. All that is needed, he suggests, is for somebody to come up with the right rules. And that is the whole point of his project. Those who are interested need only visit the project website and begin downloading documents that further explain Wolfram's theories and how citizen scientists can get involved—and if they desire, create some rules and add them to the project.
In one of his papers outlining the path the project has taken, Wolfram notes that he and his team were well on their way to unveiling the project just as the COVID-19 pandemic began to unfold. They discussed delaying the release, but then decided that with much of the world locked down, there might be interest in a deeply cerebral project by isolated people who are growing bored. He notes that Isaac Newton did some of his best work while self-isolating during the plague.





Explore further
Wolfram's ID project launch touts ImageIdentify function



[b]More information:[/b] https://www.wolframphysics.org/

https://www.wolframphysics.org … hnical-introduction/
https://writings.stephenwolfra … ram-physics-project/
https://writings.stephenwolfra … s-and-its-beautiful/

https://phys.org/news/2020-04-wolfram-ph...heory.html




Using this data, T2K evaluates confidence intervals for the parameter δcp. The disfavored region at the 3σ (99.7%) confidence level is 2º to 165º. This result represents the strongest constraint on δcp to date. The values of 0º and 180º are disfavored at 95% confidence level , which was the case in T2K's previous release in 2017, indicating that CP symmetry may be violated in neutrino oscillations.

APRIL 15, 2020
Closing in on matter-antimatter asymmetry: T2K results restrict possible values of neutrino CP phase
[Image: t2kresultsre.jpg]The arrow indicates the value most compatible with the data. The gray region is disfavored at 99.7% confidence level. Nearly half of the possible values are excluded. Credit: The T2K Collaboration
The T2K Collaboration has published new results showing the strongest constraint yet on the parameter that governs the breaking of the symmetry between matter and antimatter in neutrino oscillations. Using beams of muon neutrinos and muon antineutrinos, T2K has studied how these particles and antiparticles transition into electron neutrinos and electron antineutrinos, respectively. The parameter governing the matter/antimatter symmetry breaking in neutrino oscillation, called δcp phase, can take a value from -180º to 180º. For the first time, T2K has disfavored almost half of the possible values at the 99.7% (3σ) confidence level, and is starting to reveal a basic property of neutrinos that has not been measured until now. This is an important step on the way to knowing whether or not neutrinos and antineutrinos behave differently. These results, using data collected through 2018, have been published in the multidisciplinary scientific journal, Nature on April 16.

For most phenomena, the laws of physics provide a symmetric description of the behavior of matter and antimatter. However, this symmetry does not hold universally. The effect of the asymmetry between matter and antimatter is most apparent in the observation of the universe, which is composed of matter with little antimatter. It is considered that equal amounts of matter and antimatter were created at the beginning of the universe. Then, for the universe to evolve to a state where matter dominates over antimatter, a necessary condition is the violation of the so-called Charge-Parity (CP) symmetry. Until now, CP symmetry violation has only been observed in the physics of subatomic particles called quarks, but the magnitude of the CP symmetry violation is not large enough to explain the observed dominance of matter over antimatter in the universe. T2K is now searching for a new source of CP symmetry violation in neutrino oscillations that would manifest as a difference in the measured oscillation probability for neutrinos and antineutrinos.
The T2K experiment uses a beam consisting primarily of muon neutrinos or muon antineutrinos created using the proton beam from the Japan Proton Accelerator Research Complex (J-PARC) located in Tokai village on the east coast of Japan. A small fraction of the neutrinos (or antineutrinos) are detected 295 km away at the Super-Kamiokande detector, located under a mountain in Kamioka, near the west coast of Japan. As the muon neutrinos and muon antineutrinos traverse the distance from Tokai to Kamioka (hence the name T2K), a fraction will oscillate or change flavor into electron neutrinos or electron antineutrinos respectively. The electron neutrinos and electron antineutrinos are identified in the Super-Kamiokande detector by the rings of Cherenkov light that they produce (shown below). While Super-Kamiokande cannot identify each event as a neutrino or antineutrino interaction, T2K is able to study the neutrino and antineutrino oscillations separately by operating the beam in neutrino mode or antineutrino mode.

[Image: 1-t2kresultsre.jpg]
Event displays of candidate electron neutrino (left) and electron antineutrino (right) events observed in Super-K from the T2K neutrino beam. Credit: The T2K Collaboration
T2K released a result analysing data with 1.49x1021 and 1.64x1021 protons from the accelerator for neutrino beam mode and antineutrino beam mode respectively. If the parameter δcp equals 0º or 180º, the neutrinos and antineutrinos will change their types (from muon to electron) in the same way during oscillation. The δcp parameter may have a value that enhances the oscillations of neutrinos or antineutrinos, breaking CP symmetry. However, the observation of neutrinos is already enhanced in the T2K experiment by the fact that the detectors and beam line components are made out of matter and not antimatter. To separate the effect of δcp from known beam line and interaction effects, the T2K analysis includes corrections based on data from magnetized near detectors (ND280) placed 280m from the target. T2K observed 90 electron neutrino candidates and 15 electron antineutrino candidates. T2K expects to observe 82 electron neutrino events compared to 17 electron antineutrino events for maximal neutrino enhancement (δcp =-90º) and 56 electron neutrino events compared to 22 electron antineutrino events for maximal antineutrino enhancement (δcp=+90º). The observed number of events as a function of the reconstructed neutrino energy is shown below. The T2K data is most compatible with a value close to δcp=-90º that significantly enhances the oscillation probability of neutrinos in the T2K experiment. Using this data, T2K evaluates confidence intervals for the parameter δcp. The disfavored region at the 3σ (99.7%) confidence level is 2º to 165º. This result represents the strongest constraint on δcp to date. The values of 0º and 180º are disfavored at 95% confidence level , which was the case in T2K's previous release in 2017, indicating that CP symmetry may be violated in neutrino oscillations.
  • [Image: strongestevi.jpg]

  • Event display for a candidate electron neutrino. Credit: T2K


  • [Image: 2-t2kresultsre.jpg]



  • The observed electron neutrino (left) and electron antineutrino (right) candidate events with predictions for maximal neutrino enhancement (red, long dash) and maximum antineutrino enhancement (blue, short dash). Credit: The T2K Collaboration




  • [Image: t2kinsightin.jpg]





  • Kamioka Observatory, ICRR (Institute for Cosmic Ray Research), The University of Tokyo. Credit: Kamioka Observatory, ICRR (Institute for Cosmic Ray Research), The University of Tokyo






  • [Image: strongestevi.jpg]







  • Event display for a candidate electron neutrino. Credit: T2K








  • [Image: 2-t2kresultsre.jpg]
  • The observed electron neutrino (left) and electron antineutrino (right) candidate events with predictions for maximal neutrino enhancement (red, long dash) and maximum antineutrino enhancement (blue, short dash). Credit: The T2K Collaboration

[*]While this result shows a strong preference for enhancement of the neutrino rate in T2K, it is not yet clear if CP symmetry is violated or not. To further improve the experimental sensitivity to a potential CP symmetry violating effect, the T2K Collaboration will upgrade the near detector suite to reduce systematic uncertainties and accumulate more data, and J-PARC will increase the beam intensity by upgrading the accelerator and beamline.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#21
~MEH.

[Image: -meh-336708.jpg][/url]

(Mixed Exactly Half.)

[Image: rRMXlPQ.png]
How We Got Here: The Backstory of the Wolfram Physics Project
April 14, 2020
[url=https://www.wolframphysics.org/][Image: wolfram-physics-project-website.jpg]


Quote:“Someday…”
I’ve been saying it for decades: “Someday I’m going to mount a serious effort to find the fundamental theory of physics.” Well, I’m thrilled that today “someday” has come, and we’re launching the Wolfram Physics Project. And getting ready to launch this project over the past few months might be the single most intellectually exciting time I’ve ever had. So many things I’d wondered about for so long getting solved. So many exciting moments of “Surely it can’t be that simple?” And the dawning realization, “Oh my gosh, it’s actually going to work!”

https://www.wolframphysics.org/


Universe and infinity are not the same? Cry

If dualism(Mixed Exactly Half) is relative and the mirrored concept related to itz symmetry.,  then is triality a bunch of non-local yokels notz vocal or focal strangers that self-social distance and don't even belong to the family or strain on the timespace frame of reference 
-yet occupy all the places  Sheep dualists don't?

Ninja Covert ~19.5 'triality' may half escaped as a gyrus from an initial concept from a wu tang clan of non-local yokels.

2 bee or notz to be...that ain't the only question. Naughty
[Image: image.jpg]
Ain't isn't an itza.

B32C



~Magic Angle.
Wiggle space.
Waggle dance.
[Image: magictwistan.jpg]
Ask one to be three

A.E. said 'god don't play dice with the universe' and E.A. offered non vice advice.  Arrow 'don't gamble with improv.'

Is there a tertiary layer and reality is non-duality?

disordered hyperuniformity 

Gnosis is as all of your beeswax and can't be fully realized with a mere mirror matter.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#22
Think about thinking.
How difficult it is.

Sarte` says existence precedes essence,
which thought endows with meaning,
thus preserving its primacy over nature.

I think I said that correctly...

Hmm2

Read this part of his novel, Nausea

https://www.google.com/books/edition/Nau...frontcover

Damned
Reply
#23
.


Quote:Posted by Kalter Rauch - 8 hours ago
Think about thinking.
How difficult it is.


Well if eye did have thoughts on thoughts regarding your statement, since you do think independent of me, I would say this Paul.
Think about thinking.
Okay. I considered what you ask as a task.
My personal thoughts were then cross referenced with the Improviverse.
To truly 'think' there therefore must be sum archive that contains a base unit of thought and that would be a memory.
If a memory then please do literally... >>>


Recall: [Image: 56292_1_x.jpg?auto=webp&format=pjpg&version=1&width=512]  memory
More information

333: FOLK ART DUCK BY POPEYE REED. Carved san : Lot 333
https://www.pinterest.ca/pin/49187820904962557/

if  youareaduck then I Am a drake wake.................................


Quote:I think I said that correctly...
Sarte`is smart eh? no?

But popeye is Sheep olive oil was [Image: IMG_4660_1000.JPG]

Think memory mirrored? this ain't no wimpy burger [Image: aaf2db38150749d2865c6043fc0019aa--food-c...rtoons.jpg]  served @ Room Temperature. 
Weakly interacting massive particles ://en.wikipedia.org/wiki/Weakly_interacting_massive_particles

The Interacting Photons research group at AMOLF studies nonlinearity and noise in photonic systems. One of such systems is a cavity, formed by two mirrors facing each other at a close distance. Within the cavity, light bounces back and forth as it is reflected by the mirrors. Putting something inside such an optical cavity, changes the properties of the system. "We created a system with memory by placing a drop of olive oil inside the cavity,"


Again... recall:

The equations that describe how light behaves in our oil-filled cavity are similar to those describing collections of atoms, superconductors and even high energy physics. Therefore, the universal behavior we discovered is likely to be observed in such systems as well."

APRIL 17, 2020
Olive oil leads to discovery of new universal law of phase transitions
by AMOLF
[Image: oliveoilshed.jpg]Caption: The experiments were performed with an optical cavity formed by two mirrors. Light sent through the cavity bounces between the mirrors before getting out where the transmission is measured. The researchers have filled this cavity with olive oil and changed the relative position of the mirrors at different speeds. Credit: Henk-Jan Boluijt @AMOLF
A simple drop of olive oil in a system of photons bouncing between two mirrors has revealed universal aspects of phase transitions in physics. Researchers at AMOLF used an oil-filled optical cavity in which light undergoes phase transitions similar to those in boiling water. The system they studied has memory because the oil causes photons to interact with themselves. By varying the distance between the two mirrors and measuring the transmission of light through the cavity, they discovered a universal law describing phase transitions in systems with memory. These results are published on April 15th in Physical Review Letters.

The Interacting Photons research group at AMOLF studies nonlinearity and noise in photonic systems. One of such systems is a cavity, formed by two mirrors facing each other at a close distance. Within the cavity, light bounces back and forth as it is reflected by the mirrors. Putting something inside such an optical cavity, changes the properties of the system. "We created a system with memory by placing a drop of olive oil inside the cavity," says group leader Said Rodriguez. "The oil mediates effective photon-photon interactions, which we can see by measuring the transmission of laser light through this cavity."
[b]Scanning speed[/b]
Rodriguez and his Ph.D. students Zou Geng and Kevin Peters analyzed the transmission while increasing and decreasing the distance between the two mirrors at different speeds. They found that the amount of light transmitted through the cavity depends on the direction of movement of the mirrors. "The transmission of light through the cavity is non-linear. At a certain distance between the mirrors, the amount of transmitted light depends on whether we are opening the cavity or closing it," Rodriguez explains. "This behavior is called hysteresis. It is also observed in certain phase transitions, like in boiling water or magnetic materials."
[b]Universal[/b]
However, in the cavity with olive oil, hysteresis is not always present, the researchers observed when they increased the speed at which the cavity opens and closes. Rodriguez: "In faster scans, we saw the hysteresis vanish as a function of the scanning speed. This happens at a universal rate, independent of parameters like light intensity or the strength of the non-linearity. The equations that describe how light behaves in our oil-filled cavity are similar to those describing collections of atoms, superconductors and even high energy physics. Therefore, the universal behavior we discovered is likely to be observed in such systems as well."
[b]Coupling cavities[/b]
While it would be interesting to investigate the universal scaling behavior in other systems with memory, Rodriguez will keep his focus on oil-filled cavities. "Our system has a strong optical non-linearity at room temperature, which offers opportunities for potential applications," he says. "We are currently investigating what happens when we couple two or more cavities. Because each system has memory, an array of cavities might eventually be useful as a computational tool, or maybe even in sensing applications."




Explore further
Using noise to enhance optical sensing



[b]More information:[/b] Z. Geng et al. Universal Scaling in the Dynamic Hysteresis, and Non-Markovian Dynamics, of a Tunable Optical Cavity, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.153603
[b]Journal information:[/b] Physical Review Letters



https://phys.org/news/2020-04-olive-oil-...l-law.html


 "In faster scans, we saw the hysteresis vanish as a function of the scanning speed. This happens at a universal rate, independent of parameters like light intensity or the strength of the non-linearity. The equations that describe how light behaves in our oil-filled cavity are similar to those describing collections of atoms, superconductors and even high energy physics. Therefore, the universal behavior we discovered is likely to be observed in such systems as well."

LilD

[Image: corner_reflectors.png]
The Interacting Photons research group at AMOLF studies nonlinearity and noise in photonic systems. One of such systems is a cavity, formed by two mirrors facing each other at a close distance. Within the cavity, light bounces back and forth as it is reflected by the mirrors. Putting something inside such an optical cavity, changes the properties of the system. "We created a system with memory by placing a drop of olive oil inside the cavity,"
https://phys.org/news/2020-04-olive-oil-...l-law.html

[Image: maxresdefault.jpg][url=https://i.ytimg.com/vi/xOSoiu9XPgI/maxresdefault.jpg][/url]RE: On the matter of MATTER Vs. ANTI-MATTER : A nihilist annihilates duality reality.[url=https://www.wolframphysics.org/]https://www.wolframphysics.org/[/urlrt
Thanx for enlightening me KR. Hi   a la carte' skipped the dish served by sarte' in the chiral viral galaxy spiral.
Reply
#24
Sunday, March 8th, 2020, 03:13 am (This post was last modified: Sunday, March 8th, 2020, 03:31 am by EA.)

What if matter and anti-matter don't always  immediately self-destruct but instead they immediately mediate?

To differentiate the two.

Mind your matters.


.
Quote:Now, an international team of researchers that make up the T2K Collaboration, including Imperial College London scientists, have found the strongest evidence yet that neutrinos and antineutrinos behave differently, and therefore may not wipe each other out.


Professor Yoshi Uchida said: "When we started, we knew that seeing signs of differences between neutrinos and antineutrinos in this way was something that could take decades, if they could ever be seen at all, so it is almost like a dream to have our result be celebrated on the cover of Nature this week."

.

Strongest evidence yet that neutrinos explain how the universe exists
Date:
April 15, 2020
Source:
Imperial College London
Summary:
New data throws more support behind the theory that neutrinos are the reason the universe is dominated by matter.

[Image: 200415133657_1_540x360.jpg]
Neutrino word cloud illustration (stock image).
[i]Credit: © CrazyCloud / Adobe Stock[/i]


New data throws more support behind the theory that neutrinos are the reason the universe is dominated by matter.
The current laws of physics do not explain why matter persists over antimatter -- why the universe is made of 'stuff'. Scientists believe equal amounts of matter and antimatter were created at the beginning of the universe, but this would mean they should have wiped each other out, annihilating the universe as it began.
Instead, physicists suggest there must be differences in the way matter and antimatter behave that explain why matter persisted and now dominates the universe. Each particle of matter has an antimatter equivalent, and neutrinos are no different, with an antimatter equivalent called antineutrinos.
They should be exact opposites in their properties and behaviour, which is what makes them annihilate each other on contact.
Now, an international team of researchers that make up the T2K Collaboration, including Imperial College London scientists, have found the strongest evidence yet that neutrinos and antineutrinos behave differently, and therefore may not wipe each other out.
Dr Patrick Dunne, from the Department of Physics at Imperial, said: "This result brings us closer than ever before to answering the fundamental question of why the matter in our universe exists. If confirmed -- at the moment we're over 95 per cent sure -- it will have profound implications for physics and should point the way to a better understanding of how our universe evolved."
Previously, scientists have found some differences in behaviour between matter and antimatter versions of subatomic particles called quarks, but the differences observed so far do not seem to be large enough to account for the dominance of matter in the universe.
However, T2K's new result indicates that the differences in the behaviour of neutrinos and antineutrinos appear to be quite large. Neutrinos are fundamental particles but do not interact with normal matter very strongly, such that around 50 trillion neutrinos from the Sun pass through your body every second.
Neutrinos and antineutrinos can come in three 'flavours', known as muon, electron and tau. As they travel, they can 'oscillate' -- changing into a different flavour. The fact that muon neutrinos oscillate into electron neutrinos was first discovered by the T2K experiment in 2013.
To get the new result, the team fired beams of muon neutrinos and antineutrinos from the J-PARC facility at Tokai, Japan, and detected how many electron neutrinos and antineutrinos arrived at the Super-Kamiokande detector 295km away.
They looked for differences in how the neutrinos or antineutrinos changed flavour, finding neutrinos appear to be much more likely to change than antineutrinos.
The available data also strongly discount the possibility that neutrinos and antineutrinos are as just likely as each other to change flavour. Dr Dunne said: "What our result shows is that we're more than 95 per cent sure that matter neutrinos and antineutrinos behave differently. This is big news in itself; however we do already know of other particles that have matter-antimatter differences that are too small to explain our matter-dominated universe.
"Therefore, measuring the size of the difference is what matters for determining whether neutrinos can answer this fundamental question. Our result today finds that unlike for other particles, the result in neutrinos is compatible with many of the theories explaining the origin of the universe's matter dominance."
While the result is the strongest evidence yet that neutrinos and antineutrinos behave differently, the T2K Collaboration is working to reduce any uncertainties and gather more data by upgrading the detectors and beamlines, including the new Hyper-Kamiokande detector to replace the Super-Kamiokande. A new experiment, called DUNE, is also under construction in the US. Imperial is involved in both.
Imperial researchers have been involved in the T2K Collaboration since 2004, starting with conceptual designs on whiteboards and research and development on novel particle detector components that were key to building this experiment, which was finally completed and turned on in 2010.
For the latest result, the team contributed to the statistical analysis of the results and ensuring the signal they observe is real, as well as including the effects of how neutrinos interact with matter, which is one of the largest uncertainties that go into the analysis.
Professor Yoshi Uchida said: "When we started, we knew that seeing signs of differences between neutrinos and antineutrinos in this way was something that could take decades, if they could ever be seen at all, so it is almost like a dream to have our result be celebrated on the cover of Nature this week."

make a difference: sponsored opportunity



[b]Story Source:[/b]
Materials provided by [b]Imperial College London[/b]. Original written by Hayley Dunning. Note: Content may be edited for style and length.


[b]Related Multimedia[/b]:
[b]Journal Reference[/b]:
  1. Abe, K., Akutsu, R., Ali, A. et al. [b]Constraint on the matter–antimatter symmetry-violating phase in neutrino oscillations[/b]. Nature, 2020 DOI: 10.1038/s41586-020-2177-0





[*]Imperial College London. "Strongest evidence yet that neutrinos explain how the universe exists." ScienceDaily. ScienceDaily, 15 April 2020. <www.sciencedaily.com/releases/2020/04/200415133657.htm>.


https://www.sciencedaily.com/releases/20...133657.htm
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#25
"To truly 'think' there therefore must be sum archive that contains a base unit of thought and that would be a memory."

Assuming a time before a memory exists 
then mind/tabula rasa, as it exists,
still has a structure capable of understanding enough to form a memory.
The formation of the first memory means that thought exists in the first place.
Reply
#26
...


Quote:... have found the strongest evidence yet
that neutrinos and antineutrinos behave differently  Teetertotter

and therefore,
may not wipe each other out.



Ever hear the saying, 
"it all adds up to a whole lotta nuthin' " ... ?
said,
the matter ...  to the anti matter.

What is a finished balanced equation, in a universe of mathematically possible computations?
Moving on from an end game of  ... 
"move on, 
nothing left to see here" ?


Higgs Boson Half Lives Matter ... 
or
do
Higgs Boson Half Lives ...  Anti Matter?   

Puns like that suck at the Spallation Neutron Source ....



Quote:The half-life of the Higgs boson is not known,
but it is predicted to be 100 yoctoseconds
(septillionths of a second),
which is a rather long time for a particle of its mass.
A measurement of the Higgs boson half-life would tell us a lot.


https://www.livescience.com/strange-higg...verse.html
A Strange New Higgs Particle May Have Stolen the Antimatter from Our Universe


"move on, 
nothing left to see here"

because,
"it all adds up to a whole lotta nuthin' "


Quote:... but it is predicted to be 100 yoctoseconds Whip
(septillionths of a second), 




An eternity in timelessness ... Nonono Nonono Nonono Nonono Nonono 

The yocto-future-shockto-septillionth of a second.



Quote:neutrinos and antineutrinos behave differently Herethere
and may not wipe each other out

 

Mutual annihilation gets put on freeze frame?
and the antineutrino hops skips and jumps over or under the neutrino,
then the neutrino does the same,
on an endless loop of eternal poop and scoop ...



Quote:The tau can decay into a muon, 
plus a tau-neutrino and a muon-antineutrino; 
or it can decay directly into an electron, 
plus a tau-neutrino and an electron-antineutrino. 
Because the tau is heavy, 
it can also decay into particles containing quarks.


suddenly somehow something disturbing is seen in the smoke of his pot pipe



Quote:around 50 trillion neutrinos from the Sun pass through your body every second.


That must leave a shit load of quarks up my ass.

Neutrino's and anti neutrino's can't decide who is going to die,
and I have a neutron star spallation source up my a-hole.

https://neutrons.ornl.gov/sns

It's another day in a holographic universe.


Quote:The formation of the first memory means that thought exists in the first place




unless pure observation,

void of thought,
is suddenly and abruptly forced to react by external stimulus Whip

...
...
Reply
#27
You guys think thoughts that made me have a hoot with a bud-beer shot.
recall:

Quote:Ever hear the saying, 

"it all adds up to a whole lotta nuthin' " ... ?

All theory has their:
Pro's  Sheep Con's

The science of duality may not be actuality.

There may be matter in sum-state of Disordered Hyper-uniformity.


Triality


A pro-con-fusion snake-oil y'all kinda sorta like a counter-intuitive gut-feeling third eye blind optic topic unheard of until science speaking of it NOW.


Quote:"To truly 'think' there therefore must be sum archive that contains a base unit of thought and that would be a memory."

Assuming a time before a memory exists 
then mind/tabula rasa, as it exists,
still has a structure capable of understanding enough to form a memory.
The formation of the first memory means that thought exists in the first place.


But paul...un-see the light.

Saul there was a time when it was not on recall.

The formation of the first memory means that thought exists in the first place.

thatza forked tongue-twistor memristor just for your gnosis.

Take the road back from damascus as all roads reverse to rome.

Is there a third law of matter...and if so...is it innovative?


A 'magic'  medium?
Angel Angels  Sheep Angles  LilD



APRIL 27, 2020
'Elegant' solution reveals how the universe got its structure
[Image: elegantsolut.jpg]The Magellan telescopes at Carnegie's Las Campanas Observatory in Chile, which were crucial to the ability to conduct this survey. Credit: Yuri Beletsky, the Carnegie Institution for Science.
The universe is full of billions of galaxies—but their distribution across space is far from uniform. Why do we see so much structure in the universe today and how did it all form and grow?

A 10-year survey of tens of thousands of galaxies made using the Magellan Baade Telescope at Carnegie's Las Campanas Observatory in Chile provided a new approach to answering this fundamental mystery. The results, led by Carnegie's Daniel Kelson, are published in Monthly Notices of the Royal Astronomical Society.
"How do you describe the indescribable?" asks Kelson. "By taking an entirely new approach to the problem."
"Our tactic provides new—and intuitive—insights into how gravity drove the growth of structure from the universe's earliest times," said co-author Andrew Benson. "This is a direct, observation-based test of one of the pillars of cosmology."
The Carnegie-Spitzer-IMACS Redshift Survey was designed to study the relationship between galaxy growth and the surrounding environment over the last 9 billion years, when modern galaxies' appearances were defined.
The first galaxies were formed a few hundred million years after the Big Bang, which started the universe as a hot, murky soup of extremely energetic particles. As this material expanded outward from the initial explosion, it cooled, and the particles coalesced into neutral hydrogen gas. Some patches were denser than others and, eventually, their gravity overcame the universe's outward trajectory and the material collapsed inward, forming the first clumps of structure in the cosmos.
The density differences that allowed for structures both large and small to form in some places and not in others have been a longstanding topic of fascination. But until now, astronomers' abilities to model how structure grew in the universe over the last 13 billion years faced mathematical limitations.
"The gravitational interactions occurring between all the particles in the universe are too complex to explain with simple mathematics," Benson said.
So, astronomers either used mathematical approximations—which compromised the accuracy of their models—or large computer simulations that numerically model all the interactions between galaxies, but not all the interactions occurring between all of the particles, which was considered too complicated.

[Image: 1-elegantsolut.jpg]
The universe's first structure originated when some of the material flung outward by the Big Bang overcame its trajectory and collapsed on itself, forming clumps. A team of Carnegie researchers showed that denser clumps of matter grew faster, and less-dense clumps grew more slowly. The group's data revealed the distribution of density in the universe over the last 9 billion years. (On the illustration, violet represents low-density regions and red represents high-density regions.) Working backward in time, their findings reveal the density fluctuations (far right, in purple and blue) that created the universe's earliest structure. This aligns with what we know about the ancient universe from the afterglow of the Big Bang, called the Cosmic Microwave Background (far right in yellow and green). The researchers achieved their results by surveying the distances and masses of nearly 100,000 galaxies, going back to a time when the universe was only 4.5 billion years old. About 35,000 of the galaxies studied by the Carnegie-Spitzer-IMACS Redshift Survey are represented here as small spheres. Credit: Daniel Kelson. CMB data is based on observations obtained with Planck, an ESA science mission with instruments and contributions directly funded by ESA Member States, NASA, and Canada.
"A key goal of our survey was to count up the mass present in stars found in an enormous selection of distant galaxies and then use this information to formulate a new approach to understanding how structure formed in the universe," Kelson explained.

The research team—which also included Carnegie's Louis Abramson, Shannon Patel, Stephen Shectman, Alan Dressler, Patrick McCarthy, and John S. Mulchaey, as well as Rik Williams , now of Uber Technologies—demonstrated for the first time that the growth of individual proto-structures can be calculated and then averaged over all of space.
Doing this revealed that denser clumps grew faster, and less-dense clumps grew more slowly.
They were then able to work backward and determine the original distributions and growth rates of the fluctuations in density, which would eventually become the large-scale structures that determined the distributions of galaxies we see today.
In essence, their work provided a simple, yet accurate, description of why and how density fluctuations grow the way they do in the real universe, as well as in the computational-based work that underpins our understanding of the universe's infancy.
"And it's just so simple, with a real elegance to it," added Kelson.
The findings would not have been possible without the allocation of an extraordinary number of observing nights at Las Campanas.
"Many institutions wouldn't have had the capacity to take on a project of this scope on their own," said Observatories Director John Mulchaey. "But thanks to our Magellan Telescopes, we were able to execute this survey and create this novel approach to answering a classic question."
"While there's no doubt that this project required the resources of an institution like Carnegie, our work also could not have happened without the tremendous number of additional infrared images that we were able to obtain at Kit Peak and Cerro Tololo, which are both part of the NSF's National Optical-Infrared Astronomy Research Laboratory," Kelson added.




Explore further
The rise and fall of galaxy formation



[b]More information:[/b] Daniel D Kelson et al, Gravity and the non-linear growth of structure in the Carnegie-Spitzer-IMACS Redshift Survey, Monthly Notices of the Royal Astronomical Society (2020). DOI: 10.1093/mnras/staa100
[b]Journal information:[/b] Monthly Notices of the Royal Astronomical Society [/url]

Provided by 
Carnegie Institution for Science


Quote:The formation of the first memory means that thought exists in the first place.


the road to damn mask us!

[Image: 49828289757_3d4b6f054a_b.jpg]
Quote:
a beer and a toke on a lark of a joke.
fun-da-mental physics [Image: lilD.gif]


APRIL 27, 2020
New findings suggest laws of nature 'downright weird,' not as constant as previously thought
by Lachlan Gilbert, University of New South Wales
[Image: 1-newfindingss.jpg]Scientists examining the light from one of the furthermost quasars in the universe were astonished to find fluctuations in the electromagnetic force. Credit: Shutterstock
Not only does a universal constant seem annoyingly inconstant at the outer fringes of the cosmos, it occurs in only one direction, which is downright weird.

Those looking forward to a day when science's Grand Unifying Theory of Everything could be worn on a t-shirt may have to wait a little longer as astrophysicists continue to find hints that one of the cosmological constants is not so constant after all.
In a paper published in Science Advances, scientists from UNSW Sydney reported that four new measurements of light emitted from a quasar 13 billion light years away reaffirm past studies that found tiny variations in the fine structure constant.
UNSW Science's Professor John Webb says the fine structure constant is a measure of electromagnetism—one of the four fundamental forces in nature (the others are gravity, weak nuclear force and strong nuclear force).
"The fine structure constant is the quantity that physicists use as a measure of the strength of the electromagnetic force," Professor Webb says.
"It's a dimensionless number and it involves the speed of light, something called Planck's constant and the electron charge, and it's a ratio of those things. And it's the number that physicists use to measure the strength of the electromagnetic force."
The electromagnetic force keeps electrons whizzing around a nucleus in every atom of the universe—without it, all matter would fly apart. Up until recently, it was believed to be an unchanging force throughout time and space. But over the last two decades, Professor Webb has noticed anomalies in the fine structure constant whereby electromagnetic force measured in one particular direction of the universe seems ever so slightly different.
"We found a hint that that number of the fine structure constant was different in certain regions of the universe. Not just as a function of time, but actually also in direction in the universe, which is really quite odd if it's correct ... but that's what we found."
[b]Looking for clues[/b]
Ever the sceptic, when Professor Webb first came across these early signs of slightly weaker and stronger measurements of the electromagnetic force, he thought it could be a fault of the equipment, or of his calculations or some other error that had led to the unusual readings. It was while looking at some of the most distant quasars—massive celestial bodies emitting exceptionally high energy—at the edges of the universe that these anomalies were first observed using the world's most powerful telescopes.

"The most distant quasars that we know of are about 12 to 13 billion light years from us," Professor Webb says.
"So if you can study the light in detail from distant quasars, you're studying the properties of the universe as it was when it was in its infancy, only a billion years old. The universe then was very, very different. No galaxies existed, the early stars had formed but there was certainly not the same population of stars that we see today. And there were no planets."
He says that in the current study, the team looked at one such quasar that enabled them to probe back to when the universe was only a billion years old which had never been done before. The team made four measurements of the fine constant along the one line of sight to this quasar. Individually, the four measurements didn't provide any conclusive answer as to whether or not there were perceptible changes in the electromagnetic force. However, when combined with lots of other measurements between us and distant quasars made by other scientists and unrelated to this study, the differences in the fine structure constant became evident.
[b]A weird universe[/b]
"And it seems to be supporting this idea that there could be a directionality in the universe, which is very weird indeed," Professor Webb says.
"So the universe may not be isotropic in its laws of physics—one that is the same, statistically, in all directions. But in fact, there could be some direction or preferred direction in the universe where the laws of physics change, but not in the perpendicular direction. In other words, the universe in some sense, has a dipole structure to it.
"In one particular direction, we can look back 12 billion light years and measure electromagnetism when the universe was very young. Putting all the data together, electromagnetism seems to gradually increase the further we look, while towards the opposite direction, it gradually decreases. In other directions in the cosmos, the fine structure constant remains just that—constant. These new very distant measurements have pushed our observations further than has ever been reached before."
In other words, in what was thought to be an arbitrarily random spread of galaxies, quasars, black holes, stars, gas clouds and planets—with life flourishing in at least one tiny niche of it—the universe suddenly appears to have the equivalent of a north and a south. Professor Webb is still open to the idea that somehow these measurements made at different stages using different technologies and from different locations on Earth are actually a massive coincidence.
"This is something that is taken very seriously and is regarded, quite correctly with scepticism, even by me, even though I did the first work on it with my students. But it's something you've got to test because it's possible we do live in a weird universe."
But adding to the side of the argument that says these findings are more than just coincidence, a team in the US working completely independently and unknown to Professor Webb's, made observations about X-rays that seemed to align with the idea that the universe has some sort of directionality.
"I didn't know anything about this paper until it appeared in the literature," he says.
"And they're not testing the laws of physics, they're testing the properties, the X-ray properties of galaxies and clusters of galaxies and cosmological distances from Earth. They also found that the properties of the universe in this sense are not isotropic and there's a preferred direction. And lo and behold, their direction coincides with ours."
[b]Life, the universe and everything[/b]
While still wanting to see more rigorous testing of ideas that electromagnetism may fluctuate in certain areas of the universe to give it a form of directionality, Professor Webb says if these findings continue to be confirmed, they may help explain why our universe is the way it is, and why there is life in it at all.
"For a long time, it has been thought that the laws of nature appear perfectly tuned to set the conditions for life to flourish. The strength of the electromagnetic force is one of those quantities. If it were only a few percent different to the value we measure on Earth, the chemical evolution of the universe would be completely different and life may never have got going. It raises a tantalising question: does this "Goldilocks' situation, where fundamental physical quantities like the fine structure constant are 'just right' to favour our existence, apply throughout the entire universe?"
If there is a directionality in the universe, Professor Webb argues, and if electromagnetism is shown to be very slightly different in certain regions of the cosmos, the most fundamental concepts underpinning much of modern physics will need revision.
"Our standard model of cosmology is based on an isotropic universe, one that is the same, statistically, in all directions," he says.
"That standard model itself is built upon Einstein's theory of gravity, which itself explicitly assumes constancy of the laws of Nature. If such fundamental principles turn out to be only good approximations, the doors are open to some very exciting, new ideas in physics."
Professor Webb's team believe this is the first step towards a far larger study exploring many directions in the universe, using data coming from new instruments on the world's largest telescopes. New technologies are now emerging to provide higher quality data, and new artificial intelligence analysis methods will help to automate measurements and carry them out more rapidly and with greater precision.




Explore further
Do we live in a special part of the universe?



[b]More information:[/b] Michael R. Wilczynska et al. Four direct measurements of the fine-structure constant 13 billion years ago, Science Advances (2020). DOI: 10.1126/sciadv.aay9672

K. Migkas et al. Probing cosmic isotropy with a new X-ray galaxy cluster sample through the LX–T scaling relation, Astronomy & Astrophysics (2020). DOI: 10.1051/0004-6361/201936602
[b]Journal information:[/b] Science Advances  Astronomy & Astrophysics


.

Quote:"To truly 'think' there therefore must be sum archive that contains a base unit of thought and that would be a memory."

Assuming a time before a memory exists 
then mind/tabula rasa, as it exists,
still has a structure capable of understanding enough to form a memory.
The formation of the first memory means that thought exists in the first place.


don't gamble with improv.

Quote:[url=https://biblehub.com/nlt/revelation/2.htm]New Living Translation

They will rule the nations with an iron rod and smash them like clay pots. Reefer
 Now think about it.3



AUGUST 24, 2015 FEATURE
Researchers show that an iron bar is capable of decision-making
by Lisa Zyga , Phys.org
[Image: physicalobje.jpg]In tug-of-war dynamics, an iron bar can decide which slot machine has the higher winning probability by moving to the left for each rewarded play and to the right for each non-rewarded play of Machine A. The bar’s movements are caused by physical fluctuations. Credit: Kim, et al.
(Phys.org)—Decision-making—the ability to choose one path out of several options—is generally considered a cognitive ability possessed by biological systems, but not by physical objects. Now in a new study, researchers have shown that any rigid physical (i.e., non-living) object, such as an iron bar, is capable of decision-making by gaining information from its surroundings accompanied by physical fluctuations.

The researchers, Song-Ju Kim, Masashi Aono, and Etsushi Nameda, from institutions in Japan, have published their paper on decision-making by physical objects in a recent issue of the New Journal of Physics.
"The most important implication that we wish to claim is that the proposed scheme will provide a new perspective for understanding the information-processing principles of certain lower forms of life," Kim, from the International Center for Materials Nanoarchitectonics' National Institute for Materials Science in Tsukuba, Ibaraki, Japan, told Phys.org. "These lower lifeforms exploit their underlying physics without needing any sophisticated neural systems."
As the researchers explain in their study, the only requirement for a physical object to exhibit an efficient decision-making ability is that the object must be "volume-conserving." Any rigid object, such as an iron bar, meets this requirement and therefore is subject to a volume conservation law. This means that, when exposed to fluctuations, the object may move slightly to the right or left, but its total volume is always conserved. Because this displacement resembles a tug-of-war game with a rigid object, the researchers call the method "tug-of-war (TOW) dynamics."
Here's an example of how the idea works: Say there are two slot machines A and B with different winning probabilities, and the goal is to decide which machine offers the better winning probability, and to do so as quickly as possible based on past experiences.
The researchers explain that an ordinary iron bar can make this decision. Every time the outcome of a play of machine A ends in a reward, the bar moves to the left a specific distance, and every time the outcome ends in no reward, the bar moves to the right a specific distance. The same goes for a play of machine B, but the directions of the bar movements are reversed. After enough trials, the bar's total displacement reveals which slot machine offers the better winning probability.
The researchers explain that the bar's movements occur due to physical fluctuations.
"The behavior of the physical object caused by operations in the TOW can be interpreted as a fluctuation," Kim said. "Other than this fluctuation, we added another fluctuation to our model. The important point is that fluctuations, which always exist in real physical systems, can be used to solve decision-making problems."

The researchers also showed that the TOW method implemented by physical objects can solve problems faster than other decision-making doink-head that solve similar problems. The scientists attribute the superior performance to the fact that the new method can update the probabilities on both slot machines even though it plays just one of them. This feature stems from the fact that the system knows the sum of the two reward probabilities in advance, unlike the other decision-making doink-head.
The researchers have already experimentally realized simple versions of a physical object that can make decisions using the TOW method in related work.
"The TOW is suited for physical implementations," Kim said. "In fact, we have already implemented the TOW in quantum dotssingle photons, and atomic switches."
By showing that decision-making is not limited to biological systems, the new method has potential applications in artificial intelligence.
"The proposed method will introduce a new physics-based analog computing paradigm, which will include such things as 'intelligent nanodevices' and 'intelligent information networks' based on self-detection and self-judgment," Kim said. "One example is a device that can make a directional change so as to maximize its light-absorption." This ability is similar to how a young sunflower turns in the direction of the sun.
Another possibility that the researchers recently explored is an analogue computer that harness … natural fluctuations in order to maximize the total rewards "without paying the conventionally required computational cost."




Explore further
Quantum dots make efficient decisions



[b]More information:[/b] Song-Ju Kim, et al. "Efficient decision-making by volume-conserving physical object." New Journal of Physics. DOI: 10.1088/1367-2630/17/8/083023
[b]Journal information:[/b] New Journal of Physics




KR eye'll leave you to dual  Sheep dual with the recall.

Magic angle(s).
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#28
see the light 
Sarte` Sheep  itz art eh?

more wine Moire
APRIL 27, 2020 FEATURE
Photonic metasurfaces provide a new playground for twistronics
by Anna Demming , Phys.org
[Image: photonicmeta.jpg]Moire lattices of photonic metasurfaces made of graphene nanoribbons also have a twist in the tale. Credit: ACS Nano Letters
Quantum optics, spintronics and diffraction-free imaging with low loss are among the technologies that may benefit from recently predicted effects in twisted bilayer photonic structures. The work takes inspiration from a burgeoning field of condensed matter research—"twistronics," in which electronic behaviour can be dramatically altered by controlling the twist between layers of 2-D materials.

When Pablo Jarillo-Herrero and his group announced observations of electronic properties tuned between superconducting and Mott insulating states there was excitement not just among those researchers working closely with graphene and 2-D materials but many other fields. Naturally, not all research communities expected to find associated phenomena in the systems they studied.
"There was no reason to think this would happen in photonics—the effects stem from correlated electrons and we instead work with photons," explains Andrea Alù, Einstein Professor at City University of New York (CUNY). Nonetheless in a recent Nano Letters paper, he and colleagues at CUNY, the National University of Singapore, and the University of Texas at Austin have reported theoretical predictions of photonic behaviour changes with twisting that is in many ways analogous to the changes in electronic behaviour first observed in bilayer graphene.
[b]Flatbands[/b]
As you twist one periodic grid with respect to another on top, new "Moiré" patterns emerge that can make your eyes feel dizzy. Similarly, twisting one layer of honeycomb-shaped graphene atomic lattice with respect to another produces a Moiré superlattice with twist-dependent properties. The periodic potential fields change with dramatic effects on how electrons move, which affects how the energy levels or bands available change with the electron's momentum. At a "magic angle" of 1.1°—excruciatingly awkward to achieve in experiments—the slope flattens entirely, a stark contrast to the steep change in energy with momentum found in single layer graphene. It was on hearing about these "flat bands" that Alù's ears pricked up because they had noticed photonic flat bands in the metasurface systems they were studying.
In metamaterials, the material's composition and structure can give it optical properties that would not be found in nature, such as negative refractive indices or an extremely asymmetric "hyperbolic" optical response. In general, light emanating from a point source ripples outward in rings like waves from a pebble dropped in a pond. But in a metamaterial engineered so that the optical response in one direction is different to the perpendicular direction the rings become elliptical.

Take that asymmetry to an extreme, and the waves no longer form closed rings at all, but take off along a hyperbola like a rocket at escape velocity. The effect can be tantalising in metamaterials, which tend to be very lossy, so little light gets very far anyway. Metasurfaces, however, give the same effect, but at the surface, where you can really start to exploit the enhanced light-matter interactions from these hyperbolic optical responses.
Cutting graphene into long strips also affects how it behaves, and in 2015, Alù and his group showed that graphene nanoribbons could behave as a kind of metasurface. Light shone on a graphene nanoribbon sends large numbers of electrons oscillating in unison in response to the incident electromagnetic field—"a plasmon." More interesting still in a periodic grill of graphene nanoribbons these plasmons are hyperbolic.
"The reason why the flatband in twisted bilayer graphene resonated with us is if you take a graphene nanoribbon surface, there is a broad range of frequencies that give a hyperbolic propagation but at a point it becomes elliptical—there is a flat band for light," says Alù.
The photonic flatband means that the light travels without diffraction and light matter interactions are maximized. The catch is that the material is also at resonance at this point, meaning its loss is at a maximum. Hearing about the flatband in twisted bilayer graphene Alù and colleagues wondered whether stacking two graphene nanoribbon metasurfaces might provide some twist control over these photonic flatbands.
[b]Twisted photonics[/b]
Alù and his colleagues studied the Green's function of the bilayer graphene nanoribbon grills to evaluate the optical behaviour. They found that the two layers couple giving one plasmon mode with two energies for the whole bilayer system. In addition, the frequency of the flatband shifts so that maximum light matter interactions are possible when the materials is not at resonance. Finally, the transitions for their systems occur around 45° - much larger and more experimentally accessible than the magic angle in graphene bilayer systems, reflecting the larger periodicity of the nanoribbon grill. Since the angle is frequency dependent it is possible to sweep through frequencies to find the exact sweet point of the system.
In fact "canalization"—the diffraction-free propagation of light that occurs at the flatband point—has already been observed in a beam sent through two optical lattices of light at specific twist angles. The metasurfaces described by Alù and colleagues provide a further photonics system for exploring twist effects that may be easier to produce than magic angle bilayer graphene, as well as highlighting some new physics. "To me, the most exciting part is the beauty of how you can predict this from purely geometric formulae," says Alù.
In addition, the photonic flatband effects may prove useful for applications—quantum optics and imaging in particular. "People are often asking—how do we enhance the interaction of confined light emitters with matter, and how do we route the enhanced emission without diffraction?" says Alù. "This is an ideal platform—it's broadband and you can tune the frequency."




Explore further
Magic twist angles of graphene sheets identified



[b]More information:[/b] Guangwei Hu et al. Moiré Hyperbolic Metasurfaces. Nano Letters (2020) Accepted manuscript https://pubs.acs.org/doi/10.10 … acs.nanolett.9b05319
[b]Journal information:[/b] Nano Letters
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#29
...
Couple of posts back:


Quote:Not only does a universal constant seem annoyingly inconstant,
at the outer Uhoh fringes of the cosmos, 
it occurs in only ---> one direction ---> Whip
which is downright weird.



They never say how inconsistent the {FSC} Fine Structure Constant is,
in a way that I can understand it with the way they write it.

What are the fluctuation -- percentages?
Is it 5%  ... 1% ... or is it 0.0000001 % ... ?
How much does it change over the:
"one direction",
to the:
"outer fringes of the universe" ?
Nonetheless they mention large areas where the FSC is consistently constant.

This is what they came up with ...


Quote:The weighted mean electromagnetic force,
in this location in the universe,
deviates from the terrestrial value by
Δα/α = (αz − α0)/α0 = (−2:18 ± 7:27) × 10−5,
consistent with no temporal change.

https://advances.sciencemag.org/content/6/17/eaay9672



I am not certain,
but they might be talking about the last three decimal placements of the FSC fluctuating.


They use AI -- algo - rithms:  
{if you write the word algo - rithms correctly,
Keith has a program that reinserts the word as  "doink head" -- in reference to Don Kru pp,
an old member who was a doink head}


Quote:Observations and artificial intelligence algo - rithm
Voigt profile models,
for each of the four absorption systems were automatically constructed,
using a genetic algorithm, GVPFIT,
which requires no human decision-making beyond initial setup parameters



Then they put out the classic statement,
we see about very six months or so from the physicists:


Quote:... the most fundamental concepts Hmm2
underpinning much of modern physics,
will need revision.



And when I see this, I don't get a lot of confidence,
other than that they confirmed a previous study on the "directional" aspect ...


Quote:The final column combines the thermal and turbulent values,
using the method of moments,
and also includes, 
an estimated systematic error component Whip
 associated with possible  Naughty long-range wavelength distortions


Very interesting to read about though!
Always interested in FSC numbers and dynamics.
I produced a series of equations with the FSC ... in harmonic convergence number dynamics.

...
Reply
#30
(Correct spelling is Sartre')

Mind is structured to accept information.
Whether it supports life or not then it is remembered as such.
Thinking about quantum reality is difficult because it's counterintuitive.

That stuff about the Z' particle mediating between matter and antimatter......  Damned
Reply
#31
Quote:(Correct spelling is Sartre')
see the light (reversed)

It is Sartre` [Image: sheep.gif]  itz art tre?


Quote:more wine Moire


.
English translation of 'tre'
tre
[tre]
INVARIABLE ADJECTIVE
three
tre volte three times




Quote:Then they put out the classic statement,
we see about very six months or so from the physicists:


Quote: Wrote:... the most fundamental concepts [Image: hmm2.gif]
underpinning much of modern physics,
will need revision.



And when I see this, I don't get a lot of confidence,
other than that they confirmed a previous study on the "directional" aspect ...

[b]Debate in expert circles[/b]

Scientists are currently debating whether the discrepancy between the data sets is actually an indication that the Standard Model of Cosmology is wrong or not.
Hi
Triality is on trial and error.

Is a new medium the standard bearer?

APRIL 28, 2020
The mass of the universe
[Image: 5ea835dcb912e.jpg]It is SartreHigh-mass objects in the universe are not perfect lenses. As they deflect light, they create distortions. The resulting images appear like looking through the foot of a wine glass. more wine Moire    Credit: Roberto Schirdewahn
Bochum cosmologists headed by Professor Hendrik Hildebrandt have gained new insights into the density and structure of matter in the universe. Several years ago, Hildebrandt had already been involved in a research consortium that had pointed out discrepancies in the data between different groups. The values determined for matter density and structure differed depending on the measurement method. A new analysis, which included additional infrared data, made the differences stand out even more. They could indicate that this is the flaw in the Standard Model of Cosmology.

Rubin, the science magazine of Ruhr-Universität Bochum, has published a report on Hendrik Hildebrandt's research. The latest analysis of the research consortium, called Kilo-Degree Survey, was published in the journal Astronomy and Astrophysics in January 2020.
[b]Two methods for determining the structure of matter[/b]
Research teams can calculate the density and structure of matter based on the cosmic microwave background, a radiation that was emitted shortly after the Big Bang and can still be measured today. This is the method used by the Planck Research Consortium.
The Kilo-Degree Survey team, as well as several other groups, determined the density and structure of matter using the gravitational lensing effect: as high-mass objects deflect light from galaxies, these galaxies appear in a distorted form in a different location than they actually are when viewed from Earth. Based on these distortions, cosmologists can deduce the mass of the deflecting objects and thus the total mass of the universe. In order to do so, however, they need to know the distances between the light source, the deflecting object and the observer, among other things. The researchers determine these distances with the help of redshift, which means that the light of distant galaxies arrives on Earth shifted into the red range.
[b]New calibration using infrared data[/b]
To determine distances, cosmologists therefore take images of galaxies at different wavelengths, for example one in the blue, one in the green and one in the red range; they then determine the brightness of the galaxies in the individual images. Hendrik Hildebrandt and his team also include several images from the infrared range in order to determine the distance more precisely.
Previous analyses had already shown that the microwave background data from the Planck Consortium systematically deviate from the gravitational lensing effect data. Depending on the data set, the deviation was more or less pronounced; it was most pronounced in the Kilo-Degree Survey. "Our data set is the only one based on the gravitational lensing effect and calibrated with additional infrared data," says Hendrik Hildebrandt, Heisenberg professor and head of the RUB research group Observational Cosmology in Bochum. "This could be the reason for the greater deviation from the Planck data."
To verify this discrepancy, the group evaluated the data set of another research consortium, the Dark Energy Survey, using a similar calibration. As a result, these values also deviated even more strongly from the Planck values.
[b]Debate in expert circles[/b]
Scientists are currently debating whether the discrepancy between the data sets is actually an indication that the Standard Model of Cosmology is wrong or not. The Kilo-Degree Survey team is already working on a new analysis of a more comprehensive data set that could provide further insights. It is expected to provide even more precise data on matter density and structure in spring 2020.




Explore further
Dark matter may be smoother than expected



[b]More information:[/b] H. Hildebrandt et al. KiDS+VIKING-450: Cosmic shear tomography with optical and infrared data, Astronomy & Astrophysics (2019). DOI: 10.1051/0004-6361/201834878
[b]Journal information:[/b] Astronomy & Astrophysics  Astronomy and Astrophysics [/url]

Provided by [url=https://phys.org/partners/ruhr-universitaet-bochum/]Ruhr-Universitaet-Bochum



https://phys.org/news/2020-04-weight-universe.html


Arrow Improv Manifests.

more wine Moire

(04-19-2020, 12:14 PM)Kalter Rauch Wrote: Think about thinking.
How difficult it is.

Arrow Sarte` says existence precedes essence,
which thought endows with meaning,
thus preserving its primacy over nature.

I think EYE SPELT said that correctly...
[quote pid='245841' dateline='1587309270']

 ...Sarte'  Hmm2  Sartre'...

[/quote]
[quote pid='245841' dateline='1587309270']

Read this part of his novel, Nausea

https://www.google.com/books/edition/Nau...frontcover

Damned
[/quote]

It is as it is as itza was as it wasn't...your words not mine.

I spelled Quote:

more wine Moire

[Image: 5ea835dcb912e.jpg]
Full circle.

KR... what if a full circle got twisted by another spelling?

would that twisted full circle present as this?
[Image: valknut_1.jpg?dateline=1436854500]#22

Sunday, April 19th, 2020, 03:14 pm

Happy 70th orbit around the sun KR.
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply
#32
Danse-du-ventre
Reply
#33
Quote:Quote:
"To truly 'think' there therefore must be sum archive that contains a base unit of thought and that would be a memory."

Assuming a time before a memory exists 
then mind/tabula rasa, as it exists,
still has a structure capable of understanding enough to form a memory.
The formation of the first memory means that thought exists in the first place.


Recall:

FEBRUARY 4, 2020
Astronomers search for gravitational-wave  Hi memory
[Image: leavingthefa.jpg] Itz Art Tre' Artist’s depiction of a black hole. Credit: James Josephides, Swinburne University of Technology
Astronomers regularly observe gravitational waves (GW)—ripples in space and time—that are caused by pairs of black holes merging into one. Einstein's theory of gravity predicts that GW, which squeeze and stretch space as they pass, will permanently distort space, leaving a "memory" of the wave behind. However, this memory effect has not yet been detected, as it would be extremely small, leaving only the faintest traces.

https://phys.org/news/2020-02-astronomer...emory.html


instead of self-annihlating in an energetic flash what if matter and antimatter conjoined and 'winked out of ordinary existence?
And became Extra-Ordinary literally and in actuality of a Triality???

Extra-ordinary claims require extra-disordered-hyper-uniform evidence.


MAY 1, 2020
Looking for dark matter with the Universe's coldest material
by ICFO
[Image: darkmatter.jpg]Credit: CC0 Public Domain
Scientists have been able to observe the universe and determine that about 80% of the its mass appears to be "dark matter," which exerts a gravitational pull but does not interact with light, and thus can't be seen with telescopes. Our current understanding of cosmology and nuclear physics suggests that dark matter could be made of axions, hypothetical particles with unusual symmetry properties.

In a new article published in Physical Review Letters and highlighted as an Editor's suggestion, ICFO researchers Pau Gomez, Ferran Martin, Chiara Mazzinghi, Daniel Benedicto Orenes, and Silvana Palacios, led by ICREA Prof. at ICFO Morgan W. Mitchell, report on how to search for axions using the unique properties of Bose-Einstein condensates (BECs).
The axion, if it exists, would imply "exotic spin-dependent forces." Magnetism, the best-known spin-dependent force, causes electrons to point their spins along the magnetic field, like a compass needle that points north. Magnetism is carried by virtual photons, whereas "exotic" spin-dependent forces would be carried by virtual axions (or axion-like particles). These forces would act on both electrons and nuclei, and would be produced not just by magnets, but also by ordinary matter. To know if axions do exist, a good way is to look and see if nuclei prefer to point toward other matter.
Several experiments are already searching for these forces, using "comagnetometers," which are paired magnetic sensors in the same place. By comparing the two sensors' signals, the effect of the ordinary magnetic field can be cancelled out, leaving just the effect of the new force. So far, comagnetometers have only been able to look for spin-dependent forces that reach about a meter or more. To look for short-range spin-dependent forces, a smaller comagnetometer is needed.
Bose Einstein Condensates (BECs) are gases cooled nearly to absolute zero. Because BECs are superfluid, their constituent atoms are free to rotate for several seconds without any friction, making them exceptionally sensitive to both magnetic fields and new exotic forces. A BEC is also very small, about 10 micrometers in size. To make a BEC comagnetometer, however, requires solving a tricky problem: how to put two BEC magnetometers in the same small volume.
In their study, Gomez and his colleagues report that they were able to solve this problem by using two different internal states of the same 87Rb BEC, each one acting as a separate but co-located magnetometer. The results of the experiment confirm the predicted high immunity to noise from the ordinary magnetic field and the ability to look for exotic forces with much shorter ranges than in previous experiments. Besides looking for axions, the technique may also improve precision measurements of ultracold collision physics and studies of quantum correlations in BECs.




Explore further
A new search for axion dark matter rules out past numerical predictions



[b]More information:[/b] Pau Gomez et al. Bose-Einstein Condensate Comagnetometer, Physical Review Letters (2020). DOI: 10.1103/PhysRevLett.124.170401
[b]Journal information:[/b] Physical Review Letters [/url]

Provided by 
ICFO


(03-08-2020, 12:13 AM)EA Wrote: What if matter and anti-matter don't always  immediately self-destruct but instead they immediately mediate?
To differentiate the two.
Mind your matters.
Find shorn tatters...
The fabricated fabric of duality as causality of reality.

Triality?

Can matter and anti-matter recombine as trine?
A hybrid state of physicality.
Instantly ancient.
This idea anew.
The big-bang/yin-yan dark opposed to light.
Never knew this clever clue was a third insight.



I just thought this subject up after a beer and a toke on a lark of a joke.
Eye Wander/I Wonder what the forum thinks would happen if matter and antimatter could co-exist?  


A thought experiment.



MAY 1, 2020
Eye Wander/Seeing is conceiving/I Wonder
by Kim Krieger, University of Connecticut
[Image: seeingisconc.jpg]During one part of the experiment, participants looked at a group of complex nonsense shapes (second image down), and then were asked whether a specific shape (bottom) had been part of the nonsense group—all while they listened to words. Credit: University of Connecticut


Conceiving of a vision may be akin to seeing it, UConn researchers report in a new paper published in Psychological Science. Their findings add support to a major theory of how our brains remember and consider ideas.

Philosophers, psychologists, linguists and other neuroscientists have long wondered how exactly we conceive of things we have experienced before. For sensed experiences—seeing a sunset, hearing a violin, tasting a brownie—brain researchers suspect that thinking about something is a little bit like experiencing it. To think about a sunset, for example, part of the visual processing area of the brain lights up. But other researchers say no, the act of considering something depends on other parts of the brain. The visual area lighting up is just a side effect, according to this hypothesis.
Distinguishing between cause and effect in brain regions can be difficult. Across a series of studies, UConn cognitive scientist Eiling Yee and graduate student Charles Davis, along with other colleagues, decided to see if they could show that thinking about a visually-experienced thing requires, or at least shares resources, with the visual system in the brain. If conceiving of a visually experienced thing requires help from the brain's visual system, they thought, then busying the visual system with another task should make it harder to remember visual things.
"Thinking about a sunset with your eyes closed is different than thinking about a sunset when you're scanning a crowded refrigerator," looking for something to eat, Yee says. But is it because you need to use the visual parts of your brain both to consider a sunset and find that food item? There's less distraction with your eyes closed. Is it easier to think about the sunset because of that?
Davis, Yee, and colleagues from UConn and the Basque Center on Cognition, Brain and Language designed an experiment to find out. They had a group of undergraduates consider and remember a set of nonsense, randomly shaped blobs. While the shapes were still in front of the students, the researchers played a word; to make sure the students paid attention, they had to judge whether the word was an animal. Some of the words on the list were primarily experienced visually, for example "sunset." Other words were not as visual, such as "volume."
After the word test, the original shapes were removed and a single shape was shown. The student was then asked whether this shape was in the original set (see the picture for an example set of the shapes). The word judgement/shape memory task was then repeated for 240 words. Another group of undergrads heard these same words and made the same judgment on them, but instead of a visual task, they performed a manual task—a series of hand motions on a table.
The control group for the experiment was read the exact same group of words, but did not have to consider and remember shapes, nor do the manual knob-twisting.

[Image: 1-seeingisconc.jpg]
Some participants in the experiment were asked to move their hands (as shown) while they listened to words. Credit: University of Connecticut
Not surprisingly, both tasks interfered with the students' word judgments. After all, doing two things at once is hard. But what was so interesting to the researchers was that the visual shape task interfered with the visual words (e.g., "sunset") more than the non-visual words ("volume"). And the manual task interfered more with the non-visual words.
"What was so cool is that the two different interference tasks showed that whether a word was more affected by the visual or manual task depended on whether the thing that the word referred to was usually experienced more visually or more manually," Yee says.
Specifically, under visual interference, people had more difficulty with visually experienced words than with manually experienced words (e.g., volume), but the opposite was true under manual interference.
But wait—isn't volume a visual word? It has to do with the expanse of a space, right?
Not to psychology undergraduates, according to Davis and Yee, who had the words rated for visual experience by another group of UConn psychology students who did not participate in the experiment. Psychology undergraduates seemed to associate volume more with manual action (e.g., turning a volume knob, or pressing a button to silence their phone.) But that's the fascinating thing about this experiment, Yee says.
"It's all about your experience," and how your brain is used to considering concepts. "For physics majors, volume might very well equal space," she says.
Yee and Davis hope eventually to be able to explain exactly how our shared and distinct experiences come to form our knowledge about the world. They want to pursue this research further in the future by looking more explicitly at the different parts of the brain's perceptual system. Eventually, they hope to be able to explain which parts of the perceptual system are necessary for conceiving of things. For now, they're pointing out that their findings have a surprising implication: when you're looking for something, having to scan through unrelated things could actually interfere with your ability to think about the very thing you are searching for.




Explore further
Brain processes concrete and abstract words differently



[b]More information:[/b] Charles P. Davis et al. Making It Harder to "See" Meaning: The More You See Something, the More Its Conceptual Representation Is Susceptible to Visual Interference, Psychological Science (2020). DOI: 10.1177/0956797620910748
[b]Journal information:[/b] Psychological Science 

Provided by University of Connecticut


Quote:the road to damn mask us!  Arrow Ninja

itz Saul rite right write here...

This mechanism explains, in least part, why your pupils avoid constricting until bright light intensifies."
APRIL 30, 2020

Eyes send an unexpected signal to the brain

[Image: eyessendanun.jpg]Retinal section from a mouse where cell nuclei are labeled in blue, inhibitory cells are labeled with magenta, and ipRGCs are labeled in green. Credit: Northwestern University
The eyes have a surprise.

For decades, biology textbooks have stated that eyes communicate with the brain exclusively through one type of signaling pathway. But a new discovery shows that some retinal neurons take a road less traveled.
New research, led by Northwestern University, has found that a subset of retinal neurons sends inhibitory signals to the brain. Before, researchers believed the eye only sends excitatory signals. (Simply put: Excitatory signaling makes neurons to fire more; inhibitory signaling makes neurons to fire less.)
The Northwestern researchers also found that this subset of retinal neurons is involved in subconscious behaviors, such as synchronization of circadian rhythms to light/dark cycles and pupil constriction to intense bright lights. By better understanding how these neurons function, researchers can explore new pathways by which light influences our behavior.
"These inhibitory signals prevent our circadian clock from resetting to dim light and prevent pupil constriction in low light, both of which are adaptive for proper vision and daily function," said Northwestern's Tiffany Schmidt, who led the research. "We think that our results provide a mechanism for understanding why our eye is so exquisitely sensitive to light, but our subconscious behaviors are comparatively insensitive to light."

[Image: 1-eyessendanun.jpg]
Image from a mouse retinal section where cell nuclei are labeled in blue, RNA for the GABA synthesis enzyme Gad2 is labeled in magenta, and RNA for melanopsin is labeled in green. Credit: Northwestern University
The research will be published in the May 1 issue of the journal Science.
Schmidt is an assistant professor of neurobiology at Northwestern's Weinberg College of Arts and Sciences. Takuma Sonoda, a former Ph.D. student in the Northwestern University Interdepartmental Neuroscience program, is the paper's first author.
To conduct the study, Schmidt and her team blocked the retinal neurons responsible for inhibitory signaling in a mouse model. When this signal was blocked, dim light was more effective at shifting the mice's circadian rhythms.
"This suggests that there is a signal from the eye that actively inhibits circadian rhythms realignment when environmental light changes, which was unexpected," Schmidt said. "This makes some sense, however, because you do not want to adjust your body's entire clock for minor perturbations in the environmental light/dark cycle, you only want this massive adjustment to take place if the change in lighting is robust."
Schmidt's team also found that, when the inhibitory signals from the eye were blocked, mice's pupils were much more sensitive to light.
"Our working hypothesis is that this mechanism keeps pupils from constricting in very low light," Sonoda said. "This increases the amount of light hitting your retina, and makes it easier to see in low light conditions. This mechanism explains, in least part, why your pupils avoid constricting until bright light intensifies."
[Image: Saul-on-the-road-to-Damascus.jpg]



Explore further
Newly discovered neural pathway processes acute light to affect sleep



[b]More information:[/b] T. Sonoda el al., "A noncanonical inhibitory circuit dampens behavioral sensitivity to light," Science (2020). science.sciencemag.org/cgi/doi … 1126/science.abb7529
[b]Journal information:[/b] Science 

Provided by [url=https://medicalxpress.com/partners/northwestern-university/]Northwestern University
Along the vines of the Vineyard.
With a forked tongue the snake singsss...
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)