And a Happy New Year
(via LettersOfNote)
Pages
▼
The science behind Santa
How does Santa visit billions of homes all around the globe in just one night? Is this just a load of hogwash that your parents tell you so you'll eat your overcooked vegetables and go to bed early without making a fuss?You can find the answers in the following video via Grrlscientist (thanks to mabelmoments on tumblr): And...
A Merry Christmas to all readers
Higgs day
I've just write on my italian blog that:
After the two seminars I discussed via e-mail with Salvatore Fazio, who send me the following two plots about the preliminary results from ATLAS and CMS and the superposition of the two previous plots: He also comments:
There will be no dramatic announcement, but only new and more stringent limits on the Higgs mass(1)and the conclusion of the today Higgs' event confirms that impression. Indeed Fabiola Gianotti and Giulio Tonani, respectively spokespersons of ATLAS and CMS, during their CERN's seminars presented the new limits about Higgs mass, and in the combination of the data presented in the official press release (the combination dued by the two experiments will arrive only after the publication of the papers) we can read the new limits: from 124 to 126 GeV.
After the two seminars I discussed via e-mail with Salvatore Fazio, who send me the following two plots about the preliminary results from ATLAS and CMS and the superposition of the two previous plots: He also comments:
In search of violations
This week high-energy physics published some interesting results for our fundamental knowledge of the universe. On 14th November, LHCb published the preliminar analysis about a possible CP-violation in charm decays (see also the CERN's Bulletin and Mat Charles' presentation).
The researches about CP-violations are very important because in this way we can argue the differences between matter and anti-matter. If a physics law is CP-invariant, we must write that the beahvior of matter and anti-matter is the same. But our universe is constituted by matter and we don't know why it is so. The answer could be in CP-violation studies, like the preliminary data analyzed by LHCb team. The main goal of the experiment is the search of the properties of quark b, but he could also measure the properties of quark c. And studying the preliminary data about c decays the team find a clue of a CP-violation in a non expected channel. Following Tommaso Dorigo and Marco Delmastro (english translation by Google), if the result will be confirmed by further analysis, this could be the first sign of physics beyond Standard Model. The other possible violation is the wall of the speed of light: indeed, OPERA experiment confirm their previous data. Yesterday, in the updated version of their famous preprint, OPERA's researchers described a new serie of measures realized with CNGS using a short-bunch wide-spacing beam.
The researches about CP-violations are very important because in this way we can argue the differences between matter and anti-matter. If a physics law is CP-invariant, we must write that the beahvior of matter and anti-matter is the same. But our universe is constituted by matter and we don't know why it is so. The answer could be in CP-violation studies, like the preliminary data analyzed by LHCb team. The main goal of the experiment is the search of the properties of quark b, but he could also measure the properties of quark c. And studying the preliminary data about c decays the team find a clue of a CP-violation in a non expected channel. Following Tommaso Dorigo and Marco Delmastro (english translation by Google), if the result will be confirmed by further analysis, this could be the first sign of physics beyond Standard Model. The other possible violation is the wall of the speed of light: indeed, OPERA experiment confirm their previous data. Yesterday, in the updated version of their famous preprint, OPERA's researchers described a new serie of measures realized with CNGS using a short-bunch wide-spacing beam.
The birth of the web
In Italy we celebrate the 20th anniversary of the web. Internet was born in the early years of the 20th century at the end of 1980s at CERN. On 13rd March of 20 years ago Tim Berners-Lee (list of publications) with Robert Caillau and others propose a project to construct a scientific world network: the world-wide web.
Berners-Lee is a computer scientist, in particular expert in text editors, real-time software and communications. He is considered the most influencal scientists alive today, and he started to work on hypertextual systems in 1980 developing Enquire, and in 1989 he started the World-Wide Iniziative. Robert Caillau joined it in 1990 and in the next year arrived also Jean-François Groff and Bernd Pollerman. The team laid the basis of the modern web with three papers: World-Wide Web: The information universe(6), World-Wide Web: An information infrastructure for high-energy physics(7), World-Wide Web(8).
In the first paper, published in a referred journal, the researchers begin in The Dream section with the following words:
The structure developed by Berners-Lee team is outlined by the following diagram: From the first definition of the W3 model I would extract the following two points:
Berners-Lee is a computer scientist, in particular expert in text editors, real-time software and communications. He is considered the most influencal scientists alive today, and he started to work on hypertextual systems in 1980 developing Enquire, and in 1989 he started the World-Wide Iniziative. Robert Caillau joined it in 1990 and in the next year arrived also Jean-François Groff and Bernd Pollerman. The team laid the basis of the modern web with three papers: World-Wide Web: The information universe(6), World-Wide Web: An information infrastructure for high-energy physics(7), World-Wide Web(8).
In the first paper, published in a referred journal, the researchers begin in The Dream section with the following words:
Pick up your pen, mouse, or favorite pointing device and press it on a reference in this document perhaps to the author's name, or organization, or some related work. Suppose you are then directly presented with the background material other papers, the author's coordinates, the organization's address, and its entire telephone directory. Suppose each of these documents has the same property of being linked to other original documents all over the world. You would have at your fingertips all you need to know about electronic publishing, highenergy physics, or for that matter, Asian culture. If you are reading this article on paper, you can only dream, but read on.The construction of Berners-Lee's group dream start in 1945 with the historical paper of Vannevar Bush, As we may think(1) where the american inventor laid the basis to construct a scientists' network through hypertexts. The way towards this system had another founding father, Douglas Engelbart(2). He is also an american inventor, with scandinavian origin, pioneer in the development of graphical user interfaces. And after Bush and Engelbart we arrive to Berners-Lee, Caillau and collegues.
The structure developed by Berners-Lee team is outlined by the following diagram: From the first definition of the W
- Indexes are documents, and so may themselves be found by searches and/or following links. An index is represented to the user by a "cover page" that describes the data indexed and the properties of the search engine.
- The documents in the web do not have to exist as files; they can be "virtual" documents generated by a server in response to a query or document name. They can therefore represent views of databases, or snapshots of changing data (such as the weather forecasts, financial information, etc.).
- a common naming scheme for documents
- common network access protocols
- common data formats for hypertext
0.0909090909: Shroedinger's cat
Following Uri Geller (via Peter Woit), today it could be open a portal to another universe!
Geller support his original idea with a mix of string theory and numerology. At the other hand, this mixture of science and superstition is perfect for a fiction, for example The Invisibles, the cult comics written by Grant Morrison in the past century. The comic book series, behind the entertainment pourposes, hides a political purpose, to tell the world and the use of mass media in order to control people. But I don't write about this subject, but about the scientific background, in particular I would start from the many worlds hypothesis(1).
In The Invisibles there is, indeed, a war against alien from another universe. The scientific basis of the comics is the string theory: for example the simbols of early cristians is interpreted like a part of the infinity symbol, that it also represents the connection between the two universes, like two universal strings or two membranes (or branes). But the story of the scientific multiversity began woth the famous Schroedinger's cat: The thought experiment was proposed by Schroedinger in 1935 in order to proof the limits of quantum mechanics and Copenaghen interpretation, but Erwind could not know that his paradox would be generate a lot of intriguing theoretical science. One consequence is the many worlds interpretation, but another research line is the construction of a real Schroedinger's cat!
A first right attempt was made by Jonathan Friedman and collegues(2) (read physicsworld). The team realized a superconducting quantum interference device (SQUID):
Geller support his original idea with a mix of string theory and numerology. At the other hand, this mixture of science and superstition is perfect for a fiction, for example The Invisibles, the cult comics written by Grant Morrison in the past century. The comic book series, behind the entertainment pourposes, hides a political purpose, to tell the world and the use of mass media in order to control people. But I don't write about this subject, but about the scientific background, in particular I would start from the many worlds hypothesis(1).
In The Invisibles there is, indeed, a war against alien from another universe. The scientific basis of the comics is the string theory: for example the simbols of early cristians is interpreted like a part of the infinity symbol, that it also represents the connection between the two universes, like two universal strings or two membranes (or branes). But the story of the scientific multiversity began woth the famous Schroedinger's cat: The thought experiment was proposed by Schroedinger in 1935 in order to proof the limits of quantum mechanics and Copenaghen interpretation, but Erwind could not know that his paradox would be generate a lot of intriguing theoretical science. One consequence is the many worlds interpretation, but another research line is the construction of a real Schroedinger's cat!
A first right attempt was made by Jonathan Friedman and collegues(2) (read physicsworld). The team realized a superconducting quantum interference device (SQUID):
The simplest SQUID is a superconducting loop of inductance $L$ broken by a Josephson tunnel junction with capacitance $C$ and critical current $I_c$. In equilibrium, a dissipationless supercurrent can flow around this loop, driven by the difference between the flux that threads the loops and the external flux $\phi_x$ applied to the loop.They used two junctions in their experimental setup, and so they realized a superposition between two different states:
Such a superposition would manifest itself in an anticrossing, where the energy-level diagram of two levels of different fluxoid states (labelled $| 0 >$ and $| 1 >$) is shown in the neighbourhood in which they would become degenerate without coherent interaction (dashed lines). Coherent tunnelling lifts the degeneracy (solid lines) so that at the degeneracy point the energy eigenstates are \[\frac{1}{2} \left ( | 0 > + | 1 > \right )\] and \[\frac{1}{2} \left ( | 0 > - | 1 > \right ) \, ,\] the symmetric and anti-symmetric superpositions. The energy difference $E$ between the two states is given approximately by $E = \epsilon^2 + \Delta^2$, where $\Delta$ is known as the tunnel splitting.In order to proof the existence of the splitting, a necessary condition is that:
(...) the experimental linewidth of the states be smaller than $\Delta$(3). The SQUID is extremely sensitive to external noise and dissipation (including that due to the measurement of ), both of which broaden the linewidth. Thus, the experimental challenges to observing coherent tunnelling are severe. The measurement apparatus must be weakly coupled to the system to preserve coherence, while the signal strength must be sufficiently large to resolve the closely spaced levels. In addition, the system must be well shielded from external noise. These challenges have frustrated previous attempts5, 6 to observe coherence in SQUIDs.But the observation presents some difficulties, like the SQUID's sensibility to the noise, which must be shielded, and to the dissipation; the device must also preserve the coherence,
(...) while the signal strength must be sufficiently large to resolve the closely spaced levels.All of these problems influenced previous attempts(4, 5), but they found an answer thanks Friedman's team: In the plot it is showed the probability to realize a transition like function of the flux $\phi_x$. The curves, plotted for different potentials, are shifted upwards in order to clarify the shapes. The quantum behaviour of the macroscopic system is argued by the existence of two peaks, that decreasing potential separate one from each other and reach the same amplitude. The two peaks correspond to two distinct macroscopic fluxes and we can conclude that they realize a macroscopic Schroedinger's cat. A most recent attempt has made this year (via tumblr) by a team of China's researchers leaded by Xing-Can Yao(6). The team, using the following experimental setup:
Laika: a comics about a hero
Laika was the first living being to go in space. Its adventure was written by Rick Abadzis in a beautiful graphic novel, that I read two years ago.
The story began with Sputnik 1, the first russian space mission launched in space on October 4, 1957, in order to send a signal to Earth. The next mission, Sputnik 2, launched on November 3, 1957 with the first living thing on board, a dog, Laika. This second mission, however, arose under pressure from the government, which would celebrate the fortieth anniversary of the October revolution with a new, great success of the spatial plan of the Soviet Union. So the mission was prepared in haste, designing a less sophisticated satellite than Sputnik 1 that was not expected it would be back on the planet: a death sentence beyond a reasonable doubt for his passenger. Laika's death, however, was very early: the mission was designed to last a week or less, but the dog died long before, in 5-7 hours:
And the British cartoonist tells not only the historical background of Laika's story, but also the simple story of a dog that goes in space from the street and the story of the people who shared their life and love with the little dog.
A graphic novel, made with the classic bd style, that you can not miss in your collection.
(1) Dmitrij Malashenkov, Some Unknown Pages of the Living Organisms' First Orbital Flight
The story began with Sputnik 1, the first russian space mission launched in space on October 4, 1957, in order to send a signal to Earth. The next mission, Sputnik 2, launched on November 3, 1957 with the first living thing on board, a dog, Laika. This second mission, however, arose under pressure from the government, which would celebrate the fortieth anniversary of the October revolution with a new, great success of the spatial plan of the Soviet Union. So the mission was prepared in haste, designing a less sophisticated satellite than Sputnik 1 that was not expected it would be back on the planet: a death sentence beyond a reasonable doubt for his passenger. Laika's death, however, was very early: the mission was designed to last a week or less, but the dog died long before, in 5-7 hours:
The fact, that pressure in the cabin was not reduced, proved its reliable tightness. It was very important, as the satellite passed through areas of meteoric flows. Normalization of parameters of breath and blood circulation of Layka during orbital flight has allowed to make a conclusion, that the long weightlessness does not cause essential changes in a status of animal organisms. During flight the gradual increase of temperature and humidity in the cabin was registered via telemetric channels. Approximately in 5 - 7 hours of flight there was a failure of telemetry system. It was not possible to detect a status of the dog since the fourth circuit. During the ground simulation of this flight's conditions, the conclusion was made, that Layka should be lost because of overheating on 3d or 4-th circuit of flight.(1)So, the hypothesis of silence imposed from above on the health conditions of Laika, or more than anything else on the impossibility to monitor them, told by Abadzis are not so far-fetched: we must consider, in fact, that the animal (in the government vision) would had to stay alive in time to celebrate the forty years of the October Revolution!
And the British cartoonist tells not only the historical background of Laika's story, but also the simple story of a dog that goes in space from the street and the story of the people who shared their life and love with the little dog.
A graphic novel, made with the classic bd style, that you can not miss in your collection.
(1) Dmitrij Malashenkov, Some Unknown Pages of the Living Organisms' First Orbital Flight
Lovecraft's mathematical horrors
Sometimes we can find on the web something of interesting, like the following review of the 4D Man.
We can read on Wikipedia:
Another non-euclidean reference is in The Call of Cthulhu(1):
We can read on Wikipedia:
Brilliant but irresponsible scientist Tony Nelson (James Congdon) develops an amplifier that allows any object to achieve a 4th dimensional (4D) state. While in this state that object can pass freely through any other object.Reading these words I immediatly think to Howard Philips Lovecraft and his Cthulhu Mythos, in particular to Dream in the Witch House. In this short story Walter Gilman, a student of mathematics, lives in the house of Keziah Mason, one of the Salem's witches. In the story there are some mathematically interesting quotes:
She had told Judge Hathorne of lines and curves that could be made to point out directions leading through the walls of space to other spaces beyond (...)We can argue the Lovecraft's use for his purpouse of the non-euclidean geometry, in particular in the following quotation:
[Gilman] wanted to be in the building where some circumstance had more or less suddenly given a mediocre old woman of the Seventeenth Century an insight into mathematical depths perhaps beyond the utmost modern delvings of Planck, Heisenberg, Einstein, and de Sitter.or in the following point, in which HPL seems refer to Riemann's hypotesys:
He was getting an intuitive knack for solving Riemannian equations, and astonished Professor Upham by his comprehension of fourth-dimensional and other problems (...)Indeed Gilman was studying
non-Euclidean calculus and quantum physicsAnd Walter, dreaming, has experienced the high dimensional space of the limitless abysses:
abysses whose material and gravitational properties, and whose relation to his own entity, he could not even begin to explain. He did not walk or climb, fly or swim, crawl or wriggle; yet always experienced a mode of motion partly voluntary and partly involuntary. Of his own condition he could not well judge, for sight of his arms, legs, and torso seemed always cut off by some odd disarrangement of perspective; (...)Durign his travel in the fourth-dimension, Gilman seen
risms, labyrinths, clusters of cubes and planes, and Cyclopean buildingsthat are characteristic in lovecraftian literature.
Another non-euclidean reference is in The Call of Cthulhu(1):
He said that the geometry of the dream-place he saw was abnormal, non-Euclidean, and loathsomely redolent of spheres and dimensions apart from ours.And Cthulhu itself is a fourth dimensional creature. Cthulhu was one of the Great Old Ones: these creatures
(...) were not composed altogether of flesh and blood. They had shape (...) but that shape was not made of matter.We can imagine Cthulhu in our world like the projection of a dodecaplex in a three dimensional space, for example:
News from the OPERA
Today, at 15:30 at Physics Department in Milano, Italy, Luca Stanco, one of the 15 OPERA's collaboration who didn't sign the preprint, discussed in a brief presentation (about half an hour, without quests) the OPERA's results. In conclusion we have a lot of interesting informations. He described the experiments, starting from the production of neutrinos' beams and arriving to the detection in Italy, under Gran Sasso mountain. He briefly described the measure of the distance and the GPS system.
The most interesting part of the presentation is the production of neutrinos' beams(1). First of all reasearchers need to produce one proton bench in PS (is a little synchrotron), so they send the bench in SPS, a much greater synchrotron than PS. In order to fill SPS are needed 11 PS benches, but researchers decided to inject in SPS 5 beams (each one with a time length of about 10.4 μs) and after a time range of about 50 ms they inject others 5 benches. So, if we observ with attention the neutrino's signal, we see 5 peaks, a remember of the protons benches that origined the signal. In this process there is one of the criticism: it is necessary to be secure that the proton's probability density function and the neutrino's probability density function are equals. Another important point to clarify is the time of flight(2) or the presence of some effects dued by day/night or seasons.
But the really news arrived in the end of the presentation:
The most interesting part of the presentation is the production of neutrinos' beams(1). First of all reasearchers need to produce one proton bench in PS (is a little synchrotron), so they send the bench in SPS, a much greater synchrotron than PS. In order to fill SPS are needed 11 PS benches, but researchers decided to inject in SPS 5 beams (each one with a time length of about 10.4 μs) and after a time range of about 50 ms they inject others 5 benches. So, if we observ with attention the neutrino's signal, we see 5 peaks, a remember of the protons benches that origined the signal. In this process there is one of the criticism: it is necessary to be secure that the proton's probability density function and the neutrino's probability density function are equals. Another important point to clarify is the time of flight(2) or the presence of some effects dued by day/night or seasons.
But the really news arrived in the end of the presentation:
OPERA collaboration decided this morning to postpone the submission of paper of about one monthI lost the first of the two motivation, but the second is simple: CNGS is preparing new benches spaced at 500 ns. So OPERA could have a really first opportunity to test their data.
Probably not
A group velocity faster than $c$ does not mean that photons or neutrinos are moving faster thsn the speed of light.This is the conclusion of Fast light, fast neutrinos? by Kevin Cahill(12). He start his briefly analisys from some experimental observations of superluminal group velocity. In these experiments researchers measure a speed of light faster and slower than $c$ in vacuum. The first observation was occured in 1982(1), but an interesting collection of work in this subject is in Bigelow(7) and Gehring(11). Experimentally when some pulses journey into a highly dispersive media occur some exotic effects. One of these is the observation of a negative group velocity, that coincides with a superluminal speed.
In Bigelow's and Gehring's works wasn't a really theoretical explenation. For example Bigelow proposed the following explaination:
(...) as the combination of different absorption cross sections and lifetimes for Cr3+ ions at either mirror or inversion sites within the BeAl2O4 crystal lattice. The superluminal wave propagation is produced by a narrow “antihole” [612 Hz half width at half maximum (HWHM)] in the absorption spectrum of Cr3+ ions at the mirror sites of the alexandrite crystal lattice, and the slow light originates from an even narrower hole (8.4 Hz) in the absorption spectrum of Cr3+ ions at the inversion sites.They also considered
(...) the influence of ions both at the inversion sites and at the mirror sites. In addition, the absorption cross sections are assumed to be different at different wavelengths.
Tempest: another great timelapse
Some days ago Nasa published the following incredible photo:
It is an infrared mosaic produced using Hubble's shots and represent the Universe observed by the space telescope. But also from Earth we can observe some spectacular images, like the stars in the sky or the Milky Way. Thursday we see a greattimelapse from Jared Brandon and today I propose you another great timelapse, realized by Randy Halverson (via Universe Today):
I think it could represent the perfect fusion between Earth and Sky, like in this shots from the slideshow of the video:
Thanks also to Annarita Ruberto, who shared the video in Italy.
Milky Way in timelapse
Writing a card about Milky Way for Italian Olympiad Astronomy syllabus, I search for some video in timelapse about our galaxy, and I find some interesting videos and a great artist, or photographer, as you like. The photographer was cited on Daily Mail and Photo Blog on msbc.com (and other sites in the web), and this is his great photo (source):
Tommy Eliassen (facebook, 1X) is the photographer, and he is very talented.
But this is simple the introduction to the timelapse video of today, Mt Ruapehu Timelapse by Jared Brandon: The name of our galaxy comes from greek mythology. Indeed Zeus put on Hera's chest his son Heracles. And the hero started to suck the divine milk in order to became immortal. But Hera waked up (she was spleeping) and she pushed Heracles away. In that moment a splash of milk from the breasts of the goddess became the Milky Way:
But this is simple the introduction to the timelapse video of today, Mt Ruapehu Timelapse by Jared Brandon: The name of our galaxy comes from greek mythology. Indeed Zeus put on Hera's chest his son Heracles. And the hero started to suck the divine milk in order to became immortal. But Hera waked up (she was spleeping) and she pushed Heracles away. In that moment a splash of milk from the breasts of the goddess became the Milky Way:
The mathematics in the 2011 Nobel Prize in Chemistry
The Nobel Prize in Chemistry 2011 is assigned to Daniel Shechtman
In particular Shechtman, studying Al with 10–14% Mn, and collegues observed that
for the discovery of quasicrystalsThe paper of the discover, written with Blech, Gratias and Cahn, starting with the following worlds:
We report herein the existence of a metallic solid which diffracts electrons like a single crystal but has point group symmetry $m \bar{35}$ (icosahedral) which is inconsistent with lattice translations.(2)The lattice translations are, indeed, most important tools in order to classify crystals. Indeed in 1992 the definition of crystals given by the International Union of Crystallography was:
A crystal is a substance in which the constituent atoms, molecules, or ions are packed in a regularly ordered, repeating three-dimensional pattern.So the discover of Shechtman and collegues was very important: they introduce a new class of crystals, named quasicrystals by Levine and Steinhardt some weeks later(3), and a new way to view crystals.
In particular Shechtman, studying Al with 10–14% Mn, and collegues observed that
The symmetries of the crystals dictate that several icosahedra in a unit cell havedifferent orientations and allow them to be distorted (...)(2)And when they observe crystal using lattice translations:
crystals cannot and do not exhibit the icosahedral point group symmetry.(2)They also oserve that the formation of the icosahedral phase is a transition phase of the first order, because the two phases (the other is translational) coexist for a while during translation(2).
Italian Wikipedia and the fight for neutrality
We are Italians. We are wikipedians. And we are neutral. And now our neutrality is in danger cause by an italian law. You can read here the communication.
And this is an e-mail from Wikimedia Foundation:
And this is an e-mail from Wikimedia Foundation:
The Wikimedia Foundation first heard about this a few hours ago: we don't have a lot of details yet. Jay is gathering information and working on a statement now.
It seems obvious though that the proposed law would hurt freedom of expression in Italy, and therefore it's entirely reasonable for the Italian Wikipedians to oppose it. The Wikimedia Foundation will support their position.
The question of whether blocking access to Wikipedia is the best possible way to draw people's attention to this issue is of course open for debate and reasonable people can disagree. My understanding is that the decision was taken via a good community process. Regardless, what's done is done, for the moment.
Super-Nobel in Physics 2011
The first observation of a supernova is dated 1572 by Tycho Brahe, but the hystorically most important supernova's observation is the Galilei's observation in 1604:
The supernova of 1604 caused even more excitement than Tycho's because its appearance happened to coincide with a so-called Great Conjunction or close approach of Jupiter, Mars and Saturn.(1)The Galilei's discover was revolutionary for one important reason:
Galileo's observations and those made elsewhere in Italy and in Northern Europe indicated that it was beyond the Moon, in the region where the new star of 1572 had appeared. The appearance of a new body outside the Earth-Moon system had challenged the traditional belief, embodied in Aristotle's Cosmology, that the material of planets was unalterable and that nothing new could occur in the heavens.(1)About the new star
Galileo states that [it] was initially small but grew rapidly in size such as to appear bigger than all the stars, and all planets with the exception of Venus.(1)We can confrount the observation with modern definitions:
Novae are the result of explosions on the surface of faint white dwarfs, caused by matter falling on their surfaces from the atmosphere of larger binary companions. A supernova is also a star that suddenly increases dramatically in brightness, then slowly dims again, eventually fading from view, but it is much brighter, about ten thousand times more than a nova.(1)These dramatical events became soon a good tools in order to observe the expansion of the universe:
Type Ia supernovae are empirical tools whose precision and intrinsic brightness make them sensitive probes of the cosmological expansion.(5)And observing a series of supernovae the team of Brian Schmidt (1967) and Adam Riess (1969) in 1998(3) and the team of Saul Perlmutter (1959) in 1999(4) found an important consmological observation: Universe is accelerating!
Video abstract: The hyperring of adèle classes
We have encountered Alain Connes when I write the brief post about the Riemann hypothesis. Now he explain in the following video abstract his paper The hyperring of adèle classes (arXiv) written with Caterina Consani:
The Top of the Tevatron
Today is the last day of Tevatron. Tevatron is a particle accelerator, and it started the physics measure in 1985, on the night of 13rd october. The story of Tevatron is reach of great events, and Tommaso Dorigo write a great summary of Tevatron's physics, in particular the first great physics result: the discover of the top quark!
In Standard Model we have 6 quarks, and they are the elementary particles that constitue barionic matter. They was introduced in physics with a parton model indipendetly developed by Murray Gell-Mann(1) and George Zweig(2, 3) in 1964. The original theory is constituted by three partons (up, down, strange), but in the subsequent years were provided also the others three quark. In particular in 1972 Makoto Kobayashi and Toshihide Maskawa(6) proposed the existence of a new quark, the well known top quark: they introduce in weak interaction theory, discovered by Weinberg in 1967(4) and 1971(5), the CP-violation. In particular they write the hadronic parts of the lagrangian in four terms: kinetics, massive, strong and $L'$. Following the Higgs mechanism(9), they supposed that the CP-violation it could be in massive term, because the spontaneous breaking of gauge symmetry.
Their calculations are group theory calculations: we can imagine the group that Kobayashi and Maskawa used like a space generated by two 4-dimensional spaces (the space of $SU (4)$ group). They pictured three possible partitions for every vector space:
(from D0 paper)
(from CDF paper)
I conclude with the following video by Maria Scileppi with Rob Snihur, a Tevatron's researcher:
In Standard Model we have 6 quarks, and they are the elementary particles that constitue barionic matter. They was introduced in physics with a parton model indipendetly developed by Murray Gell-Mann(1) and George Zweig(2, 3) in 1964. The original theory is constituted by three partons (up, down, strange), but in the subsequent years were provided also the others three quark. In particular in 1972 Makoto Kobayashi and Toshihide Maskawa(6) proposed the existence of a new quark, the well known top quark: they introduce in weak interaction theory, discovered by Weinberg in 1967(4) and 1971(5), the CP-violation. In particular they write the hadronic parts of the lagrangian in four terms: kinetics, massive, strong and $L'$. Following the Higgs mechanism(9), they supposed that the CP-violation it could be in massive term, because the spontaneous breaking of gauge symmetry.
Their calculations are group theory calculations: we can imagine the group that Kobayashi and Maskawa used like a space generated by two 4-dimensional spaces (the space of $SU (4)$ group). They pictured three possible partitions for every vector space:
- two 2-dimensional subspaces;
- one 2-dimensional subspace, and two mono-dimensional subspaces;
- four mono-dimensional subspaces.
(from D0 paper)
(from CDF paper)
The universe and the flowers
Peeking the infinite increases the space, the breath, the brain of whoever is watching it.The universe is a perilous but also a beautiful place. But the beauty of the universe it's not only in galactic shots, but also in mathematics. For example the maps of the E8 group seems flowers, and if we follow Garrett Lisi and his preprint An exceptionally simple theory of everything(1), these maps are also a sort of universe's flowers!
Erri De Luca, italian writer
The E8 maps in this post are extracted from Lisi's preprint and they represent the structure of the group (the first image is F4. E8 is a Lie group, most important in physics because all symmetries group of the physical systems are Lie groups. The Lie group is an analytical group: all functions that we can define in the group are continuous. A physical example is the Galilei's group, and studying it we can argue information about free particles, described by Schroedinger equation.
Waiting superluminal neutrinos: from Maxwell to Einstein
(CNGS device)
From these transformations, discovered in 1887(1, 2), we can extract the physics of special relativity, thanks to Albert Einstein. So, if we want tosubstitute c with the alleged OPERA's neutrinos speed, we must conclude that the new boson particle of electromagnetic field is the neutrino! In this sense I say that special relativity is right: our universe and our observations are based on electromagnetic field, so if OPERA results will be verified, we probably think to:
- change the weak interaction(3);
- study an eventually quantum interaction between neutrinos and space time(4);
- imagine a new field exclusively for neutrinos(5);
- other way that in this moment I cannot imagine(6)!
Waiting the superluminal neutrinos (if they exist!)
It seems that Opera experiment observed some superluminal neutrinos.
First of all we must see the scientific data: at 4pm on 23rd september (Geneva time) we can connect to the Cern seminar (also on webcast), but probably a preprint will be puiblished on arXiv in the next hours. In every case I think that it's very important say some worlds about the news.
There's a lot of comments about the question, and some people say that special relativity and also standard model will be falsified by the results if they will be confirmed. Instead I think that we simply speak about an extension of standard model, and there're no really consequence about special relativity.
First of all we must remember that special relativity and standard model are first of all electromagnetic theories, where the boson is the photon and the speed of light is important for the photon and for the em interaction. And neutrinos don't interact with electromagnetic field, and the results is simply the confirmation of this situation!
At the other hand the results, if confirmed, say us simply that neutrinos are the most elusive particles in the universe: in this case they escape from the control of special relativity, which would not be the correct theory to describe them at highest energy. In the same way, we must modify standard model in order to include these new superneutrinos. In this last case the changes will be at the high orders of the theory: we must remember that, if the effect it's really important at the energy of standard model, the theory would never have been tested with a high degree of accuracy.
Another hypothesis is that the introduction of superluminal neutrinos in model standard could resolve some mathematical problems of the model, or explain some physical question (like the matter-antimatter asymmetry, for example). But we can continue playing with the assumptions: the superneutrinos could be the trace of a new fifth interaction between neutrinos and dark matter. This hypothesis is included in some dark matter theories: so model standard and special relativity could be remain unmodified.
In every case, if the results will be confirmed, the first step for model standard theorists is propose changes at the highest orders of the theory. And for the future search an extension of the theory.
Thanks to Marco Delmastro, Tommaso Dorigo, Peppe Liberti, Annarita Ruberto for sharing the news.
First of all we must see the scientific data: at 4pm on 23rd september (Geneva time) we can connect to the Cern seminar (also on webcast), but probably a preprint will be puiblished on arXiv in the next hours. In every case I think that it's very important say some worlds about the news.
There's a lot of comments about the question, and some people say that special relativity and also standard model will be falsified by the results if they will be confirmed. Instead I think that we simply speak about an extension of standard model, and there're no really consequence about special relativity.
First of all we must remember that special relativity and standard model are first of all electromagnetic theories, where the boson is the photon and the speed of light is important for the photon and for the em interaction. And neutrinos don't interact with electromagnetic field, and the results is simply the confirmation of this situation!
At the other hand the results, if confirmed, say us simply that neutrinos are the most elusive particles in the universe: in this case they escape from the control of special relativity, which would not be the correct theory to describe them at highest energy. In the same way, we must modify standard model in order to include these new superneutrinos. In this last case the changes will be at the high orders of the theory: we must remember that, if the effect it's really important at the energy of standard model, the theory would never have been tested with a high degree of accuracy.
Another hypothesis is that the introduction of superluminal neutrinos in model standard could resolve some mathematical problems of the model, or explain some physical question (like the matter-antimatter asymmetry, for example). But we can continue playing with the assumptions: the superneutrinos could be the trace of a new fifth interaction between neutrinos and dark matter. This hypothesis is included in some dark matter theories: so model standard and special relativity could be remain unmodified.
In every case, if the results will be confirmed, the first step for model standard theorists is propose changes at the highest orders of the theory. And for the future search an extension of the theory.
Thanks to Marco Delmastro, Tommaso Dorigo, Peppe Liberti, Annarita Ruberto for sharing the news.
Riemann hypothesis
The Rieman hypethesis was stated following the 1859 Riemann's paper On the Number of Primes Less Than a Given Magnitude. This is the begin of the paper(1):
In the story of the search of the zeta-zeros, Hugh Montgomery has an important part(4): in 1972 he investigated the distance between two zeta-zeros, finding a function of this difference. After this paper, in 1979, with Norman Levinson(5) he established some others zeta properties, investigating in particular the zeros of zeta derivatives. Obviosly he first of all proofed an equivalence relation between the zeros of Riemann zeta function and the zeros of the derivatives: in particular also these zeros belong to the critical strip, $0 < \sigma < \frac{1}{2}$.
The analitical research around zeta-zeros is not the only way: the first was Lehmer (1956 and 1957) who performed the first computational attempt in order to proof the hypothesis. An example of this kind of researches is given by Richard Brent(6): in his work he try to evaluate Riemann zeta using the Gram points, that are the points in which the zeta change its sign(3). Brent focused his research on the first 70000000 Gram blocks, veryfing the hypothesis.
But there's another approach to the problem: physics. In the end of 90s Alain Connes(7) proofed the link of Rieman hypotesis with quantum chaos.
Quantum chaos studies chaotic classical dynamical systems using quantum laws. In particular Connes found a particular chaotic system in which quantum numbers are prime numbers and the energy levels of the system correspond to the zeta-zeros on the critical line $\sigma = \frac{1}{2}$. In physics it could be the better (but not only) suspect to resolve the hypothesis
Others connection with physics are in the review Physics of the Riemann Hypothesis by Daniel Schumayer and David A. W. Hutchinson, but we can speak about the stories in the paper in another moment.
I believe that I can best convey my thanks for the honour which the Academy has to some degree conferred on me, through my admission as one of its correspondents, if I speedily make use of the permission thereby received to communicate an investigation into the accumulation of the prime numbers; a topic which perhaps seems not wholly unworthy of such a communication, given the interest which Gauss and Dirichlet have themselves shown in it over a lengthy period.The Riemann zeta function is connected to the prime numbers distribution, in particular Riemann argued that all of its non trivial zeros(2) have the form $z = \frac{1}{2} + bi$, where $z$ is complex, $b$real,$i = \sqrt{-1}$. There's also a general form of the zeros: $z = \sigma + bi$, where $\sigma$ belong to the critical strip (see below and the image at the right).
For this investigation my point of departure is provided by the observation of Euler that the product \[\prod \frac{1}{1-\frac{1}{p^s}} = \sum \frac{1}{n^s}\] if one substitutes for $p$ all prime numbers, and for $n$ all whole numbers. The function of the complex variable $s$ which is represented by these two expressions, wherever they converge, I denote by $\zeta (s)$. Both expressions converge only when the real part of $s$ is greater than 1; at the same time an expression for the function can easily be found which always remains valid.
In the story of the search of the zeta-zeros, Hugh Montgomery has an important part(4): in 1972 he investigated the distance between two zeta-zeros, finding a function of this difference. After this paper, in 1979, with Norman Levinson(5) he established some others zeta properties, investigating in particular the zeros of zeta derivatives. Obviosly he first of all proofed an equivalence relation between the zeros of Riemann zeta function and the zeros of the derivatives: in particular also these zeros belong to the critical strip, $0 < \sigma < \frac{1}{2}$.
The analitical research around zeta-zeros is not the only way: the first was Lehmer (1956 and 1957) who performed the first computational attempt in order to proof the hypothesis. An example of this kind of researches is given by Richard Brent(6): in his work he try to evaluate Riemann zeta using the Gram points, that are the points in which the zeta change its sign(3). Brent focused his research on the first 70000000 Gram blocks, veryfing the hypothesis.
But there's another approach to the problem: physics. In the end of 90s Alain Connes(7) proofed the link of Rieman hypotesis with quantum chaos.
Quantum chaos studies chaotic classical dynamical systems using quantum laws. In particular Connes found a particular chaotic system in which quantum numbers are prime numbers and the energy levels of the system correspond to the zeta-zeros on the critical line $\sigma = \frac{1}{2}$. In physics it could be the better (but not only) suspect to resolve the hypothesis
Others connection with physics are in the review Physics of the Riemann Hypothesis by Daniel Schumayer and David A. W. Hutchinson, but we can speak about the stories in the paper in another moment.
One planet, two stars
When we are going to approach to the binary system Kepler-16 (image source) we'll see two different stars: the largest of the couple, the star A, is a K dwarf with a mass of about 0.69 solar mass and about 0.65 Sun's radius, and a little red dwarf, the stra B, with a mass of about 0.2 solar mass and about 0.23 Sun's radius. But when we'll arrive at about 0.5 au from the gravitational centre of the binary system we'll see a great surprise: a little planet with a mass of 0.333 ± 0.016 and a radius of 0.7538 ± 0.0025 those of Jupiter(1).
In this momenti is very difficult to suppose the properties(2) (like composition and surface temperature) of Kepler-16b, but the most important thing is that Kepler can discover, using the transit technique, also a planet around a binary system! I think that this is the principle reason becuase the paper was accepted by Science. And it's clear that all nerd and sci-fi fan think about Skywalker family's planet, Tatooine:
Earlth and Moonch
This beautiful shot (via Universe Today) is the Cygnus Constellation. The photographer is Marco T., italian, and you can see others great photos on his flickr account.
But this beautiful shot is the better introduction to Earlth and Moonch, a animated short by Dei Gaztelumendi I see the short in Italy, during the Milano Film Festival, and in the worlds of Gaztelumendi is a reflession about Earth, ecology and our future.
But this beautiful shot is the better introduction to Earlth and Moonch, a animated short by Dei Gaztelumendi I see the short in Italy, during the Milano Film Festival, and in the worlds of Gaztelumendi is a reflession about Earth, ecology and our future.
Nuclear accidents: abstract about Fukushima and video from France
In the last hours (around 12) it was a nuclear accident in France, at the Marcoule nuclear site. Waiting for news from France (in the end of the post I embed a youtube video), I propose you some recent abstracts about Fukushima's radiation in world:
First of all I share Elevated radioxenon detected remotely following the Fukushima nuclear accident by T.W.Boyer et al. (via Science Daily):
Evidence of the radioactive fallout in the center of Asia (Russia) following the Fukushima Nuclear Accident by A. Bolsunovsky and D. Dementyev
First of all I share Elevated radioxenon detected remotely following the Fukushima nuclear accident by T.W.Boyer et al. (via Science Daily):
We report on the first measurements of short-lived gaseous fission products detected outside of Japan following the Fukushima nuclear releases, which occurred after a 9.0 magnitude earthquake and tsunami on March 11, 2011. The measurements were conducted at the Pacific Northwest National Laboratory (PNNL), (46°16′47″N, 119°16′53″W) located more than 7000 km from the emission point in Fukushima Japan (37°25′17″N, 141°1′57″E). First detections of 133Xe were made starting early March 16, only four days following the earthquake. Maximum concentrations of 133Xe were in excess of 40 Bq/m3, which is more than ×40,000 the average concentration of this isotope is this part of the United States.I try to request, via ResearchGate, a copy of the article. I don't know if I'll receive it. But after this work there's been published, on Journal of Environmental Activity, a lot of papers about Fukushima:
Evidence of the radioactive fallout in the center of Asia (Russia) following the Fukushima Nuclear Accident by A. Bolsunovsky and D. Dementyev
It was recently reported that radioactive fallout due to the Fukushima Nuclear Accident was detected in environmental samples collected in the USA and Greece, which are very far away from Japan. In April–May 2011, fallout radionuclides (134Cs, 137Cs, 131I) released in the Fukushima Nuclear Accident were detected in environmental samples at the city of Krasnoyarsk (Russia), situated in the center of Asia. Similar maximum levels of 131I and 137Cs/134Cs and 131I/137Cs ratios in water samples collected in Russia and Greece suggest the high-velocity movement of the radioactive contamination from the Fukushima Nuclear Accident and the global effects of this accident, similar to those caused by the Chernobyl accident.Short and long term dispersion patterns of radionuclides in the atmosphere around the Fukushima Nuclear Power Plant by Ádám Leelőssy, Róbert Mészáros, István Lagzi
The Chernobyl accident and unfortunately the recent accident at the Fukushima 1 Nuclear Power Plant are the most serious accidents in the history of the nuclear technology and industry. Both of them have a huge and prolonged impact on environment as well as human health. Therefore, any technological developments and strategies that could diminish the consequences of such unfortunate events are undisputedly the most important issues of research. Numerical simulations of dispersion of radionuclides in the atmosphere after an accidental release can provide with a reliable prediction of the path of the plume. In this study we present a short (one month) and a long (11 years) term statistical study for the Fukushima 1 Nuclear Power Plant to estimate the most probable dispersion directions and plume structures of radionuclides on local scale using a Gaussian dispersion model. We analyzed the differences in plume directions and structures in case of typical weather/circulation pattern and provided a statistical-climatological method for a “first-guess” approximation of the dispersion of toxic substances. The results and the described method can support and used by decision makers in such important cases like the Fukushima accident.Arrival time and magnitude of airborne fission products from the Fukushima, Japan, reactor incident as measured in Seattle, WA, USA by J. Diaz Leon et al. (pdf from arXiv)
We report results of air monitoring started due to the recent natural catastrophe on 11 March 2011 in Japan and the severe ensuing damage to the Fukushima Dai-ichi nuclear reactor complex. On 17–18 March 2011, we registered the first arrival of the airborne fission products 131I, 132I, 132Te, 134Cs, and 137Cs in Seattle, WA, USA, by identifying their characteristic gamma rays using a germanium detector. We measured the evolution of the activities over a period of 23 days at the end of which the activities had mostly fallen below our detection limit. The highest detected activity from radionuclides attached to particulate matter amounted to 4.4 ± 1.3 mBq m−3 of 131I on 19–20 March.From arXiv Aerial Measurement of Radioxenon Concentration off the West Coast of Vancouver Island following the Fukushima Reactor Accident by L. E. Sinclair et al.
In response to the Fukushima nuclear reactor accident, on March 20th, 2011, Natural Resources Canada conducted aerial radiation surveys over water just off of the west coast of Vancouver Island. Dose-rate levels were found to be consistent with background radiation, however a clear signal due to Xe-133 was observed. Methods to extract Xe-133 count rates from the measured spectra, and to determine the corresponding Xe-133 volumetric concentration, were developed. The measurements indicate that Xe-133 concentrations on average lie in the range of 30 to 70 Bq/m3.At the end The time variation of dose rate artificially increased by the Fukushima nuclear crisis by Masahiro Hosoda et al. from Scientific Reports:
A car-borne survey for dose rate in air was carried out in March and April 2011 along an expressway passing northwest of the Fukushima Dai-ichi Nuclear Power Station which released radionuclides starting after the Great East Japan Earthquake on March 11, 2011, and in an area closer to the Fukushima NPS which is known to have been strongly affected. Dose rates along the expressway, i.e. relatively far from the power station were higher after than before March 11, in some places by several orders of magnitude, implying that there were some additional releases from Fukushima NPS. The maximum dose rate in air within the high level contamination area was 36 μGy h−1, and the estimated maximum cumulative external dose for evacuees who came from Namie Town to evacuation sites (e.g. Fukushima, Koriyama and Nihonmatsu Cities) was 68 mSv. The evacuation is justified from the viewpoint of radiation protection.And now the video news about France accident:
Ray representations and Galilei group
For my PhD thesis I performed a work in group teory, precisely in the theory of representations, applied to quantum mechanics. So, in order to describe my work, recently published by the Journal of Mathematical Physics, I need to introduce some concepts.
The group theory was founded indipendetly by Niels Abel and Evariste Galois and it is focused on group, a set $G$ of elements with a multiplication operation $\cdot$ and such that the following properties are true(9):
If the world is a given physical system (for example a free particle), we have a symmetry group, that is a set of all symmetry transformation(2) of our physical system, and his representation acts in a so called Hilbert space. In this space, following Wigner's theorem(4), the most general representation is a ray (unitary) representation. In order to understand the ray (or projective) representations, we must enunciate the theorem:
- $\forall a, b \in G, a \cdot b \in G$
- $\forall a, b, c \in G, a(bc) = (ab) c$
- $\forall a \in G \, \exists e \in G \, \text{:} \, ae = ea = a$
- $\forall a \in G \, \exists b \in G \, \text{:} \, ab = ba = e$ and $b = a^{-1}$
- $\forall a, b \in G, ab = ba$
If the world is a given physical system (for example a free particle), we have a symmetry group, that is a set of all symmetry transformation(2) of our physical system, and his representation acts in a so called Hilbert space. In this space, following Wigner's theorem(4), the most general representation is a ray (unitary) representation. In order to understand the ray (or projective) representations, we must enunciate the theorem:
For every transformation of symmetry $T: \mathcal R \rightarrow \mathcal R$ between the rays of a Hilbert space $\mathcal H$ and such that conserve the transition probabilities, we can define an operator $U$ on the Hilbert space $\mathcal H$ such that, if $|\psi> \in {\mathcal R}_\psi$, then $U |\psi> \in {\mathcal R}'_\psi$, where ${\mathcal R}_\psi$ is the radius of the state $|\psi>$, ${\mathcal R}'_\psi = T {\mathcal R}_\psi$, and $U$ uniform and linear \[< U \psi | U \varphi> = <\psi | \varphi>, \qquad U |\alpha \psi + \beta \varphi> = \alpha U |\psi> + \beta U |\varphi>\] or with $U$ antiunitario and antilinear: \[< U \psi | U \varphi> = <\varphi | \psi>, \qquad U |\alpha \psi + \beta \varphi> = \alpha^* U |\psi> + \beta^* U |\varphi>\] Further, $U$ is uniquely determined except for a phase factor.So a ray representation is the association between an element of the symmetry group $G$ to a set of unitary (or antiunitary) operators which differ only for a phase: in other worlds a ray of operators(3).
Review: Death from the skies
#SciDoom
(Commons)or the ones that probably caused the dinosaurs extintion.
In order to prevent a bad encounter with a great object, physicists describe a lot of possible solutions: in the list there are a nuclear bomb, an impact against an artificial object (like Deep Impact with Tempel 1) or try to change the bullet's trajectory using the gravitational force of an ather astetoids. This last solution is proposed by B612 Foundation and, like the others, present a lot of difficulties, but is technically realizable now!
Some possible problems are, for example with shot solutions, that the object is broken into too big pieces, while for the gravitational solution we must demonstrate a very precise control of the gravity. These are the only enemies we can prevent, but against the others we could realize some particular protection, like radiations from Sun or from supernovae explosions. In this last case, the probability to be shot is 1/10000000, against 1/700000 of an asteroid shot, that is a probability actually higher than terrorism!
Another danger from outer space is an... alien attack! Don't worry! I don't mean the little green men from Mars, but simply eventually microscopic life present in asteroids or comets. An attach like this has a very low probability, first of all because the aliens must survive at Earth atmosphere, and after because they have a low probability to interact with DNA developed in our environment: following our knowledge, we must remeber that in the Universe our planet is unique!
I don't know if we are alone or not, but in this moment it is so...
In the last part of the post I spend some words about Sun:
The universe is trying to kill you.The Universe is the most dangerous place that you can imagine. There are a lot of perils: asteroids and comets, supernavae, gamma ray bursts and finally our star, the Sun. Every source of danger is examined in nine chapters introduced by a fictional short story, that is scientifically correct. In these introductory stories, Plait describes a possible scenario in which Earth is shotted by, for example, a comet o a great asteroid, like the one that leaves the Meteor Crater in Arizona
Philip Plait, Death from the skies
(Commons)
In order to prevent a bad encounter with a great object, physicists describe a lot of possible solutions: in the list there are a nuclear bomb, an impact against an artificial object (like Deep Impact with Tempel 1) or try to change the bullet's trajectory using the gravitational force of an ather astetoids. This last solution is proposed by B612 Foundation and, like the others, present a lot of difficulties, but is technically realizable now!
Some possible problems are, for example with shot solutions, that the object is broken into too big pieces, while for the gravitational solution we must demonstrate a very precise control of the gravity. These are the only enemies we can prevent, but against the others we could realize some particular protection, like radiations from Sun or from supernovae explosions. In this last case, the probability to be shot is 1/10000000, against 1/700000 of an asteroid shot, that is a probability actually higher than terrorism!
Another danger from outer space is an... alien attack! Don't worry! I don't mean the little green men from Mars, but simply eventually microscopic life present in asteroids or comets. An attach like this has a very low probability, first of all because the aliens must survive at Earth atmosphere, and after because they have a low probability to interact with DNA developed in our environment: following our knowledge, we must remeber that in the Universe our planet is unique!
I don't know if we are alone or not, but in this moment it is so...
In the last part of the post I spend some words about Sun:
A circle around Higgs boson
After the post about D0 abstracts, I return to write about Higgs boson after the last Fermilab's press release about the mass limit of Higgs boson. Combinig data from D0 and CDF, Tevatron's limits are 114-137 GeV/c2. The results was presented last week in Grenoble at the EPS High-Energy Physics conference, that it will finish on the 27th July.
During the same conference also LHC's experiments presented their first results, analyzed in about one month! And the conclusion seems un-huppy for Tevatron: the Fermilab's particle accelerator has only one chance to find Higgs boson before LHC. Why? We can simply see the following plots presented by ATLAS and CMS (via Résonaances, Tommaso Dorigo):
The two european experiments presented only a little region around 115 GeV/c2, the Tevatron's region, to 140 GeV/c2. The data from this region are probably analized and published before the end of the year, so we must wait only some months to know if Tevatron could found Higgs or not(1).
Tomasso examined in details some CMS preprint in which they are studied a lot of Higgs production channels, and also Philip Gibbs write a great summary about LHC presentations, who realize a great conclusion plot:
During the same conference also LHC's experiments presented their first results, analyzed in about one month! And the conclusion seems un-huppy for Tevatron: the Fermilab's particle accelerator has only one chance to find Higgs boson before LHC. Why? We can simply see the following plots presented by ATLAS and CMS (via Résonaances, Tommaso Dorigo):
Tomasso examined in details some CMS preprint in which they are studied a lot of Higgs production channels, and also Philip Gibbs write a great summary about LHC presentations, who realize a great conclusion plot:
Observation of a new neutral baryon
In the origin (late 1960s) the particle zoo(1) is the colloquially word used to describe the extensive list of known elementary particles. Indeed, before Standard Model becames the more accepted theory in particle physics, physicists discovered a lot of particles in their accelerators, but we know today that they are simply a combination of a little numbers of particles classified in three fundamental families: leptons, quarks (that they constitute fermions, particles with half-integer spin) and bosons (particles with integer spin).
(Commons)We can classify also particles in a lot of sub-families, like baryons, the heavy particles constituted by three quarks: for example proton and neutron are barions, with the following composition: uud and udd respectively, where u is the up quark and d the down quark.
We know six types of quarks: up (u) and down (d), that explain protons and neutrons, charm (c), strange (s), top (t) and bottom (b) that explain a lot of other heavy particles. Standard Model predicts a series of combination of this quarks that they are summirized in a picture like this:
(source)In the up there is with angular momentum $J =1/2$ and down with angular momentum $J=3/2$. Today we examine $J=1/2$ group, in particular to the last discover by CDF, one collaboration at Tevatron in Fermilab. Indeed, not all particles predicted by SM are found, and the hunt to them is open. On the 20th July, Pat Lukens announced the first observation of $\Xi_b^0$, a baryon with the structure usb:
In orther to detect the new baryon, researchers at Tevatron must reconstruct the following decay chain:
(Commons)
We know six types of quarks: up (u) and down (d), that explain protons and neutrons, charm (c), strange (s), top (t) and bottom (b) that explain a lot of other heavy particles. Standard Model predicts a series of combination of this quarks that they are summirized in a picture like this:
(source)
In orther to detect the new baryon, researchers at Tevatron must reconstruct the following decay chain:
Brian May, astrophysicist
Brian May is the famous guitarist of the Queen, Freddie Mercury's rock band (and one of my favourite band!), but is also an astrophysicist!
He wasborn 19 July 1947 in Twickenham, London. He studied mathematics and physics at Imperial College, where he started also the PhD program, but he abandoned when Queen became a succesful band in the world. He completed his PhD in 2007(5), but he did not forget his research activity, indeed he written with Patrick Moore and Chris Lintott Bang! – The Complete History of the Universe (2006)... but... just a moment... research activity? Yeah!
In 1972 and 1973 two papers signed by Mr.May are be published: MgI Emission in the Night-Sky Spectrum and An Investigation of the Motion of Zodiacal Dust Particles (Part I), written with Mr.Hicks and Mr.Reay.
May and collegues are interestend in zodiacal light, in particular in MgI spectrum, near the 5183.62 Å wavelength.
The importance of this kind of studies is that the MgI and MgII formation is one feature in the interaction between atmosphere and star radiations(2, 3).
But go to the papers: in order to determine the absorbtion lines from zodiacal light, Brian and friends used the Fabry-Perot interferometer:
He wasborn 19 July 1947 in Twickenham, London. He studied mathematics and physics at Imperial College, where he started also the PhD program, but he abandoned when Queen became a succesful band in the world. He completed his PhD in 2007(5), but he did not forget his research activity, indeed he written with Patrick Moore and Chris Lintott Bang! – The Complete History of the Universe (2006)... but... just a moment... research activity? Yeah!
In 1972 and 1973 two papers signed by Mr.May are be published: MgI Emission in the Night-Sky Spectrum and An Investigation of the Motion of Zodiacal Dust Particles (Part I), written with Mr.Hicks and Mr.Reay.
May and collegues are interestend in zodiacal light, in particular in MgI spectrum, near the 5183.62 Å wavelength.
The importance of this kind of studies is that the MgI and MgII formation is one feature in the interaction between atmosphere and star radiations(2, 3).
But go to the papers: in order to determine the absorbtion lines from zodiacal light, Brian and friends used the Fabry-Perot interferometer:
The method was to sample, for 48 s, each of up 18 points acrossthe spectral interval. Pulse counting electronics and a line printer recordedthe signal levelat each sample point. A second channel of pulse counting monitored the overall sky background over a widewaveband, thus allowing correction forfluctation in sky transparency. The resolving power of the interferometer was 3500, corresponding to an instrumental profile width of 1.5 Å.Obesrvation time is September, October 1971 and April 1972 from the observatory at Izana on Tenerife, Canary Islands.
D0 abstracts: Higgs limits and dimuon asymmetry
I usually publish abstract's digests on posterous, but in this case I think this is necessary an exception. D0 collaboration at Tevatron, indeed, released two papers on arxiv, and I think that it is important sharing with the much number of readers their work. I startwith Search for neutral Higgs bosons decaying to $\tau$ pairs produced in association with $b$ quarks in $p \bar{p}$ collisions at $\sqrt s = 1.96$ TeV, shared by Tommaso:
We report results from a search for neutral Higgs bosons produced in association with b quarks using data recorded by the D0 experiment at the Fermilab Tevatron Collider and corresponding to an integrated luminosity of 7.3 $fb^{-1}$. This production mode can be enhanced in several extensions of the standard model (SM) such as in its minimal supersymmetric extension (MSSM) at high tanBeta. We search for Higgs bosons decaying to tau pairs with one tau decaying to a muon and neutrinos and the other to hadrons. The data are found to be consistent with SM expectations, and we set upper limits on the cross section times branching ratio in the Higgs boson mass range from 90 to 320 $GeV/c^2$. We interpret our result in the MSSM parameter space, excluding tanBeta values down to 25 for Higgs boson masses below 170 $GeV/c^2$.The other two papers are in Antimatter Tevatron mystery gains ground, a great BBC's article. In particular BBC writes about Measurement of the anomalous like-sign dimuon charge asymmetry with 9 $fb^{-1}$ of $p \bar{p}$ collisions:
A solution to a maximal independent set problem
A distributed system is a set of autonomous computer that comunicate in a network in order to reach a certain goal. So a maximal independent set (MIS) is a distributed system's subject. But, what we intend for MIS?
You can see that every maximal independent set is constituted by point that aren't adjacent.
The goal of maximum independet set problem is find the maximum size of the maximal independent set in a given graph or network. In other words the problem is the search of the leaders in a local network of connected processors, and forleaderwe intend an active node connected with an inactive node. This problem is a NP-problem.
Following Afek, Alon, Barad, Hornstein, Barkai and Bar-Joseph,
Such system is called sensory organ precursors, SOP.
In graph theory, a maximal independent set or maximal stable set is an independent set that is not a subset of any other independent set.Some example of MIS are in the graph of cube:
You can see that every maximal independent set is constituted by point that aren't adjacent.
The goal of maximum independet set problem is find the maximum size of the maximal independent set in a given graph or network. In other words the problem is the search of the leaders in a local network of connected processors, and forleaderwe intend an active node connected with an inactive node. This problem is a NP-problem.
Following Afek, Alon, Barad, Hornstein, Barkai and Bar-Joseph,
no methods has been able to efficiently reduce message complexity without assuming knowledge fo the number of neighbours.But a similar network occurs in the precursors of the fly's sensory bristles, so researchers idea is to use data from this biological network to solve the starting computational problem!
Such system is called sensory organ precursors, SOP.