Saturday 27th of February 2021

bell's theorem about an incomplete story...


From time to time, we don’t shy away from bringing to this site some scientific stuff, especially on quantum mechanics. Without this theoretical assessment of matter and energy, we would still be caught by candlelight in the dark ages — or slavering/soldiering in the devious Roman empire battles for supremacy of the known flat world. So here is a bit of a solution to the battle which we already tackled about reality, between Albert Einstein and Niels Bohr. Enjoy, go blank or cringe...



Bell's theorem proves that quantum physics is incompatible with local hidden-variable theories. It was introduced by physicist John Stewart Bell in a 1964 paper titled "On the Einstein Podolsky Rosen Paradox", referring to a 1935 thought experiment that Albert EinsteinBoris Podolsky and Nathan Rosen used to argue that quantum physics is an "incomplete" theory.[1][2] 

By 1935, it was already recognized that the predictions of quantum physics are probabilistic. Einstein, Podolsky and Rosen presented a scenario that, in their view, indicated that quantum particles, like electrons and photons, must carry physical properties or attributes not included in quantum theory, and the uncertainties in quantum theory's predictions were due to ignorance of these properties, later termed "hidden variables". Their scenario involves a pair of widely separated physical objects, prepared in such a way that the quantum state of the pair is entangled.

Bell carried the analysis of quantum entanglement much further. He deduced that if measurements are performed independently on the two separated halves of a pair, then the assumption that the outcomes depend upon hidden variables within each half implies a constraint on how the outcomes on the two halves are correlated. This constraint would later be named the Bell inequality. Bell then showed that quantum physics predicts correlations that violate this inequality. Consequently, the only way that hidden variables could explain the predictions of quantum physics is if they are "nonlocal", somehow associated with both halves of the pair and able to carry influences instantly between them no matter how widely the two halves are separated.[3][4] As Bell wrote later, "If [a hidden-variable theory] is local it will not agree with quantum mechanics, and if it agrees with quantum mechanics it will not be local."[5]

Multiple variations on Bell's theorem were proved in the following years, introducing other closely related conditions generally known as Bell (or "Bell-type") inequalities. These have been tested experimentally in physics laboratories many times since 1972. Often, these experiments have had the goal of ameliorating problems of experimental design or set-up that could in principle affect the validity of the findings of earlier Bell tests. This is known as "closing loopholes in Bell test experiments". To date, Bell tests have found that the hypothesis of local hidden variables is inconsistent with the way that physical systems do, in fact, behave.[6][7]

The exact nature of the assumptions required to prove a Bell-type constraint on correlations has been debated by physicists and by philosophers. While the significance of Bell's theorem is not in doubt, its full implications for the interpretation of quantum mechanics remain unresolved.




This is a much stronger statement than saying that the particle may have such values but that we do not or cannot know them. Physicists often call this property a lack of realism, although philosophers may define the same word in a more general manner.

Furthermore, measurement has a very special role in quantum theory: if we measure, say, spin in the x-direction, sx, then we must obtain a precise value for this quantity, even if a precise value did not exist beforehand. There are two possible results or “eigenvalues” for sx: +hw/2, which is associated with a quantum “eigenstate” a+, and – hw/2, which is associated with the eigenstate a. We can predict the result of a measurement if the initial state vector describing the system, y0, is either a+ or a: if y0 = a+ the result will be + hw/2, and if y0 = a the result will be – hw/2.

In general, however, the state vector will be a linear combination of both eigenstates: y = c+a+ + ca, where c+ and c are complex constants, and c+2 + c–2 = 1. In this case, we are still bound to get one or other of the eigenvalues, but it is not certain which one. Born’s postulate tells us that the probabilities of obtaining +hw/2 and – hw/2 are c+2 and c–2, respectively. Therefore the long-cherished principle of determinism, according to which identical initial conditions (e.g. identical state vectors) must always evolve with time in exactly the same way, is no longer valid.




Moreover, to Einstein the collapse postulate was an even more pernicious retreat from realism than that described above: it implied that physical quantities usually have no values until they are observed, and therefore that the observer must be intrinsically involved in the physics being observed. This suggests that there might be no real world in the absence of an observer!



All of this can be confusing to mere mortals like us, especially when turning burnt sausages on the barbecue, as it is unresolved for most quantum physicists. But the idea WHICH CAN BE VERIFIED EXPERIMENTALLY, “proves” that there is no determinism in the processes of the universe, despite predictible entanglement of particles. 


The ultimate possibility is that whether the real world exists or not, without an “observer", only the presence of an observer can measure it. And this is fair enough. This also would confirm Gus’s third theorem that we can see the universe but the universe cannot see us. I’ve forgotten which were the first and second Gusorems. They are buried somewhere on this site. That the universe cannot see itself and god does not exist are part of this parcel...


Yes, quantum mechanics are incomplete and we like it this way, otherwise we would be know-alls. There is no morality attached to these either, making these mechanisms far superior to religious delusions. Thus we have a zillion chances of meeting little green men in different universes and zero chance in billiobs (a billiob = infinity) billion of meeting god. The future lies in democratic secularity, in which nature and the monkeys should have a say. 



Gus Leonisky


Rabid atheist, taking care of earth.


See also:



flying high with the scientific method...

a worrying violence...

APS Presidential Letter Condemning Violence Against Physicists

December 11, 2020

Dear APS Member,

The recent assassination of an Iranian nuclear physicist recalls a string of similar killings of Iranian physicists nearly a decade ago. Then as now, there is broad speculation that the purpose is global political advantage, not merely by removing a leading physicist working for the Iranian Revolutionary Guard, nor even by stoking fears in other physicists within the Iranian science and engineering community; but also by changing the climate for international treaty negotiations.

No matter what the strategic motive or the geopolitical effect of this killing, the American Physical Society is deeply troubled by tactics of violence and terror, including assassination, against members of our physics community. The APS Statement of the International Nature of Physics and International Cooperation affirms that science transcends national boundaries. We call on all governments to condemn the use of violence against scientists.


Philip H. Bucksbaum
2020 President
American Physical Society


Read more:



Illustration at top from MAD magazine: planet of the Apes...


a new star in 1572

In 1573 Digges published Alae seu scalae mathematicae , a work on the position of the new star which is often called Tycho Brahe's supernova of 1572 since Brahe also observed the star and determined its position accurately. Digges' work includes observations of the position of the 'new star' and trigonometric theorems which could be used to determine the parallax of the star. The observations are particularly impressive making Digges one of the ablest observers of his time. Digges's friend Dee published a similar work on the supernova and the two works were often put in a single binding by booksellers and sold as a single volume. 

The appearance to the 'new star' contradicted the standard view of the universe at that time. The observations to determine the parallax of the star made by Digges and others confirmed that the star could not be between the sphere of the moon and the Earth, and this was the only place that, acording to the views of the time, change could take place. Digges was quick to point out that this new star, which slowly began to fade from view in 1573, provided the ideal observational evidence to allow alternative theories of the universe to be considered. This was a bold statement since the Catholic Church began to view any suggestions of alternative cosmologies as heretical. 

Digges became the leader of the English Copernicans and used his observations of the supernova to justify the heliocentric system. One of his ideas was that the movement of the Earth round the sun meant that the Earth moved towards and then away from the star causing it to brighten and fade. However when the brightening failed to occur periodically, this ideas was seen to be wrong. 

He translated part of Copernicus's De revolutionibus  and added his own ideas of an infinite universe with the stars at varying distances in an infinite space. It was his belief that the distances to the stars varied that at first seemed to be consistent with the new star being a faint one close to the Earth. Digges wrote to William Cecil, the leading statesman of the Elizabethean era, requesting support for the astronomical work he was carrying out. He makes his Copernican views very clear in this letter quoted in [3]:-


Read more:


Here following on the footsteps of Copernicus, we see the emergence of cosmology...


See also:

from socrates to quantum physics...

... or the eye of the beholder...

Different sorts of people have been revisiting the trial of Socrates for almost 2,500 years: philosophers, political theorists, politicians, jurists, historians, journalists and artists. Each has his or her own agenda for reconstructing the motives of the prosecution, the thinking of the jurors, and the apparently perverse behaviour of Socrates himself. This paper examines some of the more influential approaches to Socrates (including those of his contemporaries, Plato, Xenophon and Aristophanes), and attempts to set them in appropriate contexts. By way of a conclusion, there is an exploration of the background and context to the trial itself, taking into account recent work on Athenian law, religion and punishment, closing with an attempt to explain the significance of the likely location of the law-court, within the Agora or city square of Athens.


First published Fri Sep 11, 2015; substantive revision Tue Sep 15, 2020

Relativism, roughly put, is the view that truth and falsity, right and wrong, standards of reasoning, and procedures of justification are products of differing conventions and frameworks of assessment and that their authority is confined to the context giving rise to them. 

More precisely, “relativism” covers views which maintain that—at a high level of abstraction—at least some class of things have the properties they have (e.g., beautiful, morally good, epistemically justified) not simpliciter, but only relative to a given framework of assessment (e.g., local cultural norms, individual standards), and correspondingly, that the truth of claims attributing these properties holds only once the relevant framework of assessment is specified or supplied. Relativists characteristically insist, furthermore, that if something is only relatively so, then there can be no framework-independent vantage point from which the matter of whether the thing in question is so can be established.

Relativism has been, in its various guises, both one of the most popular and most reviled philosophical doctrines of our time. Defenders see it as a harbinger of tolerance and the only ethical and epistemic stance worthy of the open-minded and tolerant. Detractors dismiss it for its alleged incoherence and uncritical intellectual permissiveness. Debates about relativism permeate the whole spectrum of philosophical sub-disciplines. 

From ethics to epistemology, science to religion, political theory to ontology, theories of meaning and even logic, philosophy has felt the need to respond to this heady and seemingly subversive idea. Discussions of relativism often also invoke considerations relevant to the very nature and methodology of philosophy and to the division between the so-called “analytic and continental” camps in philosophy. And yet, despite a long history of debate going back to Plato and an increasingly large body of writing, it is still difficult to come to an agreed definition of what, at its core, relativism is, and what philosophical import it has. This entry attempts to provide a broad account of the many ways in which “relativism” has been defined, explained, defended and criticized.


A standard way of defining and distinguishing between different types of relativism is to begin with the claim that a phenomenon x (e.g., values, epistemic, aesthetic and ethical norms, experiences, judgments, and even the world) is somehow dependent on and co-varies with some underlying, independent variable y (e.g., paradigms, cultures, conceptual schemes, belief systems, language). The type of dependency relativists propose has a bearing on the question of definitions. Let us take some examples.

Justice is relative to local norms.
Truth is relative to a language-game.
The measurement of temperature is relative to the scale we use.

Each of (a)–(c) exhibits a relation of dependence where a change in the independent variable y will result in variations in the dependent variable x. However, of the three examples cited above, normally only (a) and (b) are deemed relevant to philosophical discussions of relativism, for one main attraction of relativism is that it offers a way of settling (or explaining away) what appear to be profound disagreements on questions of value, knowledge and ontology and the relativizing parameter often involves people, their beliefs, cultures or languages.

The co-variance definition proceeds by asking the dual questions: (i) what is relativized? and (ii) what is it relativized to? The first question enables us to distinguish forms of relativism in terms of their objects, for example, relativism about truth, goodness, beauty, and their subject matters, e.g., science, law, religion. The answer to the second question individuates forms of relativism in terms of their domains or frames of reference—e.g., conceptual frameworks, cultures, historical periods, etc. Such classifications have been proposed by Haack (1996), O’Grady (2002), Baghramian (2004), Swoyer (2010), and Baghramian & Coliva (2019). [A]… table classifies different relativistic positions according to what is being relativized, or its objects, and what is being relativized to, or its domains.

Read more:


Here comes a few problems especially with sciences. For example on the "The measurement of temperature is relative to the scale we use...” there is of course only ONE absolute in sciences: the absolute zero. Everything else in sciences is relative. In religious beliefs everything is absolute — except our miserable life that we try to morally manage in order to achieve the “absolute”. Reaching the Absolute infinite is silly — and only exists as a convenience corner store in mathematics, but does not hold water in sciences. Sciences and religions are not compatible...

But being “relativistic”, does not mean that sciences are wrong. To the contrary. Being absolute means complete seizure and impossibility of change. At this level and every levels, religions are totally loony. Life and the universe are about change. Without change (lack of absolute) there would be no life nor a universe... 

We have to thank Protagoras of Abdera, who was one of several fifth century BC Greek thinkers. The group also included Gorgias, Hippias, and Prodicus and was collectively known as the Older Sophists [before sophism became a circular argument). They were travelling teachers/intellectuals of rhetoric (the science of oratory) and related subjects. 
Protagoras is known primarily for three ideas: the first is that man (human) is the measure of all things — a statement which is a form of radical relativism; second that man (human) could make the “worse (or weaker) argument and appear the better (or stronger)” that is to say be the best at bullshitting and the third claim is that one could not tell if the gods existed or not. 

Some ancient sources claim that these views led to his being tried for impiety in Athens but this may have been a legend. On the other way, this was the fate of Socrates...

Protagoras’ notion that judgments and knowledge are in some way relative to the person judging or knowing has been very influential, and is still widely discussed in contemporary philosophy (and the law). Protagoras’ influence on the history of philosophy has been significant. Historically, it was in response to Protagoras and his fellow sophists that Plato began the search for transcendent forms or knowledge which could somehow anchor moral judgment. Along with the other Older Sophists and Socrates, Protagoras was part of a shift in philosophical focus from the earlier tradition of natural philosophy to an interest in human philosophy. He emphasised how human subjectivity determines the way we understand, or even construct, our world, a position which is still an essential part of the modern philosophic tradition. We ask questions.

Asking questions is also at the core of Quantum Physics, in which the observer becomes part of the observations. In contrast, relativity (Einstein's) is exclusive of the observer, yet it explains relationships between the states of the universe.

Einstein and Quantum physics exponents such as Bohr were at war. So far, both theory are “correct”, despite Einstein trying to disprove Quantum theory. At this stage, it’s not a theoretical discourse, but of the scale of the subjects being observed (the theories are not scalarable) — i. e. Relativity explores the big spaces of the universe, while Quantum physics deal with the “infinitesimal small” where behaviour of particles do not follow classical mechanics, nor relativistic theories, though there are some overlaps in the systems of forces to create change. We, the mere living, know of two main ones: A) electromagnetism which manifests in our measurements of light, magnetism, electricity, electronics et al. B) gravity which prevents us from flying off from good terra firma, unless we use a plane or a rocket to beat it…

The other two forceS, which we have no clue (the Quantum physicists know) about, are the weak force and the strong force. Strangely enough the outreach of the strong force is far smaller than that of the weak force, yet both are “miniature” forces. Without these two, the whole universe would be made of ordinary loose bits, if so. Quarks could not make a quorum. I mean a proton. So what are these forces? Who knows, but they can be measured precisely by the biggest machines. The name gluon came to described the attachments between bits, etc. It’s still a bit vague but smashing particles is fun… and you have to remember that as an observer, you will interfere with the observations.

So in order to make sense of this, we have to realise that WITHOUT THIS KNOWLEDGE, we would not be able to make atom bombs blow up. Same with “nuclear energy” which is a kitchen top appliance adaptation of this knowledge to boil water, with radio activity — itself a mix of electromagnetism and of loosening of the weak and strong forces in the atoms, releasing particles from their bonds in the nucleic proton assemblage of heavy metals…

So far so good. We’ve gone goonish, far from Socrates and Protagoras, but they did not have the big machines to measure things that are very small.

In our modern world, we are beholder of the big machines. We need they eyes, otherwise we'll get confused. The eyes of the big machines will be our beholders...

Gus Leonisky
Quantum mechanician...
Read from top.

solving the schrodinger equation...

The goal of quantum chemistry is to predict chemical and physical molecular properties based on the position of their atoms in space, and avoiding laboratory experiments that require significant resources and time. Generally, this can be achieved by solving the Schrodinger equation, however in practice this is hard to achieve.

Recently, artificial intelligence (AI) has been used to solve the Schrodinger Equation in quantum chemistry. A team of scientists at Freie Universität Berlin has developed a means of calculating the ground state of the equation with artificial intelligence (AI), according to recent study results published in the journal Nature Chemistry. The deep learning method developed by German researchers is capable of achieving an unprecedented combination of computational efficiency and accuracy, according to the report.  

AI has transformed many technological and scientific areas, from computer graphics to materials science. “We believe that our approach may significantly impact the future of quantum chemistry,” says Professor Frank Noé, who led the team effort. This deep neural network was designed by the team as a new way of representing the wave functions of electrons.

"Instead of the standard approach of composing the wave function from relatively simple mathematical components, we designed an artificial neural network capable of learning the complex patterns of how electrons are located around the nuclei," the professor explains.

Dr. Jan Hermann of Freie Universitat Berlin, who designed the key features of the method used by the study, added that a special feature of electronic wave functions is their antisymmetry, meaning that they had to build this property into the neural network for the approach to work.

This feature, known as 'Pauli's exclusion principle,' resulted in the authors titling their method 'PauliNet.' Alongside the Pauli exclusion principle, electronic wave functions have other fundamental physical properties and much of the innovative success of PauliNet is that it integrates these properties into the deep neural network, rather than letting deep learning figure them out by simply observing the data.

"Building the fundamental physics into the AI is essential for its ability to make meaningful predictions in the field," says Noe. "This is really where scientists can make a substantial contribution to AI, and exactly what my group is focused on."

Read more:


Read from top.


See also: