Post has shared content

**We survived another round of elimination**

We're keen to keep an eye on work that rules out a mechanism or causal relation. It's what science is all about. Anyone can create a reasonable hypothesis, but it's more difficult to create one that survives attempts at disproof.

We're happy to have survived this round, in the light of recent constraints between electromagnetic propagation and gravitational propagation.

Our mechanism treats them as fundamentally the same, propagated as scalar radial waves that collectively behave as a flux of vacuum energy. The effect, whether electromagnetic or gravitational, depends on the organisation (structure) of the matter that absorbs and emits the vacuum flux.

This means that detectors may measure gravitational flux at the same time as charge-based interactions; indeed they may be spread behind the first-arriving signal, depending on the intermediate matter. Gravitational waves are less likely to have been directed along circuitous paths, which leads to characteristic spreads of signals. We hope to provide some predictive analysis on this soon.

As a bonus, this flux varies the rate of quantum fluctuations, with matter, or itself, in the intermediate space. We are working on calculating whether this is a candidate mechanism for dark matter, but it will take a lot of simulation or integration to work out. Chaotic systems make this much more difficult.

**New gravitational wave detection with optical counterpart rules out some dark matter alternatives**

The recently reported gravitational wave detection , GW170817, was accompanied by electromagnetic radiation. Both signals arrived on Earth almost simultaneously, within a time-window of a few seconds. This is a big problem for some alternatives to dark matt...

**Unsolved problems: A. Extra Dimensions**

This is a draft article; we'll refine it for the Wiki.

> 1. Does nature have more than four spacetime dimensions?

> 2. If so, what is their size?

> 3. Are dimensions a fundamental property of the universe or an emergent result of other physical laws?

> 4. Can we experimentally observe evidence of higher spatial dimensions?

Answers, one at a time:

**> 1. Does nature have more than four spacetime dimensions?**

Answering this in terms of our mechanism, we have

**just one dimension**for propagation, "phase" (which is the answer to the above question), along with an implied dimension for value. The other properties are emergent, either from the relationship between phases of two waves entangled within a boson (mass-energy), or from the implied behaviour when these simple entities interact (see Q3). As an aside, the value changes as the entities propagate, and the two are related as v = -b e^-phase, where b is a basis axis for the value. This basis is common to all propagating entities.

If we have one dimension for propagation, then we don't really need to think about which direction entities propagate in; they just propagate 'away' from their origination event, and directionality emerges from the unique solutions to wave collapse. All entities simply have a phase p, which describes their distance and time from the event (x,t), where p = x = t.

**> 2. If so, what is their size?**

This question is aimed more at super-symmetry, string theory, and M-theory variants, where some dimensions are compactified or somehow different in nature from the regular manifold described by spacetime (or space, AdS, anti-space, and so on). These hypotheses typically speculate that an extra value or potential is stored within a field or entity, at some abstract distance from the entity, so that it appears to be weaker than a corresponding field that is in our immediate space. They do it to explain how gravitational force is so weak.

In our terms, we don't think this is a valid question.

**> 3. Are dimensions a fundamental property of the universe or an emergent result of other physical laws?**

Dimensions are an artefact of geometric algebra, which is a way of organising instances of information, of identical types and of differing types. We studied this in some detail, following the generation of Clifford (geometric) algebras from combinations of simple properties, which gave us the number types used in mathematical physics. We think there is a very simple algebra and dimensional structure at the foundations of the most fundamental physical processes (and the meta-processes that form a more complete picture), which we mentioned above. Physics uses some more complicated constructions that are derived from those physical processes. However, we think this question is directed towards specific dimensions that are used by practitioners of physical sciences, rather than being about the notion of what dimensions are at the foundations of algebra.

**Spacetime is emergent**, as is the 3D space manifold that we relate to. Time is emergent, but very closely related to phase propagation, and generates the sequencing for the collapse of quanta. Time's Arrow and consciousness, we'll save for another article, because they are not the focus of our studies.

To do physics in the Standard Model, we create abstract quantities, that make sense when deducing our observations into an underlying mechanism. We expect these are emergent from the structure and interactions of simple fermions, and bosons which are the propagating constituents of those fermions. So for example, the weak interaction is the first collapse of a boson shell, the strong interaction is the confined collapse of quark constituents in baryons, electromagnetic interactions are a system's non-confined (charged) exchanges of lighter bosons with the vacuum, and gravitation is the re-emission of vacuum flux by large bodies. They are all the same interaction, using the same simple mechanism, but they gain different emergent expressions depending on their large-scale organisation. The groups and algebras (SU(3), SU(2), U(1), etc) of these interactions map well to those interactions and behaviours.

If we make some assumptions, and try to interpret this from a spacetime perspective, or even separated space and time, then propagation looks like an expanding sphere. We do not have a good explanation for how a R3 space emerges from phase values, and how an entity maintains a distance in a manifold, but the dimensionality of solutions does play an important part in particle flavour, and the positional constraints this places upon their conserved collapse, and how they maintain or lose their constitution.

> 4. Can we experimentally observe evidence of higher spatial dimensions?

Very unlikely, because the algebras used to construct spacetime naturally limit themselves to the (3,1) metric; trying to create a new one just creates a copy of one of the existing spatial dimensions, which is not unique as a new dimensional basis would be. But at least the question was framed reasonably well. In experiment, we can only observe spatial changes, so any 'other quantities' like activity in higher dimensions, are derived evidence, rather than direct evidence.

For solutions to our wave collapse mechanism, we are continuing work to more rigorously derive fermion flavor from the dimensionality of the interactions of boson shells. This too seems to be self-limiting to three, and it's possible to extrapolate some properties and constraint counts of these interactions, to show one further entry along the line of flavor, which corresponds to the propagating boson. However, this state is not localized, and is not a fermion.

_ _ _

The wiki is at http://johnvalentine.co.uk/po8

Post has attachment

**Challenge a 'Theory' - suggest a list of problems**

I've been developing a simple-as-possible mechanism for

**fundamental physics**[1]. In my papers, I've cherry-picked aspects of physics that I've found interesting, and described them in my own terms: things like black holes, vacuum interactions, matter/anti-matter prevalence, and the constitution and processes of fundamental particles and composites.

In the next few articles here, I want to

**take a very challenging list, and work through it, offering an interpretation in terms of my work**. Each question would result in an article that might take about 10-15 minutes to read.

This might seem pretentious, and I'm not going to write anything that would win prizes, but I thought it would be fun, and I'd like to think that it would expose problems in my ideas that cherry-picking would not uncover. I hope it would also make interesting reading, and trigger debate. That is the way science works: try very hard to disprove an idea.

To be clear, I'm not using exercise as a way of getting academics to look at my work via the back door (professionals hate that!). I've used the correct channels in the past (conferences, publications, etc), so I know how they work.

**I'm looking for suggestions for a good list to work through, in say high-energy physics, cosmology, or mathematical physics?**One possible list is the

**List of Unsolved Problems in Physics**on Wikipedia. Can you suggest something better?

[1] http://johnvalentine.co.uk/po8

Post has attachment

**Multiverse with sufficient constraints = deterministic**

We're particularly interested in pictures and views of reality, especially when aspects of QM and QFT can be explained without poor analogies.

Multiverse is doing the blog rounds this week, with opposing opinions [1,2] about its validity as our perspective on reality, and how other ideas might prove a good case the the Multiverse.

We wrote about this a few years ago [3], but we're linking to it now, as we've not given it an airing here.

**Spoiler: we don't need a Multiverse if there are sufficient constraints, and we define our maths carefully.**

If you read our Synchronicity article (which I really should rename, because it sounds like a crank title), you might have noticed that the mechanism imposes sufficient constraints (on what looks like a random choice from an infinite number of futures), and makes systems deterministic and knowable -- without a Multiverse. We get to keep the quantum stuff.

#multiverse #noMultiverse #deterministic #dice #havingCakeAndEatingIt

[1] Ethan Siegel - https://www.forbes.com/sites/startswithabang/2017/10/12/the-multiverse-is-inevitable-and-were-living-in-it

[2] Sabine Hossenfelder - https://twitter.com/skdh/status/918696685538222081

[3] http://johnvalentine.co.uk/po8.php?art=many_worlds

Post has attachment

**Humanity’s Output, Humanity's Legacy**

The race is on! Will we understand the Universe before we destroy ourselves?

http://johnvalentine.co.uk/index.php?art=humanity_output

We explore how the far-future frontiers of human impact will determine whether or not we will survive as a species.

We’ll also look at how our current activities are contributing to our genetic-imperative goal of long-term survival.

Post has attachment

**Synchronicity: How our universe might be 'ticking'**

We devised a mechanism for matter and vacuum, which describes most things well. In this speculative article, we explore one of the unusual emergent properties of the mechanism: a possible

**phase synchronicity**.

* The interacting matter has a 'synchronised tick', repeating every unit of Planck time.

* The math shows that other matter and energy may exist 'out of phase', crossing the same space, without interacting with our own matter.

* This energy may modulate the bosons of a system in any phase, increasing the probability of collapse into fermions.

* The vacuum might be more complicated and extensive than we expected.

* Speculatively, this might be a candidate description of dark energy and (separately) dark matter.

http://johnvalentine.co.uk/po8.php?art=synchronicity

(with optional links to recommended prior reading)

Post has attachment

**Blind Spot**

Just a small silly thing this time. In my 2014 work, I ran integration computations to obtain the radius at which a wave may collapse in a vacuum flux. I found the multiple to be 0.693147... I published it to 6dp.

I failed to recognise this as (ln 2).

There's probably a very good reason for this, and it's a very simple reduction of an integration of an equation having many terms. In the image,

*V(r)*is the volume of a sphere at radius

*r*.

Post has attachment

**Black Holes: Information Paradox, Accretion, Distillation, Late Universe**

An extract from http://johnvalentine.co.uk/po8.php?art=black+hole

**Can this help to solve the Black Hole Information Paradox?**

The Black Hole Information Paradox is founded on the idea that a black hole seems to destroy information. Using the above, and treating the information as individual waves or bosons, rather than as fermions or composite particles, we find that although the matter may change constitution within the body, and separate to generate radiation, the overall mass-energy will be conserved, and nothing is destroyed.

Entering the black hole

Any regular matter entering a black hole is likely to be absorbed into the main body. On its way in, the matter will be unable to maintain its constitution, as the dense flux contributes a greater probability of collapse to the matter, intercepting the usual re-constitution process. Lighter parts will be stripped away first, like electrons, and any bosons that were tenuously confined. As the environmental flux increases, the matter will be increasingly challenged, and eventually overcome, its bosons contributing to the plasma of the environment and eventually to the body of the black hole itself.

Inside the black hole

When inside the black hole, the original matter will be unrecognisable. The largest conserved units will be the bosons, which will have been separated from their 'siblings', with improbable chance of ever re-constituting in the form they entered. Any higher-level structures will have been destroyed, their bosonic parts scattered. The fermions that those bosons do make will have new identities, using parts that will likely have never met before. The environment will be an intense plasma, with bosons interacting at a rate many orders of magnitude higher than in the human environment, or even within a star.

Inside the black hole is a plasma of bosons, forming temporary fermions whose ideneities are unlikely to ever be repeated (having unique constituents every time). Lighter bosons will conduct through structures of massive bosons, with the massive bosons migrating towards the centre of the structure because that's where the directional probability of collapse will take them, as per our definition of gravitation.

Distillation

Given that the heavier bosons are drawn inwards, and therefore the lighter bosons less so, there is a tendency for the matter to be distributed radially by their mass-energy. Further, the lighter bosons have a higher probability of evaporation, which is when the environmental flux outside the black hole is capable of successively collapsing a boson outwards, against the probabilities implied by the flux gradient.

This is analogous to distillation, where lighter fractions more readily evaporate than heavier fractions.

**The Late Universe**

After a black hole is firmly established, it will accumulate matter until the environment can no longer provide it. After that, there will be a point in time where it will radiate more mass as lighter bosons than it absorbs in ordinary matter, while confining the massive bosons. This evaporation will occur in notably longer evaporation stages with each tier of mass, each stage being several orders of magnitude longer than the previous, simply because the overall probabilities of escape are a function of a very high power to the quantum probability of escape.

We predict that the late universe will contain only light bosonic radiation, and black holes having a high-mass confined flux, like a giant exotic hadron. Predicting the flux density of environmental low-mass radiation, such as neutrino constituents, depends on cosmological expansion being included or omitted.

**Summary**

Throughout our description of a black hole, we have used no special mechanisms that are unique to a black hole; only a mechanism that is common to all matter.

The most notable feature is that our description of gravity, as an emergent statistic of the collapse of fermions, serves the model well here, providing the internal stratification of the black hole, its flux gradients, and its evaporation characteristics. Further, we have described Hawking radiation, along with a deterministic basis for quantum fluctuations, again without specific mechanisms.

Bridging incompatible representations

We are hopeful that this mechanism might help with understanding a possible framing for the quantization of gravity, while not explicity combining General Relativity and Quantum Field Theory. Where leading theories are treating gravitation as an addition to the quantum properties (or vice versa), we count them once, as the same interaction. You can read more about this in

*Quantization of Gravitation*.

#blackHoles #gravitation #entropy #eventHorizon #evaporation #hawkingRadiation

Post has attachment

**Quantization of Gravitation**(or "Do we need to?")

This problem concerns the compatibility of General Relativity with quantum formulations. Quantization, or the discrepancy in the packaging of values, and what those values represent, is seen as the problem that must be solved in order to bridge between these representations, and provide a continuously valid representation at large and small scales. There currently exists no direct mapping or transition between the two.

Although we do not have a solution for this problem, we can instead show how gravitation manifests in our mechanism, having origins in continuous bosons, how it is quantized as fermion instances, and how statistics may be derived from fundamental mechanics.

**Lossy statistics**

When we build statistics, like a gravitational field representation, we will inevitably lose some information about the bosons, or miss some details about the actual emergent behaviour of the target system. For example, our attempt to derive the Newtonian-like approximation (above), provided correlation at large distances, and further, improved upon the Newtonian formulation at short distances. However, it failed to account for actual phase values, which makes the approximation only good for decoherent vacuum currents, and makes some assumptions about the homogeneity of the vacuum.

**How Classical physics lost its determinism**

Only in a truly deterministic model, with all information present, can we obtain an accurate outcome. Unfortunately, this presents some difficulties when we package our outcomes as generalisations, using preconditions that are statistical and emergent properties, or shorthand for more complicated states.

Our conventional statistics will not adequately describe the conditions; being statistics, they lose the information that is needed to show the 'quirky' behaviour at the scales where quantum mechanics proves more useful (but still falls short). Newtonian mechanics is shown to be unrepresentative of reality, and quantum mechanics can fall short too.

Quantum mechanics has its statistical losses: the vacuum is grossly oversimplified because it integrates detail into statistics, losing the instances of vacuum bosons, and replacing them with summarising parameters that were created to explain observations, in terms of deviations from a base model, e.g. coupling constants, expectation values, permittivity, permeability, various fields and corrections, derived potentials, and also free parameters that have no fundamental basis.

**Do we need to quantize? If not, what do we do?**

The physics community attempts to 'quantize gravitation' in order to reconcile the leading but incompatible representations. Both General Relativity and Quantum Mechanics are statistical in nature, operating from different representations and principles. They will forever be incompatible without a more fundamental intermediary (general) formulation.

If a conventional bridge between the two is possible, it will be formulated using a mechanism like our own, but there will be no direct mapping from one representation to the other; vital information will be lost at each stage. Further, we would not be able to provide the missing information to a statistical representation, without treating this information as corrections to the statistics. That is untidy, and by focusing on the corrections as quantities in their own right, might mislead us about the foundations that are critical to emergent behaviour. Both LQG and string theory are too aligned to their respective foundations to be truly 'bridging' in this respect. Our future work will assume our own foundations, from which the other formulations may be derived as statistics (it is not something we have the expertise to do with rigour, but we will attempt at least an outline).

Finally, we'd like to say that the statistical and correction problems apply not only to reconciling GR (and in particular, gravity) to QM, but also to most other aspects of high-energy physics and cosmology that are troubling the physics community.

--- extract from 'gravitation' article, http://johnvalentine.co.uk/po8.php?art=gravitation

#gravitation #stringTheory #QM #quantumMechanics #statistics

Post has attachment

**Flux Density, Gravitational Field, and Hubble's Law**

tl;dr: (1) gravitational field is related to the

*gradient*of flux density, (2) Is there a better fit to red-shift data?

In our previous post, we mentioned that flux density corresponds with Compton radius. It might then seem reasonable to think that you could test for blue-shifting of nearby massive objects, because they have a higher flux density. However, this is not necessarily the case.

We propose that the flux density in our neighbourhood (on the scale of galactic clusters or filaments) is very large when compared with the

*extra*flux density that massive objects (like our Sun, or Jupiter) conduct. Incidentally, this makes gravitation quite weak when compared to charge-based effects that can harness more of the vacuum flux into a coherent current.

This means that, in our neighbourhood, there is a lot of flux to reduce the Compton radius, but a relatively small flux

*gradient*(from source to receiver) that would contribute to blue/red-shifting. Local bodies are therefore not a good test for this hypothesis; instead, we must look further afield.

In a cosmological picture that assumes condensation rather than Big Bang, the red shift will be approximately proportional to distance [Hubble's Law], because the light from distant sources was emitted when the matter was in a rarefied flux.

It might therefore be interesting to examine measured red-shift values of sources against the expected vacuum flux conditions at the time of their emission, to see if it has a better fit than the general trend identified by Hubble's Law.

Perhaps more relevant to science, there are opportunities for disproof.

http://johnvalentine.co.uk/po8.php?art=gravitation

#vacuum #redshift #hubble #gravitation #flux

Wait while more posts are being loaded