I figured out how to get a "reversal" of a field line. To encrypt, one chooses the correct origin mass for a plaintext byte P, then iterate a field line with an end point A . The region around point A is then searched for an inverse field line B that goes back to origin where origin == B. So, plaintext byte P would move to its correct starting mass point corresponding to its value, then iterate out a field line to an "appropriate" endpoint B. This B is the encrypted point, and sent across the wire. The decryption takes B, reverses the field and plots a line that ends up in the correct mass to map B back to P.
The secret key are all the parameters for the vector field. This key can be quite large indeed!
I attached a rendering of an experimental encoding.
I am going to post some proof of concept code here in a day or two.
___
#Fractal #Field #Vector #Math #Space
The secret key are all the parameters for the vector field. This key can be quite large indeed!
I attached a rendering of an experimental encoding.
I am going to post some proof of concept code here in a day or two.
___
#Fractal #Field #Vector #Math #Space

View 3 previous comments
There is a specialized subject area in electrical engineering that you can find in old books. It is called "coding".
If you see one of these old books, you might guess what coding means in the modern context. say maybe writing software, or something, but no. The word was used before software was what we know now. It was used in the early days where information theory was a brand new idea, and has to do with the origins of information theory.
"coding" in the old books referrs to a method of converting analog signals into alphabet or numerical values called coding. A book about coding might have some theoretical basis, and then pages and pages of specific details of coding methods, and diagrams, and descriptions of advantages of the methods. Diagrams often include a voranoi diagram with two axes labled as frequency and amplitude for example, and a diagram with regions of the plane colored and labled with codes.
Some methods have 2 codes, like a 300 baud modem has just two points in phase space that represent 0 and one, but others have more, like a 1200 baud modem uses 8 points in the same phase space to encode 3x the information rate.Feb 24, 2016
This is actually the origins of the fundamental ideas of information theory. People saw all signals as analog, and were exploring ways to have machines control other machines by electrical signals on wires. People were discovering ways to multiplex the function of signals on wires so they could control more machines, and the question came up, how many machines could you control over a wire? As many as you want? Was it practically infinite? How can we know, if you can multiplex a few continuous signals together on one wire, why not thousands and thousands of thousands?
Noone knew how to describe or understand the limits with analog signals!Feb 24, 2016
Along came claud e Shannon who had a way to understand how to answer these questions. He was able to generalize the answer to how much control could be done with analog signals by considering the limits of the process of encoding analog signals with discrete codes where the number of codes approaches infinity and the digital approximate representation approaches the analog signal.Feb 24, 2016
By doing that he suddenly understood these deep implications of this theory, and noticed that there was a quantity with quality like entropy in physics.
This was a great amazing leap of insights, where information had entropy.Feb 24, 2016
+Chris Thomasson
There are endless parallels between information theory and processes in fractals.
Fractals and information theory gives insights into compression, and fractal distributed random processes as data sources, shows a breakdown of the old view of entropy and information content of a signal. The fractal dimension of the source process distribution is related to the information content, as well as the entropy.Feb 24, 2016
Nice disk,
not hyperbolic, LOL...Feb 25, 2016