Frondity

I can see clearly now the rain has gone

“I can see all obstacles in my way

Gone are the dark clouds that had me blind

It’s gonna be a bright (bright)

Bright (bright) sun-shining day”

Johnny Nash

It’s a strange feeling to find that you have been lied to straight to your face by people that you have been taught to trust and rely on. Perhaps this is a bit melodramatic but in some sense it is the way of the modern education system. In the sciences we are continually pedalled simplified or dumbed down versions of reality only to find out as we progress through the various levels of understanding the subject matter that the models we thought fully described the processes and phenomena were actually too crude, insufficient or only approximate the real deal.

There is no better example of this than the atom. Protons neutrons and electrons. Yes well not quite. So protons neutrons and electrons spinning around in orbitals. Yes well not quite. Well protons neutrons and an electron cloud. Yes well not quite. Quarks and leptons. Yes well not quite. And so it goes.

But let’s take a look at the electron cloud proposal. Essentially we are told that you can’t touch the electron or measure its speed and position accurately at the same time because it takes the form of a cloud or a kind of probability of its position at any one time. The density of the cloud represents the likelihood that the electron is at that point at that time. This has parallels in the way chance rears up it’s head in the world of quantum mechanics. We are inside the atom and so the rules have changed apparently.

Probabilities when viewed in 2D can often be represented well by the binomial distribution or the more common term the bell shaped curve. What you see is that within any given functional output the measured outcomes over time can be expected create such a curve around the mean value.

But there is something incredibly familiar about the Gaussian or bell-shaped curve. If you look at how Barnsley’s fern is plotted typically the is a point, (x,y) of (0,0) where the plot begins afresh with each reset of the algorithm. Let’s consider (0,0) as our point of reference and let us pretend that we are sitting on the paper or screen where the Barnsley fern is being plotted via its usual means, i.e. set initial conditions, random input, rules (if/else statements or guidelines), and repetition.

Now let us plot a curve such that we measure the concentration or density of the dots we see across our field of view, that is, our plane of view across the sheet or screen as they are generated. If we round or approximate these positions in any way (e.g. round to the closest degree) and plot the output as a frequency chart, we see the development of perhaps a marginally skewed version of the bell-shaped curve. More like a Tracy-Widom distribution. This is essentially a lopsided bell shaped curve like someone has given it a little nudge. Believe me, also, when I say understanding these type of distributions is central to understanding lots of current challenges in fundamental physics.

Without the capacity to measure the intensity of the dots appearance in order to judge its displacement from our viewpoint we lack the data needed to plot this out as the fern or frond. We are trapped by the limits of our dimensional existence in this example. But if some of the propositions of quantum mechanics like multiple further dimensions are to be taken seriously then we must at least consider that in many ways we too are simply looking across the sheet with no overhead perspective. No drone to survey our worldview more appropriately. Like the characters of Flatland we see what we see and this gives us a world view that suits our basic needs.

It is always questionable to take correlation and assume causation but equally this is often the only way to define a hypothesis or conjecture. It should also help to define tests to prove or disprove the notions. In this case, I am willing to make that leap. The near bell-shaped curves we see in frequency distributions are the result of fractals generated in a higher level dimension that we can’t readily access. I contend that everything is fractal. All processes. Everything is emergent. Everything is incomplete and yet moving ever closer to a perhaps elusive perfection. The clouds of electrons have cleared. Now I see. Well I propose I see anyway. But how to test, I wonder? Follow @frondity for this and more.

Where to begin?

“We shall not cease from exploration

And the end of all our exploring

Will be to arrive where we started

And know the place for the first time”

– T.S. Eliot, Four Quartets

When I was young my father always made a very convincing case for his logic. When the need arose from time to time, he would simply say, “I’ve been around the world and back again and this is how it is”. It’s a compelling argument with a sound basis. It can be an erroneous thing to assert or even surmise things without experience. In that sense, I begin the process of publicising my thoughts having already done a lot of soul searching and still with some trepidation.

In time, I’ve also learned that plenty of people are happy to speak without experience and what is considered real or of value sits on a sliding scale that is pushed around by the masses on social media. So in the interests of peer review and establishing fact from dubious opinion, I submit my meandering thoughts to the world for demolition. Do your worst folks!

To honour some sense of respect for the qualification of data through information, to knowledge and wisdom, I reserve the right to place none of what I propose in any of these pots. Rather meandering trains of thought based on my own experience, if such a pot exists. There, disclaimer done.

Now onto the nitty-gritty. With the disclaimer close by, this is perhaps the time to make perhaps the biggest leap. I think that the following ideas are worth strong consideration:

  1. Fundamentally, there was no such thing as the big bang
  2. Fundamentally, there is no such thing as one and zero
  3. Fundamentally, everything we know can be defined in terms of an iterative function system

But why would I come to think this? Where have these crazy thoughts come from? And what is an iterative function system when it’s at home. All I can say is what the stamp said to the letter. Stick with me and we’ll go places.

Rest assured, I didn’t arrive at this juncture without having “been around the world and back again” in some sense. I trained as a mechanical engineer. The Newtonian world view of forces and masses was my bread and butter. It’s an excellent profession for the rational mind. Generally, things stack up. But you also don’t have to go too far in life before they don’t. I remember working as an undergraduate on a project dealing with friction in thermoforming where solid plug materials are used to force heat-softened plastic sheets into moulds. Immediately, my Newtonian worldview was shattered. The complexity was staggering and I was hooked. Thermally variable viscoelastic properties of polymer materials set in the context of a “no-one is entirely sure of the mechanisms” view of friction. Suddenly, the role of previously obscure things like interfaces became an important part of my worldview.

But the thing that sat most uneasy with me was the effect that small changes in the conditions of experiments could make to results. All experimentalists will understand this. Equations are fine, but the world is complex. The way in which we deal with this is peculiar. Almost without exception, something along the lines of the butterfly effect gets touted and we round and average data to fit our standardised world view. We sleep easier that way. That’s all very well but the data in these experiments shows there is something that is almost always neglected when its ubiquity indicates the opposite should be true.

These thoughts never left me through other experiences working in organic and inorganic chemistry with nanomaterials to make nanocomposites whilst doing my PhD. Mixing of materials and materials science in general encompasses a smorgasbord of different effects, many of which are subservient to more commonly known physical laws. My supervisor at the time, would speak of the importance of assessing materials with a plethora of experimental techniques and “through the length scales”. So I got a feel for many material properties and the effect of observing samples at different magnifications if you like. Often, so close, I was almost looking at atoms. In other roles and when working with repetitive processes like injection moulding, I could see the experimental variations pervaded in this arena too.

Moving on to work in medical device development and production, I found the same issues. Pushing boundaries in microtechnology and getting a feel for the importance of lean manufacturing, I could see variation and effects arising in things like laser cutting, sputtering and general automated processes. Later, particularly working in plasma physics, I found anything remotely like a Newtonian model defunct. But like I say, any experimental scientist, particularly those dealing with large datasets will agree. Processes are complex and outright control is something of an enigma. I was lucky in my earlier days developing medical devices to pick up a book called “Chaos” by James Gleick. It was proverbial music to my ears. The things that had bugged me were being dealt with definitively. Amongst the enlightening ideas were things like Mandelbrot sets, fractal forms and Lorenz attractors.

Enthralled, I looked into the area a little bit further, discovering Barnsley’s Fern. It stuck a cord instantly. Here was an idea that spoke to my pain. It was pretty much a physio with a knuckle in deep, kneading out years of trauma. It certainly freed up my thinking. It’s like the perfect analogy tool. I’m a strong believer that mathematics and analogy are essentially the same thing. I get a real kick from reading Max Tegmark’s viewpoints in “Our Mathematical Universe”. Equally, the lyrical waxings of Brian Greene, Emily Levine and Jim Holt are a real treat.

At some level, I think some aspects of what they have to say should be taken further. Why? Well, in short, because of the function that drives the development of fractal form… The Iterative Function System. But we need some context here.

The premise for each of the three wacky ideas listed above is that there was no bang and that everything is in effect emergent. That’s fine for theorists if you place it in the context of a standardised mathematical construct. The rules are there and then the matter (and anti-matter, for that matter) follows those rules. But I would propose that the mathematical framework itself is emergent. The reasons are many. Amongst other things, it’s about efficiency, longevity, consistency and a zero touch model for management. Of course laziness had to be in there somewhere. Critically, it’s about taking a first principles approach to the argument.

How do we consider or compare something and nothing? Difference. Difference must exist. Mathematically that’s the definitive break-point in a something vs nothing argument. To pickle your brain some more, for difference to be in any sense measurable sameness must exist. Different things need a frame of reference within which they can be compared in order to be considered “to exist”. So in that sense no two things are entirely different because they exist in the same set of things that can be measured. As an engineer, I understand many of these issues from the perspective of requiring datums for measurement and geometric tolerancing. Things can get overly complicated very quickly without such information management systems and tools.

This takes us to the next step. These properties must be ubiquitous. The properties of sameness and difference must be omnipresent. This is the qualification for and quality of space-time. For difference and sameness to exist a measurement must be made to qualify condition. Offsets in properties (differences) cause and are space-time. At the crux of this also is that a difference in space is a difference in time. Pair this last thought with the omnipresent argument and you have a qualification for the construct we know as time. If the basic premise is that things have to be different in order to exist they must be different in time as well as space and therefore we have constant change and an arrow of time.

So what is being put forward here. Well essentially, it is that difference and sameness are everywhere and always. You could say that measurable difference must always exist or that difference and sameness must always exist. I would propose that the premise for a construct would be that difference and sameness must always exist in balance. As difference must always exist there is no expectation that a static stagnant balance could occur. I would also propose gravity and entropy and every other symbol found in the equations of physics, Newtonian or otherwise, are simple out-workings of this process cycling forward through time with layer on layer of interacting entities. Stack that on top of the proposition that the mathematical foundations, including numbers, on which they are based are equally an outworking of a system creating perennial change. In some sense one and two aren’t what they were yesterday and zero in it’s purest form, never really existed. These concepts are difficult to comprehend without analogy, but as luck would have it, we have the perfect tool. Barnsley’s Fern. Indeed any fractal form would suffice but I like Barnsley’s Fern because, well, it’s just a really beautiful thing.

As a child we did’t have a huge amount of options in what we could watch on TV. Growing up in the mid-eighties was special because along with the TV being largely poor we had the freedom of the country. It’s a little bit cliched, but we would roam the countryside looking for entertainment and we knew the land around us because it was our playground. I do remember the odd thing on TV now and again. One was a show on RTE called Bosco. There was a feature in the show where the young viewer would get to discover all sorts of new things. I feel compelled to offer the same invitation that preceded this adventure each time with the following rhyme… “Knock, knock, open wide, see what’s on the other side. Knock knock anymore, come with me through the magic door.”

Where do you find the properties of seamless difference, sameness, change, ubiquity, infinite complexity and simplicity bound in a seemingly finite single form? Fractals. That’s where.

The infinite property of the never ending circle as you go round and round has symbolism for many. I remember it aluded to in reference to the rings on the day my wife and I were married. The thing you find is that the closer you zoom in on the circle the more it appears to take the form of a straight line. Not so, the fractals of Barnsley and Mandelbrot. Ordered but infinitely complex in form through the length scales. The world in a nutshell.

And so we come full circle by setting aside the circle and embracing the fractal. Everything that divides also unites.

Follow @frondity for some of the more practical insights of the fractal emergent universe

Yeah, but what has that got to do with the price of bread?

“The questions philosophers ask are not so much meaningless as irrelevant.”

Marty Rubin

Hot air. Useless rambling. A waste of your time. These are fair claims often proven true. To philosophise may be to over emphasise the value of your own thought but God forbid you should ever undervalue it.

The challenge with a proposition like a fractal universe based on iterative function is relevance. But to my mind that shouldn’t be a problem. After all fractality has relevance at all scales. So as much as it is relevant to superclusters and electron clouds it must have a relevance at our fingertips. So let’s explore how.

I think the key here is to assess the building blocks and outputs of the iterative function system for day to day relevance. We have explored this in general terms previously. Again with the example of Barnsley’s Fern we have inputs of random values, processes of applying a range based rule set to dealing with the input, and an output of a fractal form with self similar structure through the length scales. We have also come to understand this fractal form as a representation of the data produced from the repetitive process. A map of functional output per say. And the operation at play can be anything that takes this form. What has potentially random input, a set of rules applied based on conditions at hand and repetition in action? Well life, that’s what. Everything you recognise as life is subject to this style of process.

Life is if/else.

From heart beats through brainwaves to day to day mundane activities. Here’s a really simple question for you. What time did you eat your dinner this evening? I can guarantee you that if you plotted your dinner times over the last 5 years it would produce a near bell shaped curve or an overlay of multiple near bell shaped curves. Why? Because the time that you eat your dinner is an output of a process with random inputs and a set of rules governing how you deal with those random inputs. Lots of things affect the input to your decision making around meal times. But you place those things in the context of your need to eat and make decisions to make it happen. It’s an iterative function system that plays out every day of your life if you’re lucky enough. Better than that. Every action you take is likely to fall within this paradigm. Many rules you have no control over. Many so called rules are ignored.

“A consistent man believes in destiny, a capricious man in chance.”

Benjamin Disraeli

But everyone has rules. Rules they create for themselves. These rules drive decision making and ultimately shape our lives. The key to being able to benefit from the local effects of a fractal universe is to harness the power of rule setting and decision making in our own lives. We have belief systems shaped by our previous experiences of pleasure and pain and the opinions of those we trust. We each have a set of values that plays a large part in defining our identity. Herein lies the rules.

“Habit is either the best of servants or the worst of masters”

Nathaniel Emmons

One of the great beauties of the Barnsley Fern is it’s effectiveness in metaphor. When the rules are unchanged and the process iterated a series of what appear at first to be random dots slowly morph into a recognisable image. This is the action of consistent thought and decision making. The work of the master who hones his art over long periods of time through strength of will. By contrast this is also the pathway to addiction. Where neurological associations are strongly reinforced to negative effect. Each it’s own fern. Each it’s own near normal distribution of behaviour.

We can play with the concept. Change the rules from time to time as the iteration continues. This represents two important factors in our lives. The negative effects of a scatter brain approach to decision making. The positive effect of exploring new experiences doing things differently and the capacity that each of us has for change. How old habits can disappear in the background of the adherence to new rules. The importance of a forthright attitude, perseverance, taking action and patience. These are not just ideas. They are manifested in iterative function systems. Visual mathematical proofs of philosophical thoughts. Now here is where I believe we have crossed a new frontier.

Many will argue that a theory of everything must encompass gravity and quantum mechanics. I would argue that there is something lacking and perhaps even lazy in that sentiment. Simply because it lacks relevance. Einstein had a theory of relativity. But it didn’t relate to the most important aspects of human experience. If anything it was a little abstract. Planck, Heisenberg and Schroeder had theories that placed much on chance that were in many ways more abstract. Relevance was anything but intuitive. The unification of these two diverse paradigms is only possible from a first principles approach to the issue of existence. I would contend that what are touted as primary governing rules for our universe are more likely 3rd or 4th order outworkings of underlying rules of a fractal universe. A “Frondity” if you like. If you don’t believe me, have a look around. It’s been staring us in the face since the dawn of time. The trees, the bracken, the brain, the electron cloud, the supercluster. Right under our noses and hidden in plain sight.

Follow @frondity for further insights

Implications of an iterative function universe

“God does not play dice with the universe”

Albert Einstein

There’s something intuitive about fractals and the iterative function systems that produce them. When you hear profound utterings, the kind that cut through swathes of chatter and feel like distilled truth, aspects of iterative function systems appear to provide great analogies.

To explore this further we need to get a feel for how the iterative function systems work. So let’s take Barnsley’s Fern as the primary example.

https://en.m.wikipedia.org/wiki/Barnsley_fern

The link on Wikipedia provides perfect background for this. What we see is a matrices based function iterated using parameters that vary based on the value generated by a random number generator.

So we have random input, a rule set, application of a function based on the rules and repetition.

Let’s address the first aspect, random input. It was Einstein who said he believed that God does not play dice in reference to natural laws. This was attached to his feeling that there was something perhaps unnatural in the way quantum mechanics is expressed in terms of probability. The scientist, particularly the theorist begins to lose a grip in some sense on a predictable and finitely testable universe with the very proposition. Two alternative viewpoints on reality with no apparent bridge.

Often categorised on the basis of scale, I would here propose that we should instead categorise these theories based on their effectiveness in delivering an effective reality in differing environments of the completeness of informational descriptions of systems.

The emergent properties of the fractal as formed through iterative function systems provides the bridge we need to understand the interaction between the quantum and relativity based descriptions of the world we experience. I believe it can be shown that the relativity we experience is a function of rules that have emerged as well described ferns or fronds and the quantum mechanical mechanism describes the processes by which these fronds/ferns transition from a few dots to a bigger new world of sorts in the recognisable image.

In many ways, particularly from an efficiency perspective, the energy expended in describing a universe at every scale is optimised by such a methodology.

There are interesting connotations for this reality. Nothing is ever entirely complete and in some sense nothing was ever completely nothing.

We can also go completely anti-Einstein and look at financial markets. Do equations describe every last aspect of these dynamic systems? Well ask Guy R. Fleury and he posits:

“When you use equations to describe what you see, it becomes very restrictive. Not because you cannot put equations on the table, but because you use an equal sign which makes it quite a categorical and unequivocal statement.”

Guy R. Fleury

A succinct and interesting take on an emergent phenomena. I would concur with the proposition that equations may tend to approximate rather than outrightly define relationships at any level. Yet the simplicity with which they do can be overwhelmingly beautiful and convince the user of an explicitness that can be elusive under the finest of scrutiny.

In many ways the fractal produced by iterative function runs in the other direction. A level of fuzziness is built into the system that is never 100% balanced and changes perennially. But with time the fuzziness is reduced the form comes into focus and an illusion of constancy pervades. And so a level of fuzziness or variability in processes is almost a proof of its existence and persistence. Indeed numeracy is just a human based code developed to assist in understanding the world around us.

There were numbers but no fractals in mathematics when Wallace D. Wattles proffered that:

“There is a thinking stuff from which all things are made, and which, in its original state, permeates, penetrates, and fills the interspaces of the universe.”

“All is right with the world. It is perfect and advancing to completion”

Wallace D. Wattles

If he tread the world in this age I imagine Wattles would have some time for a theory of the fractal emergent universe. Give it some rules and room to grow and away it goes.

This takes us to the next aspect, the rules. Here lies the great question. Were there an initial set of rules set upon the universe as per the Barnsley Fern or were they too emergent. My contention is that as gravity is an emergent function in the universe so too are the rules of quantum mechanics and so too are the rules of numeracy that help us express them.

We arrive back at the consideration of sameness and difference and the contention that the fractal is the only format that delivers these two critical components of a reality on all scales. The functions of similarity difference and an ongoing process to deliver an ever more complete picture of the universe are described effectively in an iterative function based fractal universe.

I imagine that models can be defined that play out the emergence principles to this end. The trouble is that they can be expected to take the properties of this emergent universe because they will play out in it. And so we come to a road block of sorts. We can’t get out to observe. But then in “real” terms would that be of any valuable benefit.

So we find ourselves in a universe where conditions take fractal form and have emerged through the assessment and cross reference of other fractal forms and are in constant dynamism in search for a more perfect description of the universe.

Lots of information is generated and stored in this process. In some manner this would explain the expansion of the universe. Information takes up lots of space. From an energetic perspective and from a computing perspective it makes sense to keep associated data close by. You see this with your pc. Keeping data on the same drive allows for faster more efficient processing. I would argue that this may be the basis for what we know and see as gravity. An efficiency driver for information processing for the most interactive elements (fractal forms, e.g. bodies) as they derive emergent fractal forms (motion) in any particular format. Consider the large scale structure and movement of superclusters. Very fractal like.

http://www.cpt.univ-mrs.fr/~cosmo/CosFlo16/index.php?page=scope

We see it on the large scale and we see it on the small scale. But what about that mid level where you and I go about our daily chores. Where is the impact there. Follow @frondity for more on this and much else.