posted 2 Mar 2014, 03:48 by John Brown

Computational fabric needs access to the moving average of multiple high rate streams of highly variable data. I have spent a little time developing an algorithm to meet this need, and outline it here in the hope that it may be useful to someone.

I needed the algorithm to be very fast (efficient), thread-safe, and to give sensible results in the face spurious outliers. The term medyn was coined to resemble dynamic-mean and/or dynamic-median. The algorithm more closely represents a dynamic-median than a dynamic-mean, and this is important to limit the impact of very large spurious values. The distinction between mean and median is marginal for my data, and the dynamics press this to irrelevance. A fast algorithm was more important than technical precision in my need. Also in the interests of speed, I have chosen to work with 32bit integer values, and avoid floating-point arithmatic.

The medyn algorithm makes an incremental adjustment to the medyn value on receipt of every data point. If the prior adjustment was positive, and the new data point is greater than the medyn, then the adjustment is double the prior adjustment. Conversely, if the prior adjustment was positive, and the new data point is less than the medyn, then the adjustment is half the prior adjustment. In this way, the medyn accelerates increasingly rapidly towards a run of data points which are consistently above the medyn value, and decelerates when the run ends. To avoid overshoot, the adjustment is limited to some fraction of the gap between the new data point and the medyn value.

The rate of acceleration/deceleration, and the limit on velocity may be tuned to suit the requirements to hand. Increasing the acceleration from doubling to 3x, 4x, 5x etc makes the medyn's response to new data more twitchy. Increasing the limit on velocity, to a large fraction of the remaining gap, allows the medyn to repond well to a step-change in the data points.

medyn noisy square wave
medyn noisy square wave

The charts shows the medyn of high variance data with large spurious values, and a noisy square wave function with different limits on acceleration and velocity.

A simple rolling average specifies a fixed number of data points to include in the calculation. An exponential moving average considers all data points, but progressively weighted for the more recent points. The medyn calculation considers only the prior median, the prior adjustment and a new data point. The medyn relies on the limits on acceleration and velocity to maintain an echo of prior data points.

A Java implementation is attached.

first results

posted 27 Feb 2014, 23:00 by John Brown   [ updated 27 Feb 2014, 23:17 ]

Late in 2008 I began working to test a clutch of ideas arising from my studies around intelligence. By early 2009, the ideas had evolved into a core design which has survived an arduous implementation. Now, 5½ years later, I am beginning to see some evidence that computational fabric is viable.

My over-riding aim has been to prove to myself that fabric can work. Done.

I do not understand fabric's limitations, or it's potential - but it is at the very least novel and interesting. (what? why?)

Fabric is a tool that can be placed in a real-time environment and given an objective. It is not necessary to know anything about the environment, or how the objective might be achieved. The fabric needs access to real time data from the environment, and the ability to act on/in the environment, but starts out naive of the nature of the available data or actions. No big deal, new born animals do it all the time!

I have written software to create a working prototype of computational fabric, and run a few very simple tests. There have been challenges. Creating any novel thing is difficult; punctuated by the devil in the detail, by poor decisions, by lack of experience, by steep learning curves, by adapting tools, by creating new tools, and by exhaustion. The adage 5% inspiration, 95% perspiration holds fast.

I have constructed a very simple virtual environment to use as a test. Essentially, the environment is a ball on a hill. I can connect fabric to this hill environment at three points. The first point simply measures the distance from the top of the hill to the ball (-128 thru +128). The second and third points nudge the ball to the left and to the right respectively.
hill test

My simple test involves creating two identical hill environments (blue & red) and connecting both to the fabric at the 3+3=6 points. Next, I generate a completely random swatch of fabric and ask it to keep the blue ball on the top of the hill (i.e. blue position = 0). The red hill is there simply as a control for comparison, with no set objective. Finally, I set the test in motion and monitor the position of the 2 virtual balls on the 2 virtual hills.

The chart shows the position of the two balls as time progresses (down the page) in a single 2 minute test. First you will notice that there is a lot more blue than red. Both balls spent a lot of time at the bottom of the hill on the left or the right (-128 or +128), but I have omitted these extreme values to save a little space. The red ball spent most of the time at +128, and the blue ball spent a lot more time away from the extremes. If the objective on the blue ball was met perfectly, then the chart would show a neat blue line down the centre of the chart. The result is far from perfect, but it is clear (to me) that the fabric is influencing the blue ball towards the central position, and largely ignoring the red ball.

These results are very satisfying; the basic mechanism of the computational fabric appear to work, but there are a number of areas which can be tweaked or tuned. After that, the fabric must be tested in increasingly complex and challenging environments. Much to do!

why fabric?

posted 4 Feb 2014, 20:37 by John Brown   [ updated 4 Feb 2014, 23:16 ]

Computers are able to follow a detailed set of instructions with stupendous speed, precision and reliability; far better than any human. A tiny spider, with a minuscule brain, is able to thrive in a complex, hazardous natural environment, from the moment it hatches; far better than any computer. Computational fabric sits somewhere in between.

Fabric is a new approach to computation, seeking less rigid and more autonomous capabilities. The idea behind fabric is to have a simple tool able to pursue multiple real-world objectives concurrently. Fabric needs:
  1. access to real-time data from it's environment,
  2. a way of acting on it's environment, and
  3. an objective(s) to pursue.
No human programming, or knowledge about the data, or the actions, or the environment are required.

Developing software for a computer requires a human engineer to develop a very detailed set of instructions. Every piece of data must be clearly specified, each calculation precisely defined, and each decision written as an instruction in logic (If X then do Y otherwise do Z). A computer can follow these instructions very quickly, and very reliably, but it cannot write it's own instructions, or change the instructions on the fly. A computer will follow the instructions blindly, even if the data provided is rubbish. A human engineer can add checks and balances in logic to make the software more robust, but often more rigid as well. A computer is completely reliant on a human to provide useful instructions, provide sensible data, and interpret the results.

Animals operate very differently. An animal grows with a nervous system and a brain which continues to change for it's entire lifetime. Brains have some broad similarities across the animal kingdom, but there are drastic differences between species, and significant differences between members of the same species. Different regions of a brain are more involved in one activity or another, but some species have unique functions, and other species develop quite different structures for a similar function. No two brains are even close to identical, even in the same species. Brains have at least some capacity to recover from damage.

An animal has the capacity to identify the important information in an environment, act accordingly, and ignore the noise. An animal can learn from it's experiences, change it's behaviour, and adapt to an entirely new situation. These things are very difficult to achieve with computers.

The structures found within a brain are starkly different to the structures found in human artefacts, including computers. Man-made things are usually easy to distinguish from natural things; they have a different structure because of the way they are engineered. Traditional engineering does not result in structures similar to brains, but if the structures found in brains are critical to their capabilities, then traditional engineering may struggle to build those capabilities.

The structure of computational fabric is grown, rather than engineered. Certainly, the basic growth mechanism is carefully engineered. However, the structure of the fabric grows and flexes continuously in pursuit of it's objective. The structure is a result of the available data, the available actions, and the response of the environment.

If computational fabric can achieve it's objective in an increasingly efficient manner, then I would say that the fabric is intelligent.

what is fabric?

posted 4 Feb 2014, 20:27 by John Brown   [ updated 5 Feb 2014, 14:27 ]

Computational fabric is quite different to traditional computer hardware and software. Fabric is grown into an "organic" structure, rather than built into an engineered structure; it is more like ivy than a building. why?

Software is a computational structure with a very clear distinction between the static program, and the dynamic data. The software must not change at runtime, lest it be unpredictable; only the data may change. Fabric is also a computational structure which accepts, manipulates and produces data. However, the entire structure of the fabric is continuously changing for the entire time that the fabric is "running". There is no real distinction between program and data - they are one and the same thing. If a software program is like a network of pipes under a city to direct the flow of water; then fabric is more like a river gouging it's own path in a flood.

A peek under the hood of most software will reveal a very regular hierarchical structure. This is necessary for the software developers to manage the whole complex arrangement. Growing computational fabric most certainly results in a structure; but not like the regular warp and weft of a cotton sheet. The structure forms over time, depending on the flow of data, into an irregular tangle of pathways, each which prove valuable at a point in time. Think of a map of Venice including every alley, bridge, canal, building, floor, stairwell, tide and tour group. Not like Chicago.

Computational fabric will never perform the same computation twice, not only due to the dynamic structure, but mostly because the computation is randomized at every minor juncture. Despite (and because of) the underlying chaos, fabric can pursue an objective. In fact, computational fabric can pursue multiple goals within the same structure, at the same time. Some regions of the fabric will be more involved in one goal or another, but the regions will be inextricably intertwined. Furthermore, inflicting damage to fabric (i.e. by removing a small swatch) will merely reduce overall performance, while the dynamic structure flexes and reforms to overcome the impediment.

Traditional computers can also undertake multiple concurrent tasks, but by a different mechanism. Typically, this is achieved by switching between multiple tasks extremely rapidly, or by combining multiple independent processors to work concurrently on distinct tasks. Alternatively, some complex tasks can be split-up, progressed concurrently, and the results recombined at some later juncture. Traditional computers rarely behave well when arbitrary, small chunks of software or data are removed.

Traditional computers are also pretty particular about how they accept data. If tax data is required for a computation, then supplying temperature data will rarely have a happy outcome. The "meaning" of the data is critical to the correct operation.
Computational fabric will accept a real-time stream of raw data from the environment, without needing to address the "meaning" of the data. To achieve an objective, the fabric will also need to act on the environment, but without any "knowledge" of the action. Fabric will work with the available data and actions to achieve an objective.

Computational fabric is able to be powered by traditional, readily available, computer processors; the more the merrier. Each processor will be fully loaded up with a continuous stream of small computations to perform, as part of a much larger whole. There is no need to coordinate or synchronize multiple processors.

Computational fabric is a dynamic, irregular, non-deterministic, computational structure;
able to pursue multiple goals concurrently;
on a stream of raw, unstructured environment data;
stimulated by one or more, asynchronous, serial computational resources.

Computational fabric is in the early stages of development; much remains to be done to confirm, refine and extend these concepts.

1-4 of 4