Back in the early 80s I came across the writings of an English cybernetician called Stafford Beer. Cybernetics is the name given to systems both natural and artificial which interact via feedback with the environment which in turn, modifies the system's behaviour. A car's accelerator is an example of this. You press the accelerator pedal and the car goes faster. You're not directly making the car go faster however, the pedal is connected to a series of devices that eventually put more gas into the engine via a number of feedback mechanisms which regulate the flow of petrol through the carbuerettor.
Stafford Beer was commissioned by the left-leaning Chilean government led by Salvador Allende that came to power in 1971, to devise a cybernetic system for monitoring and regulating the Chilean economy, based on ideas that were contained in two books he wrote, "Designing Freedom" and "The Brain of the Factory".
Beer's ideas were revolutionary, not only in concept but also from the perspective of the development of economic models that exploited the revolution in computer science and information technology. The heart of Beer's approach was to use information networks to collect economic data in real time from enterprises nationwide. Beer used a mathematical process called 'recursion' to essentially filter the information gathered about for example, a factory's performance so that only the information necessary at a higher level say national was passed on (more on this crucial aspect below).
The example Beer used in "The Brain of the Factory" was a cement plant. So, assuming we have a good idea of how much cement is needed nationally and we know how many cement plants we have, we know where they are and their theoretical production capacity. What we don't know is how much cement they're actually producing from day to day and we don't how much demand fluctuates or where that demand is coming from.
He based his analysis on the fact that given that we know what the potential output of the plant is, ie so many tonnes of cement per day but that in reality machines break down, workers don't pitch for work and that shortages in raw materials would affect the actual output, that a model could be developed based on the ideal output versus the actual output. The information is then integrated with information gathered firstly from the localities where the demand originates and from the country as a whole.
Beer took the notion of what I call an 'information map' and combined it with the ability to track and record the process in real time and finally integrate all the information through the telecommunications network (and this was before the Internet took off and under a US-led embargo of the export of high technology to Chile who had a problem with Allende's policies).
Of course this idea can be applied to almost anything which requires that we have a goal, a knowledge of what resources we have, where they are and what the demands are. But by now you're probably wondering what this has to do with the local elections.
Inherent in the government's long-term plans is the idea of 'devolution' that is, as virtually all services are delivered locally but planned firstly, nationally and then implemented regionally, the most efficient way is for local communities to both create and deliver these services. After all, they're on the spot so to speak and are the first to know about whether the 'plan' meets the requirements or not.
This is where Beer's recursive method of gathering and processing information kicks in. For local communities to be integrated organically into some kind of national plan of resource and service delivery, they need to be able to inform regional and national government of their needs and in a timely fashion. Importantly, regional and national governments don't need all the detail, just the 'big picture', hence the recursive model which only passes on to a higher level the information necessary to plan resource allocation in gross terms such as the total amount of cement needed nationally.
Because the system works in real time, it's based on the current reality, not numbers produced over several years of analysis and inevitably, out of date by the time the 'plan' kicks in. So if we take housing as an example, it's only at the local level that we really know what the needs are, firstly by totalling the number of homeless or poorly housed people the community has, at which point we can determine locally what are the amounts of cement, glass, wood, plumbing, cable and all the other meterials needed to build houses with.
In turn, this information is passed on directly to national government (provincial government is probably redundant in any case) through a real time Internet-based network in order for the gov to tally the total numbers. In reality of course, all kinds of factors will affect the situation from day to day and thus affect the information being produced based on the actual output of houses.
Today, such ideas form the basis of the 'New Economy' models which sees the integration of what is called the 'supply chain', that is, large corporations that actually don't manufacture everything (or even anything!) directly but use a range of suppliers globally for parts and assemblies of parts that make up the complete product eg a computer. By linking or 'chaining together' all the suppliers in a real time network, products can be manufactured literally on demand. This is how Dell Computer operates. The benefits of this approach are obvious; there's no inventory sitting in a warehouse wasting space; because the entire process is interlinked, turnaround time is reduced; the best prices can be obtained and so on.
So too, a 'supply chain' can be developed between local and national government consisting of the Internet, the real time collection of information from communities that national government integrates, also in real time, to create a national 'picture' of needs and how effective resource allocation actually is.
Is this a realistic objective? I think it is. Much of what has to be done over the next decade is being done essentially from scratch, hence we will be spending a fortune on computers anyway. We have the a national information network. Computers and software are now pretty much turnkey or off the shelf. What we lack are the skills, but a Marshall-type plan could make a big difference in a fairly short period of time, say five years. There are thousands of government workers with the skills and experience that we can use to develop the 'expert systems' needed to model the public service.
Stafford Beer by the way, is a person with a lot of innovative ideas and was well ahead of his time, although I have some problems with the hierarchical structure he proposed in "Designing Freedom" ("The buck stops in the President's office" approach). I've only skated over his ideas here, it would take far too much space to go into them in more detail, but for all budding public service workers and IT fundis, go to amazon.com and check out Beer's books, they were both still in print the last time I looked (by Routledge, Keagan and Paul) and even more relevant today than when they were written 30-odd years ago.
|All content on this site is copyright © 1987-2003 William Bowles unless otherwise stated. All rights reserved. You have the right to reproduce content from this if it is not-for-profit, non-commercial use or for fair use. For commercial reproduction, please contact the copyright owner.|