Archive for February 22nd, 2010

22/02/2010

The Use of Knowledge in the Society – Hayek’s philosophy

Na piątkowe seminarium polecam uwadze następujące teksty:

1) The Use of Knowledge in Society

2) Economics and knowledge

i najważniejszy tekst dot. filozofii umysłu Hayeka zawartej w książce “Sensory Order” – niestety nie jest dostępny w Internecie wiec niniejszym go załączam:

G. R. Steele – Hayek’s Sensory Order

Postaram się do środy przygotować krótkie zestawienie najważniejszych tez, które umieszczę na stronie.

MG

Advertisements
Tags: ,
22/02/2010

Is the Universe a Universal Computer (2002)

M. Mitchell,

Science vol 298, 4 October 2002, pp 65 – 68

Abstract

The text is a critical review of a book written by Stephen Wolfram, A New Kind of Science, published in 2002 by Wolfram Media. The book itself is available for free (in a digital version) on Stephen Wolfram’s website and it constitutes a kind of author’s credo on what is the nature of the world and in consequence how science should be practiced. In the first part of her article Melanie Mitchell briefly makes the reader familiar with Wolfram’s books main thesis, what is not easy as the book counts 1200 pages. The main Wolfram’s idea can be however described quite simply: The structure of the physical world bases upon the theory of cellular automata, originally proposed by two mathematicians, Stanislaw Ulam and John von Neumann. Any other mathematical structures, discovered by the scientists are accidental and very rare in nature. The main feature of some cellular automata, is that they are able to product very complex structures, hard to decipher and to discover any regularity, on the basis of the very simple instructions (programs) originally encoded in the automaton. Wolfram claims that science should be practiced so that, we would rather look for those “simple programs” in nature, then effortlessly try to describe the observed regularity in terms of standard mathematical equations. The key phrase is “computational equivalence” which is the new law of nature according to Wolfram and this new principle can illuminate many aspects of natural phenomena as well as fundamental philosophical questions.

Melanie Mitchell is not totally critical towards Wolfram’s proposals. She thinks that in general he is on the right track. It seems like simple computer models can sometimes better explain the complex structures then traditional approach. It doesn’t however constitute any “new kind of science”. Wolfram simply continues to develop the very significant work of the pioneers of computer age and computability, Neumann, Turing, Wiener. The works of those scientists are often disregarded in Wolframs book.

Many of his claims are rather speculations or suspicions which are not supported by any evidences. Especially, although we can observe in nature the structures which seems to be a product of simple programs, we cannot say, as Wolfram would like to do, that such structures are common as well as common is the ability to support the universal computation. On the contrary the analytical approaches to illuminating complexity in nature has been much more successful so far then cellular automata.

Commentary

Melanie Mitchell is doubtless specially competent to write a critical review of the Stephen Wolfram’s work. She dedicates most of her scientific researches to complexity and how to cope with it. As the specialist in complex systems she is aware that in order to illuminate them and to decipher the rules that govern the behavior of complex systems in nature we probably have to look for entirely different methods then the traditional mathematical linear equations. The direction of Wolfram’s researches seems to be right. But it is not Wolfram who put  mathematicians, physics, biologists and even economists on that track. Even if the direction is correct we have to be critical towards ourselves in our courageous ideas. Especially as the cellular automata are surely not the only way to cope with complexity, and not even the most effective. It is far too early to bury the old methods. They still work in many areas much better then cellular automata.

Tags:
22/02/2010

Undecidability and Intractability in Theoretical Physics (1985)

S. Wolfram,

Physical Review Letters 54 (1985) 735-738.

Abstract

The paper was published in 1985, long before the most important work of Stephen Wolfram was accomplished, A New Kind of Science. It is not only the book, but also the very wide, comprehensive and multidisciplinary program, which according to the author’s intentions, should essentially change the way in which science is practiced. The book covered most of the ideas previously expressed in the separate papers and the author himself confessed that it is better to read the book, and not the articles. Nevertheless there are at least two reasons why the text is still interesting: It touches the crucial problems of computation in modern physics and it is much shorter than the corresponding chapter of the book (“Fundamental Physic” – about 100 pages).

According to Wolfram, there is close correspondence between physical processes and computations. At least we can say that physical processes seem to be computable and theoretical physic aims to discover and properly formulate those algorithms that represent those processes. Algorithms however would be useless if we didn’t discover the “shortcuts” i.e. the equations or set of equations which describes the particular physical process and let us input the initial data to obtain the results. It is almost commonly accepted among the physicist that such shortcuts are common in the nature and we should continue looking for them. The Wolfram thesis is different.

Comparing the physical process to computer, Wolfram argues, that there are computations, normally executed on the computer, which are “irreducible”. Irreducibility means, that although we know the “program”, i.e. the instructions which are uploaded to the computer in order to conduct the computations, we cannot predict the result by the kind of shortcut, but we have to execute the whole computation step by step. Computation irreducibility is common in mathematics and in computation theory, but Wolfram claims that it is also common in theoretical physic, although has not been noticed so far.

Wolfram distinguishes three kinds of problems, which can be submit to the “universal computer” in order to get the solution: The first are denoted P, and they represent the level of complexity (in terms of space and time) which makes them decidable. The second, PSPACE, are those that can be solved with polynomial storage capacity, but may require exponential time, and so are in practice effectively intractable. The third NP, consist in identifying, among an exponentially large collection of objects, those with some particular, easily testable property. The computer that could follow arbitrarily many computational paths, could solve such problems in polynomial time. For actual computers, those problems are undecidable. According to Wolfram computation irreducibility may be widespread, even (or may be particular) among the systems with very simple structure. Cellular automata constitutes the best examples. Very simple instruction coded in the automaton may lead to the irreducibly complex structure.

Do we have an examples of such structures in physical systems? Wolframs argues that we do, and points the following examples: electric circuits, hard-sphere gases with obstructions, networks of chemical reactions, chaotic dynamical systems. Minimum energy conformation for a polymer is in general NP-complete with respect to its length. Finding a configuration below a specified energy in a spin-glass with particular couplings is similarly NP-complete. The examples can be multiplied. Wolfram expresses in fact the most important suspicion: It may occur that systems that are computationally reducible, and the so called “shortcuts”, which have been discovered so far are very rare examples of structures in the world.

Commentary

The observation that in computer theory and in mathematic we do meet problems which are undecidable or intractable is not new. We know about it at least since Gödel formulated his famous theorems or since Turing presented his conception of universal computer. However the thesis, that those unsolvable problems represent the systems actually existing in the physical world is something new. New, seems to be also the suspicion that the theory of cellular automata can be the closest representation of such a systems. Both hypothesis are also quite courageous. Why courageous? Because they seem to be very hard to proof or even to make it probable. When scientists find the shortcut which solves the significant set of problems, the only thing that can be said is that there exists solvable, computational reducible problems in nature. The thesis, that the whole nature has that special feature, that can be described by the computationally reducible processes is an extrapolation. The adverse experience, so that we conclude, that there are processes, for which the fitted shortcut has not been found yet, although many tries,  don’t even let us extrapolate about the computationally irreducible problems in nature. This seems to be purely philosophical speculation and I can hardly imagine what kind of facts could falsify the hypothesis. Nevertheless it still very interesting and worth considering, especially as it goes against the wave. The another problem is, if on the basis of such a suspicions, we can really construct a new kind of science.

Tags:
22/02/2010

Evolution of the Neokeynesian Phillips Curve

Jacek Wallusch

Ekonomista 5 / 2008 p. 577 – 592

Abstract

The so called neokeynesian Phillips curve is analyzed in the text. The Phillips curve was worked out by Alban William Phillips in 1958 and described in the article: The relationships between unemployment and the rate of change of money wages in the United Kingdom 1861 – 1957. The methodology applied by Phillips was very remarkable for that time. He collected the set of empirical data regarding the economic situation in UK, set it against each other, and concluded that there is a reverse correlation between the unemployment rate and an increase in wages (inflation). The relation suggested that when the inflation was high, the unemployment rate remained low. Phillips formalized the observed relation into the equation. The curve was very interesting and initially seemed to catch one of the important feature of the economy, however it has never been compliant to the relations actually observed in the market. In order to adjust it somehow to the reality and strengthen its predictive power it used to be corrected several times. One of the proposal done by Phelps was even rewarded with Nobel Price. The neokeynesian version is the subsequent step in the curve’s evolution. The most important improvement is that the variables refers to their expected value rather than the actual one. Author analyses few versions of the curve beginning from the Calvo model (this version assumed inflation to be dependent on anticipatory expectation of inflation rate and demand variable) up to the hybrid version (it augments the basic model by the autoregressive mechanism). The most interesting part of the text is however the trial to apply the models to Polish economy and to test it accordingly. The results are dubious or at least ambiguous. According to author’s cautious interpretation the model may not work perfect under condition of imperfect information and disinflation.

Commentary

Neokeynesizm, mainstream macroeconomics and rational expectation hypothesis seems to be well. Regardless of critiques and attacks from all directions, their apologist do much to maintain the common impression that the economics can really be practiced in that manner, and moreover, that this paradigm is the only one, that can lead us to accurate conclusions. When the subsequent versions of the equation still doesn’t pass the test of correspondence to living market, woe to the market. It looks that Lakatos model of falsification was correct. It is not so easy to falsify the wrong theory by the empirical evidences. Methodologically it is even dubious if the economic (especially neokeynesian) theories are falsifiable (I refer straightly to Milton Freedman’s postulate, that the economic theory should be predictive and therefore falsifiable). The core of the Philipp’s curve theory remains stable and seemingly untouched, although neither of the variables named by Phillips appears in the equation any longer. What remains is only the general idea to bind the variables or its subsequent alterations into function and the strong belief that we are in position sooner or later to worked the function which will generate the accurate market predictions. Maybe it’s time to seriously question that general idea?

Tags:
22/02/2010

Contemporary Versions of Materialism: Psychophysical Identity, Supervenience, Eliminationism.

Stanisław Judycki

Zeszyty Naukowe KUL 38 / 1995 nr. 3-4 p. 46-61

Abstract

The paper describes several versions of contemporary materialism. The materialism which is referred to in the text is connected with a philosophy of mind and mainly problems of mind, its physical character and its mental states are being discussed. The author starts from the division (or rather topology) of the motives leading to the materialistic position: 1. Those which come from the general physical assumptions. 2. Motives which come from certain analysis regarding functions of our notion apparatus in other fields than psychology and neurophysiology and from the conveyance of the results of such analysis to the philosophy of mind. 3. Motives which base on the successful inter-theoretical reductions and which assumes that those reductions can also be applied in the relations between psychological and physical phenomena. 4. Empirical reasons which refer to the results of neurophysiologic researches. The topology applied above is not very clear but it is getting clearer when it is commented in details.

Physicalism, which is understood philosophically so that all the objects that subsist are of the physical character or are combined with the other physical objects, is briefly described in the paper. More attention is devoted to the notion analysis. It leads to the theory of mental identity which is known in two versions: type-type identity and token-identity. In reference to the successful inter-theoretical reductions the most often cited example is the reduction of the so called phenomenological thermodynamics to statistic mechanics. The reductions are divided by the author in two categories: “soft” and “hard”. “Soft” reductions, to the contrary of “hard” ones, do not eliminate the old, reduced theories, but explain better observables. The empirical reasons also give rise to the identity theory. The most important facts are the discovered correlations between the neuronal activity and mental events.

The identity theory is criticized on the basis of the notion of “identity” explained by Leibniz. The apologetics answer that the mental states are not “beings”, which identity can be asserted, but rather are of logical (abstractive) character like the functional states of the whole organism. Things like “mousetrap” are identified by the description of its function. The same refers to the psychological notions. The adversaries of the functionalism rise two arguments: Robots posses all the function required to say that they also have “consciousness” what sounds strange. The another is the so called “knowledge argument”.  We cannot describe, only by its functions the phenomenon of sight to the person which was born blind.

The different materialistic position is supervenience.  The author recalls D. Davidson and its theory. Supervenience does not eliminate the mental events but claims only that they built over the physical foundation. Each change in the foundation causes change in the mental events.

At the end of the text the character of the so called “folk psychology” is commented together with the problem of mind – body and mind-mind relations.

Commentary

On the basis of the text itself it is hard to indentify the position of the author. The text is purely descriptive and tries to present as comprehensible as possible the various ideas and solution of the contemporary problems of mind and body, emphasizing the materialistic approach. No conclusions are met at the end.

Tags:
22/02/2010

On Certain Implications of Non-linearity in Keynesianizm

On Certain Implications of Non-linearity in Keynesianizm

Aleksander Jakimowicz

Ekonomista 1 / 2009 p. 15-48

Abstract

The paper consist of two parts. In the first one author presents to the readers some general information on the very modern and very peculiar approach to economy or rather to the macroeconomics’ modeling. This is set against the classical approach, which is more similar to the classical physic, closed within the borders of thermodynamics and the rule of energy preservation. The classical approach in the natural sciences looks for the simple linear interdependence within the limited number of variables. The economists who try to cope with the macro-relations in economy have used the same approach so far, unfortunately to no effect. No effect means that there were a lot of significant discrepancies between the mathematical models and economic reality. This gave rise to critic of the keynesizm. However modern approach in the natural science observe the enormous complexity of the explained phenomena and its chaotic character. This conclusions led to the discovery of the set of mathematical tools, useful to describe complexity and chaos. In the first part readers are getting familiar with the beginning of the researches over complexity and its first trials to use its results in very different fields of science. Interdisciplinary studies seem to become a prevailing model of practicing science contrary to what was suggested by Kuhn. We may learn that there also are researches regarding economics based upon the presumption that the economy itself constitutes a complex adaptive system (Gell – Mann) and its main feature (apart from its complexity) is that the set of principles describing the behavior of the system is not stable but evolves. The conclusion of the first part of the paper is that we may use some tools worked out due to describe complex system in order to formalize and correct some Keynes theorems e.g. the part of general theory of employment, interest and money which claims the existence of function binding domestic product with the aggregate consumption expenditures, investments and governmental expenditures. The application of those tools is theoretically allowed because of some logical homologies which are specified.

The second part of the text consists of the complex mathematical equations, where one of the main variable, significant in the model – ultimate consumption propensity is not stable but expressed with the nonlinear function. Depending on the parameters of that variable we may observe how the model work. The conclusion is that it works non-stable, typically for the chaotic models, and the relatively non-remarkable changes in the shape of consumption functions changes the macroeconomics indicators in the entirely unpredictable way.

Commentary

The article is very interesting and convincing. Especially its first part where the most important features of the modern science and the most important problems of economics are accurately caught. The researches over complexity and chaotic systems, which were rather intuitively pointed by the greatest minds in the XX century (Popper) as the field that should be explored by the science, seem to be the most accurate direction, especially in economics. The conclusion however is not very encouraging. The so called “Lapunov time” determines the maximum period when, based on the model, any prediction of the variables’ values is possible.  Any extension over the Lapunov time leads us to terra incognita. The main thesis of Keynes, that the market may not spontaneously approaches the stability, therefore governmental intervention is required is undermined. Such intervention looks tricky and may lead the economy astray. His main adversary, Hayek, may have been right.

Tags: