Archive for February, 2010

26/02/2010

The Neuroscientific Challenge to Criminal Responsibility

Lecture delivered by Prof. Stephen J. Morse, which was devoted to the problem of criminal responsibility in the light of recent neuroscientific achievements.

Enjoy!

Advertisements
Tags:
26/02/2010

Short review: Anticipated emotions as Guides to choice, Barbara Mellers Peter McGraw, Department of Psychology, The Ohio State University, Columbus, Ohio

Text:

Anticipated emotions as Guides to choice, Barbara Mellers Peter McGraw, Department of Psychology, The Ohio State University, Columbus, Ohio

(You can find it here.)

Abstact:

The paper regard an impact of emotions on decision making process. The role of emotions in this process seems to be unquestioned and consequently in ethics, morality and law where it seems to be a central issue. Moreover recent results in neuroscience researches on moral judgments show how important are emotions for the above mentioned decision making process.

In the paper authors present the theory of anticipated pleasure called decision affect theory and show how it relates to the decision making. It is claimed that when making decision, people anticipated the pleasure or pain of future outcomes, weigh those feelings by the chances they will occur, and select the option with grater average pleasure. Emotions are compared to the utility term which determines a choice from a set of possible of choices. The authors show some differences in utility and emotions from the other hand. Utility is rather stable description of outcomes, emotions are influenced by many variables. For example the outcome of the unchosen gamble was more appealing anticipated pleasure decreased. This is because people anticipate regret when they imagine having made the wrong choice. The authors show changes in the magnitude of emotions when the effect was more suprising. The authors came to conclusions that utilities do differ from anticipated pleasure. In most theories of choice utilities depend only on the status quo, bit no other reference points. Anticipated pleasure depends on multiple reference point. Furthermore in most theories of choice utilities are assumed to be independent from beliefs. In contrast anticipated pleasure of outcomes varies systematically with beliefs about their occurrence; anticipated feeling associated with suprising outcomes are amplified relative to anticipated feeling associated with expected outcomes.

Comments:

Paper is short and rather more general. However it shows the one important thing that I briefly outlined in the paper “Extremes meet each other”ł maybe general rule determining particular choice in the decision making process is the simple one {max (U)} and maybe it is enough to replicate some kind of intelligent decision maker however replicating a real human decision making process require good definition of utility. And when transforming emotions to utility or utility to emotions hard stuff begins. If we want to build a human level, or more- human alike program we should program the utility according to all possible knowledge on emotional results of particular states-actions. This particular thing seems to be imposible.

Tags: ,
24/02/2010

Hayeka teoria umysłu i jej antynaturalistyczne implikacje

Szanowni Państwo,

pozwoliłem sobie bez wiedzy i zgody autora (!!!) na pewne modyfikacje niniejszego postu. Otóż, tekst Marcina Gorazdy stanowiący polemiczną odpowiedź na wcześniejsze prace Łukasza Łazarza można odnaleźć tutaj. Wydaje się, że ten sposób umieszczania naszych prac jest bardziej przyjazny dla odbiorców.

Na koniec nie zostaje mi nic innego jak tylko polecić lekturę tekstu Marcina!

Tags: ,
24/02/2010

Teksty znalezione w sieci

Witam,

poniżej umieszczam linki do dostępnych w sieci tekstów o tematyce zbliżonej do naszych zainteresowań. Oczywiście jest tego znacznie więcej; kolejne listy tekstów już wkrótce!

1. B. Leiter, M. Weisberg – Why Evolutionary Biology is (so Far) Irrelevant to Law?

2. Morris B. Hoffman – Law and Biology

3. Nita A. Farahany – Law and Behavioral Morality

4. Dean Mobbs , Hakwan C. Lau, Owen D. Jones , Christopher D. Frith – Law, Responsibility, and the Brain

5. Jim Chen – Biolaw: Cracking the Code

6. Owen D. Jones, Erin A. O’Hara, Jeffrey Evans Stake – Economics, Behavioral Biology, and Law


Tags:
24/02/2010

The Brain and the Law – dr David Eagleman

Poniżej zamieszczam wystąpienie dr Davida Eaglemana, w którym porusza dużą liczbę problemów związanych z neurolaw. Wystąpienie to należy raczej traktować jako krókie wprowadzenie w tematykę prawa i neurosciences, aniżeli jako systematyczny wykład.

Tags:
24/02/2010

The Gifford Lectures: M. Gazzaniga

Profesor Michael Gazzaniga, znany psycholog i neurobiolog, wygłosił sześć wykładów w ramach prestiżowych The Gifford Lectures. Professor Gazzaniga jest jedną z czołowych postaci zaangażowanych w The Law and Neuroscience Project.

Wykład1: What we are

Wykład2: The Distributed Networks of Mind

Wykład3: The Interpreter

Wykład4:  Free yet Determined and Constrained

Wykład5: The Social Brain

Wykład6: We are the Law

Tags:
22/02/2010

The Use of Knowledge in the Society – Hayek’s philosophy

Na piątkowe seminarium polecam uwadze następujące teksty:

1) The Use of Knowledge in Society

2) Economics and knowledge

i najważniejszy tekst dot. filozofii umysłu Hayeka zawartej w książce “Sensory Order” – niestety nie jest dostępny w Internecie wiec niniejszym go załączam:

G. R. Steele – Hayek’s Sensory Order

Postaram się do środy przygotować krótkie zestawienie najważniejszych tez, które umieszczę na stronie.

MG

Tags: ,
22/02/2010

Is the Universe a Universal Computer (2002)

M. Mitchell,

Science vol 298, 4 October 2002, pp 65 – 68

Abstract

The text is a critical review of a book written by Stephen Wolfram, A New Kind of Science, published in 2002 by Wolfram Media. The book itself is available for free (in a digital version) on Stephen Wolfram’s website and it constitutes a kind of author’s credo on what is the nature of the world and in consequence how science should be practiced. In the first part of her article Melanie Mitchell briefly makes the reader familiar with Wolfram’s books main thesis, what is not easy as the book counts 1200 pages. The main Wolfram’s idea can be however described quite simply: The structure of the physical world bases upon the theory of cellular automata, originally proposed by two mathematicians, Stanislaw Ulam and John von Neumann. Any other mathematical structures, discovered by the scientists are accidental and very rare in nature. The main feature of some cellular automata, is that they are able to product very complex structures, hard to decipher and to discover any regularity, on the basis of the very simple instructions (programs) originally encoded in the automaton. Wolfram claims that science should be practiced so that, we would rather look for those “simple programs” in nature, then effortlessly try to describe the observed regularity in terms of standard mathematical equations. The key phrase is “computational equivalence” which is the new law of nature according to Wolfram and this new principle can illuminate many aspects of natural phenomena as well as fundamental philosophical questions.

Melanie Mitchell is not totally critical towards Wolfram’s proposals. She thinks that in general he is on the right track. It seems like simple computer models can sometimes better explain the complex structures then traditional approach. It doesn’t however constitute any “new kind of science”. Wolfram simply continues to develop the very significant work of the pioneers of computer age and computability, Neumann, Turing, Wiener. The works of those scientists are often disregarded in Wolframs book.

Many of his claims are rather speculations or suspicions which are not supported by any evidences. Especially, although we can observe in nature the structures which seems to be a product of simple programs, we cannot say, as Wolfram would like to do, that such structures are common as well as common is the ability to support the universal computation. On the contrary the analytical approaches to illuminating complexity in nature has been much more successful so far then cellular automata.

Commentary

Melanie Mitchell is doubtless specially competent to write a critical review of the Stephen Wolfram’s work. She dedicates most of her scientific researches to complexity and how to cope with it. As the specialist in complex systems she is aware that in order to illuminate them and to decipher the rules that govern the behavior of complex systems in nature we probably have to look for entirely different methods then the traditional mathematical linear equations. The direction of Wolfram’s researches seems to be right. But it is not Wolfram who put  mathematicians, physics, biologists and even economists on that track. Even if the direction is correct we have to be critical towards ourselves in our courageous ideas. Especially as the cellular automata are surely not the only way to cope with complexity, and not even the most effective. It is far too early to bury the old methods. They still work in many areas much better then cellular automata.

Tags:
22/02/2010

Undecidability and Intractability in Theoretical Physics (1985)

S. Wolfram,

Physical Review Letters 54 (1985) 735-738.

Abstract

The paper was published in 1985, long before the most important work of Stephen Wolfram was accomplished, A New Kind of Science. It is not only the book, but also the very wide, comprehensive and multidisciplinary program, which according to the author’s intentions, should essentially change the way in which science is practiced. The book covered most of the ideas previously expressed in the separate papers and the author himself confessed that it is better to read the book, and not the articles. Nevertheless there are at least two reasons why the text is still interesting: It touches the crucial problems of computation in modern physics and it is much shorter than the corresponding chapter of the book (“Fundamental Physic” – about 100 pages).

According to Wolfram, there is close correspondence between physical processes and computations. At least we can say that physical processes seem to be computable and theoretical physic aims to discover and properly formulate those algorithms that represent those processes. Algorithms however would be useless if we didn’t discover the “shortcuts” i.e. the equations or set of equations which describes the particular physical process and let us input the initial data to obtain the results. It is almost commonly accepted among the physicist that such shortcuts are common in the nature and we should continue looking for them. The Wolfram thesis is different.

Comparing the physical process to computer, Wolfram argues, that there are computations, normally executed on the computer, which are “irreducible”. Irreducibility means, that although we know the “program”, i.e. the instructions which are uploaded to the computer in order to conduct the computations, we cannot predict the result by the kind of shortcut, but we have to execute the whole computation step by step. Computation irreducibility is common in mathematics and in computation theory, but Wolfram claims that it is also common in theoretical physic, although has not been noticed so far.

Wolfram distinguishes three kinds of problems, which can be submit to the “universal computer” in order to get the solution: The first are denoted P, and they represent the level of complexity (in terms of space and time) which makes them decidable. The second, PSPACE, are those that can be solved with polynomial storage capacity, but may require exponential time, and so are in practice effectively intractable. The third NP, consist in identifying, among an exponentially large collection of objects, those with some particular, easily testable property. The computer that could follow arbitrarily many computational paths, could solve such problems in polynomial time. For actual computers, those problems are undecidable. According to Wolfram computation irreducibility may be widespread, even (or may be particular) among the systems with very simple structure. Cellular automata constitutes the best examples. Very simple instruction coded in the automaton may lead to the irreducibly complex structure.

Do we have an examples of such structures in physical systems? Wolframs argues that we do, and points the following examples: electric circuits, hard-sphere gases with obstructions, networks of chemical reactions, chaotic dynamical systems. Minimum energy conformation for a polymer is in general NP-complete with respect to its length. Finding a configuration below a specified energy in a spin-glass with particular couplings is similarly NP-complete. The examples can be multiplied. Wolfram expresses in fact the most important suspicion: It may occur that systems that are computationally reducible, and the so called “shortcuts”, which have been discovered so far are very rare examples of structures in the world.

Commentary

The observation that in computer theory and in mathematic we do meet problems which are undecidable or intractable is not new. We know about it at least since Gödel formulated his famous theorems or since Turing presented his conception of universal computer. However the thesis, that those unsolvable problems represent the systems actually existing in the physical world is something new. New, seems to be also the suspicion that the theory of cellular automata can be the closest representation of such a systems. Both hypothesis are also quite courageous. Why courageous? Because they seem to be very hard to proof or even to make it probable. When scientists find the shortcut which solves the significant set of problems, the only thing that can be said is that there exists solvable, computational reducible problems in nature. The thesis, that the whole nature has that special feature, that can be described by the computationally reducible processes is an extrapolation. The adverse experience, so that we conclude, that there are processes, for which the fitted shortcut has not been found yet, although many tries,  don’t even let us extrapolate about the computationally irreducible problems in nature. This seems to be purely philosophical speculation and I can hardly imagine what kind of facts could falsify the hypothesis. Nevertheless it still very interesting and worth considering, especially as it goes against the wave. The another problem is, if on the basis of such a suspicions, we can really construct a new kind of science.

Tags:
22/02/2010

Evolution of the Neokeynesian Phillips Curve

Jacek Wallusch

Ekonomista 5 / 2008 p. 577 – 592

Abstract

The so called neokeynesian Phillips curve is analyzed in the text. The Phillips curve was worked out by Alban William Phillips in 1958 and described in the article: The relationships between unemployment and the rate of change of money wages in the United Kingdom 1861 – 1957. The methodology applied by Phillips was very remarkable for that time. He collected the set of empirical data regarding the economic situation in UK, set it against each other, and concluded that there is a reverse correlation between the unemployment rate and an increase in wages (inflation). The relation suggested that when the inflation was high, the unemployment rate remained low. Phillips formalized the observed relation into the equation. The curve was very interesting and initially seemed to catch one of the important feature of the economy, however it has never been compliant to the relations actually observed in the market. In order to adjust it somehow to the reality and strengthen its predictive power it used to be corrected several times. One of the proposal done by Phelps was even rewarded with Nobel Price. The neokeynesian version is the subsequent step in the curve’s evolution. The most important improvement is that the variables refers to their expected value rather than the actual one. Author analyses few versions of the curve beginning from the Calvo model (this version assumed inflation to be dependent on anticipatory expectation of inflation rate and demand variable) up to the hybrid version (it augments the basic model by the autoregressive mechanism). The most interesting part of the text is however the trial to apply the models to Polish economy and to test it accordingly. The results are dubious or at least ambiguous. According to author’s cautious interpretation the model may not work perfect under condition of imperfect information and disinflation.

Commentary

Neokeynesizm, mainstream macroeconomics and rational expectation hypothesis seems to be well. Regardless of critiques and attacks from all directions, their apologist do much to maintain the common impression that the economics can really be practiced in that manner, and moreover, that this paradigm is the only one, that can lead us to accurate conclusions. When the subsequent versions of the equation still doesn’t pass the test of correspondence to living market, woe to the market. It looks that Lakatos model of falsification was correct. It is not so easy to falsify the wrong theory by the empirical evidences. Methodologically it is even dubious if the economic (especially neokeynesian) theories are falsifiable (I refer straightly to Milton Freedman’s postulate, that the economic theory should be predictive and therefore falsifiable). The core of the Philipp’s curve theory remains stable and seemingly untouched, although neither of the variables named by Phillips appears in the equation any longer. What remains is only the general idea to bind the variables or its subsequent alterations into function and the strong belief that we are in position sooner or later to worked the function which will generate the accurate market predictions. Maybe it’s time to seriously question that general idea?

Tags: