+ All Categories
Home > Documents > SEMINAR FISHER INFORMATION

SEMINAR FISHER INFORMATION

Date post: 30-May-2018
Category:
Upload: chethanm
View: 214 times
Download: 0 times
Share this document with a friend

of 23

Transcript
  • 8/14/2019 SEMINAR FISHER INFORMATION

    1/23

    FISHERFISHER

    INFORMATIONINFORMATION

    SUBMITTED BYSUBMITTED BY

    CHETHAN MCHETHAN M

  • 8/14/2019 SEMINAR FISHER INFORMATION

    2/23

    INTRODUCTIONINTRODUCTION

    Fisher information is named for its inventor, R.A.Fisher information is named for its inventor, R.A.Fisher (18901962), a British biostatistician whoFisher (18901962), a British biostatistician whowas among the first to develop and employwas among the first to develop and employwhen and as the need arose during his work inwhen and as the need arose during his work in

    genetics and eugenicssuch methods asgenetics and eugenicssuch methods asmaximum likelihood estimation, the analysis ofmaximum likelihood estimation, the analysis ofvariance, and the design of experiments.variance, and the design of experiments.

    He also pointed out that Gregor Mendel hadHe also pointed out that Gregor Mendel hadprobably falsified the data in his famous pea-probably falsified the data in his famous pea-plant experiments, which seem too clean to beplant experiments, which seem too clean to bethe result of any natural process.the result of any natural process.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    3/23

    FISHERS IDEAFISHERS IDEA

    Fishers idea was that attempts toFishers idea was that attempts tomeasure physical quantitiessuch as themeasure physical quantitiessuch as thetime required for the winner of a 100-yardtime required for the winner of a 100-yarddash to reach the finish linearedash to reach the finish lineareinvariably frustrated by noise.invariably frustrated by noise.

    Thats why multiple stopwatches areThats why multiple stopwatches areordinarily employed. Moreover, theordinarily employed. Moreover, the

    quantity of information concerning thequantity of information concerning theactual elapsed time contained in such aactual elapsed time contained in such asample varies directly with the degree tosample varies directly with the degree towhich the sample measurements clusterwhich the sample measurements clusterabout a common value.about a common value.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    4/23

    DEFINITIONDEFINITION

    The Fisher information is the amount ofThe Fisher information is the amount ofinformationinformation that an observablethat an observablerandom variablerandom variable X carries about an unknownX carries about an unknown

    parameterparameter upon which the upon which thelikelihood functionlikelihood function of , L() = f(X;),of , L() = f(X;),depends. The likelihood function is the jointdepends. The likelihood function is the jointprobability of the data, the Xs, conditional onprobability of the data, the Xs, conditional on

    the value of , as a function of .the value of , as a function of .

    Since theSince the expectationexpectation of theof the scorescore is zero,is zero,thethe variancevariance is simply the secondis simply the second momentmoment ofofthe score, the derivative of thethe score, the derivative of the loglog of theof the

    likelihood functionlikelihood function with respect to with respect to

    http://en.wikipedia.org/wiki/Informationhttp://en.wikipedia.org/wiki/Informationhttp://en.wikipedia.org/wiki/Random_variablehttp://en.wikipedia.org/wiki/Random_variablehttp://en.wikipedia.org/wiki/Parameterhttp://en.wikipedia.org/wiki/Parameterhttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Expected_valuehttp://en.wikipedia.org/wiki/Expected_valuehttp://en.wikipedia.org/wiki/Score_%28statistics%29http://en.wikipedia.org/wiki/Score_%28statistics%29http://en.wikipedia.org/wiki/Variancehttp://en.wikipedia.org/wiki/Variancehttp://en.wikipedia.org/wiki/Moment_%28mathematics%29http://en.wikipedia.org/wiki/Moment_%28mathematics%29http://en.wikipedia.org/wiki/Natural_logarithmhttp://en.wikipedia.org/wiki/Natural_logarithmhttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Natural_logarithmhttp://en.wikipedia.org/wiki/Moment_%28mathematics%29http://en.wikipedia.org/wiki/Variancehttp://en.wikipedia.org/wiki/Score_%28statistics%29http://en.wikipedia.org/wiki/Expected_valuehttp://en.wikipedia.org/wiki/Likelihood_functionhttp://en.wikipedia.org/wiki/Parameterhttp://en.wikipedia.org/wiki/Random_variablehttp://en.wikipedia.org/wiki/Information
  • 8/14/2019 SEMINAR FISHER INFORMATION

    5/23

    which implies . The Fisherwhich implies . The Fisherinformation is thus the expectation of theinformation is thus the expectation of the

    squared score. A random variable carryingsquared score. A random variable carryinghigh Fisher information implies that thehigh Fisher information implies that theabsolute value of the score is often high.absolute value of the score is often high.

    The Fisher information is not a function of aThe Fisher information is not a function of a

    particular observation, as the randomparticular observation, as the randomvariable X has been averaged out. Thevariable X has been averaged out. Theconcept of information is useful whenconcept of information is useful whencomparing two methods of observing acomparing two methods of observing a

    given random process.given random process.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    6/23

    INFORMATION VIEWINFORMATION VIEW

    Over the past 15 or so years, it hasOver the past 15 or so years, it has

    become increasingly clear that thebecome increasingly clear that the

    fundamental laws of science arefundamental laws of science areexpressions of the concept of information.expressions of the concept of information.

    This includes laws governing the small (atThis includes laws governing the small (at

    subatomic scales), the large (astronomicalsubatomic scales), the large (astronomical

    scales), and everything in between.scales), and everything in between.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    7/23

    INFORMATION VIEWINFORMATION VIEW

    In between are chemistry, biologyIn between are chemistry, biologyand even higher-level effectsand even higher-level effects

    involving willful human activity, suchinvolving willful human activity, such

    as economics and socioculturalas economics and socioculturalorganization.organization.

    These laws all derive from a principleThese laws all derive from a principleof information optimization calledof information optimization called

    extreme physical information, orextreme physical information, or

    EPI.EPI.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    8/23

    EPI - Extreme Physical InformationEPI - Extreme Physical Information

    EPI is an expression of the imperfection ofEPI is an expression of the imperfection ofobservation:observation:

    Owing to random interaction of a subjectOwing to random interaction of a subjectwith its observer and other possiblewith its observer and other possibledisturbances, its measurement contains lessdisturbances, its measurement contains less

    Fisher information than does the subject perFisher information than does the subject perse.se. Moreover, the information loss is an extremeMoreover, the information loss is an extreme

    value.value.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    9/23

    ContdContd An EPI output may alternatively be viewedAn EPI output may alternatively be viewed

    as the payoff of a zero-sum game ofas the payoff of a zero-sum game ofinformation acquisition between theinformation acquisition between theobserver and a demon in subject space.observer and a demon in subject space.

    EPI derives, Escher-like, the veryEPI derives, Escher-like, the veryprobability law that gave rise to theprobability law that gave rise to themeasurement.measurement.

    In applications, EPI is used to derive bothIn applications, EPI is used to derive bothexisting and new analytical relationsexisting and new analytical relationsgoverning probability laws of physics,governing probability laws of physics,genetics, cancer growth, ecology andgenetics, cancer growth, ecology andeconomics.economics.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    10/23

    EPI PrincipleEPI Principle

    The two types of Fisher Information I,JThe two types of Fisher Information I,J

    There is a decisive difference betweenThere is a decisive difference between

    Experiences ofExperiences ofpassively observing apassively observing alamp voltage of 120.0 vlamp voltage of 120.0 von a meteron a meter andand

    becoming an active part of the electricalbecoming an active part of the electricalphenomenon by sticking your finger inphenomenon by sticking your finger inthe lamp socket.the lamp socket.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    11/23

    EPI PrincipleEPI Principle

    The difference may be expressed onThe difference may be expressed onthe level of information by thethe level of information by the

    incredibly simple formincredibly simple formI - J =I - J =extremumextremum-------- 11

    This is called the EPI principle.TheThis is called the EPI principle.Thenumbers I and J are the outputs ofnumbers I and J are the outputs ofintegrals that define values of theintegrals that define values of the

    "Fisher information,""Fisher information,"

  • 8/14/2019 SEMINAR FISHER INFORMATION

    12/23

    Why do two Fisher informationsWhy do two Fisher informationsarise in eq. (1), and why is theirarise in eq. (1), and why is their

    simple difference extremized?simple difference extremized?

    The information that is acquired in aThe information that is acquired in a

    message does not generally arise out ofmessage does not generally arise out ofnothing.nothing.

    Any acquired information is usually aboutAny acquired information is usually aboutsomething. That something is generallysomething. That something is generally

    called an information source, and it hascalled an information source, and it hasinformation level J. That is, theinformation level J. That is, theinformation level J is requiredinformation level J is requiredto completely describe it.to completely describe it.

    The source is an effect like lamp voltage inThe source is an effect like lamp voltage inthe above.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    13/23

    Physics from Fisher InformationPhysics from Fisher Information

    Physics is fundamentally tied intoPhysics is fundamentally tied into

    measurement.measurement.

    One may regard "physics", by which weOne may regard "physics", by which we

    mean the equations of physics, as simply amean the equations of physics, as simply a

    manmade code that represents all past (andmanmade code that represents all past (and

    future) measurement as briefly, conciselyfuture) measurement as briefly, conciselyand correctly as is possible.and correctly as is possible.

    Thus physics is a body of equations thatThus physics is a body of equations that

    describes measurements.describes measurements.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    14/23

    Physics from Fisher InformationPhysics from Fisher Information

    In fact we could equally well replace the wordIn fact we could equally well replace the word

    "physics" in the foregoing with "chemistry","physics" in the foregoing with "chemistry",

    "biology," or any other quantitative science. All"biology," or any other quantitative science. All

    describe measurements and potentialdescribe measurements and potentialmeasurements by manmade codes calledmeasurements by manmade codes called

    "equations.""equations."

    But measurements are generally made for theBut measurements are generally made for the

    purpose of knowing, in particular knowing thepurpose of knowing, in particular knowing the

    state of a system. Thus, physics presumesstate of a system. Thus, physics presumes

    definite states to exist.definite states to exist.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    15/23

    ContdContd

    We characterize these by definite values of aWe characterize these by definite values of a

    parameter such as a above (for example aparameter such as a above (for example aposition or time value). A definite parameterposition or time value). A definite parameter

    value is presumed to characterize a definitevalue is presumed to characterize a definite

    system or "object".system or "object".

    Ideally accurate versions of the laws of scienceIdeally accurate versions of the laws of science

    should follow from maximally accurateshould follow from maximally accurate

    estimates, and therefore by eq. ("Cramer-Raoestimates, and therefore by eq. ("Cramer-Raoinequality" e2 1/I ) maximum (note: notinequality" e2 1/I ) maximum (note: not

    minimum) Fisher information.minimum) Fisher information.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    16/23

    ContdContd

    Fisher information measures how muchFisher information measures how muchinformation is present, whereas entropyinformation is present, whereas entropymeasures how much is missing.measures how much is missing.

    B. R. Frieden uses a single procedureB. R. Frieden uses a single procedure(called extreme physical information) with(called extreme physical information) withthe aim of deriving 'most known physics,the aim of deriving 'most known physics,from statistical mechanics andfrom statistical mechanics and

    thermodynamics to quantum mechanics,thermodynamics to quantum mechanics,the Einstein field equations and quantumthe Einstein field equations and quantumgravity'.gravity'.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    17/23

    Contd.Contd.

    A notable exception to this, based onA notable exception to this, based on

    Shannon's work on information theoryShannon's work on information theory

    (Shannon and Weaver 1964), was initiated(Shannon and Weaver 1964), was initiated

    by Brillouin (1956) and developed byby Brillouin (1956) and developed byJaynes (1957).Jaynes (1957).

    This approach (usually referred to as theThis approach (usually referred to as themaximum entropy method) has now beenmaximum entropy method) has now been

    developed to encompass other aspects ofdeveloped to encompass other aspects of

    statistics and probability theory (Jaynesstatistics and probability theory (Jaynes

    1983).1983).

  • 8/14/2019 SEMINAR FISHER INFORMATION

    18/23

    Differences between Jaynes andDifferences between Jaynes and

    FriedenFrieden

    All things physical are information-theoreticAll things physical are information-theoretic

    in origin and this is a participatory universein origin and this is a participatory universeObserver participancy gives rise toObserver participancy gives rise to

    information; and information gives rise toinformation; and information gives rise to

    physics.physics.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    19/23

    ContdContd

    (J. A. Wheeler) His emphasis is, therefore,(J. A. Wheeler) His emphasis is, therefore,on getting rather than simply havingon getting rather than simply havinginformation; that is to say oninformation; that is to say onmeasurement, though whether he reallymeasurement, though whether he reallybelieves that the physics would not bebelieves that the physics would not be

    there without the measurement is difficultthere without the measurement is difficultto say.to say.

    While Jaynes, within the area of theWhile Jaynes, within the area of thefoundations of physics, confined himself tofoundations of physics, confined himself tostatistical mechanics, Frieden claims to bestatistical mechanics, Frieden claims to beable to derive the fundamental equationsable to derive the fundamental equationsof almost all of physics.of almost all of physics.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    20/23

    CONCLUSIONSCONCLUSIONS

    The application of the ideas of informationThe application of the ideas of informationtheory to physics is interesting;and the usetheory to physics is interesting;and the useof Fisher information to provide the gradientof Fisher information to provide the gradient

    terms in the Lagrangian for a variationalterms in the Lagrangian for a variationalprocedure is of some importance.procedure is of some importance.

    The crucial step,however, is to provide inThe crucial step,however, is to provide insome rational and widely-applicable mannersome rational and widely-applicable manner

    the remaining terms of the Lagrangian.the remaining terms of the Lagrangian.Frieden believes he is able to do this by usingFrieden believes he is able to do this by usingthe idea of bound information.the idea of bound information.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    21/23

    Final FisherFinal Fisher

    The contributionsThe contributions FisherFisher made includedmade included

    the development of methods suitable forthe development of methods suitable forsmall samples, the discovery of thesmall samples, the discovery of the

    precise distributions of many sampleprecise distributions of many sample

    statistics and the invention of analysis ofstatistics and the invention of analysis of

    variance.variance.

  • 8/14/2019 SEMINAR FISHER INFORMATION

    22/23

    Ronald Aylmer. Fisher, 1929, as sketched byB. Roy Frieden, author of the book under

    review. The sketch, which appears in PhysicsfromFisher Information: A Unification, was donefrom a photograph taken at the time of

    Fisher's election as a Fellow of the Royal

    http://en.wikipedia.org/wiki/Image:RonaldFisher.jpg
  • 8/14/2019 SEMINAR FISHER INFORMATION

    23/23

    perhaps the most original mathematical scientistperhaps the most original mathematical scientistof the [twentieth] centuryBradley Efronof the [twentieth] centuryBradley EfronAnnalsAnnalsof Statisticsof Statistics (1976)(1976)

    Fisher was a genius who almost single-handedlyFisher was a genius who almost single-handedlycreated the foundations for modern statisticalcreated the foundations for modern statisticalscience .Anders Haldscience .Anders Hald A History ofA History ofMathematical StatisticsMathematical Statistics (1998)(1998)

    Sir Ronald Fisher could be regarded asSir Ronald Fisher could be regarded asDarwins greatest twentieth-century successor.Darwins greatest twentieth-century successor.

    Richard DawkinsRichard Dawkins River out of EdenRiver out of Eden (1995)(1995)

    I occasionally meet geneticists who ask meI occasionally meet geneticists who ask mewhether it is true that the great geneticist R. A.whether it is true that the great geneticist R. A.Fisher was also an important statistician.Fisher was also an important statistician.Leonard J. SavageLeonard J. SavageAnnals of StatisticsAnnals of Statistics (1976)(1976)


Recommended