Linguistic Probability Theory Joe Halliwell

Linguistic Probability Theory Joe Halliwell-PDF Download

  • Date:07 Jan 2020
  • Views:32
  • Downloads:0
  • Pages:133
  • Size:561.38 KB

Share Pdf : Linguistic Probability Theory Joe Halliwell

Download and Preview : Linguistic Probability Theory Joe Halliwell

Report CopyRight/DMCA Form For : Linguistic Probability Theory Joe Halliwell


Linguistic Probability Theory Joe Halliwell Doctor of Philosophy School of Informatics probability examples do not always exist 60


In recent years probabilistic knowledge based systems such as Bayesian net. works and influence diagrams have come to the fore as a means of represent. ing and reasoning about complex real world situations Although some of the. probabilities used in these models may be obtained statistically where this is. impossible or simply inconvenient modellers rely on expert knowledge Ex. perts however typically find it difficult to specify exact probabilities and con. ventional representations cannot reflect any uncertainty they may have In. this way the use of conventional point probabilities can damage the accuracy. robustness and interpretability of acquired models With these concerns in. mind psychometric researchers have demonstrated that fuzzy numbers are. good candidates for representing the inherent vagueness of probability esti. mates and the fuzzy community has responded with two distinct theories of. fuzzy probabilities, This thesis however identifies formal and presentational problems with these. theories which render them unable to represent even very simple scenarios. This analysis leads to the development of a novel and intuitively appealing. alternative a theory of linguistic probabilities patterned after the standard Kol. mogorov axioms of probability theory Since fuzzy numbers lack algebraic. inverses the resulting theory is weaker than but generalizes its classical coun. terpart Nevertheless it is demonstrated that analogues for classical proba. bilistic concepts such as conditional probability and random variables can be. constructed In the classical theory representation theorems mean that most of. the time the distinction between mass density distributions and probability. measures can be ignored Similar results are proven for linguistic probabili. From these results it is shown that directed acyclic graphs annotated with lin. guistic probabilities under certain identified conditions represent systems of. linguistic random variables It is then demonstrated these linguistic Bayesian. networks can utilize adapted best of breed Bayesian network algorithms junc. tion tree based inference and Bayes ball irrelevancy calculation These algo. rithms are implemented in A RBOR an interactive design editing and querying. tool for linguistic Bayesian networks, To explore the applications of these techniques a realistic example drawn from. the domain of forensic statistics is developed In this domain the knowledge. engineering problems cited above are especially pronounced and expert esti. mates are commonplace Moreover robust conclusions are of unusually crit. ical importance An analysis of the resulting linguistic Bayesian network for. assessing evidential support in glass transfer scenarios highlights the potential. utility of the approach,Acknowledgements, The greatest debt is owed to my endlessly enthusiastic insightful and patient. supervisor Qiang Shen Speaking plainly this thesis would not have been. possible without his careful advice and stalwart support Thank you Qiang. My second supervisor Alan Smaill and my colleague Jeroen Keppens have. both made invaluable contributions to the present work Thank you Alan and. I am also indebted to the former members of Qiang s research group at the Uni. versity of Edinburgh for their interest in and useful feedback on my ideas So. I d like to thank Alexios Chouchoulas Ronan Daly Michelle Galea Zhiheng. Huang and Richard Jensen, My time at the University of Edinburgh has been very happy but would not. have been so without the dear friends and intellectual comrades I have been. lucky enough to find in Paul Crook Colin Fraser Annabel Harrison Sebastian. Mhatre Fiona McNeill Alison Pease and Dan Winterstein Thanks guys. Outwith this circle I have the great privilege to be friends with Hamish Al. lan Andrew Back Stephen Blythe Sam Collier Aaron Crane Harry Day Lu. cas Dixon Stephan van Erp Helen Foot Terry Grayshon Alex Heneveld. Chris Hinds Veronika Holtzmann Jethro Green Max MacAndrews Jarred. and Sarah McGinnis Ewen Maclean Laura Meikle Jelena Meznaric Miranda. Millward Hannu Rajaniemi Nils Roeder Rebecca Smith Viktor Tron Chris. Scott Graham Steel Matthew Williams Grey Ika Willis and Susie Wilson There. have been many late nights long days pots of tea pints of beer lunches din. ners games books crosswords dreams ideas projects plans and conversa. tions For these thank you all, Finally I d like to thank my Mum and Dad for starting me on this path and.
sticking by me ever since This work such as it is is for you. Declaration, I declare that this thesis was composed by myself that the work contained. herein is my own except where explicitly stated otherwise in the text and that. this work has not been submitted for any other degree or professional qualifi. cation except as specified,Joe Halliwell,Table of Contents. 1 Introduction 1,1 1 Motivation 1,1 2 The case for fuzzy probabilities 3. 1 3 Problem statement 4,1 4 Overview 5,2 Background 8. 2 1 Fuzzy logic 9,2 1 1 Fuzzy logic in AI 11,2 1 2 Fuzzy sets 12.
2 1 3 Triangular norms 13,2 1 4 The Extension Principle 16. 2 2 Fuzzy numbers 17,2 2 1 Arithmetic operators 19. 2 2 2 Partial orderings 20,2 2 3 Functions 22,2 2 4 Algebraic properties 22. 2 2 5 Solving fuzzy equations 23,2 2 6 Topology 24. 2 2 7 Convergence results 26,2 2 8 Computational issues 27.
2 3 Probability theory 28,2 3 1 Probability Theory 28. 2 3 2 Conditional probability 30,2 3 3 Random variables 31. 2 3 4 Discrete random variables 32,2 3 5 Continuous random variables 33. 2 4 Bayesian networks 33,2 4 1 Basic graph concepts 33. 2 4 2 Representation 35,2 4 3 Inference in Bayesian networks 37.
2 5 Summary 39,3 Vague probabilities 41,3 1 Fuzzy random variables 41. 3 2 Fuzzy information systems 44,3 3 Zadeh s fuzzy probabilities 44. 3 3 1 ZFP Type 1 45,3 3 2 ZFP Type 2 46,3 3 3 Critical evaluation 46. 3 4 Bayesian Fuzzy probabilities 49,3 5 Summary 53. 4 Linguistic Probability Theory 54,4 1 Linguistic probability measures 55.
4 1 1 Properties of linguistic probability measures 56. 4 1 2 Relation with classical probabilities 58,4 2 Conditional probability 59. 4 3 Linguistic random variables 62,4 3 1 Linguistic discrete random variables 62. 4 3 2 Real valued discrete linguistic random variables 65. 4 3 3 Continuous linguistic random variables 66,4 4 Summary 66. 5 Linguistic Bayesian Networks 68,5 1 Representation 68. 5 2 Completeness 71,5 3 Inference in linguistic Bayesian networks 72.
5 4 Summary 73,6 Case study forensic statistics 75. 6 1 Bayesian networks in forensic statistics 76,6 1 1 Bayesian networks and likelihood ratio 77. 6 2 Linguistic Bayesian networks for forensic statistics 82. 6 2 1 A RBOR a linguistic Bayesian network tool 83. 6 2 2 Extended example Glass transfer 84,6 3 Summary 92. 7 Conclusion 95,7 1 Summary 95,7 2 Discussion 97,7 3 Future work 98. 7 3 1 Interpretation 98, 7 3 2 Evaluation as a method for knowledge engineering 105.
Bibliography 108,List of Figures,2 1 Examples of fuzzy numbers 18. 2 2 Some examples of different types of graph 34, 2 3 A sufficient graph for a joint distribution with four random vari. 3 1 A schematic of conventional probability theory The oval shapes. represent classical sets In hybrid fuzzy probabilistic theories. one or more of these is replaced by an appropriate fuzzy gener. alisation 42,3 2 The fuzzy probability ZFP B 47, 6 1 Bayesian network of a one way transfer case 79. 6 2 A screenshot of the A RBORextensible Bayesian network editor. showing the network used in the forensics case study 85. 6 3 The linguistic probability labelled Impossible 86. 6 4 The linguistic probability labelled Nearly impossible 86. 6 5 The linguistic probability labelled Very unlikely 87. 6 6 The linguistic probability labelled Quite unlikely 87. 6 7 The linguistic probability labelled Even chance 88. 6 8 The linguistic probability labelled Quite likely 88. 6 9 The linguistic probability labelled Very likely 89. 6 10 The linguistic probability labelled Nearly certain 89. 6 11 The linguistic probability labelled Certain 90. 6 12 The computed linguistic probability of lp ql many tc some 93. 6 13 The computed lingustic probability of lp ql many tc none 93. 6 14 The computed fuzzy likelihood ratio plotted on a logarithmic. 7 1 A graph of the density distribution of pA 99, 7 2 Diagram of the joint distribution f pA pB showing the various re. gions used in the case analysis of its density function and confi. dence intervals 102, 7 3 The computed confidence intervals for the second order proba.
bility pC and the corresponding linguistic probability 103. List of Tables, 2 1 Examples of t norms and their associated s norms 15. 3 1 A bag of balls and their blackness 45, 4 1 The coins example that proves conditionalisations of linguistic. probability examples do not always exist 60, 4 2 A fully factorable linguistic probability measure 61. 6 1 Interpretation of the likelihood ratio 78,6 2 Variables in the one way transfer case 78. 6 3 Classical prior probabilities p ps and p pl 81. 6 4 Classical conditional probabilities p qt tc 81. 6 5 Classical conditional probabilities P q p qt ps 81. 6 6 Classical conditional probabilities p ql q p pl 82. 6 7 Linguistic prior probabilities lp ps and lp pl 84. 6 8 Linguistic conditional probabilities lp qt tc 84. 6 9 Linguistic conditional probabilities lp q p qt ps 90. 6 10 Linguistic conditional probabilities lp ql q p pl 91. Introduction, The research documented in this thesis represents an attempt to answer a set of.
linked questions about the mathematical and computational aspects of mod. elling probabilities as fuzzy quantities The motivations for this study are. sketched in the following section This feeds into the set of specific research. questions that follows The chapter closes with an outline of the remainder of. the thesis,1 1 Motivation, Many different types of uncertainty can be found in the sorts of information in. telligent systems must process and it is not uncommon to find several of these. represented in a single item So for example in constructing a knowledge. based system for industrial fault diagnosis an expert might supply that if the. water pressure is well above tolerance levels then it is extremely likely that. Chapter 1 Introduction 2, the output valve will fail Here the meaning of the term extremely likely. combines elements of both fuzzy and probabilistic uncertainty. The use of fuzzy sets to model every day descriptions such as John is tall is. no doubt familiar The idea behind the set of theories sharing the name fuzzy. probability is that imprecise linguistic characterisations of probabilistic un. certainty can be treated in an analogous way The goal then put simply is to. develop a principled approach to statements such as. It is quite likely to rain tomorrow 1 1, A possible objection at this stage is that 1 1 is hopelessly uninformative If. probabilistic information about the next day s weather is crucial to a system s. successful operation there are surely better ways to obtain it In short why. bother attempting to utilize such woefully low grade information The an. swer of course is the standard argument for computing with words Zadeh. 1996 whilst gold standard numerical information may be available about to. morrow s weather there are probabilistic assessments which are too difficult. expensive or simply impossible to obtain with such precision. For example consider the questions Will there be artificial intelligence in 10. years 100 years 1000 years Consultation with an expert is unlikely to yield. much beyond vague probabilistic statements like It is extremely unlikely that. we will have true artificial intelligence in ten years time But if such in. formation is to be used within the framework of classical probability theory. numerical estimates of the probabilities of interest are required. In such cases and indeed many that are less speculative the difficulty of ob. taining point estimates of probability has been widely reported Kahneman.

Related Books