WorldCat Linked Data Explorer

http://worldcat.org/entity/work/id/20676215

The nature of statistical learning theory

The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of.

Open All Close All

http://schema.org/about

http://schema.org/description

  • "The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. This second edition contains three new chapters devoted to further development of the learning theory and SVM techniques."
  • ""The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics." --From the publisher."
  • "The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT&T Labs-Research and Professor of London University. He is one of the founders of."@en
  • "The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning as a general problem of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: * the setting of learning problems based on the model of minimizing the risk functional from empirical data * a comprehensive analysis of the empirical risk minimization principle including necessary and sufficient conditions for its consistency * non-asymptotic bounds for the risk achieved using the empirical risk minimization principle * principles for controlling the generalization ability of learning machines using small sample sizes based on these bounds * the Support Vector methods that control the generalization ability when estimating function using small sample size. The second edition of the book contains three new chapters devoted to further development of the learning theory and SVM techniques. These include: * the theory of direct method of learning based on solving multidimensional integral equations for density, conditional probability, and conditional density estimation * a new inductive principle of learning. Written in a readable and concise style, the book is intended for statisticians, mathematicians, physicists, and computer scientists. Vladimir N. Vapnik is Technology Leader AT & T Labs-Research and Professor of London University. He is one of the founders of."
  • "The aim of this book is to discuss the fundamental ideas which lie behind the statistical theory of learning and generalization. It considers learning from the general point of view of function estimation based on empirical data. Omitting proofs and technical details, the author concentrates on discussing the main results of learning theory and their connections to fundamental problems in statistics. These include: - the general setting of learning problems and the general model of minimizing the risk functional from empirical data - a comprehensive analysis of the empirical risk minimization principle and shows how this allows for the construction of necessary and sufficient conditions for consistency - non-asymptotic bounds for the risk achieved using the empirical risk minimization principle - principles for controlling the generalization ability of learning machines using small sample sizes - introducing a new type of universal learning machine that controls the generalization ability."
  • "Four periods in the research of the learning problem. Setting of the learning problem. Informal reasoning and comments. Consistency of learning processes. Bounds on the rate of convergence of learning processes. Controrlling the generalization ability of learning processes. Methods of pattern recognition. Methods of function estimation. Direct methods in statistical learning theory. The vicinal risk minimization principle and the SVMs. Conclusion: what is important in learning theory."
  • "Ben shu jie shao le tong ji xue xi li lun he zhi chi xiang liang ji de guan jian si xiang, jie lun he fang fa, yi ji gai ling yu de zui xin jin zhan. tong ji xue xi li lun shi zhen dui xiao yang ben qing kuang yan jiu tong ji xue xi gui lü de li lun, shi chuan tong tong ji xue de zhong yao fa zhan he bu chong. qi he xin si xiang shi tong guo kong zhi xue xi ji qi de rong liang shi xian dui tui guang neng li de kong zhi."

http://schema.org/genre

  • "e-book [online only]"@en
  • "Electronic books"

http://schema.org/name

  • "統計學習理論的本質"
  • "The nature of Statistical Learning Theory"
  • "The Nature of statistical learning theory"
  • "The nature of statistical learning theory"@en
  • "The nature of statistical learning theory"
  • "The nature of statitical learning theory"
  • "The Nature of Statistical Learning Theory"@en
  • "The Nature of Statistical Learning Theory"
  • "Tong ji xue xi li lun de ben zhi"
  • "统计学习理论的本质"

http://schema.org/workExample