TY - JOUR AU - Baxter, Laurence A. AB - BOOK REVIEWS BienaymCChebychev inequalities and the (weak) law of large numbers process are established, which leads to the structural risk-minimization are addressed. But it is strong in what are probably considered the core principle. These bounds differ from those normally considered in that topics for a one-semester, calculus-based, undergraduate-level probability they are easily calculated and are valid for finite samples (not asymp- course. totic). Structural risk minimization plays a key role in the construction of learning machines and represents an attempt to control both the qual- Robert P. DOBROW ity of the approximation (achieves low risk) and the complexity of the Truman State University approximating function (prevents overfitting to the training-sample data). Examples are given, as well as a discussion of the development of this principle throughout statistics and computational mathematics. Vapnik claims in the Preface that “nothing is more practical than a good The Nature of Statistical Learning Theory, by V. N. theory” and, in Chapter 5, puts this claim to the test with the descrip- VAPNIK, New York: Springer-Verlag, 1995, xv + 188 tion of a new learning machine based on the structural risk-minimization pp., $39.95. principle. Previous learning machines, such as neural networks, assume a complexity TI - Random Fields on a Network: Modeling, Statistics, and Applications JF - Technometrics DO - 10.1080/00401706.1996.10484566 DA - 1996-11-01 UR - https://www.deepdyve.com/lp/taylor-francis/random-fields-on-a-network-modeling-statistics-and-applications-TJ5TV6GQ90 SP - 409 EP - 410 VL - 38 IS - 4 DP - DeepDyve ER -