TY - JOUR AU - AB - We investigate the use of certain data-dependent estimates of the complexity of a function class, called Rademacher and Gaussian complexities. In a decision theoretic setting, we prove general risk bounds in terms of these complexities. We consider function classes that can be expressed as combinations of functions from basis classes and show how the Rademacher and Gaussian complexities of such a function class can be bounded in terms of the complexity of the basis classes. We give examples of the application of these techniques in ¯nding data-dependent risk bounds for decision trees, neural networks and support vector machines. Keywords: Error Bounds, Data-Dependent Complexity, Rademacher Averages, Maxi- mum Discrepancy 1. Introduction In learning problems like pattern classi¯cation and regression, a considerable amount of e®ort has been spent on obtaining good error bounds. These are useful, for example, for the problem of model selection|choosing a model of suitable complexity. Typically, such bounds take the form of a sum of two terms: some sample-based estimate of performance and a penalty term that is large for more complex models. For example, in pattern clas- si¯cation, the following theorem is an improvement of a classical result of Vapnik and Chervonenkis (Vapnik and Chervonenkis, 1971). TI - Computational Learning Theory: Rademacher and Gaussian Complexities: Risk Bounds and Structural Results JF - Neural Information Processing DO - 10.1007/3-540-44581-1_15 DA - 2001-01-01 UR - https://www.deepdyve.com/lp/unpaywall/computational-learning-theory-rademacher-and-gaussian-complexities-NUUugwVO5K DP - DeepDyve ER -