Header logo is ei


2001


no image
Incorporating Invariances in Non-Linear Support Vector Machines

Chapelle, O., Schölkopf, B.

Max Planck Institute for Biological Cybernetics / Biowulf Technologies, 2001 (techreport)

Abstract
We consider the problem of how to incorporate in the Support Vector Machine (SVM) framework invariances given by some a priori known transformations under which the data should be invariant. It extends some previous work which was only applicable with linear SVMs and we show on a digit recognition task that the proposed approach is superior to the traditional Virtual Support Vector method.

PostScript [BibTex]

2001

PostScript [BibTex]


no image
Unsupervised Segmentation and Classification of Mixtures of Markovian Sources

Seldin, Y., Bejerano, G., Tishby, N.

In The 33rd Symposium on the Interface of Computing Science and Statistics (Interface 2001 - Frontiers in Data Mining and Bioinformatics), pages: 1-15, 33rd Symposium on the Interface of Computing Science and Statistics (Interface - Frontiers in Data Mining and Bioinformatics), 2001 (inproceedings)

Abstract
We describe a novel algorithm for unsupervised segmentation of sequences into alternating Variable Memory Markov sources, first presented in [SBT01]. The algorithm is based on competitive learning between Markov models, when implemented as Prediction Suffix Trees [RST96] using the MDL principle. By applying a model clustering procedure, based on rate distortion theory combined with deterministic annealing, we obtain a hierarchical segmentation of sequences between alternating Markov sources. The method is applied successfully to unsupervised segmentation of multilingual texts into languages where it is able to infer correctly both the number of languages and the language switching points. When applied to protein sequence families (results of the [BSMT01] work), we demonstrate the method‘s ability to identify biologically meaningful sub-sequences within the proteins, which correspond to signatures of important functional sub-units called domains. Our approach to proteins classification (through the obtained signatures) is shown to have both conceptual and practical advantages over the currently used methods.

PDF Web [BibTex]

PDF Web [BibTex]


no image
Tracking a Small Set of Experts by Mixing Past Posteriors

Bousquet, O., Warmuth, M.

In Proceedings of the 14th Annual Conference on Computational Learning Theory, Lecture Notes in Computer Science, 2111, pages: 31-47, Proceedings of the 14th Annual Conference on Computational Learning Theory, Lecture Notes in Computer Science, 2001 (inproceedings)

Abstract
In this paper, we examine on-line learning problems in which the target concept is allowed to change over time. In each trial a master algorithm receives predictions from a large set of $n$ experts. Its goal is to predict almost as well as the best sequence of such experts chosen off-line by partitioning the training sequence into $k+1$ sections and then choosing the best expert for each section. We build on methods developed by Herbster and Warmuth and consider an open problem posed by Freund where the experts in the best partition are from a small pool of size $m$. Since $k>>m$ the best expert shifts back and forth between the experts of the small pool. We propose algorithms that solve this open problem by mixing the past posteriors maintained by the master algorithm. We relate the number of bits needed for encoding the best partition to the loss bounds of the algorithms. Instead of paying $\log n$ for choosing the best expert in each section we first pay $\log {n\choose m}$ bits in the bounds for identifying the pool of $m$ experts and then $\log m$ bits per new section. In the bounds we also pay twice for encoding the boundaries of the sections.

PDF PostScript [BibTex]

PDF PostScript [BibTex]


no image
Learning and Prediction of the Nonlinear Dynamics of Biological Neurons with Support Vector Machines

Frontzek, T., Lal, TN., Eckmiller, R.

In Proceedings of the International Conference on Artificial Neural Networks (ICANN'2001), pages: 390-398, Proceedings of the International Conference on Artificial Neural Networks (ICANN'2001), 2001 (inproceedings)

Abstract
Based on biological data we examine the ability of Support Vector Machines (SVMs) with gaussian kernels to learn and predict the nonlinear dynamics of single biological neurons. We show that SVMs for regression learn the dynamics of the pyloric dilator neuron of the australian crayfish, and we determine the optimal SVM parameters with regard to the test error. Compared to conventional RBF networks, SVMs learned faster and performed a better iterated one-step-ahead prediction with regard to training and test error. From a biological point of view SVMs are especially better in predicting the most important part of the dynamics, where the membranpotential is driven by superimposed synaptic inputs to the threshold for the oscillatory peak.

PDF [BibTex]

PDF [BibTex]


no image
Estimating a Kernel Fisher Discriminant in the Presence of Label Noise

Lawrence, N., Schölkopf, B.

In 18th International Conference on Machine Learning, pages: 306-313, (Editors: CE Brodley and A Pohoreckyj Danyluk), Morgan Kaufmann , San Fransisco, CA, USA, 18th International Conference on Machine Learning (ICML), 2001 (inproceedings)

Web [BibTex]

Web [BibTex]


no image
A Generalized Representer Theorem

Schölkopf, B., Herbrich, R., Smola, A.

In Lecture Notes in Computer Science, Vol. 2111, (2111):416-426, LNCS, (Editors: D Helmbold and R Williamson), Springer, Berlin, Germany, Annual Conference on Computational Learning Theory (COLT/EuroCOLT), 2001 (inproceedings)

[BibTex]

[BibTex]


no image
Extracting egomotion from optic flow: limits of accuracy and neural matched filters

Dahmen, H-J., Franz, MO., Krapp, HG.

In pages: 143-168, Springer, Berlin, 2001 (inbook)

[BibTex]

[BibTex]


no image
Bound on the Leave-One-Out Error for Density Support Estimation using nu-SVMs

Gretton, A., Herbrich, R., Schölkopf, B., Smola, A., Rayner, P.

University of Cambridge, 2001 (techreport)

[BibTex]

[BibTex]


no image
The pedestal effect with a pulse train and its constituent sinusoids

Henning, G., Wichmann, F., Bird, C.

Twenty-Sixth Annual Interdisciplinary Conference, 2001 (poster)

Abstract
Curves showing "threshold" contrast for detecting a signal grating as a function of the contrast of a masking grating of the same orientation, spatial frequency, and phase show a characteristic improvement in performance at masker contrasts near the contrast threshold of the unmasked signal. Depending on the percentage of correct responses used to define the threshold, the best performance can be as much as a factor of three better than the unmasked threshold obtained in the absence of any masking grating. The result is called the pedestal effect (sometimes, the dipper function). We used a 2AFC procedure to measure the effect with harmonically related sinusoids ranging from 2 to 16 c/deg - all with maskers of the same orientation, spatial frequency and phase - and with masker contrasts ranging from 0 to 50%. The curves for different spatial frequencies are identical if both the vertical axis (showing the threshold signal contrast) and the horizontal axis (showing the masker contrast) are scaled by the threshold contrast of the signal obtained with no masker. Further, a pulse train with a fundamental frequency of 2 c/deg produces a curve that is indistinguishable from that of a 2-c/deg sinusoid despite the fact that at higher masker contrasts, the pulse train contains at least 8 components all of them equally detectable. The effect of adding 1-D spatial noise is also discussed.

[BibTex]

[BibTex]


no image
Markovian domain fingerprinting: statistical segmentation of protein sequences

Bejerano, G., Seldin, Y., Margalit, H., Tishby, N.

Bioinformatics, 17(10):927-934, 2001 (article)

PDF Web [BibTex]

PDF Web [BibTex]


no image
Unsupervised Sequence Segmentation by a Mixture of Switching Variable Memory Markov Sources

Seldin, Y., Bejerano, G., Tishby, N.

In In the proceeding of the 18th International Conference on Machine Learning (ICML 2001), pages: 513-520, 18th International Conference on Machine Learning (ICML), 2001 (inproceedings)

Abstract
We present a novel information theoretic algorithm for unsupervised segmentation of sequences into alternating Variable Memory Markov sources. The algorithm is based on competitive learning between Markov models, when implemented as Prediction Suffix Trees (Ron et al., 1996) using the MDL principle. By applying a model clustering procedure, based on rate distortion theory combined with deterministic annealing, we obtain a hierarchical segmentation of sequences between alternating Markov sources. The algorithm seems to be self regulated and automatically avoids over segmentation. The method is applied successfully to unsupervised segmentation of multilingual texts into languages where it is able to infer correctly both the number of languages and the language switching points. When applied to protein sequence families, we demonstrate the method‘s ability to identify biologically meaningful sub-sequences within the proteins, which correspond to important functional sub-units called domains.

PDF [BibTex]

PDF [BibTex]


no image
The control structure of artificial creatures

Zhou, D., Dai, R.

Artificial Life and Robotics, 5(3), 2001, invited article (article)

Web [BibTex]

Web [BibTex]


no image
Support Vector Regression for Black-Box System Identification

Gretton, A., Doucet, A., Herbrich, R., Rayner, P., Schölkopf, B.

In 11th IEEE Workshop on Statistical Signal Processing, pages: 341-344, IEEE Signal Processing Society, Piscataway, NY, USA, 11th IEEE Workshop on Statistical Signal Processing, 2001 (inproceedings)

Abstract
In this paper, we demonstrate the use of support vector regression (SVR) techniques for black-box system identification. These methods derive from statistical learning theory, and are of great theoretical and practical interest. We briefly describe the theory underpinning SVR, and compare support vector methods with other approaches using radial basis networks. Finally, we apply SVR to modeling the behaviour of a hydraulic robot arm, and show that SVR improves on previously published results.

PostScript [BibTex]

PostScript [BibTex]


no image
Bound on the Leave-One-Out Error for 2-Class Classification using nu-SVMs

Gretton, A., Herbrich, R., Schölkopf, B., Rayner, P.

University of Cambridge, 2001, Updated May 2003 (literature review expanded) (techreport)

Abstract
Three estimates of the leave-one-out error for $nu$-support vector (SV) machine binary classifiers are presented. Two of the estimates are based on the geometrical concept of the {em span}, which was introduced in the context of bounding the leave-one-out error for $C$-SV machine binary classifiers, while the third is based on optimisation over the criterion used to train the $nu$-support vector classifier. It is shown that the estimates presented herein provide informative and efficient approximations of the generalisation behaviour, in both a toy example and benchmark data sets. The proof strategies in the $nu$-SV context are also compared with those used to derive leave-one-out error estimates in the $C$-SV case.

PostScript [BibTex]

PostScript [BibTex]


no image
Inference Principles and Model Selection

Buhmann, J., Schölkopf, B.

(01301), Dagstuhl Seminar, 2001 (techreport)

Web [BibTex]

Web [BibTex]


no image
Kernel Machine Based Learning for Multi-View Face Detection and Pose Estimation

Cheng, Y., Fu, Q., Gu, L., Li, S., Schölkopf, B., Zhang, H.

In Proceedings Computer Vision, 2001, Vol. 2, pages: 674-679, IEEE Computer Society, 8th International Conference on Computer Vision (ICCV), 2001 (inproceedings)

DOI [BibTex]

DOI [BibTex]


no image
Some kernels for structured data

Bartlett, P., Schölkopf, B.

Biowulf Technologies, 2001 (techreport)

[BibTex]

[BibTex]


no image
Modeling the Dynamics of Individual Neurons of the Stomatogastric Networks with Support Vector Machines

Frontzek, T., Gutzen, C., Lal, TN., Heinzel, H-G., Eckmiller, R., Böhm, H.

Abstract Proceedings of the 6th International Congress of Neuroethology (ICN'2001) Bonn, abstract 404, 2001 (poster)

Abstract
In small rhythmic active networks timing of individual neurons is crucial for generating different spatial-temporal motor patterns. Switching of one neuron between different rhythms can cause transition between behavioral modes. In order to understand the dynamics of rhythmically active neurons we analyzed the oscillatory membranpotential of a pacemaker neuron and used different neural network models to predict dynamics of its time series. In a first step we have trained conventional RBF networks and Support Vector Machines (SVMs) using gaussian kernels with intracellulary recordings of the pyloric dilatator neuron in the Australian crayfish, Cherax destructor albidus. As a rule SVMs were able to learn the nonlinear dynamics of pyloric neurons faster (e.g. 15s) than RBF networks (e.g. 309s) under the same hardware conditions. After training SVMs performed a better iterated one-step-ahead prediction of time series in the pyloric dilatator neuron with regard to test error and error sum. The test error decreased with increasing number of support vectors. The best SVM used 196 support vectors and produced a test error of 0.04622 as opposed to the best RBF with 0.07295 using 26 RBF-neurons. In pacemaker neuron PD the timepoint at which the membranpotential will cross threshold for generation of its oscillatory peak is most important for determination of the test error. Interestingly SVMs are especially better in predicting this important part of the membranpotential which is superimposed by various synaptic inputs, which drive the membranpotential to its threshold.

[BibTex]

[BibTex]


no image
Support Vector Machines: Theorie und Anwendung auf Prädiktion epileptischer Anfälle auf der Basis von EEG-Daten

Lal, TN.

Biologische Kybernetik, Institut für Angewandte Mathematik, Universität Bonn, 2001, Advised by Prof. Dr. S. Albeverio (diplomathesis)

ZIP [BibTex]

ZIP [BibTex]

1997


no image
Comparing support vector machines with Gaussian kernels to radial basis function classifiers

Schölkopf, B., Sung, K., Burges, C., Girosi, F., Niyogi, P., Poggio, T., Vapnik, V.

IEEE Transactions on Signal Processing, 45(11):2758-2765, November 1997 (article)

Abstract
The support vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights, and threshold that minimize an upper bound on the expected test error. The present study is devoted to an experimental comparison of these machines with a classical approach, where the centers are determined by X-means clustering, and the weights are computed using error backpropagation. We consider three machines, namely, a classical RBF machine, an SV machine with Gaussian kernel, and a hybrid system with the centers determined by the SV method and the weights trained by error backpropagation. Our results show that on the United States postal service database of handwritten digits, the SV machine achieves the highest recognition accuracy, followed by the hybrid system. The SV approach is thus not only theoretically well-founded but also superior in a practical application.

Web DOI [BibTex]

1997

Web DOI [BibTex]


no image
The view-graph approach to visual navigation and spatial memory

Mallot, H., Franz, M., Schölkopf, B., Bülthoff, H.

In Artificial Neural Networks: ICANN ’97, pages: 751-756, (Editors: W Gerstner and A Germond and M Hasler and J-D Nicoud), Springer, Berlin, Germany, 7th International Conference on Artificial Neural Networks, October 1997 (inproceedings)

Abstract
This paper describes a purely visual navigation scheme based on two elementary mechanisms (piloting and guidance) and a graph structure combining individual navigation steps controlled by these mechanisms. In robot experiments in real environments, both mechanisms have been tested, piloting in an open environment and guidance in a maze with restricted movement opportunities. The results indicate that navigation and path planning can be brought about with these simple mechanisms. We argue that the graph of local views (snapshots) is a general and biologically plausible means of representing space and integrating the various mechanisms of map behaviour.

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Predicting time series with support vector machines

Müller, K., Smola, A., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.

In Artificial Neural Networks: ICANN’97, pages: 999-1004, (Editors: Schölkopf, B. , C.J.C. Burges, A.J. Smola), Springer, Berlin, Germany, 7th International Conference on Artificial Neural Networks , October 1997 (inproceedings)

Abstract
Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an e insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularization parameters in these models. Two applications are considered: data from (a) a noisy (normal and uniform noise) Mackey Glass equation and (b) the Santa Fe competition (set D). In both cases Support Vector Machines show an excellent performance. In case (b) the Support Vector approach improves the best known result on the benchmark by a factor of 29%.

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
Predicting time series with support vectur machines

Müller, K., Smola, A., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.

In Artificial neural networks: ICANN ’97, pages: 999-1004, (Editors: W Gerstner and A Germond and M Hasler and J-D Nicoud), Springer, Berlin, Germany, 7th International Conference on Artificial Neural Networks , October 1997 (inproceedings)

Abstract
Support Vector Machines are used for time series prediction and compared to radial basis function networks. We make use of two different cost functions for Support Vectors: training with (i) an e insensitive loss and (ii) Huber's robust loss function and discuss how to choose the regularization parameters in these models. Two applications are considered: data from (a) a noisy (normal and uniform noise) Mackey Glass equation and (b) the Santa Fe competition (set D). In both cases Support Vector Machines show an excellent performance. In case (b) the Support Vector approach improves the best known result on the benchmark by a factor of 29%.

PDF PDF DOI [BibTex]

PDF PDF DOI [BibTex]


no image
Kernel principal component analysis

Schölkopf, B., Smola, A., Müller, K.

In Artificial neural networks: ICANN ’97, LNCS, vol. 1327, pages: 583-588, (Editors: W Gerstner and A Germond and M Hasler and J-D Nicoud), Springer, Berlin, Germany, 7th International Conference on Artificial Neural Networks, October 1997 (inproceedings)

Abstract
A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible d-pixel products in images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.

PDF DOI [BibTex]

PDF DOI [BibTex]


no image
Homing by parameterized scene matching

Franz, M., Schölkopf, B., Bülthoff, H.

In Proceedings of the 4th European Conference on Artificial Life, (Eds.) P. Husbands, I. Harvey. MIT Press, Cambridge 1997, pages: 236-245, (Editors: P Husbands and I Harvey), MIT Press, Cambridge, MA, USA, 4th European Conference on Artificial Life (ECAL97), July 1997 (inproceedings)

Abstract
In visual homing tasks, animals as well as robots can compute their movements from the current view and a snapshot taken at a home position. Solving this problem exactly would require knowledge about the distances to visible landmarks, information, which is not directly available to passive vision systems. We propose a homing scheme that dispenses with accurate distance information by using parameterized disparity fields. These are obtained from an approximation that incorporates prior knowledge about perspective distortions of the visual environment. A mathematical analysis proves that the approximation does not prevent the scheme from approaching the goal with arbitrary accuracy. Mobile robot experiments are used to demonstrate the practical feasibility of the approach.

PDF [BibTex]

PDF [BibTex]


no image
Das Spiel mit dem künstlichen Leben.

Schölkopf, B.

Frankfurter Allgemeine Zeitung, Wissenschaftsbeilage, June 1997 (misc)

[BibTex]

[BibTex]


no image
Improving the accuracy and speed of support vector learning machines

Burges, C., Schölkopf, B.

In Advances in Neural Information Processing Systems 9, pages: 375-381, (Editors: M Mozer and MJ Jordan and T Petsche), MIT Press, Cambridge, MA, USA, Tenth Annual Conference on Neural Information Processing Systems (NIPS), May 1997 (inproceedings)

Abstract
Support Vector Learning Machines (SVM) are finding application in pattern recognition, regression estimation, and operator inversion for illposed problems . Against this very general backdrop any methods for improving the generalization performance, or for improving the speed in test phase of SVMs are of increasing interest. In this paper we combine two such techniques on a pattern recognition problem The method for improving generalization performance the "virtual support vector" method does so by incorporating known invariances of the problem This method achieves a drop in the error rate on 10.000 NIST test digit images of 1,4 % to 1 %. The method for improving the speed (the "reduced set" method) does so by approximating the support vector decision surface. We apply this method to achieve a factor of fifty speedup in test phase over the virtual support vector machine The combined approach yields a machine which is both 22 times faster than the original machine, and which has better generalization performance achieving 1,1 % error . The virtual support vector method is applicable to any SVM problem with known invariances The reduced set method is applicable to any support vector machine .

PDF Web [BibTex]

PDF Web [BibTex]


no image
Homing by parameterized scene matching

Franz, M., Schölkopf, B., Bülthoff, H.

(46), Max Planck Institute for Biological Cybernetics, Tübingen, Germany, Febuary 1997 (techreport)

Abstract
In visual homing tasks, animals as well as robots can compute their movements from the current view and a snapshot taken at a home position. Solving this problem exactly would require knowledge about the distances to visible landmarks, information, which is not directly available to passive vision systems. We propose a homing scheme that dispenses with accurate distance information by using parameterized disparity fields. These are obtained from an approximation that incorporates prior knowledge about perspective distortions of the visual environment. A mathematical analysis proves that the approximation does not prevent the scheme from approaching the goal with arbitrary accuracy. Mobile robot experiments are used to demonstrate the practical feasibility of the approach.

[BibTex]

[BibTex]


no image
Learning view graphs for robot navigation

Franz, M., Schölkopf, B., Georg, P., Mallot, H., Bülthoff, H.

In Proceedings of the 1st Intl. Conf. on Autonomous Agents, pages: 138-147, (Editors: Johnson, W.L.), ACM Press, New York, NY, USA, First International Conference on Autonomous Agents (AGENTS '97), Febuary 1997 (inproceedings)

Abstract
We present a purely vision-based scheme for learning a parsimonious representation of an open environment. Using simple exploration behaviours, our system constructs a graph of appropriately chosen views. To navigate between views connected in the graph, we employ a homing strategy inspired by findings of insect ethology. Simulations and robot experiments demonstrate the feasibility of the proposed approach.

PDF Web DOI [BibTex]

PDF Web DOI [BibTex]


no image
Masking by plaid patterns is not explained by adaptation, simple contrast gain-control or distortion products

Wichmann, F., Tollin, D.

Investigative Ophthamology and Visual Science, 38 (4), pages: S631, 1997 (poster)

[BibTex]

[BibTex]


no image
Masking by plaid patterns: spatial frequency tuning and contrast dependency

Wichmann, F., Tollin, D.

OSA Conference Program, pages: 97, 1997 (poster)

Abstract
The detectability of horizontally orientated sinusoidal signals at different spatial-frequencies was measured in standard 2AFC - tasks in the presence of two-component plaid patterns of different orientation and contrast. The shape of the resulting masking surface provides insight into, and constrains models of, the underlying masking mechanisms.

[BibTex]

[BibTex]


no image
ATM-dependent telomere loss in aging human diploid fibroblasts and DNA damage lead to the post-translational activation of p53 protein involving poly(ADP-ribose) polymerase.

Vaziri, H., MD, .., RC, .., Davison, T., YS, .., CH, .., GG, .., Benchimol, S.

The European Molecular Biology Organization Journal, 16(19):6018-6033, 1997 (article)

Web [BibTex]

Web [BibTex]


no image
Support vector learning

Schölkopf, B.

pages: 173, Oldenbourg, München, Germany, 1997, Zugl.: Berlin, Techn. Univ., Diss., 1997 (book)

PDF GZIP [BibTex]

PDF GZIP [BibTex]