Subspace, Latent Structure and Feature Selection [electronic resource] : Statistical and Optimization Perspectives Workshop, SLSFS 2005, Bohinj, Slovenia, February 23-25, 2005, Revised Selected Papers / edited by Craig Saunders, Marko Grobelnik, Steve Gunn, John Shawe-Taylor.

Contributor(s): Saunders, Craig [editor.] | Grobelnik, Marko [editor.] | Gunn, Steve [editor.] | Shawe-Taylor, John [editor.] | SpringerLink (Online service)Material type: TextTextSeries: Lecture Notes in Computer Science ; 3940Publisher: Berlin, Heidelberg : Springer Berlin Heidelberg, 2006Description: X, 209 p. Also available online. online resourceContent type: text Media type: computer Carrier type: online resourceISBN: 9783540341383Subject(s): Computer science | Computer software | Artificial intelligence | Computer vision | Optical pattern recognition | Computer Science | Algorithm Analysis and Problem Complexity | Probability and Statistics in Computer Science | Computation by Abstract Devices | Artificial Intelligence (incl. Robotics) | Image Processing and Computer Vision | Pattern RecognitionAdditional physical formats: Printed edition:: No titleDDC classification: 005.1 LOC classification: QA76.9.A43Online resources: Click here to access online
Contents:
Invited Contributions -- Discrete Component Analysis -- Overview and Recent Advances in Partial Least Squares -- Random Projection, Margins, Kernels, and Feature-Selection -- Some Aspects of Latent Structure Analysis -- Feature Selection for Dimensionality Reduction -- Contributed Papers -- Auxiliary Variational Information Maximization for Dimensionality Reduction -- Constructing Visual Models with a Latent Space Approach -- Is Feature Selection Still Necessary? -- Class-Specific Subspace Discriminant Analysis for High-Dimensional Data -- Incorporating Constraints and Prior Knowledge into Factorization Algorithms – An Application to 3D Recovery -- A Simple Feature Extraction for High Dimensional Image Representations -- Identifying Feature Relevance Using a Random Forest -- Generalization Bounds for Subspace Selection and Hyperbolic PCA -- Less Biased Measurement of Feature Selection Benefits.
In: Springer eBooks
Item type: E-BOOKS
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Current library Home library Call number Materials specified URL Status Date due Barcode
IMSc Library
IMSc Library
Link to resource Available EBK4086

Invited Contributions -- Discrete Component Analysis -- Overview and Recent Advances in Partial Least Squares -- Random Projection, Margins, Kernels, and Feature-Selection -- Some Aspects of Latent Structure Analysis -- Feature Selection for Dimensionality Reduction -- Contributed Papers -- Auxiliary Variational Information Maximization for Dimensionality Reduction -- Constructing Visual Models with a Latent Space Approach -- Is Feature Selection Still Necessary? -- Class-Specific Subspace Discriminant Analysis for High-Dimensional Data -- Incorporating Constraints and Prior Knowledge into Factorization Algorithms – An Application to 3D Recovery -- A Simple Feature Extraction for High Dimensional Image Representations -- Identifying Feature Relevance Using a Random Forest -- Generalization Bounds for Subspace Selection and Hyperbolic PCA -- Less Biased Measurement of Feature Selection Benefits.

There are no comments on this title.

to post a comment.
The Institute of Mathematical Sciences, Chennai, India

Powered by Koha