This section features a number of tutorials illustrating some of the main algorithms implemented in VLFeat. The tutorials can be roughly grouped into two categories. The first class of algorithms detect and describe image regions (features). The second class of algorithms cluster data.
Covariant detectors. An introduction to computing co-variant features like Harris-Affine.
Histogram of Oriented Gradients (HOG). Getting started with this ubiquitous representation for object recognition.
Scale Invariant Feature Transform (SIFT). Getting started with this popular feature detector / descriptor.
Dense SIFT (DSIFT) and PHOW. A state-of-the-art descriptor for image categorization.
Local Intensity Order Pattern (LIOP). Getting started with LIOP / descriptor.
Maximally Stable Extremal Regions (MSER). Extracting MSERs from an image.
Image distance transform. Compute the image distance transform for fast part models and edge matching.
GMM. An implementation of a Gaussian mixture model estimation using the Expectation Maximization algorithm.
k-means. VlFeat k-means implementation.
Integer optimized k-means (IKM). A quick overview of VLFeat fast k-means implementation.
Hierarchical k-means (HIKM). Create a fast k-means tree for integer data.
Agglomerative Information Bottleneck (AIB). Cluster discrete data based on the mutual information between the data and class labels.
Quick shift. An introduction which shows how to create superpixels using this quick mode seeking method.
SLIC. An introduction to SLIC supoerpixels.
Support Vector Machine (SVM). Learn a binary classifier and check its convergence by plotting the objective values.
Forests of kd-trees. Approximate nearest neighbor queries in high dimensions using an optimized forest of kd-trees.
Plotting functions for rank evaluation. Learn how to plot ROC, DET, and precision-recall curves.
MATLAB Utilities. A list of useful MATLAB functions bundled with VLFeat.