Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/26335
Title: | Inhibition-augmented trainable COSFIRE filters for keypoint detection and object recognition |
Authors: | Guo, Jiapan Shi, Chenyu Azzopardi, George Petkov, Nicolai |
Keywords: | Computer vision Pattern recognition systems Retina -- Imaging |
Issue Date: | 2016 |
Publisher: | Springer |
Citation: | Guo, J., Shi, C., Azzopardi, G., & Petkov, N. (2016). Inhibition-augmented trainable COSFIRE filters for keypoint detection and object recognition. Machine Vision and Applications, 27(8), 1197-1211. |
Abstract: | The shape and meaning of an object can radically change with the addition of one or more contour parts. For instance, a T-junction can become a crossover. We extend the COSFIRE trainable filter approach which uses a positive prototype pattern for configuration by adding a set of negative prototype patterns. The configured filter responds to patterns that are similar to the positive prototype but not to any of the negative prototypes. The configuration of such a filter comprises selecting given channels of a bank of Gabor filters that provide excitatory or inhibitory input and determining certain blur and shift parameters. We compute the response of such a filter as the excitatory input minus a fraction of the maximum of inhibitory inputs. We use three applications to demonstrate the effectiveness of inhibition: the exclusive detection of vascular bifurcations (i.e., without crossovers) in retinal fundus images (DRIVE data set), the recognition of architectural and electrical symbols (GREC’11 data set) and the recognition of handwritten digits (MNIST data set). |
URI: | https://www.um.edu.mt/library/oar//handle/123456789/26335 |
Appears in Collections: | Scholarly Works - FacICTAI |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Inhibition-augmented_trainable_COSFIRE_filters_for.pdf Restricted Access | 1.78 MB | Adobe PDF | View/Open Request a copy |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.