Learning and Selecting Features Jointly with Point-wise Gated Boltzmann Machines |
Kihyuk Sohn, Guanyu Zhou, Chansoo Lee, and Honglak Lee |
Overview
Unsupervised feature learning has emerged as a promising tool in learning representations from unlabeled data. However, it is still challenging to learn useful high-level features when the data contains a significant amount of irrelevant patterns. Although feature selection can be used for such complex data, it may fail when we have to build a learning system from scratch (i.e., starting from the lack of useful raw features). |
Model: Point-wise Gated Boltzmann machine
Supervised point-wise gated Boltzmann machine. |
Convolutional point-wise gated Boltzmann machine. |
Evaluation 1: Handwritten digit recognition in the presence of noise
Visualization of learned filters and switch unit activations on MNIST-back-image. |
Error rates on MNIST variation datasets. |
|
Visualization of (top) learned filters from Caltech-101 Face dataset; filters in left are task-relevant and those in right are task-irrelevant, (middle) switch unit activation maps, and (bottom) corresponding examples from Caltech-101 Face dataset overlayed with predicted (red) and ground truth (green) bounding boxes. |
Visualization of (Top) learned filters from Caltech-101 Car dataset; filters in left are task-relevant and those in right are task-irrelevant, (middle) switch unit activation maps, and (bottom) corresponding examples from Caltech-101 Car dataset overlayed with predicted (red) and ground truth (green) bounding boxes. |
Publication |
Learning and Selecting Features Jointly with Point-wise Gated Boltzmann Machines. |
Kihyuk Sohn, Guanyu Zhou, Chansoo Lee, and Honglak Lee. |
In Proceedings of the 30th International Conference on Machine Learning (ICML), 2013. |
Download |
Feedback |
Please email me if you have any question. |
Acknowledgments |
This work was supported in part by NSF IIS 1247414 and a Google Faculty Research Award. |