用户名: 密码: 验证码:
HSR: L 1/2-regularized sparse representation for fast face recognition using hierarchical feature selection
详细信息    查看全文
  • 作者:Bo Han ; Bo He ; Tingting Sun ; Tianhong Yan ; Mengmeng Ma…
  • 关键词:Fast face recognition ; Hierarchical feature selection ; Gabor wavelets ; ELM ; AE ; Sparse representation ; L 1/2 regularization ; HSR
  • 刊名:Neural Computing & Applications
  • 出版年:2016
  • 出版时间:February 2016
  • 年:2016
  • 卷:27
  • 期:2
  • 页码:305-320
  • 全文大小:2,211 KB
  • 参考文献:1.Zhao WY, Chellppa R, Phillips PJ, Rosenfeld A (2003) Face recognition: a literature survey. ACM Comput Surv 35:399–459CrossRef
    2.Turk M, Pentland A (1991) Eigenfaces for recognition. J. Cogn Neurosci 3:71–86CrossRef
    3.Belhumeur PN, Hespanha JP, Kriengman DJ (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE PAMI 19:711–720CrossRef
    4.Barkan O et al (2013) Fast high dimensional vector multiplication face recognition. In: IEEE international conference on computer vision (ICCV), 2013
    5.Jiang Xudong (2011) Linear subspace learning-based dimensionality reduction. Sig Process Mag IEEE 28(2):16–26CrossRef
    6.Freifeld O, Black MJ (2012) Lie bodies: a manifold representation of 3D human shape. Computer vision—ECCV 2012. Springer Berlin Heidelberg, pp 1–14
    7.Wright J, Yang AY, Ganesh A et al (2009) Robust face recognition via sparse representation. IEEE Trans Pattern Anal Mach Intell 31(2):210–227CrossRef
    8.Yang M, Zhang L (2010) Gabor feature based sparse representation for face recognition with Gabor occlusion dictionary. Computer vision—ECCV 2010. Springer Berlin Heidelberg, pp 448–461
    9. Sharma BP, Rajesh R, Rajesh M (2013) Face recognition using Gabor wavelet for image processing applications. In: Proc. of Int. Conf. on Emerging Trends in Engineering and Technology. http://​searchdl.​org/​public/​book_​series/​AETS/​3/​98.​pdf
    10.Cambria E et al (2013) Extreme learning machines. IEEE Intell Syst 28(6):30–59CrossRef
    11.Gupta HA, Raju A, Alwan A (2013) The effect of non-linear dimension reduction on Gabor filter bank feature space. J Acoust Soc Am 134(5):4069CrossRef
    12.Cao Jiuwen, Lin Zhiping (2014) Bayesian signal detection with compressed measurements. Inf Sci 289:241–253CrossRef
    13.Xu ZB (2010) Data modeling: visual psychology approach and L1/2 regularization theory. In: Proceedings of international congress of Mathematicians, pp 3151–3184
    14.Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501CrossRef
    15.Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of IEEE international joint conference on neural networks, vol 2, pp 985–990
    16.Cao J, Lianglin X (2014) Protein sequence classification with improved extreme learning machine algorithms. BioMed Res Int 2014:1–12. Article ID 103054
    17.Huang GB, Zhou H, Ding X (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B Cybern 42(2):513–529CrossRef
    18. Noble B (1975) Methods for computing the Moore–Penrose generalized inverse, and related matters. In: Nashed MZ (ed) Generalized inverses and applications. Academic, New York, pp 245–301. http://​www.​univie.​ac.​at/​nuhag-php/​bibtex/​open_​files/​no76_​B.​%20​Noble0001.​pdf
    19.Liu C, Wechsler H (2002) Gabor feature based classification using the enhanced fisher linear discriminant model for face recognition. IEEE IP 11:467–476
    20.Baldi P (2012) Autoencoders, unsupervised learning, and deep architectures. ICML Unsupervised and Transfer Learning, JMLR: Workshop and Conference Proceedings, vol 27, pp 37–50. http://​ww.​mtome.​com/​Publications/​CiML/​CiML-v7-book.​pdf#page=​53
    21.Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16):3056–3062CrossRef
    22.Akaike H (1987) Factor analysis and AIC. Psychometrika 52(3):317–332MATH MathSciNet CrossRef
    23.Burnham KP, Anderson DR (2004) Multimodel inference understanding AIC and BIC in model selection. Sociol Methods Res 33(2):261–304MathSciNet CrossRef
    24.Zhao P, Yu B (2006) On model selection consistency of Lasso. J Mach Learn Res 7:2541–2563MATH MathSciNet
    25.Ogutu JO, Torben S-S, Hans-Peter P (2012) Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions. In: BMC proceedings, vol 6, no Suppl 2, BioMed Central Ltd
    26.Kim J, Koh K, Kusting M et al (2007) A method for large-scale l1-regularized least squares. IEEE J Sel Top Signal Process 1:606–617CrossRef
    27.Bengio Y (2009) Learning deep architectures for AI. Found Trends® Mach Learn 2(1):1–127
    28.Lee K, Ho J, Kriegman D (2005) Acquiring linear subspaces for face recognition under variable lighting. IEEE PAMI 27:684–698CrossRef
    29.Martinez A, Benavente R (1998) The AR face database. CVC Technical Report no. 24
    30.Phillips PJ, Moon H, Rizvi SA, Rauss P (2000) The FERET evaluation method-ology for face recognition algorithms. IEEE PAMI 22:1090–1104CrossRef
  • 作者单位:Bo Han (1)
    Bo He (1)
    Tingting Sun (1)
    Tianhong Yan (2)
    Mengmeng Ma (1)
    Yue Shen (1)
    Amaury Lendasse (3) (4)

    1. School of Information Science and Engineering, Ocean University of China, Qingdao, 266000, Shandong, China
    2. School of Mechanical and Electrical Engineering, China Jiliang University, Hangzhou, 310018, Zhejiang, China
    3. Department of Mechanical and Industrial Engineering and the Iowa Informatics Initiative, The University of Iowa, Iowa City, IA, 52242-1527, USA
    4. Arcada University of Applied Sciences, 00550, Helsinki, Finland
  • 刊物类别:Computer Science
  • 刊物主题:Simulation and Modeling
  • 出版者:Springer London
  • ISSN:1433-3058
文摘
In this paper, we propose a novel method for fast face recognition called L 1/2-regularized sparse representation using hierarchical feature selection. By employing hierarchical feature selection, we can compress the scale and dimension of global dictionary, which directly contributes to the decrease of computational cost in sparse representation that our approach is strongly rooted in. It consists of Gabor wavelets and extreme learning machine auto-encoder (ELM-AE) hierarchically. For Gabor wavelets’ part, local features can be extracted at multiple scales and orientations to form Gabor-feature-based image, which in turn improves the recognition rate. Besides, in the presence of occluded face image, the scale of Gabor-feature-based global dictionary can be compressed accordingly because redundancies exist in Gabor-feature-based occlusion dictionary. For ELM-AE part, the dimension of Gabor-feature-based global dictionary can be compressed because high-dimensional face images can be rapidly represented by low-dimensional feature. By introducing L 1/2 regularization, our approach can produce sparser and more robust representation compared to L 1-regularized sparse representation-based classification (SRC), which also contributes to the decrease of the computational cost in sparse representation. In comparison with related work such as SRC and Gabor-feature-based SRC, experimental results on a variety of face databases demonstrate the great advantage of our method for computational cost. Moreover, we also achieve approximate or even better recognition rate. Keywords Fast face recognition Hierarchical feature selection Gabor wavelets ELM-AE Sparse representation L 1/2 regularization HSR

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700