Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 0.650
JCR 5-Year IF: 0.639
Issues per year: 4
Current issue: Aug 2019
Next issue: Nov 2019
Avg review time: 72 days


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

2,358,168 unique visits
611,321 downloads
Since November 1, 2009



Robots online now
SemrushBot
SemanticScholar


SJR SCImago RANK

SCImago Journal & Country Rank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 19 (2019)
 
     »   Issue 3 / 2019
 
     »   Issue 2 / 2019
 
     »   Issue 1 / 2019
 
 
 Volume 18 (2018)
 
     »   Issue 4 / 2018
 
     »   Issue 3 / 2018
 
     »   Issue 2 / 2018
 
     »   Issue 1 / 2018
 
 
 Volume 17 (2017)
 
     »   Issue 4 / 2017
 
     »   Issue 3 / 2017
 
     »   Issue 2 / 2017
 
     »   Issue 1 / 2017
 
 
 Volume 16 (2016)
 
     »   Issue 4 / 2016
 
     »   Issue 3 / 2016
 
     »   Issue 2 / 2016
 
     »   Issue 1 / 2016
 
 
  View all issues  








LATEST NEWS

2019-Jun-20
Clarivate Analytics published the InCites Journal Citations Report for 2018. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.650, and the JCR 5-Year Impact Factor is 0.639.

2018-May-31
Starting today, the minimum number a pages for a paper is 8, so all submitted papers should have 8, 10 or 12 pages. No exceptions will be accepted.

2018-Jun-27
Clarivate Analytics published the InCites Journal Citations Report for 2017. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.699, and the JCR 5-Year Impact Factor is 0.674.

2017-Jun-14
Thomson Reuters published the Journal Citations Report for 2016. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.595, and the JCR 5-Year Impact Factor is 0.661.

Read More »


    
 

  1/2019 - 7

Circular Derivative Local Binary Pattern Feature Description for Facial Expression Recognition

TCHANGOU TOUDJEU, I., TAPAMO, J.-R. See more information about TAPAMO, J.-R. on SCOPUS See more information about TAPAMO, J.-R. on SCOPUS See more information about TAPAMO, J.-R. on Web of Science
 
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (1,929 KB) | Citation | Downloads: 250 | Views: 362

Author keywords
affective computing, classification, face recognition, feature extraction, image texture analysis

References keywords
facial(19), recognition(18), local(13), binary(11), patterns(8), pattern(8), image(6), classification(6), icme(4), comput(4)
Blue keywords are present in both the references section and the paper title.

About this article
Date of Publication: 2019-02-28
Volume 19, Issue 1, Year 2019, On page(s): 51 - 56
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2019.01007
Web of Science Accession Number: 000459986900007
SCOPUS ID: 85064192591

Abstract
Quick view
Full text preview
This paper presents a novel feature extraction technique called circular derivative local binary pattern (CD-LBP) for Facial Expression Recognition (FER). Motivated by uniform local binary patterns (uLBPs) which exhibits high discriminative potential at a reduced data dimension of the original LBP feature vector, we extract CD-LBP feature descriptors as a result of binary derivatives of the circular binary patterns formed by LBPs. Seven datasets consisting of CD-LBP feature vectors are derived from the Japanese female facial expression (JAFFE) database, fed individually in a K-nearest neighbor classifier and evaluated with respect to their respective recognition rate and feature vector size. The experimental results demonstrate the relevance of the proposed feature description especially when performance metrics such as recognition accuracy and running time are considered.


References | Cited By  «-- Click to see who has cited this paper

[1] N. N. Khatri, Z. H. Shah, S. A. Patel, "Facial expression recognition: A survey," International Journal of Computer Science and Information Technologies (IJCSIT), vol. 5, pp. 149-152, 2014.

[2] X. Feng, M. Pietikinen, A. Hadid, "Facial Expression Recognition with Local Binary Patterns and Linear Programming," Pattern Recognition and Image Analysis, vol. 15, no. 2, pp. 546-548, 2005.
[CrossRef] [SCOPUS Times Cited 33]


[3] L. B. Majumder, V. K. Subramanian, "Local binary pattern based facial expression recognition using Self-organizing Map," in International Joint Conference on Neural Networks (IJCNN), pp. 2375-2382, 2014.
[CrossRef] [SCOPUS Times Cited 4]


[4] D. Huang, C. Shan, M. Ardabilian, Y. Wang, L. Chen, "Local binary patterns and its application to facial image analysis: A survey," IEEE Trans. Syst. Man. Cybern. C Appl. Rev., vol. 41, no. 6, pp. 765-781, Nov. 2011.
[CrossRef] [Web of Science Times Cited 332] [SCOPUS Times Cited 465]


[5] C. Silva, T. Bouwmans, C. Frélicot, "An eXtended center-symmetric local binary pattern for background modeling and subtraction in videos," Proc. Int. Conf. Comput. Vis. Theory Appli., pp. 395-402, 2015.
[CrossRef]


[6] G. Xue, L. Song, J. Sun, and M. Wu, "Hybrid center-symmetric local pattern for dynamic background subtraction," in Proc. of IEEE International Conference on Multimedia and Expo (ICME), 2011.
[CrossRef] [SCOPUS Times Cited 27]


[7] O. Lahdenoja, J. Poikonen, M. Laiho, "Towards understanding the formation of uniform local binary patterns", ISRN Mach Vis., vol. 2013, pp. 1, Jun. 2013.
[CrossRef]


[8] I. Cohen, N. Sebe, A. Garg, M. S. Lew, T. S. Huang, "Facial expression recognition from video sequences," in Proc. of IEEE International Conference on Multimedia and Expo (ICME), pp. 121-124, 2002.
[CrossRef] [SCOPUS Times Cited 51]


[9] S. Moore, R. Bowden, "Local binary patterns for multi-view facial expression recognition," Comput. Vis. Image Understanding, vol. 115, no. 4, pp. 541-558, 2011.
[CrossRef] [Web of Science Times Cited 176] [SCOPUS Times Cited 215]


[10] X. M. Zhao, S.Q. Zhang, "A review on facial expression recognition: feature extraction and classification," IETE Technical Review, vol. 33, no. 5, pp. 505-517, 2016.
[CrossRef] [Web of Science Times Cited 8] [SCOPUS Times Cited 14]


[11] X. Feng, A. Hadid, M. Pietikinen, "A coarse-to-fine classification scheme for facial expression recognition," Proc. Int. Conf. Image Anal. Recog., pp. 668-675, 2004.
[CrossRef]


[12] C. Shan, S. Gong, P. W. McOwan, "Facial expression recognition based on local binary patterns: A comprehensive study," Image Vis. Comput., vol. 27, no. 6, pp. 803-816, 2009.
[CrossRef] [Web of Science Times Cited 885] [SCOPUS Times Cited 1168]


[13] A. Sohail, P. Bhattacharya, "Classification of facial expressions using k-nearest neighbor classifier," Proc. Vision Computer Graphics Collaboration Techniques, pp. 555-566, 2007.
[CrossRef]


[14] R. Suresh, S. Audithan, G. Kannan and K. Raja, "Facial Expression Recognition System Using Local Texture Features of Contourlet Transformation," Australian Journal of Basic and Applied Sciences, vol. 10. no. 2, 2016.

[15] S. Kasim, R. Hassan, N. H. Zaini, A. Syifaa’Ahmad, A. A. Ramli, R. R. Saedudin, "A Study on Facial Expression Recognition Using Local Binary Pattern," International Journal on Advanced Science, Engineering and Information Technology, vol. 7, no. 5, pp. 1621-6, 26 Oct. 26, 2017.
[CrossRef] [SCOPUS Times Cited 4]


[16] Y. Chang, C. Hu, R. Feris, M. Turk, "Manifold Based Analysis of Facial Expression," J. Image and Vision Computing, vol. 24, no. 6, pp. 605-614, 2006.
[CrossRef] [Web of Science Times Cited 88] [SCOPUS Times Cited 121]


[17] S. Berretti, A. D. Bimbo, P. P. B.B. Amor, M. Daoudi, "A set of selected SIFT features for 3D facial expression recognition," Proc. 20th International Conference on Pattern Recognition, pp. 4125-4128, 2010.
[CrossRef] [SCOPUS Times Cited 87]


[18] T. Ojala, M. Pietikinen, D. Harwood, "A Comparative Study of Texture Measures with Classification Based on Feature Distributions," Pattern Recognition, vol. 29, pp. 51-59, 1996.
[CrossRef] [Web of Science Times Cited 3126] [SCOPUS Times Cited 4198]


[19] S. L. Happy, A. George, A. Routray, "A real time facial expression classification system using local binary patterns," Proc. 4th Int. Conf. Intell. Human Comput. Interaction, pp. 1-5, 2012.
[CrossRef] [SCOPUS Times Cited 30]


[20] X. Zhao, S. Zhang, "Facial expression recognition using local binary patterns and discriminant kernel locally linear embedding," EURASIP Journal of Advances in Signal Processing, vol. 2012, no. 20, pp. 1-9, 2012.
[CrossRef] [Web of Science Times Cited 51] [SCOPUS Times Cited 46]


[21] M. J. Lyons, J. Budynek, S. Akamatsu, "Automatic Classification of Single Facial Images," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 21, no. 12, pp. 1,357-1,362, 1999.
[CrossRef] [Web of Science Times Cited 542] [SCOPUS Times Cited 682]


[22] P. Viola, M. Jones, "Robust Real-Time Face Detection," International Journal of Computer Vision, vol. 57, no. 2, pp. 137-154, May 2004.
[CrossRef] [Web of Science Times Cited 6072] [SCOPUS Times Cited 8312]


[23] X. Feng, B. Lv, Z. Li, J. Zhang, "A novel feature extraction method for facial expression recognition," Proc. Joint Conf. Inform. Sci. Issue Adv. Intell. Syst. Res., pp. 371-375, 2006.
[CrossRef] [SCOPUS Times Cited 11]


[24] K. Meena and A. Suruliandi, "Local binary patterns and its variants for face recognition," in Recent Trends in Information Technology (ICRTIT), 2011 International Conference on, 2011, pp. 782-786.
[CrossRef] [SCOPUS Times Cited 16]


[25] Y. Wu and Q. Weigen, "Facial expression recognition based on improved deep belief networks," IAP Conference Proceedings, vol. 1864, no. 1, 2017.
[CrossRef] [Web of Science Times Cited 1] [SCOPUS Times Cited 3]




References Weight

Web of Science® Citations for all references: 11,281 TCR
SCOPUS® Citations for all references: 15,487 TCR

Web of Science® Average Citations per reference: 434 ACR
SCOPUS® Average Citations per reference: 596 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2019-10-13 03:37 in 162 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2019
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: