Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 0.595
JCR 5-Year IF: 0.661
Issues per year: 4
Current issue: May 2018
Next issue: Aug 2018
Avg review time: 105 days


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

1,964,059 unique visits
535,261 downloads
Since November 1, 2009



No robots online now


SJR SCImago RANK

SCImago Journal & Country Rank


SEARCH ENGINES

aece.ro - Google Pagerank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 18 (2018)
 
     »   Issue 2 / 2018
 
     »   Issue 1 / 2018
 
 
 Volume 17 (2017)
 
     »   Issue 4 / 2017
 
     »   Issue 3 / 2017
 
     »   Issue 2 / 2017
 
     »   Issue 1 / 2017
 
 
 Volume 16 (2016)
 
     »   Issue 4 / 2016
 
     »   Issue 3 / 2016
 
     »   Issue 2 / 2016
 
     »   Issue 1 / 2016
 
 
 Volume 15 (2015)
 
     »   Issue 4 / 2015
 
     »   Issue 3 / 2015
 
     »   Issue 2 / 2015
 
     »   Issue 1 / 2015
 
 
  View all issues  








LATEST NEWS

2017-Jun-14
Thomson Reuters published the Journal Citations Report for 2016. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.595, and the JCR 5-Year Impact Factor is 0.661.

2017-Apr-04
We have the confirmation Advances in Electrical and Computer Engineering will be included in the EBSCO database.

2017-Jan-30
We have the confirmation Advances in Electrical and Computer Engineering will be included in the Gale database.

Read More »


    
 

  4/2013 - 10

A Hybrid Method for Fast Finding the Reduct with the Best Classification Accuracy

HACIBEYOGLU, M. See more information about HACIBEYOGLU, M. on SCOPUS See more information about HACIBEYOGLU, M. on IEEExplore See more information about HACIBEYOGLU, M. on Web of Science, ARSLAN, A. See more information about  ARSLAN, A. on SCOPUS See more information about  ARSLAN, A. on SCOPUS See more information about ARSLAN, A. on Web of Science, KAHRAMANLI, S. See more information about KAHRAMANLI, S. on SCOPUS See more information about KAHRAMANLI, S. on SCOPUS See more information about KAHRAMANLI, S. on Web of Science
 
Click to see author's profile on See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (653 KB) | Citation | Downloads: 351 | Views: 2,229

Author keywords
artificial intelligence, classification algorithms, decision trees, discernibility function, feature selection

References keywords
rough(13), data(13), systems(11), knowledge(11), information(11), rule(10), learning(10), induction(10), approach(8), classification(7)
No common words between the references section and the paper title.

About this article
Date of Publication: 2013-11-30
Volume 13, Issue 4, Year 2013, On page(s): 57 - 64
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2013.04010
Web of Science Accession Number: 000331461300010
SCOPUS ID: 84890203115

Abstract
Quick view
Full text preview
Usually a dataset has a lot of reducts finding all of which is known to be an NP hard problem. On the other hand, different reducts of a dataset may provide different classification accuracies. Usually, for every dataset, there is only a reduct with the best classification accuracy to obtain this best one, firstly we obtain the group of attributes that are dominant for the given dataset by using the decision tree algorithm. Secondly we complete this group up to reducts by using discernibility function techniques. Finally, we select only one reduct with the best classification accuracy by using data mining classification algorithms. The experimental results for datasets indicate that the classification accuracy is improved by removing the irrelevant features and using the simplified attribute set which is derived from proposed method.


References | Cited By  «-- Click to see who has cited this paper

[1] J. R. Quinlan, "Induction of decision trees," Machine Learning, vol. 1, no. 1, pp. 81-106, 1986.
[CrossRef] [SCOPUS Times Cited 8202]


[2] J. R. Quinlan, RM. Cameron-Jones "Induction of logic programs: Foil and related systems," New Generation Computing, vol. 13 no. 3-4, pp. 287-312, 1995.
[CrossRef] [Web of Science Times Cited 80] [SCOPUS Times Cited 109]


[3] Y. Kusunoki, M. Inuiguchi, J. Stefanowski, "Rule induction via clustering decision classes," International Journal of Innovative Computing Information and Control, vol. 4, no. 10, pp.2663-2677, 2008.

[4] J. A. Jakubczyc, "The ant colony algorithms for rule induction," Proceedings of AIML 05 Conference, CICC, Cairo, Egypt, pp. 112-117, 19-21 December 2005.

[5] J. W. Grzymala-Busse, Chien Pei B., "Classification methods in rule induction," Proceedings of the Fifth Intelligent Information Systems Workshop, Deblin, Poland, pp. 120-126, June 2-5 1996.

[6] R. S. Michalski, "On the Quasi-Minimal solution of the covering problem," Proceedings of the Fifth International Symposium on Information Processing, Bled, Yugoslavia, A3 (Switching Circuits), pp.125-128, 8-11 October 1969.

[7] R. S. Michalski, I. Mozetic, J. Hong, N. Lavrac, "The Multi-Purpose Incremental Learning System AQ15 and its Testing Application to Three Medical Domains," Proc. AAAI, pp. 1041-1047, 1986.

[8] W. Cohen, "Fast effective rule induction," Proceedings of the 12th International Conference on Machine Learning, p.115-123, 1995.

[9] S. J. Hong, "R-MINI: An iterative approach for generating minimal rules from examples," IEEE Transactions on Knowledge and Data Engineering, vol. 9, no. 5, pp. 709-717, 1997.
[CrossRef] [SCOPUS Times Cited 23]


[10] M. Muselli, D. Liberati, "Binary rule generation via hamming clustering," IEEE Transactions on Knowledge and Data Engineering, vol. 14, no. 6, pp. 1258-1268, 2002.
[CrossRef] [Web of Science Times Cited 28] [SCOPUS Times Cited 31]


[11] N. Cercone, A. An, C. Chan, "Rule-induction and case-based reasoning: Hybrid architectures appear advantageous," IEEE Transactions on Knowledge and Data Engineering, vol. 11, no. 1 pp. 166-174, 1999.
[CrossRef] [Web of Science Times Cited 30] [SCOPUS Times Cited 53]


[12] P. Smyth, R.M. Goodman, "An Information Theoretic Approach to Rule Induction from Database," IEEE Transactions on Knowledge and Data Engineering vol. 4 no. 4, pp. 301-316, 1992.
[CrossRef] [Web of Science Times Cited 151] [SCOPUS Times Cited 211]


[13] D. S. Zhang, L. Zhou, "Discovering Golden Nuggets: Data Mining in Financial Application," IEEE Transactions on Systems Man and Cybernetics Part C Applications and Reviews vol. 34, no.4, pp. 513-522, 2004.
[CrossRef] [Web of Science Times Cited 67] [SCOPUS Times Cited 102]


[14] R. Kohavi, J. R. Quinlan, "Decision-tree discovery," Handbook of Data Mining and Knowledge Discovery, pp. 267-276, 2002.

[15] L. Breiman, J. H. Friedman, R. A. Olshen, C.J. Stone, "Classification and Regression Trees," CA, Wadsworth, 1984.

[16] D. D. Patil, V. M. Wadhai, J. A. Gokhale, "Evaluation of Decision Tree Pruning Algorithms for Complexity and Classification Accuracy," International Journal of Computer Applications, vol. 11, no.2, pp. 23-30, 2010.

[17] J. Komorowski, L. Polkowski, A. Skowron, "Rough Set: A Tutorial," [Online] Available: Temporary on-line reference link removed - see the PDF document

[18] A. E. Hassanien, J. M. H. Ali, "Rough set approach for generation of classification rules of breast cancer data," Informatica, vol. 15, no. 1, pp. 23-38, 2004.

[19] J. Y. Guo, V. Chankong, "Rough set-based approach to rule generation and rule induction," International Journal of General Systems, vol. 31, no. 6, pp. 601-617, 2002.
[CrossRef] [Web of Science Times Cited 9] [SCOPUS Times Cited 9]


[20] J. F. Liu, Q. H. Hu, D. R. Yu, "A weighted rough set based method developed, for class imbalance learning," Information Sciences, vol. 178, no.4, pp. 1235-1256, 2008.
[CrossRef] [Web of Science Times Cited 41] [SCOPUS Times Cited 54]


[21] E. Xu, S.C. Tong, L.S. Shao, Y. Li, D. Jiao, "Rough Set Research on Rule Extraction in Information Table," Proceedings of Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), pp. 208-212, 2007.
[CrossRef] [SCOPUS Times Cited 1]


[22] M. A. Hall, G.Holmes, "Benchmarking Attribute selection techniques for discrete class data set mining," IEEE Transactions on Knowledge and data set Engineering vol. 15, pp. 6 pp. 1437-1447, 2003.
[CrossRef] [Web of Science Times Cited 449] [SCOPUS Times Cited 664]


[23] H. Liu, L.Yu, "Toward integrating feature selection algorithms for classification and clustering," IEEE Transactions on Knowledge and data set Engineering, vol. 17, no. 4, pp. 491-502, 2005.
[CrossRef] [Web of Science Times Cited 1003] [SCOPUS Times Cited 1419]


[24] A. Hassanien, "Fuzzy rough sets hybrid scheme for breast cancer detection," Image and Vision Computing, vol. 25, no. 2, pp. 172-183, 2007.
[CrossRef] [Web of Science Times Cited 67] [SCOPUS Times Cited 92]


[25] J. Y. Wang, J. Zhou, "Research of reduct features in the variable precision rough set model," Neuro computing, vol. 72, no. 10-12 pp. 2643-2648, 2009.
[CrossRef] [Web of Science Times Cited 33] [SCOPUS Times Cited 67]


[26] J. Wroblewski, "Ensembles of Classifiers Based on Approximate Reducts," Fundamenta Informaticae vol.47 no.3-4, pp. 351-360 2001.

[27] R. Jensen, Q. Shen, "Semantics-preserving dimensionality reduction: Rough and fuzzy-rough based approaches," IEEE Transactions on Knowledge and Data Engineering vol.16, no.12, pp.1457-1471, 2004.
[CrossRef] [Web of Science Times Cited 304] [SCOPUS Times Cited 425]


[28] Z. Pawlak, "Rough Sets: Theoretical aspects of reasoning about data," Kluwer Academic Publishers, Boston, pp. 1-252, 1991.
[CrossRef]


[29] J. R. Quinlan, "Learning efficient classification procedures and their application to chess end games," Machine Learning: An Artificial Intelligence Approach, vol. 1, pp.463-482, 1983.

[30] J. R. Quinlan, "C4.5: Programs for Machine Learning," Morgan Kaufmann Publishers, 1993.

[31] P. Clark, T. Niblett, "The CN2 induction algorithm," Machine Learning, vol. 3, no.4, pp. 261-283, 1989.
[CrossRef] [SCOPUS Times Cited 1320]


[32] L. H. Wang, G. F. Wu, "Attribute Reduction and Information Granularity," 6th World Multi conference on Systemics, Cybernetics and Informatics, Vol I, Proceedings - Information Systems Development I, pp. 32-37, 2003.

[33] R. Jensen, Q. Shen, "Rough set based feature selection: A review," 52 pages, 2007. [Online] Available: Temporary on-line reference link removed - see the PDF document

[34] R. W. Swiniarski, A.Skowron, "Rough set methods in feature selection and recognition," Pattern Recognition Letters, vol. 24, no. 6, pp. 833-849, 2003.
[CrossRef] [Web of Science Times Cited 421] [SCOPUS Times Cited 563]


[35] J. Wei, S. Wang, M. Wang, J. You, D. Liu, "Rough set based approach for inducing decision trees," Knowledge based systems, vol. 20, no. 8, 695-702, 2007.
[CrossRef] [Web of Science Times Cited 18] [SCOPUS Times Cited 24]


[36] A. Skowron, C. Rauszer, "The discernibility matrices and functions in information systems," Fundamenta Informaticae, vol. 15, no. 2, pp. 331-362, 1992.

[37] S. Kahramanli, M. Hacibeyoglu, A. Arslan, "A Boolean Function Approach to Feature Selection in Consistent Decision Information Systems," Expert Systems with Application, vol. 38, no. 7, pp. 8229-8239, 2011.
[CrossRef] [Web of Science Times Cited 6] [SCOPUS Times Cited 8]


[38] S. Kahramanli, M. Hacibeyoglu, A. Arslan, "Attribute Reduction by Partitioning the Minimized Discernibility Function," International Journal of Innovative Computing Information and Control, vol. 7, no. 5A, pp. 2167-2186, 2011.

[39] [Online] Available: Temporary on-line reference link removed - see the PDF document

[40] [Online] Available: Temporary on-line reference link removed - see the PDF document

[41] G. Shakhnarovish, T Darrell, P. Indyk, "Nearest-Neighbor Methods in Learning and Vision," MIT Press, 2005.

[42] T. Mitchel, "Machine Learning", McGraw-Hill, 1997.

[43] P. Minvielle, A. Doucet, A. Marrs, S. Maskell, "A Bayesian approach to joint tracking and identification of geometric shapes in video sequences", Image and Vision Computing, vol.28, no.1, pp.111-123, 2010.
[CrossRef] [Web of Science Times Cited 8] [SCOPUS Times Cited 12]


[44] C. Pozna, R.-E. Precup, J. K. Tar, I. Skrjanc, S. Preitl, "New results in modelling derived from Bayesian filtering," Knowledge-Based Systems, vol. 23, no. 2, pp. 182-194, 2010.
[CrossRef] [Web of Science Times Cited 11] [SCOPUS Times Cited 11]


[45] F. Zhang, W.-F. Xue, X. Liu, "Overview of nonlinear Bayesian filtering algorithm," Procedia Engineering, vol.15, pp. 489-495, 2011.
[CrossRef] [Web of Science Record] [SCOPUS Times Cited 7]


[46] G. E. D'Errico, "A la Kalman filtering for metrology tool with application to coordinate measuring machines," IEEE Transactions on Industrial Electronics, vol. 59 , no. 11, pp. 4377-4382, 2012.
[CrossRef] [Web of Science Times Cited 2] [SCOPUS Times Cited 7]


[47] N. Mastrogiannis, B. Boutsinas, I. Giannikos, "A method for improving the accuracy of data mining classification algorithms," Computers & Operations Research, vol. 36, no. 10, pp. 2829-2839, 2009.
[CrossRef] [Web of Science Times Cited 10] [SCOPUS Times Cited 26]




References Weight

Web of Science® Citations for all references: 2,738 TCR
SCOPUS® Citations for all references: 13,440 TCR

Web of Science® Average Citations per reference: 57 ACR
SCOPUS® Average Citations per reference: 280 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2018-06-16 11:41 in 179 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2018
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: