Click to open the HelpDesk interface
AECE - Front page banner

Menu:


FACTS & FIGURES

JCR Impact Factor: 1.102
JCR 5-Year IF: 0.734
Issues per year: 4
Current issue: Nov 2020
Next issue: Feb 2021
Avg review time: 58 days


PUBLISHER

Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229
ROMANIA

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


TRAFFIC STATS

2,744,506 unique visits
696,356 downloads
Since November 1, 2009



Robots online now
PetalBot
bingbot


SJR SCImago RANK

SCImago Journal & Country Rank




TEXT LINKS

Anycast DNS Hosting
MOST RECENT ISSUES

 Volume 20 (2020)
 
     »   Issue 4 / 2020
 
     »   Issue 3 / 2020
 
     »   Issue 2 / 2020
 
     »   Issue 1 / 2020
 
 
 Volume 19 (2019)
 
     »   Issue 4 / 2019
 
     »   Issue 3 / 2019
 
     »   Issue 2 / 2019
 
     »   Issue 1 / 2019
 
 
 Volume 18 (2018)
 
     »   Issue 4 / 2018
 
     »   Issue 3 / 2018
 
     »   Issue 2 / 2018
 
     »   Issue 1 / 2018
 
 
 Volume 17 (2017)
 
     »   Issue 4 / 2017
 
     »   Issue 3 / 2017
 
     »   Issue 2 / 2017
 
     »   Issue 1 / 2017
 
 
 Volume 16 (2016)
 
     »   Issue 4 / 2016
 
     »   Issue 3 / 2016
 
     »   Issue 2 / 2016
 
     »   Issue 1 / 2016
 
 
  View all issues  








LATEST NEWS

2020-Jun-29
Clarivate Analytics published the InCites Journal Citations Report for 2019. The InCites JCR Impact Factor of Advances in Electrical and Computer Engineering is 1.102 (1.023 without Journal self-cites), and the InCites JCR 5-Year Impact Factor is 0.734.

2020-Jun-11
Starting on the 15th of June 2020 we wiil introduce a new policy for reviewers. Reviewers who provide timely and substantial comments will receive a discount voucher entitling them to an APC reduction. Vouchers (worth of 25 EUR or 50 EUR, depending on the review quality) will be assigned to reviewers after the final decision of the reviewed paper is given. Vouchers issued to specific individuals are not transferable.

2019-Dec-16
Starting on the 15th of December 2019 all paper authors are required to enter their SCOPUS IDs. You may use the free SCOPUS ID lookup form to find yours in case you don't remember it.

2019-Jun-20
Clarivate Analytics published the InCites Journal Citations Report for 2018. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.650, and the JCR 5-Year Impact Factor is 0.639.

2018-May-31
Starting today, the minimum number a pages for a paper is 8, so all submitted papers should have 8, 10 or 12 pages. No exceptions will be accepted.

Read More »


    
 

  4/2013 - 10

 HIGHLY CITED PAPER 

A Hybrid Method for Fast Finding the Reduct with the Best Classification Accuracy

HACIBEYOGLU, M. See more information about HACIBEYOGLU, M. on SCOPUS See more information about HACIBEYOGLU, M. on IEEExplore See more information about HACIBEYOGLU, M. on Web of Science, ARSLAN, A. See more information about  ARSLAN, A. on SCOPUS See more information about  ARSLAN, A. on SCOPUS See more information about ARSLAN, A. on Web of Science, KAHRAMANLI, S. See more information about KAHRAMANLI, S. on SCOPUS See more information about KAHRAMANLI, S. on SCOPUS See more information about KAHRAMANLI, S. on Web of Science
 
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (653 KB) | Citation | Downloads: 408 | Views: 2,701

Author keywords
artificial intelligence, classification algorithms, decision trees, discernibility function, feature selection

References keywords
rough(13), data(13), systems(11), knowledge(11), information(11), rule(10), learning(10), induction(10), approach(8), classification(7)
No common words between the references section and the paper title.

About this article
Date of Publication: 2013-11-30
Volume 13, Issue 4, Year 2013, On page(s): 57 - 64
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2013.04010
Web of Science Accession Number: 000331461300010
SCOPUS ID: 84890203115

Abstract
Quick view
Full text preview
Usually a dataset has a lot of reducts finding all of which is known to be an NP hard problem. On the other hand, different reducts of a dataset may provide different classification accuracies. Usually, for every dataset, there is only a reduct with the best classification accuracy to obtain this best one, firstly we obtain the group of attributes that are dominant for the given dataset by using the decision tree algorithm. Secondly we complete this group up to reducts by using discernibility function techniques. Finally, we select only one reduct with the best classification accuracy by using data mining classification algorithms. The experimental results for datasets indicate that the classification accuracy is improved by removing the irrelevant features and using the simplified attribute set which is derived from proposed method.


References | Cited By  «-- Click to see who has cited this paper

[1] J. R. Quinlan, "Induction of decision trees," Machine Learning, vol. 1, no. 1, pp. 81-106, 1986.
[CrossRef] [SCOPUS Times Cited 10631]


[2] J. R. Quinlan, RM. Cameron-Jones "Induction of logic programs: Foil and related systems," New Generation Computing, vol. 13 no. 3-4, pp. 287-312, 1995.
[CrossRef] [Web of Science Times Cited 82] [SCOPUS Times Cited 118]


[3] Y. Kusunoki, M. Inuiguchi, J. Stefanowski, "Rule induction via clustering decision classes," International Journal of Innovative Computing Information and Control, vol. 4, no. 10, pp.2663-2677, 2008.

[4] J. A. Jakubczyc, "The ant colony algorithms for rule induction," Proceedings of AIML 05 Conference, CICC, Cairo, Egypt, pp. 112-117, 19-21 December 2005.

[5] J. W. Grzymala-Busse, Chien Pei B., "Classification methods in rule induction," Proceedings of the Fifth Intelligent Information Systems Workshop, Deblin, Poland, pp. 120-126, June 2-5 1996.

[6] R. S. Michalski, "On the Quasi-Minimal solution of the covering problem," Proceedings of the Fifth International Symposium on Information Processing, Bled, Yugoslavia, A3 (Switching Circuits), pp.125-128, 8-11 October 1969.

[7] R. S. Michalski, I. Mozetic, J. Hong, N. Lavrac, "The Multi-Purpose Incremental Learning System AQ15 and its Testing Application to Three Medical Domains," Proc. AAAI, pp. 1041-1047, 1986.

[8] W. Cohen, "Fast effective rule induction," Proceedings of the 12th International Conference on Machine Learning, p.115-123, 1995.

[9] S. J. Hong, "R-MINI: An iterative approach for generating minimal rules from examples," IEEE Transactions on Knowledge and Data Engineering, vol. 9, no. 5, pp. 709-717, 1997.
[CrossRef] [SCOPUS Times Cited 25]


[10] M. Muselli, D. Liberati, "Binary rule generation via hamming clustering," IEEE Transactions on Knowledge and Data Engineering, vol. 14, no. 6, pp. 1258-1268, 2002.
[CrossRef] [Web of Science Times Cited 33] [SCOPUS Times Cited 37]


[11] N. Cercone, A. An, C. Chan, "Rule-induction and case-based reasoning: Hybrid architectures appear advantageous," IEEE Transactions on Knowledge and Data Engineering, vol. 11, no. 1 pp. 166-174, 1999.
[CrossRef] [Web of Science Times Cited 39] [SCOPUS Times Cited 55]


[12] P. Smyth, R.M. Goodman, "An Information Theoretic Approach to Rule Induction from Database," IEEE Transactions on Knowledge and Data Engineering vol. 4 no. 4, pp. 301-316, 1992.
[CrossRef] [Web of Science Times Cited 171] [SCOPUS Times Cited 224]


[13] D. S. Zhang, L. Zhou, "Discovering Golden Nuggets: Data Mining in Financial Application," IEEE Transactions on Systems Man and Cybernetics Part C Applications and Reviews vol. 34, no.4, pp. 513-522, 2004.
[CrossRef] [Web of Science Times Cited 91] [SCOPUS Times Cited 140]


[14] R. Kohavi, J. R. Quinlan, "Decision-tree discovery," Handbook of Data Mining and Knowledge Discovery, pp. 267-276, 2002.

[15] L. Breiman, J. H. Friedman, R. A. Olshen, C.J. Stone, "Classification and Regression Trees," CA, Wadsworth, 1984.

[16] D. D. Patil, V. M. Wadhai, J. A. Gokhale, "Evaluation of Decision Tree Pruning Algorithms for Complexity and Classification Accuracy," International Journal of Computer Applications, vol. 11, no.2, pp. 23-30, 2010.

[17] J. Komorowski, L. Polkowski, A. Skowron, "Rough Set: A Tutorial," [Online] Available: Temporary on-line reference link removed - see the PDF document

[18] A. E. Hassanien, J. M. H. Ali, "Rough set approach for generation of classification rules of breast cancer data," Informatica, vol. 15, no. 1, pp. 23-38, 2004.

[19] J. Y. Guo, V. Chankong, "Rough set-based approach to rule generation and rule induction," International Journal of General Systems, vol. 31, no. 6, pp. 601-617, 2002.
[CrossRef] [Web of Science Times Cited 11] [SCOPUS Times Cited 11]


[20] J. F. Liu, Q. H. Hu, D. R. Yu, "A weighted rough set based method developed, for class imbalance learning," Information Sciences, vol. 178, no.4, pp. 1235-1256, 2008.
[CrossRef] [Web of Science Times Cited 52] [SCOPUS Times Cited 64]


[21] E. Xu, S.C. Tong, L.S. Shao, Y. Li, D. Jiao, "Rough Set Research on Rule Extraction in Information Table," Proceedings of Fourth International Conference on Fuzzy Systems and Knowledge Discovery (FSKD), pp. 208-212, 2007.
[CrossRef] [SCOPUS Times Cited 1]


[22] M. A. Hall, G.Holmes, "Benchmarking Attribute selection techniques for discrete class data set mining," IEEE Transactions on Knowledge and data set Engineering vol. 15, pp. 6 pp. 1437-1447, 2003.
[CrossRef] [Web of Science Times Cited 633] [SCOPUS Times Cited 833]


[23] H. Liu, L.Yu, "Toward integrating feature selection algorithms for classification and clustering," IEEE Transactions on Knowledge and data set Engineering, vol. 17, no. 4, pp. 491-502, 2005.
[CrossRef] [Web of Science Times Cited 1445] [SCOPUS Times Cited 1862]


[24] A. Hassanien, "Fuzzy rough sets hybrid scheme for breast cancer detection," Image and Vision Computing, vol. 25, no. 2, pp. 172-183, 2007.
[CrossRef] [Web of Science Times Cited 79] [SCOPUS Times Cited 109]


[25] J. Y. Wang, J. Zhou, "Research of reduct features in the variable precision rough set model," Neuro computing, vol. 72, no. 10-12 pp. 2643-2648, 2009.
[CrossRef] [Web of Science Times Cited 38] [SCOPUS Times Cited 71]


[26] J. Wroblewski, "Ensembles of Classifiers Based on Approximate Reducts," Fundamenta Informaticae vol.47 no.3-4, pp. 351-360 2001.

[27] R. Jensen, Q. Shen, "Semantics-preserving dimensionality reduction: Rough and fuzzy-rough based approaches," IEEE Transactions on Knowledge and Data Engineering vol.16, no.12, pp.1457-1471, 2004.
[CrossRef] [Web of Science Times Cited 385] [SCOPUS Times Cited 516]


[28] Z. Pawlak, "Rough Sets: Theoretical aspects of reasoning about data," Kluwer Academic Publishers, Boston, pp. 1-252, 1991.
[CrossRef]


[29] J. R. Quinlan, "Learning efficient classification procedures and their application to chess end games," Machine Learning: An Artificial Intelligence Approach, vol. 1, pp.463-482, 1983.

[30] J. R. Quinlan, "C4.5: Programs for Machine Learning," Morgan Kaufmann Publishers, 1993.

[31] P. Clark, T. Niblett, "The CN2 induction algorithm," Machine Learning, vol. 3, no.4, pp. 261-283, 1989.
[CrossRef] [SCOPUS Times Cited 1463]


[32] L. H. Wang, G. F. Wu, "Attribute Reduction and Information Granularity," 6th World Multi conference on Systemics, Cybernetics and Informatics, Vol I, Proceedings - Information Systems Development I, pp. 32-37, 2003.

[33] R. Jensen, Q. Shen, "Rough set based feature selection: A review," 52 pages, 2007. [Online] Available: Temporary on-line reference link removed - see the PDF document

[34] R. W. Swiniarski, A.Skowron, "Rough set methods in feature selection and recognition," Pattern Recognition Letters, vol. 24, no. 6, pp. 833-849, 2003.
[CrossRef] [Web of Science Times Cited 541] [SCOPUS Times Cited 672]


[35] J. Wei, S. Wang, M. Wang, J. You, D. Liu, "Rough set based approach for inducing decision trees," Knowledge based systems, vol. 20, no. 8, 695-702, 2007.
[CrossRef] [Web of Science Times Cited 19] [SCOPUS Times Cited 27]


[36] A. Skowron, C. Rauszer, "The discernibility matrices and functions in information systems," Fundamenta Informaticae, vol. 15, no. 2, pp. 331-362, 1992.

[37] S. Kahramanli, M. Hacibeyoglu, A. Arslan, "A Boolean Function Approach to Feature Selection in Consistent Decision Information Systems," Expert Systems with Application, vol. 38, no. 7, pp. 8229-8239, 2011.
[CrossRef] [Web of Science Times Cited 8] [SCOPUS Times Cited 10]


[38] S. Kahramanli, M. Hacibeyoglu, A. Arslan, "Attribute Reduction by Partitioning the Minimized Discernibility Function," International Journal of Innovative Computing Information and Control, vol. 7, no. 5A, pp. 2167-2186, 2011.

[39] [Online] Available: Temporary on-line reference link removed - see the PDF document

[40] [Online] Available: Temporary on-line reference link removed - see the PDF document

[41] G. Shakhnarovish, T Darrell, P. Indyk, "Nearest-Neighbor Methods in Learning and Vision," MIT Press, 2005.

[42] T. Mitchel, "Machine Learning", McGraw-Hill, 1997.

[43] P. Minvielle, A. Doucet, A. Marrs, S. Maskell, "A Bayesian approach to joint tracking and identification of geometric shapes in video sequences", Image and Vision Computing, vol.28, no.1, pp.111-123, 2010.
[CrossRef] [Web of Science Times Cited 12] [SCOPUS Times Cited 17]


[44] C. Pozna, R.-E. Precup, J. K. Tar, I. Skrjanc, S. Preitl, "New results in modelling derived from Bayesian filtering," Knowledge-Based Systems, vol. 23, no. 2, pp. 182-194, 2010.
[CrossRef] [Web of Science Times Cited 48] [SCOPUS Times Cited 54]


[45] F. Zhang, W.-F. Xue, X. Liu, "Overview of nonlinear Bayesian filtering algorithm," Procedia Engineering, vol.15, pp. 489-495, 2011.
[CrossRef] [Web of Science Record] [SCOPUS Times Cited 14]


[46] G. E. D'Errico, "A la Kalman filtering for metrology tool with application to coordinate measuring machines," IEEE Transactions on Industrial Electronics, vol. 59 , no. 11, pp. 4377-4382, 2012.
[CrossRef] [Web of Science Times Cited 5] [SCOPUS Times Cited 11]


[47] N. Mastrogiannis, B. Boutsinas, I. Giannikos, "A method for improving the accuracy of data mining classification algorithms," Computers & Operations Research, vol. 36, no. 10, pp. 2829-2839, 2009.
[CrossRef] [Web of Science Times Cited 19] [SCOPUS Times Cited 33]




References Weight

Web of Science® Citations for all references: 3,711 TCR
SCOPUS® Citations for all references: 16,998 TCR

Web of Science® Average Citations per reference: 77 ACR
SCOPUS® Average Citations per reference: 354 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2021-01-22 12:34 in 161 seconds.




Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2021
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania


All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.




Website loading speed and performance optimization powered by: