Click to open the HelpDesk interface
AECE - Front page banner



JCR Impact Factor: 0.699
JCR 5-Year IF: 0.674
Issues per year: 4
Current issue: Nov 2018
Next issue: Feb 2019
Avg review time: 83 days


Stefan cel Mare
University of Suceava
Faculty of Electrical Engineering and
Computer Science
13, Universitatii Street
Suceava - 720229

Print ISSN: 1582-7445
Online ISSN: 1844-7600
WorldCat: 643243560
doi: 10.4316/AECE


2,161,867 unique visits
Since November 1, 2009

Robots online now


SCImago Journal & Country Rank

SEARCH ENGINES - Google Pagerank


Anycast DNS Hosting

 Volume 18 (2018)
     »   Issue 4 / 2018
     »   Issue 3 / 2018
     »   Issue 2 / 2018
     »   Issue 1 / 2018
 Volume 17 (2017)
     »   Issue 4 / 2017
     »   Issue 3 / 2017
     »   Issue 2 / 2017
     »   Issue 1 / 2017
 Volume 16 (2016)
     »   Issue 4 / 2016
     »   Issue 3 / 2016
     »   Issue 2 / 2016
     »   Issue 1 / 2016
 Volume 15 (2015)
     »   Issue 4 / 2015
     »   Issue 3 / 2015
     »   Issue 2 / 2015
     »   Issue 1 / 2015
  View all issues  


Clarivate Analytics published the InCites Journal Citations Report for 2017. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.699, and the JCR 5-Year Impact Factor is 0.674.

Thomson Reuters published the Journal Citations Report for 2016. The JCR Impact Factor of Advances in Electrical and Computer Engineering is 0.595, and the JCR 5-Year Impact Factor is 0.661.

With new technologies, such as mobile communications, internet of things, and wide applications of social media, organizations generate a huge volume of data, much faster than several years ago. Big data, characterized by high volume, diversity and velocity, increasingly drives decision making and is changing the landscape of business intelligence, from governments to private organizations, from communities to individuals. Big data analytics that discover insights from evidences has a high demand for computing efficiency, knowledge discovery, problem solving, and event prediction. We dedicate a special section of Issue 4/2017 to Big Data. Prospective authors are asked to make the submissions for this section no later than the 31st of May 2017, placing "BigData - " before the paper title in OpenConf.

Read More »


  2/2008 - 12

Training Neural Networks Using Input Data Characteristics

CERNAZANU, C. See more information about CERNAZANU, C. on SCOPUS See more information about CERNAZANU, C. on IEEExplore See more information about CERNAZANU, C. on Web of Science
Click to see author's profile in See more information about the author on SCOPUS SCOPUS, See more information about the author on IEEE Xplore IEEE Xplore, See more information about the author on Web of Science Web of Science

Download PDF pdficon (710 KB) | Citation | Downloads: 769 | Views: 3,473

Author keywords
neural networks, data mining, correlation-based feature subset selection method, data features extraction, training algorithm

References keywords
neural(8), networks(7), data(7), selection(6), learning(6), mining(5), machine(5), ijcnn(4), feature(4)
Blue keywords are present in both the references section and the paper title.

About this article
Date of Publication: 2008-06-02
Volume 8, Issue 2, Year 2008, On page(s): 65 - 70
ISSN: 1582-7445, e-ISSN: 1844-7600
Digital Object Identifier: 10.4316/AECE.2008.02012
Web of Science Accession Number: 000264815000012
SCOPUS ID: 77955635511

Quick view
Full text preview
Feature selection is often an essential data processing step prior to applying a learning algorithm. The aim of this paper consists in trying to discover whether removal of irrelevant and redundant information improves the performance of neural network training results. The present study will describe a new method of training the neural networks, namely, training neural networks using input data features. For selecting the features, we used a filtering technique (borrowed from data mining) which consists in selecting the best features from a training set. The technique is made up of two components: a feature evaluation technique and a search algorithm for selecting the best features. When applied as a data preprocessing step for one common neural network training algorithms, the best data results obtained from this network are favorably comparable to a classical neural network training algorithms. Nevertheless, the first one requires less computation.

References | Cited By  «-- Click to see who has cited this paper

[1] Negnevitsky, M., "Artificial Intelligence: A Guide to Intelligent Systems", (2nd Edition), Addison Wesley, England,2005.

[2] Luger G., "Artificial Intelligence :Structures and Strategies for Complex Problem Solving", (Fifth Edition) Addison Wesley, 2005.

[3] Stergiou, C., Siganos, D., "Neural networks", [Online] Available: Temporary on-line reference link removed - see the PDF document, 1996

[4] Babii, S., Cretu, V., Petriu, E. M., "Performance Evaluation of Two Distributed BackPropagation Implementations", Neural Networks 2007, IJCNN 2007, pp. 1578-1583

[5] Zhongwen, L., Hongzhi, L., Xincai, W., "Artificial neural network computation on graphic process unit", Neural Networks, 2005, IJCNN 2005, pp. 622-626.

[6] Siddique, M. N. H., Tokhi, M.O., "Training neural networks: backpropagation vs. genetic algorithms", Neural Networks, 2001, IJCNN, 2001, pp. 2673-2678

[7] Nguyen, D., Widrow, B., "Improving the learning speed of 2-layer neural networks by choosing initial values of adaptive weights", Neural Networks 1990, IJCNN, 1990, pp. 21-26, Volume. 3

[8] Gorea, D., "Dynamically Integrating Knowledge in Applications. An Online Scoring Engine Architecture", Advances in Electrical and Computer Engineering, Suceava, Romania, Volume 8, 2008, pp.44-49
[CrossRef] [Full Text] [Web of Science Times Cited 1] [SCOPUS Times Cited 3]

[9] Langley, P., "Selection of relevant features in machine learning", Proceedings of the AAAI Fall Symposium on Relevance, AAAI Press, 1994

[10] Jain, A., Zongker, D., "Feature selection: evaluation, application and small sample performance, Pattern Analysis and Machine Learning Intelligence, IEEE Transactions on, Volume 19, 1997, pp. 153-158

[11] Pudil, P., Novovicova, J., Kittler, J., "Floating search methods in feature selection, Pattern Recognition Letters, Volume 15, November 1994, pp. 1119-1125.
[CrossRef] [Web of Science Times Cited 1558] [SCOPUS Times Cited 1865]

[12] Kim, Y., Street, W.N., Menczer, F. Roussell, G.J., "Feature selection in data mining", J. Wang Editor, Data Mining: Opportunities and Challenges, Idea Group Publishing, 2003, pages 80-105.

[13] Gigli, G., Bosse, I., Lampropoulos, G.A., "A optimized architecture for classification combining data fusion and data mining", Information Fusion, Volume 8, 2007, pp. 366-378
[CrossRef] [Web of Science Times Cited 6] [SCOPUS Times Cited 9]

[14] Hall, M. "Correlation-based Feature Selection for Machine Learning", Ph. D. diss. Hamilton, NZ: Waikato Uiversity, Department of Computer Science, 1998

[15] Boyan, J., Moore, A., "Learning evaluation functions to improve optimization by local search", Journal of Machine Learning Research, Volume 1, pp. 77-112, 2000

[16] Weka3, "Data mining Software in Java", The University of Waikato, [Online] Available: Temporary on-line reference link removed - see the PDF document, 2008

[17] Witten, I. H., Frank, E., "Data mining: Practical Machine Tools and Techniques", (Second Edition), Morgan Kaufmann, 2005.

[18] NIST Handprinted Forms and Characters Database, [Online] Available: Temporary on-line reference link removed - see the PDF document

[19], Performing attribute selection, 2008

[20] Image Segmentation Data, Vision Group, University of Massachusetts, November, 1990.

References Weight

Web of Science® Citations for all references: 1,565 TCR
SCOPUS® Citations for all references: 1,877 TCR

Web of Science® Average Citations per reference: 78 ACR
SCOPUS® Average Citations per reference: 94 ACR

TCR = Total Citations for References / ACR = Average Citations per Reference

We introduced in 2010 - for the first time in scientific publishing, the term "References Weight", as a quantitative indication of the quality ... Read more

Citations for references updated on 2019-02-15 15:57 in 25 seconds.

Note1: Web of Science® is a registered trademark of Clarivate Analytics.
Note2: SCOPUS® is a registered trademark of Elsevier B.V.
Disclaimer: All queries to the respective databases were made by using the DOI record of every reference (where available). Due to technical problems beyond our control, the information is not always accurate. Please use the CrossRef link to visit the respective publisher site.

Copyright ©2001-2019
Faculty of Electrical Engineering and Computer Science
Stefan cel Mare University of Suceava, Romania

All rights reserved: Advances in Electrical and Computer Engineering is a registered trademark of the Stefan cel Mare University of Suceava. No part of this publication may be reproduced, stored in a retrieval system, photocopied, recorded or archived, without the written permission from the Editor. When authors submit their papers for publication, they agree that the copyright for their article be transferred to the Faculty of Electrical Engineering and Computer Science, Stefan cel Mare University of Suceava, Romania, if and only if the articles are accepted for publication. The copyright covers the exclusive rights to reproduce and distribute the article, including reprints and translations.

Permission for other use: The copyright owner's consent does not extend to copying for general distribution, for promotion, for creating new works, or for resale. Specific written permission must be obtained from the Editor for such copying. Direct linking to files hosted on this website is strictly prohibited.

Disclaimer: Whilst every effort is made by the publishers and editorial board to see that no inaccurate or misleading data, opinions or statements appear in this journal, they wish to make it clear that all information and opinions formulated in the articles, as well as linguistic accuracy, are the sole responsibility of the author.

Website loading speed and performance optimization powered by: