An Effective Turkey Marble Classification System: Convolutional Neural Network with Genetic Algorithm -Wavelet Kernel - Extreme Learning Machine

An Effective Turkey Marble Classification System: Convolutional Neural Network with Genetic Algorithm -Wavelet Kernel - Extreme Learning Machine

Derya AvciEser Sert 

Vocational School of Technical Sciences, Firat University, Elazig 23119, Turkey

Department of Computer Engineering, Malatya Turgut Özal University, Malatya 44210, Turkey

Corresponding Author Email: 
davci@firat.edu.tr
Page: 
1229-1235
|
DOI: 
https://doi.org/10.18280/ts.380434
Received: 
27 April 2021
|
Revised: 
5 July 2021
|
Accepted: 
25 July 2021
|
Available online: 
31 August 2021
| Citation

© 2021 IIETA. This article is published by IIETA and is licensed under the CC BY 4.0 license (http://creativecommons.org/licenses/by/4.0/).

OPEN ACCESS

Abstract: 

Marble is one of the most popular decorative elements. Marble quality varies depending on its vein patterns and color, which are the two most important factors affecting marble quality and class. The manual classification of marbles is likely to lead to various mistakes due to different optical illusions. However, computer vision minimizes these mistakes thanks to artificial intelligence and machine learning. The present study proposes the Convolutional Neural Network- (CNN-) with genetic algorithm- (GA) Wavelet Kernel- (WK-) Extreme Learning Machine (ELM) (CNN–GA-WK-ELM) approach. Using CNN architectures such as AlexNet, VGG-19, SqueezeNet, and ResNet-50, the proposed approach obtained 4 different feature vectors from 10 different marble images. Later, Genetic Algorithm (GA) was used to optimize adjustable parameters, i.e. k, 1, and m, and hidden layer neuron number in Wavelet Kernel (WK) – Extreme Learning Machine (ELM) and to increase the performance of ELM. Finally, 4 different feature vector parameters were optimized and classified using the WK-ELM classifier. The proposed CNN–GA-WK-ELM yielded an accuracy rate of 98.20%, 96.40%, 96.20%, and 95.60% using AlexNet, SequeezeNet, VGG-19, and ResNet-50, respectively.

Keywords: 

CNN, genetic algorithm, wavelet kernel-extreme learning machine, marble classification

1. Introduction

Marble is formed by limestone, which is a result of metamorphic movements, exposed to heat and intense pressure. It is usually composed of the mineral calcite (CaCO3) and contains other minerals such as clay minerals, mica, quartz, pyrite, iron oxide, and graphite. It is often white and grayish in color. However, due to foreign matters, they may also be formed in different colors such as yellow, pink, red, blue, and black [1].

Marble is a decorative element preferred in several environments. It is widely used in interior and exterior wall sidings because of its aesthetic and durable structure. Marble quality varies depending on the vein pattern and color, which are the two most important factors affecting the marble quality [2]. Marble is usually classified manually by factory workers. This results in several mistakes in marble classification due to the eye strain that workers suffer from in longer shifts. The quality differences among marble classes lead to visible differences in marbled environments. Therefore, as for export marble goods, customers may sometimes return marble due to visible differences in marble quality, which influences export figures negatively.

Barbon et al. [3] performed the segmentation, normalization, and contrast settings for three different marble types and applied pre-processing to the marble images. Later, k-Nearest Neighbor (k-NN) was used to classify 3 different marble types. Martínez-Alajarín et al. [4] analyzed marble texture using four different color information, i.e. RGB, XYZ, YIQ, and K-L, and sum and difference histogram. Following a texture analysis using the sum and difference histogram, a feature extraction process was performed through a principal component analysis. Finally, a backpropagation algorithm with an adaptive learning rate and a trained multi-layer receptive neural network was used to classify marble plates in three different categories, which yielded an accuracy rate of 98.9%.

Torun et al. [5] trained 3 different marble types using AlexNet model and LBP to obtain texture information. LBP was used to create color and pattern characteristics, while Support Vector Machine (SVM) classifier was used for classification. Test data excluded from the training data demonstrated that AlexNet and LBP-SVM displayed an accuracy rate of 99.21% and 99.8% respectively.

Pençe and Çeşmeli [6] classified marble images as first and second quality marble using a deep learning method. When the components of fully-connected layers in the related deep network were determined, fully-connected layers (FC) and softmax outputs in one of the marbles were also obtained. Sgdm, Rmsprop, and Adam training algorithms were used to test the classification performance. Test performances in 500 iterations were calculated as 66.25%, 62.50%, and 75% for Sgdm, Rmsprop, and Adam, respectively.

Ather et al. [7] performed the automatic classification of natural stones using pre-trained Convolutional Neural Network (CNN) architectures, namely AlexNet and VGGNet, and compared the performances of Fine-Tuning VGGNet and Fine-Tuning. Kaynar et al. [8] classified natural stones into 6 different classes. First, Gray-Level Co-occurrence Matrix (GLCM) and Local Binary Pattern (LBP) were used to obtain tile pattern information. Later, Artificial Neural Network (ANN), SVM, and Naive Bayes classifiers were used to compare performances.

Ferreira and Giraldi [9] used a CNN architecture for the classification of granite tile. During the pre-processing stage, color information was removed from image analysis and converted to a gray level in order to define textures. Later, softmax in CNN architecture was used to classify granites with a performance rate of 86%. Sudakov et al. [10] analyzed the use of machine learning techniques for permeability prediction and benefited from different approaches such as CNN, Deep Neural Networks (DNN), Minkowski functionals, and Gradient Boosting.

Wang et al. [11] applied Super Resolution (SR) method to 3D rock images in order to increase image quality. 64 filters were used to capture different features in each 3D-CNN. Thus, low-resolution images passed through the layers and were converted to high-resolution images. Peak Signal to Noise Ratio (PSNR) and Structural Similarity Index Measure (SSIM) values of the obtained images were calculated and compared with different SR methods.

Shu et al. [12] used a pre-trained CNN model for feature extraction from rock images. Features obtained using CNN were classified using a SVM classifier, which displayed a performance rate of 89%. Młynarczuk et al. [13] classified nine different rock samples automatically using k-NN in four different color spaces, namely RGB, CIELab, YIQ, and HS. Rock samples were classified with an accuracy rate of 99.8%.

The main aspects and innovations of the proposed in the present study are summarized below:

1) A database was created for 10 different mostly used marble types in Turkey, and an effective approach was proposed for the classification of marbles in this database.

2) Convolutional Neural Network- (CNN-) with Genetic Algorithm- (GA) Wavelet Kernel- (WK-) Extreme Learning Machine (ELM) (CNN–GA-WK-ELM) was proposed. This approach has not been proposed in the existing literature before and was used for marble classification for the first time.

3) AlexNet, VGG-19, SqueezeNet, and ResNet-50 were used to obtain four different feature vectors from 10 different marble images.

4) GA was used to optimize adjustable parameters, i.e. k, 1, and m, and hidden layer neuron number of WK-ELM, thus creating a more efficient ELM.

5) AlexNet, SqueezeNet, VGG-19, and ResNet-50 architectures were used in CNN-GA-WK-ELM, which was proposed for marble classification, and these approaches were compared with each other experimentally.

The organization of the present study is as follows: Section 2 presents the database used in the present study. Section 3 describes CNN, Genetic Algorithm, Wavelet Kernel-Extreme Learning Machine, and the proposed approach. Section 4 and 5 present the metrics and findings of the experiments, respectively. Section 6 concludes the study.

2. Database

Turkey Marble Database (TMD) was created in the present study to list 10 most widely used marble types in Turkey.

1.  Afyon Violet

2.  Barred Marmara

3.  Burdur Coffee

4.  Bursa Dark Beige

5.  Diana Rose

6.  Ege Coffee

7.  Elazig Cherry

8.  Karacabey Black

9.  Toros Black

10. Tundra Gray

50 pieces of marble images were captured for each marble type in TMD, creating a total of 500 images. Sample images for each marble class in TMD are shown in Figure 1. In the present study, 70% of the 500 marble images in the TMD database were used for the training process, while the remaining 30% were used for the testing process.

Figure 1. Sample images for each marble class in TMD

3. Materials and Methods

3.1 Convolutional neural network

CNN is an important machine learning approach used in image processing and classification. It is widely used in various applications in the field of geology, health, safety, and other similar fields for classification and segmentation processes. The design of CNN architecture was inspired by human mind [14] and bears the capacity to detect the world visually and draw various conclusions from the detected images. A CNN architecture includes Convolutional layers (CONV), Activation layers (ACT) Pool layers (POOL), Fully-connected layers (FC), and Classification layers (CLASS). Some of these layers are used again in order to increase CNN performances [15, 16]. The present study benefited from AlexNet, VGG-19, SqueezeNet, and ResNet-50, which are among the most widely used CNN architectures, for feature extraction.

3.2 Genetic algorithm

GA is the most optimal solving method and inspired by the evolutional process in nature. It offers an effective method to adjust various parameters for optimization and machine learning applications [17]. It is widely preferred in different disciplines such as medicine and engineering, as it offers several solutions to difficult problems rather than offering a single solution to them. Thus, numerous points in the search space are evaluated, and thus the likelihood of finding an optimal solution increase [18]. Different sets representing the candidate solutions to the problem form the population of a GA with randomly generated individuals. Later, influential individuals in this population are detected through an evolutional process [19]. As a result, a random population with n chromosomes is created.

3.3 Wavelet kernel-extreme learning machine

Figure 2. The architecture of SHLFF with ELM

ELM, Single Hidden Layer FeedForward (SHLFF) is a method proposed for ANN training [20-22]. ELM can be considered as an algorithm which offers a fairly effective learning speed. Sigmoid, sinus, Gauss, and hard limit activation functions are used in the hidden layer of ELM [23], while a linear function is used in the output layer [24]. Threshold values and input-output weights in an SHLFF network do not affect the network performance. Input layer weights and threshold values are randomly assigned, and output weight values are calculated accordingly. Therefore, ELM displays a higher and quicker performance compared to conventional methods. The structure of SHLFF with ELM is shown in Figure 2. Here x1, x2, xr represent inputs, l1m, l2m, and lrm, represent weight vectors connecting input neurons, N1, N2, Nm are hidden layers, W represents the weight function, and Oj is the output value.

For different sample datasets, $\left\{\left(p_{j}, m_{j}\right) \mid p_{j} \in Q^{l}, m_{j} \in Q^{k}, j=1, \ldots, M\right\}$ k is the target function of ELM as a hidden neuron [25].

$u_{K}(p)=\sum_{j=1}^{K} N_{j} h_{j}(p)=h(p) N$    (1)

P is the target vector of the hidden layer in $\left[h_{1}(p), h_{2}(p), \ldots, h_{K}(p)\right]$. h vector converts input domain to ELM feature domain. N represents the weight vector between target neurons and hidden layer neurons for $N=\left[N_{1}, N_{2}, \ldots, N_{K}\right]$ [26]. Various methods have been so far proposed in order to optimize ELM. Weights can be minimized in order to reduce training and output errors, which improves the performance of the neural network.

Minimize: $\|T N-W\|,\|N\|$     (2)

$N=T^{T}\left(\frac{1}{R}+T T^{T}\right)^{-1} W$    (3)

T is the target matrix of the hidden layer. W and R represent the expected target matrix and regression coefficient, respectively. The output function of the ELM learning algorithm is given in Eq. (4).

$u(p)=h(p) T^{T}\left(\frac{1}{R}+T T^{T}\right)^{-1} W$    (4)

If the feature vector h(p) is unknown, the kernel matrix of ELM can be defined using Mercer’s conditions [27].

$K=T T^{T}: k_{j z}=h\left(p_{j}\right) h\left(p_{z}\right)=b\left(p_{j}, p_{z}\right)$    (5)

WK based on ELM function is defined in Eq. (6).

$u(p)=\left[b\left(p, p_{1}\right), \ldots ., b\left(p, p_{M}\right)\right]\left(\frac{1}{R}+K\right)^{-1} W$    (6)

K=TTT represents the kernel matrix. b(p,g) represents the kernel function of ELM. Eq. (7) is used for WK-ELM performance test [28, 29]:

$b(p, g)=\cos \left(k \frac{\|p-g\|}{l}\right) \exp \left(-\frac{\|p-g\|^{2}}{m}\right)$    (7)

Here, k, l, and m are adjustable parameters of WK, and they greatly influence the classification performance. Therefore, it is vital to adjust them accurately. Hidden layer features and hidden layer neuron numbers are not known in WK-ELM algorithms. In the present study, hidden layer neuron number of ELM and k, l ve m parameters of WK in Eq. (7) were optimized in order to increase ELM performance.

3.4 The proposed method

10 different marble images, namely Afyon Violet, Barred Marmara, Burdur Coffee, Bursa Dark Beige, Diana Rose, Ege Cofee, Burdur Coffee, Karacabey Black, Toros Black, and Tundra Gray, were used in the present study. As explained in "2. Database", in the present study, 70% of the 500 marble images in the TMD database were used for the training process, while the remaining 30% were used for the testing process. The diagram of the proposed approach is shown in Figure 3.

Figure 3. The diagram of the proposed approach

CNN–GA-WK-ELM, which is the approach proposed in the present study, consists of 3 stages.

At the first stage, AlexNet, VGG-19, SqueezeNet, and ResNet-50 were used to obtain 4 different feature vectors with a size of 500*1000 from marble images.

At the second stage, GA was used to optimize k, l and m adjustable parameters and hidden layer neuron number of WK-ELM and to improve the performance of ELM. Each individual in the structure of GA in the proposed approach consists of 20 bits. The first individual is randomly selected in GA. The first group of 4 bits in these individuals (1st, 2nd, 3rd, and 4th bit) represent k parameter values (1 to 16) of WK. The second group of 4 bits (5th, 6th, 7th, and 8th bit) represents l parameter values (1 to 16). The third group of 4 bits (9th, 10th, 11th, and 12th bit) represents m parameter values (1 to 16). The remaining 8 bits (5 to 259) represent the hidden layer neuron number. Coding values for k, l, m parameters of WK and hidden layer neuron number are given in Tables 1 and 2, respectively.

Table 1. Coding values for k, l, m parameters of WK

Values of k, l, m parameters

Coding

1

0 0 0 0

2

0 0 0 1

3

0 0 1 0

4

0 0 1 1

5

0 1 0 0

6

0 1 0 1

7

0 1 1 0

8

0 1 1 1

9

1 0 0 0

10

1 0 0 1

11

1 0 1 0

12

1 0 1 1

13

1 1 0 0

14

1 1 0 1

15

1 1 1 0

16

1 1 1 1

Table 2. Coding values for hidden layer neuron number

Values of HNN parameters

Coding

5

 0 0 0 0 0 0 0 0

6

0 0 0 0 0 0 0 1

7

0 0 0 0 0 0 1 0

8

0 0 0 0 0 0 1 1

.

.

.

.

 

.

.

.

.

 

259

1 1 1 1 1 1 1 1

At the third stage, the optimized parameters were employed to classify marble images using WK-ELM. The classification performances of AlexNet, SqueezeNet, VGG-19, and ResNet-50 in the proposed CNN–GA-WK-ELM approach are given in Section 5.

4. Metrics

In the present study, 10 different marble images, namely Afyon Violet, Barred Marmara, Burdur Coffee, Bursa Dark Beige, Diana Rose, Ege Cofee, Burdur Coffee, Karacabey Black, Toros Black, and Tundra Gray, were classified. The classification performance was evaluated using accuracy, F1-score, recall, precision, false-positive, and receiver operating [30-32].

True Positive (TP): The true category is true positive.

True Negative (TN): The true category is positive, while the predicted category is negative.

False Positive (FP): The true category is negative, while the predicted category is positive.

False Negative (FN): The true category is positive, while the predicted category is negative.

Accuracy $=\frac{T P+T N}{T P+T N+F P+F N}$   (8)

Precision $=\frac{T P}{T P+F P}$   (9)

Recall $=\frac{T P}{T P+F N}$    (10)

$\mathrm{F} 1=\frac{2 * \text { Precision } * \text { Recall }}{\text { Precision+Recall }}$   (11)

5. Results

AlexNet, VGG-19, SqueezeNet, and ResNet-50 were used to obtain 4 different feature vectors with a size of 500*1000 from marble images in the present study. The optimal WK parameters selected for 4 different architectures in the proposed CNN–GA-WK-ELM approach and classification performances based on the optimal hidden layer neuron number of ELM are given in Table 3. The confusion matrices for the proposed CNN–GA-WK-ELM approach with Alexnet, VGG-19, SqueezeNet, and ResNet-50 architectures are shown in Figure 4.a, 4.b, 4.c, and 4.d, respectively. Metrics for the 4 approaches are presented in Table 4. As can be seen from the related table, Accuracy (%), Precision, Recall, F1 Score metrics obtained by CNN – GA-WK-ELM with AlexNet approach are at a higher level than those obtained with the other 3 approaches. This clearly shows the superiority of the proposed CNN – GA-WK-ELM with AlexNet approach over other approaches. In the present study, a TMD database created by the authors was used. Therefore, there are no other studies using the same database. However, studies on marble classification were examined in the existing literature, and the number of classes and success rates used in them are presented in Table 5. As can be seen, in the present study, a very high performance rate was achieved despite the classification based on the maximum number of classes a, proving the superiority of the proposed approach.

Table 3. Performance comparison

The architecture used in the proposed CNN–GA-WK-ELM Approach

WK

parameters

Hidden Layer Neuron Number of ELM

Classification Performance

(%)

k

l

m

AlexNet

16

15

12

118

98.2

SqueezeNet

2

7

7

129

96.4

VGG-19

2

10

9

142

96.2

ResNet-50

15

7

8

189

95.6

Figure 4. Confusion matrices

Table 4. Metrics for 4 approaches

Method

Accuracy (%)

Macro Precision

Macro Recall

Macro F1-Score

CNN–GA-WK-ELM with AlexNet

98.20

0.9820

0.9826

0.9819

CNN–GA-WK-ELM with SqueezeNet

96.40

0.9640

0.9645

0.9638

CNN–GA-WK-ELM with VGG-19

96.20

0.9620

0.9630

0.9620

CNN–GA-WK-ELM with ResNet-50

95.40

0.9542

0.9545

0.9541

Table 5. Comparison of our approach with other methods

Method

Application Area

Number of Class

Accuracy (%)

Our

Turkey Marble

10

98.20

Computer Vision Systems (CVS) [3]

Marble

3

81.59

Quality-Based [4]

Marble

3

98.9

AlexNet [5]

Marble

3

99.21

LBP-SVM [5]

Marble

3

99.8

Deep Learning [6]

Marble

9

75.00

6. Conclusion

The use of artificial intelligence technologies in image processing applications has gained importance in today’s world. The classification of marbles is usually performed by factor workers manually, which may result in various mistakes due to their eye strain and optical illusions. In this respect, image processing techniques are becoming more and more popular in the classification of marble images in industrial environments nowadays. The selection of approaches and artificial intelligence techniques are of vital importance for a successful classification performance. In the present study, 4 different feature vectors were obtained from marble images using four different deep learning architectures, namely AlexNet, VGG-19, SqueezeNet, and ResNet-50. In addition, GA was used to optimize WK-ELM parameters and hidden layer neuron number, which aimed to increase the classification performance of ELM. Each of 4 different feature vectors was given to the WK-ELM classifier, and 10 different marble types were classified using the proposed CNN–GA-WK-ELM with AlexNet, VGG-19, SqueezeNet, and ResNet-50 architectures. The experimental findings of the present study demonstrated that CNN–GA-WK-ELM with Alexnet architecture displayed the highest marble classification performance compared to other models.

  References

[1] Hurok Marble. https://hurok.com/mermer-nedir/, accessed on 11 Jan. 2021.

[2] Maden Tetkik Arama. https://www.mta.gov.tr/v3.0/bilgi-merkezi/mermer, accessed on 11 Jan. 2021.

[3] Barbon, A.P.A.C., Junior, S.B., Campos, G.F.C., Seixas Jr, J.L., Peres, L.M., Mastelini, S.M., Andreoa, N., Ulricic, A., Bridi, A.M. (2017). Development of a flexible computer vision system for marbling classification. Computers and Electronics in Agriculture, 142: 536-544. https://doi.org/10.1016/j.compag.2017.11.017

[4] Martínez-Alajarín, J., Luis-Delgado, J.D., Tomás-Balibrea, L.M. (2005). Automatic system for quality-based classification of marble textures. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 35(4): 488-497. https://doi.org/10.1109/TSMCC.2004.843236

[5] Torun, Y., Akbaş, M.R., Çelik, M.A., Kaynar, O. (2019). Development a machine vision system for marble classification. In 2019 27th Signal Processing and Communications Applications Conference (SIU), Sivas, Turkey, pp. 1-4.

[6] Pençe, İ., Çeşmeli, M.Ş. (2019). Deep learning in marble slabs classification. Scientific Journal of Mehmet Akif Ersoy University, 2(1): 21-26. 

[7] Ather, M., Khan, B., Wang, Z., Song, G. (2019). Automatic recognition and classification of granite tiles using convolutional neural networks (CNN). 2019 3rd International Conference on Advances in Artificial Intelligence, New York, United States, pp. 193-197.

[8] Kaynar, O., Torun, Y., Temiz, M., Görmez, Y. (2018). Automatic classification of natural stone tiles with computer vision. 3rd International Conference on Computer Science and Engineering (UBMK), Sarajevo, Bosnia and Herzegovina, pp. 527-532. 

[9] Ferreira, A., Giraldi, G. (2017). Convolutional neural network approaches to granite tiles classification. Expert Systems with Applications, 84: 1-11. https://doi.org/10.1016/j.eswa.2017.04.053

[10] Sudakov, O., Burnaev, E., Koroteev, D. (2019). Driving digital rock towards machine learning: Predicting permeability with gradient boosting and deep neural networks. Computers & Geosciences, 127: 91-98. https://doi.org/10.1016/j.cageo.2019.02.002

[11] Wang, Y., Teng, Q., He, X., Feng, J., Zhang, T. (2019). CT-image of rock samples super resolution using 3D convolutional neural network. Computers & Geosciences, 133: 104314. https://doi.org/10.1016/j.cageo.2019.104314

[12] Shu, L., Osinski, G.R., McIsaac, K., Wang, D. (2018). An automatic methodology for analyzing sorting level of rock particles. Computers & Geosciences, 120: 97-104. https://doi.org/10.1016/j.cageo.2018.08.001

[13] Młynarczuk, M., Górszczyk, A., Ślipek, B. (2013). The application of pattern recognition in the automatic classification of microscopic rock images. Computers & Geosciences, 60: 126-133. https://doi.org/10.1016/j.cageo.2013.07.015

[14] Lumini, A., Nanni, L. (2019). Deep learning and transfer learning features for plankton classification. Ecological Informatics, 51: 33-43. https://doi.org/10.1016/j.ecoinf.2019.02.007

[15] Özyurt, F., Avcı, E., Sert, E., (2020). UC-Merced image classification with CNN feature reduction using wavelet entropy optimized with genetic algorithm. Traitement du Signal, 37(3): 347-353. https://doi.org/10.18280/ts.370301

[16] Dong, J., Li, X. (2020). An image classification algorithm of financial instruments based on convolutional neural network. Traitement du Signal, 37(6): 1055-1060. https://doi.org/10.18280/ts.370618

[17] Hosseinabadi, A.A.R., Vahidi, J., Saemi, B., Sangaiah, A.K., Elhoseny, M. (2019). Extended genetic algorithm for solving open-shop scheduling problem. Soft Computing, 23(13): 5099-5116. https://doi.org/10.1007/s00500-018-3177-y

[18] Mirjalili, S. (2019). Genetic algorithm. In: Evolutionary Algorithms and Neural Networks. Studies in Computational Intelligence. Springer, Cham.

[19] Zhi, H., Liu, S. (2019). Face recognition based on genetic algorithm. Journal of Visual Communication and Image Representation, 58: 495-502. https://doi.org/10.1016/j.jvcir.2018.12.012

[20] Zhu, Q.Y., Qin, A.K., Suganthan, P.N., Huang, G.B. (2005). Evolutionary extreme learning machine. Pattern Recognition, 38(10): 1759-1763. https://doi.org/10.1016/j.patcog.2005.03.028

[21] Sotelo, D., Velásquez, D., Cobos, C., Mendoza, M., Gómez, L. (2019). Optimization of neural network training with ELM based on the iterative hybridization of differential evolution with local search and restarts. In: Nicosia G., Pardalos P., Giuffrida G., Umeton R., Sciacca V. (eds) Machine Learning, Optimization, and Data Science. LOD 2018. Lecture Notes in Computer Science, Springer, Cham. https://doi.org/10.1007/978-3-030-13709-0_4

[22] Wu, D., Qu, Z.S., Guo, F.J., Zhu, X.L., Wan, Q. (2019). Hybrid intelligent deep kernel incremental extreme learning machine based on differential evolution and multiple population grey wolf optimization methods. Automatika, 60(1): 48-57. https://doi.org/10.1080/00051144.2019.1570642

[23] Jankowski, N. (2018). Prototype-based kernels for extreme learning machines and radial basis function networks. International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, pp. 70-75.

[24] Suresh, S., Saraswathi, S., Sundararajan, N. (2010). Performance enhancement of extreme learning machine for multi-category sparse data classification problems. Engineering Applications of Artificial Intelligence, 23(7): 1149-1157. https://doi.org/10.1016/j.engappai.2010.06.009

[25] Li, B., Rong, X., Li, Y. (2014). An improved kernel based extreme learning machine for robot execution failures. The Scientific World Journal, 2014: 1-7. https://doi.org/10.1155/2014/906546

[26] Deng, C.H., Wang, X.J., Gu, J., Wang, W. (2019). A review of online sequential extreme learning machines. Journal of Physics: Conference Series, 1302(3): 1-5. https://doi.org/10.1088/1742-6596/1302/3/032054 

[27] Zou, W., Yao, F., Zhang, B., Guan, Z. (2018). Improved Meta-ELM with error feedback incremental ELM as hidden nodes. Neural Computing and Applications, 30(11): 3363-3370. https://doi.org/10.1007/s00521-017-2922-y

[28] Ertam, F., Avcı, E. (2017). A new approach for internet traffic classification: GA-WK-ELM. Measurement, 95: 135-142. https://doi.org/10.1016/j.measurement.2016.10.001

[29] Ding, S., Zhang, J., Xu, X., Zhang, Y. (2016). A wavelet extreme learning machine. Neural Computing and Applications, 27(4): 1033-1040. https://doi.org/10.1007/s00521-015-1918-8

[30] Ni, S., Qian, Q., Zhang, R. (2018). Malware identification using visualization images and deep learning. Computers & Security, 77: 871-885. https://doi.org/10.1016/j.cose.2018.04.005

[31] Namanya, A.P., Awan, I.U., Disso, J.P., Younas, M. (2020). Similarity hash based scoring of portable executable files for efficient malware detection in IoT. Future Generation Computer Systems, 110: 824-832. https://doi.org/10.1016/j.future.2019.04.044

[32] Vasan, D., Alazab, M., Wassan, S., Safaei, B., Zheng, Q. (2020). Image-based malware classification using ensemble of CNN architectures (IMCEC). Computers & Security, 92: 101748. https://doi.org/10.1016/j.cose.2020.101748