Application of Ordinal Classification Prediction in Evolutionary Algorithms
CSTR:
Author:
  • Article
  • | |
  • Metrics
  • |
  • Reference [22]
  • |
  • Related [20]
  • | | |
  • Comments
    Abstract:

    Solving expensive optimization problems is often accompanied by computational cost disasters. To reduce the number of real evaluations of the objective function, this study uses the ordinal prediction method in the selection of candidate solutions in evolutionary algorithms. The relative quality of candidate solutions is directly obtained through classification prediction, which avoids the need to establish an accurate surrogate model for the objective function. In addition, a reduction method for the ordinal sample set is designed to reduce the redundancy of the ordinal sample set and improve the training efficiency of the ordinal prediction model. Next, the ordinal prediction is combined with the genetic algorithm. The simulation experiments of the ordinal prediction-assisted genetic algorithm on the expensive optimization test function show that the ordinal prediction method can effectively reduce the computational cost of solving expensive optimization problems.

    Reference
    [1] Jiang J, Han F, Wang J, et al. Improving decomposition-based multiobjective evolutionary algorithm with local reference point aided search. Information Sciences, 2021, 576: 557–576. [doi: 10.1016/j.ins.2021.06.068
    [2] Singh SP. Improved based differential evolution algorithm using new environment adaption operator. Journal of the Institution of Engineers (India): Series B, 2022, 103: 107–117
    [3] Yang YK, Liu JC, Tan SB. A partition-based constrained multi-objective evolutionary algorithm. Swarm and Evolutionary Computation, 2021, 66: 100940. [doi: 10.1016/j.swevo.2021.100940
    [4] Dar FH, Meakin JR, Aspden RM. Statistical methods in finite element analysis. Journal of Biomechanics, 2002, 35(9): 1155–1161. [doi: 10.1016/S0021-9290(02)00085-4
    [5] Lin CL, Tawhai MH, Mclennan G, et al. Computational fluid dynamics. IEEE Engineering in Medicine and Biology Magazine, 2009, 28(3): 25–33. [doi: 10.1109/MEMB.2009.932480
    [6] Shi L, Rasheed K. A survey of fitness approximation methods applied in evolutionary algorithms. In: Tenne Y, Goh CK, eds. Computational Intelligence in Expensive Optimization Problems. Berlin, Heidelberg: Springer, 2010. 3–28.
    [7] Chen LM, Qiu HB, Gao L, et al. Optimization of expensive black-box problems via gradient-enhanced Kriging. Computer Methods in Applied Mechanics and Engineering, 2020, 362: 112861. [doi: 10.1016/j.cma.2020.112861
    [8] Ren XD, Guo DF, Ren ZG, et al. Enhancing hierarchical surrogate-assisted evolutionary algorithm for high-dimensional expensive optimization via random projection. Complex & Intelligent Systems, 2021, 7(6): 2961–2975
    [9] Runarsson TP. Ordinal regression in evolutionary computation. Proceedings of the 9th International Conference on Parallel Problem Solving from Nature. Reykjavik: Springer, 2006. 1048–1057.
    [10] Tong H, Huang CW, Minku LL, et al. Surrogate models in evolutionary single-objective optimization: A new taxonomy and experimental study. Information Sciences, 2021, 562: 414–437. [doi: 10.1016/j.ins.2021.03.002
    [11] Loshchilov I, Schoenauer M, Sebag M. Comparison-based optimizers need comparison-based surrogates. Proceedings of the 11th International Conference on Parallel Problem Solving from Nature. Kraków: Springer, 2010. 364–373.
    [12] Lu XF, Tang K. Classification- and regression-assisted differential evolution for computationally expensive problems. Journal of Computer Science and Technology, 2012, 27(5): 1024–1034. [doi: 10.1007/s11390-012-1282-4
    [13] Cortes C, Vapnik V. Support-vector networks. Machine Learning, 1995, 20(3): 273–297
    [14] Goyal S. Effective software defect prediction using support vector machines (SVMs). International Journal of System Assurance Engineering and Management, 2021: 1–16. [doi: 10.1007/s13198-021-01326-1
    [15] Nanglia P, Kumar S, Mahajan AN, et al. A hybrid algorithm for lung cancer classification using SVM and neural networks. ICT Express, 2021, 7(3): 335–341. [doi: 10.1016/j.icte.2020.06.007
    [16] Shen J, Wu JC, Xu M, et al. A hybrid method to predict postoperative survival of lung cancer using improved SMOTE and adaptive SVM. Computational and Mathematical Methods in Medicine, 2021, 2021: 2213194
    [17] Yang AM, Bai YJ, Liu HX, et al. Application of SVM and its improved model in image segmentation. Mobile Networks and Applications, 2021: 1–11. [doi: 10.1007/s11036-021-01817-2
    [18] Chen Q, Liu B, Zhang Q, et al. Problem definitions and evaluation criteria for CEC 2015 special session on bound constrained single-objective computationally expensive numerical optimization. Technical Report, Zhengzhou: Zhengzhou University, 2014. 1–17.
    [19] Chang CC, Lin CJ. LIBSVM: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2011, 2(3): 27
    [20] Viana FAC. A tutorial on Latin hypercube design of experiments. Quality and Reliability Engineering International, 2016, 32(5): 1975–1985. [doi: 10.1002/qre.1924
    [21] Soentpiet R. Advances in kernel methods: Support vector learning. Cambridge: MIT Press, 1999.
    [22] Jin YC. Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation, 2011, 1(2): 61–70. [doi: 10.1016/j.swevo.2011.05.001
    Cited by
    Comments
    Comments
    分享到微博
    Submit
Get Citation

毛立伟,贺慧芳,李文彬,郭观七.序的分类预测在进化算法中的应用.计算机系统应用,2022,31(11):199-206

Copy
Share
Article Metrics
  • Abstract:620
  • PDF: 1464
  • HTML: 1005
  • Cited by: 0
History
  • Received:January 27,2022
  • Revised:February 24,2022
  • Online: July 14,2022
Article QR Code
You are the first990776Visitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-3
Address:4# South Fourth Street, Zhongguancun,Haidian, Beijing,Postal Code:100190
Phone:010-62661041 Fax: Email:csa (a) iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063