基于改进Transformer的剩余时间预测
作者:
基金项目:

黑龙江省省属本科高校基本科研业务费(2022TSTD-03)


Remaining Time Prediction Based on Improved Transformer
Author:
  • 摘要
  • | |
  • 访问统计
  • |
  • 参考文献 [17]
  • |
  • 相似文献 [20]
  • | | |
  • 文章评论
    摘要:

    剩余时间预测能够帮助企业提升业务流程执行的质量和效率. 尽管现有的深度学习方法在剩余时间预测上有一定提升, 但在处理复杂业务流程时, 仍面临时间特征利用不足和局部特征挖掘能力有限的问题, 预测精度有待提高. 为此, 本研究提出了一种基于改进Transformer编码器模型的剩余时间预测方法. 针对已有方法忽略事件时间特征以及难以捕捉局部依赖的不足, 本研究在模型中引入了时间特征编码模块和局部依赖增强模块. 时间编码模块通过嵌入学习和多粒度拼接方式, 构建了富有语义且具判别力的事件时间表示. 局部依赖增强模块采用卷积神经网络, 在Transformer编码器之后提取轨迹前缀的局部细节特征. 实验表明, 融合时间特征和局部依赖增强可以提升复杂业务流程剩余时间的预测准确性.

    Abstract:

    Remaining time prediction helps enterprises improve the quality and efficiency of business process execution. Although existing deep learning methods have shown improvement in remaining time prediction, they still face challenges when dealing with complex business processes. These challenges include insufficient utilization of time features and limited ability to extract local features, leaving room for improvement in prediction accuracy. This study proposes a remaining time prediction method based on the improved Transformer encoder model. Existing methods ignore event time features and struggle to capture local dependencies. To address these limitations, this study introduces a time feature encoding module and a local dependency enhancement module into the model. The time encoding module constructs a semantically rich and discriminative event time representation by embedding learning and multi-granularity concatenation. The local dependency enhancement module uses convolutional neural networks to extract fine-grained local features from the trajectory prefix after processing with the Transformer encoder. Experiments show that integrating time features and local dependency enhancement improves the prediction accuracy of the remaining time for complex business processes.

    参考文献
    [1] van der Aalst WMP, Schonenberg MH, Song M. Time prediction based on process mining. Information Systems, 2011, 36(2): 450–475.
    [2] Folino F, Guarascio M, Pontieri L. Discovering context-aware models for predicting business process performances. In: Meersman R, Panetto H, Dillon T, et al., eds. On the Move to Meaningful Internet Systems: OTM 2012. Berlin: Springer, 2012. 287–304.
    [3] 高俊涛, 陈珂, 刘云峰, 等. 基于在线模型的业务过程剩余时间预测. 计算机集成制造系统, 2022, 28(10): 3090–3099.
    [4] Polato M, Sperduti A, Burattin A, et al. Time and activity sequence prediction of business process instances. Computing, 2018, 100(9): 1005–1031.
    [5] Pandey S, Nepal S, Chen SP. A test-bed for the evaluation of business process prediction techniques. Proceedings of the 7th International Conference on Collaborative Computing: Networking, Applications and Worksharing. Orlando: IEEE, 2012. 382–391.
    [6] Navarin N, Vincenzi B, Polato M, et al. LSTM networks for data-aware remaining time prediction of business process instances. Proceedings of the 2017 IEEE Symposium Series on Computational Intelligence. Honolulu: IEEE, 2017. 1–7.
    [7] Tax N, Verenich I, La Rosa M, et al. Predictive business process monitoring with LSTM neural networks. Proceedings of the 29th International Conference on Advanced Information Systems Engineering. Essen: Springer, 2017. 477–492.
    [8] Taymouri F, La Rosa M, Erfani S, et al. Predictive business process monitoring via generative adversarial nets: The case of next event prediction. Proceedings of the 18th International Conference on Business Process Management. Seville: Springer, 2020. 237–256.
    [9] 倪维健, 孙宇健, 刘彤, 等. 基于注意力双向循环神经网络的业务流程剩余时间预测方法. 计算机集成制造系统, 2020, 26(6): 1564–1572.
    [10] Graves A. Long short-term memory. Supervised Sequence Labelling with Recurrent Neural Networks. Berlin: Springer, 2012. 37–45.
    [11] Bukhsh ZA, Saeed A, Dijkman RM. ProcessTransformer: Predictive business process monitoring with transformer network. arXiv:2104.00721, 2021.
    [12] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 6000–6010.
    [13] 徐兴荣, 刘聪, 李婷, 等. 基于双向准循环神经网络和注意力机制的业务流程剩余时间预测方法. 电子学报, 2022, 50(8): 1975–1984.
    [14] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space. Proceedings of the 1st International Conference on Learning Representations. Scottsdale: ICLR, 2013.
    [15] 郭娜, 刘聪, 李彩虹, 等. 一种预测流程剩余时间的可解释特征分层方法. 软件学报, 2024, 35(3): 1341–1356.
    [16] Ke G, Meng Q, Finley T, et al. LightGBM: A highly efficient gradient boosting decision tree. Proceedings of the 31st International Conference on Neural Information Processing Systems. Long Beach: Curran Associates Inc., 2017. 3146–3154.
    [17] Ni WJ, Yan M, Liu T, et al. Predicting remaining execution time of business process instances via auto-encoded transition system. Intelligent Data Analysis, 2022, 26(2): 543–562.
    引证文献
引用本文

刘海洲,高俊涛.基于改进Transformer的剩余时间预测.计算机系统应用,2024,33(12):231-239

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-05-29
  • 最后修改日期:2024-06-28
  • 在线发布日期: 2024-10-31
文章二维码
您是第12795346位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号