基于LSTM的无监督域自适应行人重识别
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:


LSTM-Based Unsupervised Domain Adaptive Person Re-Identification
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    本文提出一种基于无监督域自适应的行人重识别方法. 给定有标签的源域训练集和无标签的目标域训练集, 探索如何提升行人重识别模型在目标域测试集上的泛化能力. 以此为目的, 在模型的训练过程中, 将源域和目标域训练集同时输入到模型中进行训练, 提取全局特征的同时, 提取局部特征进行行人图像描述以学到更加细粒度的特征. 提出将长短时记忆网络(LSTM)以端到端的方式应用于行人的建模, 将其视为从头到脚的身体部分序列. 本文方法主要分为两个步骤: (1)利用StarGAN对无标签目标域图片进行数据增强; (2)源域和目标域数据集同时输入到全局分支和基于LSTM的局部分支共同训练. 在Market-1501和DukeMTMC-reID数据集上, 本文提出的模型都取得了较好的性能, 充分体现了其有效性.

    Abstract:

    In this study, we propose a method of unsupervised domain adaptive person re-identification. Given a labeled source-domain training set and an unlabeled target-domain training set, we explore how to improve the generalization ability of the person re-identification model on the target-domain test set. For this purpose, during the training of the model, the source-domain and target-domain training sets are simultaneously input into the model for training. While extracting global features, we extract local features to describe the person images and learn more fine-grained features. Furthermore, we apply Long Short-Term Memory (LSTM) for the modeling of a person in an end-to-end manner, treating the person as a sequence of body parts from the head to feet. Specifically, the method in this paper mainly includes two steps: (1) StarGAN is adopted to enhance the data of unlabeled target domain images; (2) the data sets of source domain and target domain are input into global branch and LSTM-based local branch at the same time for joint training. Finally, on the Market-1501 and DukeMTMC-reID data sets, the proposed model has achieved sound performance, which fully reflects its effectiveness.

    参考文献
    相似文献
    引证文献
引用本文

胡卓晶,王敏.基于LSTM的无监督域自适应行人重识别.计算机系统应用,2021,30(2):182-187

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-06-10
  • 最后修改日期:2020-07-10
  • 录用日期:
  • 在线发布日期: 2021-01-29
  • 出版日期:
文章二维码
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号