Abstract:As a significant task in natural language processing, cross-lingual sentiment analysis is able to leverage the data and models available in rich-resource languages when solving any problem in scarce-resource settings, which has acquired widespread attention. Its core is to establish the connection between languages. In this respect, transfer learning performs better than traditional translation methods and can be enhanced by high-quality cross-lingual text vectors. Therefore, we propose an LAAE model in this study, which uses Long Short Term Memory (LSTM) and an Adversarial AutoEncoder (AAE) to generate contextual cross-lingual vectors and then applies the Bidirectional Gated Recurrent Unit (BiGRU) for subsequent sentiment classification. Specifically, the training in the source language is transferred to that in the target language for classification. The results prove that the proposed method is effective.