Abstract:Multi-task learning is widely used in the field of natural language processing, but multi-task models tend to be sensitive to the relevance between tasks. If the task relevance is low or the information transfer is unreasonable, the task performance may be seriously affected. This study proposes a new shared-private structure multi-task learning model, BERT-BiLSTM multi-task learning (BB-MTL). It designs a special parameter optimization method, meta-learning-like train methods (MLL-TM) for the model with the help of meta-learning ideas. Further, a new information fusion gate, Softmax weighted linear gate (SoWLG), is introduced for selectively fusing the shared and private features of each task. To validate the proposed multi-task learning method, a series of experiments are conducted by combining the tasks of hate-speech detection, personality detection, and emotion detection, taking into account the fact that user behavior on the Internet is closely related to individual characteristics. The experimental results show that BB-MTL can effectively learn feature information in relevant tasks, and the accuracy rates reach 81.56%, 77.09%, and 70.82% in the three tasks, respectively.