参考文献/References:
[1] HONG Y,ZHANG J F,MA B,et al. Using cross-entity inference to improve event extraction[C]//Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics:Human Language Technologies. Portland,USA:Association for Computational Linguistics,2011.
[2]LI Q,JI H,HUANG L. Joint event extraction via structured prediction with global features[C]//Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. Sofia,Bulgaria:ACL,2013.
[3]吴家皋,周凡坤,张雪英. HMM模型和句法分析相结合的事件属性信息抽取[J]. 南京师大学报(自然科学版),2014,37(1):30-34.
[4]CHEN Y B,XU L H,LIU K,et al. Event extraction via dynamic multi-pooling convolutional neural networks[C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics. Beijing,China:ACL,2015.
[5]FENG X C,QIN B,LIU T. A language-independent neural network for event detection[J]. Science China(Information Science),2018,61(9):81-92.
[6]NGUYEN T H,CHO K,GRISHMAN R. Joint event extraction via recurrent neural networks[C]//Proceedings of the 2016 Conference of the North American Chapter of the Association Computation Linguistics:Human Language Technologies. San Diego,USA:NAACL,2016.
[7]REN P Z,XIAO Y,CHANG X J,et al. A Survey of Deep Active Learning[J]. arXiv preprint arXiv:2009.00236,2020.
[8]SEUNG H S,OPPER M,SOMPOLINSKY H. Query by committee[C]//Proceedings of the fifth Annual Workshop on Computational Learning Theory. Pittsburgh,USA:ACM,1992.
[9]LIAO S S,GRISHMAN R. Using prediction from sentential scope to build a pseudo co-testing learner for event extraction[C]//Proceedings of the 5th International Joint Conference on Natural Language Processing. Chiang Mai,Thailand:ACL,2011.
[10]邱盈盈,洪宇,周文瑄,等. 面向事件抽取的深度与主动联合学习方法[J]. 中文信息学报,2018,32(6):98-106.
[11]DEVLIN J,CHANG M W,LEE K,et al. Bert:pre-training of deep bidirectional transformers for language understanding[C]//Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Minneapolics,USA:ACL,2019.
[12]PETERS M,NEUMANN M,IYYER M,et al. Deep contextualized word representations[C]//Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. New Orleans,USA:ACL,2018.
[13]VASWANI A,SHAZEER N,PARMAR N,et al. Attention is all you need[C]//Proceedings of the 31st Conference on Neural Information Processing Systems. Long Beach,USA:CAI,2017.
[14]HUANG S J,ZHAO J W,LIU Z Y. Cost-effective training of deep CNNs with active model adaptation[C]//Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. London,UK:ACM,2018.
[15]MERCHANT A,RAHIMTOROGHI E,PAVLICK E,et al. What happens To BERT embeddings during fine-tuning?[J]. arXiv preprint arXiv:2014.14448,2020
[16]MARTINEZ-CANTIN R,DE FREITAS N,DOUCET A,et al. Active policy learning for robot planning and exploration under uncertainty[C]//Proceedings of Robotics:Science and Systems III. Atlanta,USA:MIT Press,2007.
[17]SCHEFFER T,DECOMAIN C,WROBEL S. Active hidden Markov models for information extraction[C]//Proceedings of the 4th International Conference on Advances in Intelligent Data Analysis. Berlin Germany:Springer,2001.
[18]LIU M Y,TU Z Y,ZHANG T,et al. LTP:A New Active Learning Strategy for CRF-Based Named Entity Recognition[J]. arXiv preprint arXiv:2001.02524,2020.
[19]LIU J,CHEN Y B,LIU K. Exploiting the ground-truth:an adversarial imitation based knowledge distillation approach for event detection[J]. Proceedings of the AAAI Conference on Artificial Intelligence,2019,33(1):6754-6761.
[20]SCHEIN A I,UNGAR L H. Active learning for logistic regression:an evaluation[J]. Machine Learning,2007,68(3):235-265.