General

Yang Feng

Associate Professor

Natural Language Processing Group
Key Laboratory of Intelligent Information Processing
Institute of Computing Technology, Chinese Academy of Sciences (ICT/CAS)

No. 6 Kexueyuan South Road, Haidian District, Beijing, China
fengyang [at] ict [dot] ac [dot] cn


Bio

Yang Feng is an Associate Professor in Institute of Computing Technology, Chinese Academy of Sciences where she got her phD degree in 2011. She worked in University of Sheffield and Information Sciences Institute, University of Southern California from 2011 to 2014. Now she leads the natural language process group in ICT/CAS and her research interest is natural language processing, mainly focusing on machine translation and dialogue. She was the recipient of the Best Long Paper Award of ACL 2019.


[My homepage in Chinese]

News

  • The enrollment for Graducation students in 2021 has started. Please email to me if you are interested.

  • We are recruiting postdocs and interns.

Honors & Distinctions

  • ICT/CAS “Brilliant Star”, 2019

  • Best Long Paper Award of ACL 2019

  • CCF NLPCC Distinguished Young Scientist, 2019

  • CAAI Outstanding Member, 2019

  • ICT/CAS “Outstanding Researcher”, 2018

  • ICT/CAS “Baixing Talent Introduction Program”, 2017

Publications

Recent Publications:  [More Publications]

  • CDL: Curriculum Dual Learning for Emotion-Controllable Response Generation
    Lei Shen, Yang Feng*. In Proceedings of ACL 2020.

  • A Contextual Hierarchical Attention Network with Adaptive Objective for Dialogue State Tracking
    Yong Shan, Zekang Li, Jinchao Zhang, Fandong Meng, Yang Feng*, Cheng Niu, Jie Zhou. In Proceedings of ACL 2020.

  • A Universal Multimodal Transformer for Video-Audio Scene-Aware Dialog [paper]
    Zekang Li, Zongjia Li, Jinchao Zhang, Yang Feng*, Cheng Niu, Jie Zhou. In Proceedings of AAAI 2020 DSTC8 Workshop.

  • Modeling Fluency and Faithfulness for Diverse Neural Machine Translation [paper] [code]
    Yang Feng, Wanying Xie, Shuhao Gu, Chenze Shao, Wen Zhang, Zhengxin Yang, Dong Yu. In Proceedings of AAAI 2020.

  • Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation [paper] [code]
    Chenze Shao, Jinchao Zhang, Yang Feng*, Fandong Meng and Jie Zhou. In Proceedings of AAAI 2020.

  • Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level NMT [paper]
    Zhengxin Yang, Jinchao Zhang, Fandong Meng, Shuhao Gu, Yang Feng* and Jie Zhou. In Proceedings of EMNLP 2019.

  • Bridging the Gap between Training and Inference for Neural Machine Translation [paper]
    Wen Zhang, Yang Feng*, Fandong Meng, Di You, Qun Liu. In Proceedings of ACL 2019.
    Winner of Best Long Paper Award

  • Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation [paper]
    Chenze Shao, Yang Feng*, Jinchao Zhang, Fandong Meng, Xilin Chen, Jie Zhou. In Proceedings of ACL 2019.

  • Modeling Semantic Relationship in Multi-turn Conversations withHierarchical Latent Variables [paper]
    Lei Shen, Yang Feng*, Haolan Zhan. In pPoceedings of ACL 2019.

  • Incremental Transformer with Deliberation Decoder for Document Grounded Conversations [paper]
    Zekang Li, Cheng Niu, Fandong Meng, Yang Feng, Qian Li, Jie Zhou. In Proceedings of ACL 2019.

  • Improving Domain Adaptation Translation with Domain Invariant and Specific Information [paper]
    Shuhao Gu, Yang Feng*, Qun Liu. In Proceedings of NAACL 2019.

Talks

  • A Review of the Progress of Machine Translation (in Chinese). 2019. YSSNLP invited talk. [slides]
  • An Overview of Machine Translation Frontier (in Chinese). 2019. CCL invited talk. [slides]
  • Diving into the Training for the Sequence-to-sequence Model. 2019. NLPCC NLG Workshop. [slides]

Personal Services

  • Editorial Board Member:   the Northern European Journal of Language Technology
  • Area Chair:   EMNLP 2020, COLING 2018

Students

Ph.D candidates:   Shuman Liu (Co-advising with Prof. Qun Liu);   Jiao Ou;   Peerachet Porkaew;   Shuhao Gu;   Lei Shen;   Zhengxin Yang;   Chenze Shao.
Master candidates:   Shugen Wang;   Jicheng Li;   Yong Shan;   Dengji Guo;   Zekang Li.

Alumni

Ph.D:   Wen Zhang (Co-advising with Prof. Qun Liu).
Master:   Jingyu Li;   Haiyang Xue.