Machine Translation and Natural Language Generation

Vision: Making Communication Easier among All Languages & Making Techniques Available for All Languages

Course Description

The course is designed for beginners in Natural Language Generation, Natural Language Processing and Machine Translation. The main aim is to exploring the theories and methods in automatically understanding and generating natural language text, with special focus on multilingualism.

The participation requires basic knowledge of machine learning. Please consider taking online courses from Coursera, or watch online videos.

The course is taught every Spring semaster in School of Computer Science, Nanjing University, since 2020. It is firstly designed for graduate students, and then opened for both graduate and undergraduate students.

Objectives

Outline

1. Introduction

  1. Problems in Natural Language Processing
  2. NLP as Classifications
  3. NLP as Structured Predictions
  4. Natural Language Generation

2. Language Models

  1. Probabilistic Modeling of Natural Language
  2. Statistical Language Models
  3. Neural Language Models and Pretraining
  4. Language Language Models (LLMs)
  5. LLMs and Reinforcement Learning

3. Machine Translation

  1. Traditional Machine Translation (Rule-based Machine Translation, Statistical Machine Translation)
  2. Machine Translation and Deep Learning
  3. Machine Translation and LLMs
  4. *Machine Translation with Less Parallel Data (Low-resource, Unsupervised Machine Translation)
  5. *Non-Autoregressive Machine Translation (Parallel Generation)
  6. *Interactive Machine Translation (Human-involved Generation)
  7. *Translation Quality Evaluation (Cross-task Knowledge Transfering)

4. Multilingual LLMs

  1. Evaluation of Multilinguality
  2. Extending LLMs to New Languages
  3. Aligning Language Abilities among Different Languages
  4. Understanding Multilinguality in LLMs

5. Other Topics in LLMs

  1. Reasoning Abilities in LLMs
  2. Alignment and Safety in LLMs
  3. *LLMs for Biology (Generating Protains)
  4. *Multi-modal Large Langauge Models

6. Other Generation Tasks

  1. Summarization: Content Selection
  2. Paraphrase: Semantical Equivalence (Variational Auto-Encoders)
  3. Style Transfer: Controlled Generation (Generative Adversarial Networks)
  4. Image Captioning: Multi-modal Interaction

Note: *marks topics that may change in different semasters.

Assignments and Assessment

Assessment involves both written report and in-class presentation and discussion.

Instructor Contact Information

History

MTNLG2024

Acknowledgement

The course is constantly improved with the help from wonderful teaching assistants: Zaixiang Zheng(2020), Yu Bao(2021), Jiahuan Li(2022), Wenhao Zhu(2023), Changjiang Gao(2024), Peng Ding(2025)