|TITLE||Research Topics in Natural Language Processing|
|LEVEL||05 - Postgraduate Modular Diploma or Degree Course|
|DESCRIPTION||This study-unit will expose students to current relevant research in Natural Language Processing and should serve to provide them with a comprehensive understanding of the topics at hand. Students will be provided with a number of key publications covering different topics of interest in the following fields:
- Statistical language learning
- Text analysis for Computational Social Science
- Text-driven Forecasting
- Core NLP Algorithms (semantics, syntax, morphology, and other algorithms)
- Question Answering
- Text Summarisation
- Machine Translation
- Machine learning for NLP applications
- Speech Processing
- Vision and Language Processing
Students will be assisted through meetings and online class discussions which will compare and contrast different works. Students will then be expected to draw up a report and work on a mini-project.
The aim of this study-unit is to expose the students to the state of the art research in NLP and at the same time, help them organise their readings and research efforts. Through this study-unit, students will have the opportunity to learn how to write a proper literature review, work on a mini-project, and present their findings in a seminar. This will give students invaluable background knowledge in the topic, whilst also giving them the opportunity of hands-on research.
1. Knowledge & Understanding:
Given one of the research topics covered in the course, the student will be able to:
- Demonstrate a clear understanding of the topic, its practical applications and future research directions;
- Point out and discuss the current state of the art in the research topic;
- Contribute to a possible solution in the field of research and suggest a prototype application;
- Conduct a critical review of the relevant literature;
- Compare different approaches to a specific problem and discuss their pros and cons;
- Make quantitative and qualitative comparisons on the basis of the evaluation of the literature review.
By the end of the study-unit the student will be able to:
- Discuss and critically analyse the motivation, techniques and results with specialisation in more advanced topics;
- Explore a subject area in depth within a particular application context.
Main Text/s and any supplementary readings:
- Justin Zobel. Writing for Computer Science. (2014, Springer 3rd Edition).
The following are a few sample papers/articles that will be used. An updated list will be made available to the students:
- Michael Collins and Yoram Singer. Unsupervised models for named entity classification. In Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora, 1999.
- Daniel Gildea and Daniel Jurafsky. 2002. Automatic labeling of semantic roles. Comput. Linguist. 28, 3 (September 2002), 245-288.
- James Clarke and Mirella Lapata. 2007. Modelling Compression with Discourse Constraints. In Proceedings of the Conference on Empirical Methods in Natural Language Processing and on Computational Natural Language Learning, pages 1–11. Prague, Czech Republic.
- John D. Lafferty, Andrew McCallum, and Fernando C. N. Pereira. 2001. Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data. In Proceedings of the Eighteenth International Conference on Machine Learning (ICML '01), Carla E. Brodley and Andrea Pohoreckyj Danyluk (Eds.). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 282-289.
- Patrick Pantel and Dekang Lin. 2002. Discovering word senses from text. In Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '02). ACM, New York, NY, USA, 613-619.
- John Goldsmith. Unsupervised learning of the morphology of a natural language. Computational Linguistics, 27:153–198, June 2001.
- Harald Hammarström and Lars Borin. Unsupervised learning of morphology. Computational Linguistics, 37:309–350, 2011.
- Jacob Devlin, Rabih Zbib, Zhongqiang Huang, Thomas Lamar, Richard M. Schwartz, John Makhoul: Fast and Robust Neural Network Joint Models for Statistical Machine Translation. ACL (1) 2014: 1370-1380.
- Proceedings from these conferences: Association for Computational Linguists (ACL), International Joint Conference on Artificial Intelligence, Conference on Natural Language Learning, International Conference On Natural Language Generation, International Conference on Computational Linguistics and others.
|STUDY-UNIT TYPE||Lecture, Independent Study, Project and Seminar|
|METHOD OF ASSESSMENT||
|LECTURER/S||Lonneke van der Plas
The University makes every effort to ensure that the published Courses Plans, Programmes of Study and Study-Unit information are complete and up-to-date at the time of publication. The University reserves the right to make changes in case errors are detected after publication.
The availability of optional units may be subject to timetabling constraints.
Units not attracting a sufficient number of registrations may be withdrawn without notice.
It should be noted that all the information in the study-unit description above applies to the academic year 2019/0, if study-unit is available during this academic year, and may be subject to change in subsequent years.