GUCL: Computational Linguistics @ Georgetown
We are a group of Georgetown University faculty, student, and staff researchers at the intersection of language and computation. Our areas of expertise include natural language processing, corpus linguistics, information retrieval, text mining, and more, with participation from both the Linguistics and Computer Science departments.
GU research groups: Corpling, NERT, IRLab, InfoSense, Singh lab
Other GU groups: GU-HLT Group, GU Women Coders, Massive Data Institute, Tech & Society Initiative
Related academic groups in the DC/Baltimore region: Howard NLP, JHU CLSP, UMD CLIP, George Mason NLP
- 10/24/24: Op-ed and brief in Supreme Court ghost gun case (Kevin Tobia, Nathan Schneider, Brandon Waldon)
- 9/15/23: [Job] The Linguistics department will be hiring an Assistant Professor of Computational Linguistics.
- 9/8/21: Congratulations to the Corpling lab on winning the DISRPT 2021 shared task on discourse processing!
- 8/27/20: First-Year Student Presented Paper at Prestigious Computational Linguistics Conference (Aryaman Arora)
- 9/10/18: #MeToo Movement on Twitter (Lisa Singh)
- 8/29/18: Cliches in baseball (Nathan Schneider)
- 1/20/18: The Coptic Scriptorium project (Amir Zeldes)
- Congratulations to Arman Cohan, Nazli Goharian, and Georgetown alum Andrew Yates for winning a Best Long Paper award at EMNLP 2017!
- Congratulations to Ophir Frieder, who has been named to the European Academy of Sciences and Arts (EASA)!
- 9/19/16: "Email" Dominates What Americans Have Heard About Clinton (Lisa Singh)
- 7/12/16: Searching Harsh Environments (Ophir Frieder)
Mailing list: Contact Nathan Schneider to subscribe!
upcoming talks/events
- Maciej Ogrodniczuk (IPI PAN Warsaw): Linguistics, 9/6/24, 3:30 in Poulton 230
- Alexis Palmer (Colorado Boulder): Linguistics, 9/20/24, 3:30 in Poulton 230
- Barbara Plank (LMU Munich): CS, Thurs. 10/10/24, 1:00 in STM 414
- Eugene Yang (JHU): CS, 11/1/24, 12:15 in STM 107
- Kyle Mahowald (UT Austin): Linguistics, 11/1/24, 3:30 in Poulton 230
- William Schuler (OSU): Linguistics, 3/21/25, 3:30 in Poulton 230
- Ellie Pavlick (Brown): Linguistics, 4/4/25, 3:30 in Poulton 230
- Previous talks
Courses
Overview of CL course offerings (note: old numbering system)
Document listing courses in CS, Linguistics, and other departments
that are most relevant to students interested in computational linguistics.
Includes estimates of when each course will be offered.
Fall 2023
COSC-3440 | Deep Reinforcement Learning
Grace Hui Yang Undergraduate
Deep reinforcement learning is a machine learning area that learns how to make optimal decisions from interacting with an environment using deep neural networks. An intelligent agent observes the consequences of its action from the environment and alters its behavior to maximize the expected return. We study algorithms and applications in deep reinforcement learning. Topics include Deep neural networks, Markov decision processes, policy gradient methods, Q-Learning (DQN), Actor-Critic, Imitation Learning, and other advanced topics. The course has lectures, readings, programming assignments, and exams.
COSC-4550 (was 488) | Information Retrieval
Nazli Goharian Upperclass Undergraduate & Graduate
Information retrieval is the identification of textual components, be them web pages, blogs, microblogs, documents, medical transcriptions, mobile data, or other big data elements, relevant to the needs of the user. Relevancy is determined either as a global absolute or within a given context or view point. Practical, but yet theoretically grounded, foundational and advanced algorithms needed to identify such relevant components are taught.
The Information-retrieval techniques and theory, covering both effectiveness and run-time performance of information-retrieval systems are covered. The focus is on algorithms and heuristics used to find textual components relevant to the user request and to find them fast. The course covers the architecture and components of the search engines such as parser, index builder, and query processor. In doing this, various retrieval models, relevance ranking, evaluation methodologies, and efficiency considerations will be covered. The students learn the material by building a prototype of such a search engine. These approaches are in daily use by all search and social media companies.
COSC-5455 (was 576) | Introduction to Deep Learning
Joe GarmanGraduate
Recent advances in hardware have made deep learning with neural networks practical for real-world problems. Neural networks are a powerful tool that have shown benefit in a wide range of fields. Deep learning involves creating artificial neural networks with greater layer depth or deep neural nets (DNN) for short. These DNNs can find patterns in complex data, and are useful in a wide variety of situations. In numerous fields, state-of-the-art solutions have been accomplished with DNNs and DNN systems dominate head-to-head competitions. This course will introduce the student to neural networks, explain different neural network architectures, and then demonstrate the use of these neural networks on a wide array of tasks.
COSC-6440 (was 689) | Deep Reinforcement Learning
Grace Hui Yang Graduate
Deep Reinforcement learning is an area of machine learning that learns how to make optimal decisions from interacting with an environment. From the environment, an agent observes the consequence of its action and alters its behavior to maximize the amount of rewards received in the long term. Reinforcement learning has developed strong mathematical foundations and impressive applications in diverse disciplines such as psychology, control theory, artificial intelligence, and neuroscience. An example is the winning of AlphaGo, developed using Monte Carlo tree search and deep neural networks, over world-class human Go players. The overall problem of learning from interaction to achieve goals is still far from being solved, but our understanding of it has improved significantly. In this course, we study fundamentals, algorithms, and applications in deep reinforcement learning. Topics include Markov Decision Processes, Multi-armed Bandits, Monte Carlo Methods, Temporal Difference Learning, Function Approximation, Deep Neural Networks, Actor-Critic, Deep Q-Learning, Policy Gradient Methods, and connections to Psychology and to Neuroscience. The course has lectures, mathematical and programming assignments, and exams.
COSC-8530 (was 883) | Search and Mining of Textual Data
Nazli Goharian Graduate: Doctoral [2 credits]
In this doctoral seminar, doctoral students read, present, and discuss research papers on search and mining methodologies to process textual data of any form: short or long, general or domain specific, formal scientific text or some informal social media text. Student groups are assigned projects towards the aim of developing research insights.
LING-4400 (was 362) | Introduction to Natural Language Processing
Amir Zeldes Upperclass Undergraduate & Graduate
This course will introduce students to the basics of Natural Language Processing (NLP), a field which combines insights from linguistics and computer science to produce applications such as machine translation, information retrieval, and spell checking. We will cover a range of topics that will help students understand how current NLP technology works and will provide students with a platform for future study and research. We will learn to implement simple representations such as finite-state techniques, n-gram models and basic parsing in the Python programming language. Previous knowledge of Python is not required, but students should be prepared to invest the necessary time and effort to become proficient over the course of the semester. Students who take this course will gain a thorough understanding of the fundamental methods used in natural language understanding, along with an ability to assess the strengths and weaknesses of natural language technologies based on these methods.
LING-4427 (was 367) | Computational Corpus Linguistics
Amir Zeldes Upperclass Undergraduate & Graduate
Digital linguistic corpora, i.e. electronic collections of written, spoken or multimodal language data, have become an increasingly important source of empirical information for theoretical and applied linguistics in recent years. This course is meant as a theoretically founded, practical introduction to corpus work with a broad selection of data, including non-standardized varieties such as language on the Internet, learner corpora and historical corpora. We will discuss issues of corpus design, annotation and evaluation using quantitative methods and both manual and automatic annotation tools for different levels of linguistic analysis, from parts-of-speech, through syntax to discourse annotation. Students in this course participate in building the corpus described here: https://corpling.uis.georgetown.edu/gum/
LING-4461 (was 461) | Signal Processing
Corey Miller Upperclass Undergraduate & Graduate
How do things like Amazon Echo and Siri work? What kinds of linguistics went into them and how could they be made better? In order to explore these questions, this course will survey speech technology from a computational linguistic perspective. Both speech recognition, also known as speech-to-text (STT), and speech synthesis, also known as text-to-speech (TTS), will be investigated along with related technologies like speaker/dialect/accent/language identification. While communicating the basic algorithms employed by these technologies, the course will emphasize hands-on and project work to allow you to work with web-based and open source tools to build your own components, evaluate existing products and explore linguistic questions. Students from a variety of backgrounds are encouraged to take this course. Helpful background includes: natural language processing, phonetics, phonology and sociolinguistics. While not required, helpful technical background includes familiarity with speech analysis software such as PRAAT, Linux, shell scripting and coding/scripting in languages like Python, Java, C++, etc.
DSAN-5800 (was ANLY-580) | Advanced NLP
Chris Larson Graduate
This course provides a formalism for understanding the statistical machine learning methods that have come to dominate natural language processing. Divided into three core modules, the course explores (i) how language understanding is framed as a tractable statistical inference problem, (ii) a formal yet practical treatment of the DNN architectures and learning algorithms used in NLP, and (iii) how these components are leveraged in modern AI systems such as information retrieval, recommender systems, and conversational agents. In exploring these topics, the course exposes students to the foundational math, practical applications, current research directions, and software design that is critical to gaining proficiency as an NLP/ML practitioner. The course culminates in a capstone project, conducted over its final six weeks, in which students apply NLP to an interesting problem of their choosing. In past semesters students have built chatbots, code completion tools, stock trading algorithms, just to name a few. This course assumes a basic understanding of linear algebra, probability theory, first order optimization methods, and proficiency in Python.
This is an advanced course. Suggested prerequisites are DSAN 5000, DSAN 5100 and DSAN 5400. However, first-year students with the necessary math, statistics, and deep learning background will be considered.
ICOS-7710 (was 710) | Cognitive Science Core Course
Abigail Marsh & Elissa Newport Graduate
A seminar in which important topics in cognitive science are taught by participating Georgetown faculty from the main and medical campuses. Required for the Cognitive Science concentration, available for Ph.D. students in other programs with instructor permission. (Can be taken more than once for credit.)
Spring 2024
COSC-3450 (was 270) | Artificial Intelligence
Mark Maloof Undergraduate
Artificial Intelligence (AI) is the branch of computer science that studies how to program computers to reason, learn, perceive, and understand. The lecture portion of the class surveys basic and advanced concepts and techniques of artificial intelligence, including search, knowledge representation, automated reasoning, uncertain reasoning, and machine learning. Specific topics include symbolic computing, state-space search, game playing, theorem proving, rule-based systems, Bayesian networks, probability estimation, rule induction, Markov decision processes, reinforcement learning, and ethical and philosophical issues. Applications of artificial intelligence are also discussed in domains such as medicine and computer security. Students complete midterm and final exams, and five programming projects using the Java programming language.
COSC/LING-5402 (was 572) | Empirical Methods in Natural Language Processing
Nathan Schneider Graduate
Systems of communication that come naturally to humans are thoroughly unnatural for computers. For truly robust information technologies, we need to teach computers to unpack our language. Natural language processing (NLP) technologies facilitate semi-intelligent artificial processing of human language text. In particular, techniques for analyzing the grammar and meaning of words and sentences can be used as components within applications such as web search, question answering, and machine translation.
This course introduces fundamental NLP concepts and algorithms, emphasizing the marriage of linguistic corpus resources with statistical and machine learning methods. As such, the course combines elements of linguistics, computer science, and data science. Coursework will consist of lectures, programming assignments (in Python), and a final team project. The course is intended for students who are already comfortable with programming and have some familiarity with probability theory.
COSC-5480 | Large Language Models
Grace Hui Yang Graduate
This course delves deep into the intricacies of Large Language Models (LLMs), offering students an understanding of their design, implementation, and applications. Beginning with the foundational architectures such as transformers and attention mechanisms, students will journey through the evolution from the fundamental models to contemporary marvels like GPT-3, ChatGPT, and GPT-4. The course aims to provide a comprehensive overview of the historical and current state of LLMs, equipping students with the knowledge to design, train, and fine-tune LLMs for custom applications. It will also encourage critical discussions on the ethical, societal, and technical challenges associated with LLMs. Key topics covered in the course include (1) Foundations: Review of RNNs, LSTMs, Attention Mechanisms, and Transformers. (2) Architectural Deep Dive: Behind the design of GPT-3, BERT, and other leading models. (3) Training Paradigms: Techniques and challenges in training massive models. (4) Applications: chatbots, content generation, recommendation systems, and beyond. (5) Societal Impact: Ethical considerations, fairness, and bias in LLMs. (6) Technical Challenges: Model explainability, controllability, and safety concerns. (7) Future Directions: Where LLMs are headed and emerging research areas. The course assessments consist of monthly assignments involving practical implementations and model evaluations, exams covering theoretical and applied concepts, and one optional final project focusing on designing a custom application utilizing LLMs. Class participation and critical discussion sessions are also important components in student assessments.
COSC-640 (was 688) | Experimental Artificial Intelligence (AI)
Joe GarmanGraduate
This course offers opportunities for students to have an in-depth understanding and hands-on experience with practical AI systems for state-of-the-art evaluation campaigns. It includes seminar-style classroom presentations and a significant project component. Students will be guided to go through the design and implementation of AI systems in different domains. The course will review recent AI and Machine Learning publications and lead students to work in small groups to build systems. Students are expected to have strong programming skills and previous experience in machine learning, deep learning, and/or AI.
LING-4401/DSAN-5400 (was LING-472/ANLY-521) | Computational Linguistics with Advanced Python
Trevor Adriaanse Upperclass Undergraduate & Graduate
This course presents topics in Natural Language Processing (NLP) and Python programming for both text processing and analysis. The goal of this class is to explore both classical and modern techniques in NLP, with emphasis on hands-on application. We will examine topics such as text classification, model evaluation, nearest neighbors, and distributed representations. Applications include authorship identification, structured prediction, and semantic textual similarity, to name a few.
Programming topics include Python best practices, scientific computing libraries (e.g., NumPy, sklearn, etc.), exception handling, object-oriented programming, and more. By the end of this course, students will be able to program proficiently in Python, with enough comfort to reference software documentation and pseudocode to write sophisticated programs from scratch.
Requirements: Basic Python programming skills are required (for example satisfied by LING-362, Intro to NLP)
LING-4466 | Multilingual NLP
Kenton Murray Upperclass Undergraduate & Graduate
This is a project based course focusing on the design and implementation of systems that scale Natural Language Processing methods beyond English. The course will cover both multilingual and cross-lingual methods with an emphasis on zero-shot and few-shot approaches, as well as ‘silver’ dataset creation. Modules will include Cross-Lingual Information Extraction & Semantics, Cross-Language Information Retrieval, Multilingual Question Answering, Multilingual Structured Prediction, Multilingual Automatic Speech Recognition, Typology, Morphology, as well as other non-English centric NLP methods. Students will be expected to work in small groups and pick from one of the modules to create a model based on state-of-the-art methods covered in the class. The course will be roughly three-quarters lecture based and one-quarter students presenting project updates periodically throughout the semester.
LING-5444 (was 504) | Machine Learning for Linguistics
Amir Zeldes Graduate
In the past few years, the advent of abundant computing power and data has catapulted machine learning to the forefront of a number of fields of research, including Linguistics and especially Natural Language Processing. At the same time, general machine learning toolkits and tutorials make handling ‘default cases’ relatively easy, but are much less useful in handling non-standard data, less studied languages, low-resource scenarios and the need for interpretability that is essential for drawing robust inferences from data. This course gives a broad overview of the machine learning techniques most used for text processing and linguistic research. The course is taught in Python, covering both general statistical ML algorithms, such as linear models, SVMs, decision trees and ensembles, and current deep learning models, such as deep neural net classifiers, recurrent networks and contextualized continuous meaning representations. The course assumes good command of Python (ability to implement a program from pseudo-code) but does not require previous experience with machine learning.
Requirements: Intermediate Python (courses such as LING-472: Computational Linguistics with Advanced Python provide a good preparation)
LING-8415 (was 765) | Computational Discourse Models
Amir Zeldes Graduate
Recent years have seen an explosion of computational work on higher level discourse representations, such as entity recognition, mention and coreference resolution and shallow discourse parsing. At the same time, the theoretical status of the underlying categories is not well understood, and despite progress, these tasks remain very much unsolved in practice. This graduate level seminar will concentrate on theoretical and practical models representing how referring expressions, such as mentions of people, things and events, are coded during language processing. We will begin by exploring the literature on human discourse processing in terms of information structure, discourse coherence and theories about anaphora, such as Centering Theory and Alternative Semantics. We will then look at computational linguistics implementations of systems for entity recognition and coreference resolution and explore their relationship with linguistic theory. Over the course of the semester, participants will implement their own coding project exploring some phenomenon within the domain of entity recognition, coreference, discourse modeling or a related area.
DSAN-5810 | NLP with Large Language Models
Trevor Adriaanse & Chris Larson Graduate
In recent times, Large Language Models (LLMs) have earned the attention of the world. OpenAI’s infamous generative LLM, ChatGPT, became the fastest-growing consumer application in history in only two months–and the feverish interest around LLMs continues to grow. This course is concerned with learning how to apply LLMs to natural language processing (NLP) problems in real-life settings.
To begin, we will study the transformer architecture that underlies LLMs and describe its prominent role in modern NLP. Then, we will discuss NLP system design considerations, including: the training and scaling of transformer-based models, transfer learning in low-resource settings, model deployment, distributed systems, and more. Meta-learning, multimodal learning, and societal impact will also be covered. Students will work on applications such as cross-language information retrieval, machine translation, prompt engineering, and tasks outside of NLP.
By the end of the course, students will have mastered transformer-based models and will be poised to use them in resource-intensive scenarios that resemble what is done at the cutting edge of NLP today.
ICOS-7712 (was 712) | Cognitive Science Seminar
Abigail Marsh & Elissa Newport Graduate
A seminar in which graduate students and faculty interested in the cognitive sciences will read and discuss prominent articles across our fields. Can be repeated for credit.