Research Labs and Groups at UofA CS¶
This is a practical guide to the main research groups in the UofA CS department — what they work on, who's involved, how prestigious they are internationally, and how you as an undergrad can realistically get involved.
This is not an exhaustive list of every faculty member. It's the labs and groups you'll most commonly hear about, and the ones with the clearest paths for undergrad involvement.
RLAI — Reinforcement Learning and Artificial Intelligence¶
Prestige level: International. Genuinely one of the best in the world for RL.
This is the most globally recognized research group at UofA. If you've taken any course involving reinforcement learning, you've almost certainly encountered work that came out of this lab.
Who's here: - Rich Sutton — co-author of Reinforcement Learning: An Introduction with Andrew Barto, the definitive textbook on RL, freely available at incompleteideas.net. Sutton is one of the foundational figures in the entire field. His "Reward is Enough" hypothesis (published in 2021 in Artificial Intelligence) argues that reward maximization is sufficient to explain all intelligence — a bold, contested, and influential claim. - Michael Bowling — known for solving heads-up limit Texas Hold'em poker (the first time a poker variant was solved), work on game theory and RL, and multiagent systems - Patrick Pilarski — focus on prosthetics, health applications of RL, continual machine learning in real-world systems - Martha White — representation learning, continual learning, stability-plasticity problems in neural networks - Adam White — practical RL, teaching (he runs CMPUT 365), student-accessible research - Csaba Szepesvári — theoretical foundations of RL, bandit problems, statistical learning theory
What they work on: RL theory and algorithms, policy gradient methods, deep RL, continual learning (learning that doesn't forget), AI safety, representation learning, and applications ranging from game-playing AI to real-world control systems.
Industry connections: DeepMind has deep historical ties to this lab. AlphaGo and subsequent work built on RL ideas developed here and at related groups. Researchers from RLAI have gone to Google Brain, DeepMind, Microsoft Research, OpenAI, and top academic institutions globally.
How to get involved as an undergrad: Take CMPUT 365 (Reinforcement Learning) first — it's the clearest signal of genuine interest and gives you foundational vocabulary. Read the Sutton & Barto textbook (or at least the first four chapters) before approaching anyone. The lab is competitive for undergrad positions but they do take motivated students. Email Adam White or Martha White first if you're an undergrad — they tend to be more accessible and have experience mentoring undergrads. Show that you've engaged with the material, not just that you want a research credit.
Amii — Alberta Machine Intelligence Institute¶
Prestige level: National. The hub of AI research in Alberta, one of three Pan-Canadian AI strategy nodes.
Amii isn't a single research lab — it's an institute that spans the university and connects academic AI research to government, industry, and the public. UofA is the core of it.
What it is: Canada's federal government funded three AI institutes as part of the Pan-Canadian AI Strategy: Vector Institute (Toronto, founded by Geoffrey Hinton and others), MILA (Montreal, Yoshua Bengio's group), and Amii (Edmonton, tied to RLAI and UofA broadly). This federal backing means real funding for researchers, students, and infrastructure.
What Amii does: - Funds graduate students and postdocs across participating professors - Runs industry partnership programs (companies in Alberta and nationally work with Amii researchers) - Hosts public seminars, workshops, and panels — most are free and open to students - Connects students with industry opportunities — internships, collaborations, introductions
For undergrads: Check amii.ca regularly. Their events calendar is consistently interesting — guest speakers from Google, DeepMind, Canadian tech companies, and international research labs. Showing up at these events is a low-barrier way to get known in the AI research community in Edmonton and to meet grad students and professors informally. You don't need to be doing research to attend most Amii events.
If you're interested in AI/ML research or industry, treat Amii as a resource. Subscribe to their newsletter, attend events, and look for any student programs they're running in a given year.
BSAIL — Biological and Statistical AI Lab¶
Prestige level: Niche but respected. Good for CS-biology intersections.
BSAIL works at the intersection of statistical machine learning, computational biology, and bioinformatics. If you're interested in health tech, genomics, drug discovery, or biological data — this is the lab to look at.
What they work on: Statistical modeling of biological processes, bioinformatics algorithms, applications of ML to health data, cancer genomics, and related problems.
How to get involved: Background in statistics, probability (CMPUT 267 is relevant), and some biology helps but isn't always required. Email the faculty involved directly. This is a less competed-for lab than RLAI, which means motivated undergrads have more realistic access.
Database Research Group¶
Prestige level: International. M. Tamer Özsu is one of the most cited database researchers in the world.
The database group is smaller than RLAI but internationally recognized, primarily through Özsu's career-long work on distributed data management.
Who's here: - M. Tamer Özsu — distributed database systems, NoSQL, graph databases, big data query processing. Author of Principles of Distributed Database Systems (the graduate textbook on the subject, now in its 4th edition). If you've taken a databases course anywhere in the world, your course was likely designed with his textbook on the reading list.
What they work on: Distributed query processing, transaction management in distributed systems, graph data management, and scaling databases to modern workloads.
Relevant courses to take first: CMPUT 291 (intro databases), CMPUT 391 (database management systems).
How to get involved: Email the lab directly. The database group is smaller, so it's potentially more accessible. Interest in distributed systems and solid SQL/query processing foundations will help you make a case.
Systems, Networking, and Architecture¶
Prestige level: Solid domestic reputation. Less internationally prominent than RLAI or databases but good technical work.
Multiple professors work on various aspects of systems: operating systems, computer networks, compilers, programming languages, hardware-software interfaces, and distributed systems engineering.
What they work on: Network protocol design, OS kernel-level work, compiler optimizations, programming language theory and implementation, and distributed systems at the infrastructure level.
Relevant courses: CMPUT 379 (operating systems), CMPUT 313 (computer networks), CMPUT 415 (compiler design), CMPUT 481 (distributed systems).
How to get involved: Systems research is less accessible to first or second year students because it requires deeper background (you need to understand OS concepts, networking stacks, and ideally low-level programming in C or Rust). By third year with the relevant courses done, you're in a solid position to approach faculty. Check the department website for faculty listing under the "Systems and Theory" research area.
HCI — Human-Computer Interaction and Interactive Systems¶
Prestige level: Respected nationally. Accessible to a broad range of students.
HCI research spans computer science, design, cognitive science, and social science. The work involves understanding how people interact with technology and designing better systems as a result.
What they work on: Accessibility (assistive technology for users with disabilities), novel interface paradigms, user studies and evaluation methods, visualization, and educational technology.
Relevant courses: CMPUT 302 (introduction to HCI and user interfaces).
How to get involved: HCI labs are often among the most accessible for undergrads because many projects involve running user studies, building prototypes, and analyzing data — tasks that motivated undergrads can contribute to meaningfully without years of prerequisite knowledge. If you're interested, reach out directly and emphasize any design, user research, or human-centered projects you've done.
Natural Language Processing Group¶
Prestige level: Growing. Increasingly relevant given the current state of the field.
NLP at UofA has grown as language models have become central to AI. The group works on text analysis, information extraction, language model development, and related problems.
What they work on: Named entity recognition, question answering, text classification, multilingual NLP, and increasingly, the analysis and development of large language models.
Relevant courses: CMPUT 461 (introduction to NLP).
How to get involved: Take CMPUT 461 first. NLP intersects heavily with ML so a solid ML background (CMPUT 267 or CMPUT 361) helps. This is a growing area at UofA and there's genuine opportunity for undergrads as the group expands.
Getting Involved: General Advice¶
Regardless of which group interests you, the approach is the same.
Attend department seminars. The CS department and Amii host regular talks by visiting researchers and faculty. These are almost always free and open to undergrads. Showing up is how you learn what questions researchers are actually asking — and occasionally how you get noticed. The schedule is posted on the CS department website and amii.ca.
Read papers before reaching out. This cannot be overstated. An email that says "I read your recent paper on [specific topic] and had a question about [specific aspect]" gets a response. An email that says "I'm interested in your work in machine learning" does not.
Lab reading groups. Many labs run weekly or biweekly reading groups where members present papers. Ask if you can attend as an observer. This is a zero-risk way to get known in the lab and to understand what problems they're working on.
Start in Year 2, not Year 4. You need time to ramp up, contribute meaningfully, and get a reference letter worth something. Starting in your final year means you'll leave before you've made a real impact.
Don't ignore smaller or less-famous groups. RLAI is amazing, but it's also the most competitive for undergrad positions. A professor in a smaller group might give you more mentorship, more ownership over a project, and a stronger reference letter. Famous lab + peripheral involvement is often less valuable than smaller lab + real responsibility.
The goal for Year 2: Get your foot in the door somewhere, even as a volunteer. Attend seminars. Read papers in an area you care about.
The goal for Year 3: Be meaningfully contributing to a project. Apply for NSERC USRA if eligible.
The goal for Year 4: Have a result to show — a paper, a poster, a codebase, a thesis contribution. If grad school is on the table, this is what your application is built on.