Week 9/10: Generalized Linear Mixed Models (GLMMs)

March 12-13, and March 20, 2018

When the assumption of independence is violated and your Y variable does not fit the assumptions of a Gaussian model, you can apply the same logic we covered in the GLM unit to your data that includes both random and fixed effects. These are called Generalized Linear Mixed Models or GLMMs.

GLMMs are simple in prinicple, but the underlying mathematics are complex. In this lecture we will focus on the basic logic of the models, and I will briefly review some of the packages that can run them.

Our primary activity will be to conduct a deep dive into a novel study, using GLMMs to answer a research question.

Lecture Topics

  • Introduction to GLMMs
  • Review: Random vs. fixed effects
  • Comparison of packages that run GLMMs in R
  • GLMM exercise: Green crab catch rates versus bait type in a field study

In-class Activities

We will spend much of this week going over a real (as yet unpublished) dataset that is best analyzed in a GLMM framework.

We will use all skills learned so far in this class to conduct the analysis - from setting up the data, to exploring it, to running and interpreting the model output.


Please read Ben Bolker’s introduction to GLMMs


GLMMs are more complex than what we have covered so far and self-study is recommended.

Some excellent course materials:

Primary literature:

I cite this book a lot in this class, but where random effects are concerned their reference guide is especially useful:

See also:

In the notes I use ggeffects to plot GLMM output. For more info:




Slides available on Speakerdeck