Skip to main content

Lori Barcliff Baptista

Spring 2025 Interview

Photo of Lori Barcliff Baptista.

Lori D. Barcliff Baptista

Associate Dean for Undergraduate Programs and Advising, School of Communications

Professor of Instruction, Department of Performance Studies

By Laura Ferdinand, Assistant Director of Content and Communications

In mid-March, I had the honor of speaking with Lori Barcliff Baptista about the culture of assessment at Northwestern. We met in her sunny office on the top floor of the Ryan Center for the Musical Arts, which offers a stunning view of Lake Michigan.

Affectionately known as the "S.S. Ryan" by the students who study and perform there, it felt fitting to be at the helm of this great “ship,” given Dr. Barcliff Baptista's significant contributions to leading assessment initiatives at Northwestern. The following is excerpts from our conversation.

As an active scholar/arts practitioner whose work is deeply connected to community, collaboration, and activism, how have your scholarship, arts practice, and assessment work shaped one another?

I'm an ethnographer by training, so I use a lot of different methods to study a question or concern that impacts a community more broadly or maybe individuals within a community. One of the methods that I have always found especially useful or important—particularly if the goal of what you're trying to understand is some problem that people need to work towards solving collectively—is participatory action research.

The whole premise behind participatory action research is that expertise lives in many places, and it is important to take an assets-based approach to understand how the people who are experiencing or are impacted by an issue understand it and how it fits within their values, their purpose, their mission (if it's in an organization), and their priorities. 

Then what are the structures that they already utilize or work within to get and receive information to function as an organization? Where do they get information? Who are their stakeholders? In order to understand what you're measuring, you have to understand the organization itself and how it's structured. And then you could understand where assessment—which sounds like a very abstract thing—matters, and how they're probably already doing it.

You have experience with assessment at so many levels: as a faculty member, Associate Dean, and member of the Assessment and Accreditation Council (AAC). How do you define a “culture of assessment,” and what does it look like in practice within your school and across the institution?

 I came to assessment in my many roles in a couple different places. One, as a member of the  Student Surveys Planning Group, which is this broader consortium of folks across the University that are all doing assessment in some way: whether it's the COFHE surveys, the climate surveys, the senior survey that students take as they're leaving, the incoming students survey when students are matriculating in, or the student satisfaction survey we do in the School of Communication.

And what's productive about that group—which includes many of the people who are also involved in the Assessment and Accreditation Council—is that we think about, “of all the things that are measured, what is useful for us, and what do we need in order to inform decisions that we're making at the school level?” In very practical ways I'd say many of our curricular initiatives in the School of Communication have come out of data that our unit and the university collects, a lot of data.

Being a part of a community that's figuring out how to use that data, how you can slice off the piece of your data so you can do something with it, helps us. So, the culture of assessment isn't just defensive, "oh no, we need to do this thing to be accredited,” but it's acknowledging we already measure stuff all the time. We already make decisions based upon what we measure. 

In our school and at the departmental level, we use the data in a practical way: it's important to know what's working well. We know that we use CTEC data, for example. It's a part of tenure and promotion. “Oh, I love this class. It was fantastic.” It's a part of how we determine if we need to offer more classes or if this class size is too big for what we're trying to accomplish.

How do we think about the learning? Did we hit the learning goals? If the students are confused at the end of the class, then maybe we didn't. Then it’s a way to go back and tweak it. So, I think the culture of assessment, in this context, is thinking about what do we already do? Again, participatory. Why are these things valuable? Who uses them? Where do we get it from? And how can we build upon this information to evolve in a very responsive way, but not reactive or defensive? I think high-level assessment is saying, “this is what we do, and this is the information we've used to decide why to do it that way.”

You joined the AAC in 2018. What has your experience as an AAC member been like, and what aspects of the experience have you valued most?

AAC gives us an opportunity to showcase how—and Covid did this too, inadvertently—it is that we each do our unique things [in each Northwestern school]. For example, a lot of McCormick’s curricula, as I understand it, is based in some way on certain certification and accreditation standards that they have to meet.  Knowing that they had those kinds of frameworks that they were working towards was a different point of view.

 That's useful for me to come back to my faculty and ask, certification aside, are there industry standards? This was especially true during the pandemic with production protocols. In Radio, TV, and Film, we have student filmmakers and students who have grants and coursework to make student films.  If Hollywood was saying, “okay, here's our Covid protocol in terms of how you would shoot with distancing, clean equipment, set turnaround times,” we were able to reference—it wasn't a certification standard, and it wasn't necessarily a licensure standard but—an industry standard that could govern how we were approaching our curriculum.

As an AAC Member, you champion assessment in your school and unit communities. What do instructors find most compelling about enhancing assessment? Can you share an instance when a colleague recognized the rewards or benefits of assessment? 

In the fall of 2020 during Covid, our partners in teaching and learning [Teaching and Learning Technologies, Distance Learning in the School of Professional Studies, the University Libraries, the Searle Center, and AccessibleNU] and the Office of the Provost helped us all pivot from doing all the things that we did in person to doing them remotely. They came up with these training modules and learning communities for those of us who were teaching called the Practicum on the Foundations of Teaching Online [the inaugural University Practicum].

And what was really wonderful about that project was that it was pulling those of us together from across disciplinary areas and fields, and it asked us to think about [teaching online] in one particular course that you're teaching. What are your learning objectives, and how do you scaffold your assignments to do this? And how can we help you? One, is this assignment really preparing somebody to be able to do this thing? And two, how can we help you with the technology, whether it's Panopto or asynchronous learning. What kinds of tools are baked into Canvas or can we introduce you to in this environment to help you meet your learning goals? And that's really helpful because there are some things, even after the pandemic, that many of us have kept.

The digital accessibility project is another way of thinking about how you build upon this comfort that you may not have had before with using some things in Canvas. Again, it's taking into consideration the main things you're trying to achieve in this course. And it's very nuts and bolts. You have ten weeks, you have these three units, how are you breaking it apart? The big picture of assessment doesn't work if all of these little pieces don't cumulatively lead up to what it is students should know.

And so that was very helpful because you could be in a breakout group with someone from anthropology, someone from journalism, someone from McCormick, and you're all saying, “oh, in my class I do this exercise where I do these surveys, or I use this pop quiz in this way.” And so again, using folks here as resources and being in community.

It’s not just you alone being responsible for measuring and assessing if what you're doing works. Maybe it’s implementing some other strategies to make effective and then sharing that and being an ambassador for it.

What advice would you give to faculty members or administrators looking to build a strong and lasting culture of assessment within their own schools or colleges?

It's a way of going back to participatory action. What is it we already do that we value that helps us make decisions, inform things, and sustain it? How do we expand it? How do we make the information available? We have a SharePoint site that is updated to make sure that people have access to a lot more of these tools.

Particularly, Searle does a wonderful job in terms of providing outlets for instructors to explore best practices in teaching and instruction and thinking about different modalities. I think part of it is realizing people don't know what they don't know about what's available to be supportive, and then with what we do know, being transparent about how we're using it to make other decisions.

 There's so many ways we were already using assessment, and I think what the HLC accreditation process provided was an opportunity, perhaps in a condensed way, to share back out what we do.