Dr. Ben Butina interviews Dr. Logan Watts, Assistant Professor of Psychology at the University of Texas at Arlington, about his background and research. They discuss the concept of “side effects” in organizational interventions, which are unintended negative consequences that may occur as a result of a treatment or intervention. They also explore the distinction between side effects and primary effects, which are intended consequences of an intervention. Dr. Watts also shares an example of how he and his colleagues discovered negative side effects of an assessment tool they created, which resulted in a negative correlation between the nature of feedback and how much the participants liked the tool. He also addresses the challenges in detecting side effects in the workplace, such as collecting data and disentangling the effects of an intervention, and suggests that organizations may already be collecting a lot of relevant data that could be used to detect side effects.
- Faculty Page: Logan Watts, Ph.D.
- PELICAN Lab
- Side Effects Associated with Organizational Interventions: A Perspective
Logan L. Watts, Ph.D. is an industrial-organizational (I-O) psychologist and an Assistant Professor of Psychology at The University of Texas at Arlington, where he teaches graduate courses in Leadership, Organizational Behavior, and Applied Research Methods and supervises research in the PELICAN Lab. The PELICAN Lab generates evidence-based insights on the psychology of ethics, leadership, innovation/creativity, and narratives (PELICAN) in order to help managers and employers make smarter decisions. He has published over 50 peer-reviewed articles and book chapters on these topics in top outlets such as Journal of Applied Psychology, The Leadership Quarterly, Journal of International Business Studies, and Journal of Business Ethics(Google Scholar citations = 1,800 and h-index = 22). In 2021, he published the book, Ethics Training for Managers: Best Practices and Techniques, to synthesize and translate best practices in ethics training and education. Dr. Watts serves as an Associate Editor of Creativity Research Journal and volunteers as a member of SIOP’s Committee for the Advancement of Professional Ethics.
This transcript is AI-generated and may not be completely accurate. Please do not quote myself or any of my guests based on this transcript.
[00:00:00] Ben Butina, Ph.D.: Hello and welcome to the Department 12 Podcast where we talk about everything I-O psych. I’m your host, Dr. Ben Butina, and joining me today is Dr. Logan Watts. How’s it going today, Logan?
[00:00:11] Logan Watts, Ph.D.: Great. Thanks so much for having me, Ben.
[00:00:14] Ben Butina, Ph.D.: Thanks for being here. So Logan is an assistant professor of psychology at the University of Texas at Arlington, where among other things, he supervises research in the Pelican Lab. Let’s start with where you grew up.
[00:00:27] Logan Watts, Ph.D.: Sure. So I grew up in Georgetown, Texas, which is about 20 miles north of Austin. I spent most of through high school there, and then did my undergraduate work in Abilene, Texas at Abilene Christian University.
[00:00:43] Ben Butina, Ph.D.: At what point in this journey did you find out about I-O psychology?
[00:00:47] Logan Watts, Ph.D.: It was while I was an undergrad, and I want to say I was a junior, before I had ever heard of it. I noticed this is a theme in some of your guests that you’ve interviewed.
[00:00:58] Ben Butina, Ph.D.: I’m not shocked. In fact, there are [00:01:00] people who are probably like, wait a minute, didn’t I already hear this episode? No, that’s right. I promise you this is a new episode.
[00:01:05] Logan Watts, Ph.D.: Yeah, it’s a relatively small school. We have about 5,000 undergrads and the psychology department was very focused on counseling and clinical professions of psychology at the time. So I don’t think we even touched on I-O psychology as a topic in the intro course or anything like that.
It wasn’t until I took one of those kind of, it’s one of those like discover your career type classes in psychology where I did a whole bunch of self-assessments and they exposed me to O*Net. And so I just remember going on and looking up all these different types of careers that I could do with a, with an undergrad in psychology. And that’s where I found out about I-O psychology.
[00:01:41] Ben Butina, Ph.D.: You have been published very extensively, over 50 peer reviewed articles and book chapters, and I’ll link to your faculty page because I think listeners will want to check that out.
But the article I wanted to talk to you about today is one that you co-wrote with Bradley Gray and Kelsey Medeiros, that you wrote an article around March of last year, Side Effects Associated with Organizational Interventions: A Perspective. And this was in the, uh, the journal Industrial and Organizational Psychology.
When I think about side effect, um, I think about a medication that I take to help me sleep, but it upsets my stomach, and that’s a side effect because that’s, that’s not what I wanted to happen. Is it the same idea for organizational interventions?
[00:02:27] Logan Watts, Ph.D.: Yeah, that’s exactly right. So just anytime somebody experiences kind of a negative event that is unintended—that’s associated with some kind of treatment or organizational intervention—that’s how we defined a side effect within IO psychology.
[00:02:43] Ben Butina, Ph.D.: Okay? So the intention does matter. It has to be an unintended effect.
[00:02:49] Logan Watts, Ph.D.: That’s right. Yeah. It has to be an unintended effect to be considered a side effect when we, when it’s an intended effect. We call those primary effects in the paper, um, to kind of distinguish it.[00:03:00]
[00:03:01] Ben Butina, Ph.D.: I’m about to ask a question that may be really, really stupid, but, are there, are there any instances where an effect is both intended and negative? I’m thinking of something like, you know, I go to the gym and my trainer prescribes me this workout and as a result I have this muscle soreness and that’s negative for me, but it was intended I’m supposed to get sore because it was a good workout.
Is there anything equivalent to that in our world ?
[00:03:29] Logan Watts, Ph.D.: Huh, that’s actually one I haven’t really thought through. I don’t know. I don’t know if I would agree with you that the soreness though, is a negative because, um, yes, it’s, it’s painful, but, um, but you probably process it and frame it as like a positive kind of pain that’s actually, um, doing good things for you.
Um, I, I don’t, I don’t know, do, do people, I guess some people could go to the gym and [00:04:00] work out and get really sore and completely regret their decisions of, of going to the gym and never go back. So that’s,
[00:04:06] Ben Butina, Ph.D.: no, I don’t think I completely reject it, but I know what the soreness means, you know, is there’s little micro tears in the muscle and that’s what’s coming back.
So intellectually I know that, but I experience it. Negatively. Right. Uh, in the same way that, you know, maybe, uh, someone getting feedback, , the intention is to get them to change their behavior in some way. But we know that by sharing this feedback that they’re likely to feel some negative emotions.
[00:04:30] Logan Watts, Ph.D.: Yeah, that’s a great example.
I actually have a, a story I can share really briefly about that. For a time after grad school, I started a ethics consulting company for a little while with, uh, some of my colleagues. And one of the things we did is we built an assessment that was designed to try to help people learn about what kind of unique cognitive biase.
That they have kind of blind spots and then give them feedback and help them kinda learn from that and grow. And one of the things we discovered early on was after people took our assessment [00:05:00] and received that feedback, even though we took a lot of steps to really try to be careful how we framed everything to not make people feel bad, Umhmm , we found that, you know, when we measure people’s reactions to the tool, there’s a very strong negative correlation between.
Um, how people scored on that assessment and kinda the nature of, of how negative that feedback was. Mm-hmm. and how much they liked the tool. Right. It was like a negative five. Oh correlation. Like the people who probably need to hear the feedback the most are the. Most likely to reject it. At least that’s what we found.
Yeah. Particular sample that we, that we assessed
[00:05:42] Ben Butina, Ph.D.: anecdotally. Yeah. every time. Um, okay. So a side effect we can think of as, uh, it’s unintended negative consequence of. A planned intervention. We throw that word around a lot, but how do we mean it in this sense?
[00:05:59] Logan Watts, Ph.D.: Yeah, [00:06:00] so we, we use the term pretty broadly, so it can be any kind of plan change and, um, a policy, a program. Uh, or different procedures that are intended to affect employees, behaviors, attitudes, et cetera. So it can mean any number of things, um, ranging from, you know, leadership development to personnel selection, um, to culture change interventions, to new HR policies around flex work.
things like that. Mm-hmm. .
[00:06:29] Ben Butina, Ph.D.: When I think about side effects in the pharmaceutical context, I know that they run these trials, and they ask you to report anything unusual that happens in your body.
When I think about our field, I’m struggling to figure out how do we do the equivalent of that? We can’t possibly just say, tell us everything about your behavior during the time of the intervention, and then look for patterns. So how do we come across side effects to begin with?
[00:06:56] Logan Watts, Ph.D.: Yeah, great question.
And, and there are a lot of [00:07:00] challenges, , to thinking about how to detect these, practically speaking. , but one thing I would say is a lot of organizations are already doing. , quite a bit of work to collect continuous information about their employees attitudes and health and behaviors and things like that.
Um, not just through, you know, annual or biannual kind of culture surveys, engagement surveys, things like that. You’re seeing some companies even go to much greater length to track more kind of like daily, weekly, monthly behaviors, attitudes, things like.
[00:07:32] Ben Butina, Ph.D.: Um, so am I supposed to be scared at this point or
Cause am I, I am a little bit Sure. .
[00:07:40] Logan Watts, Ph.D.: I’m not necessarily advocating for, , organizations, , not that they should be collecting data on, every facet of . Employee wellbeing and performance and everything all the time. Just so that we could detect side. . , but I think that a lot of organizations are tracking a lot of that information already.
Mm-hmm. . , so at least [00:08:00] in those cases, the data is there. If you wanted to try to disentangle, , you know, using like a quasi experimental approach mm-hmm. where we know when the intervention was introduced or we know when it was taken away. Uh, we know when the policy was implemented or we know when the policy.
and we could, correlate that with looking at changes across these different attitudes and behaviors.
[00:08:23] Ben Butina, Ph.D.: Can theory be a useful guide in identifying potential side effects? In other words, Rather than here’s this huge lump of data and let’s look for a pattern after the intervention. , can we say that, based on what we know about, goal setting theory or whatever we could expect, this side effect to emerge.
[00:08:44] Logan Watts, Ph.D.: I think in the paper , it comes across as a pessimistic tone about, , the promise of theory in terms of it helping us predict where these side effects are going to be.
, the way that I conceptualize the detection of side [00:09:00] effects is that very often, , It is inductive. It is kind of post hoc. A lot of the time it is because it’s sings that we didn’t intend. It also, a lot of the times tends to be things that we did not anticipate. Mm-hmm. . And so it’s not even on our radar until you start seeing examples of the intervention not working out or lots of complaints coming from the intervention.
, and then those stories spreading across the field and getting reported across lots of different anecdotal situations, which, if you, if you think about it, it’s not much different than how the side effects are detected in the medical world, where a lot of times it is through post reporting
[00:09:38] Ben Butina, Ph.D.: Can you provide us another example or two of, side effects?
[00:09:42] Logan Watts, Ph.D.: The example that got me started with being interested in this area is goal setting. , while I was a graduate student at Oklahoma, I, , was assigned to read, , this paper by Lisa Ordonez. That came out in 2009 called Goals Gone Wild. And it was an [00:10:00] intriguing review paper that reviews Ordon and colleagues work.
In, several studies now, um, and found that goal setting can backfire. Even though we know that it’s one of the most proven techniques in mm-hmm. , the organizational sciences for improving task motivation and focus on task, it comes with certain side effects, like people tend to completely ignore.
Behaviors that you don’t incentivize or that you don’t clarify are part of the goal? Mm-hmm. . So a practical example of this is, , the Wells Fargo, scandal from around 2016 when news broke about this, but, basically customer service people working for Wells Fargo. Um, I believe it was over 2 million fake bank accounts.
[00:10:48] Ben Butina, Ph.D.: Wait, lemme see if I can guess. Lemme see if I can guess. , yeah. Bank accounts. Okay. So they were being incentivized open new accounts, so they found some way to create lots of fake accounts and [00:11:00] get a bonus and then somehow deactivate them.
[00:11:02] Logan Watts, Ph.D.: That’s right. And it wasn’t just employees finding a way in, in some branches, they even found that managers were training their employees to create the fake No, because their incentives were tied to it as well.
If you want a more famous paper about goal setting gone wrong, , the KE paper from, I think it was 1975, management classic rewarding, awol, hoping for. that brings up all kinds of other examples of where when you set goals, and especially when you tie valued incentives to those goals, watch out for unanticipated consequences.
Um, because it really is gonna motivate people. But there’s a cost to it as well.
[00:11:41] Ben Butina, Ph.D.: I think the fact that you started describing the situation at Wells Fargo and based on what you said, I was able to guess what happened based on the incentives suggests that people really can predict this stuff in some cases.
Do you recommend sitting down when you’re planning a new intervention and saying, okay, [00:12:00] what else could this affect?
[00:12:02] Logan Watts, Ph.D.: Yes, that’s exactly right. And actually, , one of our anonymous reviewers when we submitted the paper, , prod us to think more along those lines in terms of practically how would we implement this?
And so we came up with this figure that shows all these various decision points, that you might consider as a researcher or as a practitioner as you’re preparing. Either plan an intervention or implement an intervention to think about on the front end what possible side effects might result.
And in a lot of cases, we might not have strong theory to inform what that would be. So you have to create your own, . You have to use your imagination I don’t have a strong literature telling me exactly what’s gonna happen.
And I think if you use your imagination, you could come up with some ideas and I don’t think they’re gonna be bad ideas. They’re probably going to be useful things to track.
[00:12:55] Ben Butina, Ph.D.: Absolutely. I think that I think you might be the first [00:13:00] guest in all the years I’ve been doing this show who said I got feedback from your peer review.
I thought it was good feedback, so I incorporated it and it made the paper better. I feel like I should be dropping balloons and confetti on you or something right now.
[00:13:17] Logan Watts, Ph.D.: You know, I’ll take that. I’m, I, when negative reviews, I cower and I don’t look at it for several weeks.
I have to pick myself back up off the floor .
[00:13:29] Ben Butina, Ph.D.: Well, I appreciate you sharing that too, because it’s one of those things that people don’t talk about enough and you don’t realize like, Hey, yeah. Other people feel that way too. one of your other suggestions is to advertise known side effects, just like, medications.
What would that look like in your mind? What, what, how could we do that?
[00:13:48] Logan Watts, Ph.D.: That’s a great, that’s a great question. That’s actually the recommendation we had, that we got the most pushback on in terms of is this realistic? How would this work? You know? I accept yeah, there’s a lot of [00:14:00] unknowns there.
One of the big challenges is, we don’t have standardized treatments in IO psychology like they do in medicine, ? If the bottle says a set of Menino on it, you’re getting a set of menino. Doesn’t matter which pharmacy you go to, doesn’t matter what brand you buy under.
With IO psychology, we use labels for things really loosely. When I’m proposing to implement a new leadership training program at the company, and I call it, let’s say a charismatic leadership training mm-hmm.
That can look very different, right? There’s different models out there for what charismatic leadership is, and in fact, the differences could be critically important because some models of charisma suggest. If the leader isn’t moral and holding up these kind of socialized values, then they’re not charismatic, whereas other models suggest no charismatic leadership is value neutral.
You don’t necessarily have to be moral, um, to be a charismatic leader.
[00:14:59] Ben Butina, Ph.D.: [00:15:00] That’s a pretty big gap. ?
[00:15:01] Logan Watts, Ph.D.: Yeah, I think so.
[00:15:04] Ben Butina, Ph.D.: When I look at, , the ingredients on food or on medicine or side effects on medicine, you know, it’s just printed right there on the bottle. So I wonder what the bottle is for us. When you think about doing this, do you think about it as, a centralized database or an ethical obligation maybe on the practitioner to share side effects?
[00:15:23] Logan Watts, Ph.D.: Something along those lines. Some kind of public database that can be update. , as practitioners and researchers working in the ar, these areas get new inform.
[00:15:34] Ben Butina, Ph.D.: It was great talking to you. I really appreciate you coming on the show.
[00:15:37] Logan Watts, Ph.D.: I really appreciate it, Ben. Thanks a lot.