What Administrators Don’t Get About Data
For campus leaders, too much information is part of the job. Here’s how to get better at collecting and using it.
Data is everywhere. It permeates our lives. And higher education, like any economic sector, has droves of seemingly objective statistics ripe for analysis. Yet too many campus leaders have trouble discerning useful data from the kind that is vague, irrelevant, or even specious.
I’m a dean now, but as a social scientist I was trained extensively in data — how to collect it, how to design surveys, how to ensure validity and reliability, and how to interpret it accurately. It’s that last one that poses the biggest challenge for campus leaders. Most administrators recognize data as a valuable argumentative lever to pull. But they tend to clamor for the “right” data, or they fall into confirmation bias by looking only for information that supports whatever claim they’re trying to make. Data is like a sharp knife in a kitchen. In the right hands it can be a highly useful tool. Handle it carelessly, and you might cut yourself or someone else.
I consider myself a data lover. I was immediately drawn to the field of experimental psychology in my first statistics course when my professor, a quantitative psychologist, explained linear regression. At once I understood and was engulfed by the excitement that we could, within a range of error, predict human behavior with data. It has become a lifelong passion. I continue to use data in almost every aspect of my leadership and managerial career in higher education.
But you don’t have to be a data lover to get better at using it. By taking the following steps in project planning, you should be able to gather meaningful data, grasp what it means (and what it doesn’t), and understand how to present it to the right audience.
Step 1: Know what you want to know. If that seems obvious, let me just note that overreach is one of the most common mistakes made by administrators dipping a toe into the world of data-informed decision making. Whenever you move up the management ranks and acquire new authority, you naturally want to know more — to analyze everything, everywhere, all the time. But you don’t have the bandwidth to do that, nor do your direct reports, who, we all know, will end up doing the legwork to produce a lot of this data that you don’t really need and will never use.
No doubt you have a shortlist of key priorities, projects, and areas for improvement. Use that list to guide your data requests. For each item on your shortlist, compile key performance indicators — specific things you want to know.
For example, if you want to know about student attitudes toward your college’s gen-ed courses, you don’t need to spend time measuring their academic performance in those classes. What you want to know is how they feel. Plenty of students do well in courses they don’t particularly like, and vice versa. The focus of this scenario is on student perceptions, not performance, so don’t muddy the waters by asking for data about the latter.
With your newly narrow list of things you want to know in hand, you can move to the next step: What data can realistically be collected, and how?
Step 2: Lean on experts for design, collection, and analysis. Administrators have a wide range of skill sets and backgrounds, but only a small subset of us learn — at a doctoral level of training — the technical aspects of survey creation, experimental design, and data analysis.
Don’t let relative inexperience scare you away from using data or approaching the appropriate campus offices for help. Most institutions have dedicated departments for institutional research, continuous improvement, or some variation. Additionally, most have faculty experts in the disciplines of applied statistics, experimental design, and survey creation from mathematics, business, and social-science departments, among others.
Bring those in-house experts into the conversation in your office, and share your data wish list. An institutional-research department will have a very firm idea of the nuances of data collection and what’s possible. Ask questions such as “How would you gather this data?” or “What is the best way for me to get this information?” Then it will be your turn to answer numerous questions from the experts pertaining to what you really want to know. Your answers will help institutional-research and other data experts to narrow down the possible methods, frame the questions even more specifically, and secure the feedback you desire.
On many campuses, this kind of inquiry is stated in the institutional-research department’s mission, so reach out. Part of its mandate is to do precisely the kind of work you are soliciting.
Step 3: How to decide what to share. At some point you are going to begin receiving data — whether raw and unanalyzed or compiled into a full report by a faculty member or by a staff member in institutional research. It is easy to be overwhelmed to the point of inaction (i.e., “What do I do with all of this stuff now?”), or motivated to boast about every single finding (i.e., patting yourself on the back for how smart you are for asking the right questions).
Once you begin receiving data, it is time to look back at your list of priorities and what you wanted to know. Think critically about why you wanted to know those things. It’s very likely that the “why” is rooted in solving some problem, for you, for students, for faculty and staff members, or some combination. The “why” should help narrow your focus on the next step: how to share the results.
Even if you are collecting the data for your own knowledge and edification, you will probably want to share some results with a small group of people (e.g., your leadership team, your supervisor). Keep in mind that, while many or most of those folks are highly intelligent and have research backgrounds and doctorates, they already have a lot to read. Share too much information, and people may miss the message you want them to focus on.
Consider these steps as you reveal the findings:
- Try to condense the data you’re sharing down to key points. Constantly refer back to your priority list. Recall the core reason why you wanted to know something, and then convey the core back to your target audience.
- Remember that your audience — whether it’s a few senior leaders, a group of faculty senators, or the entire student body — needs to hear only what you intend for them to hear. I’m not arguing in favor of opacity or deception. However, the purpose of the sharing (whether it happens via a meeting, a blog post, or a one-on-one conversation) is to convey a core message that refers back to your original list.
- Always be willing to share the full information with everyone, perhaps even before a planned meeting.
Step 4: Understand the data’s limitations. I see this as the most important step in the process — making sure you and your audience know what the data doesn’t mean.
In my doctoral program, we were taught to always write a “limitations” section at the end of an experimental paper. When I first heard about that tradition, I thought: “But aren’t we just telling everyone why our study is garbage?” And in a way, we were, and still are. But as I learned later on, the limitations section is an important component of the scientific enterprise. It shows trust in the researcher and demonstrates a transparency about both what the paper claims and what it doesn’t.
Perhaps as administrators, after presenting any data, we, too, should present a limitations section that explains: Here’s what the data may not mean. Whether you do that in a tongue-in-cheek way in a PowerPoint or in a straightforward acknowledgment to your audience, the point is to spell out the potential problems with your data.
To do that effectively, spend some time reflecting on what a reasonable person in your target audience might infer from your data. Put yourself in the position of your listeners or readers and consider — not what you want them to get from the information but what they might infer from it. That can be a difficult exercise, so you may want to employ the assistance of a small focus group of people from your target audience. Make your case to that small group, and get a sense of their key questions about your results. Any incorrect inferences they make can help you to fine-tune your presentation and be more clear in sharing your data with the full group.
But, you may be thinking: Why would I want to undermine my own case by underscoring the limitations of my results?
Sure, acknowledging the weaknesses of your data can detract from the strength of the argument. But doing so will give you more credibility as a leader to the masses of faculty members, who were also trained to critically evaluate arguments. I recall being a young faculty member and feeling embarrassed for one administrator who seemed to have severely misjudged her audience and their ability to see through the facade to the many holes in her argument. By being upfront about data and its value, including its shortcomings, you steal the thunder of counterarguments and show that you are on top of the issue.
Data has tremendous potential to improve the quality of higher education, as well as the lives of our students and faculty and staff members. But you, as an administrator, can realize its full potential only if you use data in a thoughtful, ethical, and responsible way.