2024 AIRUM Concurrent Sessions Listing
Friday, November 8, 2024

8:00 am - 8:459:00 am - 9:45

Downloadable PDF version: Sessions Listing

8:00 am - 8:45

24. Automating Workflows and Building Data Dashboards

Presenter: Christopher Petrie, Northwestern Health Sciences University

Description: We often engage in repetitive tasks that are structured in nature and follow a series of rule-based logic steps. These tasks might include sorting and aggregating data from one system to import or aggregate in another or using data from one system to create entries in another. These tasks can be labor intensive with significant amounts of copying and pasting or switching between windows/applications. They can also be tedious and boring leading to inadvertent errors. These are the sorts of tasks that computer systems work great for. However, automating this type of work historically required extensive programming or scripting knowledge to create systems that ingested and manipulated data from otherwise disconnected systems. Today, automated workflow tools bring the ability to perform these types of tasks automatically to non-programmers, freeing us up to devote our time to the tasks that require more nuance and variability, tasks that us humans are great at but where computers still often struggle. In addition, once the data is compiled automatically, we can have automated reporting dashboards that update in real-time with the latest data these workflows have added. This workshop will introduce attendees to designing their own automated workflows that connect to data dashboards.

25. Building a Cost-Effective CRM Alternative using Power BI

Presenter: Ian Dahlinghaus, MSOE

Description: Effective management of student relationships is crucial for retention and overall institutional success. This presentation focuses on developing a cost-effective Customer Relationship Management (CRM) system using Microsoft Power BI, tailored specifically for smaller institutions. Traditional CRM platforms can be prohibitively expensive and often come with a range of features that may not align perfectly with the unique needs of some institutions. By leveraging Power BI, institutions can create a customizable and scalable CRM solution that meets their specific requirements without incurring high costs. 

The session will delve into practical steps for integrating data from various sources, including student information systems, surveys, and support services. It will demonstrate how to utilize Power BI's powerful data visualization and reporting tools to monitor and analyze key metrics such as student engagement and support interactions. Additionally, the presentation will cover automation techniques to streamline workflows and improve efficiency. 

By the end of the session, participants will be equipped with the knowledge to develop their own cost-effective CRM solutions, ultimately enhancing their institution's decision-making processes and student relationship management.

26. Analyzing the Instructional Cost of a Degree: A Major-Level Analysis at UW-Eau Claire

Presenters:
Casey Rozowski, UW-Eau Claire
Michael Carney, UW-Eau Claire

Description: Budget challenges in higher education necessitate a deeper understanding of the instructional costs associated with different academic programs. This study at UW-Eau Claire aims to identify and analyze the instructional costs per major, providing insights into the financial demands of various academic disciplines. 

We utilized student data, including the courses taken by graduates from Fall 2017 to Spring 2022, though those courses may have been taken earlier. Faculty data from the same period was also analyzed. The methodology involved calculating the per-student cost of instruction for each course and estimating the instructional cost per major. This approach allowed us to identify significant cost drivers, such as average class size and faculty workload. 

Our findings reveal substantial variances in instructional costs across majors, offering insights that can inform strategic budgeting and resource allocation decisions. Additionally, our analysis includes the sensitivity of these costs to variations in the identified drivers, providing a robust model for financial planning. By presenting our methodology and results, we aim to contribute to ongoing discussions on cost management in higher education and provide a model that other institutions can use to analyze their instructional costs.

27. Analyze & Publicize: Validating Strategic Planning Efforts through Data Storytelling

Presenters:
Heather Rondeau, Alexandria Technical & Community College
Nichole Aber, Alexandria Technical & Community College

Description: In an effort to win an award (money) for the college, the ATCC Director of Research & Institutional Effectiveness and the Director of Marketing & Communications partnered to complete an 8-page narrative related to innovation that resulted from threat or opportunity. We wanted to share the story of adding student success coaches, athletics, etc. which are all initiatives of our Strategic Enrollment Management (SEM) plan and the impact. We had had increased enrollment and increased retention, so we figured that it had improved student success for Hispanic/Latino students but had no concrete evidence of this. Through a quick, 3-day process, we were able to uncover an amazing and unexpected story that not only validated our college's strategic planning efforts (hiring student success coaches, adding athletics, adding a bilingual admissions/success coach, adding an ELL tutor, etc.) but also validated the work of our newly hired staff. We will talk about buy-in for the initiatives and how this data impacted buy-in for future initiatives and hiring, as well.

9:00 am - 9:45

28. When Graphs Go Bad

Presenter: Susan Ray, Iowa State University

Description:  Graphs help grab attention and make a point. However, poorly designed graphs can confuse or outright lie. Sadly, bad graphs are easy to create by accident. 

In this tool-agnostic presentation, we will look at:

    • A brief discussion of the science behind how visualizations help build understanding.
    • Real-world examples of bad graphs and how they deceive.
    • The unfortunate results of overvaluing design at the expense of understanding.
    • Simple rules to take your graphs from not bad to really good.

29. More General or More Specific? Implications of Specificity of Race/Ethnicity Disaggregation for Policy and Practice

Presenters:
Peter (Jun) Li, University of Minnesota, TCI
Qian Zhao, University of Minnesota, TCI

DescriptionAlthough many studies have been done on students' noncognitive and academic outcomes among general race/ethnicity groups (typically American Indian, Asian, Black, Latino, and White), few researchers tested the source of variance in students' noncognitive measures and education outcomes focusing on between more specific national origin groups (such as Somali and Ojibwe). Based on the data of 157,757 students from 2019 Minnesota Student Survey (MSS), we measured variance of students' commitment to learning (CtL) and GPA between 61 general race/ethnicity groups and between 984 specific national origin groups using Hierarchical Linear Modeling (HLM) two-level models. Larger differences were found in student CtL and GPA between the specific national origin groups than between the general racial/ethnic groups overall. In addition, the complex combinations of identities had a significant impact on the proportion of variance between different national origin groups and between different racial/ethnic groups, although many group sample sizes were relatively small. These results suggest that, to address academic performance gaps as seen in GPA group differences, and to know which groups may need additional supports, we may need to focus more on student cultural characteristics and experiences as defined by national origin, rather than the typical heterogeneous racial/ethnic groups.

30. Closing the Loop: Lessons Learned From a Year of Pocket Surveys 

Presenter: Trina Smith, Luther Seminary 

DescriptionAdopting the idea of utilizing shorter and more frequent 'micro-surveys' to gather information about student experience, this past year we implemented a 'Pocket Survey' program at Luther Seminary. One of our aims was to consistently 'close the loop by providing timely, 'action-oriented' feedback to students in order to encourage participation and increase trust. Our strategy required collaboration with different sponsoring departments and a shared communication plan for delivering timely survey feedback of various kinds to students and staff. 

In general, the Pocket Surveys were popular with both students and staff, but the lessons we learned during this first year inspired us to make significant changes for the second year. In this presentation, we will outline ways in which our strategy for closing the loop was successful and others ways in which it was not. Some of the issues we address include collaboration with sponsoring departments, data transparency, participation patterns, question formats, survey length, scheduling/frequency, and forms of feedback/communication. We believe that the lessons we have learned would be helpful for others who are attempting to implement similar kinds of student experience surveys.

 31. Essential Metrics—and a Framework—for New Program Identification and Selection

 

Presenters: 
Dr. Seth Houston, UQ Solutions

Dr. Cheryl Norman, Central Lakes College

DescriptionMany institutions see new programs as a pathway to enrollment growth. And sometimes it is. But how can you know which programs will drive growth at your institution, and which will fall short of expectations? In this workshop, we will share essential metrics, a framework, and a recommended process for identifying new program opportunities and selecting which ones will drive market- and mission-aligned enrollment growth at your institution. Along the way, we will share practical advice from the experience of Central Lakes College, which recently completed a new program selection process. 

We will start with goal setting, process design, and market definitions. Then, we walk you through processes for identifying potential programs to consider and assessing their likely performance at your institution. Finally, we discuss strategies for planning, final decisions, implementation, and ongoing assessment. Throughout, we will share metrics—from student demand to success among peers, competition, and workforce need—that we have found to be most critical at each stage in the process.

Participants will receive materials that they can bring back to their home institutions, to help you develop your own frameworks for effectively identifying, and then launching, new programs to drive enrollment and revenue growth.