Wednesday, April 2, 2014

Findings from the National Survey of Student Engagement (NSSE)

UMA has participated in NSSE in 2007, 2010 and 2013.

NSSE has identified that student engagement symbolizes 2 important attributes of quality:
  1. Amount of time and effort UMA students put into their educational activities
  2. How UMA provides opportunities and encourages participation in activities that accumulated research have shown are linked to successful student learning
NSSE provides the opportunity for UMA to compare our students' responses with those of students at self-selected groups of comparison institutions. The survey questions are informed by accumulated research on student learning behaviors associated with successful outcomes of an education.

UMA uses the survey data to identify aspects of the educational experience, inside and outside the classroom that can be improved through policies and practices more consistent with good practices in undergraduate education.

Click here to read about the survey administration and responses from the Spring 2013. Overall, the survey achieved a 47% response rate with 508 first-year students and seniors completing the survey.

Executive Summary No.1 highlights the 5 Strengths and 5 Challenges among 53 survey questions relative to UMA’s selected comparison group.

Strengths & Out-Performed Comparison Institutions
  • Quality of interactions with faculty, academic advisors, student services and other administrative offices (e.g., Registrar, Financial Aid and etc.)
  • Faculty provided prompt and detailed feedback on assignments and were effective in using examples to explain difficult concepts
Challenges & Fell Short among Comparison Institutions
  • Discussions with people from different race/ethnic, economic and religious backgrounds
  • Opportunities to collaborate with other students on assignments and explain course material to other students
Over the coming weeks and months the Office of Institutional Research and Planning will share additional snapshot analyzes and a more comprehensive review of progress since the 2007 survey administration.

Source: NSSE (2014, April 2). About NSSE, Retrieved from http://nsse.iub.edu/html/about.cfm

Thursday, March 20, 2014

A Continued Focus on Student Retention

The process of NEASC reaccreditation asks UMA to share information on student retention appropriate to our mission. Clearly, the traditional student cohort used in calculating retention rates, that is, First-year, Full-time Associate or Bachelor’s students (i.e., IPEDS Cohort), represents a very small percentage of our 6,500+ student body. 

Consequently, in preparing our report we have made a concerted effort to analyze the performance of several different student cohorts (e.g., Part-time students), recognizing the value of trends in data and importance of success these cohorts of students have on our goals for success. 

Click here to retrieve a snapshot trend report of student cohort retention rates. 

The report provides a 4-year average of retention rates for new students/admissions attending UMA including a semester-to-semester and annual analysis. 

The current cohorts available are - 
  • Total New Students/Admissions
    • First-year students
    • Readmitted students
    • Transfer students
  • Full-time or Part-time status
  • Location/Campus: Augusta, Bangor, University College or Online
  • Degree-level
  • Outcomes-Based Funding Programs
  • IPEDS or Traditional Students
  • Transfer Credits: greater than 30 or 90 credits
  • Maine Community College transfer students
As an example:

The middle columns of the report represents the 4-year average performance of Fall semester new students/admissions. 

Over the 4 years, UMA enrolled 5,057 new students in the Fall semester with 3,950 or 78% returning the following Spring semester and 2,809 or 56% returning one year later in the following Fall semester (first red row in report). 

Additionally, over this same Fall time-period UMA enrolled 1,037 new students that transferred credits to UMA from a Maine Community College with 839 or 81% returning the following Spring semester and 652 or 63% returning one year later in the following Fall semester (last orange row in report).

Additional looks at the data are available by semester and other cohorts of interest are in development. 

Tuesday, March 4, 2014

Graduates Pursuing Higher Education

The landscape of higher education is changing rapidly and disruptively. The NEASC reaccreditation process provides the opportunity for UMA to evaluate our position in the growing competitive environment and performance in the 'attainment goal'. 

NEASC has designed the Documenting Student Success Forms (S-Series) for institutions to share data on measures of student success. One measure is the incidence of students pursuing higher learning after graduation. 

Subsequent student enrollment and degree data was provided by the National Student Clearinghouse  allowing the Office of Institutional Research and Planning to query and track post-secondary enrollment and graduation of our alumni. 

The primary interest of our analysis was to better understand the percentage of graduates that continue their education at a higher level, the institutions our graduates are attending and the programs or academic disciplines graduates are enrolled in after graduation. 

Click here to retrieve the full snapshot, trend report with segmentation by awards and institutions.

As an example:
In 2009-10 (Summer 09, Fall 09, Spring 10) UMA awarded 598 students with an associate degree, bachelor's degree or certificate. 

As of January 2014, 187 or 31% of the 598 graduates had enrolled at a higher education level and/or graduated with a higher degree/certificate than their UMA award. 

319 of the 598 graduates earned a UMA bachelor's degree in 2009-10. As of January 2014, 54 or 17% had enrolled in and/or graduated with a graduate certificate, master's or doctoral degree. UMA graduates attended/graduated from the following institutions - 
  • University of Maine, 24% (13)
  • University of Southern Maine, 11% (6)
  • Non-University of Maine System Institutions, 72% (39)
In our analysis, a student who enrolled-in/graduated-from more than one institution (e.g., University of Maine and Thomas College) would be counted once in both University of Maine (UM) and 'Other' categories therefore, the total percentage may be greater than 100%. 

Furthermore, 221 of the 598 graduates earned a UMA associate degree in 2009-10. 46% of associate degree graduates (101 of 221) continued their education at the bachelor's degree level with 96% continuing with UMA. 

Additional reports including by academic program are available upon request. 

Thursday, February 13, 2014

Improving Institutional Engagement: Don't Know?

University of Maine at Augusta ©
UMA’s institutional assessment survey modeled after the Baldrige Performance Excellence Program helps us look for ways to effectively and efficiently meet our mission and achieve vision. Our continued NEASC accreditation requires periodic review of not only academic programs but administrative areas as well. UMA launched this continuous improvement process in 2008-09 and it is linked with UMA’s Strategic Plan Key Goal 7.

This analysis follows the category Index Score Comparison posted on January 8, 2014. The survey asked faculty and staff to answer each question in the survey with a 'yes’, 'somewhat', 'no' or ‘don’t know’ response. The first three (3) answers were used in the Index Comparison analysis, while this analysis focuses on the last answer, ‘don’t know’.

The chart above shows, for each of the 7 categories, the percentage of respondents who on average replied ‘don’t know’ to the category’s questions in 2008-09 vs. 2012-13. In this context, a ‘don’t know’ percentage of 100% would mean that all respondents answered ‘don’t know’ to all questions in this category, whereas 0% would mean that no respondent answered ‘don’t know’ to any question. As an example, while this percentage was higher for some Student, Stakeholder and Market Focus category questions and lower for others, the average percentage responding ‘don’t know’ to a Student, Stakeholder and market Focus question was 32% in 2008-09 and 31% in 2012-13. 

Now referring back to the  Index Score Comparison, although most index scores improved, there is still comparable proportions of ‘don’t know’ responses between the two surveys. The weighted ‘don’t know’ percentage is critical to the survey analysis because it implies that some faculty and staff did not have sufficient information to respond to certain questions.  

With this in mind as work continues on this year’s category, Student, Stakeholder and Market Focus, the Baldrige Improvement Committee is committed to identifying strategies to help improve the transparency, communication and availability of information related to this category and improve the ‘don’t know’ percentage.

Monday, February 10, 2014

Online Learning Growing and Among Nation's Best in Online Education

University of Maine at Augusta ©
The bar graph above shows changes in student course registrations by each primary instruction modality over the last four years. The bar graph below shows changes in course sections by each instruction modality.

In 2012-13 (i.e., Summer 12, Fall 12, Spring 13), a total of 32,265 student course registrations enrolled in 1,678 total course sections. In comparison, a total of 30,624 student course registrations enrolled in 1,626 total course sections in 2009-10.

Over the last 4 years, the contribution of each instruction modality to total student registrations and course sections has changed with the rapid growth of online learning. Most recently, 32% (was 20% in 2009-10) of all student course registrations were Online and 22% (was 13% in 2009-10) of all course sections offered were Online. 

University of Maine at Augusta ©

Early this year, UMA was again ranked among the top in U.S. News & World Report’s 2014 Best Online Bachelors ProgramsUMA was ranked 61st out of almost 300 programs listed. Last year UMA was ranked for the first-time at #103.

As online enrollment continues to grow these latest rankings of online programs further affirm UMA's role as a longtime leader in delivering top-quality online education.

Tuesday, January 28, 2014

Tracking Course Completion

University of Maine at Augusta ©
Accreditation is a process of self-regulation. It requires UMA to periodically engage in a comprehensive and candid self-study. NEASC has designed the Documenting Student Success Forms (S-Series) for institutions to share data on measures of student success. One measure is course completion rates by campus and distance education (i.e., Online).

On average, the UMA-wide course completion rate is 75%, that is, students earning a final grade of A thru C-, Pass, or Audit. This analysis considered ALL grades (denominator) including Incomplete and Withdrawal grades to calculate the course completion rate.  

As an example in 2009-10 (i.e., Summer 09, Fall 09, Spring 10) students successfully completed 74% of Online courses compared to 73% in 2012-13. A student’s course location is defined by the physical location a student is completing the course, that is, a student completing an ITV Broadcast course in Augusta is reflected as Augusta or a student completing the same ITV course at the Rockland Center is reflected as University College. 

This basic, aggregate analysis illustrates that between locations and distance education course completion rates are comparable and this resemblance has been consistent over the last 4 years. 

Wednesday, January 8, 2014

Striving to Continuously Improve

University of Maine at Augusta ©
UMA’s institutional assessment survey modeled after the Baldrige Performance Excellence Program was launched in 2008-09 to assess administration and improve performance.

Over this period, UMA has collected feedback from faculty and staff on four (4) occasions.  This has resulted in the development and monitoring of action plans with the last progress report conducted in the Spring of 2012. The institutional assessment process has evolved over this period too – the questionnaire has been modified, process formalized, and campus participation has increased.

The bar graph above shows each category’s 2008-09 vs. 2012-13 index score - the weighted average of individual questions within each category. Respondents were asked to answer each question with a 'yes’ (=1), 'somewhat' (=0.5), or 'no' (=0) response. Therefore, a category index score of 1.0 means that all respondents answered ‘yes’ to all questions within a category and 0 would mean that all respondents said 'no' to all questions in a category.

Over-time, the Results category received the largest absolute change in performance: 0.35 in 2008-09 vs. 0.58 in 2012-13. This category provides measures of progress for the evaluation and improvement of processes and information review and analysis, in alignment with our overall institutional strategy.

Still, there were a large percentage of ‘don’t know’ answers to questions in the survey which are not included in this index score analysis. As an example, on average 55% of survey respondents (80 out of 144) in 2008-09 vs. 49% (146 out of 300) in 2012-13 answered ' don't know' to each of the questions within the Results category.

The ‘don’t know’ percentages to any one of the Results category questions ranged from 27% to 74% in 2008-09 and 29% to 62% in 2012-13. This implies that some faculty and staff did not have sufficient information to respond to certain questions. In the next week, I will share with you a summary analysis of the ‘don’t know’ responses across each category index for both years.

Cronbach's alpha statistic was used to determine the internal consistency or average correlation of questions in each category to gauge its reliability. In 2008-09, category reliability values ranged from 0.85 - 0.94. In 2012-13 category reliability values ranged from 0.92 - 0.94 except the Results index category (0.79).

These alpha statistics for all index categories satisfied the generally acceptable threshold of 0.70 for a new scale, suggesting that the questions within each category are closely related as a group and measure an underlying construct or category.