Academic Program Assessment Report
Assessment Report
Academic Year(s) Assessed: 2024-2025
College: Letters and Science
Department: Mathematical Sciences
Department Head: Elizabeth Burroughs
Submitted by: Elizabeth Burroughs, on behalf of Undergraduate Program Committee
Program(s) Assessed:
List all majors (including each option), minors, and certificates that are included in this assessment:
Minors, Options, etc.
1. Past Assessment Summary.
Response:
Recent assessments have found that 80% or more of our students are meeting our PLOs. For
the 22-23 assessment cycle we used survey data rather than classroom artifacts. In the 23-24 cycle we returned to using student artifacts from signature assignments to assess PLO3 and again found that 80% or more of our students meet our PLO. This cycle (24-25) we return to using a survey, based on feedback from last year’s assessment, and we focus on the department’s culture, community, and faculty advising.
2. Institutional Assessment Data Request
Based on the rationale on the Instructions page, please review your program learning outcomes (PLOs) and identify whether you have PLOs that address the Core Qualities. There are no right or wrong answers.
Identify 1-2 major-required courses that might have student assignments designed to meet these objectives at least at a surface level. If you cannot identify a course in your program that aligns with this request, please check the appropriate box. At this juncture, this is for information gathering as we plan future institutional assessment endeavors.
We do have major courses that address the core qualities of Thinkers & Problem Solvers and Effective Communicators. Our learning outcomes already refer to these core qualities:
1. Students will demonstrate mathematical reasoning or statistical thinking.
2. Students will demonstrate effective mathematical or statistical communication.
3. Students will develop a range of appropriate mathematical or statistical methods for proving, problem solving, and modeling.
|
Core Quality LOs are Institutional Learning Outcome (ILO) |
PLO overlaps with MSU Core Quality |
Beginning Level |
Developing Level |
Proficient Level |
Not Applicable (N/A) |
|---|---|---|---|---|---|
|
|
Mark X if program has at least one PLO that overlaps with an ILO
|
e.g. CORE Courses (US, W, Q, IN, CS, IA, IH, IS, D)
|
e.g. list one 200- or 300- level course
|
e.g. list one 300- or 400- level courses, Capstone, Research (R) Core Courses
|
No course exists in our program that addresses this Core Quality/ILO
|
|
Thinkers & Problem Solvers
|
X
|
Core Classes are designed to address an introductory, foundational level of Core Qualities.
Some may overlap into the developing level, but most intermediate-to-developing or
proficient/mastery level courses will exist within the majors.
|
M 242
|
M 384, M 329, Stat 412
|
|
|
Effective Communicators
|
X
|
M 242
|
M 384, M 329, Stat 412
|
|
|
|
Local & Global Citizen
|
|
|
|
X
|
3. Actional Research Question for Your Assessment.
Response:
In what ways do the department’s extra-curricular activity offerings and advising practices support students to engage with the discipline beyond the classroom?
4. Assessment Plan, Schedule, and Data Sources.
a) Did you change the previously established Assessment Plan Schedule. If yes, how was it changed?
No. We are gathering indirect evidence (as recommended in last year’s AOC feedback) about how students engage with advising and extra-curricular opportunities as part of our assessment of PLO 3.
b) Please provide a multi-year assessment schedule that will show when all program learning outcomes will be assessed, and by what criteria (data). List your PLOs in full for reference. Add rows as necessary.
|
PROGRAM LEARNING OUTCOME
|
2023 - 24
|
2024 - 25
|
2025 - 26
|
2026 - 27
|
Data Scource*
|
|---|---|---|---|---|---|
|
1. Students will demonstrate mathematical reasoning or statistical thinking
|
|
|
X
|
|
M 242 Signature Assignment
|
|
2. Students will demonstrate effective mathematical or statistical communication.
|
|
|
X
|
|
M 242 Signature Assignment
|
|
3. Students will develop a range of appropriate mathematical or statistical methods
for proving, problem solvings, and modeling.
|
X
|
X
|
|
X
|
M 384, M 329, and Stat 412 Signature Assignments Recent student survey data |
c) What are the threshold values for which your program demonstrates student achievement? Provide a rationale for your threshold values.
This assessment cycle we are examining qualitative aspects of student participation in our department community and have not set quantitative thresholds. We will use the results of this survey to set thresholds for future surveys.
5. What Was Done.
a) Self-reporting Metric (required answer): Was the completed assessment consistent with the program's assessment plan? If not, please explain the adjustments that were made.
_X_ Yes ___ No
b) How were data collected and analyzed and by whom? Please include method of collection and sample size.
In M 472 (math and applied math majors), Stat 408 (stat majors), M 384 (math and applied
math) and EDU 495 (math ed), we distributed a survey collecting information about
the level to which students participate in department extra-curricular opportunities,
how they engaged with their professors in office hours, and how they viewed the guidance
they received from their faculty advisor. We also distributed the survey to graduating
seniors.
We had 28 students respond on Qualtrix: 7 applied math, 9 math, 5 math teaching, and
2 statistics. We also had 2 math minors and 3 statistics minors respond.
c) Please provide a rubric that demonstrates how your data was evaluated.
Because we are using these data to set benchmarks for a rubric, we present here the data summary
|
Do you/did you participate in the following?
|
Yes
|
No
|
I'm unaware of this
|
|---|---|---|---|
|
Math Club
|
5
|
20
|
3
|
|
ASA Student Chapter
|
2
|
20
|
6
|
|
Tails Seminar
|
1
|
16
|
11
|
|
Putnam Competition
|
0
|
14
|
8
|
|
Directed Reading Program
|
5
|
17
|
6
|
|
Data Fest
|
3
|
17
|
8
|
|
Undergraduate research with a local faculty member
|
8
|
18
|
2
|
|
External REU
|
1
|
16
|
11
|
|
Attend a math/stat/teaching conference, e.g., MathFest or MT Educators Conference
|
5
|
16
|
7
|
|
Modeling Competition, e. g., COMAP
|
2
|
14
|
12
|
|
Work as a tutor in the MSC
|
8
|
19
|
1
|
|
Attend department seminars or colloquia
|
4
|
20
|
4
|
|
Attend department tea time
|
6
|
20
|
2
|
|
Work on homework in common departmental spaces
|
18
|
7
|
3
|
Overall, only 2 students of the 28 said no or I don’t know to every one of these opportunities.
Many students indicated that in reflecting on their time in the department, they’d be interesting being more engaged, so we see an opportunity for us to advertise these opportunities in ways that reach more students.
|
|
Often - most weeks
|
Sometimes
|
Rarely/Never
|
|---|---|---|---|
|
Over the course of your time at MSU, how often did you use the following? - Instructor/TA
Office Hours
|
3
|
19
|
4
|
|
Over the course of your time at MSU, how often did you use the following? - Tutoring
in Math/Stat Center
|
2
|
9
|
15
|
Overall, only 3 students said they never went to office hours or MSC. The rest did one or the other or both. The most common reason they selected for why they attended was because they had homework questions.
Students said they didn’t go to office hours “because the office hours didn’t fit
my schedule” (21 responses). “I felt intimidated” was chosen 4 times. We see an opportunity
for faculty to be more proactive in describing how to meet with faculty if the scheduled
office hours conflict with students’ work or class schedules.
Regarding Advising
In response to the question: Do you feel that at least one faculty member is personally invested in your education? 17 responded yes, and 6 said maybe. 4 said no.
6. What Was Learned.
a) Based on the analysis of the data, and compared to the threshold values established, what was learned from the assessment?
|
Indicators
|
Unacceptable
|
Acceptable
|
Excellent
|
|---|---|---|---|
|
Department support for out-of-classroom opportunities to engage in the discipline
|
More than 50% of students are unaware of out-of-classroom opportunities |
More than 50% of students take advantage of at least one out-of-classroom opportunity |
More than 80% of students take advantage of at least one out-of-classroom opportunity |
|
Availability out-of-classroom for support with in-classroom learning
|
More than 50% of students do not attend office hours or tutoring |
More than 50% of students attend office hours or tutoring, at least sometimes |
More than 80% of students attend office hours or tutoring, at least sometimes |
|
Advising Support
|
More than 50% of students do not identify that at least one faculty member is invested in their education |
More than 50% of students identify that at least one faculty member is invested in their education |
More than 80% of students identify that at least one faculty member is invested in their education |
b) What areas of strength in the program were identified from this assessment process?
We offer many extra-curricular opportunities and most students participate in at least some way and feel supported by faculty and advisors.
The majority of students express that at least one faculty member is invested in their
future success.
The majority of students make use of either office hours or tutoring.
We work hard to make our corner of Wilson Hall inviting and supportive of studying
and learning, those efforts appear to be successful.
c) What areas were identified that either need improvement or could be improved in a different way from this assessment process?
We could do a better job making all students aware of the many opportunities.
We could do a better job at scheduling office hours by optimizing our availability
alongside students’ availability.
We could do a better job communicating that we are invested in their success.
7. How We Responded.
a) Describe how “What Was Learned” was communicated to the department, or program faculty. How did faculty discussions re-imagine new ways program assessment might contribute to program growth/improvement/innovation beyond the bare minimum of achieving program learning objectives through assessment activities conducted at the course level?
Our undergraduate program committee has one member from each faculty group, and these members discussed the data and findings with their small groups. The results of these discussions have been incorporated in the what was learned section of this report. The entire completed report was circulated to the faculty as a whole.
b) How are the results of this assessment informing changes to enhance student learning in the program?
We are currently implementing M 194, Introduction to Mathematical Sciences, a first-year seminar, that resulted from some prior findings. We will find ways to address all three of our “needs improvement” areas into that course, since it is an avenue that systematically reaches our first year students.
c) If information outside of this assessment is informing programmatic changes, please describe that.
Our mathematics education and statistics faculty, through their own scholarship/expertise
and that of disciplinary colleagues, is helping to keep department faculty abreast
of current evidence-based practices in survey design, instruction and assessment.
Several of our faculty, from across research specialty groups and including both TT
and NTT, are exploring Artificial Intelligence in mathematics and statistics teaching
through the “AI Teaching Seminar.” The discussion was initiated at our department
retreat, and we expect to formally return to the discussion at next fall’s teaching
retreat.
d) What support and resources (e.g., workshops, training, etc.) might you need to make these adjustments?
None currently, outside of what is already offered by CFE.
8. Closing the Loops(s).
Reflect on the program learning outcomes, how they were assessed in the previous cycle (refer to #1 of the report), and what was learned in this cycle about any actions stemming from the previous cycle.
a) Self-Reporting Metric (required answer): Based on the findings and/or faculty input, will there be any changes made (such as plans for measurable improvements, realignment of learning outcomes, curricular changes, etc.) in preparation for upcoming assessments?
_X_ Yes ___ N
b) In reviewing the last report that assessed the PLO(s) in this assessment cycle, what changes proposed were implemented and will be measured in future assessment reports? What action will be taken to improve student learning objectives going forward?
In the upcoming cycle, we will return to assessing proof-and-proving artifacts from M 242. That course touches on all the learning outcomes, and we can examine how the findings from our prior course-based assessment are reflected or not in our assessment of 200-level coursework. For reference, the prior findings were:
M 329: In the math teaching option, instruction could attend to this issue: students
learn communication strategies in M 242, but don’t seem to carry many of the details
about constructing rigorous proofs into their work in M 329. Instruction should specifically
engage students in this way.
STAT 412: This assessment revealed an opportunity to improve student understanding
on if/where error terms belong in specifications of generalized linear/linear models
(especially for
linear mixed models). In this course and in subsequent coursework instruction should
continue to focus on translating research questions into correctly specified models
including the appropriate error structure and distribution for the response.
M 384: That only 20% of the students were assessed as “proficient” in their proof
technique, indicates an opportunity for growth in rigorous mathematical communication.
In combination with the comments for M329, this indicates a need for rigorous communication
to be further emphasized in the prerequisite courses M 242 and M 383.
c) Have you seen a change in student learning based on other program adjustments made in the past? Please describe the adjustments made and subsequent changes in student learning.
Over the past four years there has been a notable shift among many of our upper division courses toward more active learning opportunities. This has resulted in some students being more comfortable with initiating mathematical and statistical arguments and in being more self-reliant. Our faculty continue to share their ideas about and strategies for success with instruction that goes beyond lecturing.
d) If the program sees anything emerging from this assessment cycle that it anticipates would be a factor or an item of discussion in its 7-year program review cycle, please use this space to document that for future reference.
We expect to examine our advising practices and standards in the 2028-2029 cycle.
