Assessment Report

Academic Year(s) Assessed: 2023-2024
College: Letters and Science
Department: Mathematical Sciences
Department Head: Elizabeth Burroughs
Submitted by: Ryan Grady, Associate Professor of Mathematics

 

 

Program(s) Assessed: 

List all majors (including each option), minors, and certificates that are included in this assessment:

Majors

Minors, Options, etc.

Mathematics - Math Option
Math Minor
Mathematics - Applied Option
 
Mathematics - Teaching Option
 
Mathematics - Statistics Option
Statistics Minor

 

1. Past Assessment Summary.


  Recent assessments have found that 80% or more of our students are meeting our PLOs. For
the 22-23 assessment cycle we attempted to use survey data to address the ARQ “What can we
learn from students’ perceptions of our programs regarding content, rigor, support, and
preparation for future goals.”
  This cycle (23-24) we are returning to using student artifacts from signature assignments to
assess PLO3.
  While our first attempt at using survey data was a useful first step, we---following the AOC’s
recommendations---will need to do the following for the 24-25 assessment cycle:


       1. Develop an action research question that is more specific to PLO3.
       2. Develop a rubric and/or a coding scheme to more objectively analyze survey data.


  Both tasks are being undertaken by the current Undergraduate Program Committee within the
department in preparation for the next assessment cycle. 

2. Action Research Question.

Can students demonstrate a range of appropriate mathematical or statistical methods for proving, problem solving, and modeling?

3. Assessment Plan, Schedule, and Data Sources.

       a)  Please provide a multi-year assessment schedule that will show when all program learning outcomes will                   be assessed, and by what criteria (data).

Assessment Planning Chart
PROGRAM LEARNING OUTCOME
2023 - 24
2024 - 25
2025 - 26
2026 - 27
Data Scource*
1. Students will demonstrate mathematical reasoning or statistical thinking 
 
 
X
 
M 242 Signature Assignment
2.  Students will demonstrate effective mathematical or statistical communication.
 
 
X
 
M 242 Signature Assignment
3.  Students will develop a range of appropriate mathematical or statistical methods for proving, problem solvings, and modeling.
X
 
 
X
M 384, M 329, and Stat 412 Signature Assignments
3.  Students will develop a range of appropriate mathematical or statistical methods for proving, problem solvings, and modeling.
 
X
 
 
Recent graduate survey data

 

       bWhat are the threshold values for which your program demonstrates student achievement?

Threshold Value
PROGRAM LEARNING OUTCOME
Threshold Value
Data Source
1. Students will demonstrate mathematical reasoning or statistical thinking.
The threshold value for this outcome is for 70% of assessed students to score acceptable or proficient on the scoring rubric.
Not assessed this cycle. 
2. Students will demonstrate effective mathematical or statistical communication.
The threshold value for this outcome is for 70% of assessed students to score acceptable or proficient on the scoring rubric. 
Not assessed this cycle. 
3. Students will develop a range of appropriate mathematical or statistical methods for proving, problem solving, and modeling. 
The threshold value for this outcome is for 70% of assessed students to score acceptable or proficient on the scoring rubric. 
M 384, M 329, and Stat 412 Signature Assignments 

 

4. What Was Done. 

       a) Self-reporting Metric (required answer): Was the completed assessment consistent with the program's                       assessment plan?  If not, please explain the adjustments that were made.

                                         _X_  Yes               ___ No

       b)  How were data collected and analyzed and by whom?  Please include method of collection and sample                        size.

The Undergraduate Program Committee is responsible for annually assigning a program assessment task                force. Members of the task force will be the two most recent faculty members to have taught the course in              question; if they are not available, the Department Head will make a suitable alternate appointment.

           Megan Wickstrom and Elizabeth Burroughs assess M 329.

           Ryan Grady and Blair Davey assess M 384

           Katie Banner and Mark Greenwood assess Stat 412.

The assessment task force will select the signature assignments to assess from the bank of signature                          assignments for each course. The bank is initially populated with the signature assignments that have been              used in the past five years and will be updated by the committee as necessary, based on results of the                      assessment.  

The task force will determine whether to assess a census of the assignments from Math/Stat Majors/Minors            in the course, or whether to assess a random selection. Where possible, a minimum of 10 student                              assignments should be assessed for each course. For this assessment cycle:

                Only 3 Math Teaching option students enrolled in M 329, so we used a census.

                Only 5 Stat option students enrolled in Stat 412, so we used a census.

                Approximately 15 Math and Applied option students enrolled in M 384, so we used a random (blinded)                        selection of 10.

       c)  Please provide a rubric that demonstrates how your data were evaluated. (Delete example below and                        replace with program’s assessment-specific rubric.)

 
M 329 Rubric
Unacceptable
Acceptable
Proficient
Outcome 3:
Students will develop a range of appropriate mathematical or statistical methods for proving, problem solving, and modeling

Displays limited or inappropriate proof, problem solving, or modeling strategies in the mathematical content focus.

Problem solving:
Student is not able to create an example to investigate the claim

Proof:
Student cannot prove the claim                          

Adequately displays appropriate proof, problem solving, or modeling strategies in the mathematical content focus.

Problem solving:
Student is able to create specific examples to support claims made in their argument

Proof:
Student has not clearly stated which definition OR they use a naïve definition, such as “four congruent sides and four congruent angles” but has the outline of an argument

Displays thorough and appropriate proof, problem solving, or modeling strategies in the mathematical content focus.

Problem solving:
Student is able to create examples and counterexamples and generalizes from those examples

Proof:
Student has clearly stated which definition of square they are using, and either they discuss/chose a minimal definition (eg., equiangular quadrilateral with two congruent adjacent sides.) OR they produce a proof nuanced for mathematical knowledge for teaching

 

STAT 412 Scoring Rubric:  Criteria for demonstrating understanding: Each student is assessed on specifying appropriate methodology (a-d) and executing specified methodology (e-f) for each of three scenarios.

  1. Distribution of the response is appropriate given the scenario and the proposed model reflects measurement structure of the design (i.e., linear mixed model appropriately specified if there are repeated measures in the design)
  2. Link function matches choice of distribution (dependent on choice in (a), even if (a) is incorrect)
  3. Systematic component accurately reflects the research question (i.e., is additive or interactive where appropriate)
  4. All variables are defined completely
  5. R code is consistent with model specified in answer (even if parts of a-d are incorrect)
  6. R code that runs

 

Students achieving a score of 2 or 3 in at least 2 scenarios demonstrate acceptable understanding of the program assessment outcome; students achieving a score of 3 in at least 2 scenarios demonstrate proficiency in the assessment outcome. The outcome was assessed in two areas: specification and execution.

Outcome
Unacceptable (1)
Acceptable (2)
Proficient (3)
3. Students will develop a range of appropriate
mathematical or statistical methods for proving,
problem solving, and modeling.

Displays limited or
inappropriate reasoning
strategies in the
statistical content focus, including how to estimate parameters in statistical models from data in R.

Specifying appropriate statistical methods/models: Missing more than 2 elements of (a) – (d) above in answer specification. Specifically, missing on both (a)-(b) and (c)-(d)

Execution/modeling: R code inconsistent with model specified in answer and does not run; missing on both (e) and (f)

Adequately displays reasoning
strategies in the
statistical content focus, including how to estimate parameters in statistical models from data in R.

Specifying appropriate statistical methods/models: Consistently correct choice of (a) and (b), but issues with (c) and (d) OR visa-versa specified in answer.

Execution/modeling: R code is consistent with model specified in answer (e); missing on just (f)

Displays thorough and appropriate reasoning
strategies in the
statistical content focus, including how to estimate parameters in statistical models from data in R.

Specifying appropriate statistical methods/models: Consistently correct choice of ALL (a)-(d) specified in answer.

Execution/modeling: R code is consistent with model specified in answer (e) and runs (f)

 

 
M 384 Rubric
Unacceptable
Acceptable
Proficient
Outcome 3:
Students will develop a range of appropriate mathematical or statistical methods for proving, problem solving, and modeling

Displays limited or inappropriate proof, problem solving, or modeling strategies in the mathematical content focus.


Problem solving:
Student is not able to create counterexamples to disprove the claim



Proof:
Student cannot prove the claim                          

Adequately displays appropriate proof, problem solving, or modeling strategies in the mathematical content focus.


Problem solving:
Student is able to create specific counterexamples to disprove the claim


Proof:
Student demonstrates understanding of relevant definitions, and is able use elementary arguments, e.g., the triangle inequality, to make a plausibility argument for the claim.

Displays thorough and appropriate proof, problem solving, or modeling strategies in the mathematical content focus.


Problem solving:
Student is able to create specific counterexamples to disprove the claim


Proof:
Student is able to completely prove the claim using the triangle inequality and properties of the maximum function.

5. What Was Learned.

           a)  Based on the analysis of the data, and compared to the threshold values established, what was learned                      from the assessment?

In M 329 all three students were acceptable in both problem solving and proving.

In STAT 412 four of five students (80%) were acceptable in both specifying appropriate statistical models. Students consistently specified models appropriately to reflect additive and interactive research questions and chose appropriate probability distributions for modeling the response variable. Students most commonly fell short of proficiency due to consistent confusion on if/where error terms belong in specifications of generalized linear/linear models and linear mixed models. Additionally, four of five students (80%) demonstrated proficiency in execution of fitting statistical models to data using R Statistical Software.

In M 384, two students (20%) performed unacceptably, six students (60%) performed at an acceptable level, and an additional two students (20%) demonstrated full proficiency. In summary, 80% of the students demonstrated acceptable or proficient performance in problem solving and proof.

 

           b)  What areas of strength in the program were identified from this assessment process?

STAT 412: This was the first year we assessed students' ability to write code in the statistical programming language, R Statistical Software, to reflect the model they specified to address the research question presented to them. Four of five students demonstrated proficiency in this area. Students learn the basics of programming in STAT 337, STAT 408, and STAT411 and seem to carry much of what they learn into STAT 412.

M 384: 80% of the students were able to create and justify a counterexample to the claim under consideration.  The variety of the counterexamples given points to success in students moving beyond memorization and regurgitation to more creative aspects of the learning process.

 

           c)  What areas were identified that either need improvement or could be improved in a different way from                      this assessment process?

M 329: In the math teaching option, instruction could attend to this issue: students learn communication strategies in M 242, but don’t seem to carry many of the details about constructing rigorous proofs into their work in M 329. Instruction should specifically engage students in this way.

STAT 412: This assessment revealed an opportunity to improve student understanding on if/where error terms belong in specifications of generalized linear/linear models (especially for linear mixed models). In this course and in subsequent coursework instruction should continue to focus on translating research questions into correctly specified models including the appropriate error structure and distribution for the response.

M 384: That only 20% of the students were assessed as “proficient” in their proof technique, indicates an opportunity for growth in rigorous mathematical communication.  In combination with the comments for M329, this indicates a need for rigorous communication to be further emphasized in the prerequisite courses M 242 and M 383.

6. How We Responded.

           a)  Describe how “What Was Learned” was communicated to the department, or program How did faculty                        discussions re-imagine new ways program assessment might contribute to program                                                        growth/improvement/innovation beyond the bare minimum of achieving program learning objectives                        through assessment activities conducted at the course level?

A summary of assessment data will be presented to faculty at the department’s October faculty meeting.  Moreover, at this meeting “seed conversations” will take place with the goal being to align our approaches and assessments in M 242 (and M383) with evidence-based best practices, e.g., rough draft mathematics.  Similar conversations are being held among the statistics faculty regarding fundamental stats courses STAT 216 and STAT 337. The math ed group discussed the undergraduate mathematics teaching curriculum and assessment results at its annual retreat in early October. 

 

           b)  How are the results of this assessment informing changes to enhance student learning in the program?

Integration of evidence best practices in M242 with regards to formative student assessment are active and ongoing. Similarly, review of efficacy of modeling based statistical coursework is ongoing. The math ed group discussed the undergraduate mathematics teaching curriculum and assessment results at its annual retreat. The math education group will propose changes to the teaching option of the major to ensure a greater focus on problem solving and proving.

 

           c)  If information outside of this assessment is informing programmatic change, please describe that.

Our mathematics education faculty, through their own scholarship/expertise and that of discipline colleagues, is helping to keep department faculty abreast of current evidence-based practices in instruction and assessment.

 

           d)  What support and resources (e.g. workshops, training, etc.) might you need to make these adjustments?

The UPC recently met with the assessment coordinator from the Provost’s office and found the meeting useful.

7. Closing the Loops(s).  

Reflect on the program learning outcomes, how they were assessed in the previous cycle (refer to #1 of the report), and what was learned in this cycle.  What action will be taken to improve student learning objectives going forward?

           a)  Self-Reporting Metric (required answer):  Based on the findings and/or faculty input, will there any                               curricular or assessment changes (such as plans for measurable improvements, or realignment of                               learning outcomes)?

Yes

 

           b)  In reviewing the last report that assessed the PLO(s) in this assessment cycle, what changes proposed                         were implemented and will be measured in future assessment reports?

Our 2021-22 report indicated that M 384 would attend to more intricate arguments, M 329 would focus on mathematical knowledge for teaching about proof, and STAT 412 would focus on interactions in models. Our current assessment indicates that some of these issues persist, and it is incumbent upon department faculty to redouble their efforts in engaging these issues of instruction.

 

           c)  Have you seen a change in student learning based on other program adjustments made in the past?                            Please describe the adjustments made and subsequent changes in student learning.

Over the past four years there has been a notable shift among many of our upper division courses toward more active learning opportunities. This has resulted in some students being more comfortable with initiating mathematical and statistical arguments and in being more self-reliant. Our faculty continue to share their ideas about and strategies for success with instruction that goes beyond lecturing.

PDF of Annual Program Assessment Report