Tuesday, July 6, 2010

Some Substantial Stuff From Week 1

Here is some interesting information from the first week (assessment/placement):

1. The results of student assessment/placement tests (ex. Accuplacer) are designed to be implemented using a RANGE of scores rather than one cut score. Currently, FCC uses 90 as the cut score apparently because there is a "formal agreement" by community colleges statewide. After a little investigation I found out: a) MD communitiy colleges are wise to publish one cut score to students to avoid confusion, b) the 90 cut score in MD was determined as a "bench mark" not a hard-and-fast rule, c) therefore, many MD community colleges do not use one cut score to determine placement, and d) since we don't have common courses or course design in developmental classes and EN 101/102, it makes sense that we tailor our assessment needs.

2. The Accuplacer should not be considered a perfect assessment method. It is valid and reliable, but the standard error of measurement is highest for students who score in the "gray area" (just below or above the score that would place them EN 101)--and those are the students we are most trying to serve! Using a range is one helpful way to address this, but other means are necessary. Here are the important factors in making placement decisions (priority order):
1. Placement test scores--if a student scored in the gray area for the Accuplacer sentence skills area, consider looking at how a student scored in the Accuplacer reading portion to help determine placement. Aim for a consistency of scores.
2. Other available test information--our in-house essay exam* &/or SAT/ACT tests can help (although the latter are aptitude tests not assessment tests; they are not as valid and reliable as assessment tests). We can use SAT /ACT data to help with placement, not just to determine who is exempt from taking the placement tests.
3. High school background data--high school rank (%/tier in class) being the most effective, courses completed and GPA are less reliable data
4. Age/maturity--older students may be more committed, younger students tend to over-estimate their abilities.
5. Student opinion--not necessarily self-placement, use only if other data is unavailable
6. Additional testing

*Grading of in-house essay exams was discussed, and I can share some of that information at a later date. Just a peek: most research supports using a 6-point scale rubric that is cross-graded. Oh, joy.

2 comments:

  1. Hi Joe
    I am jealous. I think there is something very exhilarating about an intense intellectual experience like the Kellogg. I also think that the MD community colleges may need to revisit the assessment agreement. It is important that we be consistent but after fifteen years, there is bound to be drift.

    ReplyDelete
  2. I am looking forward to what you find out about exit exams.

    ReplyDelete