Hi all.
I have been observing bits and pieces and nuggets of speculations here
about how grades are assigned in national examinations (example: O and
A levels).
Some 'theories' proposed include:
Now, now... I find that these speculations arise from the fact that the
whole process of marking and grading is not made transparent enough.
That is to say, from the moment scripts are collected; it is anyone's
guess.
To illustrate, quoted from SgForums:
So far I am damned nervous... coz my initial estimate of A1/A2 cut off seems to be accurate... Got results from 2 students so far... A girl who scored >90 got A1s for both maths. Another girl who scored 86% got A2 for both... And I got 2 poorer performing students who scored around 81%... So I am hoping against all hope that 86% is the upper range for A2 then the 2 of them can get A2 also...zzz... damned scary how consistent you have to be these days to get a distinction. I prefer the old days where the questions were tougher and scores required lower. |
We can infer that the post is made by a teacher or tutor. Wait a moment.. did I see that right? 86% is an A2?
Now... if I remember correctly in my bygone O level days, we have
ALWAYS been told that 75% is perfectly well worth an A1. Whatever
happens to the prescribed mark boundaries? Again, it is anyone's guess;
such information has never been disclosed to students. Ergo,
speculations are rife.
Anyone with or without experience in pedagogy will notice two (or maybe more) rather disturbing trends that ensue.
Firstly, speculations give rise to alarmist tendencies. That is,
students are drilled even harder to ensure that they score high enough
to secure that A1, and bring good reputation to the school.
Secondly, herd mentality entails. More and more people seem to jump on
the panic bandwagon. This might be a factor that explains the stress
level in junior colleges.
In Singapore, the examinations we take (e.g. A Levels) are titled
'Singapore-Cambridge'. I remember reading a press release a few years
back, which essentially introduced the abovementioned title on our
national examinations. If I recall correctly, UCLES is in charge of
setting the questions and marking the scripts. Conversely, MOE/SEAB is
in charge of designing syllabuses and awarding grades. That's the rough
distinction of workload I gathered.
This move is applauded as it meant that we have a greater say in our
national examinations. This explained why we have unique subjects like
Social Studies (to complement NE, as far as I know), and essay
questions that pertain to Singapore in General Paper.
However, the opaque veil still persisted. For instance, although
appealing of grades is allowed, the marked papers cannot be returned.
This is a far cry from practices administered by other examination
boards. Quoted from QCA (An exam board in UK that also examines A
level):
Q: Am I able to see my marked papers? A: Yes you are, and arrangements can be made through your school. |
Of course, the reason might be that we want to prevent parents from
whining. [This seems to be a perennial occurrence after the PSLE
Mathematics papers]
Yet, there are those who believe that it is in the benefit of students
to have access to their marked papers, so that they can learn from
their mistakes, especially for the N Level examinations.
Students are stakeholders in these examinations. Students paid to sit
for these examinations. Therefore, it is only reasonable that students
have better awareness of what they are in for, and know exactly how
their grades came into being.
Well, what do you think?
DISCLAIMER: Suggested methods and practices mentioned above and therein
reflect neither current practices nor opinions of examining boards and
authorities. The post is intended to seek opinions, NOT to malign nor
defame nor libel any group, club, organization, company, or individual,
or anyone or thing, especially those with the ability and desire to
fight back and launch ad hominem attacks .
Neither is such group, club, organization, company, or individual, or
anyone or thing represented in idea, form, opinion or view in this
post. I am not responsible, nor will be held liable, for anything
anyone says or comments, nor the laws which they may break in this
country or theirs through their comments’ content, implication, and
intent. Inclusion of any copyrighted content is unintentional, and the necessary amends will be made upon notification.
Bell curve... learn it.
Flooding the market with too many A1 will make A1 lose it's value... understand that...
Yup yup.. that far I deduced myself =D
But what we want to know is... how exactly is it done?
Some terms that came to my mind:
A plausible method that I encountered from my 'research' on pedagogy:
Let's say mean = 55, and standard deviation = 10.
Grade boundaries are obtained this way: (mean + 1/2 s.d), (mean - 1/2 s.d) etc.
this question is pretty hard to answer,
what if there are too many people who have the same score?
I think the reasons for not returning the papers is that it'll be too costly and troublesome.
Just take it that 1% of the total number of paper marked have one marking error in them. Imagine the manpower needed to just transport, remark and re-enter the data again.
Also, MOE/Cambridge want to protect their reputation. Imagine taking up your piece of paper and finding marking errors in them. I'm sure you'll find that they are not as "perfect" as they seem. If thats the case, their reputation will surely take a beating.
While, at least majority of the paper are marked correctly and measures are taken to reduce such marking errors. What i can say is too bad for those who happened to have marking errors.
as far as i know moderation is confirmed.... ...
for eg, 2007 (i think) my cher say e.math's cut off for A is 85... not 75...
and his stud got 84 when got back paper... ... he like... ...
Originally posted by Chelseahpk:
Hi all.
Now, now... I find that these speculations arise from the fact that the whole process of marking and grading is not made transparent enough. That is to say, from the moment scripts are collected; it is anyone's guess.
To illustrate, quoted from SgForums:
Quote:So far I am damned nervous... coz my initial estimate of A1/A2 cut off seems to be accurate... Got results from 2 students so far... A girl who scored >90 got A1s for both maths. Another girl who scored 86% got A2 for both... And I got 2 poorer performing students who scored around 81%... So I am hoping against all hope that 86% is the upper range for A2 then the 2 of them can get A2 also...zzz... damned scary how consistent you have to be these days to get a distinction. I prefer the old days where the questions were tougher and scores required lower.
We can infer that the post is made by a teacher or tutor. Wait a moment.. did I see that right? 86% is an A2?
Firstly, speculations give rise to alarmist tendencies. That is, students are drilled even harder to ensure that they score high enough to secure that A1, and bring good reputation to the school.
Secondly, herd mentality entails. More and more people seem to jump on the panic bandwagon. This might be a factor that explains the stress level in junior colleges.
Ah yes. I so totally agree. The standard grading rubric (ie. A1=75, etc) is totally non-existent in the national examinations. During my O level year, I remembered distinctly that the the A1 cut off for A Maths was ~88/89. One of my friends got 86/87 and was reeling in complete sombre. Having said that, I support adjusting the benchmark accordingly to the cohort's performance, as afterall, A1s lose their worth if everyone gets A1.
I think SEAB should really be more transparent when it comes to reviewing of scripts. I think every candidate should have the right to, at the very least, view their scripts if there is sufficient ground that his paper has been unduly marked. Though whining/inane reappealing (esp so for borderline cases) may occur after the student has viewed his/her scripts, at least the student is able see whether his/her scripts are marked fairly or unfairly, instead of not seeing anything at all, and doing blind guessing.
For eg., if a student firmly believes he should score an A1 in Physics, but obtained a run-of-the-mill B3 instead, by allowing him to see his scripts, (I'm assuming that his scripts are properly and fairly marked), he would at least know where his mistakes lie, and that he has self-assurance that he is deserving of the B3. Without which, he might have this strong feeling of soreness that his scripts are not marked properly, thus the undesired grade.
As for more transparency in grade allocation, I'm ambivalent about that. On one hand, of course more transparency is good, as elaborately suggested. On the other hand, the smarter ones may just slack off upon reaching the benchmark. Also, it might indirectly inculcate the mentality in some students that "just hit the benchmark can already la, no need to try your utmost best".
Gotta reember that teachers always say
"Cambridge professors mark in winter in the dark, if they can't read your handwriting they tikum a grade"
If grade tikum, how to return the paper?
@ tr@nsp0rt_F3V3R
as far as i know moderation is confirmed.... ...
for eg, 2007 (i think) my cher say e.math's cut off for A is 85... not 75...
and his stud got 84 when got back paper... ... he like... ...
@ gohby
During my O level year, I remembered distinctly that the the A1 cut off for A Maths was ~88/89. One of my friends got 86/87 and was reeling in complete sombre.
Hang on.. those cut offs are never disclosed arent they? So am I right to say that these values are just estimations?
@ crimsontactics
I think the reasons for not returning the papers is that it'll be too costly and troublesome.
Just take it that 1% of the total number of paper marked have one marking error in them. Imagine the manpower needed to just transport, remark and re-enter the data again.
True.. but with the introduction of on-screen marking, everything can be done electronically (that is, lightspeed).
@ gohby
For eg., if a student firmly believes he should score an A1 in Physics, but obtained a run-of-the-mill B3 instead, by allowing him to see his scripts, (I'm assuming that his scripts are properly and fairly marked), he would at least know where his mistakes lie, and that he has self-assurance that he is deserving of the B3. Without which, he might have this strong feeling of soreness that his scripts are not marked properly, thus the undesired grade.
As for more transparency in grade allocation, I'm ambivalent about that. On one hand, of course more transparency is good, as elaborately suggested. On the other hand, the smarter ones may just slack off upon reaching the benchmark. Also, it might indirectly inculcate the mentality in some students that "just hit the benchmark can already la, no need to try your utmost best".
I can generally agree with that =D
Hi,
Transparency is good, for it will not raise speculations.
UK exam boards often provide exam statistics for the public's reference. I have quoted this example when I wrote via email to SEAB at the end of last year, urging it to improve on its transparency. The person replied that SEAB would take my suggestion into consideration when they review the policy.
So let's wait and see. Perhaps we could write to ST forum to raise public's awareness and accelerate SEAB's pace of improving transparency?
Thanks!
Cheers,
Wen Shih
Hi,
This was the email I wrote to SEAB:
I'm an educator and often times students and parents would ask me about exam-related matters like the level of difficulty of exam and moderation of marks/grades, which I am unable to answer adequately due to a lack of substantial information.
I believe that this may be addressed through FAQs posted on SEAB's website or providing key information pertaining to exam statistics like this one I found on the Internet that I'd like to share:
http://www.aqa.org.uk/over/stat.php
Thank you for reading and hopefully students and parents would be better informed on such matters in the near future should SEAB consider my suggestion :)
Have a nice day and Merry Christmas in advance to all at SEAB!
This was the reply I got:
Dear Wen Shih,
Thank you for your feedback.
We would consider this during our review of the policy on data release.
We would also like to wish you Merry Christmas and Happy New Year 2009.
Best Regards
Great, it's Mr Wee finally =D Thanks for dropping by.
I suppose the problem with increased transparency is the heavier responsibility and accountability MOE/SEAB/UCLES need to assume. For instance, PW distinction percentages are not disclosed to the public, but resourceful students who network well could blow whistles anyway. That might explain the hoo-ha last year. Remember the article that says: MOE is sticking its guns out for PW or something, in defence to sharp jumps in distinction percentages and gaping contrasts among the JCs. AFAIK, the two possible reasons are: indeed the JCs belong to either ends of the spectrum, OR the moderation process screwed up behind the veil. In this case, it is likely IMHO to be the 2nd case. Based on what I heard from JC teachers, they have no idea what happens after selected projects are sent for external moderation, supposedly to 'ensure rigorous consistency'.
Just to digress a bit... do you notice a slight increase in difficulty in 2008 A Level Maths papers, compared to the previous years? My tuition students were complaining that the papers were tough and that they were made to think harder for their As. Which is good of course - coz there needs to be a way to distinguish the great from the good. Surprisingly though, I will still expect the distinction rates to not vary much from last year ~ that's moderation at work I suppose.