Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 Assume the real IQ is 140 and the test has an abnormally distributed error as following: (Again, the real IQ is known by other more accurate tests. We then take another test with an abnormally distributed error. I change the error distribution again to make it extremely abnormal. ) 135: 50% 145: 50% Will "regression toward the mean" happen? (A) Yes (B) No My answers are (B) for both questions , what are you answers? 875282[/snapback] Under the circumstances you've outlined, someone who scored a 135 on the first test would, on average, score a 140 on retaking the test. Someone who scored a 145 on the first test would score a 140 upon being retested. In both cases, people who scored above or below the population's mean I.Q. of 140 are expected to regress toward the population's mean upon being retested.
Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 and you keep insisting this point? jeez. Could you picture what if the variance of the retests of these 140's were to be not regressing towards the mean?. . . Ok now that you imagined it. now realize that this is reality and take note of what the others are saying, and try to learn from your mistakes and their mistakes alike. 875313[/snapback] That's your opinion, and you're welcome to it. Others see the issue differently: Take people who scored 140 on the test. Two possibilities:• a) true score below 140, positive chance error (T < 140, + error) e.g. 135+5 • b) true score above 140, negative chance error ( T > 140, - error) e.g. 145-5 A plot of the normal curve shows that the first explanation is more likely – the true score is most likely lower, and so on average, the scores on the second test will be a bit lower than the first. Suppose you were told that when any group of subjects with low values on some measurement is later remeasured, their mean value will increase without the use of any treatment or intervention. Would this worry you? It sure had better! . . . The behavior described in the first paragraph is real. It is called the regression effect. In most test-retest situations . . . those who perform best usually do so with a combination of skill (which will be present in the retest) and exceptional luck (which will likely not be so good in a retest). Those who perform worst usually do so as the result of a combination of lack of skill (which still won't be present in a retest) and bad luck (which is likely to be better in a retest). . . . A particularly high score could have come from someone with an even higher true ability, but who had bad luck, or someone with a lower true ability who had good luck. Because more individuals are near average, the second case is more likely; when the second case occurs on a retest, the individual's luck is just as likely to be bad as good, so the individual's second score will tend to be lower. The same argument applies, mutatis mutandis, to the case of a particularly low score on the first test. Regression Effects The tendency of subjects, who are initially selected due to extreme scores, to have subsequent scores move inward toward the mean. Also known as statistical regression/regression to the mean/regression fallacy. Regression effect: in almost all test-retest situations- The bottom group on the first test will on average show some improvement on the second test - The top group on the first test will do a bit worse on the second test - Regression fallacy: thinking that the regression effect must be due to something important, not just spread around the SD line. If two successive trait measurements have a less-than-perfect correlation, individuals or populations will, on average, tend to be closer to the mean on the second measurement (the so-called regression effect). Hmmm . . . should I take the word of jzmack, or should I take the combined word of Stanford, Tufts, Berkeley, the Environmental Protection Agency, the University of Chicago, and the University of Washington? Hmmm, let me think about this . . .
syhuang Posted December 23, 2006 Posted December 23, 2006 Under the circumstances you've outlined, someone who scored a 135 on the first test would, on average, score a 140 on retaking the test. Someone who scored a 145 on the first test would score a 140 upon being retested. In both cases, people who scored above or below the population's mean I.Q. of 140 are expected to regress toward the population's mean upon being retested. 875386[/snapback] The statement is completely wrong. Even the average is 140, you can not score 140 in this test. You can only score 135 or 145 here. If a person gets 145 and then retakes the test, he won't get a score closer to the mean. You won't regress toward anywhere even if you keep retaking the test. Do you call a score sequence like following as "regression toward the mean"? 135, 135, 145, 135, 145,145,135,145,.... As you can see, it doesn't regress toward any value. ------ EDIT: HA, I've a feeling that you may use "average score" to reason that "regression toward the mean" still applies here. Thus, before you do that, please make sure you don't change your definition of "regression toward the mean" in past 50+ pages. If you want to use "average score of a group of people", please make sure you're not saying that "regression toward the mean" doesn't apply to individuals and only applies to the average score of a group. Please don't say that if a person's real IQ is 140 (known by other more accurate tests) and scores a 160 or 120 in a test with zero-mean normally-distributed error, he is NOT likely to get a score closer to the mean when retaking the test. If you want to use "average of all past scores", please make sure you're not changing your definition of "regression toward the mean" from "...... the next score is likely closer to the mean upon retaking the test" to "...... the average of the next score and all past scores is likely closer to the mean". Last, please don't say that a person can score 140 in a test which only has two outcomes, 135 and 145.
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 That's your opinion, and you're welcome to it. Others see the issue differently:Hmmm . . . should I take the word of jzmack, or should I take the combined word of Stanford, Tufts, Berkeley, the Environmental Protection Agency, the University of Chicago, and the University of Washington? Hmmm, let me think about this . . . 875390[/snapback] You could take no one's word for it, and calculate it yourself... Oh, wait, that's right. You're too much of a !@#$ing nitwit to calculate it yourself.
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 The statement is completely wrong. Even the average is 140, you can not score 140 in this test. You can only score 135 or 145 here. If a person gets 145 and then retake the test, he won't get a score closer to the mean. You won't regress toward anywhere even if you keep retaking the test. Do you call a score sequence like following as "regression toward the mean"? 135, 135, 145, 135, 145,145,135,145,.... As you can see, it doesn't regress toward any value. 875438[/snapback] For what it's worth, I thought your example was perfectly clear (unrealistic - but you knew that. Perfectly clear nonetheless). Every time I think HA's bottomed out, he gets stupider.
Dibs Posted December 23, 2006 Posted December 23, 2006 I've only read the first few pages here but I assume it's the same thing over & over till now. I think everyone is assuming that HA is proposing something far more complicated than he actually is. As far as I can see he is simply saying...... ....that out of a large sample size, those(as a group) that score very high on a test that involves an element of luck....when re-tested....will end up(on average) with a lower average(as a group) than when first tested. This is just simple logic.
/dev/null Posted December 23, 2006 Posted December 23, 2006 I've only read the first few pages here but I assume it's the same thing over & over till now. 875493[/snapback] yep
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 I've only read the first few pages here but I assume it's the same thing over & over till now. 875493[/snapback] Oh, yeah. It hasn't changed since the middle of the Err America thread.
Ramius Posted December 23, 2006 Posted December 23, 2006 The statement is completely wrong. Even the average is 140, you can not score 140 in this test. You can only score 135 or 145 here. If a person gets 145 and then retakes the test, he won't get a score closer to the mean. You won't regress toward anywhere even if you keep retaking the test. Do you call a score sequence like following as "regression toward the mean"? 135, 135, 145, 135, 145,145,135,145,.... As you can see, it doesn't regress toward any value. 875438[/snapback] I understood your point perfectly clear. Good job.
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 I understood your point perfectly clear. Good job. 875652[/snapback] According to a Berkley web page, it still regresses to 140, which is regression toward the mean even though the mean is 100...
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 In both cases, people who scored above or below the population's mean I.Q. of 140 are expected to regress toward the population's mean upon being retested. 875386[/snapback] What the !@#$ is this mental retardation? You've just spent a hundred pages saying they wouldn't, that they'd regress FROM the population mean and on average have lower scores.
Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 I've only read the first few pages here but I assume it's the same thing over & over till now. I think everyone is assuming that HA is proposing something far more complicated than he actually is. As far as I can see he is simply saying...... ....that out of a large sample size, those(as a group) that score very high on a test that involves an element of luck....when re-tested....will end up(on average) with a lower average(as a group) than when first tested. This is just simple logic. 875493[/snapback] Yes, that's about the shape of things.
Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 What the !@#$ is this mental retardation? You've just spent a hundred pages saying they wouldn't, that they'd regress FROM the population mean and on average have lower scores. 875682[/snapback] In his example, the people being tested had an average true I.Q. of 140. Hence, someone who scored above or below a 140 is, on average, expected to regress towards the population's mean I.Q. upon being retested.
Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 The statement is completely wrong. Even the average is 140, you can not score 140 in this test. You can only score 135 or 145 here. If a person gets 145 and then retakes the test, he won't get a score closer to the mean. You won't regress toward anywhere even if you keep retaking the test. Do you call a score sequence like following as "regression toward the mean"? 135, 135, 145, 135, 145,145,135,145,.... As you can see, it doesn't regress toward any value. ------ EDIT: HA, I've a feeling that you may use "average score" to reason that "regression toward the mean" still applies here. Thus, before you do that, please make sure you don't change your definition of "regression toward the mean" in past 50+ pages. If you want to use "average score of a group of people", please make sure you're not saying that "regression toward the mean" doesn't apply to individuals and only applies to the average score of a group. Please don't say that if a person's real IQ is 140 (known by other more accurate tests) and scores a 160 or 120 in a test with zero-mean normally-distributed error, he is NOT likely to get a score closer to the mean when retaking the test. If you want to use "average of all past scores", please make sure you're not changing your definition of "regression toward the mean" from "...... the next score is likely closer to the mean upon retaking the test" to "...... the average of the next score and all past scores is likely closer to the mean". Last, please don't say that a person can score 140 in a test which only has two outcomes, 135 and 145. 875438[/snapback] You pretty much took the words out of my mouth. If you select, say, 100 people who scored a 145 the first time around, that group's average on being retested will be around 140.
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 In his example, the people being tested had an average true I.Q. of 140. Hence, someone who scored above or below a 140 is, on average, expected to regress towards the population's mean I.Q. upon being retested. 875719[/snapback] Jesus Christ. Did you even read his example?
Chilly Posted December 23, 2006 Posted December 23, 2006 You pretty much took the words out of my mouth. If you select, say, 100 people who scored a 145 the first time around, that group's average on being retested will be around 140. 875720[/snapback] Uhm, did you read this sentence? Last, please don't say that a person can score 140 in a test which only has two outcomes, 135 and 145. Yeah, he sure took the words right out of your mouth.
Orton's Arm Posted December 23, 2006 Author Posted December 23, 2006 Uhm, did you read this sentence?Yeah, he sure took the words right out of your mouth. 875742[/snapback] Reread what I wrote about his example. I said that someone who scored a 145 on the first test would, on average, score a 140 the second time around. Obviously, 50% of the time those retaking the test would get a 145 again, and the other 50% they'd get a 135. That works out to an average outcome of 140.
Bungee Jumper Posted December 23, 2006 Posted December 23, 2006 Reread what I wrote about his example. I said that someone who scored a 145 on the first test would, on average, score a 140 the second time around. Obviously, 50% of the time those retaking the test would get a 145 again, and the other 50% they'd get a 135. That works out to an average outcome of 140. 875749[/snapback] What would the correllation coefficient of the two test scores be, then?
syhuang Posted December 23, 2006 Posted December 23, 2006 You pretty much took the words out of my mouth. If you select, say, 100 people who scored a 145 the first time around, that group's average on being retested will be around 140. 875720[/snapback] Do you imply that "regression toward the mean" only applies to a group of people and doesn't apply to individuals? Anyway, answer this question, does "regression toward the mean" apply to individuals? Or if you want an example, here is one. If a person's real IQ is 140 (known by other more accurate tests) and scores a 160 or 120 in a test with zero-mean normally-distributed error, will he likely get a score closer to the mean when retaking the test? If your answer is yes, please answer the next question. Q: If a person (not a group of people) gets a 135 in this test with abnormally distributed error, does he likely get a score (either 135 or 145) closer to the mean when retaking the test?
Recommended Posts