TJHairball
I love this place
I tried to include a few of my more moderate examples in the ones I detailed for you. Here's how this worked. I took a small batch of ready-at-hand data (on hand for other purposes), and looked it over, spotted what seem to be some nice funny correlations that match up well with the anecdotal evidence I've been hearing about - for example - young Republicans, Southern Baptist youth, etc etc etc. Then I came to you guys with my conclusions and a dozen or so examples out of the hundreds available. I'm not looking to run a rigorous study, just provide the character behind the curtain; if you want studies, I can cite study after study that's already been done that supports all of the individual symptoms that, taken in combination, fit my thesis here: There is a culture of irresponsibility among "conservative" youth.Dark Link said:Well first, you completely denied any research process. The normal Ideal process is:
Problem -> Method -> Data collection and analysis -> Support or reject Hypothesis
It seems your main type of study is an Internal state-behavioral study. Meaning the interaction between their beliefs and reported or observed behaviors. You could do well in continuing research on it, as long as it's done scientifically and ethically.
What you fail to do is take a random sampling for your study. You have boths sides of an extreme with no middle ground. Some of your sampled people are a bit unbelievable.
Frankly, if you find them unbelievable... welcome to the real world. If I included a list of all of them, it would take too long. And probably run over the post length limit here.
That's because I'm using these particular people as examples. Since I was particularly looking at conservatives, I mostly included people who had identified themselves as conservative.You also fail to continue with explaining why you have these people in a research design. You describe them, then completely forget about them.
Remember, I selected those particular examples out of a few hundred to try and give a representation of the sort of character I discussed. I could write up and post literally hundreds more (which would be the "representative" sampling). What you have here is an illustration.
That's funny... because when I look at the details, I have no trouble finding what everybody found notable about the study (which is an incidental finding, yes, but those are usually the most interesting bits):No, BMJ article: "Interventions to reduce unintended pregnancies among adolescents: systematic review of randomised controlled trials."
Tanking from the Absract because I dont feel like dealing with SAA endnoting.
"Objective: To review the effectiveness of primary prevention strategies aimed at delaying sexual intercourse, improving use of birth control, and reducing incidence of unintended pregnancy in adolescents."
"Conclusions: Primary prevention strategies evaluated to date do not delay the initiation of sexual intercourse, improve use of birth control among young men and women, or reduce the number of pregnancies in young women. "
All this article was ment to explain was how our current method isn't working.
BUT, if you look at the data, it shows that the Abstinant only program had the same successes as many of the school/agency based education, negating your claim. That and your automatically assuming that all areas that have abstinance only programs are conservative, when many are in fact not, but following a state mandated curriculum.
"Four abstinence programmes and one school based sex education programme were associated with an increase in number of pregnancies among partners of young male participants."
Whoops. Guess you didn't read past the abstract.
This, with four abstinence program studies listed in the list of studies. Looking at the details, I find these studies to have been conducted in California, which does not mandate abstinence-only education. The logic therefore holds.
Oh, sorry... try one of the studies inspiring that newspaper article. Which shows, incidentally, that the vast majority of "promise pledges" are broken, and overall STD rates don't differ in the slightest... which means that, given those keeping the pledge being a statistically significant minority, the vast majority (88%) of pledgers breaking the pledge experience an increase in STD risk.And your other example was a school newspaper... Do I need to inform you about how stupid that is? Columbia University or not, You'd be laughed out of any place, like you are here. When giving sources to back up claims use Scholarly Jounals, not newspaper articles. If you want to get articles use engines like PubMed, Medline, ERIC, NTIS, OCLC, and others. The articles you'll find there are like your BMJ article done by doctors in different fields.
And while you're chewing on that, nibble around on the CAS material I linked to. While the "promise pledge" and abstinence education data show the behavior of "religious conservative" youth, the "secular conservative youth" - the prosperous ones - are indicted incidentally in a number of those studies through their noticable membership in social fraternal organizations... etc etc etc.
The most virtuous of all the "young conservative" groups seem to be a peculiar minority. (TQ would like to call these the "real" conservatives, probably.)