• We are sorry, but NCBI web applications do not support your browser and may not function properly. More information
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
J Adolesc Health. Author manuscript; available in PMC Mar 1, 2009.
Published in final edited form as:
PMCID: PMC2359934
NIHMSID: NIHMS41506

Inconsistent Reports of Sexual Intercourse Among South African High School Students

Lori-Ann Palen, M.S.,1 Edward A. Smith, Dr.PH.,2 Linda L. Caldwell, Ph.D.,3 Alan J. Flisher, Ph.D., F.C.Psych. (S.A.),4,5 Lisa Wegner, M.Sc.O.T.,6 and Tania Vergnani, Ph.D.7

Abstract

Purpose

This study aims to describe patterns of inconsistent reports of sexual intercourse among a sample of South African adolescents.

Methods

Consistency of reported lifetime sexual intercourse was assessed using five semi-annual waves of data. Odds ratios related inconsistent reporting to demographic variables and potential indicators of general and risk-behavior-specific reliability problems.

Results

Of the sexually active participants in the sample, nearly 40% reported being a virgin after sexual activity had been reported at an earlier assessment. Inconsistent reporting could not be predicted by gender or race, nor by general indicators of poor reliability (inconsistent reporting of gender and birth year). However, those with inconsistent reports of sexual intercourse were more likely to be inconsistent reporters of substance use.

Conclusions

These results suggest that researchers need to undertake efforts to deal specifically with inconsistent risk behavior data. These may include the modification of data collection procedures and the use of statistical methodologies that can account for response inconsistencies.

Keywords: adolescence, sexual behavior, methods, reliability of results

Inconsistent Reports of Sexual Intercourse Among South African High School Students

HIV/AIDS is a pressing public health concern for youth in South Africa. Nationwide, between 8 and 10% of 15−24 year-olds are infected with HIV [1, 2]. Delaying the initiation of sexual intercourse, via policy and programmatic intervention, may help to reduce prevalence rates. However, effective intervention requires valid data on adolescent sexual behavior. These data implicate target behaviors, assist in the identification of high-risk populations, and suggest appropriate goals for behavior change [3].

Given the private nature of sexual behavior, scientists must rely heavily on self-reports. Unfortunately, there are few objective criteria against which to judge their validity. Those tools that are available, ranging from urine tests for sperm presence to monitoring of local condom sales, tend to be imprecise or impractical. Therefore, scientists have turned to repeated measures research designs as one alternative method for assessing the validity of self-reported sexual behavior [4].

Inconsistency in the U.S. Context

Several studies have examined the short-term stability of reported sexual intercourse. In one such study, within a single assessment, between 4 and 8% of youth provided inconsistent responses as to whether they had ever engaged in sexual intercourse [5]. In another study, when two assessments were up to three weeks apart, the kappa for responses on lifetime sexual intercourse was 90.5 [6]. Therefore, while short-term consistency is high, it is not perfect.

It is also possible to examine inconsistency longitudinally. If a person has his or her sexual debut in adolescence, there will be a valid inconsistency over time (e.g., virgin at Time 1, sexually active at Time 2). However, once an adolescent has had sex, we would logically expect him or her to continue reporting lifetime sexual intercourse at subsequent time points. In this case, one indicator of invalid reporting would be when an adolescent reports having engaged in sexual intercourse at one time point and then reports being a virgin at a later time point. On average, over periods of a year or more, research shows that 4 to 12% of adolescent participants rescind their initial reports of having engaged in sexual intercourse [5, 7-9].

Causes of Inconsistent Reporting of Sexual Behavior

There are several potential explanations for inconsistent reporting of sexual behavior. First, participants may not have accurate or complete memories of their sexual health and behavior histories [3, 4]. This argument is especially plausible for reports of repeated or complicated behavior; it is less applicable to reports on whether a simple and salient one-time event like first intercourse has ever occurred.

Alternatively, inconsistencies in reporting sexual activity may also arise because certain participants lack either the motivation or the ability to understand survey questions and respond accurately [3, 4]. Evidence for this explanation, in relation to adolescent reports of lifetime sexual activity, is mixed. In one study, youth with low vocabulary comprehension were more likely to be inconsistent reporters of sexual intercourse [7], while in another, inconsistency was unrelated to reading ability [8].

Another potential source of longitudinal inconsistency is participants responding to questions in ways that they feel are desirable or acceptable, given the norms and values within their social context. This may be one reason why inconsistent reporting is more common among people with certain demographic attributes. For example, younger adolescents, for whom sexual intercourse is less normative, are less consistent reporters than older adolescents [7, 8]. Some youth may use false reports of sexual intercourse as a way to bolster their status with peers, while others may understate their experience as a way to avoid stigma and embarrassment. As adolescents get older, sexual experience may have less relevance for positive or negative social status, leading to more honest reporting of sexual behavior. At least one study supports this notion, with self-reported sexual honesty being higher for older participants [10].

Similarly, norms and values have been advanced as an explanation for gender and racial/ethnic differences in consistency of reported sexual intercourse [4]. There is some evidence for the inflation of reported sexual experience being especially high among boys [7, 10]; girls who are dishonest in their sexual reporting tend to underreport their sexual experience [10]. In the U.S., Alexander and colleagues found that White adolescents were more inconsistent reporters of sex than their Black counterparts [8], however, others have found that this is only true for girls [5].

Inconsistent Responses in the South African Context

A few studies have measured the short-term consistency of reported sexual intercourse among South African adolescents. The high school participants in a study by Flisher and colleagues [11] had over 95% agreement in their responses to a lifetime sexual intercourse item in assessments that occurred up to two weeks apart. In their sample of 11 to 19 year-olds, Jaspan and colleagues [12] found slightly lower agreement (86%) over a two week time period, however, this may be due, in part, to the fact that different modes of data collection (paper-pencil, then Palm Pilot) were used for each of the two assessments. However, to our knowledge, there have been no previous studies about the long-term consistency of self-reported sexual behavior among South African adolescents. We are also unaware of studies that have attempted to predict inconsistency in this population using demographics or other variables.

The Current Study

The current study examines longitudinal reports of lifetime sexual intercourse in a sample of South African high school students. Specifically, we examine the degree to which reports are consistent over time and whether inconsistency can be predicted by demographics, indicators of general reliability problems, or inconsistency in reporting other types of risk behavior.

In addition to studying a unique population, this study is novel in the types of predictors that it examines; we are unaware of any other studies that have related inconsistent reporting of sex to inconsistent reporting of other information, including demographics and substance use. We believe that exploring these associations will help to elucidate plausible mechanisms for inconsistency, which in turn may allow us to reduce inconsistency in future studies.

Method

Sample

Participants were high school students from Mitchell's Plain, a low-income township near Cape Town, South Africa. Students (N = 2,414) were participating in a research trial of a classroom-based leisure, life skill, and sexuality education program [13]. The sample for the present study was restricted to participants who reported lifetime sexual intercourse in at least one of the first four survey assessments (n = 713), given that these were the only participants for whom an inconsistent sexual response was possible. This subsample was mostly male (69%) and Colored (mix of African, Asian and European ancestry) and had a mean age at baseline of 14 years (SD = .90; range = 13−17). Demographic information for the subsample appears in Table 1.

Table 1
Descriptive statistics and corresponding response inconsistency for sample (N = 713)

Procedure

This study was approved by the Institutional Review Boards of both the Pennsylvania State University and Stellenbosch University. Passive parental consent and adolescent assent procedures were employed. Researchers outlined the research process, including issues of privacy and confidentiality, at the start of each survey administration. Participants were identified using unique ID numbers rather than names, each was seated at an individual desk with sufficient space to ensure privacy, and trained fieldworkers monitored the administration to be sure that participants did not communicate with each other. Fieldworkers were trained in the referral process for sexual or substance use problems, and, at each administration, participants were provided with a list of community organizations that could assist with these types of issues.

Beginning in 2004, participants completed identical semi-annual surveys on personal digital assistants (PDAs). Five waves of data were used in the current study: those from the beginning and end of 8th and 9th grades and the beginning of 10th grade. Items assessed both behavior and attitudes in a number of areas, including substance use, sex, leisure, and life skills. The survey was self-administered in each participant's home language, either English (62% of all baseline surveys) or Afrikaans (38%). Assessments were completed in classrooms (and other school spaces, as needed) during school hours. At the start of each session, fieldworkers instructed participants in the use of PDAs; they were also available to assist with any difficulties that arose during the assessment. Initially, the surveys took between 45 minutes and 1 hour to complete, however, this did decrease over time (20−30 minutes by fifth assessment) as students became more familiar with the use of PDAs and their reading/comprehension skills improved. As is customary in South African research, students were not compensated for their participation. However, school received annual monetary compensation for their participation in the research project.

There was little missing data within the surveys that were actually administered [< 1% of items missing; 14] . However, school absence and drop-out is relatively high among the target population. Therefore, between 10 and 15% of the sample was lost to attrition between each semi-annual assessment, with 1,364 students from the full sample participating in the fifth assessment.1

Measures

Retracted report of sexual behavior

At each assessment, participants were asked: “Have you ever had sex? This means intimate contact with someone during which the penis enters the vagina (female private parts).” Participants could respond either yes or no. Based on these data, each participant was coded as providing either a consistent or inconsistent response-set. If, after first reporting sexual intercourse, a participant denied lifetime sexual intercourse at any assessment that followed, that individual was coded as providing an inconsistent response-set. Otherwise, a participant was coded as providing consistent reports. Participants who had missing data for the sexual intercourse variable but were consistent in all available sex responses were coded as consistent reporters.

Demographics

Participants reported on their gender, birth year (from which age at baseline was computed), and race (response options of Colored, White, Black, Indian, other) at each assessment. A participant's demographic predictors were only included if they were consistent over time (e.g., reporting “girl” for gender at every non-missing assessment). Otherwise, the participant was excluded from the bivariate analysis relating inconsistency to that demographic predictor.

Indicators of general validity problems

At each wave, participants indicated whether or not they had ever failed a grade in school. We collapsed these items into a single dichotomous indicator of whether a participant ever reported failing a grade. We used this indicator as a proxy for lower ability and/or motivation, which might reduce the validity of survey responses.

Participants reported their gender and birth year at each assessment. For each of these variables, we created an indicator of whether participants were consistent or inconsistent reporters of this information across available assessments. Including these indicators as predictors of inconsistently reported sex may help to determine whether participants exhibit general patterns of inconsistent responses or whether there is something unique about sexually-related responses.

Indicators of risk behavior validity problems

At each wave, participants responded to close-ended items about the frequency of their lifetime alcohol, cigarette, marijuana, and inhalant use. These responses were collapsed into dichotomous (yes/no) indicators of lifetime use of each substance at each wave. Then, following the same logic described previously for sexual behavior, we determined whether reports of each type of substance use was consistent or inconsistent based on the subsequent assessments. This allowed us to examine whether inconsistency of reported sexual intercourse is part of broader problems in the reporting of risk behavior, or whether it is a phenomenon specific to sexual behavior.

Results

Of the students who reported sexual intercourse at some point in the assessment process, nearly 40% (n = 283) eventually went on to report that they had never had sexual intercourse. Table 1 shows rates of inconsistency for the demographic subgroups of interest, as well as for individuals with inconsistent reports of other types of information. Intervention participants were no more or less likely to be inconsistent reporters than were members of the control group (not tabled; β = −.04, p = .801).

Examined over the full duration of the study, earlier reports of sex were most likely to be retracted than later reports; 46% of youth who reported sexual intercourse at Wave 1 later rescinded, as compared to 17% of youth who reported being sexually active at Wave 4 (Table 2). We also examined adjacent pairs of assessments. Between 14 and 18% of reports of sexual intercourse were followed by a report of no sexual intercourse in the subsequent assessment.

Table 2
Reports of sexual intercourse by assessment

Table 3 shows the associations between inconsistent reporting of sexual intercourse and both demographics and various indicators of reliability problems. Inconsistent reporting of lifetime sexual activity had no significant association with gender or race. Younger participants were somewhat more likely to be inconsistent, although this finding did not achieve statistical significance. Inconsistency of sex responses was also unrelated to the three indices of general reliability problems (academic failure, inconsistent reporting of both gender and birth year). However, those with inconsistent reports of any of the four substances assessed in this study (alcohol, cigarettes, marijuana, inhalants) were 2 to 3 times more likely to be inconsistent reporters of sexual intercourse.

Table 3
Unconditional odds ratios for predictors of inconsistent reports of sexual activity

Discussion

The results of this study show that the majority of adolescent participants were able to provide consistent reports of their sexual activity over the course of two years. However, there was still a sizable proportion of youth who provided contradictory reports. In order to best address sexually-related public health concerns for youth in South Africa, it is crucial that explanations of this inconsistency are sought such that the reliability of data intended to inform prevention efforts is improved.

Almost 40% of the sexually active participants in our sample were inconsistent in their reporting of lifetime sexual intercourse. While this proportion may seem large in light of U.S. findings [5, 7-9], it is important to note that these previous studies all had fewer assessments (and hence fewer opportunities for inconsistent responding) than the current study. If we instead examine inconsistent reporting between any two adjacent assessments in the present study, inconsistency drops to 14−18%, which is comparable to what has been found in previous studies. In addition, our assessments occurred up to 6 months apart, and a number of previous studies with higher consistency have had more closely spaced assessments [e.g., 5, 6, 11]. Therefore, we believe that our inconsistency rate of 40% is not necessarily an indicator of comparatively low data quality or uniqueness of this sample. Rather, it suggests that inconsistent reporting is a problem that compounds with additional widely-spaced longitudinal assessments. This underscores the importance of addressing response inconsistency in the long-term studies of sexual behavior.

Previous literature has put forth a number of potential explanations for inconsistent reporting of sexual behavior (i.e., memory, ability, motivation, norms/values). However, it is important to acknowledge that there may be social and historical explanations for inconsistent reporting that are unique to South African youth. In particular, forced sexual intercourse may impact reports of sexual activity. Non-consensual sexual intercourse is much more common in South Africa than in the United States [15]. For adolescents who have only engaged in non-consensual sex, the answers to general questions about sexual history may not be straightforward, and their interpretation of these questions and associated response options may change over time.

During our own data collection, the question most frequently asked by participants was how to respond to the lifetime sexual intercourse item when their only experience was forced intercourse. (In this situation, survey administrators instructed these participants to enter “no” for the item.) However, because the survey did not directly ask about forced sexual experiences, there may be invalid data for students who needed but did not seek clarification, and we cannot investigate the degree to which forced sexual experiences were related to response inconsistency. Survey instruments used in future studies of this population should be revised in a way that makes all questions relevant and clear for those who have non-consensual sexual experiences.

Our results differ from previous studies of U.S. adolescents in that inconsistency had no significant association with gender or race. Cross-sectionally, there was a trend toward younger participants being more likely to be inconsistent; however, this finding did not achieve statistical significance. The rates of inconsistency between adjacent assessments were fairly stable over time, meaning that consistency had little longitudinal association with age. In total, these results suggest that, even though there may be subgroup differences in sexual values and norms, these differences do not affect patterns of inconsistent responses in a research context. Therefore, any efforts to address inconsistent reporting in this population will need to target members of all demographic subgroups.

We failed to find an association between inconsistent reporting of sex and both school failure and inconsistent reporting of demographic information. If these indicators are indeed good proxies for low motivation or ability, it suggests that inconsistent reporting of sexual activity cannot be attributed to factors like low literacy or a disinterest in providing valid responses. Therefore, it is unlikely that consistency can be improved by general efforts to simplify assessment procedures, such as reading survey questions aloud or simplifying their wording.

As shown in Table 1, between 17 and 24% of the sample inconsistently reported their use of each of four substances. This is compatible with results of other studies examining longitudinal reports of substance use [e.g., 16]. In our data, inconsistent reports of substance use also exhibit patterns over time that are similar to those shown for sex in Table 2, with inconsistency compounding over time but remaining fairly stable between pairs of adjacent assessments (results not shown).

The significant associations between inconsistent reports of sexual intercourse and inconsistent reports of substance use suggest that the reporting of both types of risk behavior may be influenced by similar factors. For example, in this particular context, there may be similar norms and expectations for multiple types of risk behavior. Sex and drugs may carry similar risk of embarrassment or stigma. It is also possible that both types of behavior have similar value for attaining status with peers. Given that the two types of inconsistency are related, it suggests that the same strategies may be effective in reducing both.

Strategies for Reducing Inconsistency

In risk behavior research, efforts to protect the confidentiality of responses need to be visible to adolescents, and these need to be efforts that adolescents perceive as being effective. Also, researchers should strive to foster motivations that will offset a desire to provide socially desirable responses. This might include appealing to adolescents' sense of civic responsibility by sharing how results will be used to improve the lives of both participants and other youth, and then stressing that accurate results will be more helpful than inaccurate results.

The increasing popularity of electronic modes of assessment, coupled with improving infrastructure for technologies like wireless Internet, may make it possible to reduce inconsistencies by incorporating adolescents' previous responses into subsequent assessments. For example, an adolescent might report being sexually active at her first assessment. This response could be electronically stored, and then retrieved (either directly from a hard drive or remotely via the Internet) at the beginning of the next assessment. If and when an inconsistent response is given, the adolescent could be presented with one or more questions to follow up on the inconsistency. According to the adolescent, which of the two responses is correct? Was the inconsistency due to dishonesty, error, or a change in how the adolescent interpreted the question? This would allow the researcher to have more valid data, as well as give greater insight into adolescents as participants in risk behavior research.

Even if it is impossible to completely eliminate inconsistent responses, there is at least one analytic strategy in which inconsistent responses can be retained for analyses meant to describe only logical patterns of development. In latent transition analysis, the probability of illogical transitions (i.e., sexual activity to virginity) can be constrained to zero. Any individuals actually exhibiting this type of transition are included in analysis, but contribute to reduced model fit [17]. Unfortunately, for longitudinal analyses using techniques other than latent transition analysis, it is unclear how best to handle inconsistent data.

In the past, researchers have often dealt with inconsistent responses by recoding them according to logical rules. For example, a researcher might recode all subsequent responses to be consistent with an earlier report of sexual intercourse, operating under the assumption that the first report is more proximal to the event of interest and, therefore, more likely to be accurately recalled. However, the use of such rules is rarely reported, and their implications for statistical inference have not been explored. Therefore, the issue of inconsistent reporting is one that methodologists should strive to address and then provide recommendations for data analysts.

Limitations and Future Directions

It is important to acknowledge that by operationalizing consistency in the way that we have, we are limited to stating that an invalid response is present. We do not know whether it is the initial report of sexual activity or the subsequent report of virginity (or, indeed, both) that is false. In addition, this indicator only captures one type of false reporting: moving from a report of sexual activity to one of virginity. We are unable to determine which participants have responses that are false yet logically possible. For example, a participant may report being a virgin at one time point and then lie about being sexually active at a later time point. Similarly, a participant who consistently reports being sexually active may have misreported at initial assessments, and a participant who consistently reports being a virgin may be misreporting at later assessments. Unfortunately, this type of error is undetectable in our data.

In the present study, the majority of participants had missing data for at least one assessment. As long as they did not provide any inconsistent responses, we considered participants with missing data to be consistent responders. This use of only available data is contrasted with a strategy in which missing data are incorporated into analysis by way of a procedure such as multiple imputation or the EM algorithm [e.g., 18]. Our chosen strategy does have its limitations; depending on the degree to which inconsistency and its correlates are related to attrition, our results may not be fully representative of the general population. However, we felt that it is only youth who explicitly provide inconsistent data who cause analytic problems for researchers; it is these data on which missing data procedures and inference are based. Explanations of inconsistencies among these participants can potentially be used to reduce reported inconsistencies in future research. However, we would argue predicting potential inconsistencies in data that are never collected is of little practical utility.

In addition, we acknowledge that the results presented here are only representative of youth who are enrolled in and attend secondary school. In South Africa, education is only compulsory through grade 9 or age 15 [19]. National educational statistics [20] show that, beginning in grade 9, South African schools tend to be under-enrolled, meaning that there are fewer enrolled students than one would expect given the number of youth of the corresponding age within the population. Only about 40% of grade 1 students remain in school through grade 12. It is possible that the youth who either do not enroll in or do not attend school differ in the degree to which they are consistent reporters of risk behavior. Therefore, our results may not generalize to research participants in non-school settings.

It is unclear the degree to which our results our unique to our PDA assessment format. There is some evidence that reported prevalence of sexual behaviors differs between electronic assessments and either interviews or paper-pencil self-reports and that participants find electronic assessments to be both more confidential and conducive to honest responding [21-23]. Therefore, with non-electronic assessment modalities, it is possible that rates of inconsistent responding and the associations between inconsistent responses and other constructs differ from what we have reported here.

Despite the limitations, our study provides evidence that inconsistent reporting of sexual behavior is a legitimate problem in research with South African youth. It is an issue that cuts across demographic groups present in this study, suggesting some degree of universality within our target population. However, there is evidence for a relationship between inconsistent reporting of both sex and substances. This implies that researchers need to address motivations for invalid responses in the risk behavior domain specifically. Also, methodologists need to work to provide solutions for dealing with any inconsistencies that remain after the modification of research protocols.

ACKNOWLEDGEMENTS

This research was funded by NIH Grants R01 DA01749 and T32 DA017629-01A1. The authors wish to thank the PSU Prevention and Methodology Training fellows and advisors for their comments on an earlier version of this manuscript.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

1Of the 713 participants in the sexually-active subsample, 252 completed all five assessments. Poisson regressions predicting a count of missing assessments showed that boys, older participants, Black and White participants, participants who had ever failed a grade, participants who were sexually active at baseline, participants who had not used alcohol by Assessment 4, and participants who had used inhalants by Assessment 4 had significantly more missing assessments (p < .05).

References

1. Shisana O, Rehle T, Simbayi LC, et al. South African national HIV prevalence, HIV incidence, behaviour and communication survey. HSRC Press; Cape Town: 2005. 2005.
2. Pettifor AE, Rees HV, Steffenson A, et al. HIV and sexual behaviour among young South Africans: A national survey of 15−24 year olds. Reproductive Health Research Unit, University of the Witwatersrand; Johannesburg: 2004.
3. Brener ND, Billy JOG, Grady WR. Assessment of factors affecting the validity of self-reported health-risk behavior among adolescents: Evidence from the scientific literature. J Adolesc Health. 2003;33:436–457. [PubMed]
4. Catania JA, Gibson DR, Chitwood DD, et al. Methodological problems in AIDS behavior research: Influences on measurement error and participation bias in studies of sexual behavior. Psychol Bull. 1990;108:339–362. [PubMed]
5. Rodgers JL, Billy JOG, Udry JR. The recission of behaviors: Inconsistent responses in adolescent sexuality data. Soc Sci Res. 1982;11:280–296.
6. Brener ND, Kann L, McManus T, et al. Reliability of the 1999 Youth Risk Behavior Survey questionnaire. J Adolesc Health. 2002;31:336–342. [PubMed]
7. Upchurch DM, Lillard LA, Aneshensel CS, et al. Inconsistencies in reporting the occurrence and timing of first intercourse among adolescents. J Sex Res. 2002;39:197–206. [PubMed]
8. Alexander CS, Somerfield MR, Ensminger ME, et al. Consistency of adolescents' self-report of sexual behavior in a longitudinal study. J Youth Adolesc. 1993;22:455–471.
9. Capaldi DM. The reliability of retrospective report for timing first sexual intercourse for adolescent males. J Adolesc Res. 1996;11:375–387.
10. Siegel DM, Aten MJ, Roghmann KJ. Self-reported honesty among middle and high school student responding to a sexual behavior questionnaire. J Adolesc Health. 1998;23:20–28. [PubMed]
11. Flisher AJ, Evans J, Muller M, et al. Brief report: Test-retest reliability of self-reported adolescent risk behaviour. Journal of Adolescence. 2004;27:207–212. [PubMed]
12. Jaspan HB, Flisher AJ, Myer L, et al. Brief report: Methods for collecting sexual behaviour information from South African adolescents - a comparison of paper versus personal digital assistant questionnaires. Journal of Adolescence. 2007;30:353–359. [PubMed]
13. Caldwell LL, Smith E, Wegner L, et al. HealthWise South Africa: Development of a life skills curriculum for young adults. World Leis. 2004;3:4–17.
14. Palen L, Graham JW, Smith EA, et al. Rates of missing responses in PDA versus paper assessments. Evaluation Review. in press. [PubMed]
15. United Nations Office on Drugs and Crime . Seventh United Nations survey on crime trends and operations of criminal justice systems. Vienna, Austria: 2004.
16. Percy A, McAlister S, Higgins K, et al. Response consistency in young adolescents' drug use self-reports: A recanting rate analysis. Addiction. 2005:100. [PubMed]
17. Lanza ST, Flaherty BP, Collins LM. Latent class and latent transition analysis. In: Schinka J, Velicer W, editors. Handbook of psychology. Wiley; New York: 2002. pp. 663–685.
18. Graham JW, Cumsille PE, Elek-Fisk E. Handbook of psychology: Research methods in psychology. Wiley & Sons; Hoboken, NJ: 2003. Methods for handling missing data. pp. 87–114.
19. South African Schools Act. 1996.
20. Department of Education . Education statistics in South Africa at a glance in 2001. Pretoria: 2003.
21. Mensch BS, Hewett PC, Erulkar AS. The reporting of sensitive behavior by adolescents: A methodological experiment in Kenya. Demography. 2003;40:247–268. [PubMed]
22. Metzger DS, Koblin B, Turner C, et al. Randomized controlled trial of Audio Computer-assisted Self-interviewing: Utility and acceptability in longitudinal studies. American Journal of Epidemiology. 2000;152:99–106. [PubMed]
23. Zwarenstein M, Seebregts C, Mathews C, et al. Handheld computers for survey and trial data collection in resource-poor settings: Development and evaluation of PDACT, a Palm Pilot interviewing system. Manuscript submitted for publication 2006. [PubMed]
PubReader format: click here to try

Formats:

Related citations in PubMed

See reviews...See all...

Cited by other articles in PMC

See all...

Links

  • PubMed
    PubMed
    PubMed citations for these articles

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...