Quality of health literacy instruments used in children and adolescents: a systematic review ============================================================================================ * Shuaijun Guo * Rebecca Armstrong * Elizabeth Waters * Thirunavukkarasu Sathish * Sheikh M Alif * Geoffrey R Browne * Xiaoming Yu ## Abstract **Objective** Improving health literacy at an early age is crucial to personal health and development. Although health literacy in children and adolescents has gained momentum in the past decade, it remains an under-researched area, particularly health literacy measurement. This study aimed to examine the quality of health literacy instruments used in children and adolescents and to identify the best instrument for field use. **Design** Systematic review. **Setting** A wide range of settings including schools, clinics and communities. **Participants** Children and/or adolescents aged 6–24 years. **Primary and secondary outcome measures** Measurement properties (reliability, validity and responsiveness) and other important characteristics (eg, health topics, components or scoring systems) of health literacy instruments. **Results** There were 29 health literacy instruments identified from the screening process. When measuring health literacy in children and adolescents, researchers mainly focus on the functional domain (basic skills in reading and writing) and consider participant characteristics of developmental change (of cognitive ability), dependency (on parents) and demographic patterns (eg, racial/ethnic backgrounds), less on differential epidemiology (of health and illness). The methodological quality of included studies as assessed via measurement properties varied from poor to excellent. More than half (62.9%) of measurement properties were unknown, due to either poor methodological quality of included studies or a lack of reporting or assessment. The 8-item Health Literacy Assessment Tool (HLAT-8) showed best evidence on construct validity, and the Health Literacy Measure for Adolescents showed best evidence on reliability. **Conclusions** More rigorous and high-quality studies are needed to fill the knowledge gap in measurement properties of health literacy instruments. Although it is challenging to draw a robust conclusion about which instrument is the most reliable and the most valid, this review provides important evidence that supports the use of the HLAT-8 to measure childhood and adolescent health literacy in future school-based research. * measurement properties * health literacy * children * adolescents * systematic review ### Strengths and limitations of this study * The COSMIN (COnsensus-based Standards for the selection of health Measurement INstruments) checklist was used as a methodological framework to rate the methodological quality of included studies. * This review has updated previous three reviews of childhood and adolescent health literacy measurement tools and identified 19 additional new health literacy instruments. * Including only studies that aimed to develop or validate a health literacy instrument may eliminate studies that used a health literacy instrument for other purposes. * Individual subjectivity exists in the screening and data synthesis stages. ## Introduction  Health literacy is a personal resource that enables an individual to make decisions for healthcare, disease prevention and health promotion in everyday life.1 As defined by the WHO,2 health literacy refers to ‘*the cognitive and social skills which determine the motivation and ability of individuals to gain access to, understand and use information in ways which promote and maintain good health*’. The literature has shown that health literacy is an independent and more direct predictor of health outcomes than sociodemographics.3 4 People with low health literacy are likely to have worse health-compromising behaviours, higher healthcare costs and poorer health status.5 Given the close relationship between health literacy and health outcomes, many countries have adopted health literacy promotion as a key strategy to reduce health inequities.6 From a health promotion perspective, improving health literacy at an early age is crucial to childhood and adolescent health and development.7 As demonstrated by Diamond *et al*8 and Robinson *et al*,9 health literacy interventions for children and adolescents can bring about improvements in healthy behaviours and decreased use of emergency department services. Although health literacy in young people has gained increasing attention, with a rapidly growing number of publications in the past decade,10–13 childhood and adolescent health literacy is still under-researched. According to Forrest *et al*’s 4D model,14 15 health literacy in children and adolescents is mediated by four additional factors compared with adults: (1) *developmental* change: children and adolescents have less well-developed cognitive ability than adults; (2) *dependency*: children and adolescents depend more on their parents and peers than adults do; (3) *differential* epidemiology: children and adolescents experience a unique pattern of health, illness and disability; and (4) *demographic* patterns: many children and adolescents living in poverty or in single-parent families are neglected and so require additional care. These four differences pose significant challenges for researchers when measuring health literacy in children and adolescents. Health literacy is a broad and multidimensional concept with varying definitions.16 This paper uses the definition by Nutbeam,17 who states that health literacy consists of three domains: functional, interactive and critical. The *functional* domain refers to basic skills in reading and writing health information, which are important for functioning effectively in everyday life. The *interactive* domain represents advanced skills that allow individuals to extract health information and derive meaning from different forms of communication. The *critical* domain represents more advanced skills that can be used to critically evaluate health information and take control over health determinants.17 Although health literacy is sufficiently explained in terms of its definitions17–19 and theoretical models,4 7 its measurement remains a contested issue. There are two possible reasons for this. One reason is the large variety of health literacy definitions and conceptual models,12 16 and the other reason is that researchers may have different study aims, populations and contexts when measuring health literacy.20 21 Currently, there are three systematic reviews describing and analysing the methodology and measurement of childhood and adolescent health literacy.10 11 13 In 2013, Ormshaw *et al*10 conducted a systematic review of child and adolescent health literacy measures. This review used four questions to explore health literacy measurement in children and adolescents: ‘*What measurement tools were used? What health topics were involved? What components were identified?* and *Did studies achieve their stated aims?*’ The authors identified 16 empirical studies, with only 6 of them evaluating health literacy measurement as their primary aim. The remaining studies used health literacy measures as either a comparison tool when developing other new instruments or as a dependent variable to examine the effect of an intervention programme. Subsequently, in 2014, Perry11 conducted an integrative review of health literacy instruments used in adolescents. In accordance with the eligibility criteria, five instruments were identified. More recently, Okan *et al*13 conducted another systematic review on generic health literacy instruments used for children and adolescents with the aim of identifying and assessing relevant instruments for first-time use. They found 15 generic health literacy instruments used for this target group. Although these three reviews provide general knowledge about the methodology and measurement of health literacy in young people, they all have limitations. Ormshaw *et al*10 did not evaluate measurement properties of each health literacy instrument. Although Perry11 and Okan *et al*13 summarised the measurement properties of each instrument, the information provided was limited, mostly descriptive and lacked a critical appraisal. Notably, none of the three reviews considered the methodological quality of included studies.10 11 13 A lack of quality assessment of studies raises concerns about the utility of such reviews for evaluating and selecting health literacy instruments for children and adolescents. Therefore, it is still unclear which instrument is the best in terms of its validity, reliability and feasibility for field use. In addition, it is also unclear how Nutbeam17 three-domain health literacy model and Forrest *et al*14 15 4D model are considered in existing health literacy instruments for children and adolescents. To fill these knowledge gaps, this systematic review aimed to examine the quality of health literacy instruments used in the young population and to identify the best instrument for field use. We expect the findings will assist researchers in identifying and selecting the most appropriate instrument for different purposes when measuring childhood and adolescent health literacy. ## Methods Following the methods for conducting systematic reviews outlined in the Cochrane Handbook,22 we developed a review protocol (see online supplementary appendix 1, PROSPERO registration number: CRD42018013759) prior to commencing the study. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement23 (see *Research Checklist*) was used to ensure the reporting quality of this review. ### Supplementary data [[bmjopen-2017-020080-SP1.pdf]](pending:yes) ### Literature search The review took place over two time periods: The initial systematic review covered the period between 1 January 1974 and 16 May 2014 (period 1). The start date of 1974 was chosen because this was the date from which the term ‘*health literacy*’ was first used.24 A second search was used to update the review in February 2018. It covered the period from 17 May 2014 to 31 January 2018 (period 2). The databases searched were Medline, PubMed, Embase, PsycINFO, Cumulative Index to Nursing and Allied Health Literature, Education Resources Information Center, and the Cochrane Library. The search strategy was designed on the basis of previous reviews5 10 25 26 and in consultation with two librarian experts. Three types of search terms were used: (1) construct-related terms: ‘*health literacy*’ OR ‘*health and education and literacy*’; (2) outcome-related terms: ‘*health literacy assess**’ OR ‘*health literacy measure**’ OR ‘*health literacy evaluat**’ OR ‘*health literacy instrument**’ OR ‘*health literacy tool**’; and (3) age-related terms: ‘*child**’ OR ‘*adolescent**’ OR ‘*student**’ OR ‘*youth*’ OR ‘*young people*’ OR ‘*teen**’ OR ‘*young adult*’. No language restriction was applied. The detailed search strategy for each database is available in online supplementary appendix 2. As per the PRISMA flow diagram,23 the references from included studies and from six previously published systematic reviews on health literacy5 10 25–28 were also included. ### Eligibility criteria Studies had to fulfil the following criteria to be included: (1) the stated aim of the study was to develop or validate a health literacy instrument; (2) participants were children or adolescents aged 6–24—this broad age range was used because the age range for ‘*children*’ (under the age of 18) and ‘*adolescents*’ (aged 10–24) overlap,29 and also because children aged over 6 are able to learn and develop their own health literacy30; (3) the term ‘*health literacy*’ was explicitly defined, although studies assessing health numeracy (the ability to understand and use numbers in healthcare settings) were also considered; and (4) at least one measurement property (reliability, validity and responsiveness) was reported in the outcomes. Studies were excluded if (1) the full paper was not available (ie, only a conference abstract or protocol was available); (2) they were not peer-reviewed (eg, dissertations, government reports); or (3) they were qualitative studies. ### Selection process All references were imported into EndNote V.X7 software (Thomson Reuters, New York, New York) and duplicate records were initially removed before screening. Next, one author (SG) screened all studies based on the title and abstract. Full-text papers of the remaining titles and abstracts were then obtained separately for each review round (period 1 and period 2). All papers were screened by two independent authors (SG and SMA). At each major step of this systematic review, discrepancies between authors were resolved through discussion. ### Data extraction The data that were extracted from papers were the characteristics of the included studies (eg, first author, published year and country), general characteristics of instruments (eg, health topics, components and scoring systems), methodological quality of the study (eg, internal consistency, reliability and measurement error) and ratings of measurement properties of included instruments (eg, internal consistency, reliability and measurement error). Data extraction from full-text papers published during period 1 was performed by two independent authors (SG and TS), whereas data extraction from full-text papers published during period 2 was conducted by one author (SG) and then checked by a second author (TS). ### Methodological quality assessment of included studies The methodological quality of included studies was assessed using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) checklist.31 The COSMIN checklist is a critical appraisal tool containing standards for evaluating the methodological quality of studies on measurement properties of health measurement instruments.32 Specifically, nine measurement properties (internal consistency, reliability, measurement error, content validity, structural validity, hypotheses testing, cross-cultural validity, criterion validity and responsiveness) were assessed.32 Since there is no agreed-upon ‘*gold standard*’ for health literacy measurement,33 34 criterion validity was not assessed in this review. Each measurement property section contains 5–18 evaluating items. For example, ‘*internal consistency*’ is evaluated against 11 items. Each item is scored using a 4-point scoring system (‘*excellent*’, ‘*good*’, ‘*fair*’ or ‘*poor*’). The overall methodological quality of a study is obtained for each measurement property separately, by taking the lowest rating of any item in that section (ie, ‘*worst score counts*’). Two authors (SG and TS) independently assessed the methodological quality of included studies published during period 1, whereas the quality of included studies published during period 2 was assessed by one author (SG) and then checked by another (TS). ### Evaluation of measurement properties for included instruments The quality of each measurement property of an instrument was evaluated using the quality criteria proposed by Terwee *et al*,35 who are members of the group that developed the COSMIN checklist (see online supplementary appendix 3). Each measurement property was given a rating result (‘*+*’ positive, ‘−’ negative, ‘?’ indeterminate and ‘na’ no information available). ### Best evidence synthesis: levels of evidence As recommended by the COSMIN checklist developer group,32 ‘*a best evidence synthesis*’ was used to synthesise all the evidence on measurement properties of different instruments. The procedure used was similar to the Grading of Recommendations, Assessment, Development and Evaluation (GRADE) framework,36 a transparent approach to rating quality of evidence that is often used in reviews of clinical trials.37 Given that this review did not target clinical trials, the GRADE framework adapted by the COSMIN group was used.38 Under this procedure, the possible overall rating for a measurement property is ‘*positive*’, ‘*negative*’, ‘*conflicting*’ or ‘*unknown*’, accompanied by levels of evidence (‘*strong*’, ‘*moderate*’ or ‘*limited*’) (see online supplementary appendix 4). Three steps were taken to obtain the overall rating for a measurement property. First, the methodological quality of a study on each measurement property was assessed using the COSMIN checklist. Measurement properties from ‘*poor*’ methodological quality studies did not contribute to ‘*the best evidence synthesis*’. Second, the quality of each measurement property of an instrument was evaluated using Terwee’s quality criteria.35 Third, the rating results of measurement properties in different studies on the same instrument were examined whether consistent or not. This best evidence synthesis was performed by one author (SG) and then checked by a second author (TS). ### Patient and public involvement Children and adolescents were not involved in setting the research question, the outcome measures, or the design or implementation of this study. ## Results The initial search identified 2790 studies. After duplicates and initial title/abstract screening, 361 full-text articles were identified and obtained. As per the eligibility criteria, 29 studies were included,39–53 yielding 29 unique health literacy instruments used in children and adolescents (see figure 1). ![Figure 1](http://bmjopen.bmj.com/https://bmjopen.bmj.com/content/bmjopen/8/6/e020080/F1.medium.gif) [Figure 1](http://bmjopen.bmj.com/content/8/6/e020080/F1) Figure 1 Flow chart of search and selection process according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow diagram. ### Characteristics of included studies Of the 29 studies identified, 25 were published between 2010 and 2017 (see table 1). Most included studies were conducted in Western countries (n=20), with 11 studies carried out in the USA. The target population (aged 7–25) could be roughly classified into three subgroups: children aged 7–12 (n=5), adolescents aged 13–17 (n=20) and young adults aged 18–25 (n=4). Schools (n=17) were the most common recruitment settings, compared with clinical settings (n=8) and communities (n=4). View this table: [Table 1](http://bmjopen.bmj.com/content/8/6/e020080/T1) Table 1 Characteristics of included studies ### General characteristics of included instruments Compared with previous systematic reviews,10 11 13 this review identified 19 additional new health literacy instruments (eHealth Literacy Scale (eHEALS), short-form Test of Functional Health Literacy in Adults (s-TOFHLA), Diabetes Numeracy Test (DNT)-39, DNT-14, 51-item Health Literacy Assessment Tool (HLAT-51), 8-item Health Literacy Assessment Tool (HLAT-8), Child Health Literacy Test (CHLT), Visual Oral Health Literacy (VOHL), Health Literacy Assessment Scale for Adolescents (HAS-A), Questionnaire for Assessment of Mental Health Literacy (QuALiSMental), Functional, Communicative, and Critical Health Literacy-Adolescents and Young Adults Cancer (FCCHL-AYAC), Interactive and Critical Health Literacy (ICHL), Health Literacy Measure for Adolescents (HELMA), Health Literacy for School-aged Children (HLSAC), Rapid Estimate of Adolescent Literacy in Medicine Short Form (REALM-TeenS), Functional Health Literacy Scale for Young Adults (funHLS-YA), Health Literacy Scale for Thai Childhood Overweight (HLS-TCO), Health Literacy and Resiliency Scale: Youth Version (HLRS-Y), and the Portuguese version of the 8-item Health Literacy Assessment Tool (p_HLAT-8)). The 29 health literacy instruments were classified into three groups based on whether the instrument was developed bespoke for the study or not (see table 2).10 The three groups were (1) newly developed instruments for childhood, adolescent and youth health literacy (n=20)40–47 49 50 54–63; (2) adapted instruments that were based on previous instruments for adult/adolescent health literacy (n=6)51 53 64–67; and (3) original instruments that were developed for adult health literacy (n=3).39 48 50 52 View this table: [Table 2](http://bmjopen.bmj.com/content/8/6/e020080/T2) Table 2 General and important characteristics of included instruments used in children and adolescents #### Health literacy domains and components Next, Nutbeam’s three-domain health literacy model17 was used to classify the 29 instruments according to which of the commonly used components of health literacy were included. Results showed that ten instruments measured only functional health literacy39 41 48 50–53 55 61 66 and one instrument measured only critical health literacy.47 There was one instrument measuring functional and interactive health literacy,46 one measuring functional and critical health literacy,40 and one measuring interactive and critical health literacy.58 Fifteen instruments measured health literacy by all three domains (functional, interactive and critical).42–45 49 54 56 57 59 60 62–65 67 #### Consideration of participants’ characteristics As per Forrest *et al*’s 4D model,14 15 the 29 included instruments were examined for whether participant characteristics were considered when developing a new instrument or validating an existing instrument. The results showed most of the health literacy instruments considered developmental change, dependency and demographic patterns. In contrast, only seven instruments considered differential epidemiology.53 57 58 62 63 65 #### Health topics, contents and readability levels Health literacy instruments for children and adolescents covered a range of health topics such as nutrition and sexual health. Most instruments (n=26) measured health literacy in healthcare settings or health promotion contexts (eg, general health topics, oral health or mental health), while only three instruments measured health literacy in the specific context of eHealth or media health.42 43 49 In relation to the readability of tested materials, only eight health literacy instruments reported their readability levels, ranging from 2nd to 19.5th grade. #### Burden and forms of administration The time to administer was reported in seven instruments, ranging from 3 to 90 min. There were three forms of administration: self-administered instruments (n=19), interviewer-administered instruments (n=9), and video-assisted, interviewer-administered instruments (n=1). Regarding the method of assessment, 15 instruments were performance-based, 11 instruments were self-report and 3 included both performance-based and self-report items. ### Evaluation of methodological quality of included studies According to the COSMIN checklist, the methodological quality of each instrument as assessed by each study is presented in table 3. Almost all studies (n=28) examined content validity, 24 studies assessed internal consistency and hypotheses testing, 17 studies examined structural validity, 8 studies assessed test–retest/inter-rater reliability, 2 studies assessed cross-cultural validity and only 1 study assessed responsiveness. View this table: [Table 3](http://bmjopen.bmj.com/content/8/6/e020080/T3) Table 3 Methodological quality of each study for each measurement property according to the COSMIN checklist ### Evaluation of instruments’ measurement properties After the methodological quality assessment of included studies, the measurement properties of each health literacy instrument were examined according to Terwee’s quality criteria (see online supplementary appendix 5).35 The rating results of the measurement properties of each instrument are summarised in table 4. View this table: [Table 4](http://bmjopen.bmj.com/content/8/6/e020080/T4) Table 4 Evaluation of measurement properties for included instruments according to Terwee’s quality criteria ### The synthesised evidence for the overall rating of measurement properties Finally, a synthesis was conducted for the overall rating of measurement properties for each instrument according to ‘*the best evidence synthesis*’ guidelines recommended by the COSMIN checklist developer group.32 This synthesis result was derived from information presented in table 3 and table 4. The overall rating of each measurement property for each health literacy instrument is presented in table 5. In summary, most information (62.9%, 146/232) on measurement properties was unknown due to either poor methodological quality of studies or a lack of information on reporting or assessment. View this table: [Table 5](http://bmjopen.bmj.com/content/8/6/e020080/T5) Table 5 The overall quality of measurement properties for each health literacy instrument used in children and adolescents ## Discussion ### Summary of the main results This study identified and examined 29 health literacy instruments used in children and adolescents and exemplified the large variety of methods used. Compared with previous three systematic reviews,10 11 13 this review identified 19 additional new health literacy instruments and critically appraised the measurement properties of each instrument. It showed that, to date, only half of included health literacy instruments (15/29) measure all three domains (functional, interactive and critical) and that the functional domain is still the focus of attention when measuring health literacy in children and adolescents. Additionally, researchers mainly focus on participant characteristics of developmental change (of cognitive ability), dependency (on parents) and demographic patterns (eg, racial/ethnic backgrounds), and less so on differential epidemiology (of health and illness). The methodological quality of included studies as assessed via measurement properties varied from poor to excellent. Most information (62.9%) on measurement properties was unknown due to either the poor methodological quality of studies or a lack of reporting or assessment. It is therefore difficult to draw a robust conclusion about which instrument is the best. ### Health literacy measurement in children and adolescents This review found that health literacy measurement in children and adolescents tends to include Nutbeam’s three-domain health literacy construct (ie, functional, interactive and critical), especially in the past 5 years. However, almost one-third of included instruments focused only on the functional domain (n=10). Unlike health literacy research for patients in clinics, health literacy research for children and adolescents (a comparatively healthy population) should be considered from a health promotion perspective,68 rather than a healthcare or disease management perspective. Integrating interactive and critical domains into health literacy measurement is aligned with the rationale of emphasising empowerment in health promotion for children and adolescents.69 The focus of health literacy for this population group should therefore include all three domains and so there is a need for future research to integrate the three domains within health literacy instruments. Similar to previous findings by Ormshaw *et al*10 and Okan *et al*,13 this review also revealed that childhood and adolescent health literacy measurement varied by its dimensions, health topics, forms of administration and by the level to which participant characteristics were considered. There are likely four main reasons for these disparities. First, definitions of health literacy were inconsistent. Some researchers measured general health literacy,40 45 while others measured eHealth literacy or media health literacy.43 49 Second, researchers had different research purposes for their studies. Some researchers used what were originally adult instruments to measure adolescent health literacy,39 48 52 whereas others developed new or adapted instruments.40–42 53 Third, the research settings affected the measurement process. As clinical settings were busy, short surveys were more appropriate than long surveys.39 41 44 On the other hand, health literacy in school settings was often measured using long and comprehensive surveys.40 42 47 Fourth, researchers considered different participant characteristics when measuring health literacy in children and adolescents. For example, some researchers took considerations of students’ cognitive development,40 41 44 46 51 some focused on adolescents’ resources and environments (eg, friends and family contexts, eHealth contexts, media contexts),43 45 49 and others looked at the effect of different cultural backgrounds and socioeconomic status.40 41 43 44 46 47 49–52 Based on Forrest *et al*’s 4D model,14 15 this review showed that most health literacy instruments considered participants’ development, dependency and demographic patterns, with only seven instruments considering differential epidemiology.53 57 58 62 63 65 Although the ‘4D’ model cannot be used to reduce the disparities in health literacy measurement, it does provide an opportunity to identify gaps in current research and assist researchers to consider participants’ characteristics comprehensively in future research. ### The methodological quality of included studies This review included a methodological quality assessment of included studies, which was absent from previous reviews on this subject.10 11 Methodological quality assessment is important because strong conclusions about the measurement properties of instruments can only be drawn from high-quality studies. In this review, the COSMIN checklist was shown to be a useful framework for critically appraising the methodological quality of studies via each measurement property. Findings suggested that there was wide variation in the methodological quality of studies for all instruments. Poor methodological quality of studies was often seen in the original or adapted health literacy instruments (the Newest Vital Sign (NVS), the Test of Functional Health Literacy in Adults (TOFHLA), the s-TOFHLA, the DNT-39 and the DNT-14) for two main reasons. The first reason was the vague description of the target population involved. This suggested that researchers were less likely to consider an instrument’s content validity when using the original, adult instrument for children and/or adolescents. Given that children and adolescents have less well-developed cognitive abilities, in future it is essential to assess whether all items within an instrument are understood. The second reason was a lack of unidimensionality analysis for internal consistency. As explained by the COSMIN group,70 a set of items can be inter-related and multidimensional, whereas unidimensionality is a prerequisite for a clear interpretation of the internal consistency statistics (Cronbach’s alpha). Future research on the use of health literacy instruments therefore needs to assess and report both internal consistency statistics and unidimensionality analysis (eg, factor analysis). ### Critical appraisal of measurement properties for included instruments This review demonstrated that of all instruments reviewed, three instruments (the Chinese version of short-form Test of Functional Health Literacy in Adolescents (c-sTOFHLAd), the HELMA and the HLSAC) showed satisfactory evidence about internal consistency and test–retest reliability. Based on the synthesised evidence, the HELMA showed moderate evidence and positive results of internal consistency (α=0.93) and test–retest reliability (intraclass correlation coefficient (ICC)=0.93), whereas the HLSAC (α=0.93; standardised stability estimate=0.83) and the c-sTOFHLAd (α=0.85; ICC=0.95) showed limited evidence and positive results. Interestingly, compared with the overall reliability rating of the s-TOFHLA,50 the c-sTOFHLAd showed better results.51 The reason for this was probably the different methodological quality of the studies that examined the s-TOFHLA and the c-sTOFHLAd. The c-sTOFHLAd study had fair methodological quality in terms of internal consistency and test–retest reliability, whereas the original s-TOFHLA study had poor methodological quality for internal consistency and unknown information for test–retest reliability. Given the large disparity of rating results between the original and translated instrument, further evidence is needed to confirm whether the s-TOFHLA has the same or a different reliability within different cultures, thus assisting researchers to understand the generalisability of the s-TOFHLA’s reliability results. Four instruments were found to show satisfactory evidence about both content validity and construct validity (structural validity and hypotheses testing). Construct validity is a fundamental aspect of psychometrics and was examined in this review for two reasons. First, it enables an instrument to be assessed for the extent to which operational variables adequately represent underlying theoretical constructs.71 Second, the overall rating results of content validity for all included instruments were similar (ie, unknown or moderate/strong evidence and positive result). The only difference was that the target population was involved or not. Given that all instruments’ items reflected the measured construct, in this review, construct validity was determined to be key to examining the overall validity of included instruments. In this context, only the HLAT-8 showed strong evidence and positive result for structural validity (CFI=0.99; TLI=0.97; RMSEA=0.03; SRMR=0.03) and moderate evidence on hypotheses testing (known-group validity results showed differences of health literacy by gender, educational status and health valuation). However, in the original paper,45 the HLAT-8 was only tested for its known-group validity, not for convergent validity. Examination of convergent validity is important because it assists researchers in understanding the extent to which two examined measures’ constructs are theoretically and practically related.72 Therefore, future research on the convergent validity of the HLAT-8 would be beneficial for complementing that which exists for its construct validity. Similar to a previous study by Jordan *et al*,26 this review demonstrated that only one included study contained evidence of responsiveness. Ueno *et al*55 developed a visual oral health literacy instrument and examined responsiveness by comparing changes in health literacy before and after oral health education. Their results showed students’ health literacy scores increased significantly after health education. Responsiveness is the ability of an instrument to detect change over time in the construct being measured, and it is particularly important for longitudinal studies.31 However, most studies included in this review were cross-sectional studies, and only one study (on the Multidimensional Measure of Adolescent Health Literacy44) discussed the potential to measure health literacy over time. Studies that measure health literacy over time in populations are needed, because this is a prerequisite for longitudinal studies and so that the responsiveness of instruments can be monitored and improved. ### Feasibility issues for included instruments This review showed that the feasibility aspects of instruments varied markedly. In relation to forms of administration, this review identified 19 self-administered instruments and 10 interviewer-administered instruments. This suggests that self-administered instruments are more commonly used in practice than interviewer-administered instruments. However, both administration modes have limitations. Self-administered instruments are cost-effective and efficient, but may bring about respondent bias, whereas interviewer-administered instruments, while able to ensure high response rates, are always resource-intensive and expensive to administer.73 Although the literature showed that there was no significant difference in scores outcome between these two administration modes,74 75 the relevant studies mostly concerned health-related quality of life instruments. It is still unknown whether the same is true for health literacy instruments. Among children and adolescents, health literacy research is more likely to be conducted through large-scale surveys in school settings. Therefore, the more cost-effective, self-administered mode seems to have great potential for future research. To further support the wide use of self-administered instruments, there is a need for future research to confirm the same effect of administration between self-administered and interviewer-administered instruments. With regard to the type of assessment method, this review revealed that performance-based health literacy instruments (n=15) are more preferable than self-report instruments (n=11). There might be two reasons for this. First, it is due to participant characteristics. Compared with adults, children and adolescents are more dependent on their parents for health-related decisions.15 Measurement error is more likely to occur when children and adolescents answer self-report items.76 Therefore, performance-based assessment is often selected to avoid such inaccuracy. Second, performance-based instruments are objective, whereas self-report instruments are subjective and may bring about overestimated results.77 However, the frequent use of performance-based instruments does not mean that they are more appropriate than self-report instruments when measuring childhood and adolescent health literacy. Compared with performance-based instruments, self-report instruments are always time-efficient and help to preserve respondents’ dignity.21 The challenge in using self-report instruments is to consider the readability of tested materials. If children and adolescents can understand what a health literacy instrument measures, then they are more able to accurately self-assess their own health literacy skills.69 The difference between self-report and performance-based instruments of health literacy has been discussed in the literature,78 but the evidence about the difference is still limited due to a lack of specifically designed studies for exploring the difference. Further studies are needed to fill this knowledge gap. ### Recommendations for future research This review identified 18 instruments (the Rapid Estimate of Adolescent Literacy in Medicine (REALM-Teen), the NVS, the s-TOFHLA, the c-sTOFHLAd, the eHEALS, the Critical Health Competence Test (CHC Test), the Health Knowledge, Attitudes, Communication and Self-efficacy Scale (HKACSS), the Health Literacy Assessment Booklet (HLAB), the Media Health Literacy (MHL), the HLAT-51, the CHLT, the VOHL, the QuALiSMental, the HELMA, the HLSAC, the funHLS-YA, the HLS-TCO and the p_HLAT-8) that were used to measure health literacy in school settings. Although it is difficult to categorically state which instrument is the best, this review provides useful information that will assist researchers to identify the most suitable instrument to use when measuring health literacy in children and adolescents in school contexts. Among the 18 instruments, 6 tested functional health literacy (the REALM-Teen, the NVS, the s-TOFHLA, the c-sTOFHLAd, the VOHL and the funHLS-YA), 1 examined critical health literacy (the CHC Test), 1 measured functional and interactive health literacy (the HKACSS), 1 examined functional and critical health literacy (the HLAB), and 9 tested health literacy comprehensively focusing on functional, interactive and critical domains (the eHEALS, the MHL, the HLAT-51, the CHLT, the QuALiSMental, the HELMA, the HLSAC, the HLS-TCO and the p_HLAT-8). However, only one of these three-domain instruments (the HLSAC) was considered appropriate for use in schools because of its quick administration, satisfactory reliability and one-factor validity. Eight three-domain instruments were excluded due to the fact that they focused on non-general health literacy (the eHEALS, the MHL, the QuALiSMental, the HLS-TCO) or were burdensome to administer (the HLAT-51, the HELMA-44) or were not published in English (the CHLT and the p_HLAT-8). Compared with the HLSAC, the HLAT-8 examines the construct of health literacy via three domains rather than one-factor structure, thus enabling a more comprehensive examination of the construct. Meanwhile, although the p_HLAT-8 (Portuguese version) is not available in English, the original HLAT-8 is. After comparing measurement domains and measurement properties, the HLAT-8 was deemed to be more suitable for measuring health literacy in school settings for four reasons: (1) it measures health literacy in the context of family and friends,45 a highly important attribute because children and adolescents often need support for health decisions from parents and peers7 15; (2) it is a short but comprehensive tool that captures Nutbeam’s three-domain nature of health literacy17; (3) it showed satisfactory structural validity (RMSEA=0.03; CFI=0.99; TLI=0.97; SRMR=0.03)45; and (4) it has good feasibility (eg, the p_HLAT-8 is self-administered and time-efficient) in school-based studies. However, there are still two main aspects that need to be considered in future. One aspect is its use in the target population. Given the HLAT-8 has not been tested for children and adolescents under 18, its readability and measurement properties need to be evaluated. The other aspect is that its convergent validity (the strength of association between two measures of a similar construct, an essential part of construct validity) has not been examined. Testing convergent validity of the HLAT-8 is important because high convergent validity assists researchers to understand the extent to which two examined measures’ constructs are theoretically and practically related. ### Limitations This review was not without limitation. First, we restricted the search to studies aiming to develop or validate a health literacy instrument. Thus we may have missed relevant instruments in studies that were not aiming to develop instruments.79 80 Second, although the COSMIN checklist provided us with strong evidence of the methodological quality of a study via an assessment of each measurement property, it cannot evaluate a study’s overall methodological quality. Third, criterion validity was not examined due to lack of ‘gold standard’ for health literacy measurement. However, we examined convergent validity under the domain of ‘hypotheses testing’. This can ascertain the validity of newly developed instruments against existing commonly used instruments. Finally, individual subjectivity inevitably played a part in the screening, data extraction and synthesis stage of the review. To reduce this subjectivity, two authors independently managed the major stages. ## Conclusion This review updated previous reviews of childhood and adolescent health literacy measurement (cf  Ormshaw *et al*, Perry and Okan *et al*)10 11 13 to incorporate a quality assessment framework. It showed that most information on measurement properties was unknown due to either the poor methodological quality of studies or a lack of assessment and reporting. Rigorous and high-quality studies are needed to fill the knowledge gap in relation to health literacy measurement in children and adolescents. Although it is challenging to draw a robust conclusion about which instrument is the best, this review provides important evidence that supports the use of the HLAT-8 to measure childhood and adolescent health literacy in future research. ## Acknowledgments The authors appreciate the helpful comments received from the reviewers (Martha Driessnack and Debi Bhattacharya) at *BMJ Open*. ## Footnotes * Contributors SG conceived the review approach. RA and EW provided general guidance for the drafting of the protocol. SG and SMA screened the literature. SG and TS extracted the data. SG drafted the manuscript. SG, GRB, RA, EW, XY, SMA and TS reviewed and revised the manuscript. All authors contributed to the final manuscript. * Funding This paper is part of SG’s PhD research project, which is supported by the Melbourne International Engagement Award. This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors. * Competing interests None declared. * Patient consent Not required. * Provenance and peer review Not commissioned; externally peer reviewed. * Data sharing statement There are no additional data available. This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: [http://creativecommons.org/licenses/by-nc/4.0/](http://creativecommons.org/licenses/by-nc/4.0/) ## References 1. 1.Nutbeam D. The evolving concept of health literacy. Soc Sci Med 2008;67:2072–8.[doi:10.1016/j.socscimed.2008.09.050](http://dx.doi.org/10.1016/j.socscimed.2008.09.050) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/j.socscimed.2008.09.050&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=18952344&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000261993900016&link_type=ISI) 2. 2.Nutbeam D. Health Promotion Glossary. Health Promot Int 1998;13:349–64.[doi:10.1093/heapro/13.4.349](http://dx.doi.org/10.1093/heapro/13.4.349) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/heapro/13.4.349&link_type=DOI) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000077491500009&link_type=ISI) 3. 3.Paasche-Orlow MK, Wolf MS. The causal pathways linking health literacy to health outcomes. Am J Health Behav 2007;31(Suppl 1):19–26.[doi:10.5993/AJHB.31.s1.4](http://dx.doi.org/10.5993/AJHB.31.s1.4) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.5993/AJHB.31.s1.4&link_type=DOI) 4. 4.Squiers L, Peinado S, Berkman N, et al. The health literacy skills framework. J Health Commun 2012;17(Suppl 3):30–54.[doi:10.1080/10810730.2012.713442](http://dx.doi.org/10.1080/10810730.2012.713442) 5. 5.Berkman ND, Sheridan SL, Donahue KE, et al. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med 2011;155:97–107.[doi:10.7326/0003-4819-155-2-201107190-00005](http://dx.doi.org/10.7326/0003-4819-155-2-201107190-00005) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.7326/0003-4819-155-2-201107190-00005&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21768583&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000292822000016&link_type=ISI) 6. 6.Dodson S, Good S, Osborne R. Health literacy toolkit for low-and middle-income countries: a series of information sheets to empower communities and strengthen health systems. New Delhi: World Health Organization, Regional Office for South-East Asia, 2015. 7. 7.Manganello JA. Health literacy and adolescents: a framework and agenda for future research. Health Educ Res 2008;23:840–7.[doi:10.1093/her/cym069](http://dx.doi.org/10.1093/her/cym069) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/her/cym069&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=18024979&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000259327100008&link_type=ISI) 8. 8.Diamond C, Saintonge S, August P, et al. The development of building wellness, a youth health literacy program. J Health Commun 2011;16(Suppl 3):103–18.[doi:10.1080/10810730.2011.604385](http://dx.doi.org/10.1080/10810730.2011.604385) 9. 9.Robinson LD, Calmes DP, Bazargan M. The impact of literacy enhancement on asthma-related outcomes among underserved children. J Natl Med Assoc 2008;100:892–6.[doi:10.1016/S0027-9684(15)31401-2](http://dx.doi.org/10.1016/S0027-9684(15)31401-2) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/S0027-9684(15)31401-2&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=18717138&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 10. 10.Ormshaw MJ, Paakkari LT, Kannas LK. Measuring child and adolescent health literacy: a systematic review of literature. Health Educ 2013;113:433–55.[doi:10.1108/HE-07-2012-0039](http://dx.doi.org/10.1108/HE-07-2012-0039) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1108/HE-07-2012-0039&link_type=DOI) 11. 11.Perry EL. Health literacy in adolescents: an integrative review. J Spec Pediatr Nurs 2014;19:210–8.[doi:10.1111/jspn.12072](http://dx.doi.org/10.1111/jspn.12072) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1111/jspn.12072&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=24612548&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 12. 12.Bröder J, Okan O, Bauer U, et al. Health literacy in childhood and youth: a systematic review of definitions and models. BMC Public Health 2017;17:361.[doi:10.1186/s12889-017-4267-y](http://dx.doi.org/10.1186/s12889-017-4267-y) 13. 13.Okan O, Lopes E, Bollweg TM, et al. Generic health literacy measurement instruments for children and adolescents: a systematic review of the literature. BMC Public Health 2018;18:166.[doi:10.1186/s12889-018-5054-0](http://dx.doi.org/10.1186/s12889-018-5054-0) 14. 14.Forrest CB, Simpson L, Clancy C. Child health services research. Challenges and opportunities. JAMA 1997;277:1787–93. [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1001/jama.1997.03540460051032&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=9178792&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=A1997XC49900032&link_type=ISI) 15. 15.Rothman RL, Yin HS, Mulvaney S, et al. Health literacy and quality: focus on chronic illness care and patient safety. Pediatrics 2009;124(Suppl 3):S315–26.[doi:10.1542/peds.2009-1163H](http://dx.doi.org/10.1542/peds.2009-1163H) [Abstract/FREE Full Text](http://bmjopen.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTA6InBlZGlhdHJpY3MiO3M6NToicmVzaWQiO3M6MjE6IjEyNC9TdXBwbGVtZW50XzMvUzMxNSI7czo0OiJhdG9tIjtzOjI1OiIvYm1qb3Blbi84LzYvZTAyMDA4MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 16. 16.Sørensen K, Van den Broucke S, Fullam J, et al. Health literacy and public health: a systematic review and integration of definitions and models. BMC Public Health 2012;12:80.[doi:10.1186/1471-2458-12-80](http://dx.doi.org/10.1186/1471-2458-12-80) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1186/1471-2458-12-80&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=22276600&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 17. 17.Nutbeam D. Health literacy as a public health goal: a challenge for contemporary health education and communication strategies into the 21st century. Health Promot Int 2000;15:259–67.[doi:10.1093/heapro/15.3.259](http://dx.doi.org/10.1093/heapro/15.3.259) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/heapro/15.3.259&link_type=DOI) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000089567000009&link_type=ISI) 18. 18.Berkman ND, Davis TC, McCormack L. Health literacy: what is it? J Health Commun 2010;15(Suppl 2):9–19.[doi:10.1080/10810730.2010.499985](http://dx.doi.org/10.1080/10810730.2010.499985) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1080/10810731003695375&link_type=DOI) 19. 19.Nutbeam D. Defining and measuring health literacy: what can we learn from literacy studies? Int J Public Health 2009;54:303–5.[doi:10.1007/s00038-009-0050-x](http://dx.doi.org/10.1007/s00038-009-0050-x) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1007/s00038-009-0050-x&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=19641847&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000270871300002&link_type=ISI) 20. 20.Abel T. Measuring health literacy: moving towards a health - promotion perspective. Int J Public Health 2008;53:169–70.[doi:10.1007/s00038-008-0242-9](http://dx.doi.org/10.1007/s00038-008-0242-9) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1007/s00038-008-0242-9&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=18716719&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000258655500001&link_type=ISI) 21. 21.Pleasant A. Advancing health literacy measurement: a pathway to better health and health system performance. J Health Commun 2014;19:1481–96.[doi:10.1080/10810730.2014.954083](http://dx.doi.org/10.1080/10810730.2014.954083) 22. 22.1. Higgins JP, 2. Green S Higgins JP, Green S. In: Higgins JP, Green S, eds. Cochrane handbook for systematic reviews of interventions version 5.1. 0 [updated Mar 2011]. London, UK: The Cochrane Collaboration, 2011. 23. 23.Moher D, Liberati A, Tetzlaff J, et al. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. Ann Intern Med 2009;151:264–9.[doi:10.7326/0003-4819-151-4-200908180-00135](http://dx.doi.org/10.7326/0003-4819-151-4-200908180-00135) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.7326/0003-4819-151-4-200908180-00135&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=19622511&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000269038900006&link_type=ISI) 24. 24.Simonds SK. Health education as social policy. Health Educ Monogr 1974;2(1_suppl):1–10.[doi:10.1177/10901981740020S102](http://dx.doi.org/10.1177/10901981740020S102) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1177/109019817400200102&link_type=DOI) 25. 25.Mancuso JM. Assessment and measurement of health literacy: an integrative review of the literature. Nurs Health Sci 2009;11:77–89.[doi:10.1111/j.1442-2018.2008.00408.x](http://dx.doi.org/10.1111/j.1442-2018.2008.00408.x) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1111/j.1442-2018.2008.00408.x&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=19298313&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000264083800012&link_type=ISI) 26. 26.Jordan JE, Osborne RH, Buchbinder R. Critical appraisal of health literacy indices revealed variable underlying constructs, narrow content and psychometric weaknesses. J Clin Epidemiol 2011;64:366–79.[doi:10.1016/j.jclinepi.2010.04.005](http://dx.doi.org/10.1016/j.jclinepi.2010.04.005) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/j.jclinepi.2010.04.005&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=20638235&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000288364300005&link_type=ISI) 27. 27.Sanders LM, Federico S, Klass P, et al. Literacy and child health: a systematic review. Arch Pediatr Adolesc Med 2009;163:131–40.[doi:10.1001/archpediatrics.2008.539](http://dx.doi.org/10.1001/archpediatrics.2008.539) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1001/archpediatrics.2008.539&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=19188645&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000262991800006&link_type=ISI) 28. 28.DeWalt DA, Hink A. Health literacy and child health outcomes: a systematic review of the literature. Pediatrics 2009;124(Suppl 3):S265–74.[doi:10.1542/peds.2009-1162B](http://dx.doi.org/10.1542/peds.2009-1162B) [Abstract/FREE Full Text](http://bmjopen.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTA6InBlZGlhdHJpY3MiO3M6NToicmVzaWQiO3M6MjE6IjEyNC9TdXBwbGVtZW50XzMvUzI2NSI7czo0OiJhdG9tIjtzOjI1OiIvYm1qb3Blbi84LzYvZTAyMDA4MC5hdG9tIjt9czo4OiJmcmFnbWVudCI7czowOiIiO30=) 29. 29.Sawyer SM, Afifi RA, Bearinger LH, et al. Adolescence: a foundation for future health. Lancet 2012;379:1630–40.[doi:10.1016/S0140-6736(12)60072-5](http://dx.doi.org/10.1016/S0140-6736(12)60072-5) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/S0140-6736(12)60072-5&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=22538178&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000303452600037&link_type=ISI) 30. 30.Fok MS, Wong TK. What does health literacy mean to children? Contemp Nurse 2002;13:249–58.[doi:10.5172/conu.13.2-3.249](http://dx.doi.org/10.5172/conu.13.2-3.249) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=16116781&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 31. 31.Terwee CB, Mokkink LB, Knol DL, et al. Rating the methodological quality in systematic reviews of studies on measurement properties: a scoring system for the COSMIN checklist. Qual Life Res 2012;21:651–7.[doi:10.1007/s11136-011-9960-1](http://dx.doi.org/10.1007/s11136-011-9960-1) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1007/s11136-011-9960-1&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21732199&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000303405900010&link_type=ISI) 32. 32.Mokkink LB, Prinsen CA, Bouter LM, et al. The COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) and how to select an outcome measurement instrument. Braz J Phys Ther 2016;20:105–13.[doi:10.1590/bjpt-rbf.2014.0143](http://dx.doi.org/10.1590/bjpt-rbf.2014.0143) 33. 33.McCormack L, Haun J, Sørensen K, et al. Recommendations for advancing health literacy measurement. J Health Commun 2013;18(Suppl 1):9–14.[doi:10.1080/10810730.2013.829892](http://dx.doi.org/10.1080/10810730.2013.829892) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1080/10810730.2013.829892&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=24093340&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 34. 34.Pleasant A, McKinney J, Rikard RV. Health literacy measurement: a proposed research agenda. J Health Commun 2011;16(Suppl 3):11–21.[doi:10.1080/10810730.2011.604392](http://dx.doi.org/10.1080/10810730.2011.604392) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1080/10810730.2011.604392&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21951240&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000299952500005&link_type=ISI) 35. 35.Terwee CB, Bot SD, de Boer MR, et al. Quality criteria were proposed for measurement properties of health status questionnaires. J Clin Epidemiol 2007;60:34–42.[doi:10.1016/j.jclinepi.2006.03.012](http://dx.doi.org/10.1016/j.jclinepi.2006.03.012) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/j.jclinepi.2006.03.012&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=17161752&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000243055800005&link_type=ISI) 36. 36.Atkins D, Best D, Briss PA, et al. Grading quality of evidence and strength of recommendations. BMJ 2004;328:1490–4.[doi:10.1136/bmj.328.7454.1490](http://dx.doi.org/10.1136/bmj.328.7454.1490) [Abstract/FREE Full Text](http://bmjopen.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjEzOiIzMjgvNzQ1NC8xNDkwIjtzOjQ6ImF0b20iO3M6MjU6Ii9ibWpvcGVuLzgvNi9lMDIwMDgwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 37. 37.Guyatt GH, Oxman AD, Vist GE, et al. GRADE: an emerging consensus on rating quality of evidence and strength of recommendations. BMJ 2008;336:924–6.[doi:10.1136/bmj.39489.470347.AD](http://dx.doi.org/10.1136/bmj.39489.470347.AD) [FREE Full Text](http://bmjopen.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiRlVMTCI7czoxMToiam91cm5hbENvZGUiO3M6MzoiYm1qIjtzOjU6InJlc2lkIjtzOjEyOiIzMzYvNzY1MC85MjQiO3M6NDoiYXRvbSI7czoyNToiL2Jtam9wZW4vOC82L2UwMjAwODAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 38. 38.Schellingerhout JM, Verhagen AP, Heymans MW, et al. Measurement properties of disease-specific questionnaires in patients with neck pain: a systematic review. Qual Life Res 2012;21:659–70.[doi:10.1007/s11136-011-9965-9](http://dx.doi.org/10.1007/s11136-011-9965-9) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1007/s11136-011-9965-9&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21735306&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000303405900011&link_type=ISI) 39. 39.Warsh J, Chari R, Badaczewski A, et al. Can the newest vital sign be used to assess health literacy in children and adolescents? Clin Pediatr 2014;53:141–4.[doi:10.1177/0009922813504025](http://dx.doi.org/10.1177/0009922813504025) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1177/0009922813504025&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=24065737&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 40. 40.Wu AD, Begoray DL, Macdonald M, et al. Developing and evaluating a relevant and feasible instrument for measuring health literacy of Canadian high school students. Health Promot Int 2010;25:444–52.[doi:10.1093/heapro/daq032](http://dx.doi.org/10.1093/heapro/daq032) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/heapro/daq032&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=20466776&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000284164900008&link_type=ISI) 41. 41.Davis TC, Wolf MS, Arnold CL, et al. Development and validation of the Rapid Estimate of Adolescent Literacy in Medicine (REALM-Teen): a tool to screen adolescents for below-grade reading in health care settings. Pediatrics 2006;118:e1707–e14.[doi:10.1542/peds.2006-1139](http://dx.doi.org/10.1542/peds.2006-1139) [Abstract/FREE Full Text](http://bmjopen.bmj.com/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6MTA6InBlZGlhdHJpY3MiO3M6NToicmVzaWQiO3M6MTE6IjExOC82L2UxNzA3IjtzOjQ6ImF0b20iO3M6MjU6Ii9ibWpvcGVuLzgvNi9lMDIwMDgwLmF0b20iO31zOjg6ImZyYWdtZW50IjtzOjA6IiI7fQ==) 42. 42.Harper R. Development of a health literacy assessment for young adult college students: a pilot study. J Am Coll Health 2014;62:125–34.[doi:10.1080/07448481.2013.865625](http://dx.doi.org/10.1080/07448481.2013.865625) 43. 43.Norman CD, Skinner HA. eHEALS: the ehealth literacy scale. J Med Internet Res 2006;8:e27.[doi:10.2196/jmir.8.4.e27](http://dx.doi.org/10.2196/jmir.8.4.e27) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.2196/jmir.8.4.e27&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=17213046&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 44. 44.Massey P, Prelip M, Calimlim B, et al. Findings toward a multidimensional measure of adolescent health literacy. Am J Health Behav 2013;37:342–50.[doi:10.5993/AJHB.37.3.7](http://dx.doi.org/10.5993/AJHB.37.3.7) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.5993/AJHB.37.3.7&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=23985181&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000321214000007&link_type=ISI) 45. 45.Abel T, Hofmann K, Ackermann S, et al. Health literacy among young adults: a short survey tool for public health and health promotion research. Health Promot Int 2015;30:725–35.[doi:10.1093/heapro/dat096](http://dx.doi.org/10.1093/heapro/dat096) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/heapro/dat096&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=24482542&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 46. 46.Schmidt CO, Fahland RA, Franze M, et al. Health-related behaviour, knowledge, attitudes, communication and social status in school children in Eastern Germany. Health Educ Res 2010;25:542–51.[doi:10.1093/her/cyq011](http://dx.doi.org/10.1093/her/cyq011) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/her/cyq011&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=20228152&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000280260200003&link_type=ISI) 47. 47.Steckelberg A, Hülfenhaus C, Kasper J, et al. How to measure critical health competences: development and validation of the Critical Health Competence Test (CHC Test). Adv Health Sci Educ 2009;14:11–22.[doi:10.1007/s10459-007-9083-1](http://dx.doi.org/10.1007/s10459-007-9083-1) 48. 48.Chisolm DJ, Buchanan L. Measuring adolescent functional health literacy: a pilot validation of the Test of Functional Health Literacy in Adults. J Adolesc Health 2007;41:312–4.[doi:10.1016/j.jadohealth.2007.04.015](http://dx.doi.org/10.1016/j.jadohealth.2007.04.015) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/j.jadohealth.2007.04.015&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=17707303&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000249054500015&link_type=ISI) 49. 49.Levin-Zamir D, Lemish D, Gofin R. Media Health Literacy (MHL): development and measurement of the concept among adolescents. Health Educ Res 2011;26:323–35.[doi:10.1093/her/cyr007](http://dx.doi.org/10.1093/her/cyr007) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/her/cyr007&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21422003&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000288799700012&link_type=ISI) 50. 50.Hoffman S, Trout A, Nelson T, et al. A psychometric assessment of health literacy measures among youth in a residential treatment setting. J Stud Soc Sci 2013;5:288–300. 51. 51.Chang LC, Hsieh PL, Liu CH. Psychometric evaluation of the Chinese version of short-form Test of Functional Health Literacy in Adolescents. J Clin Nurs 2012;21:2429–37.[doi:10.1111/j.1365-2702.2012.04147.x](http://dx.doi.org/10.1111/j.1365-2702.2012.04147.x) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1111/j.1365-2702.2012.04147.x&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=22784219&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 52. 52.Driessnack M, Chung S, Perkhounkova E, et al. Using the "Newest Vital Sign" to assess health literacy in children. J Pediatr Health Care 2014;28:165–71.[doi:10.1016/j.pedhc.2013.05.005](http://dx.doi.org/10.1016/j.pedhc.2013.05.005) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1016/j.pedhc.2013.05.005&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=23910945&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000332791300011&link_type=ISI) 53. 53.Mulvaney SA, Lilley JS, Cavanaugh KL, et al. Validation of the diabetes numeracy test with adolescents with type 1 diabetes. J Health Commun 2013;18:795–804.[doi:10.1080/10810730.2012.757394](http://dx.doi.org/10.1080/10810730.2012.757394) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1080/10810730.2012.757394&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=23577642&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000320691200004&link_type=ISI) 54. 54.Liu CH, Liao LL, Shih SF, et al. Development and implementation of Taiwan’s child health literacy test. Taiwan J Public Health 2014;33:251–70. 55. 55.Ueno M, Takayama A, Adiatman M, et al. Application of visual oral health literacy instrument in health education for senior high school students. Int J Health Promot Educ 2014;52:38–46.[doi:10.1080/14635240.2013.845412](http://dx.doi.org/10.1080/14635240.2013.845412) 56. 56.Manganello JA, DeVellis RF, Davis TC, et al. Development of the health literacy assessment scale for adolescents (HAS-A). J Commun Healthc 2015;8:172–84.[doi:10.1179/1753807615Y.0000000016](http://dx.doi.org/10.1179/1753807615Y.0000000016) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=27656257&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 57. 57.Guttersrud Ø, Naigaga MD, Pettersen KS. Measuring maternal health literacy in adolescents attending antenatal care in uganda: exploring the dimensionality of the health literacy concept studying a composite scale. J Nurs Meas 2015;23:50E–66.[doi:10.1891/1061-3749.23.2.E50](http://dx.doi.org/10.1891/1061-3749.23.2.E50) 58. 58.Smith SR, Samar VJ. Dimensions of deaf/hard-of-hearing and hearing adolescents' health literacy and health knowledge. J Health Commun 2016;21:141–54.[doi:10.1080/10810730.2016.1179368](http://dx.doi.org/10.1080/10810730.2016.1179368) 59. 59.Ghanbari S, Ramezankhani A, Montazeri A, et al. Health literacy measure for adolescents (HELMA): development and psychometric properties. PLoS One 2016;11:e0149202.[doi:10.1371/journal.pone.0149202](http://dx.doi.org/10.1371/journal.pone.0149202) 60. 60.Paakkari O, Torppa M, Kannas L, et al. Subjective health literacy: development of a brief instrument for school-aged children. Scand J Public Health 2016;44:751–7.[doi:10.1177/1403494816669639](http://dx.doi.org/10.1177/1403494816669639) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1177/1403494816669639&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=27655781&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 61. 61.Tsubakita T, Kawazoe N, Kasano E. A new functional health literacy scale for japanese young adults based on item response theory. Asia Pac J Public Health 2017;29:149–58.[doi:10.1177/1010539517690226](http://dx.doi.org/10.1177/1010539517690226) 62. 62.Intarakamhang U, Intarakamhang P. Health literacy scale and causal model of childhood overweight. J Res Health Sci 2017;17:e00368. 63. 63.Bradley-Klug K, Shaffer-Hudkins E, Lynn C, et al. Initial development of the health literacy and resiliency scale: youth version. J Commun Healthc 2017;10:100–7.[doi:10.1080/17538068.2017.1308689](http://dx.doi.org/10.1080/17538068.2017.1308689) 64. 64.de Jesus Loureiro LM. Questionnaire for assessment of mental health literacy-QuALiSMental: study of psychometric properties. Revista de Enfermagem Referência 2015;4:79–88.[doi:10.12707/riv14031](http://dx.doi.org/10.12707/riv14031) 65. 65.McDonald FE, Patterson P, Costa DS, et al. Validation of a health literacy measure for adolescents and young adults diagnosed with cancer. J Adolesc Young Adult Oncol 2016;5:69–75.[doi:10.1089/jayao.2014.0043](http://dx.doi.org/10.1089/jayao.2014.0043) 66. 66.Manganello JA, Colvin KF, Chisolm DJ, et al. Validation of the rapid estimate for adolescent literacy in medicine short form (REALM-TeenS). Pediatrics 2017;139:e20163286.[doi:10.1542/peds.2016-3286](http://dx.doi.org/10.1542/peds.2016-3286) 67. 67.Quemelo PR, Milani D, Bento VF, et al. [Health literacy: translation and validation of a research instrument on health promotion in Brazil]. Cad Saude Publica 2017;33:e00179715.[doi:10.1590/0102-311X00179715](http://dx.doi.org/10.1590/0102-311X00179715) 68. 68.Kilgour L, Matthews N, Christian P, et al. Health literacy in schools: prioritising health and well-being issues through the curriculum. Sport Educ Soc 2015;20:485–500.[doi:10.1080/13573322.2013.769948](http://dx.doi.org/10.1080/13573322.2013.769948) 69. 69.Velardo S, Drummond M. Emphasizing the child in child health literacy research. J Child Health Care 2017;21:5–13.[doi:10.1177/1367493516643423](http://dx.doi.org/10.1177/1367493516643423) 70. 70.Mokkink LB, Terwee CB, Patrick DL, et al. The COSMIN checklist manual. Amsterdam: VU University Medical Centre, 2012. 71. 71.Steckler A, McLeroy KR. The importance of external validity. Am J Public Health 2008;98:9–10.[doi:10.2105/AJPH.2007.126847](http://dx.doi.org/10.2105/AJPH.2007.126847) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.2105/AJPH.2007.126847&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=18048772&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000252178200007&link_type=ISI) 72. 72.Akpa OM, Bamgboye EA, Baiyewu O. The Adolescents' Psychosocial Functioning Inventory (APFI): scale development and initial validation using Exploratory and Confirmatory Factor Analysis. Afr J Psychol Study Soc Issues 2015;18:1–21. 73. 73.Bowling A. Mode of questionnaire administration can have serious effects on data quality. J Public Health 2005;27:281–91.[doi:10.1093/pubmed/fdi031](http://dx.doi.org/10.1093/pubmed/fdi031) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/pubmed/fdi031&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=15870099&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000231762400005&link_type=ISI) 74. 74.Lozano F, Lobos JM, March JR, et al. Self-administered versus interview-based questionnaires among patients with intermittent claudication: Do they give different results? A cross-sectional study. Sao Paulo Med J 2016;134:63–9.[doi:10.1590/1516-3180.2015.01733009](http://dx.doi.org/10.1590/1516-3180.2015.01733009) 75. 75.Dujaili JA, Sulaiman SAS, Awaisu A, et al. Comparability of interviewer-administration versus self-administration of the functional assessment of chronic illness therapy-tuberculosis (FACIT-TB) health-related quality of life questionnaire in pulmonary tuberculosis patients. Pulmonary Therapy 2016;2:127–37.[doi:10.1007/s41030-016-0016-0](http://dx.doi.org/10.1007/s41030-016-0016-0) 76. 76.Vaz S, Parsons R, Passmore AE, et al. Internal consistency, test-retest reliability and measurement error of the self-report version of the social skills rating system in a sample of Australian adolescents. PLoS One 2013;8:e73924.[doi:10.1371/journal.pone.0073924](http://dx.doi.org/10.1371/journal.pone.0073924) 77. 77.Altin SV, Finke I, Kautz-Freimuth S, et al. The evolution of health literacy assessment tools: a systematic review. BMC Public Health 2014;14:1207.[doi:10.1186/1471-2458-14-1207](http://dx.doi.org/10.1186/1471-2458-14-1207) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1186/1471-2458-14-1207&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=25418011&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 78. 78.Kiechle ES, Bailey SC, Hedlund LA, et al. Different measures, different outcomes? A systematic review of performance-based versus self-reported measures of health literacy and numeracy. J Gen Intern Med 2015;30:1538–46.[doi:10.1007/s11606-015-3288-4](http://dx.doi.org/10.1007/s11606-015-3288-4) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1007/s11606-015-3288-4&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=25917656&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) 79. 79.Paek HJ, Reber BH, Lariscy RW. Roles of interpersonal and media socialization agents in adolescent self-reported health literacy: a health socialization perspective. Health Educ Res 2011;26:131–49.[doi:10.1093/her/cyq082](http://dx.doi.org/10.1093/her/cyq082) [CrossRef](http://bmjopen.bmj.com/lookup/external-ref?access_num=10.1093/her/cyq082&link_type=DOI) [PubMed](http://bmjopen.bmj.com/lookup/external-ref?access_num=21248025&link_type=MED&atom=%2Fbmjopen%2F8%2F6%2Fe020080.atom) [Web of Science](http://bmjopen.bmj.com/lookup/external-ref?access_num=000286468300011&link_type=ISI) 80. 80.Hubbard B, Rainey J. Health literacy instruction and evaluation among secondary school students. Am J Health Educ 2007;38:332–7.[doi:10.1080/19325037.2007.10598991](http://dx.doi.org/10.1080/19325037.2007.10598991)