000 10053cam a22004694i 4500
005 20221101190113.0
008 040517s2004 njua b 001 0 eng d
010 _a 2003063992
011 _aBIB MATCHES WORLDCAT
020 _a0471458414
_qpbk.
020 _a9780471458418
_qpbk.
035 _a(ATU)b10880124
035 _a(OCoLC)53325055
040 _aDLC
_beng
_erda
_cDLC
_dPMC
_dMUQ
_dSBH
_dBAKER
_dNLGGC
_dP#O
_dYDXCP
_dOCLCQ
_dBTCTA
_dUQ1
_dOCLCG
_dUKM
_dTEX
_dDEBBG
_dIG#
_dATU
042 _apcc
050 0 0 _aHM538
_b.M48 2004
082 0 0 _a300.723
_222
245 0 0 _aMethods for testing and evaluating survey questionnaires /
_cedited by Stanley Presser [and others].
264 1 _aHoboken, NJ :
_bJohn Wiley & Sons,
_c[2004]
264 4 _c©2004
300 _axvi, 606 pages :
_billustrations ;
_c24 cm.
336 _atext
_btxt
_2rdacontent
337 _aunmediated
_bn
_2rdamedia
338 _avolume
_bnc
_2rdacarrier
490 1 _aWiley series in survey methodology
504 _aIncludes bibliographical references (pages 547-602) and index.
505 0 _aPreface -- Chapter 1. Methods for testing and evaluating survey questions / Stanley Presser, Mick P. Couper, Judith T. Lessler, Elizabeth Martin, Jean Martin, Jennifer M. Rothgeb, and Eleanor Singer -- Part I. Cognitive interviews. Chapter 2. Cognitive interviewing revisited : a useful technique, in theory? / Gordon B. Willis -- Chapter 3. The dynamics of cognitive interviewing / Paul Beatty -- Chapter 4. Data quality in cognitive interviews : the case of verbal reports / Frederick G. Conrad and Johnny Blair -- Chapter 5. Do different cognitive interview techniques produce different results? / Theresa J. DeMaio and Ashley Landreth -- Part II. Supplements to conventional pretests. Chapter 6. Evaluating survey questions by analyzing patterns of behavior codes and question-answer sequences : a diagnostic approach / Johannes van der Zouwen and Johannes H. Smit -- Chapter 7. Response latency and (para)linguistic expressions as indicators of response error / Stasja Draisma and Wil Dijkstra -- Chapter 8. Vignettes and respondent debriefing for questionnaire design and evaluation / Elizabeth Martin -- Part III. Experiments. Chapter 9. The case for more split-sample experiments in developing survey instruments / Floyd Jackson Fowler, Jr. -- Chapter 10. Using field experiments to improve instrument design : the SIPP methods panel project / Jeffrey Moore, Joanne Pascale, Pat Doyle, Anna Chan, and Julia Klein Griffiths -- Chapter 11. Experimental design considerations for testing and evaluating questionnaires / Roger Tourangeau -- Part IV. Statistical modeling. Chapter 12. Modeling measurement error to identify flawed questions / Paul Biemer -- Chapter 13. Item response theory modeling for questionnaire evaluation / Bryce B. Reeve and Louise C. Mâsse -- Chapter 14. Development and improvement of questionnaires using predictions of reliability and validity / Willem E. Saris, William van der Veld, and Irmtraud Gallhofer -- Part V. Mode of administration. Chapter 15. Testing paper self-administered questionnaires : cognitive interview and field test comparisons / Don A. Dillman and Cleo D. Redline -- Chapter 16. Methods for testing and evaluating computer-assisted questionnaire / John Tarnai and Danna L. Moore -- Chapter 17. Usability testing to evaluate computer-assisted instruments / Sue Ellen Hansen and Mick P. Couper -- Chapter 18. Development and testing of web questionnaires / Reginald P. Baker, Scott Crawford, and Janice Swinehart -- Part VI. Special populations. Chapter 19. Evolution and adaptation of questionnaire development, evaluation, and testing methods for establishment surveys / Diane K. Willimack, Lars Lyberg, Jean Martin, Lilli Japec, and Patricia Whitridge -- Chapter 20. Pretesting questionnaires for children and adolescents / Edith de Leeuw, Natacha Borgers, and Astrid Smits -- Chapter 21. Developing and evaluating cross-national survey instruments / Tom W. Smith -- Chapter 22. Survey questionnaire translation and assessment / Janet Harkness, Beth-Ellen Pennell, and Alisú Schoua-Glusberg -- Part VII. Multimethod applications. Chapter 23. A multiple-method approach to improving the clarity of closely related concepts : distinguishing legal and physical custody of children / Nora Cate Schaeffer and Jennifer Dykema -- Chapter 24. Multiple methods for developing and evaluating a stated-choice questionnaire to value wetlands / Michael D. Kaplowitz, Frank Lupi, and John P. Hoehn -- Chapter 25. Does pretesting make a difference? An experimental test / Barbara Forsyth, Jennifer M. Rothgeb, and Gordon B. Willis -- References -- Index -- --
505 0 0 _g1.
_tMethods for testing and evaluating survey questions /
_rStanley Presser, Mick P. Couper, Judith T. Lessler, Elizabeth Martin, Jean Martin, Jennifer M. Rothgeb and Eleanor Singer --
_gPt. I.
_tCognitive interviews --
_g2.
_tCognitive interviewing revisited : a useful technique, in theory? /
_rGordon B. Willis --
_g3.
_tThe dynamics of cognitive interviewing /
_rPaul Beatty --
_g4.
_tData quality in cognitive interviews : the case of verbal reports /
_rFrederick G. Conrad and Johnny Blair --
_g5.
_tDo different cognitive interview techniques produce different results? /
_rTheresa J. DeMaio and Ashley Landreth --
_gPt. II.
_tSupplements to conventional pretests --
_g6.
_tEvaluating survey questions by analyzing patterns of behavior codes and question-answer sequences : a diagnostic approach /
_rJohannes van der Zouwen and Johannes H. Smit --
_g7.
_tResponse latency and (par)linguistic expressions as indicators of response error /
_rStasja Draisma and Wil Dijkstra --
_g8.
_tVignettes and respondent debriefing for questionnaire design and evaluation /
_rElizabeth Martin --
_gPt. III.
_tExperiments --
_g9.
_tThe case for more split-sample experiments in developing survey instruments /
_rFloyd Jackson Fowler, Jr. --
_g10.
_tUsing field experiments to improve instrument design : the SIPP methods panel project /
_rJeffrey Moore, Joanne Pascale, Pat Doyle, Anna Chan and Julia Klein Griffiths --
_g11.
_tExperimental design considerations for testing and evaluating questionnaires /
_rRoger Tourangeau --
_gPt. IV.
_tStatistical modeling --
_g12.
_tModeling measurement error to identify flawed questions /
_rPaul Biemer --
_g13.
_tItem response theory modeling for questionnaire evaluation /
_rBryce B. Reeve and Louise C. Masse --
_g14.
_tDevelopment and improvement of questionnaires using predictions of reliability and validity /
_rWillem E. Saris, William van der Veld and Irmtraud Gallhofer --
_gPt. V.
_tMode of administration --
_g15.
_tTesting paper self-administered questionnaires : cognitive interview and field test comparisons /
_rDon A. Dillman and Cleo D. Redline --
_g16.
_tMethods for testing and evaluating computer-assisted questionnaires /
_rJohn Tarnai and Danna L. Moore --
_g17.
_tUsability testing to evaluate computer-assisted instruments /
_rSue Ellen Hansen and Mick P. Couper --
_g18.
_tDevelopment and testing of web questionnaires /
_rReginald P. Baker, Scott Crawford and Janice Swinehart --
_gPt. VI.
_tSpecial populations --
_g19.
_tEvolution and adaptation of questionnaire development, evaluation, and testing methods for establishment surveys /
_rDiane K. Willimack, Lars Lyberg, Jean Martin, Lilli Japec and Patricia Whitridge --
_g20.
_tPretesting questionnaires for children and adolescents /
_rEdith de Leeuw, Natacha Borgers and Astrid Smits --
_g21.
_tDeveloping and evaluating cross-national survey instruments /
_rTom W. Smith --
_g22.
_tSurvey questionnaire translation and assessment /
_rJanet Harkness, Beth-Ellen Pennell and Alisu Schoua-Glusberg --
_gPt. VII.
_tMultimethod applications --
_g23.
_tA multiple-method approach to improving the clarity of closely related concepts : distinguishing legal and physical custody of children /
_rNora Cate Schaeffer and Jennifer Dykema --
_g24.
_tMultiple methods for developing and evaluating a stated-choice questionnaire to value Wetlands /
_rMichael D. Kaplowitz, Frank Lupi and John P. Hoehn --
_g25.
_tDoes pretesting make a difference? : an experimental test /
_rBarbara Forsyth, Jennifer M. Rothgeb and Gordon B. Willis.
520 _aWritten and edited by leading experts, this volume offers an overview of and solid foundation in up-to-date survey questionnaire issues, concerns, and responses. This work has been prepared in conjunction with an international conference on the topic (in November 2002) by the Survey Research Methods Section of the American Statistical Association, the American Association for Public Opinion Research, the International Association of Survey Statisticians, the Council of American Survey Research Organizations, and the Council of Marketing and Opinion Research. The book covers cognitive interviewing, interaction analysis, response latency, respondent debriefings, vignette analysis, split-sample comparisons, statistical modeling, mode of administration, and special populations. It also considers these topics in light of emerging techniques and technologies. The book's authors include more than two-dozen eminent professionals in a variety of fields related to survey methodology and questionnaire development. Many tables, figures, and references, as well as an extensive glossary, supplement the high quality discussion throughout the text.
588 _aMachine converted from AACR2 source record.
650 0 _aQuestionnaires
_xMethodology
_9738894
650 0 _aSocial sciences
_xResearch
_xMethodology.
_9370737
650 0 _aSocial sciences
_xMethodology.
_9370624
650 2 _aData Collection
_xmethods
_9357334
700 1 _aPresser, Stanley,
_d1950-
_eeditor.
_9416300
830 0 _aWiley series in survey methodology.
_91040201
856 4 2 _3Contributor biographical information
_uhttp://catdir.loc.gov/catdir/bios/wiley047/2003063992.html
907 _a.b10880124
_b10-06-19
_c27-10-15
942 _cB
945 _a300.723 MET
_g1
_iA260138B
_j0
_lnmain
_o-
_p$71.36
_q-
_r-
_s-
_t0
_u25
_v1
_w0
_x2
_y.i1205754x
_z29-10-15
998 _a(2)b
_a(2)n
_b23-03-18
_cm
_da
_feng
_gnju
_h0
999 _c1150030
_d1150030