IV' SURVEY RESEARCH STRATEGIES - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

IV' SURVEY RESEARCH STRATEGIES

Description:

1. most social science knowledge is based on the ... conduct social science investigation ... operational definition created to explain it; ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 35
Provided by: daniel106
Category:

less

Transcript and Presenter's Notes

Title: IV' SURVEY RESEARCH STRATEGIES


1
IV. SURVEY RESEARCH STRATEGIES
2
A. Introduction
  • 1. most social science knowledge is based
    on the
  • process of asking questions and
    processing answers
  • 2. the trick is asking the right
    questions and getting the
  • right answers
  • 3. survey research is the basic
    investigative tool used to
  • conduct social science investigation
  • 4. consideration must be given to the type
    of data that are
  • best for the research conducted
  • gt quantitative or qualitative

3
B. Types of Research Models
  • 1. exploratory studies
  • a. useful in gaining preliminary
    information and insights
  • b. also called pilot studies
  • c. initial general description/s of a
    particular
  • phenomenon or research question
  • gt most research contains at least
    some description of
  • an exploratory nature
  • d. fact-finding explorations
  • e. usually univariate in nature
  • f. can be or lead to one-shot or
    longitudinal in
  • applications

4
  • 2. explanatory research
  • a. generally take the form of
    correlation research
  • b. an examination of the relationship
    between
  • variables leading to uni / bi /
    multivariate analysis
  • c. process of identifying specific
    constructs /
  • variables that will be measured and
    compared to
  • another construct or variable
  • d. this relationship is then examined
    as a strong or
  • weak relationship
  • gt i.e., what is the probability
    that A B ?

5
  • 3. one-shot / fixed-point / snap-shot
    studies
  • a. data are collected at one point in
    time only at a
  • particular place and produces a
    particular array of
  • data
  • gt creates methodological problems
    with sampling,
  • i.e., questions the validity of
    the information
  • collected
  • b. problem arises in trying to
    generalize these results
  • across a wider population or in
    other situations
  • c. generally this research is either
    descriptive /
  • exploratory or explanatory

6
  • 4. case studies
  • a. an extensive theoretical,
    behavioral, operational,
  • comprehensive examination of an
    individual,
  • organization, incident, legal case,
    etc.
  • b. an example of a particular situation
    to demonstrate
  • a positive or negative procedure
  • c. province of law schools, business
    schools, and most
  • police training academies
  • d. Gideon v Wainwright (372 US 335.
    1963)
  • e. parental context Why cant you be
    more like ...?

7
  • 5. true experiments
  • a. the classic experiment
  • b. a test / treatment is administered /
    given to a coterie
  • of research subjects in a
    controlled environment
  • designed to examine specific
    conditions or test
  • conditions
  • c. all operational definitions are
    established prior to the
  • onset of research and their results
    are assessed
  • d. design utilizes two unique reporting
    groups (samples)
  • experimental and control
  • gt Kansas City Preventive Patrol
    Study (Kelling, Pate,
  • Dieckman, and Brown, 1974)

8
  • 6. longitudinal studies
  • a. investigations examining change
    occurring over time
  • b. basic model
  • T1 X
    T2
  • (base-line (treatment
    (measure of
  • measure administered)
    change, if any)
  • taken)

9
  • c. types
  • 1) trend studies
  • a) research seeking to
    determine attitudinal,
  • behavioral, cognitive,
    opinion change, if any,
  • over time
  • b) repeated (same/similar)
    measures taken
  • independently from
    similar units of analysis,
  • over time
  • c) e.g. What is the
    effect of gaining a college
  • education?
    What changes occur based
  • on acquiring more knowledge /
  • experience?

10
  • 2) panel / cohort studies
  • a) examination of change,
    if any, in a specific
  • sample over time
  • b) repeated measures taken
    over the same
  • sample at different
    points in time
  • c) e.g.1 Wolfgangs
    (1972) Delinquency in a
  • Birth
    Cohort study
  • d) e.g.2 experiencing
    your 10-year high school
  • reunion
  • e) caution the Hawthorne
    Effect

11
  • 7. to be discussed more thoroughly later
    (perhaps)
  • a. field research / ethnography
  • 1) generally a more qualitative
    than quantitative
  • first-hand phenomenological
    account of the
  • researchers experiences
  • 2) e.g.s testimonial
    narratives, oral histories, or
  • memoirs
  • b. program evaluation research
  • 1) determinations of the success,
    or failure of any
  • policy initiative
  • gt does community
    policing work?

12
  • 8. secondary analysis
  • a. overview
  • 1) analyzing available information
    collected by
  • someone else
  • gt a different unit of analysis
  • 2) can take the form of document
    or content analysis
  • gt i.e., examining various
    media for underlying
  • themes or
    significance
  • 3) e.g.1 the Unabomer Manifesto
  • 4) e.g.2 the John Gotti tapes

13
  • b. advantages
  • 1) cheaper and faster to conduct
    than original /
  • primary analysis
  • 2) can examine existing
    information and tailor-make
  • your analysis / examination
  • c. disadvantages
  • 1) validity concerns
  • gt how can information
    collected by someone
  • else, for possibly
    other purposes be relevant
  • for your study?
  • 2) question of accuracy and
    precision

14
  • d. a sampling of sources of secondary
    data available
  • for criminal justice research
  • 1) National Opinion Research
    Center
  • a) national sample
  • b) questions social concerns
    of education, public
  • policy, environment,
    crime, and health
  • 2) Juvenile Court Daily Statistics
  • a) compiled by Office of
    Juvenile Justice and
  • Delinquency Prevention
  • b) national aggregate
    compilation of juvenile and
  • family court cases

15
  • 3) National Jail Census
  • a) compiled by National Institute of
    Justice
  • b) aggregate head count,
    charges, and
  • socio-demographic
    information of the 100
  • largest lock-ups in the
    country
  • 4) Sourcebook of Criminal Justice
    Statistics
  • a) compiled by Bureau of
    Justice Statistics
  • b) a comprehensive
    compendium of a range of
  • criminal justice
    official data generated by BJS
  • and agency funded
    research

16
  • 5) National Crime Victimization
    Survey
  • a) generated by the U.S.
    Census Bureau
  • b) the crime rate
  • Number of reported
    offenses
  • CR
    X
    100,000
  • Total U.S. Population
  • c) benefits
  • 1 amelioration of the
    Dark Figure
  • 2 because of research
    techniques, survey
  • goes to more
    representative households

17

  • d) limitations
  • 1 telescoping
  • 2 suspect validity
  • 3 trivial v serious
    crime reports
  • 6) Uniform Crime Reports
  • a) compiled by the Federal
    Bureau of
  • Investigation
  • b) reported major felonies
  • gt Murder / Non-negligent
    Manslaughter,
  • Forcible Rape,
    Aggravated Assault,
  • Burglary,
    Larceny-Theft (GE 250),
  • Motor Vehicle Theft,
    and Arson

18
  • c) benefits
  • 1 makes sense of a
    large array of local data
  • 2 standardized
    reporting of official data
  • 3 good indicator of
    big city police
  • bookkeeping
    procedures
  • d) limitations
  • 1 initial bias /
    discretion
  • 2 rural v urban bias
  • 3 under-reporting of
    Part II and white-collar
  • crime
  • 4 reporting hierarchy
    problem

19
  • 7) National Incident-Based
    Reporting System
  • a) began in 1991 by
    Department of Justice
  • upgrade of UCR
  • b) reports individual
    crimes, not aggregated
  • summary of crime types
  • c) no offense hierarchy
  • d) taxes resources of small
    departments and
  • clearinghouse locale

20
  • 9. self-report questionnaires
  • a. overview
  • 1) traditional paper-pencil ask
    question / get
  • answer
  • 2) most popular form of social
    science data
  • gathering strategy
  • b. advantages
  • 1) allows great freedom in
    seeking and acquiring
  • information to support
    theoretical constructs
  • 2) strong validity
  • 3) good for large samples with
    lots of variables

21
  • 5) ameliorates the Dark Figure
  • c. limitations
  • 1) no standardization
  • 2) much research comes from
    juvenile populations
  • 3) fear of social desirability /
    Hawthorne Effect
  • 4) sampling bias
  • d. Types
  • 1) survey face-to-face interviews
  • a) predetermined questions
    are constructed and
  • asked in a specific
    order

22

  • c) advantages
  • 1
    high response rate
  • 2 guards against confusing
    and ambiguous
  • questions
  • 3 interviewer can
    interpret respondents
  • physical cues
  • d) limitations
  • 1 hermeneutics

  • gt how good a listener
    are you?
  • 2 poor reliability

23
  • 2) interactive
    interviews

  • a) focused conversation

  • b) must remember what areas
    (questions) are
  • important to the
    research endeavor
  • c) listen and respond
  • gt
    focus must be on the answer the
  • respondent, is
    providing
  • 3) mailed surveys
  • a) best strategy for a
    broad, well-funded project
  • gt large number of
    variables (questions), large
  • number of
    respondents

24
  • b) funding is critical due to
    reproduction and
  • postal charges and
    considerations
  • c) strong validity / weak
    reliability
  • d) process
  • 1 cover letter,
    questionnaire, instructions,
  • self-addressed
    stamped envelope
  • 2 addresses
  • 3 consider follow-ups
  • e) concern over response
    rate
  • gt likely best rate to
    expect is between
  • 40-60

25
  • 4) the self-report
  • a) by far the most popular
    data gathering
  • strategy
  • b) questionnaires do NOT have to
    be a
  • collection of
    questions
  • c) use various scaling
    techniques
  • d) open / closed-ended
    questions
  • 1 include both
  • 2 OE items allow
    the respondent a chance
  • to interact with
    the questionnaire

  • gt
    CE items are forced choices

26
  • 3 OE
    questions may generate
  • unforeseen,
    unanticipated, interesting,
  • irrelevant
    responses


  • gt CE items may not provide enough
  • choices
  • 4 a caution on CE
    items, remember to
  • insure that the
    attributes (responses)
  • are mutually
    exclusive and exhaustive
  • gt allow the
    respondent opportunity to
  • answer the
    question honestly and

  • appropriately

27
  • e)
    which type of answer will generate the most /
  • best information

  • f) lack of response may be as
    informative as an
  • abundance of garbled
    responses
  • g) issues
  • 1 clarity
  • 2 avoid double-barreled
    questions
  • 3 beware of respondent
    competency
  • 4 be parsimonious
  • 5 avoid biased
    questions

28

  • h) questionnaire construction
  • 1 cosmetically
    attractive

  • 2 order of items
  • 3 clear
    instructions
  • i) remember this is
    your test instrument
  • 1 what is it you are
    looking for?

  • 2 what are you trying to
    prove?
  • 3 you are tying to
    ameliorate the Dark
  • Figure
  • 4 SRs are good at
    maintaining confidentiality

29
  • 5 SRs good on
    validity, less so on reliability
  • gt secondary sources are
    vice versa
  • 6 remember, this is your
    attempt at gaining
  • The Truth

30
C. Experimental / Internal Validity
  • 1. overview
  • a. addresses the issue of relevance
  • gt is your study valid?
  • b. def the relevance between the
    dependent variable and the
  • operational definition
    created to explain it
  • gt obtaining the results you
    actually intended / hoped to obtain
  • 2. problems that may affect respondents
    responses
  • a. history / current events
  • 1) some life issue, unanticipated
    event, may directly affect
  • performance or response
  • 2) e.g., physical illness,
    epiphany, affect of substance abuse

31
  • 2) increased awareness /
    sophistication in
  • completing the examination /
    boredom
  • c. statistical regression
  • 1) a mathematical convention
  • 2) in a normal distribution, all
    responses will
  • eventually form a bell-shaped
    curve
  • 3) responses will regress toward
    the mean
  • 4) scores at the extremes of the
    distribution are
  • rare
  • d. sample selection bias
  • gt do those elements that / who
    participate actually

32
  • e. subject mortality
  • 1) respondent drop-outs cause
    statistical
  • quandaries
  • gt most social science
    statistical models assume
  • 100 completion rates from
    samples
  • 2) a form of sampling bias

33
D. External (In)Validity
  • 1. addresses the real concern of whether or not
    the
  • results gained in any one experiment can be
    used in
  • any other research setting
  • 2. can the results of your experiment be
    generalized
  • beyond the parameters of your research
    setting?
  • gt will the results be useful beyond the
    parameters of
  • your research setting
  • 3. do results come from a Hawthorne Effect?
  • 4. some programs and policy initiatives are
    effective
  • specifically because of the charismatic
    personality in
  • charge

34
E. Summary
  • 1. research designed is based SOLELY on the
    type of
  • information desired
  • gt remember what your research question
    is
  • 2. does your research actually reflect that
    which you had
  • intended
  • 3. there is no guarantee that the perfect
    model or project
  • can be designed
  • gt problems inevitably will always
    occur
Write a Comment
User Comments (0)
About PowerShow.com