Survey Research - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Survey Research

Description:

Survey Research Lesley Cottrell, Ph.D. Department of Pediatrics, WVU Objectives Identify and understand the individual components of survey research Distinguish ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 37
Provided by: LesleyC
Learn more at: https://www.hsc.wvu.edu
Category:

less

Transcript and Presenter's Notes

Title: Survey Research


1
Survey Research
  • Lesley Cottrell, Ph.D.
  • Department of Pediatrics, WVU

2
Objectives
  • Identify and understand the individual components
    of survey research
  • Distinguish between types of surveys
  • Identify design issues (/-)
  • Understand difference between reliable and valid
    survey methods
  • Apply survey research principles to real world
    ideas

3
Survey Types
  • Written
  • Mail
  • Group administered
  • Drop-off
  • Oral
  • Electronic

4
Written Surveys
  • Mail
  • Large samples at limited cost
  • Response rate issues
  • Group Administered
  • Higher response rate at one point
  • Must gather participants
  • Drop-off Survey
  • Allows for longer response time
  • Provides introduction for study purpose

5
Oral Surveys
  • Interaction with participants
  • Collection of detailed responses
  • Can be done through various media (e.g., phone,
    face-to-face)
  • Example Group interviews

6
Oral Surveys S W
  • Strengths
  • Flexibility to react to respondents situation
  • Probe for more detail
  • Seek reflective replies
  • Ask questions which are complex or personally
    intrusive
  • Glastonbury MacKean (p. 228)

7
Oral Surveys S W (cont.)
  • Weaknesses
  • Cost
  • Bias
  • Limitations in question range
  • Attitude

8
Electronic Surveys
  • Email
  • Web-based
  • Kiosks
  • Laptops

9
Electronic Surveys
  • Strengths
  • Fewer costs
  • Rapidly changing format
  • Real-time processing
  • Higher response rate
  • Diverse samples
  • More candid responses

10
Electronic Surveys
  • Weaknesses
  • Higher response rates early in process only
  • Technical issues
  • Comfort and accessibility with format
  • Confidentiality issues
  • Layout issues

11
Survey Design Issues
  • Theory
  • Survey purpose
  • Respondent characteristics
  • Survey Quality
  • Cost

12
Theory
  • A theory is an explanation of events in terms of
    the structures and processes that are presumed to
    underlie them.
  • Theory consists of constructs and it specifies
    how the constructs are related.

13
What can a theory do?
  • Describe- students academic success across
    demographics
  • Predict- e.g., SAT and future academic success
  • Improve- e.g., effectiveness of intervention
  • Explain- subsumes all three

14
Survey Purpose
  • What are you trying to measure?
  • What questions are you attempting to answer?

15
Respondent Characteristics
  • Attitude
  • How easy is it to ignore?
  • What will catch your eye?
  • What kind of survey would be appealing?

16
Survey Quality
  • Find a balance between
  • Survey approach
  • Question type
  • Closed-ended
  • Open-ended
  • Multiple answer
  • Likert scales

17
Cost
  • Issues to consider include

Geographic Logistics
Population Characteristics
Types of Questions
18
Creating Survey
  • Developing a recipe
  • Identify construct
  • Operationalize construct how would you measure
    it?
  • Consider representative items

19
Constructs
  • A construct is a type of concept used to
    describe a structure of process that is
    hypothesized to underlie particular observable
    phenomenon.
  • E.g., motivation, intelligence, decision making,
    leadership, student professionalism)

20
Example Emotional Development
  • Constructs
  • Emotional Intelligence
  • Personal competence Social competence
  • Self-awareness Empathy
  • Emotional awareness Understanding others
  • Accurate self-assessment Developing others
  • Self-confidence Leveraging diversity
  • Self-regulation Social Skills
  • Self-control Influence
  • Trustworthiness Communication
  • Adaptability Conflict management
  • Innovation Leadership

21
Operational Definitions
  • Constructs are usually defined in operational
    terms because they are often latent or difficult
    to observe.
  • When a construct is thought of as a
    characteristic that can vary in quantity or
    quality, it is called a variable.
  • E.g., WAIS-R and intelligence

22
Scales and Items
  • Classical measurement theory Individual items
    are comparable indicators of an underlying
    construct
  • Typically, researchers are interested in
    constructs rather than items or scales. Items
    are a means to an the end of construct
    assessment.
  • Measures are proxies for variables that cannot be
    directly observed.
  • Robert Devellis

23
Scale Development
  • Determine clearly what you want to measure
  • Generate an item pool
  • Determine measurement format
  • Experts review item pool
  • Administer to sample
  • Conduct reliability and validity analyses

24
Generate Item Pool
  • Items should be aligned with the goal of
    measurement
  • Consider items as overt manifestations of
    common latent variable
  • Identify a variety of ways in which construct can
    be stated

25
Determine Measurement Format
  • Opinion may be ordinal scale
  • Strongly agree
  • Agree
  • Somewhat agree
  • Strongly disagree
  • More objective behaviors may be aligned with
  • Yes or No items or
  • Behavioral checklists

26
Item Exercise
  • 1. If you found a wallet, would you
  • Turn it in to police
  • Keep it
  • Undecided
  • 2. Who do you feel is most responsible when a
    student fails?
  • the student
  • the parents
  • unsympathetic faculty members
  • 3. How often do you attend campus sporting
    events?
  • Never Rarely Occasionally Regularly

27
Experts Review Item Pool
  • Administer scale to experts
  • Conduct a focus group
  • Identify themes and consider modifications

28
Administer to Sample
  • Piloting
  • Participating
  • inform respondents of pilot test
  • Respondents are asked about question form,
    wording, item order
  • Undeclared
  • Respondents do not know of pilot test
  • Assess time constraints

29
Pilot Assessments
  • Assess
  • Question variation
  • Meaning
  • Task Difficulty
  • Respondent interest and attention
  • Flow, order, skip patterns, timing, overall
    respondent response

30
Administer to Sample
  • Sampling
  • Administration
  • Analysis
  • Primary
  • Secondary
  • Presenting Findings

31
Sampling
  • Unless population is very small, there is no way
    one can survey all members of a population.
  • A representative sample can be used to make
    generalizations regarding the larger populations
    opinions, beliefs, or behaviors.

32
Administration
  • Find most convenient time
  • (interview and group-administered surveys)
  • Work out logistical issues during pilot phase
  • Seating arrangements
  • Number of times to call
  • How to classify responses
  • How to respond to questions

33
Analysis
  • Data entering
  • Scanning
  • Double entry
  • Checks and balances
  • Data cleaning
  • Outliers
  • Missing data

34
Analysis
  • Correlation
  • Relationship between 2 or more constructs
  • ANOVA
  • Differences in outcome based on groupings
  • Regression
  • Predicting an outcome using 2 or more constructs

35
Present Findings
  • Formal report
  • Departmental meetings
  • Peer-reviewed manuscript
  • Cost-analyses
  • Process evaluation
  • Outcomes research

36
Additional Considerations
  • Reliability/Validity
  • Ethical considerations
  • Sample representativeness
  • Data analysis
  • Confidentiality versus anonymity
  • Response rates
Write a Comment
User Comments (0)
About PowerShow.com