Implementing Developmental Screening in the Medical Home - PowerPoint PPT Presentation

1 / 72
About This Presentation
Title:

Implementing Developmental Screening in the Medical Home

Description:

Kennedy Krieger Institute. Johns Hopkins University School of ... Joseph Hagan, Jr., MD. W. Carl Cooley, MD. Nancy Swigonski, MD. Paul Biondich, MD, MS ... – PowerPoint PPT presentation

Number of Views:93
Avg rating:3.0/5.0
Slides: 73
Provided by: keith235
Category:

less

Transcript and Presenter's Notes

Title: Implementing Developmental Screening in the Medical Home


1
Implementing Developmental Screening in the
Medical Home
Medical Home Implementation Teleconference Series
April 20, 20091100 am CT
2
  • Paul H. Lipkin, MD
  • Kennedy Krieger Institute
  • Johns Hopkins University School of Medicine
  • Timothy Geleske, MD
  • North Arlington Pediatrics, IL
  • Tracy M. King, MD, MPH
  • Johns Hopkins University School of Medicine

The speakers have no relevant financial
relationships with the manufacturers(s) of any
commercial products(s) and/or provider of
commercial services discussed in this CME
activity.
3
Session Objectives
  • Understand the motivation and planning of
    practices choosing to implement the AAP's policy
    statement on developmental surveillance and
    screening.
  • Utilize the wide range of implementation
    strategies used by D-PIP practices, and to
    illustrate some of the challenges faced by
    practices in adopting these strategies.
  • Describe the implications of these findings for
    the sustainability of developmental surveillance
    and screening efforts within the medical home.

4
Pediatrics 2006 118 405-420
5
The 2006 AAP Policy Statement on Surveillance and
Screening Goals
  • Increase identification of children with
    developmental disorders by child health
    professionals
  • Improved surveillance and screening
  • Concrete guidelines (algorithm)
  • Eliminate barriers (e.g. reimbursement, time)
  • Improve medical assessment

6
Definitions (AAP, 2006)
  • Developmental surveillance
  • A flexible, longitudinal, continuous, and
    cumulative process whereby knowledgeable health
    care professionals identify children who may have
    developmental problems
  • Developmental screening
  • The administration of a brief standardized tool
    aiding the identification of children at risk of
    a developmental disorder
  • Not diagnostic!
  • Developmental evaluation
  • Aimed at identifying the specific developmental
    disorder or disorders affecting the child

7
Developmental Surveillance
9 mo
18 mo
24/30 mo
Developmental Screening
8
Policy Statement Recommendations
  • Developmental surveillance
  • Every well-child visit
  • Developmental screening using a standardized
    screening tool
  • 9, 18, and 30 months
  • When concern is expressed
  • Autism screening
  • 18 (and 24) months

9
Why screen at 9, 18 and 30 months?
  • Time availability
  • Limited other requirements
  • Key developmental stages
  • Early Intervention
  • Medical interventions

10
When screening results concerningReferrals
  • Developmental evaluations
  • Identify disability
  • Medical evaluations
  • Identify etiology
  • Counsel around diagnosis/prognosis
  • Genetics and family planning issues
  • Implement medical treatments
  • Early intervention/other services
  • Service delivery

11
Developmental Screening Instruments
  • Domains
  • General
  • Domain-specific (motor, language)
  • Disorder-specific
  • (autism)
  • Administration
  • Parent-completed
  • Directly administered

Acceptable sensitivity and specificity 70-80
12
Algorithm Surveillanceto Screening to
Referral
13
(No Transcript)
14
Why an implementation project?
  • Quality improvement framework
  • To see if guidelines can be effectively
    implemented in a variety of practice settings,
    specifically with regards to
  • Developmental Surveillance
  • Developmental Screening
  • Referral Practices

15
Developmental Surveillance and Screening Policy
Implementation Project (D-PIP)
  • Aim
  • Implement policy statement in pilot practices
  • Goals
  • Determine if the policy statement is efficiently
    and effectively implemented into practice
  • Recognize strategies for implementation
  • Examine outcomes of implementation
  • Pilot sites to serve as best-practice sites

16
Participating Sites
  • Setting
  • 9 urban
  • 5 suburban
  • 3 rural
  • Practice type
  • 7 private practice
  • 5 residency programs
  • 5 community health centers

17
Training of PracticesPre-Implementation Workshop
  • 3 members from each practice
  • (pediatrician, office staff, other)
  • Review of AAP guidelines
  • Screening test examples
  • Principles of implementation
  • (Bright Futures, Medical Home models)

18
Developmental Screening
  • One practices experience

19
North Arlington Pediatrics
  • Primary care pediatric practice in a middle class
    suburban setting
  • Five full-time and three part-time physicians
  • Emphasis on surveillance with health maintenance
    visits at months 1-6, 8, 10, 12, 15, 18, 24, 30,
    and 36 months, and yearly after that
  • No standardized developmental screening performed

20
North Arlington Pediatrics
  • Developmental Screening Policy Implementation
    Project (D-PIP) sponsored by the AAP - Spring
    2006
  • Implementation team nurse, front office staff
    and physician champions
  • Developmental screening introduced in July of
    2006 by utilizing PDSA cycles and small tests of
    change

21
PDSA Plan, do , study, act
22
PDSA Plan, do , study, act
Incremental Improvement
23
Developmental Screening Implementation
  • Ages and Stages was chosen because of its high
    sensitivity and specificity and its relative ease
    of use in the practice setting
  • Parents receive the screener upon arriving at the
    office and fill out in the waiting or exam room.
    Nursing scores the screener before the physician
    enters the room

24
Developmental Screening Implementation
  • Started July 2006 with one physician and expanded
    accordingly
  • By November 2006 all 10, 18 and 30 month old
    infants were routinely screened using Ages and
    Stages Questionnaires
  • Developmental screens were performed at other
    visits based on surveillance according to the
    Developmental Screening Algorithm

25
Impact on Referrals
  • To determine referral patterns, we looked at the
    total number of referrals to early intervention,
    sub-specialists or other diagnostic evaluations
    in our practice
  • A retrospective chart review of all health
    maintenance visits at 10, 12, 15, 18, 24, and 30
    months from March 2006 served as baseline data

26
Impact on Referrals
  • Data was collected for D-PIP by keeping a running
    tally of referrals
  • A chart review of all health maintenance visits
    at 10, 12, 15, 18, 24 and 30 months from March
    2007 was also performed to assure no referrals
    were missed
  • The tally from March 2007 was compared with March
    2006

27
Impact On Referrals
28
Impressions
  • To determine the impressions of participants in
    the developmental screening process, a
    questionnaire was distributed to physicians,
    nursing and office Staff, and families at the
    conclusion of D-PIP

29
Impressions
30
Physicians Impressions
  • Overall, found screening tool helpful and viewed
    parents impressions as favorable
  • Of those physicians who believed that their
    referral patterns had changed, referring sooner
    was given as area of change
  • Provided parental reassurance
  • Allowed more time to be spent on parents
    questions and less time spent on surveillance

31
Nurses Impressions
  • Generally had a favorable impression of
    developmental screening, found it easy to score,
    and viewed parents impression as favorable

32
Nurses Impressions
  • Generally had a favorable impression of
    developmental screening
  • Easy to score
  • Viewed parents impression as favorable
  • Identified challenges, opportunities to improve
  • May reassure or make parent anxious.
  • Parents may not understand questions or intent of
    screening
  • Not enough time to complete
  • Difficult to fill out and watch child/children
  • Nurse needs to come back to room to score

33
Families Impressions
  • Overall, parents had a favorable impression and
    found it to be helpful in understanding their
    childs development
  • Parents felt they had enough information about
    their childs development to adequately complete
    the screener
  • Rated it as easy to complete
  • Expressed desire to experiment with questions
    ahead of time

34
Conclusion
  • Referrals and patients identified for potential
    referral to early intervention increased with
    developmental screening
  • Physicians, nursing staff, and families found
    Developmental Screening to be helpful

35
D-PIP Results
36
Data Collection
Quantitative
  • Screening (test chosen, frequency of screening)
  • Frequency of referral

Qualitative
  • 3 representatives from each practice
  • 2 time points (mid-, post-implementation)
  • Analysis

37
RESULTS QUANTITATIVE DATA
RESULTS QUANTITATIVE DATA
38
General developmental screening instruments (n17
practices)
includes use of multiple instruments by some
practices
39
Rates of screening
40
Rates of referral(among children with failed
screens)
41
Referral Sites
(N214 total referrals, all 9 months)
42
RESULTS QUALITATIVE INTERVIEWS
43
ThemeConsiderations in choosing screening
instruments
44
Concerns about clinic flow
  • We chose the PEDS because of the simplicity of
    itweve got a busy practice and youve got to
    move fast or youll get trampled.
  • Weve been real happy with the Ages and Stages
    because it hasnt slowed us down significantly,
    its easy to scoreonce we became familiar with
    it, then its made using the tool very easy.

45
Alignment with community-based programs
  • The biggest reason we went with the ASQ is
    because its currently used by our state early
    intervention program and so we thought if we
    were using the same tool we would have some
    consistency with them.

46
Support of teaching
  • The ASQ gives us a little more opportunity
    for teachingboth teaching parents and teaching
    students about appropriate developmental
    expectations.

47
ThemeNeed for practice-wide implementation
systems
48
Distributing responsibilities among multiple staff
  • Our front desk staff puts the screener in the
    chartand then the nurses are just giving out
    the screener, going back and checking and scoring
    it.

49
Modifying implementation in response to data
  • I was looking at the numbers and the forms
    werent getting back so I was asking front
    desk staff and they said, well, were so busy
    checking insurance that we just cant always get
    those forms in here . taking it off of them and
    putting it with nursing seemed to work better.

50
ThemeFrequent challenges in implementation
51
Capturing children at target visits
  • that was the hardest piece, absolutely by
    farwas to remember to screen in those isolated
    3 visits.

52
Keeping up screening during busy times
  • ...toward the winter months when we started
    getting a lot of sick kids coming in and it got
    very crazy sometimes we would just forget.

53
Coping with staff turnover
  • When the staff changed the office was obviously
    in chaosso I had to put screening on a back
    burner.

54
ThemeDeviations from the AAP algorithm
55
Not implementing a 30 month visit
  • Were not doing a two-and-a-half year checkup
    because insurance companies wont reimburse for
    it.

56
Not screening when surveillance suggests delays
  • If theres something there on surveillance, we
    go right to a referral.

57
Stratifying referrals (1)
  • We try to refer directly to early intervention
    if its multiple significant developmental
    delays. If its just speech and language then we
    will refer for a hearing screen and speech and
    language therapy, but not to early
    intervention.

58
Stratifying referrals (2)
  • If it seems like its something that is
    relatively minor, and its not going to entail
    that much of evaluation, then we go with our
    state early intervention program. If it seems
    more serious, more concerning, we may start doing
    some work up and tests on our own, while we get
    them lined up to go in and see a developmental
    pediatrician.

59
ThemeLessons learned from referral tracking
efforts
60
Referrals cannot be tracked without a system
  • We were just putting the referral in the
    chart, no follow up, no nothing.we just didnt
    know what happened because it was only in the
    chart, and of course the chart doesnt speak for
    itself.

61
Referral tracking requires people and time
  • Unfortunately, we lose track of many
    referrals.We dont have the number of people
    we need to make sure that these families follow
    up.

62
Many families dont follow through with referrals
  • I did keep a list of who was referredbut when
    I got around to following up I found out
    that a lot of people didnt bother with it,
    contacting early intervention.

63
Families often dont understand why theyre being
referred
  • They didnt understand who exactly was calling
    them. So sometimes we have to re-explain the
    process, that these are the people we talked to
    you about that are going to help you and evaluate
    the baby.Usually once the docs call again and
    re-explain, then they are pretty good to go with
    it but it sometimes requires additional
    reassurance on our part.

64
Tracking leads to better communication
  • I can tell you I get a lot more stuff back from
    early intervention than I ever had before.And
    I think its because we put that referral piece
    in place.

65
Tracking can show that more children are being
identified
  • We know that were identifying more children
    based on our referrals to early intervention
    being increased by 60 with no decline in
    eligibility.

66
IMPLICATIONS
67
Implications for practices
  • To fully implement the AAP policy statement,
    practices need two distinct implementation
    systems
  • Screening
  • Referral
  • Implementation requires consistent and ongoing
    monitoring

68
Implications for policymakers
  • Guidelines are not enough to ensure widespread
    adoption of new guidelines
  • Tools/toolkits
  • Technical assistance/mentoring
  • Ongoing revision of guidelines to reflect new
    knowledge (especially regarding implementation)

69
Implications for researchers
  • Prior research has failed to link universal
    developmental screening with improved outcomes
    for children
  • Do failures in the referral process (partially)
    account for this gap in evidence?
  • How can gaps in referrals be minimized in future
    research efforts?

70
Acknowledgments
  • AAP Policy Revision Committee
  • John Duby, MD
  • Michelle Macias, MD
  • Lynn Wegner, MD
  • Paula Duncan, MD
  • Joseph Hagan, Jr., MD
  • W. Carl Cooley, MD
  • Nancy Swigonski, MD
  • Paul Biondich, MD, MS
  • American Academy of Pediatrics
  • Thomas Tonniges, MD
  • Stephanie Skipper, MPH
  • Jill Ackermann Healy, MS
  • Holly Griffin
  • Amy Brin, MA
  • Mary Crane, PhD, LSW
  • Amy Gibson, MS, RN
  • Darcy Steinberg, MPH
  • Ginny Chanda
  • D-PIP Studies
  • S. Darius Tandon, PhD
  • 17 Practice site personnel
  • PRC Liaisons and Consultant
  • Donald Lollar, EdD- Centers for Disease Control
    and Prevention
  • Bonnie Strickland, PhD- Maternal Child Health
    Bureau
  • Melissa Capers, MA, MFA
  • Other
  • Ed Schor, MD-CommonwealthFund

71
(No Transcript)
72
  • Questions?
Write a Comment
User Comments (0)
About PowerShow.com