Hot Call Girl In Chandigarh đ 𼾠9053'900678 Call Girls Service In Chandigarh
Â
NCA Residency Session 7 March 8 2017
1. AGENDA- Learning Collaborative Session 7
March 8, 3:00-4:30pm (EST)
ď§ Welcome and Review Moodle/Assignments
ď§ Questions on NP finances
ď§ Residency Program Policies and Procedures
ď§ Curriculum: Evaluation of the Learner
ď§ How and Why to Assess Your Residents
ď§ Assessment Tools and Process
ď§ Action Period Items
ď§ Begin working on policy and procedures
ď§ Continue Curriculum development
ď§ Progress Checklist and MONTHLY REPORTS!
Monthly
Reports
due EVERY
MONTH!
2. ⢠1) Is the Program set up as a separate Cost Center. If so,
what is costed directly to the Residency Program Cost
Center?
â Are the revenues from the residents credited to this cost
center?
â Are the salaries of the mentors and continuity clinics charged to
the cost center?
⢠2) Is the program considered fully in scope, was this
separately added to your scope â how has this been
handled with respect to UDS and FFR and SAC 330 budget?
⢠3) Is the work of the residents at the off site specialty
rotations covered by FTCA or Gap policies?
3. Creating policy and procedures for your program
⢠Policies VS. Procedures
⢠What policies do you need to create for the
residency?
⢠What procedures may you need to adapt to fit
program? (ie PTO)
⢠What policies does your organization already
have?
3
Policies and Procedures
4. Policies and Procedures
⢠Residency Specific Policies
1. Ramp Up Policy
2. Precepting Policy
3. Patient Panel Transfer Policy
⢠Accreditation Policies
⢠Check accreditation standards on what policies are
required
4
5. Policies and Procedures
⢠Start putting pen to paper to develop Policies and
Procedures
⢠Create program manuals
⢠Program staff
⢠Residents
⢠Important to have these established
for training new staff
5
9. Learning ObjectivesKnowledge:
â Understand the purpose of assessment
â Know the characteristics of good assessment
â Understand how assessment builds trainee and
programmatic performance
Attitude:
â Appreciate the importance of good assessment
â Embrace the challenge
Skills:
â To be gained by independent / group work building on
information provided in the presentation
10. Overview of the Session
⢠Defining terms: difference between evaluation and
assessment
⢠How assessment/evaluation fits in the bigger picture of
curriculum and program development
â Integrated throughout the program
â Creates explicit expectations for trainee
â Building blocks for program evaluation
â Engine for trainee and program improvement
⢠Characteristics of effective assessment and evaluation
⢠Examples of techniques/methods
⢠Discussion
11. Definitions
⢠Assessment
ďź Process of measuring learning (describing, collecting, recording, and scoring
information), generally focusing on observable KSAs
ďź Gathering of information about learner performance that is relevant to stated
competencies/outcomes
ďź The goal of assessment: performance improvement, as opposed to simply being
judged.
ďź Provides information for changes/interventions that improve learner performance
ďź Formative
⢠Evaluation
ďź Process of making judgments; of comparing assessment data against established
criteria, evidence or standards to determine the extent to which learner
competencies/outcomes and program goals have been met
ďź Provides information for changes/interventions that improve learner/program
performance
ďź Summative
12. Definitions conât
⢠Program Goals
ďź General and âfuzzyâ, they are aspirational.
ďź Overall outline of what the program will accomplish.
⢠Program Objectives
ďź Measurable and specific.
ďź Introduce the curricular domains of the program, eg: Patient-Centered Care,
Professionalism, Clinical Practice.
ďź Within the domains are sub-domains which contain specific learner outcomes.
⢠Learner outcomes
ďź Measurable benchmarks, the intended results of the curriculum.
ďź Describe what the learner will actually do, and often use Bloomâs taxonomy of action
verbs.
ďź Summative (final) data describing learner performance is compared to the benchmarks.
It is an indicator of achieving outcomes. It is your evidence that your residents are
learning and doing what you said they would learn and do.
13. The Relationship between
Assessment and Evaluation
Formative Assessment
for Learner Feedback
Summative Evaluation for
Improvement
Summative Evaluation for
Programmatic Improvement
14. APA guidelines
Domain E: ResidentâSupervisor Relations
At least semiannually, written feedback re: meeting performance
requirements:
(a) Initial written evaluation provided early enough for self-
correction;
(b) Second written evaluation early enough to provide time for
continued correction or development;
(c) Discussions / signing of evaluation by resident and supervisor;
(d) Timely written notification of problems, opportunity to discuss
them, and guidance re: remediation; and
(e) Substantive written feedback on extent to which corrective
actions are or are not successful.
15. NNPRFTC Standard 3: Evaluation
Evaluation components
⢠Institutional performance
⢠Programmatic
performance
⢠Trainee performance
⢠Instructor and staff
performance
⢠Assessment based on Programâs core
elements, competencies, and
curriculum components
⢠Assess performance of each trainee
through periodic and objective
assessment (formative and
summative)
⢠Include identification of any
deficiencies or performance concerns
⢠Process for trainee performance
concerns, incl. improvement plan
with measurable goals.
16. Models of Learner Assessment
ďź Learner assessment is anchored in the learning theory or model used to
create the curriculum;
ďź Measure important milestones specified by the learning theory in the
context of the curriculum.
_____________________________________________________
â Malcolm Knowles â Andragogy: âAdult learningâ
⢠Involve learner in the planning and evaluation of their instruction.
⢠Experience (including mistakes) provides the basis for the learning activities.
⢠Adult learning is problem-centered rather than content-oriented. (Kearsley, 2010)
â James Englander et al (2013) 8 Clinical competencies
â Dreyfus / Brenner
⢠Novice to expert
⢠Assessment tailored to each level of proficiency
22. Examples: APA Accreditation
⢠Competency/domain: Professionalism
⢠Learner Outcome: Demonstrates in behavior
and comportment the professional values and
attitudes of the discipline of psychology.
⢠Subdomains: Professional Values and
Attitudes, Cultural diversity, Ethics, Reflective
Practice/Self-Assessment
⢠Measurable outcome for subdomains:
â CHCI: Dreyfus Novice to Expert
23. Example: APA Accreditation w/CHCI outcomes
⢠Subdomain: Professional Values and Attitudes
⢠Components of subdomain: Integrity,
Accountability, Concern for welfare of others
⢠Outcome for Integrity: Monitors and
independently resolves situations that
challenge professional values and integrity
⢠Outcome for Accountability: Independently
accepts personal responsibility across settings
and contexts
24. CHCI Rating Scale for Post-doc Psychologists
1) Novice â entry level skills, knowledge, attitudes
2) Advanced Beginner -- Developing skills, knowledge
and attitude
3) Competent - Developed skills, knowledge and
attitude
4) Proficient -- Advanced skills, knowledge and attitude
5) Expert -- Authority for skills, knowledge and attitude
0) No interaction
25.
26. Example: NNPRFTC Accreditation
⢠Competency/domain: Patient Care/ Knowledge
for practice
⢠Learner Outcome: Provide effective evidence-
based patient-centered care for the treatment of
health problems and the promotion of health
⢠Subdomains: diagnostic tests, history & physical,
prescribing, plan of care
⢠CHCIâs Model for assessment measurement:
Dreyfus/Benner Novice to Expert
27. NNPRFTC Accreditation w/CHCI outcomes
⢠Subdomain: History & physical
⢠Outcome for History & physical: Perform
comprehensive history and physical exam
⢠Outcome for diagnostic tests: Order
appropriate screening and diagnostic tests
⢠Outcome for prescribing: Order appropriate
medications
28. CHCI NP Residency rating scale
1 Novice Observes task only: Entry level skills,
knowledge, attitudes
2 Advance
Beginner
Needs direct supervision: Developing skills,
knowledge, attitudes
3 Competent Needs supervision periodically: Developed
skills, knowledge, attitudes
4 Proficient Able to perform without supervision:
Advanced skills, knowledge, attitudes
5 Expert Able to supervise others: Authority for skills,
knowledge, attitudes
0 N/A Not applicable, not observed, or not
performed
29. CHCI Assessment Proto
⢠Residents assessed in 8 competency domain areas
(based on NNPRFTC accreditation curriculum standards)
⢠Residents complete a self-assessment at baseline,
6 months and 12 months
⢠Preceptors complete assessment at 6 and 12 months
⢠Preceptor team develops 1 final assessment for each
resident
30. Creating Your Assessment Process
⢠Anchor in the curriculum and program objectives
⢠What is the evidence/documentation?
⢠What methods do you want to use?
⢠Use reliable and valid techniques
⢠When are you going to collect data?
⢠Conduct systematic formative (on-going) and
summative (final) data collection
⢠Create feedback loop â remediation and using the
information
⢠Measuring the impact
31. ⢠Pell Institute: user-friendly toolbox that steps through every
point in the evaluation process: designing a plan, data
collection and analysis, dissemination and communication,
program improvement.
⢠CDC has an evaluation workbook for obesity programs;
concepts and detailed work products can be readily adapted to
NP postgraduate programs.
⢠The Community Tool Box, (Work Group for Community Health
at the U of Kansas): incredibly complete and understandable
resource, provides theoretical overviews, practical suggestions,
a tool box, checklists, and an extensive bibliography.
Resources:
32. Resources contâ
⢠Another wonderful resource, Designing Your Program Evaluation
Plans, provides a self-study approach to evaluation for nonprofit organizations
and is easily adapted to training programs. There are checklists and suggested
activities, as well as recommended readings.
⢠http://edglossary.org/assessment/
⢠NNPRFTC website â blogs: http://www.nppostgradtraining.com/Education-
Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-
Evaluation
34. VETERANS HEALTH ADMINISTRATION
⢠Explain the development of the NP Residency
competency tool
⢠Describe the validation of the NP Residency
competency tool
34
Objectives
35. VETERANS HEALTH ADMINISTRATION
⢠Demonstrate program effectiveness
⢠Standardization across 5 sites
⢠Document competence in 7 domains
⢠Prepare for site accreditation
NP Competency Tool
36. VETERANS HEALTH ADMINISTRATION
â AACN/CCNE Masters and DNP Essentials
â AACN/NONPF Adult-Gerontology Nurse Practitioner
Core Competencies
â NCQA PCMH Standards
â Core Competencies for Interprofessional
Collaborative Practice (IPEC)
â ACGME competencies
â VA top outpatient diagnoses
â COE education core domains
â Entrustable Professional Activities
Development
37. VETERANS HEALTH ADMINISTRATION
⢠Iterative process
â VA NP experts at each site and MD education
consultant
â Post-graduate NP trainee reviewed and offered
suggestions
â Solicitied input from experienced and new NPs
throughout VA Primary Care
Content validity
38. VETERANS HEALTH ADMINISTRATION
⢠Clinical competency in planning and managing care
⢠Leadership
⢠Interprofessional team collaboration
⢠Patient-centered care
⢠Shared decision making
⢠Sustain relationships
⢠Quality improvement and population management
Domains
39. VETERANS HEALTH ADMINISTRATION
⢠NP resident and mentor complete competency tool at 1, 6,
and 12 months (total 69 items)
⢠Rate on 0-5 scale
â 0= not observed or not performed
â 1= observes task only
â 2= needs direct supervision
â 3= needs supervision periodically
â 4= able to perform without supervision
â 5= able to supervise others- aspirational!
NP resident responds to open ended questions
Methods
40. VETERANS HEALTH ADMINISTRATION
⢠Evaluation questions:
â identify items and domains NP residents are strongest and
weakest
â determine how NP residents progress over time
â determine agreement between trainee and mentor ratings
⢠Descriptive statistics to evaluate the distributional
characteristics of each item and domain, the impact
of the time on trainee and mentor
⢠T-test and general linear models to assess relationship
between NP resident and mentor ratings over time
Analysis
44. VETERANS HEALTH ADMINISTRATION
Leadership Competency
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
2.1 Lead PACT
team huddle
2.2 Lead case
conference
2.3 Lead team
meeting using
conflict
mgmt/resolution
2.4 Lead group
educ activities for
pts/fam, PACT
team, peers
2.5 Lead PACT
team quality
improvement
project
2.6 Lead
shared/group
medical appts
2.7 Apply
leadership
strategies to
support
collaborative
practice/team
effectiveness
Mean
Mentor_1m
Mentor_6m
Mentor_12m
Trainee_1m
Trainee_6m
Trainee_12m
45. VETERANS HEALTH ADMINISTRATION
⢠At 1 month, 24 out of 28 items were rated between 2 and 3 (2= needs direct
supervision; 3=needs supervision periodically) only four items were rated greater
than 3 by the NP Residents.
⢠Four items rated higher than 3 were âperform comprehensive history and
physical examâ (3.48), âperform medication reconciliationâ (3.54) and
âmanagement of hypertensionâ (3.13) and âmanagement of obesityâ (3.35).
⢠At the 12 month time point all items were rates higher than 3 and seven items
out of 28 were rated higher than 4 (able to perform without supervision) by the
NP Residents
⢠The seven items rated 4 or higher were âperform comprehensive history and
physical examâ (4.17), âorder appropriate consultsâ (4.11), âperform medication
reconciliationâ (4.14) âmanagement of hypertensionâ (4.08), âmanagement of
obesityâ (4.11) âmanagement of gastroesophageal refluxâ ( 4.02), and
âmanagement of osteoarthritisâ (4.00).
⢠At the 12 month time point the mentors ratings were all above 4 (4=able to
perform without supervision) except for two items, âmanagement military sexual
traumaâ (3.58) and â management of traumatic brain injuryâ (3.66).
Item Analysis -Clinical
Competence
46. VETERANS HEALTH ADMINISTRATION
Psychometric Analysis
⢠Internal consistency â the degree to which the items
are measuring the same attribute
⢠Cronbachâs (coefficient) alpha ranging .00 -1.0, higher
value the higher the internal consistency
⢠Internal consistency calculated by NP resident and
mentor for each domain and each time point; Îą =
0.82-0.96
⢠Triangulating qualitative data, qualitative data and
end of program evaluation further enhances content
validity
⢠Factor analysis will be used for construct validation â
identifies clusters of related variables
46
Is the trainee performing based on stated outcomes, competencies and expectations?
â Assessment needs to be anchored in the program objectives
â Outcomes/expectations need to be very clear up front. Assessment/evaluation flows from these.
⢠How do we know that the trainee is performing?
â What tools/methods will we use?
â When/how often will we use these tools?
â How does the assessment occur? Who does it?
⢠What do we do with this information?
â What if the trainee is NOT performing as expected?
â What does it mean for program effectiveness?