top of page

Service Measures and Outcomes Working Group: May 8, 2020

In Attendance: Sandy, Mary, Melanie, Don, Avra, Janet, Heather, Dayna, Laura

Reviewing the “Practice Profile for NAVIGATE”


- The team has been working on this core components document for a long time

- The plan is to share it with the NAVIGATE trainers in the US and to get feedback from them as to whether this is important and relevant

- Tried to identify guiding principles for NAVIGATE and core components

- The data source will be staff filling out information, which is a limited data source compared to what was done for the RAISE study in the US, but is within the scope of what we can do. It may be something programs can sustain when the research project is over

- The beginning details the background, rationale, and literature

- It has been narrowed down to 7 core components, and caseload size was recently added, clinicians need to have less than 20 clients for the program to get the most points, but it’s important to note that different roles have different numbers of clients, so it will be interesting to capture the caseload for each NAVIGATE role

- We combined integrated activities and integration role to the director role

- Often we think about interventions in terms of manuals, but we need a higher organizing frame, we want people who don’t know about NAVIGATE to be able to understand it and task it so that they will be able to put it in their EPI program

- In the US model they based fidelity off of certification of staff, in Ontario it’s based off expert consultation in a peer support model of competency. This captures adherence and penetration

- It’s not clear if this document is descriptive or evaluative, at this point we don’t know exactly what the benchmark will be

- It may be helpful to prepare descriptive profiles of each site, see the variances, and have other data sources to understand how well they’re implementing NAVIGATE

- Descriptive and implementation/clinical data needs to be looked at with respect to the operational definition of clinical success and implementation success

- On the clinical side this is somewhat already defined, clinician’s look at clinical success at the level of the client’s improvement

- On the implementation side, we need to come up with an operational definition of implementation success, not just a penetration measure

- This ideally would have been set out from the beginning

- Don mentioned that this is a more granular version of the fidelity scale we are already using, and that caseload size not likely to be related to outcomes

- Some items are fairly clear

- If you divide the IRT role into CBT and psychoeducation, it would generalize findings and you could compare it to other research. So it may be helpful to parse out IRT modules because some are psychoeducation, some are about substances, and some are CBT

- We could capture how it’s provided but not what’s in it, as this is done in manuals

- We don’t want this to just be from a research lens, we need to be clear about what we are doing in terms of the research lens so they can still do program evaluation when the research project is over

- We need to leave them with evaluation tools they can actually use

- REDCap is the research database they will be using to record the use of NAVIGATE during the research project

- If we want program evaluation to survive after the research project is finished, we need to teach the sites how to do this on their own, and encourage them to do this

- Some sites want to incorporate something into their own EMRs to track service delivery

- We started talking about an ACCESS database, then went back to REDCap

- With help from the IS’s we can learn what provided clinical utility for them, as it may be difficult for clinicians to keep up with evaluation

- Lots of EPI programs have administrative supervision, but not clinical supervision

- There are a lot of metrics, and if this is intended to be maintained by the program, less items will be better

- We want to focus on process measures (Did you deliver NAVIGATE material?), so we will remove outcome measures

- We want to highlight quality assurance pieces as these will be helpful for sites

- It may be helpful to ask the NAVIGATE trainers in the US which components are the most important

- Great conversation was had today, and the implementation team will be in contact with everyone via email to continue the conversation

Recent Posts

See All
bottom of page