top of page

Implementation Committee: February 20, 2019

In Attendance: Janet Durbin, Mary Hanna, Jasmyn Cunningham, Eva Serhal, Sanjeev Sockalingam, Christine Mitchell, Dayna Rossi, Nandini Saxena, Sandy Brooks, Kelsey Jones, George Foussias, Sarah Bromley, Chelsi Major, Jeff Rocca

Action Items

Site Technology Assessment

  • Send ECHO registration/technology assessment to Chelsi for review - Eva

  • REB amendment for Zoom for remote patient assessment – Jasmyn

  • How this was approached in other REB application - George

  • Troubleshoot Zoom for patients/networks at sites – PSSP

Progress Report

  • Draw up 1-page progress report for the sites, for site leads and higher level management – Jeff

In-person Training Evaluation and Readiness Assessment

  • Send higher-level training objectives to Sanjeev – George and Sarah (done)

  • Update with likely evaluation draft timeline – Sanjeev (ASAP)

  • Draft initial version of evaluation (likely 1-1.5 pages) - Sanjeev

  • Data team meet for REDCap evaluation platform - George, Jasmyn, Steve, Dielle, Chelsi, Nadia (tentatively scheduled)

  • Review draft site readiness assessment and propose any desired changes – Implementation Committee (email to Janet)

In-person Training Notes

  • Send note categories out for in-person training notes, or send what notes they have so far so others can add to it - PSSP



Updates from Steering Committee

  • Steering Committee Meeting today was mainly follow up from in-person training. There will be a follow up email circulated, instructions for receipts, etc.

  • Chelsi will update the web portal, and will email with updates being done, including PowerPoints, links to training manuals, adding a booster call tab, etc.

  • Discussed some ECHO agenda items (see below)

ECHO discussion

  • Eva in Steering Committee Meeting indicated that ECHO has registration forms for technological assessment at the sites

  • The question on the research side was whether this will be an ECHO specific assessment? Or will it also assess for patient data collection needs as well?

  • George: we will be using Zoom for patient assessment

  • There are barriers for the sites using Zoom for research data collection purposes – are patients able to access it?

  • Who is responsible for assessing these technological barriers?

  • Eva: ECHO is only looking at Zoom, and has an existing set of questions, and can look at how can they frontload those questions and ask them earlier? If a site expressed concern about not being able to use network, is technology transfer of private data the issue? Or is it about onboarding people on to zoom?

  • ECHO doesn’t see patients directly. If seeing patients directly – uses Ontario Telemedicine Network (OTN) – what would be used for direct patient consultation in our research?

  • George: Zoom is HIPAA and PHIPAA compliant

  • Eva: Zoom isn’t used for patients; it is up to the research team to do an assessment. You can’t bill OHIP for it – maybe that isn’t relevant. ECHO is currently in a conversation with privacy/IMG/ others to assess using Zoom for Client care, and should have answers by March 31 to have a framework. Until then, ECHO not comfortable assessing for using Zoom in a direct patient communication context. For research, it is up to the Steering Committee if we want to use Zoom for patients. Research would need REB amendment for Zoom for remote assessment.

  • Eva: ECHO can help onboard people but if it is questions around the research piece it needs to come from the research group, following same process that George followed for the other study he was referencing.

  • Eva will circulate the assessment given to the sites

  • ECHO evaluation is about technology, and a 15-20 minute orientation to trouble shoot before starting the session.

  • In the registration, there is no question about network, usually everybody is a staff, therefore you will likely run into issues with clients/patients. Will need to factor in more considerations.

  • This project will be using Zoom, not OTN. Likely this wasn’t specified in the site contracts, but at this time we would need to confirm.

  • Since not billing OHIP for this study – Zoom needs to be specified in the REB, as it is separate from clinical service (so different abilities to do things than in core clinical services)

  • In the protocol, it states that we are using ECHO infrastructure, which we are not doing (they don’t use Zoom for direct clinical contact, and the ECHO intention is to connect providers, not patients) so this will need to be amended for REB

  • Patients will be at their site, sitting in front of a webcam, and remote CAMH research staff will be assessing them (not necessarily clinicians or physicians, so it is not a clinical consultation)

  • From IT perspective, having a patient in front of a computer logged in to an internal network is the concern

  • The technology piece needs to be done anyways for ECHO – the outstanding piece is around availability of a different network like a guest network that participants can log into, which is more troubleshooting than onboarding. This would be the responsibility of PSSP.

  • Christine can connect with the team to share approach around onboarding, talk about role of onboarding vs troubleshooting

Progress Report

  • Jeff will complete a 1-page progress report for the sites, for site leads and higher level management

In-person Training Evaluation

  • Sanjeev: the education evaluation team standardizes evaluation across education. We could look at what you’ll send out and send feedback. This needs to happen quickly, not be delayed.

  • We hope to send it out at same time as the “thank you” email to the sites, within the week

  • Janet: readiness assessment is close to finalization. Need to look at it and make sure there’s no overlap in terms of content with training evaluation, not ready to go out today

  • Kelsey: part of the role of engaging with sites is clarifying these different pieces going out to them

  • Janet/PSSP could take a first look at the evaluation, then Sanjeev could take a look at it

  • Sanjeev: we are evaluating the training that they received last week for what purpose? For development of future training? For presenters? For Navigate team? What is it for, will it help determine the future approach, who is the audience for the data?

  • George: The audience is us, not Navigate trainers. We want to understand if there are benefits in terms of people’s capacity or sense of capacity around the content of Navigate and their familiarity and ease with which they feel that they can implement it following training. This overlaps a bit with the readiness assessment, and it is not possible to entirely ease them apart – don’t leave it all for readiness assessment

  • Training evaluation is more content-specific than the readiness assessment

  • Mary: Trainers also expressed interest in reviewing evaluations etc. but she didn’t commit to that

  • The event evaluation is for our own internal event planning, and to focus on the level of preparedness and readiness around specific content areas. We can draft training objectives and assess whether people feel they met these objectives – need to be reflective of specific content areas

  • Sarah: For evaluations – the perspective is did this help improving competence in delivery? For sustainability, this will help drive the training and if we need to modify things

  • Sanjeev: self-assessment tends to be poor, so what we will get at is confidence or self-efficacy, which will overlap a bit with readiness. It will be tied to specific content domains, skills, knowledge domains in broad categories then figure out based on that what their perceived confidence really is. Might be good to reassess and align where we want ECHO to go and to have a baseline from this evaluation.

  • Janet: also does it have to be 3 days? Was it all valuable? Going forward, onboarding new people, won’t have a face-to-face 3 day event, we can ask about other kinds of options or ways it could be delivered or shortened – this would also be important in future for scaling up

  • Sanjeev: there are standard questions about delivery satisfaction and certain delivery components and recommendations for changing the training format. We will want to include open ended questions.

  • There will be two surveys sent separately, but a couple of weeks apart (evaluation and readiness), and both will be short so it hopefully won’t be overwhelming. Training and conference evaluations are standard.

  • Sanjeev will do the evaluation part – needs guidance from the content domain on what they think they should have walked away with from the training. Can information be sent by those trained on Navigate – the rest should be able to be developed in 1-1.5 page evaluation by Sanjeev’s team

  • Sanjeev will draft an initial version for review but needs guidance on objectives for training

  • George and Sarah had high level objectives that were put together, they can share those.

Readiness Assessment

  • A draft of readiness survey was circulated that needs everyone’s input. It is a 38-item tool standardized from group in the US, they adapted language to make it relevant to Navigate – bolded items from tools from implementation of navigate at Slaight.

  • Look at added red items, see if you want to keep them in or not

  • Add extra items from what we know about the sites if you feel like items should be added

  • Edit “about you” section if we want to know anything else about the person completing it.

  • Everybody go through it, then will have final or close to final version

  • Data team (George, Jasmyn, Steve, Dielle, Chelsi, Nadia) will try to meet soon to pull together in REDCap

  • We should have it centralized so that everything is similar and consistent from a messaging perspective

  • When would survey be completed? Timeline of training evaluation? Sanjeev will find out, shouldn’t take too long since modifying a template

Information from In-Person Training

  • PSSP: A few people took notes from training, and will try to see the most useful way to share that with others. They are working on summary document including information classified as support, questions, helpful practices/recommendations, recommendation/modifications from the manual, as well as additional informational leads, and positive reflections from clinicians.

  • They will work on and share. They couldn’t get full coverage from the sessions, each participated in all types of sessions but couldn’t attend full session for any of those but will share back as soon as possible.

  • Kelsey also has notes from end of day reflection meetings.

  • Can send categories out now, or what they have so far so others can add to it if they want to.

Recent Posts

See All

Implementation Committee: September 14, 2022

Present: Michael, Sandy, Kerri, Josette, Kim, Sarah, Nitha Regrets: Carol, Jolene, Jeff, Shelly Agenda: Onboarding communication and monitoring Sept 28 optional IPC meeting Check-ins Notes: 1. O

Implementation Committee: September 7, 2022

Attendees: Michael, Sandy, Nitha, Rakshi, Marleine, Shelly Agenda: · ECHO · Presenting scores · Check-ins Notes: 1. ECHO Sudbury presenting a case at October 14th ECHO Dis

Implementation Committee: August 24, 2022

Attendees: Michael, Jeff, Josette, Shelly, Sandy, Emily, Carol Agenda: Check-in Reporting scores to participants at end of participation in study Updates Notes: 1. Check-in Josette: No concerns Having


bottom of page