This study was reviewed and approved by the Ethics Committee of the Faculty of Computer Science, Engineering and the Built Environment, Ulster University (Reference: 15.07 Bond, Raymond [Torney, Hannah]). The study was conducted in two parts: Protocol 1 was an in-person usability study of the HeartSine SAM 500P (HeartSine, Belfast, Northern Ireland), and Protocol 2 was a remote use study of the same device conducted using video conferencing software. Sample sizes for each protocol were based on ease of use and human factors guidelines, which recommend a minimum of 15 participants per group7.9. Figure 1 shows a flow chart depicting each study protocol.
Protocol 1: Personal usability study
The simulation study was carried out in an office space at Ulster University, Northern Ireland, in January 2017 ahead of the Covid-19 pandemic. Eighteen participants were randomly recruited from a public space at Ulster University and gave informed consent. Parental consent was obtained for participants under the age of 18.
This study evaluated the personal usability of the HeartSine SAM 500P. To ensure the safety of participants and researchers, the AED used throughout the study was modified to remove hardware that prevents the device from physically delivering a shock. Otherwise, the device was fully representative of the marketed SAM 500P. A vital signs manikin (TruCorp, Craigavon, Northern Ireland) capable of simulating shockable arrhythmias was placed on the floor with the SAM 500P alongside. The equipment set up for Protocol 1 is shown in Fig. 2.
Upon recruitment, participants received a summary of the study protocol with basic background information about the study. All participants were assigned an identification number and were asked to complete a pre-test questionnaire that collected the participants’ demographics and whether they had previously been trained in cardiopulmonary resuscitation (CPR) or defibrillation. In addition, participants were asked to rate how difficult they expected the test to be on a scale of 1-10, with 1 being very easy and 10 being very difficult.
The participants were instructed to enter the testing room, where they found a simulated cardiac arrest victim (the clothed mannequin described above) lying on the floor. They were instructed to assume that emergency services had already been called and that the victim was not breathing and had no pulse. They were then told they would find an automated external defibrillator, or AED, next to the victim, which could be used to deliver an electric shock to a patient’s heart to get it pumping again. Participants were informed that one or more people would be observing and videotaping their actions; However, you are not allowed to ask them questions or ask for help until they have been told that the test is complete. They were told to behave in the same way as they would in a real emergency, where time is of the essence and every second counts. Participants were assured that the AED was being evaluated, not them, but that it was important that they act with the same urgency, determination and care that they would in a true emergency of this nature. The examiner indicated the beginning of the test with the words “You can begin”. This statement specified the start time for all time-based metrics in the study. Each test was conducted in one room without interruption and the participant was videotaped for the duration of the test to allow for retrospective analysis.
Participants were expected to turn on the AED, locate and remove the pads from the AED, remove the plastic wrap and apply the pads to the manikin, remove contact with the manikin as directed by the device, activate shock delivery, and complete a two-minute exercise cycle of CPR.
Upon completion, participants were asked to rate the actual difficulty of the test on a scale of 1 to 10, with 1 being very easy and 10 being very difficult. They were then invited to a short interview to give their opinion on the test and the applicability of the device.
Protocol 2: Remote Usability Study
This simulation study was conducted remotely in September 2020 using video conferencing software. Eighteen participants were recruited after responding to a social media post and obtaining informed consent. There were no participants under the age of 18 in this study.
To allow comparison with Protocol 1, this study also evaluated the ease of use of the HeartSine SAM 500P. To ensure the safety of the examiner, the SAM 500P used during the tests was modified in the same way as Protocol 1. Otherwise the device was fully representative of the marketed SAM 500P. A webcam connected to Microsoft Teams (Redmond, USA) was placed over the AED to provide a top view of the device. A vital signs manikin (TruCorp, Craigavon, Northern Ireland) was placed on the floor, also with a webcam connected to Microsoft Teams in airplane view as shown in Fig. 2. Participants were invited to attend individual Microsoft Teams meetings. Two participants failed to join the Microsoft Teams meeting, using Google Meet instead. There were no differences in study design between Microsoft Teams and Google Meet.
Participants received a brief overview of the study that provides information about AEDs and their use. All participants were assigned an identification number and asked to complete a pre-test questionnaire identical to that used in Protocol 1. Participants were also asked to rate the expected difficulty of the test on a scale of 1 to 10. where 1 was very easy and 10 was very difficult.
Participants were informed that the study was conducted to assess the usability of the device and the feasibility of remote usability testing. They were informed that they would be asked to instruct the investigator to use a publicly available automated external defibrillator, who would issue instructions both verbally and visually by moving their computer mouse over a video feed. The investigator instructed the participant to share their screen so that the participant could show their movements with the mouse.
Before starting the testing phase, a two-step participant verification was performed to assess the video latency and to provide a sample scenario. First, the investigator placed a piece of paper with the letters A and B written on it under the webcam view and instructed the participant to move their mouse between the letters, indicating what they were indicating when they moved their mouse. This indicated latency in the video display. Second, the investigator placed a television remote control under the webcam view and asked the participant to instruct the investigator visually (with his computer mouse) and verbally to turn on the television and open a specific application. This allowed the participant to practice guiding the examiner.
Before the start of the study, the investigator informed the participant that he had encountered a person who had suffered sudden cardiac arrest. They were told that 999 had been called and that the investigator had a publicly available AED, but that they didn’t know how to use it and needed their guidance. They have been told that sudden cardiac arrest is time sensitive and that they must think quickly and give clear instructions. The examiner indicated the beginning of the test by saying “Let’s begin”. This statement specified the start time for all time-based metrics in the study.
The investigator performed all tasks as instructed and indicated. For example, a participant can instruct the user: “Press the button in the center of the device”. To avoid bias, it is important that the investigator did not make assumptions, but instead clarified with the participant and asked for further instructions, such as: B. “Which button should I press?”.
Similar to Protocol 1, participants were expected to instruct the investigator to turn on the AED, locate and remove the pads from the AED, remove the plastic wrap and apply the pads to the manikin, remove contact with the manikin when prompted by the device and enable shock delivery. Upon completion, participants were asked to rate the actual difficulty of the test and took part in a brief interview to provide their opinion on the test protocol and the device’s ease of use.
The key metrics in this study were the time it took to turn on the AED, the time it took to place the pads, and the time it took to deliver a shock. The acceptability of the electrode placement was evaluated by viewing the quadrant of the chest on which the electrodes were placed. The placement of the sternum pad was considered correct when it was between the clavicle and the nipple to the right of the sternum, and the placement of the apical pad was considered correct when it was positioned below the left nipple near the left midaxillary line. In addition, safety was assessed based on the proportion of participants who did not touch the doll during the analysis period of the algorithm in Protocol 1 and those who indicated that the examiner should “stand away” in Protocol 2. Note that both Protocol 1 and Protocol 2 used the same investigator who, beyond introducing the scenario, provided no guidance in either study but asked for clarification in the distance study in Protocol 2 if the instructions were not clear.
Minitab 19 was used for data analysis. Summary statistics were calculated for participants’ age and gender, educational level, and prior CPR and/or defibrillation training. Pearson’s χ2 Where appropriate, the Mann-Whitney U test and the Mann-Whitney U test were used. A p-value of <0.05 was considered statistically significant.