Introduction
The current generation of learners has matured in an ever-advancing technological world, with access to practically limitless digital information ‘at their fingertips’ [
1]. Educators must now consider learners’ preferences for technological approaches when adapting to different learning styles [
2,
3]. As a large portion of learning occurs at the bedside, medicine exhibits an ever-present need for educational resources at the point of care. Readily accessible mobile devices and applications are commonly used for real-time clinical decision support [
4,
5]. Prior work has explored the use of mobile devices for the education of medical students, residents, and other health professions students. However, metrics regarding these devices and applications have generally been limited to post hoc or periodic surveys to assess usage and user satisfaction [6–13]. Thus, we lack prospectively gathered realtime data to demonstrate how these tools are actively used.
Using a mobile application (‘app’) that was custom built as an educational resource for pediatric anesthesia, we designed this study to describe and evaluate patterns of utilization in a mixed sample of learners (‘trainees’): anesthesiology residents and anesthesiologist assistant (AA) students. We used quantifiable data that was collected by an in-app analytics platform. Our primary goal was to disprove the null hypothesis that app usage between these groups of trainees would be the same.
Results
The participant flow is shown in
Fig. 2. Notably, there were a significant number of participants who did not complete the post-rotation survey (16/30, 53%). Of the 30 who completed the initial survey, basic demographics, as well as attitudes about technology and apps, are shown in
Tables 1 and
2.
Primary outcome
The median app usage frequency for AA students was 1.0 times per day (interquartile range, 0.9–1.8 times per day). This was a more frequent app use than that exhibited by residents (P = 0.025), who had a median usage of 0.4 times per day (interquartile range, 0.3–0.7 times per day).
Conversely, the total amount of time spent in the app was significantly longer amongst residents than amongst AA students (P < 0.001). The median in-app time for residents was 3.4 minutes (interquartile range, 0.6–12.1 min). The median in-app time for AA students was 1.4 minutes (interquartile range, 0.2–4.3 min). Total aggregated in-app time per trainee was 112 minutes amongst AA students and 236 minutes amongst residents.
Fig. 3 demonstrates the decline in the use of the app over the course of the rotation, both overall and by trainee type. After the first week of the rotation, app usage stabilized at approximately 15 uses per day for the entire cohort. Many of the devices were returned with the battery completely discharged, although specific data regarding this aspect were not recorded.
Secondary outcomes
Participants’ scores on the didactic post-test improved but were not significantly higher (P = 0.104). Excluding the participant with a post-test score of 5 (manual review showed that the participant did not complete the post-test), the mean score for residents (n = 6) ranged from 16.8 to 18.8, whereas the mean score for AA students (n = 6) ranged from 11.3 to 13.6. The overall mean score ranged from 14.1 to 16.3, and participants demonstrated overall improvement of 2.2 ± 3.3 (mean ± SD).
We used alteration of age and weight in the app by the user as a proxy for patient care/clinical-decision-support. As shown in
Fig. 4D, the performance of these calculations peaked in the morning.
In-app clicks on drugs provided us with information regarding areas where trainees felt they needed additional information. These clicks were primarily focused on drugs that are commonly used in anesthesia, such as succinylcholine and ondansetron (
Table 3). We also collected information regarding which didactics were accessed and when. Lectures tended to be accessed more frequently in the 12 pm–2 pm hours and in the 4 pm–5 pm hour (
Fig. 4C). Didactic materials that were accessed most often (
Table 4) included orientation materials and information regarding core topics (e.g., ‘Preoperative Evaluation of the Pediatric Patient’).
We assessed general attitudes towards various educational modalities for instruction in the field of anesthesia (
Table S4, top). We also assessed whether participants felt that their use of the app was viewed as distracting by staff or surgeons (
Table S4, bottom).
Discussion
We found that AA students used a mobile app, providing both clinical decision support and educational resources customized for pediatric anesthesia, more frequently than anesthesiology residents, while on clinical rotation. In both groups, there was notable attrition in app usage by the end of the rotation. On a daily basis, use of the app occurred primarily at the beginning of the workday (when participants were likely preparing for the day’s cases); the most frequently accessed educational resources were related to clinical care (e.g., protocols) and core didactics (e.g., preoperative evaluation).
Detailed analytics related to the use of educational apps have not been previously described. Where usage data has been collected, the collection has typically occurred through self-reports or other forms of retrospective survey [
5,
10]. Therefore, the phenomenon of attrition has not been reported, nor has details regarding specific timing or content accessed. Consistent with a recent review of the use of mobile devices by health professions students, we found that participants used our app for both clinical decision support and to support self-directed learning [
11]. Our survey of attitudes towards various educational modalities also supported previous findings by Ellaway and colleagues; specifically, even with the increased ease of access at the point of care, the use of apps on mobile devices may augment conventional learning approaches in medical education, but will likely not replace them [
10,
17].
AAs are a group of anesthesia providers unique to the United States; we have included in
Table 5 a summary of the educational requirements, training experience, and healthcare system roles filled by AAs, certified registered nurse anesthetists, and physician anesthesiologists in the United States. This is important because the difference in the rate of app use between residents and AA students may have several potential explanations. First, at the time of the rotation, AA trainees will have had objectively less medical training than their resident counterparts. This could lead to the greater use of adjuncts for decision support in a new environment, such as a pediatric anesthesia rotation. Secondly, the group of AA students was younger (P = 0.075), which may also, in part, explain greater app use in this group. An argument against this is supported by similar levels of comfort with the use of mobile technology (
Table 2, P = 0.613).
The contrast between the frequency of app use, which was higher amongst AA students and the duration of time spent in the app, which was longer amongst residents was also an interesting observation. The duration of time spent in the app, on the order of minutes, is consistent with known app-use patterns [
18]. The net result was that residents exhibited nearly double the exposure to the app, compared with AA students. We speculate that residents tended to use the app, although less frequently, for a deeper investigation into specific topics; however, more work is needed to truly understand these differences.
An overall decline in app use after the first week of the rotation was not completely unexpected. Our observation that many of the devices were returned in a discharged state was consistent with this observation. This may reflect the accumulation of knowledge by the trainee during the rotation, with a decreasing need to access the app for reference. Alternatively, the decline in usage may reflect participant preference to access resources on personal devices rather than to track and charge a separate device. Learners’ preference for personal devices has been reported by others; further, the use of technological devices in the workplace is influenced by ease of use, speed of access, and reliability [
10].
Although the majority of app use occurred during the day, participants also used the app late in the evening and overnight, suggesting a role for the app in providing just-in-time support for on-call and emergency cases. Our data also suggest that participants utilized nonclinical downtime to access didactic materials. These resources were primarily accessed prior to the start of the first case, at midday (e.g. during the lunch break), and in the early evening (after scheduled cases were complete). Survey results suggest that trainees are not completely beholden to digital learning: participants provided a higher rating to a variety of other educational resources, including traditional lectures and intraoperative teaching.
Given the importance of vigilance to the safe practice of anesthesia, we assessed the perceptions of tablet usage as a distraction in the operating room. It is concerning that any trainee, let alone one-third of our participants, believed that operating room staff and surgeons considered their use of the device to be distracting. Although recent literature supports the concept that the use of mobile devices tends to occur during times of low cognitive load (e.g., during the maintenance phase of the anesthetic) [
19], it is important to note that the use of these devices may adversely affect the perception of trainees by our perioperative colleagues.
This study has several limitations. A significant limitation that we encountered was the low survey response rate, which may have led to sampling bias. This may have been a result of several factors. From a process perspective, participants were reminded to complete the post-rotation survey within the app, one week prior to completing their rotation. However, completion of the survey was voluntary and, to preserve participant anonymity, we included no mechanism to identify who had completed their surveys; unfortunately, this limited our follow-up ability. This voluntary aspect of the survey, likely combined with the aforementioned decline in usage of the app, may have resulted in the observed attrition. We plan to investigate whether improved post-rotation survey participation occurs if the app is available on a participant’s own device, rather than on the tablet computer, or if survey completion can be converted into a mandatory function.
There are several other noteworthy study limitations. First and most significantly, the use of the mobile app was added to the existing educational curriculum for the pediatric anesthesiology rotation at our institution. As we did not collect pre-rotation and post-rotation didactic test information from a control group (non-users), we cannot, in any case, draw conclusions regarding the effectiveness of the app beyond our within-group comparisons. Given the complexities inherent in analyzing educational interventions and learning preferences, however, it is not guaranteed that the inclusion of a control group will necessarily provide widely generalizable data [
12,
20]. Second, the study involved a relatively small number of participants. This could represent another source of sampling bias. Third, the results related to patterns of use are only applicable to the app studied; therefore, they should not be extrapolated to other medical education apps.
In conclusions, this study employed quantifiable data provided by in-app analytics to characterize the use patterns of a pediatric anesthesia mobile app; notably, it shows a greater frequency of use by AA students, compared with anesthesiology residents. Further research is needed to determine the trainees’ preferred choice of device, user experience, and content, for the full range of clinical and nonclinical purposes. More work is warranted to establish whether the use of mobile apps in the operating room is distracting and presents risks to patient care.