Globally, roughly half of the world’s population suffers from lack of access to essential health care due to the shortage of trained human resources in healthcare.1 In Low and Middle Income Country (LMIC) settings, one of the ways to address this critical gap is to train community healthcare workers. Community health workers, often referred to as Frontline Workers (FLW), can be trained on health prevention and health promotion.2 These FLW play an important role in community mobilization, outreach activities, and service delivery at the last mile. Major donor agencies such as the United States Agency for International Development (USAID) support community health projects focusing on the marginalized and vulnerable groups in countries in Asia and Africa.3
In rural India, the number of internet users has been increasing; there was a year-on-year growth of 35% in 2018, and smartphone penetration has increased from 9% in 2015 to 25% in 2018. The smartphone base in India is expected to further increase to a level of 820 million in the coming two years.4 India also has one of the cheapest internet data plans in the world; the low cost of data makes access to the internet more affordable.5 From the point of view of public health practitioners and policymakers, it is therefore imperative to use this vast network of mobile phone users to disseminate key health information and impact more lives.
Though there have been several mobile platforms for clinical decision support; however, there is a dearth of evidence and understanding on the use of mobile phones for training FLW in LMICs on thematic areas of their work. The current evidence suggests that most mobile-based training for FLW targeted topics such as workforce management and on-the-job aid and tools. There are only eight published studies on formal FLW training courses in thematic areas of health as against 44 on works force management.3
The current evidence base shows a good degree of progress towards creating and using more interactive and engaging approaches to pedagogies across several areas of global health but significant work is needed to improve the quality of mobile-based training programs.3
USAID-funded Project Samvad, implemented by Digital Green, used a mobile phone-based virtual training to enhance the skills and knowledge of the frontline workers on family planning. This descriptive study deals with the questions of the feasibility of virtual training at scale, engagement of the participants with the content, and quality of the learning. Furthermore, we explore how enrollment and rolling out of mobile training programs and their promotion can be made more effective and simpler from the participants’ perspectives. We also explored issues such as course design, content creation, and support system at the community level that may help the participants in the process of learning and the factors that program managers should consider for scaling up virtual training programs.
METHODS
This virtual training experiment under Project Samvad was implemented in Uttarakhand, Bihar, Jharkhand, Chhattisgarh, Odisha and Assam states of India. In these six states, three languages are spoken most frequently: Hindi is spoken in Uttarakhand, Bihar, Jharkhand and Chhattisgarh states with some local variations, Odia is spoken in Odisha, and Assamese is spoken in Assam state. The modern family planning methods in these states need to be promoted. The current use of any modern method of family planning among the currently married women aged 15–49 years in Bihar is 44.4% and in Assam it is 45.3% (National Family Health Survey 2019-20).6 In the states of Uttarakhand, Jharkhand, Odisha and Chhattisgarh it is 49.3%, 37.5%, 45.4% and 54.5% respectively (National Family Health Survey 2015-16).7 The rates of contraceptive use in many of these states are lower than the national average of 47.8% in the year 2015-16. The national average for the year 2019-20 is not available.
The method employed in this study is divided into two broad categories: (1) course development, promotion, and rollout processes, and (2) course evaluation. Both qualitative and quantitative approaches were used to collect and analyze data. The participation in the course and the study was voluntary. All FLW appointed by the partner organizations that owned or have access to smartphones and internet were invited to take the course and participate in the study. The further details of the steps followed are explained in the following sections.
Course development, promotion, and rollout
The methodology for capturing the processes that we undertook to develop and rollout the virtual training course comprises small group interviews, internal discovery workshops, taking notes of direct process observations, and reviewing and analyzing existing program documents.
Course description
We developed a four module course. Each module could be completed in 15 to 20 minutes, depending on the participant’s previous knowledge. The total duration of the course is roughly one hour of learning including pre-test and post-test. The details of the course can be found in Table 1.
Online course platform
We used Learn.ink to host the training material. The Learn.ink platform is designed to rapidly distribute this training by providing access to a unique URL that can be shared with end-users on any digital communication channel, e.g. blast SMS, WhatsApp groups, and much more. Once users engage with the training, analytics relating to their usage and learning performance are available in real-time within the online platform. Improvement in learning outcomes can be assessed by testing knowledge before and after exposure to the learning material.
Course development process
We developed a template to map the course content and its flow. The template comprised the details such as module name, module stages, name of a specific stage, learning objectives, learning content, conversation, and deciding on the lesson card which means defining the way a specific content needs to be delivered. The types of learning cards available on the platform to choose from were – multiple-choice questions, polls, use of a video, an image or slide, or simply a conversation to deliver the content to the participants. After this template was populated for the course, it was reviewed to check the flow of the course and ensure if the lessons card chosen were effectively delivering the content and stages of the course. After the review, the template was used to build the course on the platform directly by copying the content from the excel sheet to the platform and applying for the tools and lessons cards. The first draft of the course was developed in a week. After the course upload was completed, a preview of the course was generated and tested with the field team. The team provided feedback that included improving the general flow of the content, giving the context to the participants at the beginning of the module, using more images and videos, sequence of the questions, building the conversation considering the options that learners may have, and ensuring that for each module the pretest and post-tests have been defined. After all of these changes, the test version was rechecked and then the course was translated in Hindi, Odia, and Assamese languages for tests and rollout. Translated versions of the courses were also tested and changes were made before rolling them out in the field.
Course enrollment and roll-out processes
The process of enrollment and rolling out of the course was mostly uniform across the project locations with some state-specific variations. Figure 1 shows this process in general. The project team at the state level discussed the course and the processes involved in completing it with state partners to secure their buy-in and shared the course link and video tutorial with the partners. The partner staff took the test version of the course. The next steps in the process of rolling out the course were to orient the block level staff over the phone and then in a face-to-face meeting. In the state of Chhattisgarh, the district level head of the health program issued a letter for the field functionaries to undertake this course. The block-level functionaries encouraged FLW to complete the course during their monthly meeting. To disseminate the link of the course to the participants, existing WhatsApp groups were used and the message with the course link was shared over these groups. The video tutorials were also shared in these groups to inform the participants how to register and navigate through the course. In the state of Bihar, the launch of the course was linked with the Family Planning campaign. The partners and FLW were informed that the course is free and does not take much time to take. Regular follow-up took place in almost all the states to encourage completion of the course. In the state of Uttarakhand, handholding and on-the-ground support were also provided to complete the course during the field visits.
Course promotion and onboarding of the participants
Figure 2 shows the process, we used for the promotion of the course. The course was promoted jointly by Digital Green and state partners.
Course evaluation
The course-related data on enrollment, interaction, course participation, and course completion was downloaded from the server of the training platform in excel format and descriptive statistics were calculated. The qualitative data were collected using a monitoring questionnaire from the course implementers, representatives of partner organizations, and course participants. The qualitative data were collected in face-to-face meetings and core team review meetings. Responses were received from ten Digital Green Staff who implemented and rolled out the course in the field, six partner staff from the six implementing states and 15 course participants (five each from those who completed the course, who partially completed the course and who did not complete the course). In total 90 course participants were reached out for the interviews in six states. Content analysis using an excel sheet was done to identify common themes including topics, ideas, and patterns of responses that were repeated. The course analytics is divided into two major categories including the participants’ profile and the quality and engagement of the participants in the process of learning.
Limitations
One of the limitations of the study is that only who have access to the smartphones and internet participated in the course. Therefore, we do not know about the usefulness of the online training for those FLW who do not have access to smartphones or internet access. The participation in the course was voluntary; only a fraction of FLW who own smartphones participated in the course and we do not know the reasons why FLW chose not participate in the course.
RESULTS
Course participants
The course participants that enrolled on the platform and took the course included a variety of FLW working with nongovernment organizations and state partners. Most of the course participants were women. A total of 1211 participants registered on the online learning platform in a period of two months (January to February 2021) and out of these FLWs, 86 % were female workers and the remaining 14% were male FLWs. Most of FLW were in the 36 to 45-year-old age group, followed by the age group of 26 to 35 years. (Figure 3).
Quality of learning and engagement in the virtual training
The quality and engagement of the participants in the virtual training were evaluated on the accuracy of the responses to the questions responded, successful course completion, average course progress, and average end line accuracy. These indicators are defined as below.
-
Question accuracy definition – This represents the average percentage of questions answered correctly before (baseline) and after (end line) the lessons.
-
Course completion definition – This represents the total number of unique users who have completed the course in the timeframe.
-
Average course progress definition – This represents the average proportion of the course completed by users.
-
Average end line accuracy definition – This represents the average percentage of questions answered correctly when asked after covering all the modules.
Figure 4 (plates A-C) shows an increase in the question accuracy for all four modules across all three versions of the family planning course that the program team developed considering the geographical spread of the project. The range of the increase in question accuracy in baseline and endline for all the four modules of the course was 10 percentage points for the Assamese language version, 3 percentage points for Odia, and 9 percentage points for the Hindi versions of the course. The average increase in the baseline and endline question accuracy for all the four modules across all three versions of the Family Planning course was 14.8 percentage points. All the participants responded correctly to around 52 to 73% baseline questions across all the four modules in three different versions of the course.
Out of the six states where the project was implemented, four states were Hindi speaking states and the remaining two were Assamese and Odia speaking states. Hence, the majority of people enrolled and completed the Hindi version of the course. The course completion rate reflects the percentage of the participants that completed all the exercises and modules out of those who registered for the course. Figure 5 shows the course completion by all languages.
Average course progress reflects the progress made by a learner at the individual level and hence shows individual engagement with the content. It was calculated by taking an average of course modules that different participants completed who registered on the portal. The average course progress for the Hindi version of the course was 87%. This means that on average for all the participants, 87% of the course content was completed which is indeed a high content coverage. The average course progress rate for Assamese and Odia versions of the course was 77% (Figure 6).
Another indicator of the quality of learning that we took was average endline accuracy (Figure 7). Average endline accuracy is the percentage of the total number of correct answers given by the participants, out of the total number of questions responded at the end of the course. The average endline accuracy for the Hindi (HN) version of the course was 78% followed by Odia (OD) version at 74% and the Assamese (AS) version at 71% respectively.
The opportunity created
Using a training platform provides the health systems an opportunity to create more such learning courses that suit to the requirements of frontline workers. The learning platform showed merits such as it has a simple process of registration and navigating the course content. The course site was easy to access and can be shared with the learners over social media platforms such as WhatsApp. The course used a conversational style of communication with the participants that made it easy to understand, despite some challenges faced by the participants that are discussed further in the result section. The content allowed the participants to share what they knew and validated that to boost participants’ confidence and motivated them to pursue the course further. It is felt that more such micro-modules can be created jointly with the partners to enhance the knowledge and skills of the frontline workers.
Areas of improvement
We received feedback from the partners and learners that the course could be further simplified to help the learners that possess very basic educational qualifications. Although the course used several images and slides to provide information, more images could be used. The partners suggested having more video content for easy and simple delivery of the course content. They further added that voice messages may also help to deliver key messages related to the course topics.
Some other areas of improvement to be considered include the design of the registration page that does self-selection of the country code for registration of the participants and auto read the OTP as some of the participants experienced difficulty in registration. The design of the course pages should allow the participants to see how much score they got in each session. It would give the participants a sense of satisfaction and accomplishments moving forward in the course.
Engagement with the partners
Interestingly, the concerns from partners were related to the topic of the course and issues such as the design of the course certificates and the use of their logos. Partners were focusing on the topics that were more closely linked to their area of work such as nutrition or child health. During the rollout of the course, partners’ frontline functionaries were engaged in COVID-19 prevention-related activities such as vaccination and had less time to do a course on family planning. Some of the other concerns that partners shared were related to the cost of internet use and bandwidth in the areas where the FLWs were working. Despite the cheap internet data, for many of the FLW, the cost was an inhibiting factor. NGO partners in the state of Odisha did not share any specific concerns for the roll-out of the course. The NGO partner mentioned that the experience of using the digital tools during the COVID-19 pandemic helped them to roll out the course. In the state of Uttarakhand, the National Health Mission was also doing online training, so they used this opportunity to orient their FLWs and informed them about the online training opportunity.
Partners’ opinion
The state partners provided feedback that such online training programs can supplement their existing training programs by creating micro modules for refresher training, upgrading existing modules, and assessing the skills set of the FLW by sending quizzes and short questions over time. They noted that these online courses can be embedded in their training plans and can be used for continuous skills enhancement and complementing the face-to-face training of the FLW.
Partners noted that a variety of short courses can be created on health, nutrition, and water and sanitation issues which are hyper-local in content matter and help address the region-specific learning needs of the participants. Moving forward it will be important to use simple language, easy expression, and terms used by local people. Further to this voice over the slides to explain certain terms and concepts may also be helpful for the learners.
Partners showed interest in scaling and institutionalizing such learning and capacity-building approach by jointly working on course content development and aligning such initiatives with the priority areas of the partners.
Program factors and participants’ engagement in learning
Table 2 delineates the factors that were found linked to the varying degree of participant’s engagement with the platform, leading to course completion which is defined as completing all the modules and participation in all the course exercises. We observed that those participants who completed the course had smartphones or family phones, watched the tutorial videos and sought remote or face-to-face support. These participants also experienced technology-related challenges and issues but they successfully resolved these issues by reaching out to family and friends for support. Unlike those who left the course in between or could not start it, actually did not report seeking or getting support to resolve such issue. It is felt that a community-level system for support to resolve technology issues should be considered for developing courses and rolling out. Those who completed the course also reported peer recognition and certificate of course completion as their motivation however those who could not complete the course did not mention about their motivation. It is realized that other than course certification, other sources of motivation should be identified for the participants and used.
The participants who did not complete the course mentioned that they do not have personal phone set, did not get support from the supervisor, had some technology-related issues, found the system difficult, and could not prioritize taking up the course. We feel that those who could not complete the course could be the ones who may need training more than others. It would be therefore helpful for the program managers to deep dive and learn the issues and the motivation that they have and design the program in a manner that enables them to take such learning opportunities.
Key technical challenges experienced
Some of the technology-related issues experienced by the FLW comprised unsupported browser, unknown language popping up, non-receiving of One Time Password (OTP) on time, course interface in English language and outdated phone software. These were discussed over the phone either by block-level staff or by the project team to resolve some of them. Other issues related to technology, experienced by the FLW were related to the auto language change. This was due to the regional language set in the browser that translated the page and changed the meaning of the text.
Some of the participants also faced difficulties during the registration, despite seeing the video tutorial. For many of the FLW, access to a smartphone at the time of the course initiation was an issue as they were sharing the device with the family when they are at home, and as a result of that, they took a couple of days to complete the course. Few participants had challenges in understanding the questions asked in the course and they had to take support from the fellow participants to understand it. It was found that some of the FLW found it difficult to understand the Hindi version of the course due to regional variation and changes in dialects. In some remote locations, the FLW experienced network issues that affected their ability to complete the course. FLW in some of the core Hindi-speaking states experienced difficulty in understanding the technical content in Hindi, requiring support from other participants and colleagues to understand the content. This is probably because of different dialects that are spoken in different Hindi-speaking states.
DISCUSSION
The online family planning course was developed in a way that delivers the content in a conversational style. Several pedagogical tools and techniques were used to engage the participants in the process of learning such as polls, quizzes, videos, images, and questions. The conversational style content delivery enhanced learners’ attention and reduced cognitive load to improve the learning outcomes.8 This may have helped in improving the score in posttests which was recorded roughly at 15% points higher from the pretest.
For rolling out the course, the partners’ staff was engaged at different levels. They were oriented and then they supported the initiative at the ground level. In some of the states, the course rollout was linked with the health campaigns that match with the theme of the course. This may have helped in the promotion and quick uptake of course.9
Some of the challenges that the participants experienced were related to language, expressions in the conversation, and terms used in the technical content that the participants found hard to understand. One of the reasons for that maybe different dialects that are spoken in different pockets in the project locations. Hence moving forward, it would be useful to keep the language simple and use local terms that FLW use in community settings. This becomes more important when the participants interact with the self-learning modules.10 We observed that the participants took support from other participants and colleagues to respond to some of the questions that they found hard to understand. We believe that this presented an opportunity to the participants to discuss the technical content among themselves and in turn encouraged peer-to-peer support and learning. Designing similar courses at scale should consider this observation and build on that to promote peer-to-peer support in the process of learning. Some of the participants experienced technology-related challenges but they took support from their peers to resolve them. For further scale-up of such interventions, it would be very useful to envisage a peer-to-peer and community-led, and community level system for support participants to resolve technology issues.9
The partners found the course useful and their concerns were related to the cost of internet and internet bandwidth in some of the areas. Some of the project partners were supporting FLW with internet data packages, but in some of the areas it was not available and FLW had to use their own internet data packages to access the course. Going forward, and for further scale-up of the intervention, support for internet package11 would be helpful as the cost may be a barrier for many FLW who may like to take the course but would not be able12 to do that because of the cost of internet data plans. Another important point was related to embedding the online courses in the existing plan of training prepared by the partners. Any such virtual training should consider complementing existing capacity-building plans of the partner organizations by creating need-based, region-specific, and hyperlocal course content. Such virtual training plans should be aligned with the partners’ requirements and face-to-face training plans and should fit into the workflow of the health workers for easy absorption of the initiative into the system.13 Virtual training using micro modules such as the one we used can be used to supplement and complement the large-scale training that partners conduct for the frontline workers.14
We learned that the course certification15 was a motivating factor for many FLW. Moving forward and further scale-up of the program should also consider learning about other sources of motivation for the frontline workers and use them in designing the virtual training opportunities.
A high proportion of the course participants completed the course but there remains a set of participants who either completed the course partially or did not complete the course at all. Those who could not complete the course maybe those who may need training more than others. It would be helpful for the program managers to deep dive and learn the issues and practical challenges that these workers may have. Some of the issues that we identified include access to a smartphone, busy work schedules, technology-related challenges, and handholding support that was needed to understand the modules and navigation of the course to complete it.16 The virtual learning initiatives should consider all such practical issues and address them programmatically.
The program data suggest that out of the total number of course participants, 86% of the participants were female participants who enrolled in the course. This figure reflects that the course successfully reached out to a majority of female frontline workers engaging with the communities. But it may also be true that the course only reached to those who own smartphones or have access to it along with an internet package and not to all those FLW who wanted to use such learning opportunities.
We developed three versions of the same course in three different languages. The overall average course completion rate for all three versions of the course was 82% out of those who registered on the platform. This reflects a large proportion of the participants completed the course. This course completion rate is roughly 6 times higher compared to course completion rate17 (13%) of Massive Open Online Courses (MOOC). Another recently documented evidence suggests a low completion rate for MOOCs.18 ‘Inside Higher Ed’ reports that in the year 2017-18 among all MOOC participants only 3.13% completed their courses.19 Comparing the completion rate of Project Samvad’s family planning course with the MOOC completion rate, we find that the completion rate of Project Samvad’s short course on family planning was roughly 27 times higher.
For the promotion of the course and to encourage the participants to complete the course, several initiatives were taken under the project. Some of the most effective ones, as we believe were related to sending the promotional messages over the WhatsApp groups, tutorial videos, and remote support over mobile phones. The short duration of the course may also be one of the reasons that helped to improve the course completion rate. Audio testimony, supervisors’ support, and sharing the course completion certificates15 in the WhatsApp group were also found effective in motivating the course participants. We also found that making the registration process simpler on the learning platform by keeping only a few data points in the form, may also help the users.
The successful completion of the course is not enough in itself unless the engagement with the content is meaningful and helps in improving the knowledge and skills of the participants.20 For every module, we had baseline and endline questions. We found that for all the modules, the average increase in response accuracy in the endline was around 15% points from that of baseline accuracy. It is also important to note that baseline accuracy for one of the modules in Odia version of the course was as high as 73%. This means that the course helped the participants to gain knowledge though in certain cases they already had a good level of knowledge on the subject. We also assessed average endline accuracy which means the percentage of a total number of correct answers given by the participants out of a total number of questions responded at the end of the course. For the Hindi version of the course, it was 78%. All of this put together suggests that the course helped the participants to gain knowledge. Another indicator of training quality that we took was the average course progress which means progress made by an individual learner and hence shows individual-level engagement with the course content. It is calculated by taking an average of course modules that different participants completed. For the Hindi version of the course, it was 87%. This means that the participants engaged with the content at a very high level and we deduce that such conversational style online training can help improve the participants’ engagement with the content and in turn the knowledge gain.21
CONCLUSIONS
The experiment under Project Samvad demonstrates that a mobile-based short online course seems feasible22 to virtually train the FLW at a scale as the course completion rate was found to be high and also the engagement of the participants with the content. The experiment demonstrates that a high proportion of the FLW can complete the course and a high quality of their engagement with the content, can be achieved by carefully designing content and by using a variety of tools to effectively engage them in the process of learning. It is important to underline the facts that any such initiative needs to consider the challenges and issues associated with the development, design, and rolling out of such initiative. In this experiment, we could only capture the insights and learning from those participants who own or have access to smartphones and internet. Further study should explore the perspectives of those FLW who do not own the devices so as to develop a comprehensive plan and policy on virtual training for the FLW. In the case of Project Samvad, we attempted to understand the reasons why some of the participants could not complete the course, and their responses were related to lack of technical support, issues of internet connectivity, and other challenges such as work schedules, work-related engagements, and ownership of devices.12,23 Project Samvad took cognizance of such issues and attempted to respond to them programmatically such as by supporting the participants through the supervisors though we could not address all.
Moving forward and to further scale and replicate an initiative of similar nature, the virtual training program should carefully consider understanding partners’ training requirements, co-create hyperlocal content, establish community and peer-to-peer support system, and address issues concerning mobile devices and internet package. It should aim to complement and supplement the existing training programs that the partners have developed over the years and may be seen to contribute to continuing professional development of the health workforce.
Acknowledgements
Sudha Jha, Jagdish Rana, Shams Tariq, Anila Heritu Samuel, Smarika Chandrakar, Satyapriya Sahu, Azmol Hussain rolled out the course and collected the monitoring data. Erica Arya, contributed to the assessment questions.
Funding
We duly acknowledge the funding from USAID via Agreement Number AID-386-A-15-00008 to implement Project Samvad.
Authorship contributions
FA conceptualized the manuscript, wrote the first draft, led the process of coordination and review with other authors, ensured that inputs and feedback are included, and finalized the manuscript for publication. He also wrote the course content and developed the flow of the online course and assessment questions. GB, AW, VP, and SK reviewed the manuscript and shared their feedback. ST did a final edit and review.
Competing interests
The authors completed the Unified Competing Interest form at www.icmje.org/coi_disclosure.pdf and declare no conflicts of interest.
Correspondence to:
Farhad Ali, Master of Science in Public Health, Digital Green, Avanta Business Centre, Office no. 1208, 12th Floor, Ambadeep Building, KG Marg, Connaught Place, New Delhi 110001. [email protected]