|Year : 2018 | Volume
| Issue : 4 | Page : 260-265
Development of the sexually transmitted infection service evaluation tool: Use of Delphi approach
Bansari L Chawada1, Jayendrakumar K Kosambiya2, Vipul P Chaudhari2, Kristen J Wells3
1 Department of Community Medicine, Medical College, Baroda, Vadodara, Gujarat, India
2 Department of Community Medicine, Government Medical College, Surat, Gujarat, India
3 Department of Psychology, San Diego State University, San Diego, California, USA
|Date of Submission||01-Feb-2018|
|Date of Acceptance||14-Nov-2018|
|Date of Web Publication||21-Dec-2018|
Dr. Jayendrakumar K Kosambiya
Department of Community Medicine, Government Medical College, Majura Gate, Surat - 395 001, Gujarat
Source of Support: None, Conflict of Interest: None
| Abstract|| |
Background: With the wide variation in sexually transmitted infection (STI) services, it is a challenge to devise strategies for ensuring effective service delivery. The objectives of this study were to develop a standard tool for STI services evaluation and use the Delphi method to derive a weighted factor for each parameter of the tool. Methods: A review of existing guidelines for quality care STI services were conducted, and parameters were selected to form the content of the measurement tool. Delphi technique was used to derive weighted factor for each STI service delivery parameter by using the Likert scale of 0–7. A heterogeneous group of 18 anonymous experts was invited to rate the parameters. All the responses were collected online. Cronbach's alpha level ≥0.8 was selected to define consensus the experts. Results: The STI service evaluation tool (SSET) was created including ten parameters to evaluate the performance of clinic with the help of standard National guidelines. The SSET was standardized by deriving weighted factor for each evaluation parameter. Three rounds of Delphi were required to achieve consistency. Response rates for each Delphi round were 77.78%, 100%, and 78.57%, respectively. Each parameter, such as workforce, materials, consumables, consultation, counseling, coverage, referral, records, information education and communication, and supervision, was assigned weighted factor derived by converting the final score into the total standard score of 100. Conclusion: The Delphi method represents a novel approach to develop standardized tools to evaluate the performance of service delivery.
Keywords: Delphi method, program evaluation, sexually transmitted infection clinic, standardization
|How to cite this article:|
Chawada BL, Kosambiya JK, Chaudhari VP, Wells KJ. Development of the sexually transmitted infection service evaluation tool: Use of Delphi approach. Indian J Community Med 2018;43:260-5
|How to cite this URL:|
Chawada BL, Kosambiya JK, Chaudhari VP, Wells KJ. Development of the sexually transmitted infection service evaluation tool: Use of Delphi approach. Indian J Community Med [serial online] 2018 [cited 2022 May 28];43:260-5. Available from: https://www.ijcm.org.in/text.asp?2018/43/4/260/248190
| Introduction|| |
Sexually transmitted infections (STIs) are a group of communicable diseases that are transmitted predominantly by sexual contacts. Global estimates of the World Health Organization report 1999 state that 340 million new cases of STIs have occurred worldwide, and the largest number of new infections (151 million) occurred in the region of South and Southeast Asia. Various National Surveys document the prevalence of STI-related symptom as high as 11%–17% among women and 5% of men with low poor treatment-seeking behavior.,, STIs are among top ten causes of healthy life lost in young adults aged 15–44 years.
Across India, there is a large variation in the STIs service delivery. STIs are diagnosed and treated in a network of various clinics such as National AIDS Control Organization (NACO)-affiliated clinics (which are stand-alone STI clinics), facility-based clinics run at public clinics, general practitioners, private practitioners, and agencies implementing targeted interventions among high-risk populations. Broadly, these services can be considered as public (government affiliated) and privates. Existing NACO operational guidelines to assess quality STI/RTI services describes basic components of high-quality STI/RTI care, including counseling, history assessment, diagnosis with the examination, blood reports, treatment, and follow-up of all clients., Evaluation of STI services is done as subjective assessment usually including few priority parameters. Studies have described the use of existing NACO guidelines for assessing the quality of STI services. This kind of evaluation reports often not enough to monitor the overall progress of a clinic service., Although standard guidelines are available, there is no standard tool available which can be replicated to assess all components of STI services. The Delphi technique is one of the widely used methods to form and standardized new policies with the help of diversified experts., Based on the assumption that an organization's performance depends on individual task performance, the performance of an STI clinic would depend on various subparameters. Evaluation and feedback help in strengthening services by identifying gaps in service delivery. To disperse limited resources, we need to be able to evaluate each health-care clinic using a standard method. A standardized tool that can be applied at any STI service delivery clinic is necessary for reliable evaluation and its development.
This study describes the development of a standardized evaluation tool to assess STI service across health centers. In this study, experts discussed the available NACO guidelines describing STI services, and from these guidelines, parameters of assessment were identified. It was assumed that each parameter might not have equal priority in the evaluation of the clinic performance; hence, experts who were familiar with the provision of STI services were engaged using a Delphi approach to prepare the assessment tool. The objectives of this study were to develop a standard tool for STI services evaluation and use a Delphi method to derive a weighted factor for each parameter of the tool.
| Methods|| |
The STI service evaluation tool (SSET) was developed to serve as an objective evaluation tool for STI clinics [Figure 1]. NACO recommends direct observation of clinics, as well as interviews with personnel, to obtain information regarding the clinic's procedures and resources. The study authors reviewed the NACO guidelines and organized them into 10 parameters. Opinions from experts were obtained to develop these 10 parameters based on their importance in STI services. The evaluation tool with 10 parameters was pilot tested at two different public clinics run by a corporation. The tool of 10 parameters includes three different domains as follows: input (workforce, materials, consumables, information education and communication [IEC], and supervision), process (counseling and consultation), and output (records, referral, and coverage) [Table 1].
|Figure 1: Sexually transmitted infection service evaluation tool development workflow|
Click here to view
When a person is evaluating a clinic using the SSET, he or she will score all the 10 parameters by evaluating the defined multiple subcomponents. The evaluator would provide a unit score of “one” to each subcomponent after verifying its quality. The scores on each subcomponent are then summed to derive a parameter score. As not all parameters have equal importance in the evaluation of the clinic performance, standardization is required to derive scores on the SSET, and the weights for standardization will be derived based on the mode scale ratings of importance obtained through the Delphi method. The proposed study has adopted the approach of a “Decision” Delphi. It is believed that the features of anonymity, iteration with controlled feedback, statistical group response, and expert input can facilitate consensus to make effective decisions.,
To conduct the Delphi process, experts were sampled from across India and internationally. To be included as an expert, a person had to have more than 10 years' experience in public health management of STI and program implementation. The sampling size of experts under Delphi as described in standard guidelines ranges between 7 and 21. A group of 18 anonymous experts were E-mailed and invited online to take part in the study. We aimed to obtain a heterogeneous group of experts; wide variety of public health experts in STIs, community medicine, researchers, NACO officials, State AIDS Control Society officials, official from STI program implementing health agencies, and to achieve this list of experts was prepared which had 18 experts. In the initial E-mail contact, all 18 experts were provided with brief project synopsis and details on evaluating the importance of each parameter to be included in the SSET.
The Delphi questionnaire listed all 10 parameters, and experts were asked to rate each parameter using a Likert scale, which ranges from 1 to 7 (1 = unimportant, 2 = little important, 3 = mild important, 4 = moderately important, 5 = important, 6 = very important, and 7 = strongly important). A specific definition for each response on the scale was shared with experts before collecting responses. For example, the “Unimportant” response was defined as “Absence can be tolerated throughout the project period without interruption of the services and its presence cannot improve the standard of quality of services.” In addition, participants' written feedback was also collected as free text using the following item “Please provide your extra comments if any, in this space for the given parameters…” Using the Delphi process, multiple rounds of rating were conducted. In the first round, experts shared their first responses online for 10 parameters. In subsequent rounds, information compiled from previous round along with comments was shared anonymously with the experts. This process was repeated until consensus was reached.
All data were entered into SPSS version 19 for analysis (IBM Corp. Released 2010. IBM SPSS Statistics for Windows, Version 19.0. Armonk, NY: IBM Corp.). Data for each parameter were summarized descriptively (e.g., frequencies, mode, and range). Combining the ratings of all participating experts, a Cronbach's alpha was calculated. This process was repeated for each Delphi round until Cronbach's alpha of ≥0.8 was achieved to ensure internal consistency.,,
Once consensus was reached, as indicated by Cronbach's alpha, the mode of the score on each parameter was used to generate a total weighted score. The process of weighting consisted of two steps. First, we converted the parameter mode score to a standard scale of 1–10 using the following formula:
Second, we summed the weighted parameters for a total score of 100 to create the SSET. The study was performed and reported as per the standards of Standards for Quality Improvement Reporting Excellence guidelines to report improving health care. No human intervention was done for the current report. Ethical permission from the Institutional Ethical Committee was obtained. Expert participants of Delphi are aimed to be anonymous forever.
| Results|| |
Three rounds of Delphi were required to achieve consensus with consistency in the Delphi result (Cronbach's alpha of ≥0.8). Those participants who did not respond in the initial rounds were considered to be nonrespondents and were not contacted in the subsequent rounds. Thus, of the total 18 experts participants, the response rates achieved for each Delphi round were 77.78%, 100%, and 78.57%, respectively. Cronbach's alpha in the first round (where participating experts were not able to view each other's ratings) was 0.63. In the second and third rounds, participating experts were able to review each other's comments, and the internal consistency of the ratings of importance increased to 0.66 in the second round and 0.83 in the third round. Thus, the Delphi process concluded at the end of the third round.
The parameters in the inputs domain, workforce, materials, consumables, IEC, and supervision showed relatively little variation across the rounds. The variation for workforce and consumables decreased across the rounds with participating experts rating these parameters as more important. The materials parameter remained stable, with most participating exports rating it as “important to very important.” For the parameter IEC, there was a lot of variation in expert participants' ratings in Round 1, with an increase in importance in Round 2. By Round 3, most expert participants rated this parameter as important or very important. Finally, for the parameter supervision, in Round 1, the ratings ranged from moderately to strongly important. In Round 2, the variation increased. By Round 3, most participants rated supervision as mildly important to important [Table 2].
The two parameters in the process domain are counseling and consultation. In Round 1, counseling skills were rated by most participating experts as being very important or strongly important. In Round 2, the variation of the ratings increased. By Round 3, the most raters indicated that counseling skills were moderately important or important. In contrast, there was a lot of variation in the expert participants' ratings of the importance of consultation in Round 1. This variation decreased and the importance of the parameter increased in Rounds 2 and 3 with the majority of expert participants' rating it as strongly important [Table 2].
There was a lot of variation for the parameter record in Rounds 1 and 2. By Round 3, the majority of expert participants rated it as moderately important. There was also wide variation seen in the referral parameter. This variation decreased in Rounds 2 and 3, with the majority of experts rated it as mildly important in Round 3. For the parameter of coverage, with varied variation in Rounds 1 and 2, the majority of the experts rated it as mildly important in Round 3 [Table 2].
Using the mode scores from Round 3, weighted factors were calculated to provide a total score of 100 on the SSET. As shown in [Table 3], availability of workforce, materials, consumables, and consultation parameters carries the highest and equal weight of 14.14 in the evaluation of an STI clinic, followed by IEC (10.00), counseling (7.93), and records (7.93). The lowest weight of 5.86 was equal for referral, coverage, and supervision, indicating that they have less importance as compared to other parameters. Score for the overall performance of the clinics was calculated by summing weighted score of all parameters.
| Discussion and Conclusion|| |
In the present study, we developed a 10 parameter STI service evaluate the tool, which can be used to objectively evaluate the quality of services in any STI clinics. The ten parameters include workforce, materials, consumables, IEC, supervision, counseling, consultation, records, referral, and coverage. After developing this instrument, based on the NACO guidelines, we used a Delphi method to standardize the tool using experts' ratings of the importance of each parameter. A study published by Ellerton et al. in 2011 uses similar Delphi process to develop and to validate pediatric cardiopulmonary physiotherapy discharge tool. To develop the tool, they asked the experts to rank the definitions for defined items, while in SSET, we asked experts to rate the parameters on their importance. The Delphi technique has been used and extensively modified by researchers over the years to obtain opinions from people with expertise. SSET includes performance indicators to evaluate STI services. The tool is formed and standardized with the approach of collective intelligence by decision Delphi method. A similar type of Delphi method is used previously to develop indicators to evaluate the performance of laboratory and family practice by Zinn and Zalokowskiand Barnsley et al., respectively.,
This tool can be used to compare the quality of services delivered in various clinics, allowing us to understand best practices in higher quality clinics. The tool has numerous implications, including future research into the association between clinic quality and health outcomes as well as allowing the see the progress of the clinic over time.
How can sexually transmitted infection service evaluation tool be used and interpreted?
As the SSET is a standardized tool that includes all of the parameters specified as critical according to the NACO guidelines, none of the parameters should be missed while interpreting the result. This standard tool shall make the STI clinic evaluation more efficient, as an evaluator will only need to verify the resources and services being provided according to the parameters on the SSET. When the evaluation is complete, an evaluator will get calculated individual parameter weighted score for each clinic. Scores on the SSET can be used for (a) establishing new STI clinic at the place where services are not available, (b) further development of STI services where the clinic are functioning, (c) creating a benchmark for quality STI services, (d) comparing the progress of a clinic with time against benchmark, (e) comparing one STI service to another service elsewhere, and (f) developing focused improvement strategies pertaining to the specific weak parameters.
To the best of our knowledge, the development of a tool to evaluate the services provided by STI clinics using the Delphi method represents a novel approach and contribution. However, the study has some limitations. First, there is no formal literature available for sample size calculation for Delphi. SSET was developed with the sample size of 18 experts. The response rate was reduced in Round 1 and Round 3. Influence of this missing data cannot be calculated. Second, as this tool was developed with the help of national NACO guidelines, SSET does not incorporate any other guidelines.
This tool represents a first step in evaluating the quality of STI clinic services. Efforts to improve the delivery of STI-related health care can lead to meaningful reductions STI prevalence and improved treatment. Reducing the burden of STIs is an important health care goal for NACO, and it is the ultimate contribution to millennium developmental goals. In conclusion, the SSET is a rigorously designed tool to evaluate service provision in STI clinics. The future research is necessary to determine its utility in STI research and quality improvement efforts.
The authors are thankful to the anonymous Delphi expert participants for their valuable opinions. The authors acknowledge the technical support received from the Fogarty International Grant/USNIH: Grant # 1D43TW006793-01A2-AITRP.
Financial support and sponsorship
Conflicts of interest
There are no conflicts of interest.
| References|| |
Philip PS, Benjamin AI, Sengupta P. Prevalence of symptoms suggestive of reproductive tract infections/sexually transmitted infections in women in an urban area of ludhiana. Indian J Sex Transm Dis AIDS 2013;34:83-8.
Adler M, Cowan F, French P, Mitchell H, Richens J. ABC of Sexually Transmitted Infections. 5th
ed. London: BMJ Publishing Group Ltd.; 2004. p. 2.
National AIDS Control Organisation. Guidelines on STI/RTI Service Delivery for High Risk Groups and Bridge Population in TI NGOs. New Delhi: National AIDS Control Organisation; 2012.
Rath RS, Singh M, Rizwan SA, Lohiya A, Gopal G, Silan V, et al.
Evaluation of state-run STI/RTI clinics in the state of haryana, India through a supportive supervision approach. Indian Dermatol Online J 2014;5:446-8. [Full text]
Ganju SA, Sharma NL, Kanga A. Towards quality improvement: Training and supportive supervision in STI control programme, Himachal Pradesh. Indian Dermatol Online J 2012;3:221-2.
] [Full text]
Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and reporting the delphi method for selecting healthcare quality indicators: A systematic review. PLoS One 2011;6:e20476.
Wakai A, O'Sullivan R, Staunton P, Walsh C, Hickey F, Plunkett PK, et al.
Development of key performance indicators for emergency departments in Ireland using an electronic modified-Delphi consensus approach. Eur J Emerg Med 2013;20:109-14.
Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000;32:1008-15.
Linstone HA, Turoff M. The Delphi Method – Techniques and Applications. Boston: Addison-Wesley Publishing; 1975.
Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika 1951;16:297-334.
Henson RK. Understanding internal consistency reliability estimates: A conceptual primer on coefficient alpha. Meas Eval Couns Dev 2001;34:177-89.
George D, Mallery P. SPSS for Windows Step by Step: A Simple Guide and Reference. 11.0 update. Boston: Allyn & Bacon; 2003. p. 400-1.
Ogrinc G, Davies L, Goodman D, Batalden P, Davidoff F, Stevens D, et al.
SQUIRE 2.0 (Standards for QUality improvement reporting excellence): Revised publication guidelines from a detailed consensus process. BMJ Qual Saf 2016;25:986-92.
Ellerton C, Davis A, Brooks D. Preliminary development and validation of a paediatric cardiopulmonary physiotherapy discharge tool. Physiother Can 2011;63:34-44.
Zinn J, Zalokowski A. Use of the Delphi panel method to develop consensus on laboratory performance indicators. Clin Lab Manag Rev1999;13:97-105.
Barnsley J, Berta W, Cockerill R, MacPhail J, Vayda E. Identifying performance indicators for family practice: Assessing levels of consensus. Can Fam Physician 2005;51:700-1.
[Table 1], [Table 2], [Table 3]