Mobile Menu

Real-world Effectiveness of HIV Treatment

A recent study aimed to evaluate the outcomes of standard-of-care triple therapy versus two-drug treatment for HIV, following the observation of potential discrepancies between randomised clinical trials and effectiveness in the real-world setting.


In 2016, there were over 160,000 people newly diagnosed with HIV in 51 of the 53 countries in the World Health Organisation European Region. More specifically, this corresponds to a rate of 18.2 newly diagnosed infections per 100,000 population. It is estimated that approximately 15% of those living with HIV in the EU and EEA are not aware of their status. The majority of the patients who do not receive antiretroviral therapy (ART) die within 2 years of the onset of AIDS.

Improvements in ART has transformed HIV from a terminal to chronic disease. ART has evolved over time from monotherapy to triple therapy (TT). The recommended combination regimen for treatment initiation consists of two nucleotide analogue reverse transcriptase inhibitors (NRTI) paired with an agent. TT has been the standard of care therapy in HIV treatment since 1996. Despite improvements in TT, the long-term toxicity of tenofovir disoproxil fumarate (TDF) and abacavir (ABC) has led clinicians and researchers to try and find solutions to improve the long-term health of HIV patients. Exploratory strategies such as two-drug combinations (2DC) have been considered in cases where neither TDF nor ABC are optimal.

Two-drug combination treatment for HIV

Aside from a few exceptions, 2DC strategies have been associates with higher rates of virologic failure than TT in the clinical setting. However, a phase 1&2 randomised clinical trial (SWORD) showed that switching to dolutegravir (DTG) + rilpivirine (RPV) did not produce worse outcomes than patients remaining on TT for at least 6 months. In line with this, the TANGO randomised trial demonstrated non-inferiority of switching DTG/lamivudine (3TC) when compared to patients maintaining a tenofovir alafenamide (TAF)-containing regimen of at least three drugs.

One of the main concerns with 2DC is the potential for a lower resistance barrier, which may be more problematic with poor adherence to treatment plans. Exclusion criteria for HIV treatment trials can be strict and selection biases exist, making it difficult to extrapolate results into the real-world setting. In clinical trials, average adherence is at least 95%, whereas in the real-world it is often below 80%. While traditional TT regimens required 95% adherence for optimal effectiveness, there is currently evidence that suggests modern TT regimens are more tolerant of suboptimal adherence, with it being estimated that >80% adherence may be enough to maintain suppression.

Given this potential discrepancy between randomised clinical trials and effectiveness of HIV treatment in the real-world setting, the researchers behind this study aimed to evaluate the outcomes of integrase strand transfer inhibitor (INSTI)-containing TT to DTG- and/or boosted protease inhibitor (bPI)-containing 2DC in a large, Spanish cohort of HIV patients.

Assessing real-world effectiveness of HIV treatment

In order to assess the effectiveness of INSTI-containing TT to DTG- and/or bPI-containing 2DC, the researchers carried out retrospective analysis using data from the VACH cohort, a multicentre, Spanish cohort of adult HIV patients. All patients receiving treatment-initiated TT consisting of an INSTI combined with two nucleoside analogue reverse transcriptase inhibitors (NRTIs) or a 2DC-containing DTG and/or a bPI over a period of 5 years (01/01/2012 – 01/06/2017) were included in the study. Patient regimen was used as the unit of analysis and the overall sample analysis was complemented using two sub-analyses. The first sub-analysis focused on patients treated with a backbone plus DTG compared to those treated with DTG + one other antiretroviral. Meanwhile, the second sub-analysis focused on patients with HIV RNA<50 copies/mL at baseline, irrespective of the treatment regimen used.

The timepoints assessed were: time to discontinuation, time to switch due to virologic failure, and time to switch due to toxicity. Time-to-event analyses were conducted using Kaplan-Meier survival curves, which is defined as the probability of surviving in a given length of time while considering time in many small intervals, and Cox regression models, which is used to investigate the association between the survival time of patients and one or more predictor variables.

Observed real-world effectiveness of HIV treatment

In total, 7,481 patients were included in the study, contributing to 9,243 patient regimens. Patient characteristics at baseline differed among groups. More specifically, the 2DC group was significantly older, had a higher proportion of woman, had spent a longer time on ART, and had higher numbers of previous virologic failure. Median time to switch therapies was identified as 2.5 years in 2DC versus 2.9 years in TT. Adjusted hazard ratios for discontinuation due to any reason, virologic failure and toxicity in the 2DC versus TT group was 1.29, 2.06 and 1.18 respectively. These findings were also found to be consistent across the two sub-analyses.

Overall, this study identified that patients being treated with 2DC had a significantly higher probability of virologic failure compared to TT. However, there was no statistically significant difference in terms of discontinuations due to adverse events. This finding appears to be at odds with the findings of the SWORD clinical trial, which did not find any difference between outcomes of TT and 2DC after 6 months. However, a possible explanation for this is the differences in the baseline characteristics between the populations studied in the SWORD trial and those used in this study. Participants in this SWORD trial were on their first or second line of treatment and were only included if they had not experienced a virological failure, as well as having had excellent virological care in the months leading up to the trial. Whereas, the patients in this real-world study were unselected for their past antiretroviral treatment history or virological failure, and were not excluded if the viral load was detectable at the baseline visit.

Taken together, both sets of analyses lead to a coherent hypothesis that, in carefully selected patients, DTG 2DC is as effective as INSTI-based TT, whereas in cases with a less favourable ART background, TT is a more effective treatment option.


This study was designed to assess the real-world effectiveness of two types of HIV treatments: 2DC and TT. The researchers identified that time to discontinuation and probability of remaining free of virologic failure was significantly higher in patients treated with TT compared with 2DC, with no difference in toxicity. However, the observed difference between this real-world study and previous clinical trials could be explained by differences in participant characteristics. As new ART regimens are developed, care should be taken to establish real-world efficacy and viral suppression prior to implementation to provide quality care, minimise adverse events and improve the lives of patients living with HIV.

Image credit: wirestock – FreePik

Share this article