Radiotherapy and 3 years of androgen suppression allow a higher survival as compared with radiotherapy and 6 months of androgen suppression in locally advanced prostate cancer

The researchers of the European Organization for Research and Treatment of Cancer (EORTC), Radiation Oncology Group and Genito-Urinary Tract Cancer Group conducted a trial to investigate if brief-term androgen suppression (radiotherapy and 6 months of androgen suppression) could maintain quality of life and obtain the general survival rate achieved with long-term androgen suppression (radiotherapy and 3 years of androgen suppression) in individuals with locally advanced prostate cancer. Almost 1,000 subjects (970) who had had external-beam radiotherapy and 6 months of androgen suppression were allocated in a random way to two groups, one receiving no other treatment (brief-term suppression, 483 patients) and the other receiving further treatment with a luteinizing hormone-releasing hormone agonist for 2.5 years (long-term suppression, 487 patients). Ninety-eight patients in the long-term group and 132 patients in the brief-term suppression group were died after a follow-up period of 6.4 years, and 29 were died because of prostate cancer in the long-term suppression group, compared to 47 in the brief-term group. The general mortality at 5 years for the long-term suppression group was 15.2%, compared to an overall mortality of 19% in the brief-term suppression group, with an observed hazard ratio of 1.42. Statistically significant differences were documented between the groups with reference to insomnia (P = 0.006), hot flushes (P < 0.001) and sexual activity (P < 0.001). The authors conclude that in patients with locally advanced prostate tumour radiotherapy and 3 years of androgen suppression achieve higher survival if compared with radiotherapy and 6 months of androgen suppression.

Reference

Bolla M et al (2009) EORTC Radiation Oncology Group and Genito-Urinary Tract Cancer Group. Duration of androgen suppression in the treatment of prostate cancer. N Engl J Med 360:2516–2527

Standard adjuvant chemotherapy was superior to capecitabine in elderly women with early-stage breast cancer

Sometimes older women are not fully represented in clinical trials, and in particular elderly women affected by breast cancer seem to be not appropriately represented in controlled clinical trials. Since information regarding the impact of adjuvant chemotherapy in these patients is improvable, the researchers of the Cancer and Leukemia Group B (CALGB) (49907 trial) (planned to compare the efficacy of standard chemotherapy, constituted by methotrexate, cyclophosphamide plus fluorouracil or doxorubicin plus cyclophosphamide, with capecitabine) evaluated the non-inferiority of capecitabine compared with standard adjuvant chemotherapy in women affected by breast cancer and aged at least 65 years. Women with stage I, II, IIIA, or IIIB breast cancer were allocated in a random way to standard chemotherapy (cyclophosphamide, methotrexate and fluorouracil or cyclophosphamide and doxorubicin) or capecitabine, and the major end point of this trial was the assessment of survival free of relapse. Recruitment was stopped when the 600th patient was enrolled, because at that time the probability that, with an extended follow-up, capecitabine was much likely to be inferior to standard adjuvant chemotherapy, satisfied a fixed threshold. Following another year, the hazard ratio for relapse or mortality in the capecitabine group was 2.09, with a 95% confidence interval included between 1.38 and 3.17, and a P value <0.001. When the analysis was performed at 3 years, the general rate of survival was 91% in the standard adjuvant chemotherapy group and 86% in the capecitabine group, while the survival free of recurrence was 85% in the first group and 68% in the capecitabine group. Sixty-four percent of subjects in the standard chemotherapy group had toxic effects ranging from moderate to serious, compared to 33% of patients in the capecitabine group. In this trial standard adjuvant chemotherapy was superior to capecitabine in elderly women with early-stage breast cancer.

Reference

Muss HB et al (2009) CALGB investigators. Adjuvant chemotherapy in older women with early-stage breast cancer. N Engl J Med 360:2055–2065

The Score for the Targeting of Atrial Fibrillation may be adopted as a useful element of a combined approach to target atrial fibrillation in the secondary prevention setting of stroke

In Italy, stroke is the third determinant of death accounting for more than 10% of total mortality. Mortality within 30 days following a stroke is at least 20%, and mortality within 12 months ranges from 30 to 40%; moreover, stroke constitutes the main cause of invalidity. Stroke, particularly if associated with atrial fibrillation, also carries a significant risk of recurrence, and therefore timely diagnostic patterns are mandatory to begin appropriate anticoagulant therapy.

The authors of this French study documented the clinical features of consecutive individuals suffering ischaemic stroke, and adopted independent elements to elaborate a predictive grading score for the diagnosis of atrial fibrillation, called Score for the Targeting of Atrial Fibrillation (STAF). STAF is made up of the sum of the points of four items (the total score ranges from 0 to 8), and specifically the absence of symptomatic intra- or extra-cranial stenosis ≥50%, or clinico-radiological lacunar syndrome (3 points), the age over 62 years (2 points), the NIHSS ≥ 8 (1 point), the left atrial dilatation (2 points). In this study STAF of at least 5 points had 88% specificity and 89% sensitivity in detecting subjects with atrial fibrillation, and therefore the researchers propose the STAF as a useful element of a combined approach to target atrial fibrillation in the secondary prevention setting of stroke, yet acknowledging that a poli-centre trial is mandatory to validate STAF itself.

Reference

Suissa L et al (2009) Score for the Targeting of Atrial Fibrillation (STAF). A new approach to the detection of atrial fibrillation in the secondary prevention of ischemic stroke. Stroke 40:2866–2868

Improved cardiorespiratory fitness is linked to a decreased risk in coronary heart disease and in overall mortality

Even if cardiorespiratory fitness (CRF) is a generally satisfactory expression of physical fitness, analysed by means of exercise tolerance testing, not frequently physicians weigh it in assessing overall cardiovascular risk. Since data deriving from the medical literature suggest the presence of an inverse link between cardiorespiratory fitness and coronary heart disease, the authors of this Japanese study have investigated the relationship between cardiorespiratory fitness and coronary heart disease, cardiovascular disease episodes and total mortality in healthy adults. Electronic archives MEDLINE (1966–2008) and EMBASE (1980–2008) were scanned so as to retrieve cohort studies, including the association of cardiorespiratory fitness, evaluated as maximal aerobic capacity (MAC), with coronary heart disease, cardiovascular disease and all-cause mortality. The healthy subjects of the studies obtained from the systematic research were subdivided into different categories on the basis of their MAC, expressed in metabolic equivalent (MET) units. High CRF was considered ≥10.9 METs, intermediate CRF was considered 7.9–10.8 METs and low CRF was considered <7.9 METs. Thirty-three studies were included, and individuals with low CRF had, if compared to people with high CRF, a risk ratio for total mortality of 1.70 (95% confidence interval between 1.51 and 1.92, P < 0.001) and of 1.56 (95% confidence interval between 1.39 and 1.75, P < 0.001) for coronary heart disease and cardiovascular events. Individuals with low CRF had, if compared to people with intermediate CRF, a risk ratio for total mortality of 1.40 (95% confidence interval between 1.32 and 1.48, P < 0.001) and of 1.47 (95% confidence interval between 1.35 and 1.61, P < 0.001) for coronary heart disease and cardiovascular episodes.

The authors conclude that improved cardiorespiratory fitness is linked to a decreased risk in overall mortality and in coronary heart disease and cardiovascular disease episodes, and that healthy subjects with a MAC of at least 7.9. METs have significantly minor rates of total mortality and coronary heart disease and cardiovascular disease events when compared with people with a MAC inferior to 7.9 METs.

Reference

Kodama S et al (2009) Cardiorespiratory fitness as a quantitative predictor of all-cause mortality and cardiovascular events in healthy men and women: a meta-analysis. JAMA 301:2024–2035

An evaluation of the lowering in coronary heart disease mortality obtainable enhancing targeted medical and surgical therapy

Cardiovascular diseases are the main determinants of mortality across the world, and coronary heart disease is considered to cost individual European countries national economies billions of euros each year. The authors of this study used information on the number of patients with coronary heart disease and data on clinical and surgical therapy uptake levels to explore the decrease in coronary heart disease mortality potentially obtainable enhancing the availability of dedicated clinical and surgical therapy to eligible subjects in the USA. They have found that in 2000 no more than 60% of eligible individuals in the USA had a sound treatment, and that, by treating 60% of eligible patients, almost 300,000 (297,470, range from 118,360 to 628,120) fewer deaths would have been achieved, constituting 134,635 deaths less than in 2000, with 15% from primary prevention using statins, 19% from the management of acute coronary syndromes, 30% from secondary prevention and 32% from heart failure treatment. Almost 135,000 fewer deaths would be therefore obtained in 2000 augmenting the number of eligible individuals with coronary heart diseases appropriately treated, and consequently current and future care models should be addressed to improve the provision of correct, timely and appropriate treatment to every eligible subject affected by coronary heart disease. In this scenario, in particular secondary prevention and heart failure treatment should be optimised since they account for the highest proportions of clinical benefit.

Reference

Capewell S et al (2009) Potential reductions in United States coronary heart disease mortality by treating more patients. Am J Cardiol 103:1703–1709

In the primary cardiovascular prevention the benefit of aspirin has not been fully clarified

While the benefit of aspirin in secondary cardiovascular prevention is clear and defined, the role of this same drug in primary prevention has not been fully elucidated. Therefore, the Antithrombotic Trialists’ (ATT) Collaboration evaluated, conducting a meta-analysis, the risks and benefits of aspirin in primary prevention. The authors analysed major clinical cardiovascular events and serious bleeding episodes in six primary prevention studies and in 16 secondary prevention trials. With specific reference to primary prevention trials the effect of aspirin on stroke was not significant (0.20 vs. 0.21% per year; P = 0.4), and also mortality due to vascular causes was not significantly different (0.19 vs. 0.19% per year; P = 0.7) between aspirin and non-aspirin. The 12% proportional lowering in major cardiovascular episodes (0.51% aspirin vs. 0.57% control per year; P = 0.0001) with aspirin was substantially attributable to a decrease in non-fatal myocardial infarction (0.18 vs. 0.23% per year; P < 0.0001). Aspirin was associated in a significant way with an increase in serious extracranial and gastroenteric haemorrhages (0.10 vs. 0.07% per year; P < 0.0001). The authors conclude that, given the increase in major bleedings, the net benefit of aspirin in primary cardiovascular prevention is not clear.

Reference

Antithrombotic Trialists’ (ATT) Collaboration, Baigent C et al (2009) Aspirin in the primary and secondary prevention of vascular disease: collaborative meta-analysis of individual participant data from randomised trials. Lancet 373:1849–1860