ORIGINAL RESEARCH
Support Vector Machine Classification of Brain Metastasis and Radiation Necrosis Based on Texture Analysis in MRI Andr es Larroza, MS,1 David Moratal, PhD,2* Alexandra Paredes-S anchez, MS,2 Emilio Soria-Olivas, PhD,3 Marıa L. Chust, MD,4 Leoncio A. Arribas, MD, PhD,4 and Estanislao Arana, MD, PhD5 Purpose: To develop a classification model using texture features and support vector machine in contrast-enhanced T1weighted images to differentiate between brain metastasis and radiation necrosis. Methods: Texture features were extracted from 115 lesions: 32 of them previously diagnosed as radiation necrosis, 23 as radiation-treated metastasis and 60 untreated metastases; including a total of 179 features derived from six texture analysis methods. A feature selection technique based on support vector machine was used to obtain a subset of features that provide optimal performance. Results: The highest classification accuracy evaluated over test sets was achieved with a subset of ten features when the untreated metastases were not considered; and with a subset of seven features when the classifier was trained with untreated metastases and tested on treated ones. Receiver operating characteristic curves provided area-under-thecurve (mean 6 standard deviation) of 0.94 6 0.07 in the first case, and 0.93 6 0.02 in the second. Conclusion: High classification accuracy (AUC > 0.9) was obtained using texture features and a support vector machine classifier in an approach based on conventional MRI to differentiate between brain metastasis and radiation necrosis. J. MAGN. RESON. IMAGING 2015;42:1362–1368.
B
rain Metastases are the most common brain tumors and their treatment usually comprise a combination of surgery, radiosurgery, radiotherapy, and chemotherapy.1 Stereotactic radiosurgery (SRS) for brain metastases may lead to delayed radiation necrosis, with symptoms and imaging findings usually indistinguishable from persistent tumor.2–4 Brain metastases show heterogeneous imaging findings regardless their origin and therapy.5 Multiparametric MRI and positron emission tomography (PET) have been advocated to differentiate metastatic recurrence from radiation-induced changes.4,6,7 Even so, accurate discrimination remains difficult compared with histopathology.2,7 To avoid invasive diagnosis and due to incomplete availability of advanced imaging such as MR spectroscopy, perfusion MRI and PET, great interest exists
in identifying reliable imaging features from routine MRI that could differentiate metastasis from radiation necrosis.2,8 Texture analysis describes a wide range of techniques that enable the quantification of gray-level patterns, pixel interrelationships, and the spectral properties within an image; to derive features that provide a measure of intralesional heterogeneity.9 Texture analysis has been applied in MRI for classification of tumors and other diseases.10–15 Different tumor areas exhibit different textural patterns, which are beyond human visual perception.16 The vast variety of texture analysis methods makes possible to obtain a myriad of features that can be used in combination with machine learning techniques to obtain a reliable diagnostic tool..17,18 The purpose of the present work was to develop a classification model using texture
View this article online at wileyonlinelibrary.com. DOI: 10.1002/jmri.24913 Received Jan 23, 2015, and in revised form Mar 26, 2015. Accepted for publication Mar 26, 2015. *Address reprint requests to: D.M., Center for Biomaterials and Tissue Engineering, Universitat Polite`cnica de Vale`ncia, Camı de Vera, s/n. 46022, Valencia, Spain. E-mail:
[email protected] From the 1Department of Medicine, Universitat de Vale`ncia, Valencia, Spain; 2Centre for Biomaterials and Tissue Engineering, Universitat Polite`cnica de Vale`ncia, Valencia, Spain; 3Intelligent Data Analysis Laboratory, Electronic Engineering Department, Universitat de Vale`ncia, Valencia, Spain; 4Department of on Instituto Valenciano de Radiation Oncology., Fundaci on Instituto Valenciano de Oncologıa, Valencia, Spain; and 5Department of Radiology, Fundaci Oncologıa, Valencia, Spain
1362
C 2015 Wiley Periodicals, Inc. V
Larroza et al.: Metastasis/Radionecrosis Discrimination
features and support vector machine (SVM) in contrastenhanced T1-weighted images to differentiate between brain metastasis and radiation necrosis.
MATERIAL AND METHODS Patients This retrospective, single-center study was approved by the Institutional Review Board. All subjects provided written informed consent before the research study. Patients followed up for brain metastases submitted for SRS, with or without whole brain radiotherapy (WBRT), between September 2007 and June 2013 were included. Patients were eligible if they (i) had a pathologically proven primary extra-cerebral tumor; (ii) had undergone a treatment for cerebral metastases with SRS; (iii) had at least two consecutive MRI scans showing enlarging enhancing lesions within the radiation field. For purposes of analysis, scans were grouped into 3month intervals and evaluated up to 36 months post-SRS. Patients were excluded for any of the following reasons: if brain metastases were leptomeningeal or showed any extraaxial extension, if SRS was performed on the basis of CT imaging or for consolidation to a surgical resection bed only. Seventy-three patients (36 women and 37 men) with an age of 56.8 6 10.3 years (mean 6 standard deviation) were included. One-hundred fifteen lesions were analyzed; 83 diagnosed as metastasis: 60 analyzed previous to treatment, 10 treated with SRS with a median dose of 20 Gy, and 13 treated with SRS plus WBRT with a median dose of 53 Gy; and 32 as radiation necrosis: 14 treated with SRS in a single or multiple sessions with a median dose of 22 Gy and the remaining 18 with SRS plus WBRT with a median dose of 49 Gy.
Clinical Outcome Patients were followed by clinical manifestations and by MRI at 3month interval. A histological diagnosis was obtained in 12 of the 115 lesions (10.4%); in the remaining patients, clinical diagnosis was assessed by clinical and radiological follow-up, regarding neurological condition and size changes in serial MRI scans and PET findings.5 Metastatic progression diagnosis was made when a rapidly deteriorating neurological condition, short survival time due to neurological progression or ongoing progression in subsequent MRI scans (at least a 20% increase in the sum of diameters of the target lesions) was found. The diagnosis of radiation necrosis was made when the lesion showed: (a) a complete, partial (at least a 30% decrease in the sum of diameters of target lesions), or stable response, according to the Response Evaluation Criteria in Solid Tumors (RECIST 1.1); (b) enlarging lesions with reduced relative cerebral blood volume (rCBV) and F-18 fluorodeoxyglucose (FDG) uptake.5
Imaging Protocol All MRI examinations were performed using a 1.5 Tesla (T) magnet with a multichannel phased-array coil (Magnetom Symphony; Siemens Healthcare, Erlangen, Germany). The MRI protocol included T1-weighted axial images with gadolinium and T2weighted FLAIR axial images. Three-dimensional (3D) spoiled November 2015
FIGURE 1: General approach of the proposed classification model. Texture features were extracted from regions of interest (ROIs) of all lesion samples. An optimal subset of features was then used to train a support vector machine (SVM) classifier.
gradient recalled echo (SPGR), T1-weighted images of the whole brain were acquired without magnetization transfer, following intravenous administration of single-dose of gadobenate dimeglumine (0.1 mmol/kg, MultiHance, Bracco; Milan, Italy) with a 6min delay. Imaging parameters were: repetition time/echo time (TR/TE) of 11/4.8 ms; flip angle of 25 ; matrix 256 3 256; voxel size of 0.5 3 0.5 3 1.3 mm3; sections per slab, 224; and acquisition time, 4 min 15 s. Parameters for the FLAIR images were: TR/TE, 8500/114 ms; inversion time, 2500 ms; voxel size, 0.43 3 0.43 3 5 mm3; slice spacing, 6.5 mm.
Regions of Interest The process followed toward classification (Fig. 1), began with the definition of regions of interest (ROIs) on a slice with the most solid lesion component on the contrast-enhanced T1-weighted image coregistered to the corresponding FLAIR image in a viewing station (Syngo; Siemens Healthcare, Erlangen, Germany) to identify and exclude peripheral blood vessels adjacent to enhancing lesion, excluding perilesional edema. Manual delineation of ROIs was performed by a radiologist (E.A., 20 years of experience). The longest axial diameters were normally distributed and without statistical differences (Student t-test, P 5 0.297) between metastasis and radiation necrosis with mean 6 standard deviation of 22.9 6 8.2 and 20.3 6 8.1, respectively.
Texture Analysis Texture features were extracted from contrast-enhanced T1weighted images using the MaZda software, version 4.6 (Institute of Electronics, Technical University of Lodz, Lodz, Poland).19 Image normalization is necessary to minimize the influence of contrast and brightness variation, and to obtain reproducible results under different MRI acquisition protocols.19 Normalization was performed within MaZda using a method that remaps the 1363
Journal of Magnetic Resonance Imaging
TABLE 1. List of Texture Features Used in This Study
Method
Features
N
Histogram
Mean, variance, skewness, kurtosis, percentiles (1, 10, 50, 90, and 99%).
9
Absolute gradient
Gradient mean, variance, skewness, Kurtosis, non-zeros.
5
Co-occurrence matrix
Angular second moment, contrast, correlation, sum of squares, inverse difference moment, sum average, sum variance, sum entropy, entropy, difference variance, difference entropy. Matrices were calculated for one to three inter-pixel distances in four directions: horizontal, vertical, 45 and 135 .
132
Run-length matrix
Run-length non-uniformity, gray-level non-uniformity, long run emphasis, short run emphasis, fraction of image runs; in four directions: horizontal, vertical, 45 and 135 .
20
Autoregressive model
Theta 1-4, sigma.
5
Wavelets
Haar basis functions in sub-bands LL, LH, HL, and HH for two subsampling factors.
8
histogram to fit within m 6 3r (m: gray-level mean between the ROI and r: gray-level standard deviation). For the same reason, image intensity (gray-level) was encoded to 8 bits/pixel. A total of 179 features were computed for each ROI based on the six texture analysis methods available in MaZda: (i) histogram, based on the count of pixels in the ROI that possess a given gray-level value.20 (ii) Absolute gradient, quantifies any abrupt or smooth gray-level variation across the ROI.20 (iii) Co-occurrence matrix, contains probabilities of co-occurrence of pixel pairs with given gray levels.21 (iv) Run-length matrix, represents runs of pixels having the same gray-level value.22 (v) Autoregressive model, assumes a local interaction between image pixels in that the pixel intensity is a weighted sum of the neighboring pixel intensities.23 (vi) Wavelets, a technique that analyzes the frequency content of an image within different scales and frequency directions.20 The computed features are listed in Table 1; please refer to the MaZda user’s manual for further details.24
The previous verifications indicated similarities between both treated and untreated metastases, however we did not mixed them during classification but we approached the classification problem by treating the data as two datasets. Each dataset was split into training (70%) and test (30%) subsets. Only treated cases were included in dataset 1 (23 metastases and 32 radiation necroses). In dataset 2, all untreated metastases (60 lesions) were placed in the training subset and treated metastases (23 lesions) in the test subset, while radiation necroses (32 lesions) were randomly split. The aim of the last approach was to evaluate if the discrimination problem can be assessed training the classifier with the well-known untreated metastases.
Feature Selection A simple statistical comparison between treated metastases and radiation necroses showed 115 statistically different features
Data Preparation Our data consisted of 83 metastases, including 60 lesions at diagnosis and 23 lesions that received radiation treatment. The main objective in clinical practice is to discriminate radiation necrosis from recurrent metastasis after radiation therapy. Considering the scarcity of biopsy-proven treated cases, we wanted to take advantage of the available well-known lesions (untreated metastases). We performed three statistical verifications to compare treated and untreated metastases. First, a Mann-Whitney U-test with Benjamini-Hochberg correction showed only 37 statistically different features (P < 0.05) between both types of metastasis. Second, the first two principal components of both types had dot product > 0.9; indicating similar distribution. Lastly, k-means proved that both types were equally distributed in two generated clusters.25 1364
FIGURE 2: Average area under the curve (AUC) values over test sets for different subsets of features ranked by the mSVMRFE algorithm. Best performance feature subsets are indicated with vertical dotted lines.
Volume 42, No. 5
Larroza et al.: Metastasis/Radionecrosis Discrimination
TABLE 2. Features Ranked by the mSVM-RFE Algorithma
Top ranked features Dataset 1
Dataset 2
Percentile 90%
Difference variance (1 pixel, 45 )
Sum entropy (3 pixels, horizontal)
Sum entropy (2 pixels, horizontal)
Kurtosis
Contrast (3 pixels, 135 )
Sum entropy (2 pixels, 45 )
Difference variance (1 pixel, horizontal)
Variance
Percentile 50%
Mean
Short run emphasis (vertical)
Sum entropy (2 pixels, horizontal)
Variance
Wavelet LH (sub-band 1)
Fraction (45 )
Correlation (1 pixel, vertical)
Entropy (1 pixel, vertical)
Short run emphasis (135 )
Inverse difference moment (2 pixels, 135 )
the smallest ranking score. At each iteration, the coefficients of the weight vector w of a linear SVM are used to compute the feature ranking score. The feature with the smallest ranking score ci ¼ ðwi Þ2 is eliminated. The ranking criterion is chosen to remove the feature whose removal affect least the objective function J ¼ ð1=2Þkwk2 .28 A modified version of the SVM-RFE method (mSVM-RFE) that incorporates resampling at each step of the algorithm to stabilize the feature selection was proposed by Duan et al 29 and implemented in the present work. In the mSVM-RFE, t linear SVMs are trained on different subsamples of the original data: let w j be the weight vector of the j th SVM, w ji the corresponding weight value associated with the ith feature, and 2 vji ¼ wji . The feature ranking score becomes:
ci ¼
vi r vi
(1)
where vi and rvi are mean and standard deviation of variable vi . The weight vectors are normalized w j ¼ w j =jjw j jj before computing the ranking score. The algorithm described by Duan et al 29 is: 1. Start: ranked feature set R ¼ ½ ; S ¼ ½1; . . . ; d; 2. Repeat until all features are ranked:
a Those highlighted in bold provided the highest classification with a linear svm for each dataset.
(Mann-Whitney U test with Benjamini-Hochberg correction, P < 0.05). Using a high dimension of features for developing a classifier model may cause a decrease in performance due to the curse of dimensionality, so finding an appropriate subset of features is essential to achieve optimal classification performance.26,27 The support vector machine (SVM) is one of the best-known classification techniques and usually provides the best classification for computer-aided detection in radiology.18 We decided to implement a robust feature selection technique based on SVM and recursive feature elimination (RFE) that was successfully applied in gene selection for cancer classification.28 The SVM-RFE algorithm returns a ranking of features by recursively training a linear SVM and removing the feature with
selected
subset
a. Train t linear SVMs on subsamples of the original training data, with features in set S as input variables; b. Compute and normalize the weight vectors; c. Compute the ranking scores ci for features in S using Eq. [1]; d. Find the feature with the smallest ranking score: e ¼ arg mini ci ; e. Update: R ¼ ½e; R ; S ¼ S-½e; 3. Output: Ranked feature list R. Feature values were standardized to zero mean and unit variance across training samples. Five-fold cross-validation with 10 repetitions was used as resampling method to reduce variability of the estimation of generalization error. Thus, at each step t ¼ 50 linear SVMs were trained with a fixed value of the C parameter. The C parameter sets the trade-off between complexity of decision rule and errors of the SVM 30 and was determined beforehand on the basis of the highest classification accuracy estimated with the resampling method when including the full set of features. Values
TABLE 3. Classification Performance of the Linear SVM With and Without Feature Selection for Discrimination of Treated Metastasis and Radiation Necrosisa
Dataset 1 Dataset 2
Number of features
AUC
Sensitivity
Specificity
Full (179) mSVM-RFE (10)
0.89 6 0.09 0.94 6 0.07
0.81 6 0.17 0.84 6 0.15
0.84 6 0.11 0.85 6 0.14
Full (179) mSVM-RFE (7)
0.82 6 0.07 0.93 6 0.02
0.73 6 0.07 0.83 6 0.12
0.74 6 0.16 0.82 6 0.25
a
Values are shown as mean 6 standard deviation as a result across resamples on test sets. Sensitivity and specificity were computed according to the optimal cutoff on the ROC curve. Metastasis was considered the positive class, so sensitivity measures the ability of the classifier to detect treated metastases, whereas specificity measures the ability to detect radiation necroses.
November 2015
1365
Journal of Magnetic Resonance Imaging
Class probabilities of the SVM, computed with the method described by Platt,31 were used as cutoffs on the ROC curves. Sensitivity and specificity were computed for the cutoff on the ROC curve that maximizes the product of both measures. Sensitivity was defined as the true positive (metastasis classified as metastasis) divided by the sum of true positive and false positive (metastasis classified as radiation necrosis), whereas specificity as the true negative (radiation necrosis classified as radiation necrosis) divided by the sum of true negative and false negative (radiation necrosis classified as metastasis). The methods described were implemented using the Caret package 32 in R language, version 3.0.1 (R Development Core Team, Vienna, Austria). FIGURE 3: Average ROC curves over test sets. The highlighted points on the curves indicate the cutoff for achieving the highest product between sensitivity and specificity.
were chosen from the finite set C 2 f10-10 ; . . . ; 100 ; . . . ; 1010 g resulting in C ¼ 0:01 (dataset 1) and C ¼ 100000 (dataset 2).
Classification The feature selection process was performed solely in training sets, whereas test sets were used to estimate classification performance. In small datasets, as present, a unique AUC value on test set may be unreliable due to unfortunate data partition. For this reason, we merged training and test sets and performed repeated training/test splits 100 times holding 70% in training, and then averaging AUC values on test sets.29 In dataset 2, untreated metastases were always kept in training sets while treated metastases in test sets. Ranked features returned by the mSVM-RFE were progressively added one by one from most to least important. Area under the curve (AUC) of the receiver operating characteristic (ROC) was used as an index of classification accuracy. We plotted the average AUC value on test sets versus the size of feature subsets, to find the subset with the highest generalization accuracy (Fig. 2).
RESULTS Ten features provided the largest AUC for dataset 1 and seven features for the dataset 2. The feature “sum entropy” computed for different inter-pixel distances and directions appeared three times in the optimal subset for dataset 1, while the feature “difference variance” at one inter-pixel distance but in different directions appeared twice in the optimal subset for dataset 2. Only two features appeared in both optimal subsets: “sum entropy” at two inter-pixel distance in horizontal direction, and the histogram variance (Table 2). Performance measures of the SVM classifiers for each dataset are reported in Table 3 according to their respective ROC curves (Fig. 3). High classification accuracy (AUC > 0.9) was achieved with the reduced subset of features in both datasets; with enhanced performance over those with full set of features. However, the selected features are not necessarily discriminative when taken individually. So, classification accuracy decreased notably for the top three ranked features: AUC values (mean 6 standard
FIGURE 4: Feature maps of the top three ranked features in dataset 2 for a treated metastasis (top row) and a radiation necrosis lesion (bottom row): T1-weighted MRI (a), difference variance (1 pixel, 45 ) (b), sum entropy (2 pixels, horizontal) (c), and contrast (3 pixels, 135 ) (d).
1366
Volume 42, No. 5
Larroza et al.: Metastasis/Radionecrosis Discrimination
deviation) of 0.71 6 0.13, 0.68 6 0.13, 0.70 6 0.14 in dataset 1; and 0.79 6 0.07, 0.63 6 0.07, and 0.71 6 0.08 in dataset 2. Furthermore, a clear visual discrimination is unfeasible by looking at the texture feature maps shown in Figure 4.
DISCUSSION Our SVM classification approach could differentiate brain recurrent metastasis from radiation necrosis on test sets with AUC of 0.94 6 0.07 using a dataset composed only by radiation-treated lesions, and with AUC of 0.93 6 0.02 by training the classifier with untreated metastases. A feature selection process was necessary to achieve the reported results. As a reference, but considering that different datasets and methods were used in each study, results reported with other techniques are: perfusion MRI (AUC 5 0.80),33 MR spectroscopy (AUC 5 0.92) 34 and PET (AUC 5 0.78).35 These advanced MRI and functional imaging techniques, commonly proposed for discrimination of brain metastasis and radiation necrosis,7,36 are costly and/or time-consuming (especially MR spectroscopy), and not widely available (as in the case of PET imaging) which preclude their application in clinical routine. Standard measures from conventional MRI do not reliably discriminate between tumor progression and radiation necrosis after treatment with SRS.8 The usefulness of other parameters such as texture features has not been incorporated in clinical practice.36 In the study of Tiwari et al,37 texture features were scored for distinguishing brain metastasis from radiation necrosis. However, they aimed to find the most important features and did not thoroughly analyze classification accuracy. According to our practice, they identified contrastenhanced T1-weighted images as the most important sequence for metastasis and radiation necrosis discrimination, with higher accuracy (AUC 5 0.71) over T2-weighted and FLAIR MRI. The texture features they found as relevant for discrimination were not analyzed in our work as those were not available in the MaZda software. The implemented feature selection mSVM-RFE algorithm 29 enhanced the classification accuracy when the best features of an optimal subset were taken together.28 Most of the selected features are associated with entropy and variance, which are mainly related to image heterogeneity and pixel disorganization.20 However, a direct relation to imaging findings which could help visual discrimination is not usually possible.37 Our study was limited to six texture analysis methods and could include more, thus enhancing the probabilities of classification performance. Also, features derived from 3D texture analysis shall be examined given the observed improved performance over 2D approaches.38 We extracted texture features from MRI images acquired with the same November 2015
scanner and acquisition parameters; and probably, good reproducibility is expected for multicenter studies irrespective of the scanner and acquisition parameters.39 The main limitation of our work was the limited number of cases with histological confirmation of the diagnosis, a shared weakness in these studies,2,7 as standard clinical practice in the follow-up of brain metastasis usually does not require pathological exam.5 Tumor recurrence and radiation necrosis are commonly mixed making histological diagnosis difficult.40 For ongoing work, a third class for mixed lesions could be included, although hampered with known limitations of pathological analysis. In conclusion, our results show that texture analysis and SVM based on contrast-enhanced T1-weighted MRI allows differentiation of brain metastasis and radiation necrosis with high accuracy when the proper features are selected; however, the developed model should be validated with more histologically proven cases. Patients showing radiation necrosis should be properly treated and refrained from further radiotherapy. Conversely, patients with tumor progression could benefit from more aggressive treatments.
ACKNOWLEDGMENT Contract grant sponsor: Spanish Ministerio de Educacion, Cultura y Deporte (MECD); Contract grant number: FPU12/01140; Contract grant sponsor: the Spanish Ministerio de Economıa y Competitividad (MINECO); Contract grant sponsor: FEDER funds; Contract grant number: TEC2012-33778. A.L. was funded by the Spanish Ministerio de Educacion, Cultura y Deporte (MECD).
REFERENCES 1.
Bauer S, Wiest R, Nolte LP, Reyes M. A survey of MRI-based medical image analysis for brain tumor studies. Phys Med Biol 2013;58:R97– R129. doi: 10.1088/0031-9155/58/13/R97.
2.
Chao ST, Ahluwalia MS, Barnett GH, et al. Challenges with the diagnosis and treatment of cerebral radiation necrosis. Int J Radiat Oncol Biol Phys 2013;87:449–457. doi: 10.1016/j.ijrobp.2013.05.015.
3.
Selek U, Lo SS, Chang EL. Radiation therapy for brain metastasis. In: Lu JJ, Brady LW, editors. Decision making in radiation oncology. Medical radiology. Berlin, Heidelberg: Springer Berlin Heidelberg; 2011. p 3–23.
4.
Patronas NJ. Brain metastasis. In: Drevelegas A, editor. Imaging of brain tumors with histological correlations. Berlin, Heidelberg: Springer Berlin Heidelberg; 2011. p 373–400.
5.
Patel TR, McHugh BJ, Bi WL, Minja FJ, Knisely JPS, Chiang VL. A comprehensive review of MR imaging changes following radiosurgery to 500 brain metastases. AJNR Am J Neuroradiol 2011;32:1885–1892. doi: 10.3174/ajnr.A2668.
6.
Jain R, Narang J, Sundgren PM, et al. Treatment induced necrosis versus recurrent/progressing brain tumor: going beyond the boundaries of conventional morphologic imaging. J Neurooncol 2010;100: 17–29. doi: 10.1007/s11060-010-0139-3.
1367
Journal of Magnetic Resonance Imaging 7.
Kickingereder P, Dorn F, Blau T, et al. Differentiation of local tumor recurrence from radiation-induced changes after stereotactic radiosurgery for treatment of brain metastasis: case report and review of the literature. Radiat Oncol 2013;8:52. doi: 10.1186/1748-717X-8-52.
8.
Stockham AL, Tievsky AL, Koyfman SA, et al. Conventional MRI does not reliably distinguish radiation necrosis from tumor recurrence after stereotactic radiosurgery. J Neurooncol 2012;109:149–158. doi: 10.1007/s11060-012-0881-9.
9.
Davnall F, Yip CSP, Ljungqvist G, et al. Assessment of tumor heterogeneity: an emerging imaging tool for clinical practice? Insights Imaging 2012;3:573–589. doi: 10.1007/s13244-012-0196-6.
10.
Doan NT, Van Den Bogaard SJA, Dumas EM, et al. Texture analysis of ultrahigh field T2*-weighted MR images of the brain: application to Huntington’s disease. J Magn Reson Imaging 2014;39:633–640. doi: 10.1002/jmri.24199.
11.
House MJ, Bangma SJ, Thomas M, et al. Texture-based classification of liver fibrosis using MRI. J Magn Reson Imaging 2015;41:322–328. doi: 10.1002/jmri.24536.
12.
Bahl G, Cruite I, Wolfson T, et al. Noninvasive classification of hepatic fibrosis based on texture parameters from double contrast-enhanced magnetic resonance images. J Magn Reson Imaging 2012;36:1154– 1161. doi: 10.1002/jmri.23759.
13.
Ahmed A, Gibbs P, Pickles M, Turnbull L. Texture analysis in assessment and prediction of chemotherapy response in breast cancer. J Magn Reson Imaging 2013;38:89–101. doi: 10.1002/jmri.23971.
14.
Juntu J, Sijbers J, De Backer S, Rajan J, Van Dyck D. Machine learning study of several classifiers trained with texture analysis features to differentiate benign from malignant soft-tissue tumors in T1-MRI images. J Magn Reson Imaging 2010;31:680–689. doi: 10.1002/jmri.22095.
15.
Zacharaki EI, Wang S, Chawla S, et alC. Classification of brain tumor type and grade using MRI texture and shape in a machine learning scheme. Magn Reson Med 2009;62:1609–1618. doi: 10.1002/mrm.22147.
16.
Kassner A, Thornhill RE. Texture analysis: a review of neurologic MR imaging applications. AJNR Am J Neuroradiol 2010;31:809–816. doi: 10.3174/ajnr.A2061.
17.
Materka A. Texture analysis methodologies for magnetic resonance imaging. Dialogues Clin Neurosci 2004;6:243–250.
18.
Wang S, Summers RM. Machine learning and radiology. Med Image Anal 2012;16:933–951. doi: 10.1016/j.media.2012.02.005.
19.
Szczypi nski PM, Strzelecki M, Materka A, Klepaczko A. MaZda–a software package for image texture analysis. Comput Methods Programs Biomed 2009;94:66–76. doi: 10.1016/j.cmpb.2008.08.005.
20.
Castellano G, Bonilha L, Li LM, Cendes F. Texture analysis of medical images. Clin Radiol 2004;59:1061–1069. doi: 10.1016/ j.crad.2004.07.008.
21.
Haralick R, Shanmugam K, Dinstein I. Textural features for image classification. IEEE Trans Syst Man Cybern 1973;SMC-3:610–621.
22.
Galloway MM. Texture analysis using gray level run lengths. Comput Graph Image Process 1975;4:172–179. doi: 10.1016/S0146664X(75)80008-6.
23.
Kashyap R, Chellappa R. Estimation and choice of neighbors in spatial-interaction models of images. IEEE Trans Inf Theory 1983;29: 60–72. doi: 10.1109/TIT.1983.1056685.
24.
Materka A. MaZda user’s manual. MaZda software website. www.eletel.p. lodz.pl/mazda/download/mazda_manual.pdf. Accesed January 10, 2015.
1368
25.
Hastie T, Tibshirani R, Friedman J. The elements of statistical learning: data mining, inference, and prediction, 2nd edition. New York: Springer; 2009. 739 p.
26.
Chu C, Hsu A, Chou KH, Bandettini P, Lin C, for the Alzheimer’s Disease Neuroimaging Initiative. Does feature selection improve classification accuracy? Impact of sample size and feature selection on classification using anatomical magnetic resonance images. Neuroimage 2012;60:59–70. doi: 10.1016/j.neuroimage.2011.11.066.
27.
Guyon I, Elisseeff A. An introduction to variable and feature selection. J Mach Learn Res 2003;3:1157–1182.
28.
Guyon I, Weston J, Barnhill S, Vapnik V. Gene selection for cancer classification using support vector machines. Mach Learn 2002;46: 389–422. doi: 10.1023/A:1012487302797.
29.
Duan K-B, Rajapakse JC, Wang H, Azuaje F. Multiple SVM-RFE for gene selection in cancer classification with expression data. IEEE Trans Nanobioscience 2005;4:228–234.
30.
Cortes C, Vapnik V. Support-vector networks. Mach Learn 1995;20: 273–297.doi: 10.1023/A:1022627411411.
31.
Platt J. Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. In: Smola J, Barlett P, Sch€ olkopf B, Schuurmans D, editors. Advances in large margin classifiers. Cambridge, MA: MIT Press; 1999;10:61–74.
32.
Kuhn M. Building predictive models in R using the caret package. J Stat Softw 2008;28:1–26.
33.
Huang J, Wang A-M, Shetty A, et al. Differentiation between intraaxial metastatic tumor progression and radiation injury following fractionated radiation therapy or stereotactic radiosurgery using MR spectroscopy, perfusion MR imaging or volume progression modeling. Magn Reson Imaging 2011;29:993–1001. doi: 10.1016/ j.mri.2011.04.004.
34.
Elias AE, Carlos RC, Smith EA, et al. MR spectroscopy using normalized and non-normalized metabolite ratios for differentiating recurrent brain tumor from radiation injury. Acad Radiol 2011;18:1101–1108. doi: 10.1016/j.acra.2011.05.006.
35.
Terakawa Y, Tsuyuguchi N, Iwai Y, et al. Diagnostic accuracy of 11Cmethionine PET for differentiation of recurrent brain tumors from radiation necrosis after radiotherapy. J Nucl Med 2008;49:694–699. doi: 10.2967/jnumed.107.048082.
36.
Verma N, Cowperthwaite MC, Burnett MG, Markey MK. Differentiating tumor recurrence from treatment necrosis: a review of neurooncologic imaging strategies. Neuro Oncol 2013;15:515–534. doi: 10.1093/neuonc/nos307.
37.
Tiwari P, Prasanna P, Rogers L, et al. Texture descriptors to distinguish radiation necrosis from recurrent brain tumors on multiparametric MRI. Proc SPIE 2014;9035:90352B. doi: 10.1117/ 12.2043969.
38.
Depeursinge A, Foncubierta-Rodriguez A, Van De Ville D, M€ uller H. Three-dimensional solid texture analysis in biomedical imaging: review and opportunities. Med Image Anal 2014;18:176–196. doi: 10.1016/j.media.2013.10.005.
39.
Herlidou-M^ eme S, Constans JM, Carsin B, et al. MRI texture analysis on texture test objects, normal brain and intracranial tumors. Magn Reson Imaging 2003;21:989–993. doi: 10.1016/S0730-725X(03)00212-1.
40.
Shah R, Vattoth S, Jacob R. Radiation necrosis in the brain: imaging features and differentiation from tumor recurrence. Radiographics 2012;32:1343–1359. doi: 10.1148/rg.325125002.
Volume 42, No. 5