I'm using RgtSvm to make SVM models for large dataset (nrows=2500000).
The problem occurs only with particular number of columns.
For example:
- 1-10 columns with 2,5 millions of rows -> No problem creating the SVM model
- 11-19 columns with 2,5 millions of rows ->
Error: An iteration made no progress
Error in gtsvmtrain.classfication.call(y, x, param, verbose = verbose) : Error in GPU process.
- 20-132 columns with 2,5 millions of rows -> No problem creating the SVM model
Any ideas?
There should be no semantic error, because I'm attaching the columns in a for loops, so the dataset has always the same representation:
| feature 1 | features 2 | ... | target column |