Classical support vector regression (C- SVR) is a powerful function approximation method, which is robust against noise and performs a good generalization, since it is formulated by a regularized error function employing the e- insensitiveness property. To exploit the kernel trick, C- SVR generally solves the Lagrangian dual problem. In this paper, an efficient sequential minimal optimization (SMO) algorithm with a novel easy to compute working set selection (WSS) based on the minimization of an upper bound on the difference between consecutive loss function values for solving a convex non- smooth dual optimization problem obtained by reformulating the dual problem of C- SVR with l2 error loss function which is equivalent to the e- insensitive version of the LSSVR, is proposed. The asymptotic convergence to the optimum of the proposed SMO algorithm is also proved. This proposed SMO algorithm for solving non- smooth problem comprises both SMO algorithms for solving LSSVR and C- SVR. Indeed, it becomes equivalent to the SMO algorithm with second- order WSS for solving LSSVR when e = 0. The proposed algorithm has the advantage of dealing with the optimization variables half the number of the ones in C- SVR, which results in lesser number of kernel related matrix evaluations than the standard SMO algorithm developed for C- SVR and improves the probability of the matrix outputs to have been precomputed and cached. Therefore, the proposed SMO algorithm results better training time than the standard SMO algorithm for solving C- SVR, especially with caching process. Moreover, the superiority of the proposed WSS over its first- order counterpart for solving the non- smooth optimization problem is presented.