Twin support vector machine (TWSVM) exhibits fast training speed with better classification abilities compared with standard SVM. However, it suffers the following drawbacks: (i) the objective functions of TWSVM are comprised of empirical risk and thus may suffer from overfitting and suboptimal solution in some cases. (ii) a convex quadratic programming problems (QPPs) need to be solve, which is relatively complex to implement. To address these problems, we proposed two smoothing approaches for an implicit Lagrangian TWSVM classifiers by formulating a pair of unconstrained minimization problems in dual variables whose solutions will be obtained by solving two systems of linear equations rather than solving two QPPs in TWSVM. Our proposed formulation introduces regularization terms to each objective function with the idea of maximizing the margin. In addition, our proposed formulation becomes well-posed model due to this term, which introduces invertibility in the dual formulation. Moreover, the structural risk minimization principle is implemented in our formulation which embodies the essence of statistical learning theory. The experimental results on several benchmark datasets show better performance of the proposed approach over existing approaches in terms of estimation accuracy with less training time.