Preview

Industrial laboratory. Diagnostics of materials

Advanced search
Open Access Open Access  Restricted Access Subscription Access

Modified gradient descent algorithm along nodal straight lines in regression analysis problem

https://doi.org/10.26896/1028-6861-2025-91-3-83-92

Abstract

The issues of creating computationally efficient algorithms for implementing the least absolute deviation method for estimating regression dependencies are considered. The purpose of the work is to improve the performance of gradient descent along nodal lines by considering the geometry of the objective function near the minimum, as well as its comparative analysis with the gradient projection algorithm. A modified gradient descent algorithm for regression estimation using the least absolute deviation method is proposed. Efficiency was achieved by excluding the calculation of the objective function values in the minimums of the nodal lines and determining an improved initial approximation on part of the sample. As a result, it was possible to reduce the dependence of computation time on sample size and expand the application area of the least absolute deviation method. The gradient projection algorithm for constructing linear regression dependencies does not guarantee finding an exact solution and is significantly inferior in performance to algorithms for descending along nodal lines.

About the Authors

O. A. Golovanov
Institute of Economics, Ural Branch of RAS
Russian Federation

Oleg A. Golovanov

29, Moskovskaya ul., Yekaterinburg, 620014



A. N. Tyrsin
Institute of Economics, Ural Branch of RAS; The first President of Russia B. N. Yeltsin Ural Federal University
Russian Federation

Alexander N. Tyrsin

29, Moskovskaya ul., Yekaterinburg, 620014; 19, ul. Mira, Yekaterinburg, 620002



References

1. Probability and mathematical statistics: Encyclopedia / Yu. V. Prokhorov, Chief editor. — Moscow: Bol’shaya Rossiiskaya Éntsiklopediya, 1999. — 910 p. [in Russian].

2. Akimov P. A., Matasov A. I. An iterative algorithm for l1-norm approximation in dynamic estimation problems / Automation and Remote Control. 2015. Vol. 76. No. 5. P. 733 – 748. DOI: 10.1134/S000511791505001X

3. Darlington R. B., Hayes A. F. Regression Analysis and Linear Models. Concepts, Applications, and Implementation. — New York – London: The Guilford Press, 2017. — 661 p.

4. Christensen R. Analysis of Variance, Design, and Regression Linear Modeling for Unbalanced Data. 2nd ed. — CRC Press, 2016. — 603 p.

5. Mudrov V. I., Kushko V. L. Measurement processing methods. Quasi-plausible estimates. — Moscow: Radio i svyaz’, 1983. — 304 p. [in Russian].

6. Orlov A. I. Diversity of the models for regression analysis (generalizing article) / Industr. Lab. Mater. Diagn. 2018. Vol. 84. No. 5. P. 63 – 73 [in Russian]. DOI: 10.26896/1028-6861-2018-84-5-63-73

7. Nelyubin A. P., Podinovski V. V. Approximation of functions defined in tabular form: multicriteria approach / Computational Mathematics and Mathematical Physics. 2023. Vol. 63. No. 5. P. 730 – 742. DOI: 10.1134/S0965542523050147

8. Sheynin O. B. R. J. Boscovich’s work on probability / Archive for History of Exact Sciences. 1973. Vol. 9. Nos. 4 – 5. P. 306 – 324. DOI: 10.1007/BF00348366

9. Laplace P. S. Sur Quelques du Systeme du Monde. Memories de l’Academie Royale des Science de Paris, 1789. — Paris: Gauthier-Villars, 1895.

10. Stigler S. M. Studies in the History of Probability and Statistics. XXXII: Laplace, Fisher and the Discovery of the Concept of Sufficiency / Biometrika. 1973. Vol. 60. No. 3. P. 439 – 445. DOI: 10.1093/biomet/60.3.439

11. Basset G., Koenker R. Asymptotic Theory of Least Absolute Error Regression / J. Am. Statist. Assoc. 1978. Vol. 73. No. 363. P. 618 – 622. DOI: 10.1080/01621459.1978.10480065

12. Boldin M. V., Simonova G. I., Tyurin Yu. N. Sign-based methods in linear statistical models. Translations of Mathematical Monographs. Vol. 162. — Providence, RI: American Mathematical Society. 1997. — 234 p.

13. Tyrsin A. N., Azaryan A. A. Exact evaluation of linear regression models by the least absolute deviations method based on the descent through the nodal straight lines / Vestn. YuUrGU. Ser. «Matem. Mekh. Fiz». 2018. Vol. 10. No. 2. P. 47 – 56 [in Russian]. DOI: 10.14529/mmph180205

14. Tyrsin A. N. Algorithms for descending along nodal lines in the problem of estimating regression equations by the method of least modules / Industr. Lab. Mater. Diagn. 2021. Vol. 87. No. 5. P. 68 – 75 [in Russian]. DOI: 10.26896/1028-6861-2021-87-5-68-75

15. Narula S. C., Wellington J. F. Algorithm AS108: Multiple linear regression with minimum sum of absolute errors / Appl. Statistics. 1977. Vol. 26. P. 106 – 111.

16. Armstrong R. D., Kung D. S. Algorithm AS132: Least absolute value estimates for a simple linear regression problem / Appl. Statistics. 1978. Vol. 27. P. 363 – 366.

17. Panyukov A. V., Mezal Ya. A. Parametric identification of quasilinear difference equation / Vestn. YuUrGU. Ser. «Matem. Mekh. Fiz». 2019. Vol. 11. No. 4. P. 32 – 38 [in Russian]. DOI: 10.14529/mmph190404

18. Mezaal Ya. A. Quasi-linear analysis of discrete models of nonlinear dynamics (time series): Candidate’s Thesis. — Chelyabinsk, 2020. — 134 p. [in Russian].

19. Tyrsin A. N., Golovanov O. A. Descent along nodal straight lines and simplex algorithm: two variants of regression analysis based on the least absolute deviation method / Industr. Lab. Mater. Diagn. 2024. Vol. 90. No. 5. P. 79 – 87 [in Russian]. DOI: 10.26896/1028-6861-2024-90-5-79-87

20. Rosen J. B. The gradient projection method for nonlinear programming. Part 1: Linear constraints / J. Soc. Industr. Appl. Math. 1960. Vol. 8. No. 1. P. 181 – 217. DOI: 10.1137/0108011

21. The travelling salesman problem: a guided tour of combinatorial optimization / E. L. Lawler, J. K. Lenstra, A. H. G. Rinnooy Kan, D. B. Shmoys, Eds. — J. Wiley & Sons, 1985. — 465 p.

22. Minoux M. Mathematical Programming: Theory and Algorithms. — New York: Wiley, 1986. — 489 p.

23. Barbu A., Zhu S.-C. Monte Carlo Methods. — Springer Nature Singapore Pte Ltd., 2020. — 422 p. DOI: 10.1007/978-981-13-2971-5_1

24. Tyrsin A. N. The method of selecting the best distribution law for continuous random variables on the basis of inverse mapping / Vestn. YuUrGU. Ser. «Matem. Mekh. Fiz». 2017. Vol. 9. No. 1. P. 31 – 38 [in Russian]. DOI: 10.14529/mmph170104


Review

For citations:


Golovanov O.A., Tyrsin A.N. Modified gradient descent algorithm along nodal straight lines in regression analysis problem. Industrial laboratory. Diagnostics of materials. 2025;91(3):83-92. (In Russ.) https://doi.org/10.26896/1028-6861-2025-91-3-83-92

Views: 232


ISSN 1028-6861 (Print)
ISSN 2588-0187 (Online)