Skip to content
This repository has been archived by the owner on Aug 27, 2019. It is now read-only.

Quantile regression #9

Open
tlienart opened this issue Aug 24, 2019 · 0 comments
Open

Quantile regression #9

tlienart opened this issue Aug 24, 2019 · 0 comments

Comments

@tlienart
Copy link
Owner

tlienart commented Aug 24, 2019

algos

ADMM, MM, and CD approaches, while MM and CD are faster and ADMM slower than the IP algorithm available in quantreg. The results so far suggest that the MM algorithm is the best-suited for non-regularized (composite) quantile regression among the four methods tested, especially for data sets with large n and relatively small p. In regularized quantile regression, all methods perform similarly in terms of variable selection, but CD and ADMM show clear superiority in run time, particularly relative to the IP and MM methods when p is large. In the case of regularized composite quantile regression, CD and ADMM dis

(...)

Applying existing optimization algorithms to (composite) quantile regression requires a non-trivial reformulation of the problem due to the non-linearity and non-differentiability of the loss and regularization terms of the objective. The well-known quantreg package for R (Koenker, 2017) uses an interior point (IP) approach for quantile and composite quantile regression with the option of l1 (lasso) regularization for the former and no regulariza- tion options for the latter. Although advanced IP algorithms in quantreg, such as the one using prediction-correction (Mehrotra, 1992) for non-regularized quantile regression, have greatly improved upon earlier attempts using simplex methods, the time spent on matrix inversion in IP approaches (Chen and Wei, 2005) motivates us to seek faster algorithms for quantile and composite quantile regression, particularly for high-dimensional data where regularization is required. In addition, following the conjectures of Fan and Li (2001), Zou (2006) showed lasso variable selection—currently the most commonly-implemented penalty for quantile regression—to be inconsistent in certain situations and presented adaptive lasso regularization as a solution. Our work in the present paper is thus motivated by both a search for faster quantile regression algorithms as well as the lack of publicly-available meth- ods for adaptive-lasso regularized quantile and composite quantile regression, particularly for high-dimensional data.

in https://arxiv.org/pdf/1709.04126.pdf (cqreg package in R)

refs

** Hunter & Lange: MM algo for QR http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.206.1351&rep=rep1&type=pdf

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant