Quantile regression represents a flexible approach for modelling the impact of several covariates on the conditional distribution of the dependent variable, which does not require making any parametric assumption on the observations density. However, fitting quantile regression models using the traditional pinball loss is computationally expensive, due to the non-differentiability of this function. In addition, if this loss is used, extending quantile regression to the context of non-parametric additive models become difficult. In this talk we will describe how the computational burden can be reduced, by approximating the pinball loss with a differentiable function. This allows us to exploit the computationally efficient approach described by [1], and implemented by the mgcv R package, to fit smooth additive quantile models. Beside this, we will show how the smoothing parameters can be selected in a robust fashion, and how reliable uncertainty estimated can be obtained, even for extreme quantiles. We will demonstrate this approach, which is implemented by an upcoming extension of mgcv, in the context of probabilistic forecasting of electricity demand. [1] Wood, S. N., N. Pya, and B. Safken (2015). Smoothing parameter and model selection for general smooth models. http://arxiv.org/abs/1511.03864