From noreply at r-forge.r-project.org Mon Nov 3 23:39:10 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Mon, 3 Nov 2014 23:39:10 +0100 (CET) Subject: [Returnanalytics-commits] r3547 - in pkg/FactorAnalytics: . R man Message-ID: <20141103223910.BE8B4183E82@r-forge.r-project.org> Author: pragnya Date: 2014-11-03 23:39:10 +0100 (Mon, 03 Nov 2014) New Revision: 3547 Modified: pkg/FactorAnalytics/DESCRIPTION pkg/FactorAnalytics/R/fitFundamentalFactorModel.R pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/man/fitFundamentalFactorModel.Rd pkg/FactorAnalytics/man/fitTsfm.Rd Log: Minor fix in fitTsfm, edits to description of fitFfm Modified: pkg/FactorAnalytics/DESCRIPTION =================================================================== --- pkg/FactorAnalytics/DESCRIPTION 2014-10-16 02:16:08 UTC (rev 3546) +++ pkg/FactorAnalytics/DESCRIPTION 2014-11-03 22:39:10 UTC (rev 3547) @@ -1,8 +1,8 @@ Package: factorAnalytics Type: Package Title: Factor Analytics -Version: 2.0.0.99 -Date: 2014-07-30 +Version: 2.0.1 +Date: 2014-11-03 Author: Eric Zivot, Yi-An Chen and Sangeetha Srinivasan Maintainer: Sangeetha Srinivasan Description: An R package for the estimation and risk analysis of linear factor Modified: pkg/FactorAnalytics/R/fitFundamentalFactorModel.R =================================================================== --- pkg/FactorAnalytics/R/fitFundamentalFactorModel.R 2014-10-16 02:16:08 UTC (rev 3546) +++ pkg/FactorAnalytics/R/fitFundamentalFactorModel.R 2014-11-03 22:39:10 UTC (rev 3547) @@ -1,27 +1,32 @@ -#' fit fundamental factor model by classic OLS or Robust regression technique +#' @title Fit a fundamental factor model using classic OLS or Robust regression #' -#' fit fundamental factor model or cross-sectional factor model by -#' classic OLS or Robust regression. Fundamental factor models use +#' @description Fit a fundamental (cross-sectional) factor model using Ordinary +#' Least Squares (OLS) or Robust regression. Fundamental factor models use #' observable asset specific characteristics (fundamentals) like industry #' classification, market capitalization, style classification (value, growth) -#' etc. to calculate the common risk factors. The function creates the class -#' "FundamentalFactorModel". +#' etc. to calculate the common risk factors. An object of class \code{"ffm"} +#' is returned. #' #' @details -#' If style factor exposure is standardized to regression-weighted mean zero, this makes -#' style factors orthogonal to the world factor (intercept term), which in turn facilitted -#' interpretation of the style factor returns. See Menchero 2010. +#' If style factor exposure is standardized to have a regression-weighted mean +#' of zero, style factors become orthogonal to the world factor (intercept +#' term), which in turn facilitates the interpretation of the style factor +#' returns. See Menchero (2010). #' -#' The original function was designed by Doug Martin and originally implemented -#' in S-PLUS by a number of UW Ph.D. students: Christopher Green, Eric Aldrich, -#' and Yindeng Jiang. Guy Yullen re-implemented the function in R. Yi-An Chen from -#' University of Washington re-writes the codes and finalizes the function. +#' The original function was designed by Doug Martin and initially implemented +#' in S-PLUS by a number of University of Washington Ph.D. students: +#' Christopher Green, Eric Aldrich, and Yindeng Jiang. Guy Yollin +#' re-implemented the function in R. Yi-An Chen and Sangeetha Srinivasan +#' (UW PhD students; as part of Google Summer of Code 2013 & 2014 respectively) +#' further updated the code. #' #' -#' @param data data.frame, data must have \emph{assetvar}, \emph{returnvar}, \emph{datevar} -#' , and exposure.names. Generally, data has to look like panel data. It needs firm variabales -#' and time variables. Data has to be a balanced panel. -#' @param exposure.names a character vector of exposure names for the factor model +#' @param data data.frame, data must have \emph{assetvar}, \emph{returnvar}, +#' \emph{datevar}, and exposure.names. Generally, data has to look like panel +#' data. It needs firm variabales and time variables. Data has to be a balanced +#' panel. +#' @param exposure.names a character vector of exposure names for the factor +#' model #' @param wls logical flag, TRUE for weighted least squares, FALSE for ordinary #' least squares #' @param regression A character string, "robust" for regression via lmRob, @@ -39,47 +44,49 @@ #' the data. #' @param assetvar A character string gives the name of the asset variable in #' the data. -#' @param standardized.factor.exposure logical flag. Factor exposure will be standardized -#' to regression weighted mean 0 and standardized deviation to 1 if \code{TRUE}. -#' Default is \code{FALSE}. See Detail. -#' @param weight.var A character strping gives the name of the weight used for standarizing factor exposures. -#' @return an S3 object containing -#' \itemize{ -#' \item returns.cov A "list" object contains covariance information for -#' asset returns, includes covariance, mean and eigenvalus. Beta of taken as latest -#' date input. -#' \item factor.cov An object of class "cov" or "covRob" which -#' contains the covariance matrix of the factor returns (including intercept). -#' \item resids.cov An object of class "cov" or "covRob" which contains +#' @param standardized.factor.exposure logical flag. Factor exposure will be +#' standardized to regression weighted mean 0 and standardized deviation to 1 +#' if \code{TRUE}. Default is \code{FALSE}. See Details. +#' @param weight.var A character strping gives the name of the weight used for +#' standarizing factor exposures. +#' +#' @return An object of class \code{"ffm"} is a list containing the following +#' components: +#' \item{returns.cov}{A "list" object contains covariance information for +#' asset returns, includes covariance, mean and eigenvalus. Beta of taken as +#' latest date input.} +#' \item{factor.cov}{An object of class "cov" or "covRob" which contains the +#' covariance matrix of the factor returns (including intercept).} +#' \item{resids.cov}{An object of class "cov" or "covRob" which contains #' the covariance matrix of the residuals, if "full.resid.cov" is TRUE. NULL -#' if "full.resid.cov" is FALSE. -#' \item returns.corr Correlation matrix of assets returns. -#' \item factor.corr An object of class "cov" or "covRob" which -#' contains the correlation matrix of the factor returns (including intercept). -#' \item resids.corr Correlation matrix of returns returns. -#' \item resid.variance A vector of variances estimated from the OLS +#' if "full.resid.cov" is FALSE.} +#' \item{returns.corr}{Correlation matrix of assets returns.} +#' \item{factor.corr}{An object of class "cov" or "covRob" which contains the +#' correlation matrix of the factor returns (including intercept).} +#' \item{resids.corr}{Correlation matrix of returns returns.} +#' \item{resid.variance}{A vector of variances estimated from the OLS #' residuals for each asset. If "wls" is TRUE, these are the weights used in #' the weighted least squares regressions. If "cov = robust" these values are -#' computed with "scale.tau". Otherwise they are computed with "var". -#' \item factor.returns A "xts" object containing the times series of -#' estimated factor returns and intercepts. -#' \item residuals A "xts" object containing the time series of residuals -#' for each asset. -#' \item tstats A "xts" object containing the time series of t-statistics -#' for each exposure. -#' \item call function call -#' \item exposure.names A character string giving the name of the exposure variable in -#' the data. -#' } -#' @author Guy Yullen and Yi-An Chen +#' computed with "scale.tau". Otherwise they are computed with "var".} +#' \item{factor.returns}{A "xts" object containing the times series of +#' estimated factor returns and intercepts.} +#' \item{residuals}{A "xts" object containing the time series of residuals for +#' each asset.} +#' \item{tstats}{A "xts" object containing the time series of t-statistics +#' for each exposure.} +#' \item{call}{function call} +#' \item{exposure.names}{A character string giving the name of the exposure +#' variable in the data.} +#' +#' @author Guy Yollin, Yi-An Chen and Sangeetha Srinivasan +#' #' @references -#' \itemize{ -#' \item "The Characteristics of Factor Portfolios", Fall 2010, MENCHERO Jose, -#' Journal of Performance Measurement. -#' \item Grinold,R and Kahn R, \emph{Active Portfolio Management}. -#' } +#' Menchero, J. (2010). The Characteristics of Factor Portfolios. Journal of +#' Performance Measurement, 15(1), 52-62. #' -#' @export +#' Grinold, R. C., & Kahn, R. N. (2000). Active portfolio management (Second +#' Ed.). New York: McGraw-Hill. +#' #' @examples #' #' # BARRA type factor model @@ -125,20 +132,16 @@ #' test.fit2$tstats #' test.fit2$call #' -#' -#' -#' +#' @export - - -fitFundamentalFactorModel <- - function(data,exposure.names, datevar, returnsvar, assetvar, - wls = TRUE, regression = "classic", - covariance = "classic", full.resid.cov = FALSE, robust.scale = FALSE, - standardized.factor.exposure = FALSE, weight.var) { +fitFundamentalFactorModel <- function(data, exposure.names, datevar, + returnsvar, assetvar, wls=TRUE, + regression="classic", + covariance="classic", + full.resid.cov=FALSE, robust.scale=FALSE, + standardized.factor.exposure=FALSE, + weight.var) { - - assets = unique(data[[assetvar]]) timedates = as.Date(unique(data[[datevar]])) data[[datevar]] <- as.Date(data[[datevar]]) Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-10-16 02:16:08 UTC (rev 3546) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-03 22:39:10 UTC (rev 3547) @@ -5,7 +5,7 @@ #' Users can choose between ordinary least squares-OLS, discounted least #' squares-DLS (or) robust regression. Several variable selection options #' including Stepwise, Subsets, Lars are available as well. An object of class -#' \code{tsfm} is returned. +#' \code{"tsfm"} is returned. #' #' @details #' Typically, factor models are fit using excess returns. \code{rf.name} gives @@ -76,7 +76,7 @@ #' \code{\link{fitTsfm.control}} for details. #' @param ... arguments passed to \code{\link{fitTsfm.control}} #' -#' @return fitTsfm returns an object of class \code{tsfm} for which +#' @return fitTsfm returns an object of class \code{"tsfm"} for which #' \code{print}, \code{plot}, \code{predict} and \code{summary} methods exist. #' #' The generic accessor functions \code{coef}, \code{fitted} and @@ -84,7 +84,7 @@ #' Additionally, \code{fmCov} computes the covariance matrix for asset returns #' based on the fitted factor model #' -#' An object of class \code{tsfm} is a list containing the following +#' An object of class \code{"tsfm"} is a list containing the following #' components: #' \item{asset.fit}{list of fitted objects for each asset. Each object is of #' class \code{lm} if \code{fit.method="OLS" or "DLS"}, class \code{lmRob} if @@ -186,6 +186,9 @@ stop("Invalid argument: variable.selection must be either 'none', 'stepwise','subsets' or 'lars'") } + if (missing(factor.names) && !is.null(mkt.name)) { + factor.names <- NULL + } # extract arguments to pass to different fit and variable selection functions decay <- control$decay Modified: pkg/FactorAnalytics/man/fitFundamentalFactorModel.Rd =================================================================== --- pkg/FactorAnalytics/man/fitFundamentalFactorModel.Rd 2014-10-16 02:16:08 UTC (rev 3546) +++ pkg/FactorAnalytics/man/fitFundamentalFactorModel.Rd 2014-11-03 22:39:10 UTC (rev 3547) @@ -1,7 +1,7 @@ % Generated by roxygen2 (4.0.1): do not edit by hand \name{fitFundamentalFactorModel} \alias{fitFundamentalFactorModel} -\title{fit fundamental factor model by classic OLS or Robust regression technique} +\title{Fit a fundamental factor model using classic OLS or Robust regression} \usage{ fitFundamentalFactorModel(data, exposure.names, datevar, returnsvar, assetvar, wls = TRUE, regression = "classic", covariance = "classic", @@ -9,11 +9,13 @@ standardized.factor.exposure = FALSE, weight.var) } \arguments{ -\item{data}{data.frame, data must have \emph{assetvar}, \emph{returnvar}, \emph{datevar} -, and exposure.names. Generally, data has to look like panel data. It needs firm variabales -and time variables. Data has to be a balanced panel.} +\item{data}{data.frame, data must have \emph{assetvar}, \emph{returnvar}, +\emph{datevar}, and exposure.names. Generally, data has to look like panel +data. It needs firm variabales and time variables. Data has to be a balanced +panel.} -\item{exposure.names}{a character vector of exposure names for the factor model} +\item{exposure.names}{a character vector of exposure names for the factor +model} \item{wls}{logical flag, TRUE for weighted least squares, FALSE for ordinary least squares} @@ -40,59 +42,62 @@ \item{assetvar}{A character string gives the name of the asset variable in the data.} -\item{standardized.factor.exposure}{logical flag. Factor exposure will be standardized -to regression weighted mean 0 and standardized deviation to 1 if \code{TRUE}. -Default is \code{FALSE}. See Detail.} +\item{standardized.factor.exposure}{logical flag. Factor exposure will be +standardized to regression weighted mean 0 and standardized deviation to 1 +if \code{TRUE}. Default is \code{FALSE}. See Details.} -\item{weight.var}{A character strping gives the name of the weight used for standarizing factor exposures.} +\item{weight.var}{A character strping gives the name of the weight used for +standarizing factor exposures.} } \value{ -an S3 object containing -\itemize{ -\item returns.cov A "list" object contains covariance information for -asset returns, includes covariance, mean and eigenvalus. Beta of taken as latest -date input. -\item factor.cov An object of class "cov" or "covRob" which -contains the covariance matrix of the factor returns (including intercept). -\item resids.cov An object of class "cov" or "covRob" which contains +An object of class \code{"ffm"} is a list containing the following +components: +\item{returns.cov}{A "list" object contains covariance information for +asset returns, includes covariance, mean and eigenvalus. Beta of taken as +latest date input.} +\item{factor.cov}{An object of class "cov" or "covRob" which contains the +covariance matrix of the factor returns (including intercept).} +\item{resids.cov}{An object of class "cov" or "covRob" which contains the covariance matrix of the residuals, if "full.resid.cov" is TRUE. NULL -if "full.resid.cov" is FALSE. -\item returns.corr Correlation matrix of assets returns. -\item factor.corr An object of class "cov" or "covRob" which -contains the correlation matrix of the factor returns (including intercept). -\item resids.corr Correlation matrix of returns returns. -\item resid.variance A vector of variances estimated from the OLS +if "full.resid.cov" is FALSE.} +\item{returns.corr}{Correlation matrix of assets returns.} +\item{factor.corr}{An object of class "cov" or "covRob" which contains the +correlation matrix of the factor returns (including intercept).} +\item{resids.corr}{Correlation matrix of returns returns.} +\item{resid.variance}{A vector of variances estimated from the OLS residuals for each asset. If "wls" is TRUE, these are the weights used in the weighted least squares regressions. If "cov = robust" these values are -computed with "scale.tau". Otherwise they are computed with "var". -\item factor.returns A "xts" object containing the times series of -estimated factor returns and intercepts. -\item residuals A "xts" object containing the time series of residuals -for each asset. -\item tstats A "xts" object containing the time series of t-statistics -for each exposure. -\item call function call -\item exposure.names A character string giving the name of the exposure variable in -the data. +computed with "scale.tau". Otherwise they are computed with "var".} +\item{factor.returns}{A "xts" object containing the times series of +estimated factor returns and intercepts.} +\item{residuals}{A "xts" object containing the time series of residuals for +each asset.} +\item{tstats}{A "xts" object containing the time series of t-statistics +for each exposure.} +\item{call}{function call} +\item{exposure.names}{A character string giving the name of the exposure +variable in the data.} } -} \description{ -fit fundamental factor model or cross-sectional factor model by -classic OLS or Robust regression. Fundamental factor models use +Fit a fundamental (cross-sectional) factor model using Ordinary +Least Squares (OLS) or Robust regression. Fundamental factor models use observable asset specific characteristics (fundamentals) like industry classification, market capitalization, style classification (value, growth) -etc. to calculate the common risk factors. The function creates the class -"FundamentalFactorModel". +etc. to calculate the common risk factors. An object of class \code{"ffm"} +is returned. } \details{ -If style factor exposure is standardized to regression-weighted mean zero, this makes -style factors orthogonal to the world factor (intercept term), which in turn facilitted -interpretation of the style factor returns. See Menchero 2010. +If style factor exposure is standardized to have a regression-weighted mean +of zero, style factors become orthogonal to the world factor (intercept +term), which in turn facilitates the interpretation of the style factor +returns. See Menchero (2010). -The original function was designed by Doug Martin and originally implemented -in S-PLUS by a number of UW Ph.D. students: Christopher Green, Eric Aldrich, -and Yindeng Jiang. Guy Yullen re-implemented the function in R. Yi-An Chen from -University of Washington re-writes the codes and finalizes the function. +The original function was designed by Doug Martin and initially implemented +in S-PLUS by a number of University of Washington Ph.D. students: +Christopher Green, Eric Aldrich, and Yindeng Jiang. Guy Yollin +re-implemented the function in R. Yi-An Chen and Sangeetha Srinivasan +(UW PhD students; as part of Google Summer of Code 2013 & 2014 respectively) +further updated the code. } \examples{ # BARRA type factor model @@ -139,13 +144,13 @@ test.fit2$call } \author{ -Guy Yullen and Yi-An Chen +Guy Yollin, Yi-An Chen and Sangeetha Srinivasan } \references{ -\itemize{ -\item "The Characteristics of Factor Portfolios", Fall 2010, MENCHERO Jose, -Journal of Performance Measurement. -\item Grinold,R and Kahn R, \emph{Active Portfolio Management}. +Menchero, J. (2010). The Characteristics of Factor Portfolios. Journal of +Performance Measurement, 15(1), 52-62. + +Grinold, R. C., & Kahn, R. N. (2000). Active portfolio management (Second +Ed.). New York: McGraw-Hill. } -} Modified: pkg/FactorAnalytics/man/fitTsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.Rd 2014-10-16 02:16:08 UTC (rev 3546) +++ pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-03 22:39:10 UTC (rev 3547) @@ -53,7 +53,7 @@ \code{fitTsfm}} } \value{ -fitTsfm returns an object of class \code{tsfm} for which +fitTsfm returns an object of class \code{"tsfm"} for which \code{print}, \code{plot}, \code{predict} and \code{summary} methods exist. The generic accessor functions \code{coef}, \code{fitted} and @@ -61,7 +61,7 @@ Additionally, \code{fmCov} computes the covariance matrix for asset returns based on the fitted factor model -An object of class \code{tsfm} is a list containing the following +An object of class \code{"tsfm"} is a list containing the following components: \item{asset.fit}{list of fitted objects for each asset. Each object is of class \code{lm} if \code{fit.method="OLS" or "DLS"}, class \code{lmRob} if @@ -88,7 +88,7 @@ Users can choose between ordinary least squares-OLS, discounted least squares-DLS (or) robust regression. Several variable selection options including Stepwise, Subsets, Lars are available as well. An object of class -\code{tsfm} is returned. +\code{"tsfm"} is returned. } \details{ Typically, factor models are fit using excess returns. \code{rf.name} gives From noreply at r-forge.r-project.org Fri Nov 14 20:47:14 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 14 Nov 2014 20:47:14 +0100 (CET) Subject: [Returnanalytics-commits] r3548 - in pkg/FactorAnalytics: R man sandbox vignettes Message-ID: <20141114194714.8B0C318663C@r-forge.r-project.org> Author: pragnya Date: 2014-11-14 20:47:13 +0100 (Fri, 14 Nov 2014) New Revision: 3548 Removed: pkg/FactorAnalytics/R/factorModelEsDecomposition.R pkg/FactorAnalytics/R/factorModelMonteCarlo.R pkg/FactorAnalytics/R/factorModelSdDecomposition.R pkg/FactorAnalytics/R/factorModelVaRDecomposition.R pkg/FactorAnalytics/R/fitFundamentalFactorModel.R pkg/FactorAnalytics/R/fitStatisticalFactorModel.R pkg/FactorAnalytics/R/plot.FundamentalFactorModel.r pkg/FactorAnalytics/R/plot.StatFactorModel.r pkg/FactorAnalytics/R/predict.FundamentalFactorModel.r pkg/FactorAnalytics/R/predict.StatFactorModel.r pkg/FactorAnalytics/R/print.FundamentalFactorModel.r pkg/FactorAnalytics/R/print.StatFactorModel.r pkg/FactorAnalytics/R/summary.FundamentalFactorModel.r pkg/FactorAnalytics/R/summary.StatFactorModel.r pkg/FactorAnalytics/man/factorModelEsDecomposition.Rd pkg/FactorAnalytics/man/factorModelMonteCarlo.Rd pkg/FactorAnalytics/man/factorModelSdDecomposition.Rd pkg/FactorAnalytics/man/factorModelVaRDecomposition.Rd pkg/FactorAnalytics/man/fitFundamentalFactorModel.Rd pkg/FactorAnalytics/man/fitStatisticalFactorModel.Rd pkg/FactorAnalytics/man/plot.FundamentalFactorModel.Rd pkg/FactorAnalytics/man/plot.StatFactorModel.Rd pkg/FactorAnalytics/man/predict.FundamentalFactorModel.Rd pkg/FactorAnalytics/man/predict.StatFactorModel.Rd pkg/FactorAnalytics/man/print.FundamentalFactorModel.Rd pkg/FactorAnalytics/man/print.StatFactorModel.Rd pkg/FactorAnalytics/man/summary.FundamentalFactorModel.Rd pkg/FactorAnalytics/man/summary.StatFactorModel.Rd pkg/FactorAnalytics/sandbox/example.menu.plot.r pkg/FactorAnalytics/sandbox/plotFactorModelFit.r pkg/FactorAnalytics/sandbox/test.vignette.r pkg/FactorAnalytics/sandbox/testfile.r pkg/FactorAnalytics/vignettes/equity.Rdata pkg/FactorAnalytics/vignettes/equity.csv pkg/FactorAnalytics/vignettes/fundamentalFM.Rnw pkg/FactorAnalytics/vignettes/fundamentalFM.pdf Log: Deleted outdated scripts, examples etc. Deleted: pkg/FactorAnalytics/R/factorModelEsDecomposition.R =================================================================== --- pkg/FactorAnalytics/R/factorModelEsDecomposition.R 2014-11-03 22:39:10 UTC (rev 3547) +++ pkg/FactorAnalytics/R/factorModelEsDecomposition.R 2014-11-14 19:47:13 UTC (rev 3548) @@ -1,135 +0,0 @@ -#' Compute Factor Model ES Decomposition -#' -#' Compute the factor model factor expected shortfall (ES) decomposition for an -#' asset based on Euler's theorem given historic or simulated data and factor -#' model parameters. The partial derivative of ES with respect to factor beta -#' is computed as the expected factor return given fund return is less than or -#' equal to its value-at-risk (VaR). VaR is compute as the sample quantile of -#' the historic or simulated data. -#' -#' The factor model has the form \cr -#' \code{R(t) = beta'F(t) + e(t) = beta.star'F.star(t)}\cr -#' where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' By Euler's -#' theorem: \cr \code{ES.fm = sum(cES.fm) = sum(beta.star*mES.fm)} \cr -#' -#' @param Data \code{B x (k+2)} matrix of historic or simulated data. The first -#' column contains the fund returns, the second through \code{k+1}st columns -#' contain the returns on the \code{k} factors, and the \code{(k+2)}nd column -#' contain residuals scaled to have unit variance. -#' @param beta.vec \code{k x 1} vector of factor betas. -#' @param sig.e scalar, residual variance from factor model. -#' @param tail.prob scalar, tail probability for VaR quantile. Typically 0.01 -#' or 0.05. -#' @param VaR.method character, method for computing VaR. Valid choices are -#' one of "modified","gaussian","historical", "kernel". computation is done -#' with the \code{VaR} in the PerformanceAnalytics package. -#' -#' -#' @return A list with the following components: -#' \itemize{ -#' \item{VaR} {Scalar, nonparametric VaR value for fund reported as a -#' positive number.} -#' \item{n.exceed} Scalar, number of observations beyond VaR. -#' \item{idx.exceed} n.exceed x 1 vector giving index values of exceedences. -#' \item{ES.fm} Scalar. nonparametric ES value for fund reported as a positive -#' number. -#' \item{mES.fm} (K+1) x 1 vector of factor marginal contributions to ES. -#' \item{cES.fm} (K+1) x 1 vector of factor component contributions to ES. -#' \item{pcES.fm} (K+1) x 1 vector of factor percentage component contributions -#' to ES. -#' } -#' @author Eric Zviot and Yi-An Chen. -#' @references \enumerate{ -#' \item Hallerback (2003), "Decomposing Portfolio Value-at-Risk: A -#' General Analysis", The Journal of Risk 5/2. -#' \item Yamai and Yoshiba (2002)."Comparative Analyses of Expected Shortfall -#' and Value-at-Risk: Their -#' Estimation Error, Decomposition, and Optimization Bank of Japan. -#' \item Meucci (2007). "Risk Contributions from Generic User-Defined Factors". -#' \item Epperlein and Smillie (2006) "Cracking VAR with Kernels," Risk. -#' } -#' @examples -#' \dontrun{ -#' data(managers) -#' fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), -#' factor.names=c("EDHEC LS EQ","SP500 TR"),data=managers) -#' # risk factor contribution to ETL -#' # combine fund returns, factor returns and residual returns for HAM1 -#' tmpData = cbind(managers[,1],managers[,c("EDHEC LS EQ","SP500 TR")] , -#' residuals(fit.macro$asset.fit$HAM1)/sqrt(fit.macro$resid.sd[1])) -#' colnames(tmpData)[c(1,4)] = c("HAM1", "residual") -#' factor.es.decomp.HAM1 = factorModelEsDecomposition(tmpData, fit.macro$beta[1,], -#' fit.macro$resid.sd[1], tail.prob=0.05, -#' VaR.method="historical" ) -#' -#' # fundamental factor model -#' # try to find factor contribution to ES for STI -#' data(Stock.df) -#' fit.fund <- fitFundamentalFactorModel(exposure.names=c("BOOK2MARKET", "LOG.MARKETCAP") -#' , data=stock,returnsvar = "RETURN",datevar = "DATE", -#' assetvar = "TICKER", -#' wls = TRUE, regression = "classic", -#' covariance = "classic", full.resid.cov = FALSE) -#' idx <- fit.fund$data[,fit.fund$assetvar] == "STI" -#' asset.ret <- fit.fund$data[idx,fit.fund$returnsvar] -#' tmpData = cbind(asset.ret, fit.fund$factor.returns, -#' fit.fund$residuals[,"STI"]/sqrt(fit.fund$resid.variance["STI"]) ) -#' colnames(tmpData)[c(1,length(tmpData[1,]))] = c("STI", "residual") -#' factorModelEsDecomposition(tmpData, -#' fit.fund$beta["STI",], -#' fit.fund$resid.variance["STI"], tail.prob=0.05,VaR.method="historical") -#' } -#' -#' @export -#' -factorModelEsDecomposition <- -function(Data, beta.vec, sig.e, tail.prob = 0.05, - VaR.method=c("modified", "gaussian", "historical", "kernel")) { - - Data = as.matrix(Data) - ncol.Data = ncol(Data) - if(is.matrix(beta.vec)) { - beta.names = c(rownames(beta.vec), "residual") - } else if(is.vector(beta.vec)) { - beta.names = c(names(beta.vec), "residual") - } else { - stop("beta.vec is not an n x 1 matrix or a vector") - } - beta.names = c(names(beta.vec), "residual") - beta.star.vec = c(beta.vec, sig.e) - names(beta.star.vec) = beta.names - - ## epsilon is calculated in the sense of minimizing mean square error by Silverman 1986 - epi <- 2.575*sd(Data[,1]) * (nrow(Data)^(-1/5)) - VaR.fm = as.numeric(VaR(Data[, 1], p=(1-tail.prob),method=VaR.method)) - idx = which(Data[, 1] <= VaR.fm + epi & Data[,1] >= VaR.fm - epi) - - - - ES.fm = -mean(Data[idx, 1]) - - ## - ## compute marginal contribution to ES - ## - ## compute marginal ES as expected value of factor return given fund - ## return is less than or equal to VaR - mcES.fm = -as.matrix(colMeans(Data[idx, -1])) - -## compute correction factor so that sum of weighted marginal ES adds to portfolio ES -cf = as.numeric( ES.fm / sum(mcES.fm*beta.star.vec) ) -mcES.fm = cf*mcES.fm -cES.fm = mcES.fm*beta.star.vec -pcES.fm = cES.fm/ES.fm -colnames(mcES.fm) = "MCES" -colnames(cES.fm) = "CES" -colnames(pcES.fm) = "PCES" -ans = list(VaR.fm = -VaR.fm, - n.exceed = length(idx), - idx.exceed = idx, - ES.fm = ES.fm, - mES.fm = t(mcES.fm), - cES.fm = t(cES.fm), - pcES.fm = t(pcES.fm)) -return(ans) -} - Deleted: pkg/FactorAnalytics/R/factorModelMonteCarlo.R =================================================================== --- pkg/FactorAnalytics/R/factorModelMonteCarlo.R 2014-11-03 22:39:10 UTC (rev 3547) +++ pkg/FactorAnalytics/R/factorModelMonteCarlo.R 2014-11-14 19:47:13 UTC (rev 3548) @@ -1,155 +0,0 @@ -#' Simulate returns using factor model Monte Carlo method. -#' -#' Simulate returns using factor model Monte Carlo method. Parametric method -#' like normal distribution, Cornish-Fisher and skew-t distribution for -#' residuals can be selected. Resampling method such as non-parametric bootstrap -#' or stationary bootstrap can be selected. -#' -#' The factor model Monte Carlo method is described in Jiang (2009). -#' -#' @param n.boot Integer number of bootstrap samples. -#' @param factorData \code{n.months x n.funds} matrix or data.frame of factor -#' returns. -#' @param Beta.mat \code{n.funds x n.factors} matrix of factor betas. -#' @param Alpha.mat \code{n.funds x 1} matrix of factor alphas (intercepts). If -#' \code{NULL} then assume that all alphas are zero. -#' @param residualData \code{n.funds x n.parms} matrix of residual distribution -#' parameters. The columns of \code{residualData} depend on the value of -#' \code{residual.dist}. If \code{residual.dist = "normal"}, then -#' \code{residualData} has one column containing variance values; if -#' \code{residual.dist = "Cornish-Fisher"}, then \code{residualData} has three -#' columns containing variance, skewness and excess kurtosis values; if -#' \code{residual.dist="skew-t"}, then \code{residualData} has four columns -#' containing location, scale, shape, and df values. -#' @param residual.dist character vector specifying the residual distribution. -#' Choices are "normal" for the normal distribution; "Cornish-Fisher" for the -#' Cornish-Fisher distribution based on the Cornish-Fisher expansion of the -#' normal distribution quantile; "skew-t" for the skewed Student's t -#' distribution of Azzalini and Captiano. -#' @param boot.method character vector specifying the resampling method. -#' Choices are "random" for random sampling with replacement (non-parametric -#' bootstrap); "block" for stationary block bootstrapping. -#' @param seed integer random number seed used for resampling the factor -#' returns. -#' @param return.factors logical; if \code{TRUE} then return resampled factors -#' in output list object. -#' @param return.residuals logical; if \code{TRUE} then return simulated -#' residuals in output list object. -#' @return A list with the following components: -#' \itemize{ -#' \item{returns} \code{n.boot x n.funds} matrix of simulated fund -#' returns. -#' \item{factors} \code{n.boot x n.factors} matrix of resampled factor -#' returns. Returned only if \code{return.factors = TRUE}. -#' \item{residuals} \code{n.boot x n.funds} matrix of simulated fund -#' residuals. Returned only if \code{return.residuals = TRUE}. -#' } -#' @author Eric Zivot and Yi-An Chen. -#' @references Jiang, Y. (2009). UW PhD Thesis. -#' @export -#' @examples -#' -#' # load data from the database -#' \dontrun{ -#' data(managers) -#' fit <- fitTsfm(asset.names=colnames(managers[,(1:6)]), -#' factor.names=c("EDHEC LS EQ","SP500 TR"), data=managers) -#' factorData= managers[,c("EDHEC LS EQ","SP500 TR")] -#' Beta.mat=fit$beta -#' residualData=as.matrix((fit$resid.sd)^2,1,6) -#' n.boot=1000 -#' # bootstrap returns data from factor model with residuals sample from normal distribution -#' bootData <- factorModelMonteCarlo(n.boot, factorData,Beta.mat, residual.dist="normal", -#' residualData, Alpha.mat=NULL, boot.method="random", -#' seed = 123, return.factors = "TRUE", return.residuals = -#' "TRUE") -#' # Cornish-Fisher distribution -#' # build different residualData matrix -#' residualData <- cbind(c(1,2,1,3,0.1,0.5),rnorm(6),c(2,3,1,2,1,0)) -#' colnames(residualData) <- c("var","skew","ekurt") -#' rownames(residualData) <- colnames(managers[,(1:6)]) -#' bootData <- factorModelMonteCarlo(n.boot, factorData,Beta.mat, residual.dist="Cornish-Fisher", -#' residualData, Alpha.mat=NULL, boot.method="random", -#' seed = 123, return.factors = "TRUE", return.residuals = -#' "TRUE") -#' -#' -#' # skew-t distribution -#' # build residualData matrix -#' residualData <- cbind(rnorm(6),c(1,2,1,3,0.1,0.5),rnorm(6),c(2,3,1,6,10,100)) -#' colnames(residualData) <- c("xi","omega","alpha","nu") -#' rownames(residualData) <- colnames(managers[,(1:6)]) -#' bootData <- factorModelMonteCarlo(n.boot, factorData,Beta.mat, residual.dist="skew-t", -#' residualData, Alpha.mat=NULL, boot.method="random", -#' seed = 123, return.factors = "TRUE", return.residuals = -#' "TRUE") -#' } -#' -factorModelMonteCarlo <- - function (n.boot = 1000, factorData, Beta.mat, Alpha.mat = NULL, - residualData, residual.dist = c("normal", "Cornish-Fisher", - "skew-t"), boot.method = c("random", "block"), seed = 123, - return.factors = FALSE, return.residuals = FALSE) - { - boot.method = boot.method[1] - residual.dist = residual.dist[1] - set.seed(seed) - if (nrow(Beta.mat) != nrow(residualData)) { - stop("Beta.mat and residualData have different number of rows") - } - factorData = as.matrix(factorData) - n.funds = nrow(Beta.mat) - fund.names = rownames(Beta.mat) - if (is.null(Alpha.mat)) { - Alpha.mat = matrix(0, nrow(Beta.mat)) - rownames(Alpha.mat) = fund.names - } - if (boot.method == "random") { - bootIdx = sample(nrow(factorData), n.boot, replace = TRUE) - } - else { - n.samples = round(n.boot/nrow(factorData)) - n.adj = n.boot - n.samples * nrow(factorData) - bootIdx = as.vector(tsbootstrap(1:nrow(factorData), nb = n.samples)) - if (n.adj > 0) { - bootIdx = c(bootIdx, bootIdx[1:n.adj]) - } - } - factorDataBoot = factorData[bootIdx, ] - fundReturnsBoot = matrix(0, n.boot, n.funds) - residualsSim = matrix(0, n.boot, n.funds) - colnames(fundReturnsBoot) = colnames(residualsSim) = fund.names - for (i in fund.names) { - set.seed(which(fund.names == i)) - if (residual.dist == "normal") { - residualsSim[, i] = rnorm(n.boot, sd = sqrt(residualData[i, - ])) - } - else if (residual.dist == "Cornish-Fisher") { - residualsSim[, i] = rCornishFisher(n.boot, sigma = sqrt(residualData[i, - "var"]), skew = residualData[i, "skew"], ekurt = residualData[i, - "ekurt"]) - } - else if (residual.dist == "skew-t") { - residualsSim[, i] = rst(n.boot, xi = residualData[i, - "xi"], omega = residualData[i, "omega"], - alpha = residualData[i, "alpha"], nu = residualData[i, - "nu"]) - } - else { - stop("Invalid residual distribution") - } - fundReturnsBoot[, i] = Alpha.mat[i, 1] + factorDataBoot[, - colnames(Beta.mat)] %*% t(Beta.mat[i, , drop = FALSE]) + - residualsSim[, i] - } - ans = list(returns = fundReturnsBoot) - if (return.factors) { - ans$factors = factorDataBoot - } - if (return.residuals) { - ans$residuals = residualsSim - } - return(ans) - } - Deleted: pkg/FactorAnalytics/R/factorModelSdDecomposition.R =================================================================== --- pkg/FactorAnalytics/R/factorModelSdDecomposition.R 2014-11-03 22:39:10 UTC (rev 3547) +++ pkg/FactorAnalytics/R/factorModelSdDecomposition.R 2014-11-14 19:47:13 UTC (rev 3548) @@ -1,89 +0,0 @@ -#' Compute factor model standard deviation decomposition -#' -#' Compute the factor model factor standard deviation decomposition for an -#' asset based on Euler's theorem given factor model parameters. -#' -#' The factor model has the form \cr \code{R(t) = beta'F(t) + e(t) = beta.star'F.star(t)}\cr -#' where beta.star = (beta, sig.e)' and F.star(t) = [F(t)', z(t)]'. By Euler's -#' theorem:\cr \code{Sd.fm = sum(cSd.fm) = sum(beta.star*mSd.fm)} \cr -#' -#' -#' @param beta.vec k x 1 vector of factor betas with factor names in the -#' rownames. -#' @param factor.cov k x k factor excess return covariance matrix. -#' @param sig2.e scalar, residual variance from factor model. -#' @return an S3 object containing -#' \itemize{ -#' \item{Sd.fm} Scalar, std dev based on factor model. -#' \item{mSd.fm} (K+1) x 1 vector of factor marginal contributions to sd. -#' \item{cSd.fm} (K+1) x 1 vector of factor component contributions to sd. -#' \item{pcSd.fm} (K+1) x 1 vector of factor percentage component contributions to sd. -#' } -#' @author Eric Zivot and Yi-An Chen -#' @references -#' \enumerate{ -#' \item Hallerback (2003), "Decomposing Portfolio Value-at-Risk: A -#' General Analysis", The Journal of Risk 5/2. -#' \item Yamai and Yoshiba (2002)."Comparative Analyses of Expected Shortfall and Value-at-Risk: Their -#' Estimation Error, Decomposition, and Optimization Bank of Japan. -#' \item Meucci (2007). "Risk Contributions from Generic User-Defined Factors," Risk. -#' }#' -#' @examples -#' -#' # load data from the database -#' data("stat.fm.data") -#' fit.stat <- fitStatisticalFactorModel(sfm.dat,k=2) -#' cov.factors = var(fit.stat$factors) -#' names = colnames(fit.stat$asset.ret) -#' factor.sd.decomp.list = list() -#' for (i in names) { -#' factor.sd.decomp.list[[i]] = -#' factorModelSdDecomposition(fit.stat$loadings[,i], -#' cov.factors, fit.stat$resid.variance[i]) -#' } -#' -#' @export -#' -factorModelSdDecomposition <- -function(beta.vec, factor.cov, sig2.e) { - -## Remarks: -## The factor model has the form -## R(t) = beta'F(t) + e(t) = beta.star'F.star(t) -## where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' -## By Euler's theorem -## sd.fm = sum(cr.fm) = sum(beta*mcr.fm) - if(is.matrix(beta.vec)) { - beta.names = c(rownames(beta.vec), "residual") - } else if(is.vector(beta.vec)) { - beta.names = c(names(beta.vec), "residual") - } else { - stop("beta.vec is not a matrix or a vector") - } - beta.vec = as.vector(beta.vec) - beta.star.vec = c(beta.vec, sqrt(sig2.e)) - names(beta.star.vec) = beta.names - factor.cov = as.matrix(factor.cov) - k.star = length(beta.star.vec) - k = k.star - 1 - factor.star.cov = diag(k.star) - factor.star.cov[1:k, 1:k] = factor.cov - -## compute factor model sd - sd.fm = as.numeric(sqrt(t(beta.star.vec) %*% factor.star.cov %*% beta.star.vec)) -## compute marginal and component contributions to sd - mcr.fm = (factor.star.cov %*% beta.star.vec)/sd.fm - cr.fm = mcr.fm * beta.star.vec - pcr.fm = cr.fm/sd.fm - rownames(mcr.fm) <- rownames(cr.fm) <- rownames(pcr.fm) <- beta.names - colnames(mcr.fm) = "MCR" - colnames(cr.fm) = "CR" - colnames(pcr.fm) = "PCR" -## return results - ans = list(Sd.fm = sd.fm, - mSd.fm = t(mcr.fm), - cSd.fm = t(cr.fm), - pcSd.fm = t(pcr.fm)) - return(ans) -} - Deleted: pkg/FactorAnalytics/R/factorModelVaRDecomposition.R =================================================================== --- pkg/FactorAnalytics/R/factorModelVaRDecomposition.R 2014-11-03 22:39:10 UTC (rev 3547) +++ pkg/FactorAnalytics/R/factorModelVaRDecomposition.R 2014-11-14 19:47:13 UTC (rev 3548) @@ -1,104 +0,0 @@ -#' Compute factor model VaR decomposition -#' -#' Compute factor model factor VaR decomposition based on Euler's theorem given -#' historic or simulated data and factor model parameters. The partial -#' derivative of VaR wrt factor beta is computed as the expected factor return -#' given fund return is equal to its VaR and approximated by kernel estimator. -#' VaR is compute either as the sample quantile or as an estimated quantile -#' using the Cornish-Fisher expansion. -#' -#' The factor model has the form \cr \code{R(t) = beta'F(t) + e(t) = beta.star'F.star(t)}\cr -#' where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' By Euler's -#' theorem:\cr \code{VaR.fm = sum(cVaR.fm) = sum(beta.star*mVaR.fm)} \cr -#' -#' @param Data B x (k+2) matrix of bootstrap data. First column contains -#' the fund returns, second through k+1 columns contain factor returns, (k+2)nd -#' column contain residuals scaled to have unit variance . -#' @param beta.vec k x 1 vector of factor betas. -#' @param sig2.e scalar, residual variance from factor model. -#' @param tail.prob scalar, tail probability -#' @param VaR.method character, method for computing VaR. Valid choices are -#' one of "modified","gaussian","historical", "kernel". computation is done with the \code{VaR} -#' in the PerformanceAnalytics package. -#' @return an S3 object containing -#' \itemize{ -#' \item{VaR.fm} Scalar, bootstrap VaR value for fund reported as a -#' positive number. -#' \item{n.exceed} Scalar, number of observations beyond VaR. -#' \item{idx.exceed} n.exceed x 1 vector giving index values of -#' exceedences. -#' \item{mVaR.fm} (K+1) x 1 vector of factor marginal contributions to VaR. -#' \item{cVaR.fm} (K+1) x 1 vector of factor component contributions to VaR. -#' \item{pcVaR.fm} (K+1) x 1 vector of factor percentage contributions to VaR. -#' } -#' @author Eric Zivot and Yi-An Chen -#' @references -#' \enumerate{ -#' \item Hallerback (2003), "Decomposing Portfolio Value-at-Risk: A -#' General Analysis", The Journal of Risk 5/2. -#' \item Yamai and Yoshiba (2002)."Comparative Analyses of Expected Shortfall and Value-at-Risk: Their -#' Estimation Error, Decomposition, and Optimization Bank of Japan. -#' \item Meucci (2007). "Risk Contributions from Generic User-Defined Factors," Risk. -#' } -#' @examples -#' \dontrun{ -#' data(managers) -#' fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), -#' factor.names=c("EDHEC LS EQ","SP500 TR"), data=managers) -#' # risk factor contribution to VaR -#' # combine fund returns, factor returns and residual returns for HAM1 -#' tmpData = cbind(managers[,1], managers[,c("EDHEC LS EQ","SP500 TR")] , -#' residuals(fit.macro$asset.fit$HAM1)/fit.macro$resid.sd[1]) -#' colnames(tmpData)[c(1,4)] = c("HAM1", "residual") -#' factor.VaR.decomp.HAM1 = factorModelVaRDecomposition(tmpData, fit.macro$beta[1,], -#' fit.macro$resid.sd[1], tail.prob=0.05, -#' VaR.method="historical") -#' } -#' @export -factorModelVaRDecomposition <- -function(Data, beta.vec, sig2.e, tail.prob = 0.01, - VaR.method=c("modified", "gaussian", "historical", "kernel")) { - - VaR.method = VaR.method[1] - Data = as.matrix(Data) - ncol.Data = ncol(Data) - if(is.matrix(beta.vec)) { - beta.names = c(rownames(beta.vec), "residual") - } else if(is.vector(beta.vec)) { - beta.names = c(names(beta.vec), "residual") - } else { - stop("beta.vec is not an n x 1 matrix or a vector") - } - beta.names = c(names(beta.vec), "residual") - beta.star.vec = c(beta.vec, sqrt(sig2.e)) - names(beta.star.vec) = beta.names - - ## epsilon is calculated in the sense of minimizing mean square error by Silverman 1986 - epi <- 2.575*sd(Data[,1]) * (nrow(Data)^(-1/5)) - VaR.fm = as.numeric(VaR(Data[, 1], p=(1-tail.prob),method=VaR.method)) - idx = which(Data[, 1] <= VaR.fm + epi & Data[,1] >= VaR.fm - epi) - - ## - ## compute marginal contribution to VaR - ## - ## compute marginal VaR as expected value of factor return given - ## triangler kernel - mVaR.fm = -as.matrix(colMeans(Data[idx, -1])) - -## compute correction factor so that sum of weighted marginal VaR adds to portfolio VaR -cf = as.numeric( -VaR.fm / sum(mVaR.fm*beta.star.vec) ) -mVaR.fm = cf*mVaR.fm -cVaR.fm = mVaR.fm*beta.star.vec -pcVaR.fm = cVaR.fm/-VaR.fm -colnames(mVaR.fm) = "MVaR" -colnames(cVaR.fm) = "CVaR" -colnames(pcVaR.fm) = "PCVaR" -ans = list(VaR.fm = -VaR.fm, - n.exceed = length(idx), - idx.exceed = idx, - mVaR.fm = t(mVaR.fm), - cVaR.fm = t(cVaR.fm), - pcVaR.fm = t(pcVaR.fm)) -return(ans) -} - Deleted: pkg/FactorAnalytics/R/fitFundamentalFactorModel.R =================================================================== --- pkg/FactorAnalytics/R/fitFundamentalFactorModel.R 2014-11-03 22:39:10 UTC (rev 3547) +++ pkg/FactorAnalytics/R/fitFundamentalFactorModel.R 2014-11-14 19:47:13 UTC (rev 3548) @@ -1,493 +0,0 @@ -#' @title Fit a fundamental factor model using classic OLS or Robust regression -#' -#' @description Fit a fundamental (cross-sectional) factor model using Ordinary -#' Least Squares (OLS) or Robust regression. Fundamental factor models use -#' observable asset specific characteristics (fundamentals) like industry -#' classification, market capitalization, style classification (value, growth) -#' etc. to calculate the common risk factors. An object of class \code{"ffm"} -#' is returned. -#' -#' @details -#' If style factor exposure is standardized to have a regression-weighted mean -#' of zero, style factors become orthogonal to the world factor (intercept -#' term), which in turn facilitates the interpretation of the style factor -#' returns. See Menchero (2010). -#' -#' The original function was designed by Doug Martin and initially implemented -#' in S-PLUS by a number of University of Washington Ph.D. students: -#' Christopher Green, Eric Aldrich, and Yindeng Jiang. Guy Yollin -#' re-implemented the function in R. Yi-An Chen and Sangeetha Srinivasan -#' (UW PhD students; as part of Google Summer of Code 2013 & 2014 respectively) -#' further updated the code. -#' -#' -#' @param data data.frame, data must have \emph{assetvar}, \emph{returnvar}, -#' \emph{datevar}, and exposure.names. Generally, data has to look like panel -#' data. It needs firm variabales and time variables. Data has to be a balanced -#' panel. -#' @param exposure.names a character vector of exposure names for the factor -#' model -#' @param wls logical flag, TRUE for weighted least squares, FALSE for ordinary -#' least squares -#' @param regression A character string, "robust" for regression via lmRob, -#' "classic" for regression with lm() -#' @param covariance A character string, "robust" for covariance matrix -#' computed with covRob(), "classic" for covariance matrix with covClassic() in -#' robust package. -#' @param full.resid.cov logical flag, TRUE for full residual covariance matrix -#' calculation, FALSE for diagonal residual covarinace matrix -#' @param robust.scale logical flag, TRUE for exposure scaling via robust scale -#' and location, FALSE for scaling via mean and sd -#' @param returnsvar A character string giving the name of the return variable -#' in the data. -#' @param datevar A character string gives the name of the date variable in -#' the data. -#' @param assetvar A character string gives the name of the asset variable in -#' the data. -#' @param standardized.factor.exposure logical flag. Factor exposure will be -#' standardized to regression weighted mean 0 and standardized deviation to 1 -#' if \code{TRUE}. Default is \code{FALSE}. See Details. -#' @param weight.var A character strping gives the name of the weight used for -#' standarizing factor exposures. -#' -#' @return An object of class \code{"ffm"} is a list containing the following -#' components: -#' \item{returns.cov}{A "list" object contains covariance information for -#' asset returns, includes covariance, mean and eigenvalus. Beta of taken as -#' latest date input.} -#' \item{factor.cov}{An object of class "cov" or "covRob" which contains the -#' covariance matrix of the factor returns (including intercept).} -#' \item{resids.cov}{An object of class "cov" or "covRob" which contains -#' the covariance matrix of the residuals, if "full.resid.cov" is TRUE. NULL -#' if "full.resid.cov" is FALSE.} -#' \item{returns.corr}{Correlation matrix of assets returns.} -#' \item{factor.corr}{An object of class "cov" or "covRob" which contains the -#' correlation matrix of the factor returns (including intercept).} -#' \item{resids.corr}{Correlation matrix of returns returns.} -#' \item{resid.variance}{A vector of variances estimated from the OLS -#' residuals for each asset. If "wls" is TRUE, these are the weights used in -#' the weighted least squares regressions. If "cov = robust" these values are -#' computed with "scale.tau". Otherwise they are computed with "var".} -#' \item{factor.returns}{A "xts" object containing the times series of -#' estimated factor returns and intercepts.} -#' \item{residuals}{A "xts" object containing the time series of residuals for -#' each asset.} -#' \item{tstats}{A "xts" object containing the time series of t-statistics -#' for each exposure.} -#' \item{call}{function call} -#' \item{exposure.names}{A character string giving the name of the exposure -#' variable in the data.} -#' -#' @author Guy Yollin, Yi-An Chen and Sangeetha Srinivasan -#' -#' @references -#' Menchero, J. (2010). The Characteristics of Factor Portfolios. Journal of -#' Performance Measurement, 15(1), 52-62. -#' -#' Grinold, R. C., & Kahn, R. N. (2000). Active portfolio management (Second -#' Ed.). New York: McGraw-Hill. -#' -#' @examples -#' -#' # BARRA type factor model -#' data(Stock.df) -#' # there are 447 assets -#' exposure.names <- c("BOOK2MARKET", "LOG.MARKETCAP") -#' test.fit <- fitFundamentalFactorModel(data=stock,exposure.names=exposure.names, -#' datevar = "DATE", returnsvar = "RETURN", -#' assetvar = "TICKER", wls = TRUE, -#' regression = "classic", -#' covariance = "classic", full.resid.cov = TRUE, -#' robust.scale = TRUE) -#' -#' names(test.fit) -#' test.fit$returns.cov -#' test.fit$resids.cov -#' names(test.fit$cov.factor) -#' test.fit$factor.cov$cov -#' test.fit$factor -#' test.fit$resid.variance -#' test.fit$resids -#' test.fit$tstats -#' test.fit$call -#' -#' # BARRA type Industry Factor Model -#' exposure.names <- c("GICS.SECTOR") -#' # the rest keep the same -#' test.fit2 <- fitFundamentalFactorModel(data=stock,exposure.names=exposure.names, -#' datevar = "DATE", returnsvar = "RETURN", -#' assetvar = "TICKER", wls = TRUE, -#' regression = "classic", -#' covariance = "classic", full.resid.cov = TRUE, -#' robust.scale = TRUE) -#' -#' names(test.fit2) -#' test.fit2$cov.returns -#' test.fit2$cov.resids -#' names(test.fit2$cov.factor) -#' test.fit2$cov.factor$cov -#' test.fit2$factor -#' test.fit2$resid.variance -#' test.fit2$resids -#' test.fit2$tstats -#' test.fit2$call -#' -#' @export - -fitFundamentalFactorModel <- function(data, exposure.names, datevar, - returnsvar, assetvar, wls=TRUE, - regression="classic", - covariance="classic", - full.resid.cov=FALSE, robust.scale=FALSE, - standardized.factor.exposure=FALSE, - weight.var) { - - assets = unique(data[[assetvar]]) - timedates = as.Date(unique(data[[datevar]])) - data[[datevar]] <- as.Date(data[[datevar]]) - - if (length(timedates) < 2) - stop("At least two time points, t and t-1, are needed for fitting the factor model.") - if (!is(exposure.names, "vector") || !is.character(exposure.names)) - stop("exposure argument invalid---must be character vector.") - if (!is(assets, "vector") || !is.character(assets)) - stop("assets argument invalid---must be character vector.") - - wls <- as.logical(wls) - full.resid.cov <- as.logical(full.resid.cov) - robust.scale <- as.logical(robust.scale) - standardized.factor.exposure <- as.logical(standardized.factor.exposure) - - if (!match(regression, c("robust", "classic"), FALSE)) - stop("regression must one of 'robust', 'classic'.") - if (!match(covariance, c("robust", "classic"), FALSE)) - stop("covariance must one of 'robust', 'classic'.") - this.call <- match.call() - - if (match(returnsvar, exposure.names, FALSE)) - stop(paste(returnsvar, "cannot be used as an exposure.")) - - - numTimePoints <- length(timedates) - numExposures <- length(exposure.names) - numAssets <- length(assets) - - - - - # check if exposure.names are numeric, if not, create exposures. factors by dummy variables - which.numeric <- sapply(data[, exposure.names, drop = FALSE],is.numeric) - exposures.numeric <- exposure.names[which.numeric] - # industry factor model - exposures.factor <- exposure.names[!which.numeric] - if (length(exposures.factor) > 1) { - stop("Only one nonnumeric variable can be used at this time.") - } - - if (standardized.factor.exposure == TRUE) { - if (is.na(weight.var)) { - stop("Need to assign weight variable") - } - weight = by(data = data, INDICES = as.numeric(data[[datevar]]), - function(x) x[[weight.var]]/sum(x[[weight.var]])) - data[[weight.var]] <- unlist(weight) - - for (i in exposures.numeric) { - standardized.exposure <- by(data = data, INDICES = as.numeric(data[[datevar]]), - function(x) ((x[[i]] - mean(x[[weight.var]]*x[[i]]) )*1/sd(x[[weight.var]]*x[[i]]) )) - data[[i]] <- unlist(standardized.exposure) - } - } - - - regression.formula <- paste("~", paste(exposure.names, collapse = "+")) - # "~ BOOK2MARKET" - if (length(exposures.factor)) { - regression.formula <- paste(regression.formula, "- 1") - data[, exposures.factor] <- as.factor(data[,exposures.factor]) - exposuresToRecode <- names(data[, exposure.names, drop = FALSE])[!which.numeric] - contrasts.list <- lapply(seq(length(exposuresToRecode)), [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3548 From noreply at r-forge.r-project.org Fri Nov 14 21:12:16 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 14 Nov 2014 21:12:16 +0100 (CET) Subject: [Returnanalytics-commits] r3549 - in pkg/FactorAnalytics: . vignettes Message-ID: <20141114201216.6E830186882@r-forge.r-project.org> Author: pragnya Date: 2014-11-14 21:12:16 +0100 (Fri, 14 Nov 2014) New Revision: 3549 Added: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.R Modified: pkg/FactorAnalytics/NAMESPACE Log: Added R script for fitTsfm vignette Modified: pkg/FactorAnalytics/NAMESPACE =================================================================== --- pkg/FactorAnalytics/NAMESPACE 2014-11-14 19:47:13 UTC (rev 3548) +++ pkg/FactorAnalytics/NAMESPACE 2014-11-14 20:12:16 UTC (rev 3549) @@ -6,30 +6,16 @@ S3method(fmEsDecomp,tsfm) S3method(fmSdDecomp,tsfm) S3method(fmVaRDecomp,tsfm) -S3method(plot,FundamentalFactorModel) -S3method(plot,StatFactorModel) S3method(plot,pafm) S3method(plot,tsfm) -S3method(predict,FundamentalFactorModel) -S3method(predict,StatFactorModel) S3method(predict,tsfm) -S3method(print,FundamentalFactorModel) -S3method(print,StatFactorModel) S3method(print,pafm) S3method(print,summary.tsfm) S3method(print,tsfm) S3method(residuals,tsfm) -S3method(summary,FundamentalFactorModel) -S3method(summary,StatFactorModel) S3method(summary,pafm) S3method(summary,tsfm) export(dCornishFisher) -export(factorModelEsDecomposition) -export(factorModelMonteCarlo) -export(factorModelSdDecomposition) -export(factorModelVaRDecomposition) -export(fitFundamentalFactorModel) -export(fitStatisticalFactorModel) export(fitTsfm) export(fitTsfm.control) export(fmCov) Added: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.R =================================================================== --- pkg/FactorAnalytics/vignettes/fitTsfm_vignette.R (rev 0) +++ pkg/FactorAnalytics/vignettes/fitTsfm_vignette.R 2014-11-14 20:12:16 UTC (rev 3549) @@ -0,0 +1,224 @@ + +## ----message=FALSE------------------------------------------------------- +library(factorAnalytics) + + +## ------------------------------------------------------------------------ +data(managers) +colnames(managers) +range(index(managers)) + + +## ------------------------------------------------------------------------ +asset.names <- colnames(managers[,1:6]) +factor.names <- colnames(managers[,7:9]) +mkt.name <- "SP500 TR" +rf.name <- "US 3m TR" + + +## ------------------------------------------------------------------------ +data(CommonFactors) +names(factors.Q) +range(index(factors.Q)) + + +## ----tidy=TRUE----------------------------------------------------------- +args(fitTsfm) + + +## ------------------------------------------------------------------------ +fit.Sharpe <- fitTsfm(asset.names=asset.names, factor.names="SP500 TR", + rf.name="US 3m TR", data=managers) +names(fit.Sharpe) +fit.Sharpe + + +## ------------------------------------------------------------------------ +# adding up-market timing factor ("HM") to the model +fit1 <- fitTsfm(asset.names=asset.names, factor.names=factor.names, + mkt.name="SP500 TR", mkt.timing="HM", data=managers) +fit1$beta +fit1$r2 +fit1$resid.sd + + +## ------------------------------------------------------------------------ +fit2 <- fitTsfm(asset.names=asset.names, factor.names=factor.names, + mkt.name="SP500 TR", data=managers, fit.method="Robust") +fit2$beta +fit2$r2 +fit2$resid.sd + + +## ----fig.cap="HAM3 Returns: fit1-OLS (top) vs fit2-Robust (bottom)", fig.show='hold'---- +par(mfrow=c(2,1)) +plot(fit1, plot.single=TRUE, which.plot.single=1, asset.name="HAM3", loop=FALSE) +plot(fit2, plot.single=TRUE, which.plot.single=1, asset.name="HAM3", loop=FALSE) + + +## ----fig.cap="Residual vol: fit1-OLS (left) vs fit2-Robust (right)", fig.width=3, fig.height=2.5, out.width='.49\\linewidth', fig.show='hold'---- +par(mfrow=c(1,2)) +plot(fit1, which.plot.group=5, loop=FALSE, xlim=c(0,0.043)) +plot(fit2, which.plot.group=5, loop=FALSE, xlim=c(0,0.043)) + + +## ------------------------------------------------------------------------ +fit.lars <- fitTsfm(asset.names=colnames(managers[,(1:6)]), + factor.names=colnames(managers[,(7:9)]), data=managers, + rf.name="US 3m TR", mkt.name="SP500 TR") +fit.lars$beta +fit.lars$r2 + +fit.sub <- fitTsfm(asset.names=colnames(managers[,(1:6)]), + factor.names=colnames(managers[,(7:9)]), data=managers, + rf.name="US 3m TR", mkt.name="SP500 TR", + variable.selection="subsets", subset.size=4) +fit.sub$beta +fit.sub$r2 + + +## ----fig.cap="Factor betas: fit.lars", fig.show='hold'------------------- +plot(fit.lars, which.plot.group=2, loop=FALSE) + + +## ----fig.cap="Factor betas: fit.sub", fig.show='hold'-------------------- +plot(fit.sub, which.plot.group=2, loop=FALSE) + + +## ----tidy=TRUE----------------------------------------------------------- +args(fitTsfm.control) + + +## ------------------------------------------------------------------------ +methods(class="tsfm") + + +## ------------------------------------------------------------------------ +coef(fit.sub) +tail(fitted(fit.sub)) +tail(residuals(fit.sub)) + +# comparing data, fitted and residual values for HAM1 +tail(merge(fit.sub$data[,1], fitted(fit.sub)[,1], residuals(fit.sub)[,1])) + +# printed summary for the time series factor model +summary(fit.sub, se.type="HAC") + + + +## ----fig.cap="Factor model return correlation (pairwise complete obs)"---- +fmCov(fit.sub) +# return correlation plot; Angular Order of the Eigenvectors +plot(fit.sub, which.plot.group=7, loop=FALSE, order="AOE", method="ellipse", + tl.pos = "d") + + +## ----fig.cap="Percentage factor contribution to SD"---------------------- +decomp <- fmSdDecomp(fit.sub) +# get the factor model standard deviation for all assets +decomp$Sd.fm +# get the component contributions to Sd +decomp$cSd +# get the marginal factor contributions to Sd +decomp$mSd +# get the percentage component contributions to Sd +decomp$pcSd +# plot the percentage component contributions to Sd +plot(fit.sub, which.plot.group=8, loop=FALSE) + + +## ----fig.cap="Percentage factor contribution to VaR"--------------------- +decomp2 <- fmVaRDecomp(fit.sub) +# get the factor model value-at-risk for all assets +decomp2$VaR.fm +# get the component contributions to VaR +decomp2$cVaR +# get the marginal factor contributions to VaR +decomp2$mVaR +# get the percentage component contributions to VaR +decomp2$pcVaR +# plot the percentage component contributions to VaR +plot(fit.sub, which.plot.group=10, loop=FALSE) + + +## ----fig.cap="Percentage factor contribution to ES"---------------------- +decomp2 <- fmEsDecomp(fit.sub, method="historical") +# get the factor model expected shortfall for all assets +decomp2$ES.fm +# get the component contributions to ES +decomp2$cES +# get the marginal factor contributions to ES +decomp2$mES +# get the percentage component contributions to ES +decomp2$pcES +# plot the percentage component contributions to ES +plot(fit.sub, which.plot.group=9, loop=FALSE) + + +## ----eval=FALSE---------------------------------------------------------- +## ## S3 method for class 'tsfm' +## plot(x, which.plot.group=NULL, max.show=6, plot.single=FALSE, asset.name, +## which.plot.single=NULL, colorset=(1:12), legend.loc="topleft", las=1, +## VaR.method="historical", loop=TRUE, ...) + + +## ----eval=FALSE, results='hide'------------------------------------------ +## plot(fit.sub) +## +## ## Make a plot selection (or 0 to exit): +## ## +## ## 1: Factor model coefficients: Alpha +## ## 2: Factor model coefficients: Betas +## ## 3: Actual and Fitted asset returns +## ## 4: R-squared +## ## 5: Residual Volatility +## ## 6: Factor Model Residual Correlation +## ## 7: Factor Model Return Correlation +## ## 8: Factor Contribution to SD +## ## 9: Factor Contribution to ES +## ## 10: Factor Contribution to VaR +## ## +## ## Selection: + + +## ----fig.cap="Actual and fitted factor model returns for the 1st 4 assets"---- +plot(fit.sub, which.plot.group=3, max.show=4, legend.loc=NULL, loop=FALSE) + + +## ----eval=FALSE, results='hide'------------------------------------------ +## plot(fit.sub, plot.single=TRUE, asset.name="HAM1") +## +## ## Make a plot selection (or 0 to exit): +## ## +## ## 1: Time series plot of actual and fitted asset returns +## ## 2: Time series plot of residuals with standard error bands +## ## 3: Time series plot of squared residuals +## ## 4: Time series plot of absolute residuals +## ## 5: SACF and PACF of residuals +## ## 6: SACF and PACF of squared residuals +## ## 7: SACF and PACF of absolute residuals +## ## 8: Histogram of residuals with normal curve overlayed +## ## 9: Normal qq-plot of residuals +## ## 10: CUSUM test-Recursive residuals +## ## 11: CUSUM test-OLS residuals +## ## 12: Recursive estimates (RE) test of OLS regression coefficients +## ## 13: Rolling estimates over a 24-period observation window +## ## +## ## Selection: + + +## ----fig.cap="Time series plot of residuals with standard error bands: HAM1", fig.show='hold'---- +plot(fit.sub, plot.single=TRUE, asset.name="HAM1", which.plot.single=2, + loop=FALSE) + + +## ----fig.cap="SACF and PACF of absolute residuals: HAM1", fig.show='hold'---- +plot(fit.sub, plot.single=TRUE, asset.name="HAM1", which.plot.single=7, + loop=FALSE) + + +## ----fig.cap="Histogram of residuals with normal curve overlayed for HAM1", fig.show='hold'---- +plot(fit.sub, plot.single=TRUE, asset.name="HAM1", which.plot.single=8, + loop=FALSE) + + From noreply at r-forge.r-project.org Sat Nov 15 00:22:06 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sat, 15 Nov 2014 00:22:06 +0100 (CET) Subject: [Returnanalytics-commits] r3550 - pkg/FactorAnalytics/sandbox/R Message-ID: <20141114232206.26229186952@r-forge.r-project.org> Author: gyollin Date: 2014-11-15 00:22:05 +0100 (Sat, 15 Nov 2014) New Revision: 3550 Removed: pkg/FactorAnalytics/sandbox/R/FactorAnalytics-package.R pkg/FactorAnalytics/sandbox/R/bootstrapFactorESdecomposition.r pkg/FactorAnalytics/sandbox/R/bootstrapFactorVaRdecomposition.r pkg/FactorAnalytics/sandbox/R/chart.RollingStyle.R pkg/FactorAnalytics/sandbox/R/chart.Style.R pkg/FactorAnalytics/sandbox/R/covEWMA.R pkg/FactorAnalytics/sandbox/R/factorModelFactorRiskDecomposition.r pkg/FactorAnalytics/sandbox/R/factorModelGroupRiskDecomposition.r pkg/FactorAnalytics/sandbox/R/factorModelPortfolioRiskDecomposition.r pkg/FactorAnalytics/sandbox/R/factorModelRiskAttribution.r pkg/FactorAnalytics/sandbox/R/factorModelRiskDecomposition.r pkg/FactorAnalytics/sandbox/R/factorModelSimulation.r pkg/FactorAnalytics/sandbox/R/impliedFactorReturns.R pkg/FactorAnalytics/sandbox/R/modifiedEsReport.R pkg/FactorAnalytics/sandbox/R/modifiedIncrementalES.R pkg/FactorAnalytics/sandbox/R/modifiedIncrementalVaR.R pkg/FactorAnalytics/sandbox/R/modifiedPortfolioEsDecomposition.R pkg/FactorAnalytics/sandbox/R/modifiedPortfolioVaRDecomposition.R pkg/FactorAnalytics/sandbox/R/modifiedVaRReport.R pkg/FactorAnalytics/sandbox/R/nonparametricEsReport.R pkg/FactorAnalytics/sandbox/R/nonparametricIncrementalES.R pkg/FactorAnalytics/sandbox/R/nonparametricIncrementalVaR.R pkg/FactorAnalytics/sandbox/R/nonparametricPortfolioEsDecomposition.R pkg/FactorAnalytics/sandbox/R/nonparametricPortfolioVaRDecomposition.R pkg/FactorAnalytics/sandbox/R/nonparametricVaRReport.R pkg/FactorAnalytics/sandbox/R/normalEsReport.R pkg/FactorAnalytics/sandbox/R/normalIncrementalES.R pkg/FactorAnalytics/sandbox/R/normalIncrementalVaR.R pkg/FactorAnalytics/sandbox/R/normalPortfolioEsDecomposition.R pkg/FactorAnalytics/sandbox/R/normalPortfolioVaRDecomposition.R pkg/FactorAnalytics/sandbox/R/normalVaRReport.R pkg/FactorAnalytics/sandbox/R/portfolioSdDecomposition.R pkg/FactorAnalytics/sandbox/R/scenarioPredictions.r pkg/FactorAnalytics/sandbox/R/scenarioPredictionsPortfolio.r pkg/FactorAnalytics/sandbox/R/style.QPfit.R pkg/FactorAnalytics/sandbox/R/style.fit.R pkg/FactorAnalytics/sandbox/R/table.RollingStyle.R Log: Cleaning up unused file Deleted: pkg/FactorAnalytics/sandbox/R/FactorAnalytics-package.R =================================================================== --- pkg/FactorAnalytics/sandbox/R/FactorAnalytics-package.R 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/FactorAnalytics-package.R 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,132 +0,0 @@ - - -#' Functions for Cornish-Fisher density, CDF, random number simulation and -#' quantile. -#' -#' \code{dCornishFisher} Computes Cornish-Fisher density from two term -#' Edgeworth expansion given mean, standard deviation, skewness and excess -#' kurtosis. \code{pCornishFisher} Computes Cornish-Fisher CDF from two term -#' Edgeworth expansion given mean, standard deviation, skewness and excess -#' kurtosis. \code{qCornishFisher} Computes Cornish-Fisher quantiles from two -#' term Edgeworth expansion given mean, standard deviation, skewness and excess -#' kurtosis. \code{rCornishFisher} simulate observations based on -#' Cornish-Fisher quantile expansion given mean, standard deviation, skewness -#' and excess kurtosis. -#' -#' CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) -#' -#' @aliases rCornishFisher dCornishFisher pCornishFisher qCornishFisher -#' @param n scalar, number of simulated values in rCornishFisher. Sample length -#' in density,distribution,quantile function. -#' @param sigma scalar, standard deviation. -#' @param skew scalar, skewness. -#' @param ekurt scalar, excess kurtosis. -#' @param seed set seed here. Default is \code{NULL}. -#' @param x,q vector of standardized quantiles. See detail. -#' @param p vector of probabilities. -#' @return n simulated values from Cornish-Fisher distribution. -#' @author Eric Zivot and Yi-An Chen. -#' @references A.DasGupta, "Asymptotic Theory of Statistics and Probability", -#' Springer Science+Business Media,LLC 2008 Thomas A.Severini, "Likelihood -#' Methods in Statistics", Oxford University Press, 2000 -#' @examples -#' -#' # generate 1000 observation from Cornish-Fisher distribution -#' rc <- rCornishFisher(1000,1,0,5) -#' hist(rc,breaks=100,freq=FALSE,main="simulation of Cornish Fisher Distribution", -#' xlim=c(-10,10)) -#' lines(seq(-10,10,0.1),dnorm(seq(-10,10,0.1),mean=0,sd=1),col=2) -#' # compare with standard normal curve -#' -#' # example from A.dasGupta p.188 exponential example -#' # x is iid exp(1) distribution, sample size = 5 -#' # then x_bar is Gamma(shape=5,scale=1/5) distribution -#' q <- c(0,0.4,1,2) -#' # exact cdf -#' pgamma(q/sqrt(5)+1,shape=5,scale=1/5) -#' # use CLT -#' pnorm(q) -#' # use edgeworth expansion -#' pCornishFisher(q,n=5,skew=2,ekurt=6) -#' -#' @name CornishFisher -NULL - - - - - -#' Hypothetical Alternative Asset Manager and Benchmark Data -#' -#' a data.frame format from managers dataset from package PerformanceAnalytics, -#' containing columns of monthly returns for six hypothetical asset managers -#' (HAM1 through HAM6), the EDHEC Long-Short Equity hedge fund index, the S\&P -#' 500 total returns. Monthly returns for all series end in December 2006 and -#' begin at different periods starting from January 1997. -#' -#' -#' @name managers.df -#' @docType data -#' @keywords datasets -#' @examples -#' -#' data(managers.df) -#' ## maybe str(managers.df) ; plot(managers.df) ... -#' -NULL - - - - - -#' Monthly Stock Return Data || Portfolio of Weekly Stock Returns -#' -#' sfm.dat: This is a monthly "data.frame" object from January 1978 to December -#' 1987, with seventeen columns representing monthly returns of certain assets, -#' as in Chapter 2 of Berndt (1991). sfm.apca.dat: This is a weekly -#' "data.frame" object with dimension 182 x 1618, which runs from January 8, -#' 1997 to June 28, 2000 and represents the stock returns on 1618 U.S. stocks. -#' -#' CITCRP monthly returns of Citicorp. CONED monthly returns of Consolidated -#' Edison. CONTIL monthly returns of Continental Illinois. DATGEN monthly -#' returns of Data General. DEC monthly returns of Digital Equipment Company. -#' DELTA monthly returns of Delta Airlines. GENMIL monthly returns of General -#' Mills. GERBER monthly returns of Gerber. IBM monthly returns of -#' International Business Machines. MARKET a value-weighted composite monthly -#' returns based on transactions from the New York Stock Exchange and the -#' American Exchange. MOBIL monthly returns of Mobile. PANAM monthly returns -#' of Pan American Airways. PSNH monthly returns of Public Service of New -#' Hampshire. TANDY monthly returns of Tandy. TEXACO monthly returns of -#' Texaco. WEYER monthly returns of Weyerhauser. RKFREE monthly returns on -#' 30-day U.S. Treasury bills. -#' -#' @name stat.fm.data -#' @aliases sfm.dat sfm.apca.dat -#' @docType data -#' @references Berndt, E. R. (1991). The Practice of Econometrics: Classic and -#' Contemporary. Addison-Wesley Publishing Co. -#' @source S+FinMetrics Berndt.dat & folio.dat -#' @keywords datasets -NULL - - - - - -#' constructed NYSE 447 assets from 1996-01-01 through 2003-12-31. -#' -#' constructed NYSE 447 assets from 1996-01-01 through 2003-12-31. -#' -#' Continuous data: PRICE, RETURN, VOLUME, SHARES.OUT, MARKET.EQUITY,LTDEBT, -#' NET.SALES, COMMON.EQUITY, NET.INCOME, STOCKHOLDERS.EQUITY, LOG.MARKETCAP, -#' LOG.PRICE, BOOK2MARKET Categorical data: GICS, GICS.INDUSTRY, GICS.SECTOR -#' Identi cation data: DATE, PERMNO, TICKER.x -#' -#' @name stock -#' @docType data -#' @references Guy Yullen and Yi-An Chen -#' @keywords datasets -NULL - - - Deleted: pkg/FactorAnalytics/sandbox/R/bootstrapFactorESdecomposition.r =================================================================== --- pkg/FactorAnalytics/sandbox/R/bootstrapFactorESdecomposition.r 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/bootstrapFactorESdecomposition.r 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,83 +0,0 @@ -bootstrapFactorESdecomposition <- function(bootData, beta.vec, sig2.e, tail.prob = 0.01, - method=c("average"), - VaR.method=c("HS", "CornishFisher")) { -## Compute factor model ES decomposition based on Euler's theorem given bootstrap data -## and factor model parameters. -## The partial derivative of ES wrt factor beta is computed -## as the expected factor return given fund return is less than or equal to portfolio VaR -## VaR is compute either as the sample quantile or as an estimated quantile -## using the Cornish-Fisher expansion -## inputs: -## bootData B x (k+2) matrix of bootstrap data. First column contains the fund returns, -## second through k+1 columns contain factor returns, k+2 column contain residuals -## scaled to have variance 1. -## beta.vec k x 1 vector of factor betas -## sig2.e scalar, residual variance from factor model -## tail.prob scalar tail probability -## method character, method for computing marginal ES. Valid choices are -## "average" for approximating E[Fj | R<=VaR] -## VaR.method character, method for computing VaR. Valid choices are "HS" for -## historical simulation (empirical quantile); "CornishFisher" for -## modified VaR based on Cornish-Fisher quantile estimate. Cornish-Fisher -## computation is done with the VaR.CornishFisher in the PerformanceAnalytics -## package -## output: -## Output: -## A list with the following components: -## ES.fm scalar, bootstrap ES value for fund reported as a positive number -## mcES.fm k+1 x 1 vector of factor marginal contributions to ES -## cES.fm k+1 x 1 vector of factor component contributions to ES -## pcES.fm k+1 x 1 vector of factor percent contributions to ES -## Remarks: -## The factor model has the form -## R(t) = beta'F(t) + e(t) = beta.star'F.star(t) -## where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' -## By Euler's theorem -## ES.fm = sum(cES.fm) = sum(beta.star*mcES.fm) -## References: -## 1. Hallerback (2003), "Decomposing Portfolio Value-at-Risk: A General Analysis", -## The Journal of Risk 5/2. -## 2. Yamai and Yoshiba (2002). "Comparative Analyses of Expected Shortfall and -## Value-at-Risk: Their Estimation Error, Decomposition, and Optimization -## Bank of Japan. -## 3. Meucci (2007). "Risk Contributions from Generic User-Defined Factors," Risk. - require(PerformanceAnalytics) - VaR.method = VaR.method[1] - bootData = as.matrix(bootData) - ncol.bootData = ncol(bootData) - beta.names = c(names(beta.vec), "residual") - #beta.vec = as.vector(beta.vec) - beta.star.vec = c(beta.vec, sqrt(sig2.e)) - names(beta.star.vec) = beta.names - - if (VaR.method == "HS") { - VaR.fm = quantile(bootData[, 1], prob=tail.prob) - idx = which(bootData[, 1] <= VaR.fm) - ES.fm = -mean(bootData[idx, 1]) - } else { - VaR.fm = -VaR.CornishFisher(bootData[, 1], p=(1-tail.prob)) - idx = which(bootData[, 1] <= pVaR) - ES.fm = -mean(bootData[idx, 1]) - } - ## - ## compute marginal contribution to ES - ## - if (method == "average") { - ## compute marginal ES as expected value of factor return given fund - ## return is less than or equal to VaR - mcES.fm = -as.matrix(colMeans(bootData[idx, -1])) - } else { - stop("invalid method") - } - -## compute correction factor so that sum of weighted marginal ES adds to portfolio ES -#cf = as.numeric( ES.fm / sum(mcES.fm*beta.star.vec) ) -#mcES.fm = cf*mcES.fm -cES.fm = mcES.fm*beta.star.vec -pcES.fm = cES.fm/ES.fm -ans = list(ES.fm = ES.fm, - mcES.fm = mcES.fm, - cES.fm = cES.fm, - pcES.fm = pcES.fm) -return(ans) -} Deleted: pkg/FactorAnalytics/sandbox/R/bootstrapFactorVaRdecomposition.r =================================================================== --- pkg/FactorAnalytics/sandbox/R/bootstrapFactorVaRdecomposition.r 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/bootstrapFactorVaRdecomposition.r 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,90 +0,0 @@ -bootstrapFactorVaRdecomposition <- function(bootData, beta.vec, sig2.e, h=NULL, tail.prob = 0.01, - method=c("average"), - VaR.method=c("HS", "CornishFisher")) { -## Compute factor model VaR decomposition based on Euler's theorem given bootstrap data -## and factor model parameters. -## The partial derivative of VaR wrt factor beta is computed -## as the expected factor return given fund return is equal to portfolio VaR -## VaR is compute either as the sample quantile or as an estimated quantile -## using the Cornish-Fisher expansion -## inputs: -## bootData B x (k+2) matrix of bootstrap data. First column contains the fund returns, -## second through k+1 columns contain factor returns, k+2 column contain residuals -## scaled to have variance 1. -## beta.vec k x 1 vector of factor betas -## sig2.e scalar, residual variance from factor model -## h integer, number of obvs on each side of VaR. Default is h=round(sqrt(B)/2) -## tail.prob scalar tail probability -## method character, method for computing marginal VaR. Valid choices are -## "average" for approximating E[Fj | R=VaR] -## VaR.method character, method for computing VaR. Valid choices are "HS" for -## historical simulation (empirical quantile); "CornishFisher" for -## modified VaR based on Cornish-Fisher quantile estimate. Cornish-Fisher -## computation is done with the VaR.CornishFisher in the PerformanceAnalytics -## package -## output: -## Output: -## A list with the following components: -## VaR.fm scalar, bootstrap VaR value for fund reported as a positive number -## mcVaR.fm k+1 x 1 vector of factor marginal contributions to VaR -## cVaR.fm k+1 x 1 vector of factor component contributions to VaR -## pcVaR.fm k+1 x 1 vector of factor percent contributions to VaR -## Remarks: -## The factor model has the form -## R(t) = beta'F(t) + e(t) = beta.star'F.star(t) -## where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' -## By Euler's theorem -## VaR.fm = sum(cVaR.fm) = sum(beta.star*mcVaR.fm) -## References: -## 1. Hallerback (2003), "Decomposing Portfolio Value-at-Risk: A General Analysis", -## The Journal of Risk 5/2. -## 2. Yamai and Yoshiba (2002). "Comparative Analyses of Expected Shortfall and -## Value-at-Risk: Their Estimation Error, Decomposition, and Optimization -## Bank of Japan. -## 3. Meucci (2007). "Risk Contributions from Generic User-Defined Factors," Risk. - require(PerformanceAnalytics) - VaR.method = VaR.method[1] - bootData = as.matrix(bootData) - ncol.bootData = ncol(bootData) - beta.names = c(names(beta.vec), "residual") - #beta.vec = as.vector(beta.vec) - beta.star.vec = c(beta.vec, sqrt(sig2.e)) - names(beta.star.vec) = beta.names - - # determine number of obvs to average around VaR - if (is.null(h)) { - h = round(sqrt(nrow(bootData))) - } else h = round(h) - - if (VaR.method == "HS") { - VaR.fm = -quantile(bootData[,1], prob=tail.prob) - } else { - VaR.fm = VaR.CornishFisher(bootData[,1], p=(1-tail.prob)) - } - ## - ## compute marginal contribution to VaR - ## - if (method == "average") { - ## compute marginal VaR as expected value of fund return given portfolio - ## return is equal to portfolio VaR - r.sort = sort(bootData[,1]) - idx.lower = which(r.sort <= -VaR.fm) - idx.upper = which(r.sort > -VaR.fm) - r.vals = c(r.sort[tail(idx.lower,n=h)], r.sort[head(idx.upper,n=h)]) - idx = which(bootData[,1] %in% r.vals) - mcVaR.fm = -as.matrix(colMeans(bootData[idx,-1])) - } else { - stop("invalid method") - } - -## compute correction factor so that sum of weighted marginal VaR adds to portfolio VaR -cf = as.numeric( VaR.fm / sum(mcVaR.fm*beta.star.vec) ) -mcVaR.fm = cf*mcVaR.fm -cVaR.fm = mcVaR.fm*beta.star.vec -pcVaR.fm = cVaR.fm/VaR.fm -ans = list(VaR.fm = VaR.fm, - mcVaR.fm = mcVaR.fm, - cVaR.fm = cVaR.fm, - pcVaR.fm = pcVaR.fm) -return(ans) -} Deleted: pkg/FactorAnalytics/sandbox/R/chart.RollingStyle.R =================================================================== --- pkg/FactorAnalytics/sandbox/R/chart.RollingStyle.R 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/chart.RollingStyle.R 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,52 +0,0 @@ -chart.RollingStyle <- -function (R.fund, R.style, method = c("constrained","unconstrained","normalized"), leverage = FALSE, width = 12, main = NULL, space = 0, ...) -{ # @author Peter Carl - - result<-table.RollingStyle(R.fund=R.fund, R.style=R.style, method=method,leverage=leverage,width=width) - - if (is.null(main)){ - freq = periodicity(R.fund) - - switch(freq$scale, - minute = {freq.lab = "minute"}, - hourly = {freq.lab = "hour"}, - daily = {freq.lab = "day"}, - weekly = {freq.lab = "week"}, - monthly = {freq.lab = "month"}, - quarterly = {freq.lab = "quarter"}, - yearly = {freq.lab = "year"} - ) - - main = paste(colnames(R.fund)[1]," Rolling ", width ,"-",freq.lab," Style Weights", sep="") - } - - chart.StackedBar(result, main = main, space = space, ...) - -} - -############################################################################### -# R (http://r-project.org/) Econometrics for Performance and Risk Analysis -# -# Copyright (c) 2004-2007 Peter Carl and Brian G. Peterson -# -# This library is distributed under the terms of the GNU Public License (GPL) -# for full details see the file COPYING -# -# $Id: chart.RollingStyle.R 1796 2011-01-19 16:18:07Z braverock $ -# -############################################################################### -# $Log: not supported by cvs2svn $ -# Revision 1.4 2009-10-15 21:50:19 brian -# - updates to add automatic periodicity to chart labels, and support different frequency data -# -# Revision 1.3 2008-07-11 03:22:01 peter -# - removed unnecessary function attributes -# -# Revision 1.2 2008-04-18 03:59:52 peter -# - added na.omit to avoid problems with missing data -# -# Revision 1.1 2008/02/23 05:55:21 peter -# - chart demonstrating fund exposures through time -# -# -############################################################################### Deleted: pkg/FactorAnalytics/sandbox/R/chart.Style.R =================================================================== --- pkg/FactorAnalytics/sandbox/R/chart.Style.R 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/chart.Style.R 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,195 +0,0 @@ -#' calculate and display effective style weights -#' -#' Functions that calculate effective style weights and display the results in -#' a bar chart. \code{chart.Style} calculates and displays style weights -#' calculated over a single period. \code{chart.RollingStyle} calculates and -#' displays those weights in rolling windows through time. \code{style.fit} -#' manages the calculation of the weights by method. \code{style.QPfit} -#' calculates the specific constraint case that requires quadratic programming. -#' -#' These functions calculate style weights using an asset class style model as -#' described in detail in Sharpe (1992). The use of quadratic programming to -#' determine a fund's exposures to the changes in returns of major asset -#' classes is usually refered to as "style analysis". -#' -#' The "unconstrained" method implements a simple factor model for style -#' analysis, as in: \deqn{Ri = bi1*F1+bi2*F2+...+bin*Fn+ei}{R_i = -#' b_{i1}F_1+b_{i2}F_2+\dots+b_{in}F_n +e_i} where \eqn{Ri}{R_i} represents the -#' return on asset i, \eqn{Fj}{F_j} represents each factor, and \eqn{ei}{e_i} -#' represents the "non-factor" component of the return on i. This is simply a -#' multiple regression analysis with fund returns as the dependent variable and -#' asset class returns as the independent variables. The resulting slope -#' coefficients are then interpreted as the fund's historic exposures to asset -#' class returns. In this case, coefficients do not sum to 1. -#' -#' The "normalized" method reports the results of a multiple regression -#' analysis similar to the first, but with one constraint: the coefficients are -#' required to add to 1. Coefficients may be negative, indicating short -#' exposures. To enforce the constraint, coefficients are normalized. -#' -#' The "constrained" method includes the constraint that the coefficients sum -#' to 1, but adds that the coefficients must lie between 0 and 1. These -#' inequality constraints require a quadratic programming algorithm using -#' \code{\link[quadprog]{solve.QP}} from the 'quadprog' package, and the -#' implementation is discussed under \code{\link{style.QPfit}}. If set to -#' TRUE, "leverage" allows the sum of the coefficients to exceed 1. -#' -#' According to Sharpe (1992), the calculation for the constrained case is -#' represented as: \deqn{min var(Rf - sum[wi * R.si]) = min var(F - w*S)}{min -#' \sigma(R_f - \sum{w_i * R_s_i}) = min \sigma(F - w*S)} \deqn{s.t. sum[wi] = -#' 1; wi > 0}{ s.t. \sum{w_i} = 1; w_i > 0} -#' -#' Remembering that: -#' -#' \deqn{\sigma(aX + bY) = a^2 \sigma(X) + b^2 \sigma(Y) + 2ab cov(X,Y) = -#' \sigma(R.f) + w'*V*w - 2*w'*cov(R.f,R.s)} -#' -#' we can drop \eqn{var(Rf)}{\sigma(R_f)} as it isn't a function of weights, -#' multiply both sides by 1/2: -#' -#' \deqn{= min (1/2) w'*V*w - C'w}{= min (1/2) w'*V*w - C'w} \deqn{ s.t. w'*e = -#' 1, w_i > 0}{ s.t. w'*e = 1, w_i > 0} -#' -#' Which allows us to use \code{\link[quadprog]{solve.QP}}, which is specified -#' as: \deqn{min(-d' b + 1/2 b' D b)}{min(-d' b + 1/2 b' D b)} and the -#' constraints \deqn{ A' b >= b.0 }{ A' b >= b_0 } -#' -#' so: b is the weight vector, D is the variance-covariance matrix of the -#' styles d is the covariance vector between the fund and the styles -#' -#' The chart functions then provide a graphical summary of the results. The -#' underlying function, \code{\link{style.fit}}, provides the outputs of the -#' analysis and more information about fit, including an R-squared value. -#' -#' Styles identified in this analysis may be interpreted as an average of -#' potentially changing exposures over the period covered. The function -#' \code{\link{chart.RollingStyle}} may be useful for examining the behavior of -#' a manager's average exposures to asset classes over time, using a -#' rolling-window analysis. -#' -#' The chart functions plot a column chart or stacked column chart of the -#' resulting style weights to the current device. Both \code{style.fit} and -#' \code{style.QPfit} produce a list of data frames containing 'weights' and -#' 'R.squared' results. If 'model' = TRUE in \code{style.QPfit}, the full -#' result set is shown from the output of \code{solve.QP}. -#' -#' @aliases chart.Style chart.RollingStyle table.RollingStyle style.fit -#' style.QPfit -#' @param R.fund matrix, data frame, or zoo object with fund returns to be -#' analyzed -#' @param R.style matrix, data frame, or zoo object with style index returns. -#' Data object must be of the same length and time-aligned with R.fund -#' @param method specify the method of calculation of style weights as -#' "constrained", "unconstrained", or "normalized". For more information, see -#' \code{\link{style.fit}} -#' @param leverage logical, defaults to 'FALSE'. If 'TRUE', the calculation of -#' weights assumes that leverage may be used. For more information, see -#' \code{\link{style.fit}} -#' @param model logical. If 'model' = TRUE in \code{\link{style.QPfit}}, the -#' full result set is shown from the output of \code{solve.QP}. -#' @param selection either "none" (default) or "AIC". If "AIC", then the -#' function uses a stepwise regression to identify find the model with minimum -#' AIC value. See \code{\link{step}} for more detail. -#' @param unstacked logical. If set to 'TRUE' \emph{and} only one row of data -#' is submitted in 'w', then the chart creates a normal column chart. If more -#' than one row is submitted, then this is ignored. See examples below. -#' @param space the amount of space (as a fraction of the average bar width) -#' left before each bar, as in \code{\link{barplot}}. Default for -#' \code{chart.RollingStyle} is 0; for \code{chart.Style} the default is 0.2. -#' @param main set the chart title, same as in \code{\link{plot}} -#' @param width number of periods or window to apply rolling style analysis -#' over -#' @param ylim set the y-axis limit, same as in \code{\link{plot}} -#' @param \dots for the charting functions, these are arguments to be passed to -#' \code{\link{barplot}}. These can include further arguments (such as 'axes', -#' 'asp' and 'main') and graphical parameters (see 'par') which are passed to -#' 'plot.window()', 'title()' and 'axis'. For the calculation functions, these -#' are ignored. -#' @note None of the functions \code{chart.Style}, \code{style.fit}, and -#' \code{style.QPfit} make any attempt to align the two input data series. The -#' \code{chart.RollingStyle}, on the other hand, does merge the two series and -#' manages the calculation over common periods. -#' @author Peter Carl -#' @seealso \code{\link{barplot}}, \code{\link{par}} -#' @references Sharpe, W. Asset Allocation: Management Style and Performance -#' Measurement Journal of Portfolio Management, 1992, 7-19. See \url{ -#' http://www.stanford.edu/~wfsharpe/art/sa/sa.htm} -#' @keywords ts multivariate hplot -#' @examples -#' -#' data(edhec) -#' data(managers) -#' style.fit(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE) -#' chart.Style(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE, unstack=TRUE, las=3) -#' chart.RollingStyle(managers[,2,drop=FALSE],edhec[,1:11], method="constrained", leverage=FALSE, width=36, cex.legend = .7, colorset=rainbow12equal, las=1) -#' -`chart.Style` <- -function (R.fund, R.style, method = c("constrained", "unconstrained", "normalized"), leverage = FALSE, main = NULL, ylim = NULL, unstacked=TRUE, ...) -{ # @author Peter Carl - - # DESCRIPTION: - # A wrapper to create a chart of relative returns through time - - # R-Squared could deliver adjusted R-Squared if we wanted - - # FUNCTION: - - # Transform input data to a data frame - R.fund = checkData(R.fund) - R.style = checkData(R.style) - method = method[1] - - # Calculate - result = style.fit(R.fund, R.style, method = method, leverage = leverage) - weights = t(as.matrix(result$weights)) - - if(is.null(main)) - main = paste(colnames(R.fund)[1] ," Style Weights", sep="") - - if(is.null(ylim)) - if(method == "constrained" & leverage == FALSE) ylim = c(0,1) - else ylim = NULL - - chart.StackedBar(weights, main = main, ylim = ylim, unstacked = unstacked, ...) -# barplot(weights, main = main, ylim = ylim, ...) - -} - -############################################################################### -# R (http://r-project.org/) Econometrics for Performance and Risk Analysis -# -# Copyright (c) 2004-2007 Peter Carl and Brian G. Peterson -# -# This library is distributed under the terms of the GNU Public License (GPL) -# for full details see the file COPYING -# -# $Id: chart.Style.R 2351 2013-06-18 09:52:27Z braverock $ -# -############################################################################### -# $Log: not supported by cvs2svn $ -# Revision 1.7 2008-07-11 03:24:52 peter -# - fixed error with alignment of results -# -# Revision 1.6 2008-04-18 03:58:04 peter -# - reduced to a wrapper to chart.StackedBar -# -# Revision 1.5 2008/02/27 04:05:32 peter -# - added 'leverage' tag to eliminate sum to one constraint -# - added cex.names for controlling size of xaxis labels -# -# Revision 1.4 2008/02/26 04:49:06 peter -# - handles single column fits better -# -# Revision 1.3 2008/02/26 04:39:40 peter -# - moved legend and margin control into chart.StackedBar -# - handles multiple columns -# -# Revision 1.2 2008/02/23 05:35:56 peter -# - set ylim more sensibly depending on method -# -# Revision 1.1 2008/02/23 05:32:37 peter -# - simple bar chart of a fund's exposures to a set of factors, as determined -# by style.fit -# -# -############################################################################### Deleted: pkg/FactorAnalytics/sandbox/R/covEWMA.R =================================================================== --- pkg/FactorAnalytics/sandbox/R/covEWMA.R 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/covEWMA.R 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,79 +0,0 @@ -#' Compute RiskMetrics-type EWMA Covariance Matrix -#' -#' Compute time series of RiskMetrics-type EWMA covariance matrices of returns. -#' Initial covariance matrix is assumed to be the unconditional covariance -#' matrix. -#' -#' The EWMA covariance matrix at time \code{t} is compute as \cr \code{Sigma(t) -#' = lambda*Sigma(t-1) + (1-lambda)*R(t)t(R(t))} \cr where \code{R(t)} is the -#' \code{K x 1} vector of returns at time \code{t}. -#' -#' @param factors \code{T x K} data.frame containing asset returns, where -#' \code{T} is the number of time periods and \code{K} is the number of assets. -#' @param lambda Scalar exponential decay factor. Must lie between between 0 -#' and 1. -#' @param return.cor Logical, if TRUE then return EWMA correlation matrices. -#' @return \code{T x K x K} array giving the time series of EWMA covariance -#' matrices if \code{return.cor=FALSE} and EWMA correlation matrices if -#' \code{return.cor=TRUE}. -#' @author Eric Zivot and Yi-An Chen. -#' @references Zivot, E. and J. Wang (2006), \emph{Modeling Financial Time -#' Series with S-PLUS, Second Edition}, Springer-Verlag. -#' @examples -#' -#' # compute time vaying covariance of factors. -#' data(managers.df) -#' factors = managers.df[,(7:9)] -#' cov.f.ewma <- covEWMA(factors) -#' cov.f.ewma[120,,] -#' -covEWMA <- -function(factors, lambda=0.96, return.cor=FALSE) { -## Inputs: -## factors N x K numerical factors data. data is class data.frame -## N is the time length and K is the number of the factors. -## lambda scalar. exponetial decay factor between 0 and 1. -## return.cor Logical, if TRUE then return EWMA correlation matrices -## Output: -## cov.f.ewma array. dimension is N x K x K. -## comments: -## 1. add optional argument cov.start to specify initial covariance matrix -## 2. allow data input to be data class to be any rectangular data object - - -if (is.data.frame(factors)){ - factor.names = colnames(factors) - t.factor = nrow(factors) - k.factor = ncol(factors) - factors = as.matrix(factors) - t.names = rownames(factors) -} else { - stop("factor data should be saved in data.frame class.") -} -if (lambda>=1 || lambda <= 0){ - stop("exponential decay value lambda should be between 0 and 1.") -} else { - cov.f.ewma = array(,c(t.factor,k.factor,k.factor)) - cov.f = var(factors) # unconditional variance as EWMA at time = 0 - FF = (factors[1,]- mean(factors)) %*% t(factors[1,]- mean(factors)) - cov.f.ewma[1,,] = (1-lambda)*FF + lambda*cov.f - for (i in 2:t.factor) { - FF = (factors[i,]- mean(factors)) %*% t(factors[i,]- mean(factors)) - cov.f.ewma[i,,] = (1-lambda)*FF + lambda*cov.f.ewma[(i-1),,] - } - -} - # 9/15/11: add dimnames to array - dimnames(cov.f.ewma) = list(t.names, factor.names, factor.names) - - if(return.cor) { - cor.f.ewma = cov.f.ewma - for (i in 1:dim(cor.f.ewma)[1]) { - cor.f.ewma[i, , ] = cov2cor(cov.f.ewma[i, ,]) - } - return(cor.f.ewma) - } else{ - return(cov.f.ewma) - } -} - Deleted: pkg/FactorAnalytics/sandbox/R/factorModelFactorRiskDecomposition.r =================================================================== --- pkg/FactorAnalytics/sandbox/R/factorModelFactorRiskDecomposition.r 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/factorModelFactorRiskDecomposition.r 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,53 +0,0 @@ -## factorModelFactorRiskDecomposition.r -## -## purpose: Compute factor model factor risk (sd) decomposition for individual -## fund -## author: Eric Zivot -## created: August 13, 2009 -## revision history: -## July 1, 2010 -## Added comment to inputs -## June 8, 2010 -## Added percent contribution to risk as output - -factorModelFactorRiskDecomposition <- function(beta.vec, factor.cov, sig2.e) { - ## Inputs: - ## beta k x 1 vector of factor betas with factor names in the rownames - ## factor.cov k x k factor excess return covariance matrix - ## sig2.e scalar, residual variance from factor model - ## Output: - ## A list with the following components: - ## sd.fm scalar, std dev based on factor model - ## mcr.fm k+1 x 1 vector of factor marginal contributions to risk (sd) - ## cr.fm k+1 x 1 vector of factor component contributions to risk (sd) - ## pcr.fm k+1 x 1 vector of factor percent contributions to risk (sd) - ## Remarks: - ## The factor model has the form - ## R(t) = beta'F(t) + e(t) = beta.star'F.star(t) - ## where beta.star = (beta, sig.e)' and F.star(t) = (F(t)', z(t))' - ## By Euler's theorem - ## sd.fm = sum(cr.fm) = sum(beta*mcr.fm) - beta.names = c(rownames(beta.vec), "residual") - beta.vec = as.vector(beta.vec) - beta.star.vec = c(beta.vec, sqrt(sig2.e)) - names(beta.star.vec) = beta.names - factor.cov = as.matrix(factor.cov) - k.star = length(beta.star.vec) - k = k.star - 1 - factor.star.cov = diag(k.star) - factor.star.cov[1:k, 1:k] = factor.cov - - ## compute factor model sd - sd.fm = as.numeric(sqrt(t(beta.star.vec) %*% factor.star.cov %*% beta.star.vec)) - ## compute marginal and component contributions to sd - mcr.fm = (factor.star.cov %*% beta.star.vec)/sd.fm - cr.fm = mcr.fm * beta.star.vec - pcr.fm = cr.fm/sd.fm - rownames(mcr.fm) <- rownames(cr.fm) <- rownames(pcr.fm) <- beta.names - ## return results - ans = list(sd.fm = sd.fm, - mcr.fm = mcr.fm, - cr.fm = cr.fm, - pcr.fm = pcr.fm) - return(ans) -} Deleted: pkg/FactorAnalytics/sandbox/R/factorModelGroupRiskDecomposition.r =================================================================== --- pkg/FactorAnalytics/sandbox/R/factorModelGroupRiskDecomposition.r 2014-11-14 20:12:16 UTC (rev 3549) +++ pkg/FactorAnalytics/sandbox/R/factorModelGroupRiskDecomposition.r 2014-11-14 23:22:05 UTC (rev 3550) @@ -1,78 +0,0 @@ -## factorModelGroupRiskDecomposition.r -## -## purpose: Compute factor model risk decomposition for individual fund by risk groups -## Risk groups are equity, rates, credit, fx, commondity, strategy -## -## author: Eric Zivot -## created: July 9, 2009 -## revised: July 9, 2009 - -factorModelGroupRiskDecomposition <- function(beta.vec, factor.cov, sig2.e, - equityIds, ratesIds, creditIds, - fxIds, cmdtyIds, strategyIds) { -## Inputs: -## beta k x 1 vector of factor betas -## factor.cov k x k factor excess return covariance matrix -## sig2.e scalar, residual variance from factor model [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3550 From noreply at r-forge.r-project.org Sat Nov 15 00:24:01 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sat, 15 Nov 2014 00:24:01 +0100 (CET) Subject: [Returnanalytics-commits] r3551 - pkg/FactorAnalytics/sandbox Message-ID: <20141114232401.8DA20186952@r-forge.r-project.org> Author: gyollin Date: 2014-11-15 00:24:01 +0100 (Sat, 15 Nov 2014) New Revision: 3551 Removed: pkg/FactorAnalytics/sandbox/R/ Log: From noreply at r-forge.r-project.org Sat Nov 15 23:54:45 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sat, 15 Nov 2014 23:54:45 +0100 (CET) Subject: [Returnanalytics-commits] r3552 - pkg/FactorAnalytics/sandbox/Man Message-ID: <20141115225445.6B76018762F@r-forge.r-project.org> Author: pragnya Date: 2014-11-15 23:54:45 +0100 (Sat, 15 Nov 2014) New Revision: 3552 Removed: pkg/FactorAnalytics/sandbox/Man/Style.Rd pkg/FactorAnalytics/sandbox/Man/chart.Style.Rd pkg/FactorAnalytics/sandbox/Man/covEWMA.Rd pkg/FactorAnalytics/sandbox/Man/impliedFactorReturns.Rd pkg/FactorAnalytics/sandbox/Man/modifiedEsReport.Rd pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalES.Rd pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalVaR.Rd pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioEsDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioVaRDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/modifiedVaRReport.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricEsReport.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricIncrementalES.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricIncrementalVaR.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricPortfolioEsDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricPortfolioVaRDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/nonparametricVaRReport.Rd pkg/FactorAnalytics/sandbox/Man/normalEsReport.Rd pkg/FactorAnalytics/sandbox/Man/normalIncrementalES.Rd pkg/FactorAnalytics/sandbox/Man/normalIncrementalVaR.Rd pkg/FactorAnalytics/sandbox/Man/normalPortfolioEsDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/normalPortfolioVaRDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/normalVaRReport.Rd pkg/FactorAnalytics/sandbox/Man/portfolioSdDecomposition.Rd pkg/FactorAnalytics/sandbox/Man/scenarioPredictions.Rd pkg/FactorAnalytics/sandbox/Man/scenarioPredictionsPortfolio.Rd Log: Deleted unused file Deleted: pkg/FactorAnalytics/sandbox/Man/Style.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/Style.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/Style.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,104 +0,0 @@ -\name{chart.Style} -\alias{chart.Style} -\alias{chart.RollingStyle} -\alias{table.RollingStyle} -\alias{style.fit} -\alias{style.QPfit} -%- Also NEED an '\alias' for EACH other topic documented here. -\title{ calculate and display effective style weights } -\description{ - Functions that calculate effective style weights and display the results in a bar chart. \code{chart.Style} calculates and displays style weights calculated over a single period. \code{chart.RollingStyle} calculates and displays those weights in rolling windows through time. \code{style.fit} manages the calculation of the weights by method. \code{style.QPfit} calculates the specific constraint case that requires quadratic programming. -} -\usage{ -chart.Style(R.fund, R.style, method = c("constrained", "unconstrained", "normalized"), leverage = FALSE, main = NULL, ylim = NULL, unstacked=TRUE, ...) - -chart.RollingStyle(R.fund, R.style, method = c("constrained","unconstrained","normalized"), leverage = FALSE, width = 12, main = NULL, space = 0, ...) - -style.fit(R.fund, R.style, model=FALSE, method = c("constrained", "unconstrained", "normalized"), leverage = FALSE, selection = c("none", "AIC"), ...) - -style.QPfit(R.fund, R.style, model = FALSE, leverage = FALSE, ...) - -} -%- maybe also 'usage' for other objects documented here. -\arguments{ - \item{R.fund}{ matrix, data frame, or zoo object with fund returns to be analyzed } - \item{R.style}{ matrix, data frame, or zoo object with style index returns. Data object must be of the same length and time-aligned with R.fund } - \item{method}{ specify the method of calculation of style weights as "constrained", "unconstrained", or "normalized". For more information, see \code{\link{style.fit}} } - \item{leverage}{ logical, defaults to 'FALSE'. If 'TRUE', the calculation of weights assumes that leverage may be used. For more information, see \code{\link{style.fit}} } - \item{model}{ logical. If 'model' = TRUE in \code{\link{style.QPfit}}, the full result set is shown from the output of \code{solve.QP}. } - \item{selection}{ either "none" (default) or "AIC". If "AIC", then the function uses a stepwise regression to identify find the model with minimum AIC value. See \code{\link{step}} for more detail.} - \item{unstacked}{ logical. If set to 'TRUE' \emph{and} only one row of data is submitted in 'w', then the chart creates a normal column chart. If more than one row is submitted, then this is ignored. See examples below. } - \item{space}{ the amount of space (as a fraction of the average bar width) left before each bar, as in \code{\link{barplot}}. Default for \code{chart.RollingStyle} is 0; for \code{chart.Style} the default is 0.2. } - \item{main}{ set the chart title, same as in \code{\link{plot}} } - \item{width}{ number of periods or window to apply rolling style analysis over } - \item{ylim}{ set the y-axis limit, same as in \code{\link{plot}} } - \item{\dots}{ for the charting functions, these are arguments to be passed to \code{\link{barplot}}. These can include further arguments (such as 'axes', 'asp' and 'main') and graphical parameters (see 'par') which are passed to 'plot.window()', 'title()' and 'axis'. For the calculation functions, these are ignored. } -} -\details{ -These functions calculate style weights using an asset class style model as described in detail in Sharpe (1992). The use of quadratic programming to determine a fund's exposures to the changes in returns of major asset classes is usually refered to as "style analysis". - -The "unconstrained" method implements a simple factor model for style analysis, as in: -\deqn{Ri = bi1*F1+bi2*F2+...+bin*Fn+ei}{R_i = b_{i1}F_1+b_{i2}F_2+\dots+b_{in}F_n +e_i} -where \eqn{Ri}{R_i} represents the return on asset i, \eqn{Fj}{F_j} represents each factor, and \eqn{ei}{e_i} represents the "non-factor" component of the return on i. This is simply a multiple regression analysis with fund returns as the dependent variable and asset class returns as the independent variables. The resulting slope coefficients are then interpreted as the fund's historic exposures to asset class returns. In this case, coefficients do not sum to 1. - -The "normalized" method reports the results of a multiple regression analysis similar to the first, but with one constraint: the coefficients are required to add to 1. Coefficients may be negative, indicating short exposures. To enforce the constraint, coefficients are normalized. - -The "constrained" method includes the constraint that the coefficients sum to 1, but adds -that the coefficients must lie between 0 and 1. These inequality constraints require a -quadratic programming algorithm using \code{\link[quadprog]{solve.QP}} from the 'quadprog' package, -and the implementation is discussed under \code{\link{style.QPfit}}. If set to TRUE, -"leverage" allows the sum of the coefficients to exceed 1. - -According to Sharpe (1992), the calculation for the constrained case is represented as: -\deqn{min var(Rf - sum[wi * R.si]) = min var(F - w*S)}{min \sigma(R_f - \sum{w_i * R_s_i}) = min \sigma(F - w*S)} -\deqn{s.t. sum[wi] = 1; wi > 0}{ s.t. \sum{w_i} = 1; w_i > 0} - -Remembering that: - -\deqn{\sigma(aX + bY) = a^2 \sigma(X) + b^2 \sigma(Y) + 2ab cov(X,Y) = \sigma(R.f) + w'*V*w - 2*w'*cov(R.f,R.s)} - -we can drop \eqn{var(Rf)}{\sigma(R_f)} as it isn't a function of weights, multiply both sides by 1/2: - -\deqn{= min (1/2) w'*V*w - C'w}{= min (1/2) w'*V*w - C'w} -\deqn{ s.t. w'*e = 1, w_i > 0}{ s.t. w'*e = 1, w_i > 0} - -Which allows us to use \code{\link[quadprog]{solve.QP}}, which is specified as: -\deqn{min(-d' b + 1/2 b' D b)}{min(-d' b + 1/2 b' D b)} -and the constraints -\deqn{ A' b >= b.0 }{ A' b >= b_0 } - -so: -b is the weight vector, -D is the variance-covariance matrix of the styles -d is the covariance vector between the fund and the styles - -The chart functions then provide a graphical summary of the results. The underlying -function, \code{\link{style.fit}}, provides the outputs of the analysis and more -information about fit, including an R-squared value. - -Styles identified in this analysis may be interpreted as an average of potentially -changing exposures over the period covered. The function \code{\link{chart.RollingStyle}} -may be useful for examining the behavior of a manager's average exposures to asset classes over time, using a rolling-window analysis. - - The chart functions plot a column chart or stacked column chart of the resulting style weights to the current device. Both \code{style.fit} and \code{style.QPfit} produce a list of data frames containing 'weights' and 'R.squared' results. If 'model' = TRUE in \code{style.QPfit}, the full result set is shown from the output of \code{solve.QP}. -} -\references{ -Sharpe, W. Asset Allocation: Management Style and Performance Measurement Journal of Portfolio Management, 1992, 7-19. See \url{ http://www.stanford.edu/~wfsharpe/art/sa/sa.htm} - } -\author{ Peter Carl } -\note{ - None of the functions \code{chart.Style}, \code{style.fit}, and \code{style.QPfit} make any attempt to align the two input data series. The \code{chart.RollingStyle}, on the other hand, does merge the two series and manages the calculation over common periods. -} -\seealso{ \code{\link{barplot}}, \code{\link{par}} } -\examples{ -data(edhec) -data(managers) -style.fit(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE) -chart.Style(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE, unstack=TRUE, las=3) -chart.RollingStyle(managers[,2,drop=FALSE],edhec[,1:11], method="constrained", leverage=FALSE, width=36, cex.legend = .7, colorset=rainbow12equal, las=1) -} -% Add one or more standard keywords, see file 'KEYWORDS' in the -% R documentation directory. -\keyword{ ts } -\keyword{ multivariate } -\keyword{ hplot } Deleted: pkg/FactorAnalytics/sandbox/Man/chart.Style.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/chart.Style.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/chart.Style.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,191 +0,0 @@ -\name{chart.Style} -\alias{chart.RollingStyle} -\alias{chart.Style} -\alias{style.fit} -\alias{style.QPfit} -\alias{table.RollingStyle} -\title{calculate and display effective style weights} -\usage{ - chart.Style(R.fund, R.style, - method = c("constrained", "unconstrained", "normalized"), - leverage = FALSE, main = NULL, ylim = NULL, - unstacked = TRUE, ...) -} -\arguments{ - \item{R.fund}{matrix, data frame, or zoo object with fund - returns to be analyzed} - - \item{R.style}{matrix, data frame, or zoo object with - style index returns. Data object must be of the same - length and time-aligned with R.fund} - - \item{method}{specify the method of calculation of style - weights as "constrained", "unconstrained", or - "normalized". For more information, see - \code{\link{style.fit}}} - - \item{leverage}{logical, defaults to 'FALSE'. If 'TRUE', - the calculation of weights assumes that leverage may be - used. For more information, see \code{\link{style.fit}}} - - \item{model}{logical. If 'model' = TRUE in - \code{\link{style.QPfit}}, the full result set is shown - from the output of \code{solve.QP}.} - - \item{selection}{either "none" (default) or "AIC". If - "AIC", then the function uses a stepwise regression to - identify find the model with minimum AIC value. See - \code{\link{step}} for more detail.} - - \item{unstacked}{logical. If set to 'TRUE' \emph{and} - only one row of data is submitted in 'w', then the chart - creates a normal column chart. If more than one row is - submitted, then this is ignored. See examples below.} - - \item{space}{the amount of space (as a fraction of the - average bar width) left before each bar, as in - \code{\link{barplot}}. Default for - \code{chart.RollingStyle} is 0; for \code{chart.Style} - the default is 0.2.} - - \item{main}{set the chart title, same as in - \code{\link{plot}}} - - \item{width}{number of periods or window to apply rolling - style analysis over} - - \item{ylim}{set the y-axis limit, same as in - \code{\link{plot}}} - - \item{\dots}{for the charting functions, these are - arguments to be passed to \code{\link{barplot}}. These - can include further arguments (such as 'axes', 'asp' and - 'main') and graphical parameters (see 'par') which are - passed to 'plot.window()', 'title()' and 'axis'. For the - calculation functions, these are ignored.} -} -\description{ - Functions that calculate effective style weights and - display the results in a bar chart. \code{chart.Style} - calculates and displays style weights calculated over a - single period. \code{chart.RollingStyle} calculates and - displays those weights in rolling windows through time. - \code{style.fit} manages the calculation of the weights - by method. \code{style.QPfit} calculates the specific - constraint case that requires quadratic programming. -} -\details{ - These functions calculate style weights using an asset - class style model as described in detail in Sharpe - (1992). The use of quadratic programming to determine a - fund's exposures to the changes in returns of major asset - classes is usually refered to as "style analysis". - - The "unconstrained" method implements a simple factor - model for style analysis, as in: \deqn{Ri = - bi1*F1+bi2*F2+...+bin*Fn+ei}{R_i = - b_{i1}F_1+b_{i2}F_2+\dots+b_{in}F_n +e_i} where - \eqn{Ri}{R_i} represents the return on asset i, - \eqn{Fj}{F_j} represents each factor, and \eqn{ei}{e_i} - represents the "non-factor" component of the return on i. - This is simply a multiple regression analysis with fund - returns as the dependent variable and asset class returns - as the independent variables. The resulting slope - coefficients are then interpreted as the fund's historic - exposures to asset class returns. In this case, - coefficients do not sum to 1. - - The "normalized" method reports the results of a multiple - regression analysis similar to the first, but with one - constraint: the coefficients are required to add to 1. - Coefficients may be negative, indicating short exposures. - To enforce the constraint, coefficients are normalized. - - The "constrained" method includes the constraint that the - coefficients sum to 1, but adds that the coefficients - must lie between 0 and 1. These inequality constraints - require a quadratic programming algorithm using - \code{\link[quadprog]{solve.QP}} from the 'quadprog' - package, and the implementation is discussed under - \code{\link{style.QPfit}}. If set to TRUE, "leverage" - allows the sum of the coefficients to exceed 1. - - According to Sharpe (1992), the calculation for the - constrained case is represented as: \deqn{min var(Rf - - sum[wi * R.si]) = min var(F - w*S)}{min \sigma(R_f - - \sum{w_i * R_s_i}) = min \sigma(F - w*S)} \deqn{s.t. - sum[wi] = 1; wi > 0}{ s.t. \sum{w_i} = 1; w_i > 0} - - Remembering that: - - \deqn{\sigma(aX + bY) = a^2 \sigma(X) + b^2 \sigma(Y) + - 2ab cov(X,Y) = \sigma(R.f) + w'*V*w - 2*w'*cov(R.f,R.s)} - - we can drop \eqn{var(Rf)}{\sigma(R_f)} as it isn't a - function of weights, multiply both sides by 1/2: - - \deqn{= min (1/2) w'*V*w - C'w}{= min (1/2) w'*V*w - C'w} - \deqn{ s.t. w'*e = 1, w_i > 0}{ s.t. w'*e = 1, w_i > 0} - - Which allows us to use \code{\link[quadprog]{solve.QP}}, - which is specified as: \deqn{min(-d' b + 1/2 b' D - b)}{min(-d' b + 1/2 b' D b)} and the constraints \deqn{ - A' b >= b.0 }{ A' b >= b_0 } - - so: b is the weight vector, D is the variance-covariance - matrix of the styles d is the covariance vector between - the fund and the styles - - The chart functions then provide a graphical summary of - the results. The underlying function, - \code{\link{style.fit}}, provides the outputs of the - analysis and more information about fit, including an - R-squared value. - - Styles identified in this analysis may be interpreted as - an average of potentially changing exposures over the - period covered. The function - \code{\link{chart.RollingStyle}} may be useful for - examining the behavior of a manager's average exposures - to asset classes over time, using a rolling-window - analysis. - - The chart functions plot a column chart or stacked column - chart of the resulting style weights to the current - device. Both \code{style.fit} and \code{style.QPfit} - produce a list of data frames containing 'weights' and - 'R.squared' results. If 'model' = TRUE in - \code{style.QPfit}, the full result set is shown from the - output of \code{solve.QP}. -} -\note{ - None of the functions \code{chart.Style}, - \code{style.fit}, and \code{style.QPfit} make any attempt - to align the two input data series. The - \code{chart.RollingStyle}, on the other hand, does merge - the two series and manages the calculation over common - periods. -} -\examples{ -data(edhec) -data(managers) -style.fit(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE) -chart.Style(managers[97:132,2,drop=FALSE],edhec[85:120,], method="constrained", leverage=FALSE, unstack=TRUE, las=3) -chart.RollingStyle(managers[,2,drop=FALSE],edhec[,1:11], method="constrained", leverage=FALSE, width=36, cex.legend = .7, colorset=rainbow12equal, las=1) -} -\author{ - Peter Carl -} -\references{ - Sharpe, W. Asset Allocation: Management Style and - Performance Measurement Journal of Portfolio Management, - 1992, 7-19. See \url{ - http://www.stanford.edu/~wfsharpe/art/sa/sa.htm} -} -\seealso{ - \code{\link{barplot}}, \code{\link{par}} -} -\keyword{hplot} -\keyword{multivariate} -\keyword{ts} - Deleted: pkg/FactorAnalytics/sandbox/Man/covEWMA.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/covEWMA.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/covEWMA.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,49 +0,0 @@ -\name{covEWMA} -\alias{covEWMA} -\title{Compute RiskMetrics-type EWMA Covariance Matrix} -\usage{ - covEWMA(factors, lambda = 0.96, return.cor = FALSE) -} -\arguments{ - \item{factors}{\code{T x K} data.frame containing asset - returns, where \code{T} is the number of time periods and - \code{K} is the number of assets.} - - \item{lambda}{Scalar exponential decay factor. Must lie - between between 0 and 1.} - - \item{return.cor}{Logical, if TRUE then return EWMA - correlation matrices.} -} -\value{ - \code{T x K x K} array giving the time series of EWMA - covariance matrices if \code{return.cor=FALSE} and EWMA - correlation matrices if \code{return.cor=TRUE}. -} -\description{ - Compute time series of RiskMetrics-type EWMA covariance - matrices of returns. Initial covariance matrix is assumed - to be the unconditional covariance matrix. -} -\details{ - The EWMA covariance matrix at time \code{t} is compute as - \cr \code{Sigma(t) = lambda*Sigma(t-1) + - (1-lambda)*R(t)t(R(t))} \cr where \code{R(t)} is the - \code{K x 1} vector of returns at time \code{t}. -} -\examples{ -# compute time vaying covariance of factors. -data(managers.df) -factors = managers.df[,(7:9)] -cov.f.ewma <- covEWMA(factors) -cov.f.ewma[120,,] -} -\author{ - Eric Zivot and Yi-An Chen. -} -\references{ - Zivot, E. and J. Wang (2006), \emph{Modeling Financial - Time Series with S-PLUS, Second Edition}, - Springer-Verlag. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/impliedFactorReturns.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/impliedFactorReturns.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/impliedFactorReturns.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,53 +0,0 @@ -\name{impliedFactorReturns} -\alias{impliedFactorReturns} -\title{Compute Implied Factor Returns Using Covariance Matrix Approach} -\usage{ - impliedFactorReturns(factor.scenarios, mu.factors, - cov.factors) -} -\arguments{ - \item{factor.scenarios}{m x 1 vector of scenario values - for a subset of the n > m risk factors} - - \item{mu.factors}{\code{n x 1} vector of factor mean - returns.} - - \item{cov.factors}{\code{n x n} factor covariance - matrix.} -} -\value{ - \code{(n - m) x 1} vector of implied factor returns -} -\description{ - Compute risk factor conditional mean returns for a one - group of risk factors given specified returns for another - group of risk factors based on the assumption that all - risk factor returns are multivariately normally - distributed. -} -\details{ - Let \code{y} denote the \code{m x 1} vector of factor - scenarios and \code{x} denote the \code{(n-m) x 1} vector - of other factors. Assume that \code{(y', x')'} has a - multivariate normal distribution with mean \code{(mu.y', - mu.x')'} and covariance matrix partitioned as - \code{(cov.yy, cov.yx, cov.xy, cov.xx)}. Then the implied - factor scenarios are computed as \code{E[x|y] = mu.x + - cov.xy*cov.xx^-1 * (y - mu.y)} -} -\examples{ -# get data -data(managers.df) -factors = managers.df[,(7:9)] -# make up a factor mean returns scenario for factor SP500.TR -factor.scenarios <- 0.1 -names(factor.scenarios) <- "SP500.TR" -mu.factors <- mean(factors) -cov.factors <- var(factors) -# implied factor returns -impliedFactorReturns(factor.scenarios,mu.factors,cov.factors) -} -\author{ - Eric Zivot and Yi-An Chen. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedEsReport.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedEsReport.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedEsReport.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,71 +0,0 @@ -\name{modifiedEsReport} -\alias{modifiedEsReport} -\title{compute ES report via Cornish-Fisher expansion for collection of assets in a -portfolio given simulated (bootstrapped) return data.} -\usage{ - modifiedEsReport(bootData, w, delta.w = 0.001, - tail.prob = 0.01, method = c("derivative", "average"), - nav, nav.p, fundStrategy, i1, i2) -} -\arguments{ - \item{bootData}{B x n matrix of B bootstrap returns on - assets in portfolio.} - - \item{w}{n x 1 vector of portfolio weights.} - - \item{delta.w}{scalar, change in portfolio weight for - computing numerical derivative. Default value is 0.010.} - - \item{tail.prob}{scalar tail probability.} - - \item{method}{character, method for computing marginal - ES. Valid choices are "derivative" for numerical - computation of the derivative of portfolio ES wrt fund - portfolio weight; "average" for approximating E[Ri | - Rp<=VaR]} - - \item{nav}{n x 1 vector of net asset values in each - fund.} - - \item{nav.p}{scalar, net asset value of portfolio - percentage.} - - \item{fundStrategy}{n x 1 vector of fund strategies.} - - \item{i1,i2}{if ff object is used, the ffapply functions - do apply an EXPRession and provide two indices FROM="i1" - and TO="i2", which mark beginning and end of the batch - and can be used in the applied expression.} -} -\value{ - dataframe with the following columns: Strategy n x 1 - strategy. Net.Asset.value n x 1 net asset values. - Allocation n x 1 vector of asset weights. Mean n x 1 mean - of each funds. Std.Dev n x 1 standard deviation of each - funds. Assets.ES n x 1 vector of asset specific ES - values. cES n x 1 vector of asset specific component ES - values. cES.dollar n x 1 vector of asset specific - component ES values in dollar terms. pcES n x 1 vector of - asset specific percent contribution to ES values. iES n x - 1 vector of asset specific incremental ES values. - iES.dollar n x 1 vector of asset specific component ES - values in dollar terms. mES n x 1 vector of asset - specific marginal ES values. mES.dollar n x 1 vector of - asset specific marginal ES values in dollar terms. -} -\description{ - compute ES report via Cornish-Fisher expansion for - collection of assets in a portfolio given simulated - (bootstrapped) return data. Report format follows that of - Excel VaR report. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] -modifiedEsReport (bootData= ret.assets[,1:3], w=c(1/3,1/3,1/3), delta.w = 0.001, tail.prob = 0.01, - method="derivative",nav=c(100,200,100), nav.p=500, fundStrategy=c("S1","S2","S3")) -} -\author{ - Eric Zivot and Yi-An Chen. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalES.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalES.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalES.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,44 +0,0 @@ -\name{modifiedIncrementalES} -\alias{modifiedIncrementalES} -\title{Compute incremental ES given bootstrap data and portfolio weights.} -\usage{ - modifiedIncrementalES(bootData, w, tail.prob = 0.01, i1, - i2) -} -\arguments{ - \item{bootData}{B x N matrix of B bootstrap returns on n - assets in portfolio.} - - \item{w}{N x 1 vector of portfolio weights} - - \item{tail.prob}{scalar tail probability.} - - \item{i1,i2}{if ff object is used, the ffapply functions - do apply an EXPRession and provide two indices FROM="i1" - and TO="i2", which mark beginning and end of the batch - and can be used in the applied expression.} -} -\value{ - n x 1 matrix of incremental ES values for each asset. -} -\description{ - Compute incremental ES given bootstrap data and portfolio - weights. Incremental ES is defined as the change in - portfolio ES that occurs when an asset is removed from - the portfolio and allocation is spread equally among - remaining assets. VaR used in ES computation is computed - as an estimated quantile using the Cornish-Fisher - expansion. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] -modifiedIncrementalES(ret.assets[,1:3],w=c(1/3,1/3,1/3),tail.prob = 0.05) -} -\author{ - Eric Zivot and Yi-An Chen. -} -\references{ - Jorian, P. (2007). Value-at-Risk, pg. 168. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalVaR.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalVaR.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedIncrementalVaR.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,43 +0,0 @@ -\name{modifiedIncrementalVaR} -\alias{modifiedIncrementalVaR} -\title{Compute incremental VaR given bootstrap data and portfolio weights.} -\usage{ - modifiedIncrementalVaR(bootData, w, tail.prob = 0.01, i1, - i2) -} -\arguments{ - \item{bootData}{B x N matrix of B bootstrap returns on n - assets in portfolio.} - - \item{w}{N x 1 vector of portfolio weights} - - \item{tail.prob}{scalar tail probability.} - - \item{i1,i2}{if ff object is used, the ffapply functions - do apply an EXPRession and provide two indices FROM="i1" - and TO="i2", which mark beginning and end of the batch - and can be used in the applied expression.} -} -\value{ - n x 1 matrix of incremental VaR values for each asset. -} -\description{ - Compute incremental VaR given bootstrap data and - portfolio weights. Incremental VaR is defined as the - change in portfolio VaR that occurs when an asset is - removed from the portfolio and allocation is spread - equally among remaining assets. VaR is computed as an - estimated quantile using the Cornish-Fisher expansion. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] -modifiedIncrementalVaR(ret.assets[,1:3],w=c(1/3,1/3,1/3),tail.prob = 0.05) -} -\author{ - Eric Zivot and Yi-An Chen. -} -\references{ - Jorian, P. (2007). Value-at-Risk, pg. 168. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioEsDecomposition.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioEsDecomposition.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioEsDecomposition.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,54 +0,0 @@ -\name{modifiedPortfolioEsDecomposition} -\alias{modifiedPortfolioEsDecomposition} -\title{Compute portfolio ES (risk) decomposition by assets.} -\usage{ - modifiedPortfolioEsDecomposition(bootData, w, - delta.w = 0.001, tail.prob = 0.01, - method = c("derivative", "average")) -} -\arguments{ - \item{bootData}{B x N matrix of B bootstrap returns on - assets in portfolio.} - - \item{w}{N x 1 vector of portfolio weights} - - \item{delta.w}{Scalar, change in portfolio weight for - computing numerical derivative.} - - \item{tail.prob}{Scalar, tail probability.} - - \item{method}{Character, method for computing marginal - ES. Valid choices are "derivative" for numerical - computation of the derivative of portfolio ES with - respect to fund portfolio weight; "average" for - approximating E[R_i | R_p<=VaR].} -} -\value{ - an S3 list containing -} -\description{ - Compute portfolio ES decomposition given historical or - simulated data and portfolio weights. Marginal ES is - computed either as the numerical derivative of ES with - respect to portfolio weight or as the expected fund - return given portfolio return is less than or equal to - portfolio VaR VaR is compute as an estimated quantile - using the Cornish-Fisher expansion. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] -modifiedPortfolioEsDecomposition(ret.assets[,1:3], w=c(1/3,1/3,1/3), delta.w = 0.001, - tail.prob = 0.01, method=c("derivative")) -} -\author{ - Eric Zivot and Yi-An Chen. -} -\references{ - 1. Hallerback (2003), "Decomposing Portfolio - Value-at-Risk: A General Analysis", The Journal of Risk - 5/2. 2. Yamai and Yoshiba (2002). "Comparative Analyses - of Expected Shortfall and Value-at-Risk: Their Estimation - Error, Decomposition, and Optimization Bank of Japan. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioVaRDecomposition.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioVaRDecomposition.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedPortfolioVaRDecomposition.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,57 +0,0 @@ -\name{modifiedPortfolioVaRDecomposition} -\alias{modifiedPortfolioVaRDecomposition} -\title{Compute portfolio VaR decomposition given historical or simulated data and -portfolio weights.} -\usage{ - modifiedPortfolioVaRDecomposition(bootData, w, - delta.w = 0.001, tail.prob = 0.01, - method = c("derivative", "average")) -} -\arguments{ - \item{bootData}{B x N matrix of B bootstrap returns on - assets in portfolio.} - - \item{w}{N x 1 vector of portfolio weights} - - \item{delta.w}{Scalar, change in portfolio weight for - computing numerical derivative.} - - \item{tail.prob}{Scalar, tail probability.} - - \item{method}{Character, method for computing marginal - ES. Valid choices are "derivative" for numerical - computation of the derivative of portfolio ES with - respect to fund portfolio weight; "average" for - approximating E[R_i | R_p =VaR].} -} -\value{ - an S3 list containing -} -\description{ - Compute portfolio VaR decomposition given historical or - simulated data and portfolio weights. The partial - derivative of VaR wrt factor beta is computed as the - expected factor return given fund return is equal to its - VaR and approximated by kernel estimator. VaR is compute - as an estimated quantile using the Cornish-Fisher - expansion. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] -modifiedPortfolioVaRDecomposition(ret.assets[,1:3], w=c(1/3,1/3,1/3), delta.w = 0.001, - tail.prob = 0.01, method=c("average")) -} -\author{ - Eric Zivot and Yi-An Chen. -} -\references{ - 1. Hallerback (2003), "Decomposing Portfolio - Value-at-Risk: A General Analysis", The Journal of Risk - 5/2. 2. Yamai and Yoshiba (2002). "Comparative Analyses - of Expected Shortfall and Value-at-Risk: Their Estimation - Error, Decomposition, and Optimization Bank of Japan. 3. - Epperlein and Smillie (2006) "Cracking VAR with Kernels," - Risk. -} - Deleted: pkg/FactorAnalytics/sandbox/Man/modifiedVaRReport.Rd =================================================================== --- pkg/FactorAnalytics/sandbox/Man/modifiedVaRReport.Rd 2014-11-14 23:24:01 UTC (rev 3551) +++ pkg/FactorAnalytics/sandbox/Man/modifiedVaRReport.Rd 2014-11-15 22:54:45 UTC (rev 3552) @@ -1,73 +0,0 @@ -\name{modifiedVaRReport} -\alias{modifiedVaRReport} -\title{compute VaR report via Cornish-Fisher expansion for collection of assets in -a portfolio given simulated (bootstrapped) return data.} -\usage{ - modifiedVaRReport(bootData, w, delta.w = 0.001, - tail.prob = 0.01, method = c("derivative", "average"), - nav, nav.p, fundStrategy, i1, i2) -} -\arguments{ - \item{bootData}{B x n matrix of B bootstrap returns on - assets in portfolio.} - - \item{w}{n x 1 vector of portfolio weights.} - - \item{delta.w}{scalar, change in portfolio weight for - computing numerical derivative. Default value is 0.010.} - - \item{tail.prob}{scalar tail probability.} - - \item{method}{character, method for computing marginal - VaR Valid choices are "derivative" for numerical - computation of the derivative of portfolio VaR wrt fund - portfolio weight; "average" for approximating E[Ri | Rp - =VaR]} - - \item{nav}{n x 1 vector of net asset values in each - fund.} - - \item{nav.p}{scalar, net asset value of portfolio - percentage.} - - \item{fundStrategy}{n x 1 vector of fund strategies.} - - \item{i1,i2}{if ff object is used, the ffapply functions - do apply an EXPRession and provide two indices FROM="i1" - and TO="i2", which mark beginning and end of the batch - and can be used in the applied expression.} -} -\value{ - dataframe with the following columns: Strategy n x 1 - strategy. Net.Asset.value n x 1 net asset values. - Allocation n x 1 vector of asset weights. Mean n x 1 mean - of each funds. Std.Dev n x 1 standard deviation of each - funds. Assets.VaR n x 1 vector of asset specific VaR - values. cVaR n x 1 vector of asset specific component VaR - values. cVaR.dollar n x 1 vector of asset specific - component VaR values in dollar terms. pcVaR n x 1 vector - of asset specific percent contribution to VaR values. - iVaR n x 1 vector of asset specific incremental VaR - values. iVaR.dollar n x 1 vector of asset specific - component VaR values in dollar terms. mVaR n x 1 vector - of asset specific marginal VaR values. mVaR.dollar n x 1 - vector of asset specific marginal VaR values in dollar - terms. -} -\description{ - compute VaR report via Cornish-Fisher expansion for - collection of assets in a portfolio given simulated - (bootstrapped) return data. Report format follows that of - Excel VaR report. -} -\examples{ -data(managers.df) -ret.assets = managers.df[,(1:6)] [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3552 From noreply at r-forge.r-project.org Wed Nov 19 11:00:09 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Wed, 19 Nov 2014 11:00:09 +0100 (CET) Subject: [Returnanalytics-commits] r3553 - in pkg/FactorAnalytics: . R man Message-ID: <20141119100009.89D7C187860@r-forge.r-project.org> Author: pragnya Date: 2014-11-19 11:00:09 +0100 (Wed, 19 Nov 2014) New Revision: 3553 Added: pkg/FactorAnalytics/R/CornishFisher.R Removed: pkg/FactorAnalytics/R/dCornishFisher.R pkg/FactorAnalytics/R/pCornishFisher.R pkg/FactorAnalytics/R/qCornishFisher.R pkg/FactorAnalytics/R/rCornishFisher.R Modified: pkg/FactorAnalytics/DESCRIPTION pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/paFm.r pkg/FactorAnalytics/R/plot.pafm.r pkg/FactorAnalytics/R/plot.tsfm.r pkg/FactorAnalytics/R/print.pafm.r pkg/FactorAnalytics/R/summary.pafm.r pkg/FactorAnalytics/R/summary.tsfm.r pkg/FactorAnalytics/man/CornishFisher.Rd pkg/FactorAnalytics/man/fitTsfm.Rd Log: Better handling of some dependencies (import vs depends), combined CornishFisher functions into one script Modified: pkg/FactorAnalytics/DESCRIPTION =================================================================== --- pkg/FactorAnalytics/DESCRIPTION 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/DESCRIPTION 2014-11-19 10:00:09 UTC (rev 3553) @@ -16,17 +16,17 @@ are included. License: GPL-2 Depends: - R (>= 2.14.0), - robust, - leaps, + R (>= 3.0.0), + xts (>= 0.9), + PerformanceAnalytics(>= 1.1.0), + lmtest, + sandwich +Imports: + corrplot, + robust, + leaps, lars, - lmtest, - PerformanceAnalytics (>= 1.1.0), - sn, - tseries, - strucchange, - ellipse -Imports: corrplot + strucchange Suggests: testthat, quantmod, knitr LazyLoad: yes Added: pkg/FactorAnalytics/R/CornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/CornishFisher.R (rev 0) +++ pkg/FactorAnalytics/R/CornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -0,0 +1,114 @@ +#' @title Density, distribution function, quantile function and random +#' generation for the Cornish-Fisher distribution. +#' +#' @details CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) +#' \itemize{ +#' \item \code{dCornishFisher} Computes Cornish-Fisher density from two term +#' Edgeworth expansion given mean, standard deviation, skewness and excess +#' kurtosis. +#' \item \code{pCornishFisher} Computes Cornish-Fisher CDF from two term +#' Edgeworth expansion given mean, standard deviation, skewness and excess +#' kurtosis. +#' \item \code{qCornishFisher} Computes Cornish-Fisher quantiles from two term +#' Edgeworth expansion given mean, standard deviation, skewness and excess +#' kurtosis. +#' \item \code{rCornishFisher} simulate observations based on Cornish-Fisher +#' quantile expansion given mean, standard deviation, skewness and excess +#' kurtosis.} +#' +#' @param n scalar; number of simulated values in random simulation, sample +#' length in density, distribution and quantile functions. +#' @param sigma scalar standard deviation. +#' @param skew scalar; skewness. +#' @param ekurt scalar; excess kurtosis. +#' @param seed scalar; set seed. Default is \code{NULL}. +#' @param x,q vector of standardized quantiles. +#' @param p vector of probabilities. +#' +#' @return +#' \code{dCornishFisher} gives the density, \code{pCornishFisher} gives the +#' distribution function, \code{qCornishFisher} gives the quantile function, +#' and \code{rCornishFisher} generates \code{n} random simulations. +#' +#' @author Eric Zivot and Yi-An Chen. +#' +#' @references +#' DasGupta, A. (2008). Asymptotic theory of statistics and probability. +#' Springer. +#' Severini, T. A., (2000). Likelihood Methods in Statistics. Oxford University +#' Press. +#' +#' @examples +#' \dontrun{ +#' # generate 1000 observation from Cornish-Fisher distribution +#' rc <- rCornishFisher(1000,1,0,5) +#' hist(rc, breaks=100, freq=FALSE, +#' main="simulation of Cornish Fisher Distribution", xlim=c(-10,10)) +#' lines(seq(-10,10,0.1), dnorm(seq(-10,10,0.1), mean=0, sd=1), col=2) +#' # compare with standard normal curve +#' +#' # exponential example from A.dasGupta p.188 +#' # x is iid exp(1) distribution, sample size = 5 +#' # then x_bar is Gamma(shape=5, scale=1/5) distribution +#' q <- c(0,0.4,1,2) +#' # exact cdf +#' pgamma(q/sqrt(5)+1, shape=5, scale=1/5) +#' # use CLT +#' pnorm(q) +#' # use edgeworth expansion +#' pCornishFisher(q, n=5, skew=2, ekurt=6) +#' } +#' +#' @rdname CornishFisher +#' @export + +dCornishFisher <- function(x, n,skew, ekurt) { + density <- dnorm(x) + + 1/sqrt(n)*(skew/6*(x^3 - 3*x))*dnorm(x) + + 1/n*( (skew)^2/72*(x^6 - 15*x^4 + 45*x^2 - 15) + ekurt/24*(x^4 - 6*x^2 + 3) )*dnorm(x) + return(density) +} + +#' @rdname CornishFisher +#' @export + +pCornishFisher <- function(q, n, skew, ekurt) { + zq <- q + CDF <- pnorm(zq) + + 1/sqrt(n)*(skew/6 * (1-zq^2))*dnorm(zq) + + 1/n*( (ekurt)/24*(3*zq-zq^3) + (skew)^2/72*(10*zq^3-15*zq-zq^5) )*dnorm(zq) + return(CDF) +} + +#' @rdname CornishFisher +#' @export + +qCornishFisher <- function(p,n,skew, ekurt) { + zq <- qnorm(p) + q.cf <- zq + + 1/sqrt(n)*(((zq^2 - 1) * skew)/6) + + 1/n*( (((zq^3 - 3*zq) * ekurt)/24) - ((((2*zq^3) - 5*zq) * skew^2)/36) ) + return(q.cf) +} + +#' @rdname CornishFisher +#' @export + +rCornishFisher <- function(n, sigma, skew, ekurt, seed=NULL) { + + ## inputs: + ## n scalar, number of simulated values + ## sigma scalar, standard deviation + ## skew scalar, skewness + ## ekurt scalar, excess kurtosis + ## outputs: + ## n simulated values from Cornish-Fisher distribution + + if (!is.null(seed)) set.seed(seed) + zc <- rnorm(n) + z.cf <- zc + + (((zc^2 - 1) * skew)/6) + + (((zc^3 - 3*zc)*ekurt)/24) - + ((((2*zc^3) - 5*zc)*skew^2)/36) + return(sigma*z.cf) +} Deleted: pkg/FactorAnalytics/R/dCornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/dCornishFisher.R 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/dCornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -1,15 +0,0 @@ -#'@name CornishFisher -#'@aliases CornishFisher -#'@aliases rCornishFisher -#'@aliases dCornishFisher -#'@aliases qCornishFisher -#'@aliases pCornishFisher -#' @export -dCornishFisher <- -function(x, n,skew, ekurt) { - -density <- dnorm(x) + 1/sqrt(n)*(skew/6*(x^3-3*x))*dnorm(x) + - 1/n *( (skew)^2/72*(x^6 - 15*x^4 + 45*x^2 -15) + ekurt/24 *(x^4-6*x^2+3) )*dnorm(x) -return(density) -} - Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -15,7 +15,7 @@ #' Estimation method "OLS" corresponds to ordinary least squares using #' \code{\link[stats]{lm}}, "DLS" is discounted least squares (weighted least #' squares with exponentially declining weights that sum to unity), and, -#' "Robust" is robust regression (useing \code{\link[robust]{lmRob}}). +#' "Robust" is robust regression (using \code{\link[robust]{lmRob}}). #' #' If \code{variable.selection="none"}, all chosen factors are used in the #' factor model. Whereas, "stepwise" performs traditional forward/backward @@ -261,7 +261,7 @@ reg.list <- SelectStepwise(dat.xts, asset.names, factor.names, fit.method, lm.args, lmRob.args, step.args, decay) } else if (variable.selection == "subsets") { - reg.list <- SelectAllSubsets(dat.xts, asset.names, factor.names,fit.method, + reg.list <- SelectAllSubsets(dat.xts, asset.names, factor.names, fit.method, lm.args, lmRob.args, regsubsets.args, subset.size, decay) } else if (variable.selection == "lars") { @@ -317,8 +317,8 @@ lm.args$weights <- WeightsDLS(nrow(reg.xts), decay) reg.list[[i]] <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) } else if (fit.method == "Robust") { - reg.list[[i]] <- do.call(lmRob, c(list(fm.formula,data=reg.xts), - lmRob.args)) + reg.list[[i]] <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), + lmRob.args)) } } reg.list @@ -349,8 +349,10 @@ lm.fit <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) reg.list[[i]] <- do.call(step, c(list(lm.fit),step.args)) } else if (fit.method == "Robust") { - lmRob.fit <- do.call(lmRob, c(list(fm.formula,data=reg.xts),lmRob.args)) - reg.list[[i]] <- do.call(step.lmRob, c(list(lmRob.fit),step.args)) + lmRob.fit <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), + lmRob.args)) + reg.list[[i]] <- do.call(robust::step.lmRob, c(list(lmRob.fit), + step.args)) } } reg.list @@ -379,8 +381,8 @@ } # choose best subset of factors depending on specified subset size - fm.subsets <- do.call(regsubsets, c(list(fm.formula,data=reg.xts), - regsubsets.args)) + fm.subsets <- do.call(leaps::regsubsets, c(list(fm.formula,data=reg.xts), + regsubsets.args)) sum.sub <- summary(fm.subsets) names.sub <- names(which(sum.sub$which[as.character(subset.size),-1]==TRUE)) reg.xts <- na.omit(dat.xts[,c(i,names.sub)]) @@ -392,8 +394,8 @@ lm.args$weights <- WeightsDLS(nrow(reg.xts), decay) reg.list[[i]] <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) } else if (fit.method == "Robust") { - reg.list[[i]] <- do.call(lmRob, c(list(fm.formula,data=reg.xts), - lmRob.args)) + reg.list[[i]] <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), + lmRob.args)) } } reg.list @@ -422,11 +424,12 @@ # convert to matrix reg.mat <- as.matrix(reg.xts) # fit lars regression model - lars.fit <- - do.call(lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i]),lars.args)) + lars.fit <- do.call(lars::lars, c(list(x=reg.mat[,factor.names], + y=reg.mat[,i]),lars.args)) lars.sum <- summary(lars.fit) - lars.cv <- do.call(cv.lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i], - mode="step"),cv.lars.args)) + lars.cv <- + do.call(lars::cv.lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i], + mode="step"),cv.lars.args)) # including plot.it=FALSE to cv.lars strangely gives an error: "Argument s # out of range". And, specifying index=seq(nrow(lars.fit$beta)-1) resolves # the issue, but care needs to be taken for small N Deleted: pkg/FactorAnalytics/R/pCornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/pCornishFisher.R 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/pCornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -1,16 +0,0 @@ -#'@name CornishFisher -#'@aliases CornishFisher -#'@aliases rCornishFisher -#'@aliases dCornishFisher -#'@aliases qCornishFisher -#'@aliases pCornishFisher -#' @export - -pCornishFisher <- -function(q,n,skew, ekurt) { -zq = q -CDF = pnorm(zq) + 1/sqrt(n) *(skew/6 * (1-zq^2))*dnorm(zq) + - 1/n *( (ekurt)/24*(3*zq-zq^3)+ (skew)^2/72*(10*zq^3 - 15*zq -zq^5))*dnorm(zq) -return(CDF) -} - Modified: pkg/FactorAnalytics/R/paFm.r =================================================================== --- pkg/FactorAnalytics/R/paFm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/paFm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -105,7 +105,6 @@ if (class(fit)=="ffm" ) { # if benchmark is provided - # # if (!is.null(benchmark)) { # stop("use fitFundamentalFactorModel instead") # } Modified: pkg/FactorAnalytics/R/plot.pafm.r =================================================================== --- pkg/FactorAnalytics/R/plot.pafm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/plot.pafm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -41,8 +41,8 @@ #' @export #' plot.pafm <- function(x, which.plot=c("none","1L","2L","3L"),max.show=6, - date=NULL,plot.single=FALSE,fundName, - which.plot.single=c("none","1L","2L","3L"),...) { + date=NULL,plot.single=FALSE,fundName, + which.plot.single=c("none","1L","2L","3L"),...) { # ... for chart.TimeSeries if (is.null(date)){ date = index(x[[3]][[1]])[1] @@ -54,24 +54,26 @@ which.plot.single<-which.plot.single[1] if (which.plot.single=="none") - which.plot.single<-menu(c("attributed cumulative returns", - paste("attributed returns","on",date,sep=" "), - "Time series of attributed returns"), - title="performance attribution plot \nMake a plot selection (or 0 to exit):\n") + which.plot.single <- menu(c("attributed cumulative returns", + paste("attributed returns","on",date,sep=" "), + "Time series of attributed returns"), + title="performance attribution plot + \nMake a plot selection (or 0 to exit):\n") switch(which.plot.single, - "1L" = { + "1L" = { bar <- c(x$cum.spec.ret[fundName],x$cum.ret.attr.f[fundName,]) names(bar)[1] <- "specific.returns" - barplot(bar,horiz=TRUE,main="cumulative attributed returns",las=1) + barplot(bar, horiz=TRUE, main="cumulative attributed returns", las=1) }, "2L" ={ bar <- coredata(x$attr.list[[fundName]][as.Date(date)]) - tryCatch( {barplot(bar,horiz=TRUE,main=fundName,las=1) - },error=function(e){cat("\nthis date is not available for this assets.\n")}) + tryCatch({barplot(bar, horiz=TRUE, main=fundName, las=1)}, + error=function(e){cat("\this date is not available for this asset.\n")}) }, "3L" = { chart.TimeSeries(x$attr.list[[fundName]], - main=paste("Time series of attributed returns of ",fundName,sep=""),... ) + main=paste("Time series of attributed returns of", + fundName, sep=" "), ...) }, invisible()) } @@ -82,13 +84,14 @@ n <- length(fundnames) if(which.plot=='none') - which.plot<-menu(c("attributed cumulative returns", - paste("attributed returns","on",date,sep=" "), - "time series of attributed returns"), - title="performance attribution plot \nMake a plot selection (or 0 to exit):\n") + which.plot <- menu(c("attributed cumulative returns", + paste("attributed returns","on",date,sep=" "), + "time series of attributed returns"), + title="performance attribution plot + \nMake a plot selection (or 0 to exit):\n") if (n >= max.show) { - cat(paste("numbers of assets are greater than",max.show,", show only first", - max.show,"assets",sep=" ")) + cat(paste("numbers of assets are greater than ", max.show, + "; showing only first ", max.show, " assets.", sep="")) n <- max.show } switch(which.plot, @@ -96,9 +99,9 @@ "1L" = { par(mfrow=c(2,n/2)) for (i in fundnames[1:n]) { - bar <- c(x$cum.spec.ret[i],x$cum.ret.attr.f[i,]) + bar <- c(x$cum.spec.ret[i], x$cum.ret.attr.f[i,]) names(bar)[1] <- "specific.returns" - barplot(bar,horiz=TRUE,main=i,las=1) + barplot(bar, horiz=TRUE, main=i, las=1) } par(mfrow=c(1,1)) }, @@ -106,24 +109,23 @@ par(mfrow=c(2,n/2)) for (i in fundnames[1:n]) { tryCatch({ - bar <- coredata(x$attr.list[[i]][as.Date(date)]) - barplot(bar,horiz=TRUE,main=i,las=1) + bar <- coredata(x$attr.list[[i]][as.Date(date)]) + barplot(bar, horiz=TRUE, main=i, las=1) }, error=function(e) { - cat("\nDate for some assets returns is not available.\n") - dev.off() - } ) - } + cat("\nDate for some assets returns is not available.\n") + dev.off() + } ) + } par(mfrow=c(1,1)) }, "3L" = { par(mfrow=c(2,n/2)) for (i in fundnames[1:n]) { - chart.TimeSeries(x$attr.list[[i]],main=i,...) + chart.TimeSeries(x$attr.list[[i]], main=i, ...) } par(mfrow=c(1,1)) }, invisible() ) - } } Modified: pkg/FactorAnalytics/R/plot.tsfm.r =================================================================== --- pkg/FactorAnalytics/R/plot.tsfm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/plot.tsfm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -219,7 +219,8 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.rec <- efp(formula(fit), type="Rec-CUSUM", data=fit$model) + cusum.rec <- strucchange::efp(formula(fit), type="Rec-CUSUM", + data=fit$model) plot(cusum.rec, main=paste("Recursive CUSUM test:",i), las=las, col=colorset, ...) }, "11L" = { @@ -227,7 +228,8 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.ols <- efp(formula(fit), type="OLS-CUSUM", data=fit$model) + cusum.ols <- strucchange::efp(formula(fit), type="OLS-CUSUM", + data=fit$model) plot(cusum.ols, main=paste("OLS-based CUSUM test:",i), las=las, col=colorset, ...) }, "12L" = { @@ -235,7 +237,8 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.est <- efp(formula(fit), type="RE", data=fit$model) + cusum.est <- strucchange::efp(formula(fit), type="RE", + data=fit$model) plot(cusum.est, functional=NULL, col=colorset, las=0, main=paste("RE test (Recursive estimates test):",i), ...) }, "13L" = { @@ -266,7 +269,8 @@ width=24, by.column=FALSE, align="right") } else if (x$fit.method=="Robust") { rollReg.Rob <- function(data.z, formula) { - coef(lmRob(formula=formula, data=as.data.frame(data.z))) + coef(robust::lmRob(formula=formula, + data=as.data.frame(data.z))) } reg.z <- zoo(fit$model, as.Date(rownames(fit$model))) rollReg.z <- rollapply(reg.z, width=24, FUN=rollReg.Rob, Modified: pkg/FactorAnalytics/R/print.pafm.r =================================================================== --- pkg/FactorAnalytics/R/print.pafm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/print.pafm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -20,8 +20,8 @@ #' @method print pafm #' @export #' -print.pafm <- function(x,...) { +print.pafm <- function(x, ...) { cat("\nMean of returns attributed to factors \n") - print(sapply(x[[3]],function(x) apply(x,2,mean))) + print(sapply(x[[3]], function(x) apply(x,2,mean))) } Deleted: pkg/FactorAnalytics/R/qCornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/qCornishFisher.R 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/qCornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -1,18 +0,0 @@ -#'@name CornishFisher -#'@aliases CornishFisher -#'@aliases rCornishFisher -#'@aliases dCornishFisher -#'@aliases qCornishFisher -#'@aliases pCornishFisher -#' @export - -qCornishFisher <- -function(p,n,skew, ekurt) { -zq = qnorm(p) -q.cf = zq + 1/sqrt(n)* (((zq^2 - 1) * skew)/6) + 1/n*((((zq^3 - 3 * zq) * - ekurt)/24) - ((((2 * zq^3) - 5 * zq) * skew^2)/36) ) -return(q.cf) - - -} - Deleted: pkg/FactorAnalytics/R/rCornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/rCornishFisher.R 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/rCornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) @@ -1,88 +0,0 @@ -#' Functions for Cornish-Fisher density, CDF, random number simulation and -#' quantile. -#' -#'@name CornishFisher -#'@aliases CornishFisher -#'@aliases rCornishFisher -#'@aliases dCornishFisher -#'@aliases qCornishFisher -#'@aliases pCornishFisher -#' -#' -#'@description -#' \itemize{ -#' \item \code{rCornishFisher} simulate observations based on -#' Cornish-Fisher quantile expansion given mean, standard -#' deviation, skewness and excess kurtosis. -#' \item \code{dCornishFisher} Computes Cornish-Fisher density -#' from two term Edgeworth expansion given mean, standard -#' deviation, skewness and excess kurtosis. -#' \item \code{pCornishFisher} Computes Cornish-Fisher CDF from -#' two term Edgeworth expansion given mean, standard -#' deviation, skewness and excess kurtosis. -#' \item \code{qCornishFisher} Computes Cornish-Fisher quantiles -#' from two term Edgeworth expansion given mean, standard -#' deviation, skewness and excess kurtosis. -#'} -#' -#'@param n Scalar, number of simulated values in rCornishFisher. Sample length in -#' density,distribution,quantile function. -#' @param sigma Scalar, standard deviation. -#' @param skew Scalar, skewness. -#' @param ekurt Scalar, excess kurtosis. -#' @param seed Set seed here. Default is \code{NULL}. -#' @param x,q Vector of standardized quantiles. See detail. -#' @param p Vector of probabilities. -#' -#' @return n Simulated values from Cornish-Fisher distribution. -#' @author Eric Zivot and Yi-An Chen. -#' @references -#' \enumerate{ -#' \item A.DasGupta, "Asymptotic Theory of Statistics and -#' Probability", Springer Science+Business Media,LLC 2008 -#' \item Thomas A.Severini, "Likelihood Methods in Statistics", -#' Oxford University Press, 2000 -#' } -#' @export -#' -#' @details CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) -#' -#' @examples -#' \dontrun{ -#' # generate 1000 observation from Cornish-Fisher distribution -#' rc <- rCornishFisher(1000,1,0,5) -#'hist(rc,breaks=100,freq=FALSE,main="simulation of Cornish Fisher Distribution", -#' xlim=c(-10,10)) -#'lines(seq(-10,10,0.1),dnorm(seq(-10,10,0.1),mean=0,sd=1),col=2) -#' # compare with standard normal curve -#' -#' # example from A.dasGupta p.188 exponential example -#' # x is iid exp(1) distribution, sample size = 5 -#' # then x_bar is Gamma(shape=5,scale=1/5) distribution -#' q <- c(0,0.4,1,2) -#' # exact cdf -#' pgamma(q/sqrt(5)+1,shape=5,scale=1/5) -#' # use CLT -#' pnorm(q) -#' # use edgeworth expansion -#' pCornishFisher(q,n=5,skew=2,ekurt=6) -#' } - - - -rCornishFisher <- -function(n, sigma, skew, ekurt, seed=NULL) { -## inputs: -## n scalar, number of simulated values -## sigma scalar, standard deviation -## skew scalar, skewness -## ekurt scalar, excess kurtosis -## outputs: -## n simulated values from Cornish-Fisher distribution -if (!is.null(seed)) set.seed(seed) -zc = rnorm(n) -z.cf = zc + (((zc^2 - 1) * skew)/6) + (((zc^3 - 3 * zc) * - ekurt)/24) - ((((2 * zc^3) - 5 * zc) * skew^2)/36) -ans = sigma*z.cf -ans -} Modified: pkg/FactorAnalytics/R/summary.pafm.r =================================================================== --- pkg/FactorAnalytics/R/summary.pafm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/summary.pafm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -22,15 +22,15 @@ #' #' @method summary pafm #' @export -#' -summary.pafm <- function(object ,digits = max(3, .Options$digits - 3),...) { -# n <- dim(fm.attr[[1]])[1] -# k <- dim(fm.attr[[1]])[2]+1 -# table.mat <- matrix(rep(NA,n*k*2),ncol=n) - cat("\nMean of returns attributed to factors - \n") - print(sapply(object[[3]],function(x) apply(x,2,mean)),digits = digits,...) - cat("\nStandard Deviation of returns attributed to factors - \n") - print(sapply(object[[3]],function(x) apply(x,2,sd)),digits = digits,...) + +summary.pafm <- function(object, digits=max(3, .Options$digits - 3), ...) { + + # n <- dim(fm.attr[[1]])[1] + # k <- dim(fm.attr[[1]])[2]+1 + # table.mat <- matrix(rep(NA,n*k*2),ncol=n) + + cat("\nMean of returns attributed to factors \n") + print(sapply(object[[3]], function(x) apply(x,2,mean)), digits=digits, ...) + cat("\nStandard Deviation of returns attributed to factors \n") + print(sapply(object[[3]], function(x) apply(x,2,sd)), digits=digits, ...) } Modified: pkg/FactorAnalytics/R/summary.tsfm.r =================================================================== --- pkg/FactorAnalytics/R/summary.tsfm.r 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/R/summary.tsfm.r 2014-11-19 10:00:09 UTC (rev 3553) @@ -74,9 +74,11 @@ # extract coefficients separately for "lars" variable.selection method for (i in object$asset.names) { if (se.type=="HC") { - sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], vcovHC)[,1:4] + sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], + vcov.=vcovHC)[,1:4] } else if (se.type=="HAC") { - sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], vcovHAC)[,1:4] + sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], + vcov.=vcovHAC)[,1:4] } } Modified: pkg/FactorAnalytics/man/CornishFisher.Rd =================================================================== --- pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-19 10:00:09 UTC (rev 3553) @@ -1,12 +1,11 @@ % Generated by roxygen2 (4.0.1): do not edit by hand -\name{CornishFisher} -\alias{CornishFisher} +\name{dCornishFisher} \alias{dCornishFisher} \alias{pCornishFisher} \alias{qCornishFisher} \alias{rCornishFisher} -\title{Functions for Cornish-Fisher density, CDF, random number simulation and -quantile.} +\title{Density, distribution function, quantile function and random +generation for the Cornish-Fisher distribution.} \usage{ dCornishFisher(x, n, skew, ekurt) @@ -17,73 +16,74 @@ rCornishFisher(n, sigma, skew, ekurt, seed = NULL) } \arguments{ -\item{n}{Scalar, number of simulated values in rCornishFisher. Sample length in -density,distribution,quantile function.} +\item{n}{scalar; number of simulated values in random simulation, sample +length in density, distribution and quantile functions.} -\item{sigma}{Scalar, standard deviation.} +\item{sigma}{scalar standard deviation.} -\item{skew}{Scalar, skewness.} +\item{skew}{scalar; skewness.} -\item{ekurt}{Scalar, excess kurtosis.} +\item{ekurt}{scalar; excess kurtosis.} -\item{seed}{Set seed here. Default is \code{NULL}.} +\item{seed}{scalar; set seed. Default is \code{NULL}.} -\item{x,q}{Vector of standardized quantiles. See detail.} +\item{x,q}{vector of standardized quantiles.} -\item{p}{Vector of probabilities.} +\item{p}{vector of probabilities.} } \value{ -n Simulated values from Cornish-Fisher distribution. +\code{dCornishFisher} gives the density, \code{pCornishFisher} gives the +distribution function, \code{qCornishFisher} gives the quantile function, +and \code{rCornishFisher} generates \code{n} random simulations. } \description{ -\itemize{ -\item \code{rCornishFisher} simulate observations based on -Cornish-Fisher quantile expansion given mean, standard -deviation, skewness and excess kurtosis. -\item \code{dCornishFisher} Computes Cornish-Fisher density -from two term Edgeworth expansion given mean, standard -deviation, skewness and excess kurtosis. -\item \code{pCornishFisher} Computes Cornish-Fisher CDF from -two term Edgeworth expansion given mean, standard -deviation, skewness and excess kurtosis. -\item \code{qCornishFisher} Computes Cornish-Fisher quantiles -from two term Edgeworth expansion given mean, standard -deviation, skewness and excess kurtosis. +Density, distribution function, quantile function and random +generation for the Cornish-Fisher distribution. } -} \details{ CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) +\itemize{ +\item \code{dCornishFisher} Computes Cornish-Fisher density from two term +Edgeworth expansion given mean, standard deviation, skewness and excess +kurtosis. +\item \code{pCornishFisher} Computes Cornish-Fisher CDF from two term +Edgeworth expansion given mean, standard deviation, skewness and excess +kurtosis. +\item \code{qCornishFisher} Computes Cornish-Fisher quantiles from two term +Edgeworth expansion given mean, standard deviation, skewness and excess +kurtosis. +\item \code{rCornishFisher} simulate observations based on Cornish-Fisher +quantile expansion given mean, standard deviation, skewness and excess +kurtosis.} } \examples{ \dontrun{ - # generate 1000 observation from Cornish-Fisher distribution +# generate 1000 observation from Cornish-Fisher distribution rc <- rCornishFisher(1000,1,0,5) -hist(rc,breaks=100,freq=FALSE,main="simulation of Cornish Fisher Distribution", - xlim=c(-10,10)) -lines(seq(-10,10,0.1),dnorm(seq(-10,10,0.1),mean=0,sd=1),col=2) +hist(rc, breaks=100, freq=FALSE, + main="simulation of Cornish Fisher Distribution", xlim=c(-10,10)) +lines(seq(-10,10,0.1), dnorm(seq(-10,10,0.1), mean=0, sd=1), col=2) # compare with standard normal curve -# example from A.dasGupta p.188 exponential example +# exponential example from A.dasGupta p.188 # x is iid exp(1) distribution, sample size = 5 -# then x_bar is Gamma(shape=5,scale=1/5) distribution +# then x_bar is Gamma(shape=5, scale=1/5) distribution q <- c(0,0.4,1,2) # exact cdf -pgamma(q/sqrt(5)+1,shape=5,scale=1/5) +pgamma(q/sqrt(5)+1, shape=5, scale=1/5) # use CLT pnorm(q) # use edgeworth expansion -pCornishFisher(q,n=5,skew=2,ekurt=6) +pCornishFisher(q, n=5, skew=2, ekurt=6) } } \author{ Eric Zivot and Yi-An Chen. } \references{ -\enumerate{ -\item A.DasGupta, "Asymptotic Theory of Statistics and -Probability", Springer Science+Business Media,LLC 2008 -\item Thomas A.Severini, "Likelihood Methods in Statistics", - Oxford University Press, 2000 - } +DasGupta, A. (2008). Asymptotic theory of statistics and probability. +Springer. +Severini, T. A., (2000). Likelihood Methods in Statistics. Oxford University +Press. } Modified: pkg/FactorAnalytics/man/fitTsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-15 22:54:45 UTC (rev 3552) +++ pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-19 10:00:09 UTC (rev 3553) @@ -98,7 +98,7 @@ Estimation method "OLS" corresponds to ordinary least squares using \code{\link[stats]{lm}}, "DLS" is discounted least squares (weighted least squares with exponentially declining weights that sum to unity), and, -"Robust" is robust regression (useing \code{\link[robust]{lmRob}}). +"Robust" is robust regression (using \code{\link[robust]{lmRob}}). If \code{variable.selection="none"}, all chosen factors are used in the factor model. Whereas, "stepwise" performs traditional forward/backward From noreply at r-forge.r-project.org Thu Nov 20 05:38:48 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Thu, 20 Nov 2014 05:38:48 +0100 (CET) Subject: [Returnanalytics-commits] r3554 - in pkg/FactorAnalytics: . R data man Message-ID: <20141120043848.E58BA18740A@r-forge.r-project.org> Author: pragnya Date: 2014-11-20 05:38:47 +0100 (Thu, 20 Nov 2014) New Revision: 3554 Added: pkg/FactorAnalytics/R/Misc.R pkg/FactorAnalytics/data/managers.rda pkg/FactorAnalytics/man/managers.Rd Modified: pkg/FactorAnalytics/DESCRIPTION pkg/FactorAnalytics/NAMESPACE pkg/FactorAnalytics/R/CornishFisher.R pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/fmEsDecomp.R pkg/FactorAnalytics/R/fmVaRDecomp.R pkg/FactorAnalytics/R/paFm.r pkg/FactorAnalytics/R/plot.pafm.r pkg/FactorAnalytics/R/plot.tsfm.r pkg/FactorAnalytics/R/predict.tsfm.r pkg/FactorAnalytics/R/summary.tsfm.r pkg/FactorAnalytics/man/CornishFisher.Rd Log: All packages (except xts) in the depends field moved to Imports. Functions and Namespace changed accordingly. Managers.rda and related help directly imported/copied into factorAnalytics. Modified: pkg/FactorAnalytics/DESCRIPTION =================================================================== --- pkg/FactorAnalytics/DESCRIPTION 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/DESCRIPTION 2014-11-20 04:38:47 UTC (rev 3554) @@ -17,16 +17,17 @@ License: GPL-2 Depends: R (>= 3.0.0), - xts (>= 0.9), + xts (>= 0.9) +Imports: PerformanceAnalytics(>= 1.1.0), - lmtest, - sandwich -Imports: corrplot, robust, leaps, lars, - strucchange + strucchange, + lmtest, + sandwich, + lattice Suggests: testthat, quantmod, knitr LazyLoad: yes Modified: pkg/FactorAnalytics/NAMESPACE =================================================================== --- pkg/FactorAnalytics/NAMESPACE 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/NAMESPACE 2014-11-20 04:38:47 UTC (rev 3554) @@ -26,3 +26,23 @@ export(paFm) export(qCornishFisher) export(rCornishFisher) +importFrom(PerformanceAnalytics,Return.cumulative) +importFrom(PerformanceAnalytics,VaR) +importFrom(PerformanceAnalytics,chart.ACFplus) +importFrom(PerformanceAnalytics,chart.Histogram) +importFrom(PerformanceAnalytics,chart.QQPlot) +importFrom(PerformanceAnalytics,chart.TimeSeries) +importFrom(PerformanceAnalytics,checkData) +importFrom(corrplot,corrplot) +importFrom(lars,cv.lars) +importFrom(lars,lars) +importFrom(lattice,barchart) +importFrom(lattice,panel.barchart) +importFrom(lattice,panel.grid) +importFrom(leaps,regsubsets) +importFrom(lmtest,coeftest.default) +importFrom(robust,lmRob) +importFrom(robust,step.lmRob) +importFrom(sandwich,vcovHAC.default) +importFrom(sandwich,vcovHC.default) +importFrom(strucchange,efp) Modified: pkg/FactorAnalytics/R/CornishFisher.R =================================================================== --- pkg/FactorAnalytics/R/CornishFisher.R 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/CornishFisher.R 2014-11-20 04:38:47 UTC (rev 3554) @@ -1,20 +1,21 @@ -#' @title Density, distribution function, quantile function and random -#' generation for the Cornish-Fisher distribution. +#' @title Cornish-Fisher expansion +#' +#' @aliases Cornish-Fisher dCornishFisher pCornishFisher +#' qCornishFisher rCornishFisher +#' +#' @description Density, distribution function, quantile function and random +#' generation using Cornish-Fisher approximation. #' #' @details CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) -#' \itemize{ -#' \item \code{dCornishFisher} Computes Cornish-Fisher density from two term +#' \code{dCornishFisher} Computes Cornish-Fisher density from two term Edgeworth +#' expansion given mean, standard deviation, skewness and excess kurtosis. +#' \code{pCornishFisher} Computes Cornish-Fisher CDF from two term Edgeworth +#' expansion given mean, standard deviation, skewness and excess kurtosis. +#' \code{qCornishFisher} Computes Cornish-Fisher quantiles from two term #' Edgeworth expansion given mean, standard deviation, skewness and excess #' kurtosis. -#' \item \code{pCornishFisher} Computes Cornish-Fisher CDF from two term -#' Edgeworth expansion given mean, standard deviation, skewness and excess -#' kurtosis. -#' \item \code{qCornishFisher} Computes Cornish-Fisher quantiles from two term -#' Edgeworth expansion given mean, standard deviation, skewness and excess -#' kurtosis. -#' \item \code{rCornishFisher} simulate observations based on Cornish-Fisher -#' quantile expansion given mean, standard deviation, skewness and excess -#' kurtosis.} +#' \code{rCornishFisher} simulates observations based on Cornish-Fisher quantile +#' expansion given mean, standard deviation, skewness and excess kurtosis. #' #' @param n scalar; number of simulated values in random simulation, sample #' length in density, distribution and quantile functions. Added: pkg/FactorAnalytics/R/Misc.R =================================================================== --- pkg/FactorAnalytics/R/Misc.R (rev 0) +++ pkg/FactorAnalytics/R/Misc.R 2014-11-20 04:38:47 UTC (rev 3554) @@ -0,0 +1,16 @@ +#' @title Miscellaneous Imported functions +#' +#' @details Only unique directives are saved to the ?NAMESPACE? file, so one +#' can repeat them as needed to maintain a close link between the functions +#' where they are needed and the namespace file. +#' +#' @importFrom PerformanceAnalytics checkData VaR chart.TimeSeries chart.ACFplus +#' chart.Histogram chart.QQPlot Return.cumulative +#' @importFrom robust lmRob step.lmRob +#' @importFrom leaps regsubsets +#' @importFrom lars lars cv.lars +#' @importFrom lmtest coeftest.default +#' @importFrom sandwich vcovHC.default vcovHAC.default +#' @importFrom lattice barchart panel.barchart panel.grid +#' @importFrom corrplot corrplot +#' @importFrom strucchange efp \ No newline at end of file Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-20 04:38:47 UTC (rev 3554) @@ -165,8 +165,13 @@ #' factor.names=colnames(managers[,(7:9)]), #' rf.name="US 3m TR", data=managers, #' variable.selection="lars", lars.criterion="cv") -#' -#' @export +#' +#' @importFrom PerformanceAnalytics checkData +#' @importFrom robust lmRob step.lmRob +#' @importFrom leaps regsubsets +#' @importFrom lars lars cv.lars +#' +#' @export fitTsfm <- function(asset.names, factor.names, mkt.name=NULL, rf.name=NULL, data=data, fit.method=c("OLS","DLS","Robust"), @@ -317,8 +322,8 @@ lm.args$weights <- WeightsDLS(nrow(reg.xts), decay) reg.list[[i]] <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) } else if (fit.method == "Robust") { - reg.list[[i]] <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), - lmRob.args)) + reg.list[[i]] <- do.call(lmRob, c(list(fm.formula,data=reg.xts), + lmRob.args)) } } reg.list @@ -349,10 +354,8 @@ lm.fit <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) reg.list[[i]] <- do.call(step, c(list(lm.fit),step.args)) } else if (fit.method == "Robust") { - lmRob.fit <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), - lmRob.args)) - reg.list[[i]] <- do.call(robust::step.lmRob, c(list(lmRob.fit), - step.args)) + lmRob.fit <- do.call(lmRob, c(list(fm.formula,data=reg.xts), lmRob.args)) + reg.list[[i]] <- do.call(step.lmRob, c(list(lmRob.fit), step.args)) } } reg.list @@ -381,8 +384,8 @@ } # choose best subset of factors depending on specified subset size - fm.subsets <- do.call(leaps::regsubsets, c(list(fm.formula,data=reg.xts), - regsubsets.args)) + fm.subsets <- do.call(regsubsets, c(list(fm.formula,data=reg.xts), + regsubsets.args)) sum.sub <- summary(fm.subsets) names.sub <- names(which(sum.sub$which[as.character(subset.size),-1]==TRUE)) reg.xts <- na.omit(dat.xts[,c(i,names.sub)]) @@ -394,8 +397,8 @@ lm.args$weights <- WeightsDLS(nrow(reg.xts), decay) reg.list[[i]] <- do.call(lm, c(list(fm.formula,data=reg.xts),lm.args)) } else if (fit.method == "Robust") { - reg.list[[i]] <- do.call(robust::lmRob, c(list(fm.formula,data=reg.xts), - lmRob.args)) + reg.list[[i]] <- do.call(lmRob, c(list(fm.formula,data=reg.xts), + lmRob.args)) } } reg.list @@ -424,12 +427,11 @@ # convert to matrix reg.mat <- as.matrix(reg.xts) # fit lars regression model - lars.fit <- do.call(lars::lars, c(list(x=reg.mat[,factor.names], - y=reg.mat[,i]),lars.args)) + lars.fit <- do.call(lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i]), + lars.args)) lars.sum <- summary(lars.fit) - lars.cv <- - do.call(lars::cv.lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i], - mode="step"),cv.lars.args)) + lars.cv <- do.call(cv.lars, c(list(x=reg.mat[,factor.names],y=reg.mat[,i], + mode="step"),cv.lars.args)) # including plot.it=FALSE to cv.lars strangely gives an error: "Argument s # out of range". And, specifying index=seq(nrow(lars.fit$beta)-1) resolves # the issue, but care needs to be taken for small N Modified: pkg/FactorAnalytics/R/fmEsDecomp.R =================================================================== --- pkg/FactorAnalytics/R/fmEsDecomp.R 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/fmEsDecomp.R 2014-11-20 04:38:47 UTC (rev 3554) @@ -78,6 +78,8 @@ #' # get the component contributions #' ES.decomp$cES #' +#' @importFrom PerformanceAnalytics VaR +#' #' @export fmEsDecomp <- function(object, ...){ Modified: pkg/FactorAnalytics/R/fmVaRDecomp.R =================================================================== --- pkg/FactorAnalytics/R/fmVaRDecomp.R 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/fmVaRDecomp.R 2014-11-20 04:38:47 UTC (rev 3554) @@ -73,6 +73,8 @@ #' # get the component contributions #' VaR.decomp$cVaR #' +#' @importFrom PerformanceAnalytics VaR +#' #' @export fmVaRDecomp <- function(object, ...){ Modified: pkg/FactorAnalytics/R/paFm.r =================================================================== --- pkg/FactorAnalytics/R/paFm.r 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/paFm.r 2014-11-20 04:38:47 UTC (rev 3554) @@ -40,6 +40,8 @@ #' # without benchmark #' fm.attr <- paFm(fit) #' +#' @importFrom PerformanceAnalytics Return.cumulative +#' #' @export #' @@ -85,8 +87,7 @@ xts(rep(NA, length(date)), dates)) } else { attr.ret.xts <- actual.xts - - xts(as.matrix(fit.lm$model[i])%*%as.matrix(fit.lm$coef[i]), - dates) + xts(as.matrix(fit.lm$model[i])%*%as.matrix(fit.lm$coef[i]), dates) cum.attr.ret[k, i] <- cum.ret - Return.cumulative(actual.xts-attr.ret.xts) attr.ret.xts.all <- merge(attr.ret.xts.all, attr.ret.xts) Modified: pkg/FactorAnalytics/R/plot.pafm.r =================================================================== --- pkg/FactorAnalytics/R/plot.pafm.r 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/plot.pafm.r 2014-11-20 04:38:47 UTC (rev 3554) @@ -3,7 +3,6 @@ #' Generic function of plot method for paFm. #' Either plot all assets or choose a single asset to plot. #' -#' #' @param x object of class \code{"pafm"} created by #' \code{paFm}. #' @param which.plot Integer indicates which plot to create: "none" will @@ -23,7 +22,9 @@ #' 3 = time series of attributed returns #' @param ... more arguements for \code{chart.TimeSeries} used for plotting #' time series +#' #' @author Yi-An Chen. +#' #' @examples #' \dontrun{ #' data(managers) @@ -37,12 +38,15 @@ #' plot(fm.attr, plot.single=TRUE, fundName="HAM1") #' } #' +#' @importFrom PerformanceAnalytics chart.TimeSeries +#' #' @method plot pafm #' @export #' plot.pafm <- function(x, which.plot=c("none","1L","2L","3L"),max.show=6, date=NULL,plot.single=FALSE,fundName, which.plot.single=c("none","1L","2L","3L"),...) { + # ... for chart.TimeSeries if (is.null(date)){ date = index(x[[3]][[1]])[1] @@ -63,12 +67,14 @@ "1L" = { bar <- c(x$cum.spec.ret[fundName],x$cum.ret.attr.f[fundName,]) names(bar)[1] <- "specific.returns" - barplot(bar, horiz=TRUE, main="cumulative attributed returns", las=1) + barplot(bar, horiz=TRUE, main="cumulative attributed returns", + las=1) }, "2L" ={ bar <- coredata(x$attr.list[[fundName]][as.Date(date)]) tryCatch({barplot(bar, horiz=TRUE, main=fundName, las=1)}, - error=function(e){cat("\this date is not available for this asset.\n")}) + error=function(e){cat("\this date is not available for + this asset.\n")}) }, "3L" = { chart.TimeSeries(x$attr.list[[fundName]], @@ -77,6 +83,7 @@ }, invisible()) } + # plot all assets else { which.plot<-which.plot[1] Modified: pkg/FactorAnalytics/R/plot.tsfm.r =================================================================== --- pkg/FactorAnalytics/R/plot.tsfm.r 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/plot.tsfm.r 2014-11-20 04:38:47 UTC (rev 3554) @@ -115,6 +115,13 @@ #' # group plot; type selected from menu prompt; auto-looped for multiple plots #' # plot(fit.macro) #' +#' @importFrom PerformanceAnalytics chart.TimeSeries chart.ACFplus +#' chart.Histogram chart.QQPlot +#' @importFrom lattice barchart panel.barchart panel.grid +#' @importFrom corrplot corrplot +#' @importFrom strucchange efp +#' @importFrom robust lmRob +#' #' @method plot tsfm #' @export @@ -219,8 +226,7 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.rec <- strucchange::efp(formula(fit), type="Rec-CUSUM", - data=fit$model) + cusum.rec <- efp(formula(fit), type="Rec-CUSUM", data=fit$model) plot(cusum.rec, main=paste("Recursive CUSUM test:",i), las=las, col=colorset, ...) }, "11L" = { @@ -228,8 +234,7 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.ols <- strucchange::efp(formula(fit), type="OLS-CUSUM", - data=fit$model) + cusum.ols <- efp(formula(fit), type="OLS-CUSUM", data=fit$model) plot(cusum.ols, main=paste("OLS-based CUSUM test:",i), las=las, col=colorset, ...) }, "12L" = { @@ -237,8 +242,7 @@ if (!x$fit.method=="OLS") { stop("CUSUM analysis applicable only for 'OLS' fit.method.") } - cusum.est <- strucchange::efp(formula(fit), type="RE", - data=fit$model) + cusum.est <- efp(formula(fit), type="RE", data=fit$model) plot(cusum.est, functional=NULL, col=colorset, las=0, main=paste("RE test (Recursive estimates test):",i), ...) }, "13L" = { @@ -269,8 +273,7 @@ width=24, by.column=FALSE, align="right") } else if (x$fit.method=="Robust") { rollReg.Rob <- function(data.z, formula) { - coef(robust::lmRob(formula=formula, - data=as.data.frame(data.z))) + coef(lmRob(formula=formula, data=as.data.frame(data.z))) } reg.z <- zoo(fit$model, as.Date(rownames(fit$model))) rollReg.z <- rollapply(reg.z, width=24, FUN=rollReg.Rob, @@ -372,13 +375,13 @@ "6L" = { ## Factor Model Residual Correlation cor.resid <- cor(residuals(x), use="pairwise.complete.obs") - corrplot::corrplot(cor.resid, ...) + corrplot(cor.resid, ...) # mtext("pairwise complete obs", line=0.5) }, "7L" = { ## Factor Model Return Correlation cor.fm <- cov2cor(fmCov(x)) - corrplot::corrplot(cor.fm, ...) + corrplot(cor.fm, ...) # mtext("pairwise complete obs", line=0.5) }, "8L" = { Modified: pkg/FactorAnalytics/R/predict.tsfm.r =================================================================== --- pkg/FactorAnalytics/R/predict.tsfm.r 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/predict.tsfm.r 2014-11-20 04:38:47 UTC (rev 3554) @@ -30,6 +30,8 @@ #' rownames(newdata) <- rownames(fit$data) #' pred.fit2 <- predict(fit, newdata, interval="confidence") #' +#' @importFrom PerformanceAnalytics checkData +#' #' @method predict tsfm #' @export #' Modified: pkg/FactorAnalytics/R/summary.tsfm.r =================================================================== --- pkg/FactorAnalytics/R/summary.tsfm.r 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/R/summary.tsfm.r 2014-11-20 04:38:47 UTC (rev 3554) @@ -53,6 +53,9 @@ #' # summary of lm fit for a single asset #' summary(fit$asset.fit[[1]]) #' +#' @importFrom lmtest coeftest.default +#' @importFrom sandwich vcovHC.default vcovHAC.default +#' #' @method summary tsfm #' @export @@ -74,11 +77,11 @@ # extract coefficients separately for "lars" variable.selection method for (i in object$asset.names) { if (se.type=="HC") { - sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], - vcov.=vcovHC)[,1:4] + sum[[i]]$coefficients <- coeftest.default(object$asset.fit[[i]], + vcov.=vcovHC.default)[,1:4] } else if (se.type=="HAC") { - sum[[i]]$coefficients <- coeftest(object$asset.fit[[i]], - vcov.=vcovHAC)[,1:4] + sum[[i]]$coefficients <- coeftest.default(object$asset.fit[[i]], + vcov.=vcovHAC.default)[,1:4] } } Added: pkg/FactorAnalytics/data/managers.rda =================================================================== (Binary files differ) Property changes on: pkg/FactorAnalytics/data/managers.rda ___________________________________________________________________ Added: svn:mime-type + application/octet-stream Modified: pkg/FactorAnalytics/man/CornishFisher.Rd =================================================================== --- pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-19 10:00:09 UTC (rev 3553) +++ pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-20 04:38:47 UTC (rev 3554) @@ -1,11 +1,11 @@ % Generated by roxygen2 (4.0.1): do not edit by hand \name{dCornishFisher} +\alias{Cornish-Fisher} \alias{dCornishFisher} \alias{pCornishFisher} \alias{qCornishFisher} \alias{rCornishFisher} -\title{Density, distribution function, quantile function and random -generation for the Cornish-Fisher distribution.} +\title{Cornish-Fisher expansion} \usage{ dCornishFisher(x, n, skew, ekurt) @@ -38,23 +38,19 @@ } \description{ Density, distribution function, quantile function and random -generation for the Cornish-Fisher distribution. +generation using Cornish-Fisher approximation. } \details{ CDF(q) = Pr(sqrt(n)*(x_bar-mu)/sigma < q) -\itemize{ -\item \code{dCornishFisher} Computes Cornish-Fisher density from two term +\code{dCornishFisher} Computes Cornish-Fisher density from two term Edgeworth +expansion given mean, standard deviation, skewness and excess kurtosis. +\code{pCornishFisher} Computes Cornish-Fisher CDF from two term Edgeworth +expansion given mean, standard deviation, skewness and excess kurtosis. +\code{qCornishFisher} Computes Cornish-Fisher quantiles from two term Edgeworth expansion given mean, standard deviation, skewness and excess kurtosis. -\item \code{pCornishFisher} Computes Cornish-Fisher CDF from two term -Edgeworth expansion given mean, standard deviation, skewness and excess -kurtosis. -\item \code{qCornishFisher} Computes Cornish-Fisher quantiles from two term -Edgeworth expansion given mean, standard deviation, skewness and excess -kurtosis. -\item \code{rCornishFisher} simulate observations based on Cornish-Fisher -quantile expansion given mean, standard deviation, skewness and excess -kurtosis.} +\code{rCornishFisher} simulates observations based on Cornish-Fisher quantile +expansion given mean, standard deviation, skewness and excess kurtosis. } \examples{ \dontrun{ Added: pkg/FactorAnalytics/man/managers.Rd =================================================================== --- pkg/FactorAnalytics/man/managers.Rd (rev 0) +++ pkg/FactorAnalytics/man/managers.Rd 2014-11-20 04:38:47 UTC (rev 3554) @@ -0,0 +1,37 @@ +\name{managers} +\docType{data} +\alias{managers} +\title{Hypothetical Alternative Asset Manager and Benchmark Data} +\description{ +This dataset and it's documentation have been duplicated from +\code{\link[PerformanceAnalytics]{managers}} in the \code{PerformanceAnalytics} package. \code{managers} is used in the examples and vignette of the +\code{factorAnalytics} package. + +A xts object that contains columns of monthly returns for six hypothetical +asset managers (HAM1 through HAM6), the EDHEC Long-Short Equity hedge +fund index, the S\&P 500 total returns, and total return series for +the US Treasury 10-year bond and 3-month bill. Monthly returns for +all series end in December 2006 and begin at different periods starting +from January 1996. + +Note that all the EDHEC indices are available in \code{\link{edhec}}. +} +\usage{managers} +\details{ + Please note that the `managers' data set included with PerformanceAnalytics will be periodically updated with new managers and information. If you intend to use this data set in automated tests, please be sure to subset your data like \code{managers[1:120,1:6]} to use the first ten years of observations on HAM1-HAM6. +} +\format{CSV conformed into an xts object with monthly observations} +\examples{ +data(managers) + +#preview the data +head(managers) + +#summary period statistics +summary(managers) + +#cumulative returns +tail(cumprod(1+managers),1) +} +\keyword{datasets} +\keyword{ ts } \ No newline at end of file From noreply at r-forge.r-project.org Fri Nov 21 12:25:12 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 21 Nov 2014 12:25:12 +0100 (CET) Subject: [Returnanalytics-commits] r3555 - in pkg/FactorAnalytics: R man vignettes Message-ID: <20141121112512.C5191183C81@r-forge.r-project.org> Author: pragnya Date: 2014-11-21 12:25:12 +0100 (Fri, 21 Nov 2014) New Revision: 3555 Modified: pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/fitTsfm.control.R pkg/FactorAnalytics/man/fitTsfm.Rd pkg/FactorAnalytics/man/fitTsfm.control.Rd pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf Log: Added option: best model from a range of subset sizes using subsets regression Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-20 04:38:47 UTC (rev 3554) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-21 11:25:12 UTC (rev 3555) @@ -25,11 +25,12 @@ #' Bayesian Information Criterion (BIC) or Akaike Information Criterion (AIC), #' improves. And, "subsets" enables subsets selection using #' \code{\link[leaps]{regsubsets}}; chooses the best performing subset of any -#' given size. See \code{\link{fitTsfm.control}} for more details on the -#' control arguments. \code{variable.selection="lars"} corresponds to least -#' angle regression using \code{\link[lars]{lars}} with variants "lasso", -#' "lar", "forward.stagewise" or "stepwise". Note: If -#' \code{variable.selection="lars"}, \code{fit.method} will be ignored. +#' given size or within a range of subset sizes. See +#' \code{\link{fitTsfm.control}} for more details on the control arguments. +#' \code{variable.selection="lars"} corresponds to least angle regression +#' using \code{\link[lars]{lars}} with variants "lasso", "lar", "stepwise" or +#' "forward.stagewise". Note: If \code{variable.selection="lars"}, +#' \code{fit.method} will be ignored. #' #' Arguments \code{mkt.name} and \code{mkt.timing} allow for market-timing #' factors to be added to any of the above methods. Market timing accounts for @@ -197,6 +198,7 @@ # extract arguments to pass to different fit and variable selection functions decay <- control$decay + nvmin <- control$nvmin subset.size <- control$subset.size lars.criterion <- control$lars.criterion m1 <- match(c("weights","model","x","y","qr"), @@ -208,7 +210,7 @@ m3 <- match(c("scope","scale","direction","trace","steps","k"), names(control), 0L) step.args <- control[m3, drop=TRUE] - m4 <- match(c("weights","nbest","nvmax","force.in","force.out","method", + m4 <- match(c("weights","nvmax","force.in","force.out","method", "really.big"), names(control), 0L) regsubsets.args <- control[m4, drop=TRUE] m5 <- match(c("type","normalize","eps","max.steps","trace"), @@ -268,7 +270,7 @@ } else if (variable.selection == "subsets") { reg.list <- SelectAllSubsets(dat.xts, asset.names, factor.names, fit.method, lm.args, lmRob.args, regsubsets.args, - subset.size, decay) + nvmin, subset.size, decay) } else if (variable.selection == "lars") { result.lars <- SelectLars(dat.xts, asset.names, factor.names, lars.args, cv.lars.args, lars.criterion) @@ -365,8 +367,8 @@ ### method variable.selection = "subsets" # SelectAllSubsets <- function(dat.xts, asset.names, factor.names, fit.method, - lm.args, lmRob.args, regsubsets.args, subset.size, - decay) { + lm.args, lmRob.args, regsubsets.args, nvmin, + subset.size, decay) { # initialize list object to hold the fitted objects reg.list <- list() @@ -387,7 +389,19 @@ fm.subsets <- do.call(regsubsets, c(list(fm.formula,data=reg.xts), regsubsets.args)) sum.sub <- summary(fm.subsets) - names.sub <- names(which(sum.sub$which[as.character(subset.size),-1]==TRUE)) + + # choose best model of a given subset.size (or) + # best model amongst subset sizes in [nvmin, nvmax] + if (!is.null(subset.size)) { + names.sub <- names(which(sum.sub$which[subset.size,-1]==TRUE)) + bic <- sum.sub$bic[subset.size - nvmin + 1] + } else { + best.size <- which.min(sum.sub$bic[nvmin:length(sum.sub$bic)]) + nvmin -1 + names.sub <- names(which(sum.sub$which[best.size,-1]==TRUE)) + bic <- min(sum.sub$bic[nvmin:length(sum.sub$bic)]) + } + + # completely remove NA cases for chosen subset reg.xts <- na.omit(dat.xts[,c(i,names.sub)]) # fit based on time series regression method chosen Modified: pkg/FactorAnalytics/R/fitTsfm.control.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-20 04:38:47 UTC (rev 3554) +++ pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-21 11:25:12 UTC (rev 3555) @@ -63,7 +63,7 @@ #' @param k the multiple of the number of degrees of freedom used for the #' penalty in \code{"stepwise"}. Only \code{k = 2} gives the genuine AIC. #' \code{k = log(n)} is sometimes referred to as BIC or SBC. Default is 2. -#' @param nbest number of subsets of each size to record for \code{"subsets"}. +#' @param nvmin minimum size of subsets to examine for \code{"subsets"}. #' Default is 1. #' @param nvmax maximum size of subsets to examine for \code{"subsets"}. #' Default is 8. @@ -77,9 +77,10 @@ #' "exhaustive". #' @param really.big option for \code{"subsets"}; Must be \code{TRUE} to #' perform exhaustive search on more than 50 variables. -#' @param subset.size number of factors required in the factor model; -#' an option for \code{"subsets"} variable selection. Default is 1. -#' Note: \code{nvmax >= subset.size >= length(force.in)}. +#' @param subset.size number of factors required in the factor model; an +#' option for \code{"subsets"} variable selection. \code{NULL} selects the +#' best model (BIC) from amongst subset sizes in [\code{nvmin},\code{nvmax}]. +#' Default is \code{NULL}. #' @param type option for \code{"lars"}. One of "lasso", "lar", #' "forward.stagewise" or "stepwise". The names can be abbreviated to any #' unique substring. Default is "lasso". @@ -133,11 +134,11 @@ fitTsfm.control <- function(decay=0.95, weights, model=TRUE, x=FALSE, y=FALSE, qr=TRUE, nrep=NULL, scope, scale, direction, - trace=FALSE, steps=1000, k=2, nbest=1, nvmax=8, + trace=FALSE, steps=1000, k=2, nvmin=1, nvmax=8, force.in=NULL, force.out=NULL, method, - really.big=FALSE, subset.size=1, type, + really.big=FALSE, subset.size=NULL, type, normalize=TRUE, eps=.Machine$double.eps, max.steps, - lars.criterion="Cp", K = 10) { + lars.criterion="Cp", K=10) { # get the user-specified arguments (that have no defaults) # this part of the code was adapted from stats::lm @@ -171,13 +172,15 @@ if (!is.logical(really.big) || length(really.big) != 1) { stop("Invalid argument: control parameter 'really.big' must be logical") } - if (subset.size <= 0 || round(subset.size) != subset.size) { - stop("control parameter 'subset.size' must be a positive integer") + if (!is.null(subset.size)) { + if (subset.size <= 0 || round(subset.size) != subset.size) { + stop("Control parameter 'subset.size' must be a positive integer or NULL") + } + if (nvmax < subset.size || subset.size < length(force.in)) { + stop("Invaid Argument: nvmax should be >= subset.size and subset.size + should be >= length(force.in)") + } } - if (nvmax < subset.size || subset.size < length(force.in)) { - stop("Invaid Argument: nvmax should be >= subset.size and subset.size - should be >= length(force.in)") - } if (!is.logical(normalize) || length(normalize) != 1) { stop("Invalid argument: control parameter 'normalize' must be logical") } @@ -187,7 +190,7 @@ # return list of arguments with defaults if they are unspecified result <- c(args, list(decay=decay, model=model, x=x, y=y, qr=qr, nrep=nrep, - trace=trace, steps=steps, k=k, nbest=nbest, + trace=trace, steps=steps, k=k, nvmin=nvmin, nvmax=nvmax, force.in=force.in, force.out=force.out, really.big=really.big, subset.size=subset.size, normalize=normalize, eps=eps, Modified: pkg/FactorAnalytics/man/fitTsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-20 04:38:47 UTC (rev 3554) +++ pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-21 11:25:12 UTC (rev 3555) @@ -108,11 +108,12 @@ Bayesian Information Criterion (BIC) or Akaike Information Criterion (AIC), improves. And, "subsets" enables subsets selection using \code{\link[leaps]{regsubsets}}; chooses the best performing subset of any -given size. See \code{\link{fitTsfm.control}} for more details on the -control arguments. \code{variable.selection="lars"} corresponds to least -angle regression using \code{\link[lars]{lars}} with variants "lasso", -"lar", "forward.stagewise" or "stepwise". Note: If -\code{variable.selection="lars"}, \code{fit.method} will be ignored. +given size or within a range of subset sizes. See +\code{\link{fitTsfm.control}} for more details on the control arguments. +\code{variable.selection="lars"} corresponds to least angle regression +using \code{\link[lars]{lars}} with variants "lasso", "lar", "stepwise" or +"forward.stagewise". Note: If \code{variable.selection="lars"}, +\code{fit.method} will be ignored. Arguments \code{mkt.name} and \code{mkt.timing} allow for market-timing factors to be added to any of the above methods. Market timing accounts for Modified: pkg/FactorAnalytics/man/fitTsfm.control.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-20 04:38:47 UTC (rev 3554) +++ pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-21 11:25:12 UTC (rev 3555) @@ -5,9 +5,9 @@ \usage{ fitTsfm.control(decay = 0.95, weights, model = TRUE, x = FALSE, y = FALSE, qr = TRUE, nrep = NULL, scope, scale, direction, - trace = FALSE, steps = 1000, k = 2, nbest = 1, nvmax = 8, + trace = FALSE, steps = 1000, k = 2, nvmin = 1, nvmax = 8, force.in = NULL, force.out = NULL, method, really.big = FALSE, - subset.size = 1, type, normalize = TRUE, eps = .Machine$double.eps, + subset.size = NULL, type, normalize = TRUE, eps = .Machine$double.eps, max.steps, lars.criterion = "Cp", K = 10) } \arguments{ @@ -56,7 +56,7 @@ penalty in \code{"stepwise"}. Only \code{k = 2} gives the genuine AIC. \code{k = log(n)} is sometimes referred to as BIC or SBC. Default is 2.} -\item{nbest}{number of subsets of each size to record for \code{"subsets"}. +\item{nvmin}{minimum size of subsets to examine for \code{"subsets"}. Default is 1.} \item{nvmax}{maximum size of subsets to examine for \code{"subsets"}. @@ -76,9 +76,10 @@ \item{really.big}{option for \code{"subsets"}; Must be \code{TRUE} to perform exhaustive search on more than 50 variables.} -\item{subset.size}{number of factors required in the factor model; -an option for \code{"subsets"} variable selection. Default is 1. -Note: \code{nvmax >= subset.size >= length(force.in)}.} +\item{subset.size}{number of factors required in the factor model; an +option for \code{"subsets"} variable selection. \code{NULL} selects the +best model (BIC) from amongst subset sizes in [\code{nvmin},\code{nvmax}]. +Default is \code{NULL}.} \item{type}{option for \code{"lars"}. One of "lasso", "lar", "forward.stagewise" or "stepwise". The names can be abbreviated to any Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw =================================================================== --- pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-20 04:38:47 UTC (rev 3554) +++ pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-21 11:25:12 UTC (rev 3555) @@ -158,7 +158,7 @@ plot(fit2, which.plot.group=5, loop=FALSE, xlim=c(0,0.043)) @ -By adding more factors in fit1 and fit2, though the R-squared values have improved (when compared to Sharpe's single index model), one might prefer to employ variable selection methods such as \verb"stepwise", \verb"subsets" or \verb"lars" to avoid over-fitting. The method can be selected via the \code{variable.selection} argument. The default \verb"none", uses all the factors and performs no variable selection. \verb"stepwise" performs traditional forward or backward stepwise OLS regression, starting from an initial (given) set of factors and adds factors only if the regression fit, as measured by the Bayesian Information Criterion (BIC) or Akaike Information Criterion (AIC), improves. \verb"subsets" enables subsets selection using \code{regsubsets}; chooses the best performing subset of any given size. \verb"lars" corresponds to least angle regression using \code{lars} with variants "lasso", "lar", "forward.stagewise" or "stepwise". +By adding more factors in fit1 and fit2, though the R-squared values have improved (when compared to Sharpe's single index model), one might prefer to employ variable selection methods such as \verb"stepwise", \verb"subsets" or \verb"lars" to avoid over-fitting. The method can be selected via the \code{variable.selection} argument. The default \verb"none", uses all the factors and performs no variable selection. \verb"stepwise" performs traditional forward or backward stepwise OLS regression, starting from an initial (given) set of factors and adds factors only if the regression fit, as measured by the Bayesian Information Criterion (BIC) or Akaike Information Criterion (AIC), improves. \verb"subsets" enables subsets selection using \code{regsubsets}; chooses the best performing subset of any given size or within a range of subset sizes. \verb"lars" corresponds to least angle regression using \code{lars} with variants "lasso", "lar", "forward.stagewise" or "stepwise". Remarks: \begin{itemize} @@ -205,15 +205,16 @@ \item \verb"lm": "weights","model","x","y","qr" \item \verb"lmRob": "weights","model","x","y","nrep" \item \verb"step": "scope","scale","direction","trace","steps","k" -\item \verb"regsubsets": "weights","nbest","nvmax","force.in","force.out","method","really.big" +\item \verb"regsubsets": "weights","nvmax","force.in","force.out","method","really.big" \item \verb"lars": "type","normalize","eps","max.steps","trace" \item \verb"cv.lars": "K","type","normalize","eps","max.steps","trace" \end{itemize} -There are 3 other important arguments passed to \code{fitTsfm.control} that determine the type of factor model fit chosen. +There are 4 other arguments passed to \code{fitTsfm.control} that determine the type of factor model fit chosen. \begin{itemize} \item \verb"decay": Determines the decay factor for \code{fit.method="DLS"}, which performs exponentially weighted least squares, with weights adding to unity. -\item \verb"subset.size": Number of factors required in the factor model when performing \verb"subsets" selection. This might be meaningful when looking for the best model of a certain size (perhaps for parsimony, perhaps to compare with a different model of the same size, perhaps to avoid over-fitting/ data dredging etc.) +\item \verb"nvmin": The lower limit for the range of subset sizes from which the best model (BIC) is found when performing \verb"subsets" selection. Note that the upper limit was already passed to \verb"regsubsets" function. +\item \verb"subset.size": Number of factors required in the factor model when performing \verb"subsets" selection. This might be meaningful when looking for the best model of a certain size (perhaps for parsimony, perhaps to compare with a different model of the same size, perhaps to avoid over-fitting/ data dredging etc.) Alternately, users can specify \code{NULL} to get the best model from amongst subset sizes in the range \code{[nvmin,nvmax]}. \item \verb"lars.criterion": An option (one of "Cp" or "cv") to assess model selection for the \code{"lars"} variable selection method. "Cp" is Mallow's Cp statistic and "cv" is K-fold cross-validated mean squared prediction error. \end{itemize} Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf =================================================================== (Binary files differ) From noreply at r-forge.r-project.org Fri Nov 21 13:26:00 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 21 Nov 2014 13:26:00 +0100 (CET) Subject: [Returnanalytics-commits] r3556 - in pkg/FactorAnalytics: R man vignettes Message-ID: <20141121122600.6A61F183C81@r-forge.r-project.org> Author: pragnya Date: 2014-11-21 13:26:00 +0100 (Fri, 21 Nov 2014) New Revision: 3556 Modified: pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/fitTsfm.control.R pkg/FactorAnalytics/man/fitTsfm.Rd pkg/FactorAnalytics/man/fitTsfm.control.Rd pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf Log: Removed subset.size control paramter for parsimony. Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-21 11:25:12 UTC (rev 3555) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-21 12:26:00 UTC (rev 3556) @@ -159,7 +159,7 @@ #' fit.sub <- fitTsfm(asset.names=colnames(managers[,(1:6)]), #' factor.names=colnames(managers[,(7:9)]), #' data=managers, variable.selection="subsets", -#' method="exhaustive", subset.size=2) +#' method="exhaustive", nvmin=2) #' #' # example using "lars" variable selection and subtracting risk-free rate #' fit.lar <- fitTsfm(asset.names=colnames(managers[,(1:6)]), @@ -199,7 +199,6 @@ # extract arguments to pass to different fit and variable selection functions decay <- control$decay nvmin <- control$nvmin - subset.size <- control$subset.size lars.criterion <- control$lars.criterion m1 <- match(c("weights","model","x","y","qr"), names(control), 0L) @@ -270,7 +269,7 @@ } else if (variable.selection == "subsets") { reg.list <- SelectAllSubsets(dat.xts, asset.names, factor.names, fit.method, lm.args, lmRob.args, regsubsets.args, - nvmin, subset.size, decay) + nvmin, decay) } else if (variable.selection == "lars") { result.lars <- SelectLars(dat.xts, asset.names, factor.names, lars.args, cv.lars.args, lars.criterion) @@ -368,7 +367,7 @@ # SelectAllSubsets <- function(dat.xts, asset.names, factor.names, fit.method, lm.args, lmRob.args, regsubsets.args, nvmin, - subset.size, decay) { + decay) { # initialize list object to hold the fitted objects reg.list <- list() @@ -390,16 +389,12 @@ regsubsets.args)) sum.sub <- summary(fm.subsets) - # choose best model of a given subset.size (or) + # choose best model of a given subset size nvmax=nvmin (or) # best model amongst subset sizes in [nvmin, nvmax] - if (!is.null(subset.size)) { - names.sub <- names(which(sum.sub$which[subset.size,-1]==TRUE)) - bic <- sum.sub$bic[subset.size - nvmin + 1] - } else { - best.size <- which.min(sum.sub$bic[nvmin:length(sum.sub$bic)]) + nvmin -1 - names.sub <- names(which(sum.sub$which[best.size,-1]==TRUE)) - bic <- min(sum.sub$bic[nvmin:length(sum.sub$bic)]) - } + nvmax <- length(sum.sub$bic) + best.size <- which.min(sum.sub$bic[nvmin:nvmax]) + nvmin -1 + names.sub <- names(which(sum.sub$which[best.size,-1]==TRUE)) + bic <- min(sum.sub$bic[nvmin:nvmax]) # completely remove NA cases for chosen subset reg.xts <- na.omit(dat.xts[,c(i,names.sub)]) Modified: pkg/FactorAnalytics/R/fitTsfm.control.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-21 11:25:12 UTC (rev 3555) +++ pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-21 12:26:00 UTC (rev 3556) @@ -77,10 +77,6 @@ #' "exhaustive". #' @param really.big option for \code{"subsets"}; Must be \code{TRUE} to #' perform exhaustive search on more than 50 variables. -#' @param subset.size number of factors required in the factor model; an -#' option for \code{"subsets"} variable selection. \code{NULL} selects the -#' best model (BIC) from amongst subset sizes in [\code{nvmin},\code{nvmax}]. -#' Default is \code{NULL}. #' @param type option for \code{"lars"}. One of "lasso", "lar", #' "forward.stagewise" or "stepwise". The names can be abbreviated to any #' unique substring. Default is "lasso". @@ -120,7 +116,7 @@ #' @examples #' #' # check argument list passed by fitTsfm.control -#' tsfm.ctrl <- fitTsfm.control(method="exhaustive", subset.size=2) +#' tsfm.ctrl <- fitTsfm.control(method="exhaustive", nvmin=2) #' print(tsfm.ctrl) #' #' # used internally by fitTsfm @@ -128,7 +124,7 @@ #' fit <- fitTsfm(asset.names=colnames(managers[,(1:6)]), #' factor.names=colnames(managers[,(7:9)]), #' data=managers, variable.selection="subsets", -#' method="exhaustive", subset.size=2) +#' method="exhaustive", nvmin=2) #' #' @export @@ -136,8 +132,8 @@ qr=TRUE, nrep=NULL, scope, scale, direction, trace=FALSE, steps=1000, k=2, nvmin=1, nvmax=8, force.in=NULL, force.out=NULL, method, - really.big=FALSE, subset.size=NULL, type, - normalize=TRUE, eps=.Machine$double.eps, max.steps, + really.big=FALSE, type, normalize=TRUE, + eps=.Machine$double.eps, max.steps, lars.criterion="Cp", K=10) { # get the user-specified arguments (that have no defaults) @@ -172,14 +168,12 @@ if (!is.logical(really.big) || length(really.big) != 1) { stop("Invalid argument: control parameter 'really.big' must be logical") } - if (!is.null(subset.size)) { - if (subset.size <= 0 || round(subset.size) != subset.size) { - stop("Control parameter 'subset.size' must be a positive integer or NULL") - } - if (nvmax < subset.size || subset.size < length(force.in)) { - stop("Invaid Argument: nvmax should be >= subset.size and subset.size + if (nvmin <= 0 || round(nvmin) != nvmin) { + stop("Control parameter 'nvmin' must be a positive integer") + } + if (nvmax < nvmin || nvmin < length(force.in)) { + stop("Invaid Argument: nvmax should be >= nvmin and nvmin should be >= length(force.in)") - } } if (!is.logical(normalize) || length(normalize) != 1) { stop("Invalid argument: control parameter 'normalize' must be logical") @@ -192,8 +186,7 @@ result <- c(args, list(decay=decay, model=model, x=x, y=y, qr=qr, nrep=nrep, trace=trace, steps=steps, k=k, nvmin=nvmin, nvmax=nvmax, force.in=force.in, force.out=force.out, - really.big=really.big, subset.size=subset.size, - normalize=normalize, eps=eps, + really.big=really.big, normalize=normalize, eps=eps, lars.criterion=lars.criterion, K=K)) return(result) } Modified: pkg/FactorAnalytics/man/fitTsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-21 11:25:12 UTC (rev 3555) +++ pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-21 12:26:00 UTC (rev 3556) @@ -160,7 +160,7 @@ fit.sub <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:9)]), data=managers, variable.selection="subsets", - method="exhaustive", subset.size=2) + method="exhaustive", nvmin=2) # example using "lars" variable selection and subtracting risk-free rate fit.lar <- fitTsfm(asset.names=colnames(managers[,(1:6)]), Modified: pkg/FactorAnalytics/man/fitTsfm.control.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-21 11:25:12 UTC (rev 3555) +++ pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-21 12:26:00 UTC (rev 3556) @@ -6,9 +6,9 @@ fitTsfm.control(decay = 0.95, weights, model = TRUE, x = FALSE, y = FALSE, qr = TRUE, nrep = NULL, scope, scale, direction, trace = FALSE, steps = 1000, k = 2, nvmin = 1, nvmax = 8, - force.in = NULL, force.out = NULL, method, really.big = FALSE, - subset.size = NULL, type, normalize = TRUE, eps = .Machine$double.eps, - max.steps, lars.criterion = "Cp", K = 10) + force.in = NULL, force.out = NULL, method, really.big = FALSE, type, + normalize = TRUE, eps = .Machine$double.eps, max.steps, + lars.criterion = "Cp", K = 10) } \arguments{ \item{decay}{a scalar in (0, 1] to specify the decay factor for "DLS". @@ -76,11 +76,6 @@ \item{really.big}{option for \code{"subsets"}; Must be \code{TRUE} to perform exhaustive search on more than 50 variables.} -\item{subset.size}{number of factors required in the factor model; an -option for \code{"subsets"} variable selection. \code{NULL} selects the -best model (BIC) from amongst subset sizes in [\code{nvmin},\code{nvmax}]. -Default is \code{NULL}.} - \item{type}{option for \code{"lars"}. One of "lasso", "lar", "forward.stagewise" or "stepwise". The names can be abbreviated to any unique substring. Default is "lasso".} @@ -144,7 +139,7 @@ } \examples{ # check argument list passed by fitTsfm.control -tsfm.ctrl <- fitTsfm.control(method="exhaustive", subset.size=2) +tsfm.ctrl <- fitTsfm.control(method="exhaustive", nvmin=2) print(tsfm.ctrl) # used internally by fitTsfm @@ -152,7 +147,7 @@ fit <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:9)]), data=managers, variable.selection="subsets", - method="exhaustive", subset.size=2) + method="exhaustive", nvmin=2) } \author{ Sangeetha Srinivasan Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw =================================================================== --- pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-21 11:25:12 UTC (rev 3555) +++ pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-21 12:26:00 UTC (rev 3556) @@ -178,7 +178,7 @@ fit.sub <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:9)]), data=managers, rf.name="US 3m TR", mkt.name="SP500 TR", - variable.selection="subsets", subset.size=4) + variable.selection="subsets", nvmin=4, nvmax=4) fit.sub$beta fit.sub$r2 @ @@ -213,8 +213,7 @@ There are 4 other arguments passed to \code{fitTsfm.control} that determine the type of factor model fit chosen. \begin{itemize} \item \verb"decay": Determines the decay factor for \code{fit.method="DLS"}, which performs exponentially weighted least squares, with weights adding to unity. -\item \verb"nvmin": The lower limit for the range of subset sizes from which the best model (BIC) is found when performing \verb"subsets" selection. Note that the upper limit was already passed to \verb"regsubsets" function. -\item \verb"subset.size": Number of factors required in the factor model when performing \verb"subsets" selection. This might be meaningful when looking for the best model of a certain size (perhaps for parsimony, perhaps to compare with a different model of the same size, perhaps to avoid over-fitting/ data dredging etc.) Alternately, users can specify \code{NULL} to get the best model from amongst subset sizes in the range \code{[nvmin,nvmax]}. +\item \verb"nvmin": The lower limit for the range of subset sizes from which the best model (BIC) is found when performing \verb"subsets" selection. Note that the upper limit was already passed to \verb"regsubsets" function. By specifying \code{nvmin=nvmax}, users can obtain the best model of a particular size (meaningful to those who want a parsimonious model, or to compare with a different model of the same size, or perhaps to avoid over-fitting/ data dredging etc.). \item \verb"lars.criterion": An option (one of "Cp" or "cv") to assess model selection for the \code{"lars"} variable selection method. "Cp" is Mallow's Cp statistic and "cv" is K-fold cross-validated mean squared prediction error. \end{itemize} Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf =================================================================== (Binary files differ) From noreply at r-forge.r-project.org Fri Nov 21 15:03:49 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 21 Nov 2014 15:03:49 +0100 (CET) Subject: [Returnanalytics-commits] r3557 - pkg/PerformanceAnalytics/src Message-ID: <20141121140349.9483C1877E4@r-forge.r-project.org> Author: rossbennett34 Date: 2014-11-21 15:03:47 +0100 (Fri, 21 Nov 2014) New Revision: 3557 Added: pkg/PerformanceAnalytics/src/Makevars Log: Adding Makevars to explicitly link to BLAS and LAPACK Added: pkg/PerformanceAnalytics/src/Makevars =================================================================== --- pkg/PerformanceAnalytics/src/Makevars (rev 0) +++ pkg/PerformanceAnalytics/src/Makevars 2014-11-21 14:03:47 UTC (rev 3557) @@ -0,0 +1 @@ +PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS) \ No newline at end of file From noreply at r-forge.r-project.org Fri Nov 21 15:40:20 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Fri, 21 Nov 2014 15:40:20 +0100 (CET) Subject: [Returnanalytics-commits] r3558 - pkg/PerformanceAnalytics Message-ID: <20141121144020.9BD2E183BF8@r-forge.r-project.org> Author: braverock Date: 2014-11-21 15:40:20 +0100 (Fri, 21 Nov 2014) New Revision: 3558 Modified: pkg/PerformanceAnalytics/DESCRIPTION Log: - bump version after Makevars change Modified: pkg/PerformanceAnalytics/DESCRIPTION =================================================================== --- pkg/PerformanceAnalytics/DESCRIPTION 2014-11-21 14:03:47 UTC (rev 3557) +++ pkg/PerformanceAnalytics/DESCRIPTION 2014-11-21 14:40:20 UTC (rev 3558) @@ -12,7 +12,7 @@ , person(given="Kyle",family="Balkissoon",role="ctb") , person(given="Diethelm",family="Wuertz",role="ctb") ) -Version: 1.4.3541 +Version: 1.4.3558 Date: $Date$ Description: Collection of econometric functions for performance and risk analysis. This package aims to aid From noreply at r-forge.r-project.org Sat Nov 22 18:58:57 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sat, 22 Nov 2014 18:58:57 +0100 (CET) Subject: [Returnanalytics-commits] r3559 - in pkg/FactorAnalytics: . R inst/tests man vignettes Message-ID: <20141122175857.1DF95187645@r-forge.r-project.org> Author: pragnya Date: 2014-11-22 18:58:56 +0100 (Sat, 22 Nov 2014) New Revision: 3559 Modified: pkg/FactorAnalytics/NAMESPACE pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/fitTsfm.control.R pkg/FactorAnalytics/inst/tests/test-fitTSFM.r pkg/FactorAnalytics/man/fitTsfm.control.Rd pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf Log: fitTsfm control internalized. Re-ordered beta in fitTsfm for compatibility Modified: pkg/FactorAnalytics/NAMESPACE =================================================================== --- pkg/FactorAnalytics/NAMESPACE 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/NAMESPACE 2014-11-22 17:58:56 UTC (rev 3559) @@ -17,7 +17,6 @@ S3method(summary,tsfm) export(dCornishFisher) export(fitTsfm) -export(fitTsfm.control) export(fmCov) export(fmEsDecomp) export(fmSdDecomp) Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-22 17:58:56 UTC (rev 3559) @@ -281,14 +281,18 @@ return(result) } - # extract the fitted factor models, coefficients, r2 values and residual vol - # from returned factor model fits above + # extract coefficients from fitted factor models returned above coef.mat <- makePaddedDataFrame(lapply(reg.list, coef)) alpha <- coef.mat[, 1, drop=FALSE] # to get alpha of class numeric, do: aplha <- coef.mat[,1] beta <- coef.mat[, -1, drop=FALSE] - # reorder the columns to match factor names vector - beta <- subset(beta, select=factor.names) + # reorder and expand columns of beta to match factor.names + tmp <- matrix(NA, length(asset.names), length(factor.names)) + colnames(tmp) <- factor.names + rownames(tmp) <- asset.names + beta <- merge(beta, tmp, all.x=TRUE, sort=FALSE)[,factor.names] + rownames(beta) <- asset.names + # extract r2 and residual sd r2 <- sapply(reg.list, function(x) summary(x)$r.squared) resid.sd <- sapply(reg.list, function(x) summary(x)$sigma) # create list of return values. Modified: pkg/FactorAnalytics/R/fitTsfm.control.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-22 17:58:56 UTC (rev 3559) @@ -114,19 +114,19 @@ #' \code{\link[lars]{cv.lars}} #' #' @examples -#' +#' \dontrun{ #' # check argument list passed by fitTsfm.control #' tsfm.ctrl <- fitTsfm.control(method="exhaustive", nvmin=2) #' print(tsfm.ctrl) +#' } #' -#' # used internally by fitTsfm +#' # used internally by fitTsfm in the example below #' data(managers) #' fit <- fitTsfm(asset.names=colnames(managers[,(1:6)]), #' factor.names=colnames(managers[,(7:9)]), #' data=managers, variable.selection="subsets", #' method="exhaustive", nvmin=2) -#' -#' @export +#' fitTsfm.control <- function(decay=0.95, weights, model=TRUE, x=FALSE, y=FALSE, qr=TRUE, nrep=NULL, scope, scale, direction, Modified: pkg/FactorAnalytics/inst/tests/test-fitTSFM.r =================================================================== --- pkg/FactorAnalytics/inst/tests/test-fitTSFM.r 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/inst/tests/test-fitTSFM.r 2014-11-22 17:58:56 UTC (rev 3559) @@ -2,7 +2,6 @@ test_that("fitTsfm is as expected", { - # fit Carhart 4-factor model using lm fpath <- system.file("extdata", "timeSeriesReturns.csv", package="factorAnalytics") returns.z <- read.zoo(file=fpath, header=TRUE, sep=",", as.is=TRUE, @@ -11,6 +10,8 @@ assets <- names(returns.z)[1:30] ex.rets <- returns.z[,assets]-returns.z$rf carhart <- returns.z[,c("mktrf","smb","hml","umd")] + + # fit Carhart 4-factor model using lm ff4 <- lm(ex.rets ~ carhart) sum4 = summary(ff4) rsq4 <- as.numeric(sapply(X = sum4, FUN = "[", "r.squared")) @@ -20,11 +21,11 @@ Sigma.R <- t(beta.hat) %*% Sigma.F %*% beta.hat + Sigma.eps^2 # fit Carhart 4-factor mode via fitTsfm - ff.mod <- fitTsfm( - asset.names = assets, - factor.names = c("mktrf","smb","hml","umd"), - data = cbind(ex.rets,carhart)) + ff.mod <- fitTsfm(asset.names=assets, + factor.names=c("mktrf","smb","hml","umd"), + data=cbind(ex.rets,carhart)) + # compare beta and r2 expect_that(as.matrix(ff.mod$beta),is_equivalent_to(t(coef(ff4)[-1,]))) expect_that(as.numeric(ff.mod$r2), equals(as.numeric(sapply(X=sum4, FUN="[", "r.squared")))) Modified: pkg/FactorAnalytics/man/fitTsfm.control.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-22 17:58:56 UTC (rev 3559) @@ -138,11 +138,13 @@ \code{\link[lars]{cv.lars}}. } \examples{ +\dontrun{ # check argument list passed by fitTsfm.control tsfm.ctrl <- fitTsfm.control(method="exhaustive", nvmin=2) print(tsfm.ctrl) +} -# used internally by fitTsfm +# used internally by fitTsfm in the example below data(managers) fit <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:9)]), Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw =================================================================== --- pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-21 14:40:20 UTC (rev 3558) +++ pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw 2014-11-22 17:58:56 UTC (rev 3559) @@ -164,7 +164,7 @@ \begin{itemize} \item Variable selection methods \verb"stepwise" and \verb"subsets" can be combined with any of the fit methods, "OLS", "DLS" or "Robust". \item If variable selection method selected is \verb"lars", \code{fit.method} will be ignored. -\item Refer to the section on \code{fitTsfm.control} for more details on the control arguments to the different variable selection methods. +\item Refer to the next section on \code{fitTsfm control} for more details on the control arguments that can be passed to the different variable selection methods. \end{itemize} The next example uses the \verb"lars" variable selection method. The default type and criterion used are \verb"lasso" and the \verb"Cp" statistic. The \verb"subsets" variable selection method is demonstrated next for comparison using the same set of factors. However, the best subset of size 4 for each asset is chosen. Figures 3 and 4 display the factor betas from the two fits. @@ -192,15 +192,11 @@ @ \newpage -\subsection{fitTsfm.control} +\subsection{fitTsfm control} Since \code{fitTsfm} calls many different regression fitting and variable selection methods, it made sense to collect all the optional controls for these functions and process them via \code{fitTsfm.control}. This function is meant to be used internally by \code{fitTsfm} when arguments are passed to it via the ellipsis. The use of control parameters was demonstrated with subset.size in the fit.sub example earlier. -<>= -args(fitTsfm.control) -@ - -Here's an ordered list of control parameters passed by \code{fitTsfm} matched with their respective functions for easy reference. See the corresponding help files for more details on each parameter. +For easy reference, here's a classified list of control parameters accepted and passed by \code{fitTsfm} to their respective model fitting (or) model selection functions in other packages. See the corresponding help files for more details on each parameter. \begin{itemize} \item \verb"lm": "weights","model","x","y","qr" \item \verb"lmRob": "weights","model","x","y","nrep" @@ -210,13 +206,15 @@ \item \verb"cv.lars": "K","type","normalize","eps","max.steps","trace" \end{itemize} -There are 4 other arguments passed to \code{fitTsfm.control} that determine the type of factor model fit chosen. +There are 3 other significant arguments that can be passed through the \code{...} argument to \code{fitTsfm}. \begin{itemize} \item \verb"decay": Determines the decay factor for \code{fit.method="DLS"}, which performs exponentially weighted least squares, with weights adding to unity. \item \verb"nvmin": The lower limit for the range of subset sizes from which the best model (BIC) is found when performing \verb"subsets" selection. Note that the upper limit was already passed to \verb"regsubsets" function. By specifying \code{nvmin=nvmax}, users can obtain the best model of a particular size (meaningful to those who want a parsimonious model, or to compare with a different model of the same size, or perhaps to avoid over-fitting/ data dredging etc.). \item \verb"lars.criterion": An option (one of "Cp" or "cv") to assess model selection for the \code{"lars"} variable selection method. "Cp" is Mallow's Cp statistic and "cv" is K-fold cross-validated mean squared prediction error. \end{itemize} +\newpage + \subsection{Summary, Predict, Coefficients, Fitted values and Residuals} <<>>= Modified: pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf =================================================================== (Binary files differ) From noreply at r-forge.r-project.org Sun Nov 23 15:28:24 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sun, 23 Nov 2014 15:28:24 +0100 (CET) Subject: [Returnanalytics-commits] r3560 - in pkg/PerformanceAnalytics/sandbox: . qwafafew2014 qwafafew2014/R qwafafew2014/data Message-ID: <20141123142824.C0B47186A0C@r-forge.r-project.org> Author: peter_carl Date: 2014-11-23 15:28:24 +0100 (Sun, 23 Nov 2014) New Revision: 3560 Added: pkg/PerformanceAnalytics/sandbox/qwafafew2014/ pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/ pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/Baily_LopezdePrado_stop-out.R pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/mbb.R pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/table.RiskStats.R pkg/PerformanceAnalytics/sandbox/qwafafew2014/data/ pkg/PerformanceAnalytics/sandbox/qwafafew2014/data/Futures Trend 201409a.csv pkg/PerformanceAnalytics/sandbox/qwafafew2014/qwafafew2014.R pkg/PerformanceAnalytics/sandbox/qwafafew2014/qwafafew2014.Rpres Log: - presentation and code for Chicago chapter QWAFAFEW presentation Added: pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/Baily_LopezdePrado_stop-out.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/Baily_LopezdePrado_stop-out.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/Baily_LopezdePrado_stop-out.R 2014-11-23 14:28:24 UTC (rev 3560) @@ -0,0 +1,108 @@ +# -------------------------------------------------------------------- +# Max Quartile Loss at Confidence level MLdP +# -------------------------------------------------------------------- + +MaxQL<-function(R, confidence=0.95, type=c("ac","normal"), ...) { + # @TODO Handle multi-column output + if(!is.vector(R)) + x = checkData(R[,1]) + else + x = checkData(R) + x = na.omit(x) + if(type[1]=="ac"){ + mu = mean(x, na.rm = TRUE) + sigma_infinity = StdDev(x) + phi = cov(x[-1],x[-length(x)])/(cov(x[-length(x)])) + phi=.5 + sigma = sigma_infinity*((1-phi^2)^0.5) + dPi0 = 0 + minQ = minQ(phi, mu, sigma, dPi0, confidence) + } + if(type[1]=="normal"){ + minQ = minQ_norm(x, confidence) + } + MaxQL=min(0,minQ[[1]]) +# rownames(MaxQL) = paste0("MaxQL (", confidence*100, "%)") #,"t*") + return(MaxQL) +} + +# -------------------------------------------------------------------- +# Max Quartile Loss at Confidence level - First-order AC +# -------------------------------------------------------------------- +getQ <- function(bets, phi, mu, sigma, dPi0, confidence) { + # Compute analytical solution to quantile + #1) Mean (eq 15) + mean=(phi^(bets+1)-phi)/(1-phi)*(dPi0-mu)+mu*bets # wrong? + #2) Variance (eq 15) + var=sigma^2/(phi-1)^2 + var=var*((phi^(2*(bets+1))-1)/(phi^2-1)-2*(phi^(bets+1)-1)/(phi-1)+bets+1) + #3) Quantile + q=mean+qnorm(1-confidence)*(var^0.5) + #print(sprintf("bets %g, mean %g, var %g, var1 %g, var2 %g, var3 %g, q %g", bets, mean, var, var1, var2, var3, q)) + q +} + +goldenSection<-function(a, b, FUN, minimum = TRUE, ...) { + FUN = match.fun(FUN) + tol = 10^-9 + sign = 1 + + if(minimum) sign = -1 + N = round(ceiling(-2.078087*log(tol/abs(b-a)))) + r = 0.618033989 + c = 1.0 - r + x1 = r*a + c*b + x2 = c*a + r*b + f1 = sign * FUN(x1,...=...) + f2 = sign * FUN(x2,...=...) + #print(f1); print(f2) + for(i in 1:N){ + if(f1>f2){ + a = x1 + x1 = x2 + f1 = f2 + x2 = c*a+r*b + f2 = sign*FUN(x2,...=...) + } else { + b = x2 + x2 = x1 + f2 = f1 + x1 = r*a + c*b + f1 = sign*FUN(x1,...=...) + } + } + if(f1 < f2){ + return(list(minQ=sign*f1, t=x1)) + } else { + return(list(minQ=sign*f2, t=x2)) + } +} + +minQ <- function(phi, mu, sigma, dPi0, confidence) { + q = 0 + bets = 0 + while (q <= 0) { + bets = bets + 1 + q = getQ(bets, phi, mu, sigma, dPi0, confidence) + } + #print(sprintf("bets %g, q %g", bets, q)) + goldenSection(0,bets,getQ,FALSE,phi=phi,mu=mu,sigma=sigma,dPi0=dPi0,confidence=confidence) +} + +# minQ(0.5, 1, 2, 1, 0.95) +# MinQ = -9.15585580378 +# Time at MinQ = 12.4832517718 + +# -------------------------------------------------------------------- +# Max Quartile Loss at Confidence level - Assuming IID (eq. 5) +# -------------------------------------------------------------------- + +minQ_norm<-function(x, confidence){ +# Calculate the maximum drawdown for a normal distribution, assuming iid returns + x = na.omit(x) + sd = StdDev(x) + mu = mean(x, na.rm = TRUE) + minQ = -((qnorm(1-confidence)*sd)^2)/(4*mu) + t = ((qnorm(1-confidence)*sd)/(2*mu))^2 + return(list(minQ=minQ,t=t)) +} Added: pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/mbb.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/mbb.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/mbb.R 2014-11-23 14:28:24 UTC (rev 3560) @@ -0,0 +1,27 @@ +mbb <- function(R, N=length(R), k=6, nrep=1000, verbose=TRUE, cores=6){ +#' Moving blocks bootstrap +#' +#' Follows Efron and Tibshirani (sec. 8.6). +#' With a nod to http://stat.wharton.upenn.edu/~buja/STAT-541/time-series-bootstrap.R + +#' N length of the resulting time series +#' k size of the moving blocks +#' nrep number of bootstrap replications + require(foreach) + require(doMC) + registerDoMC(cores) + .mdd <- function(R, N, k){ + result.mbb <- rep(NA, N) # local vector for a bootstrap replication + for(j in 1:ceiling(N/k)) { # fill the vector with random blocks + endpoint <- sample(k:N, size=1) # by randomly sampling endpoints + result.mbb[(j-1)*k+1:k] <- R[endpoint-(k:1)+1] # and copying blocks to the local vector + } + result <- as.matrix(result.mbb[1:N])# trim overflow when k doesn't divide N + return(result) + } + + result <- foreach(i=1:nrep, .combine=cbind, .inorder=TRUE) %dopar% { + .mdd(R=R, N=N, k=k) + } + return(result) +} \ No newline at end of file Added: pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/table.RiskStats.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/table.RiskStats.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/qwafafew2014/R/table.RiskStats.R 2014-11-23 14:28:24 UTC (rev 3560) @@ -0,0 +1,229 @@ +# Additional and re-organized tables for WB presentations + +table.RiskStats <- +function (R, ci = 0.95, scale = NA, Rf = 0, MAR = .1/12, p= 0.95, digits = 4) +{# @author Peter Carl + # Risk Statistics: Statistics and Stylized Facts + + y = checkData(R, method = "zoo") + if(!is.null(dim(Rf))) + Rf = checkData(Rf, method = "zoo") + # Set up dimensions and labels + columns = ncol(y) + rows = nrow(y) + columnnames = colnames(y) + rownames = rownames(y) + + if(is.na(scale)) { + freq = periodicity(y) + switch(freq$scale, + minute = {stop("Data periodicity too high")}, + hourly = {stop("Data periodicity too high")}, + daily = {scale = 252}, + weekly = {scale = 52}, + monthly = {scale = 12}, + quarterly = {scale = 4}, + yearly = {scale = 1} + ) + } + + # for each column, do the following: + for(column in 1:columns) { + x = na.omit(y[,column,drop=FALSE]) + # for each column, make sure that R and Rf are for the same dates + if(!is.null(dim(Rf))){ # if Rf is a column + z = merge(x,Rf) + zz = na.omit(z) + x = zz[,1,drop=FALSE] + Rf.subset = zz[,2,drop=FALSE] + } + else { # unless Rf is a single number + Rf.subset = Rf + } + + z = c( + Return.annualized(x, scale = scale), + StdDev.annualized(x, scale = scale), + SharpeRatio.annualized(x, scale = scale, Rf = Rf), + DownsideDeviation(x,MAR=0)*sqrt(scale),# Add annualization to this function + SortinoRatio(x)*sqrt(scale), # New function adds annualization + PerformanceAnalytics:::AverageDrawdown(x), + maxDrawdown(x), + SterlingRatio(x), + VaR(x, p=p,method="historical"), + ES(x, p=p,method="historical"), + skewness(x), + kurtosis(x), + VaR(x, p=p), + ES(x, p=p), + SharpeRatio(x, p=p, Rf=Rf, FUN="ES", annualize=TRUE), + length(x) + ) + znames = c( + "Annualized Return", + "Annualized Std Dev", + "Annualized Sharpe Ratio", + "Annualized Downside Deviation", + "Annualized Sortino Ratio", + "Average Drawdown", + "Maximum Drawdown", + "Sterling Ratio (10%)", + paste("Historical VaR (",base::round(p*100,1),"%)",sep=""), + paste("Historical ETL (",base::round(p*100,1),"%)",sep=""), + "Skewness", + "Excess Kurtosis", + paste("Modified VaR (",base::round(p*100,1),"%)",sep=""), + paste("Modified ETL (",base::round(p*100,1),"%)",sep=""), + paste("Annualized Modified Sharpe Ratio (ETL ", base::round(p*100,1),"%)",sep=""), + "# Obs" + ) + if(column == 1) { + resultingtable = data.frame(Value = z, row.names = znames) + } + else { + nextcolumn = data.frame(Value = z, row.names = znames) + resultingtable = cbind(resultingtable, nextcolumn) + } + } + colnames(resultingtable) = columnnames + ans = base::round(resultingtable, digits) + ans +} + +table.PerfStats <- +function (R, scale = NA, Rf = 0, digits = 4) +{# @author Peter Carl + # Performance Statistics: Statistics and Stylized Facts + + y = checkData(R) + if(!is.null(dim(Rf))) + Rf = checkData(Rf) + # Set up dimensions and labels + columns = ncol(y) + rows = nrow(y) + columnnames = colnames(y) + rownames = rownames(y) + + if(is.na(scale)) { + freq = periodicity(y) + switch(freq$scale, + minute = {stop("Data periodicity too high")}, + hourly = {stop("Data periodicity too high")}, + daily = {scale = 252}, + weekly = {scale = 52}, + monthly = {scale = 12}, + quarterly = {scale = 4}, + yearly = {scale = 1} + ) + } + + # for each column, do the following: + for(column in 1:columns) { + x = na.omit(y[,column,drop=FALSE]) + # for each column, make sure that R and Rf are for the same dates + if(!is.null(dim(Rf))){ # if Rf is a column + z = merge(x,Rf) + zz = na.omit(z) + x = zz[,1,drop=FALSE] + Rf.subset = zz[,2,drop=FALSE] + } + else { # unless Rf is a single number + Rf.subset = Rf + } + + z = c( + Return.cumulative(x), + Return.annualized(x, scale = scale), + StdDev.annualized(x, scale = scale), + length(subset(x, x>0)), + length(subset(x, x<=0)), + length(subset(x, x>0))/length(x), + mean(subset(x, x>0)), + mean(subset(x, x<=0)), + mean(x), + AverageDrawdown(x), + AverageRecovery(x) + ) + znames = c( + "Cumulative Return", + "Annualized Return", + "Annualized Std Dev", + "# Positive Months", + "# Negative Months", + "% Positive Months", + "Average Positive Month", + "Average Negative Month", + "Average Month", + "Average Drawdown", + "Average Months to Recovery" + ) + if(column == 1) { + resultingtable = data.frame(Value = z, row.names = znames) + } + else { + nextcolumn = data.frame(Value = z, row.names = znames) + resultingtable = cbind(resultingtable, nextcolumn) + } + } + colnames(resultingtable) = columnnames + ans = base::round(resultingtable, digits) + ans +} + +table.RiskContribution <- function(R, p, ..., weights=NULL, scale=NA, geometric = TRUE) { + + R = na.omit(R) + if(is.null(weights)) { + message("no weights passed in, assuming equal weighted portfolio") + weights = rep(1/dim(R)[[2]], dim(R)[[2]]) + } + if (is.na(scale)) { + freq = periodicity(R) + switch(freq$scale, minute = { + stop("Data periodicity too high") + }, hourly = { + stop("Data periodicity too high") + }, daily = { + scale = 252 + }, weekly = { + scale = 52 + }, monthly = { + scale = 12 + }, quarterly = { + scale = 4 + }, yearly = { + scale = 1 + }) + } + + # Returns + # ret.col = colMeans(R)*weights + ret.col = Return.annualized(R, geometric=geometric)*weights + percret.col = ret.col/sum(ret.col) + result = cbind(t(ret.col), t(percret.col)) + # Standard Deviation + sd.cols = StdDev(R, weights=weights, invert=TRUE, portfolio_method="component", p=(1-1/12)) + result = cbind(sd.cols$contribution*sqrt(scale), sd.cols$pct_contrib_StdDev, result) + # VaR? + var.cols = VaR(R, weights=weights, method="gaussian", portfolio_method="component", p=(1-1/12)) + result = cbind(var.cols$contribution, var.cols$pct_contrib_VaR, result) + + mvar.cols = VaR(R, weights=weights, method="gaussian", portfolio_method="component", p=(1-1/12)) + result = cbind(mvar.cols$contribution, mvar.cols$pct_contrib_VaR, result) + + # ES + es.cols = ES(R, weights=weights, method="gaussian", portfolio_method="component", p=(1-1/12)) + result = cbind(es.cols$contribution, es.cols$pct_contrib_ES, result) + + mes.cols = ES(R, weights=weights, method="modified", portfolio_method="component", p=(1-1/12)) + result = cbind(weights, mes.cols$contribution, mes.cols$pct_contrib_MES, result) + total = colSums(result) + + result = rbind(result, colSums(result)) + rownames(result) = c(colnames(R),"Total") +# colnames(result) = c("Weights", "Contribution to mETL", "Percentage Contribution to mETL", "Contribution to gETL", "Percentage Contribution to gETL", "Contribution to Annualized StdDev", "Percentage Contribution to StdDev", "Contribution to Annualized E(R)", "Percentage Contribution to E(R)") + + colnames(result) = c("Weights", "Contribution to mETL", "%Contribution to mETL", "Contribution to gETL", "%Contribution to gETL", "Contribution to mVaR", "%Contribution to mVaR", "Contribution to gVaR", "%Contribution to gVaR", "Contribution to Annualized StdDev", "%Contribution to StdDev", "Contribution to Annualized E(R)", "%Contribution to E(R)") + return(result) + +} Added: pkg/PerformanceAnalytics/sandbox/qwafafew2014/data/Futures Trend 201409a.csv =================================================================== --- pkg/PerformanceAnalytics/sandbox/qwafafew2014/data/Futures Trend 201409a.csv (rev 0) +++ pkg/PerformanceAnalytics/sandbox/qwafafew2014/data/Futures Trend 201409a.csv 2014-11-23 14:28:24 UTC (rev 3560) @@ -0,0 +1,298 @@ +,Trend1,Trend2,Trend3,Trend4,Trend5,Trend6,Trend7,Trend8,Trend9,Trend10,Trend11,Trend12,Trend13,Trend14,Trend15,Trend16,Trend17,Trend18,Trend19,Trend20,Trend21,Trend22,EDHEC CTA,NewEdge CTA,NewEdge Trend Index,Barclay BTOP50 +1990-01-31,,,0.0373,,0.0563,,,,,,,,,,,,,,0.0365,,,,,,,0.0008 +1990-02-28,,,0.0389,,0.0245,,,,,,,,,,,,,,0.0181,,,,,,,0.0044 +1990-03-31,,,0.0308,,0.0568,,,,,,,,,,,,,,0.0945,,,,,,,0.0123 +1990-04-30,,,0.0242,,0.0834,,,,,,,,,,,,,,0.129,,,,,,,0.0299 +1990-05-31,,,-0.0519,,-0.1209,,,,,,,,,,,,,,-0.079,,,,,,,-0.0315 +1990-06-30,,,0.0347,,0.0455,,,,,,,,,,,,,,0.0249,,,,,,,0.0022 +1990-07-31,,,0.148,,0.0432,,,,,,,,,,,,,,0.2008,,,,,,,0.0363 +1990-08-31,,,0.0333,,0.0898,,-0.0216,,,,,,,,,,,,0.1854,,,,,,,0.0493 +1990-09-30,,,0.026,,0.0072,,-0.0262,,,,,,,,,,,,0.0857,,,,,,,0.0229 +1990-10-31,,,0.0703,,0.0213,,-0.1475,,,,,,,,,,,,-0.0036,,,,,,,0.0208 +1990-11-30,,,0.0291,,0.0007,,0.0901,,,,,,,,,,,,0.0031,,,,,,,0.0074 +1990-12-31,,,0.0053,,-0.0082,,-0.0112,,,,,,,,,,,,-0.0009,,,,,,,-0.0094 +1991-01-31,,,-0.056,,-0.0759,,0.0115,,,,,,,,,,,,-0.1594,,,,,,,-0.0493 +1991-02-28,,,0.0171,,-0.0258,,0.0801,,,,,,,,,,,,0.013,,,,,,,0.0024 +1991-03-31,,,0.0307,,0.1604,,0.237,,,,,,,,,,,,0.0243,,,,,,,0.0561 +1991-04-30,,,0.0018,,-0.0166,,-0.0329,,,,,,,,,,,,-0.137,,,,,,,0.0058 +1991-05-31,,,-0.0078,,0.0266,,-0.1306,,,,,,,,,,,,0.0294,,,,,,,-0.0238 +1991-06-30,,,0.0197,,0.0543,,0.1241,,,,,,,,,,,,0.0211,,,,,,,0.0287 +1991-07-31,,,-0.038,,-0.0854,,0.0899,,,,,,,,,,,,-0.0152,,,,,,,-0.0185 +1991-08-31,,,-0.0596,,-0.0292,,-0.0051,,,,,,,,,,,,-0.0633,,,,,,,-0.005 +1991-09-30,,,0.0109,,0.0211,,0.0323,,,,,,,,,,,,0.1161,,,,,,,0.0444 +1991-10-31,,,-0.0008,,0.0031,,-0.0096,,,,,0.021,,,,,,,0.1661,,,,,,,-0.0146 +1991-11-30,,,0.0033,,-0.0209,,0.025,,,,,0.013,,,,,,,-0.0209,,,,,,,0.0146 +1991-12-31,,,0.1766,,0.1601,,0.1696,,,,,0.0963,,,,,,,0.3375,,,,,,,0.1063 +1992-01-31,,,-0.0815,,-0.0555,,-0.0677,,,,,-0.0496,,,,,,,-0.126,,,,,,,-0.0663 +1992-02-29,,,-0.0103,,-0.0504,,-0.0538,,,,,-0.0191,,,,,,,-0.06,,,,,,,-0.0232 +1992-03-31,,,-0.0044,,-0.0261,,0.0551,,,,,0.0141,,,,,,,-0.0547,,,,,,,-0.003 +1992-04-30,,,-0.0045,,-0.0222,,-0.0529,,,,,0.013,,,,,,,0.0031,,,,,,,-0.0316 +1992-05-31,,,0.0107,,-0.0226,,-0.0024,,,,,0.0212,,,,,,,-0.0571,,,,,,,0.0027 +1992-06-30,,,0.1392,,0.1064,,0.1174,,,,,0.0644,,,,,,,0.0658,,,,,,,0.0584 +1992-07-31,,,0.0816,,0.1114,,0.1656,,,,,0.0648,,,,,,,0.1652,,,,,,,0.074 +1992-08-31,,,0.0694,,0.0453,,0.0813,,,,,0.0392,,,,,,,0.0192,,,,,,,0.0515 +1992-09-30,,,0.0005,,-0.0043,,-0.1027,,,,,0.0077,,,,,,,-0.0034,,,,,,,-0.0208 +1992-10-31,,,-0.0323,,-0.0321,,0.0188,,,,,-0.0739,,,,,,,-0.0331,,,,,,,-0.0047 +1992-11-30,,,0.0265,,0.0424,,0.0233,,,,,0.0491,,,,,,,0.0465,,,,,,,0.0129 +1992-12-31,,,-0.0043,,-0.0011,,-0.025,,,,,-0.0309,,,,,,,-0.0454,,,,,,,-0.0157 +1993-01-31,,,-0.0255,,0.0031,,0.0423,,,,,-0.0429,,,,,,,-0.0421,,,,,,,-0.0091 +1993-02-28,,,0.0549,,0.1243,,0.0934,,,,,0.0394,,,,,,,0.061,,,,,,,0.0687 +1993-03-31,,,-0.0051,,-0.0309,,-0.0211,,,,,0.0171,,,,,,,0.0457,,,,,,,-0.0048 +1993-04-30,,,0.0516,,-0.0001,,0.0142,,,,,0.0449,,,,,,,0.0924,,,,,,,0.0317 +1993-05-31,,,-0.015,,0.0279,,-0.0102,,,,,0.0324,,,,,,,0.0488,,,,,,,0.0125 +1993-06-30,,,-0.0193,,0.0381,,0.0303,,,,,-0.01,,,,,,,-0.0122,,,,,,,0.0112 +1993-07-31,,,0.0478,,0.046,,0.0309,,,,,0.0436,,,,,,,0.066,,,,,,,0.0314 +1993-08-31,,,-0.0704,,-0.0612,,0.0081,,,,,-0.0328,,,,,,,-0.0528,,,,,,,-0.0015 +1993-09-30,,,0.0113,,-0.0707,,0.0361,,,,,0.0026,,,,,,,0.0116,-0.0816,,,,,,-0.0089 +1993-10-31,,,-0.0025,,-0.0545,,0.0206,,,,,-0.0486,,,,,,,-0.0659,0.0271,,,,,,-0.0092 +1993-11-30,,,0.0145,,-0.0231,,-0.0003,,,,,0.0738,,,,,,,0.0371,0.0562,,,,,,-0.0091 +1993-12-31,,,0.0824,,0.0417,,0.0284,,,,,0.0794,,,,,,,0.1283,0.032,,,,,,0.0166 +1994-01-31,,,-0.0773,,-0.0377,,0.0134,,,,,-0.0204,,,,,,,-0.0145,-0.0245,,,,,,-0.031 +1994-02-28,,,-0.0162,,-0.0845,,0.03,,,,,-0.0406,,,,,,,-0.0416,0.132,,,,,,-0.0215 +1994-03-31,,,0.0795,,0.0635,,0.0609,,,,,0.0452,,,,,,,0.0287,0.0676,,,,,,0.0209 +1994-04-30,,,-0.0205,,-0.0374,,-0.0343,,,,,-0.0203,,,,,,,-0.0839,0.0502,,,,,,-0.0083 +1994-05-31,,,0.0489,,0.0349,,-0.0291,,,,,0.0925,,,,,,,0.1501,0.0868,,,,,,0.0212 +1994-06-30,,,0.051,,0.149,,0.0028,,,,,0.0321,,,,,,,0.0147,-0.0705,,,,,,0.0348 +1994-07-31,,,-0.0254,,0.0253,,-0.117,,,,,-0.013,,,,,,,0.0098,-0.0674,,,,,,-0.0199 +1994-08-31,,,-0.0491,,-0.0335,,-0.0512,,,,,-0.0174,,,,,,,-0.0738,0.1755,,,,,,-0.0299 +1994-09-30,,,0.0355,,0.0348,,-0.0142,,,,,0.0115,,,,,,,0.0505,-0.0401,,,,,,0.0103 +1994-10-31,,,0.028,,0.005,,0.009,,,,,-0.0302,,,,,,,0.0543,0.019,,,,,,0.0087 +1994-11-30,,,0.0462,,0.0284,,0.045,,,,,0.0923,,,,,,,0.1424,0.0352,,,,,,0.0137 +1994-12-31,,,0.023,,-0.0356,,-0.0224,,,,,-0.0238,,,,,,,0.0106,0.0034,,,,,,-0.0065 +1995-01-31,,-0.0616,-0.0291,,-0.0287,,-0.0228,,,,,-0.0156,,,,,,,-0.0791,0.0393,,,,,,-0.0268 +1995-02-28,,0.0957,0.0645,,0.0485,,0.0119,,,,,-0.0149,,,,,,,0.0124,0.009,,,,,,0.0293 +1995-03-31,,0.1015,0.1461,,0.0402,,0.0452,,,,,0.0257,,,,,,,0.0663,0.0245,,,,,,0.0715 +1995-04-30,,0.0216,0.0424,,0.014,,0.0084,,,,,0.0563,,,,,,,0.0473,-0.0054,,,,,,0.0129 +1995-05-31,,0.0649,-0.0084,,-0.013,,0.0809,,,,,0.0182,,,,,,,0.0822,0.0692,,,,,,0.0148 +1995-06-30,,0.0363,0.0102,,0.0008,,-0.0234,,,,,-0.0149,,,,,,,0.0011,-0.0966,,,,,,-0.0158 +1995-07-31,,-0.037,-0.0202,,-0.0549,,0.0104,,,,,-0.0568,,,,,,,-0.0875,0.0206,,,,,,-0.0178 +1995-08-31,,-0.0058,0.015,,0.0257,,0.068,,,,,-0.0489,,,,,,,-0.0534,-0.0578,,,,,,0.02 +1995-09-30,,0.0175,-0.0145,,-0.0275,,-0.0057,,,,,-0.0003,,,,,,,-0.0184,0.0849,,,,,,-0.0128 +1995-10-31,,-0.0381,-0.0016,,-0.0075,,0.0034,,,,,-0.0085,,,,,,,-0.0667,-0.0032,,,,,,0.0046 +1995-11-30,,0.0207,0.003,,0.0077,,0.0216,,,,,0.0239,,,,,,,-0.0019,0.0049,,,,,,0.0181 +1995-12-31,,0.057,0.0738,,0.0647,,-0.0064,0.1451,,,,0.144,,,,,,,0.1911,0.005,,,,,,0.0369 +1996-01-31,,0.0538,0.0717,,0.0377,,0.0545,0.0168,,,,-0.0426,,,,,,,-0.0685,0.0898,,,,,,0.0261 +1996-02-29,,-0.0665,-0.1056,,-0.0722,,-0.0007,-0.085,,,,-0.0673,,,,,,,-0.1378,0.0005,,,,,,-0.0495 +1996-03-31,,-0.0048,0.0106,,0.0341,,-0.003,0.1162,,,,0.0365,,,,,,,0.0966,0.0074,,,,,,0.0007 +1996-04-30,,0.0859,0.0549,,0.0515,,0.0558,0.0618,,,,0.1346,,,,0.044,,,0.1427,0.0269,,,,,,0.0374 +1996-05-31,,-0.044,-0.0547,,-0.0267,,0.0196,-0.0946,,,,-0.0352,,,,-0.024904215,,,-0.0941,-0.0079,,,,,,-0.017 +1996-06-30,,-0.003,0.0414,,0.0091,,0.0011,-0.0014,,,,0.0091,,,,0.004911591,,,0.0152,-0.0443,,,,,,0.0034 +1996-07-31,,0.0388,0.012,,-0.0113,,0.0058,-0.0723,,,,-0.0489,,,,-0.021505376,,,-0.063,0.052,,,,,,0.0002 +1996-08-31,,0.0726,-0.0129,,0.0209,,0.0304,0.0019,,,,-0.0045,,,,0.005994006,,,-0.0334,0.0082,,,,,,-0.0032 +1996-09-30,,0.0751,0.0257,,0.0173,,0.0277,-0.0152,,,,0.0906,,,,0.106256207,,,0.0603,-0.0127,,,,,,0.028 +1996-10-31,,0.1037,0.0978,,0.1336,,0.0351,0.1514,,,,0.0437,,,,0.086175943,,,0.1684,0.0481,,,,,,0.0523 +1996-11-30,,0.0146,0.0389,,0.1038,,0.0703,0.0505,,,,0.031,,,,0.098347107,,,0.0245,0.0477,,,,,,0.0478 +1996-12-31,,-0.0312,0.0123,,-0.0403,,-0.0219,0.0464,,,,-0.0409,,,,0.031602709,,,-0.0641,-0.0071,,,,,,-0.0154 +1997-01-31,,0.0964,0.0779,,0.0368,,0.0207,0.0222,,,,0.0504,,,,0.031363968,,,0.0528,0.0219,,,0.0393,,,0.0315 +1997-02-28,,0.0512,0.055,,0.0177,,-0.0041,0.0457,,,,0.0625,,,,0.064356436,,,0.0915,0.0579,,,0.0298,,,0.0251 +1997-03-31,,-0.0217,-0.0238,,-0.0208,,0.0167,0.0271,,,,0.0076,,,,-0.023255814,,,-0.015,0.0382,,,-0.0021,,,0.0027 +1997-04-30,,-0.0407,-0.0243,,-0.0256,,-0.0493,-0.0544,,,,0.0139,,,,-0.041496599,,,-0.0516,0.039,,,-0.017,,,-0.011 +1997-05-31,,-0.0062,0.0137,,-0.0174,,0.0401,0.0556,,,,-0.0036,,,,0.007806955,,,-0.0132,0.0325,,,-0.0015,,,-0.0131 +1997-06-30,,0.002,0.0065,,0.0319,,0.0034,-0.0332,,,,0.0142,,,,0.045774648,,,0.0038,0.0219,,,0.0085,,,0.0094 +1997-07-31,,0.1927,0.0753,,0.0689,,0.088,0.0275,,,,0.055,,,,0.094276094,,,0.0411,0.0512,,,0.0591,,,0.0497 +1997-08-31,,0.0102,-0.0708,,-0.0511,,-0.0221,-0.0571,,,,-0.0357,,,,-0.08,,,-0.0808,0.0033,,,-0.0473,,,-0.0303 +1997-09-30,,0.0187,0.0112,,0.0387,,0.05,-0.0283,,,,0.0119,,,,0.021404682,,,0.0495,-0.0026,,,0.0198,,,0.0063 +1997-10-31,-0.1297,-0.0858,-0.0168,,0.018,,-0.0077,0.0402,,,,-0.0911,,,,0.020956123,,,-0.0537,0.0035,,,-0.0098,,,-0.0008 +1997-11-30,0.0996,0.0572,-0.0009,,0.0039,,-0.0163,-0.0125,,,,-0.0127,,,,-0.019884541,,,0.021,-0.0383,,,0.0133,,,0.0158 +1997-12-31,0.0814,0.0795,0.0476,,0.0459,,0.0366,0.0756,,,,0.0647,,,,0.107984293,,,0.0746,-0.0118,,,0.0286,,,0.0209 +1998-01-31,0.015,0.0025,0.0289,,0.0281,,0.0166,0.0356,,,,0.0493,,,,-0.025398701,,,-0.009,0.0352,,,0.0104,,,0.0083 +1998-02-28,0.0327,0.0021,-0.0246,,-0.0259,,-0.0312,0.0269,,,-0.079,0.0471,,,,0.058181818,,,0.0409,-0.0355,,,-0.0065,,,-0.0121 +1998-03-31,0.0738,0.0279,0.0118,,0.0412,,-0.0063,-0.0073,,,-0.026,0.0375,,,,0.012027491,,,-0.0445,-0.0449,,,0.0122,,,0.0185 +1998-04-30,-0.0163,-0.0543,-0.07,,-0.0637,,-0.1067,-0.0802,,,-0.114,-0.0429,,,,-0.027730617,,,-0.0445,-0.0154,,,-0.0296,,,-0.0313 +1998-05-31,0.0853,0.0355,0.0419,,0.0333,,0.0281,-0.0135,,,0.042,0.0137,,,,0.049476135,,,0.0261,0.0073,,,0.0193,,,0.023 +1998-06-30,0.0297,0.0136,0.0237,,0.0133,,-0.0219,0.0372,,,-0.076,0.0177,,,,0.039378813,,,-0.0234,0.0095,,,0.0051,,,0.0027 +1998-07-31,0.0151,-0.0475,-0.0459,,-0.0405,,-0.0346,-0.0222,,,-0.02,0.0162,,,,-0.003735326,,,-0.0083,0.0282,,,-0.001,,,0.0016 +1998-08-31,0.1099,0.1957,0.065,,0.0891,,0.1315,0.1543,,,0.278,0.0564,,,,0.198714515,,,0.2324,0.0948,,,0.0691,,,0.0554 +1998-09-30,0.0451,0.0193,0.0526,,0.0186,,0.0602,0.0251,,,0.106,0.0189,,,,0.00357462,,,-0.0333,0.0396,,,0.0454,,,0.046 +1998-10-31,-0.057,0.0086,-0.015,,0.0345,,0.0178,-0.0151,,,-0.095,-0.0214,,,,0.000445236,,,-0.1139,0.0042,,,0.0004,,,0.0181 +1998-11-30,0.0115,-0.0106,-0.0047,,-0.0083,,-0.0233,0.0296,,,0.058,-0.043,,,,0.024032043,,,0.0094,0.0119,,,-0.0089,,,-0.0238 +1998-12-31,0.095,0.027,0.0261,,0.0075,,0.0379,0.0569,,,-0.073,0.0379,,,,0.037809648,,,0.0467,0.008,,,0.0221,,,0.0216 +1999-01-31,-0.0138,-0.0386,-0.0396,,-0.0503,,-0.0257,-0.0769,,,-0.125,-0.0204,,,0.0119,-0.047319933,,,-0.1156,-0.0213,,,-0.0167,,,-0.0201 +1999-02-28,0.0361,0.0122,0.0324,,0.0254,,0.0327,0.0179,,,0.151,0.0442,,,0.0034,0.023296703,,,0.1335,-0.0035,,,0.0197,,,0.0241 +1999-03-31,-0.0398,-0.0277,0.013,,-0.0031,,0.0292,-0.0335,,,0.087,-0.0145,,,-0.0495,-0.021477663,,,-0.0943,-0.0079,,,-0.0065,,,-0.0066 +1999-04-30,0.1051,0.0311,0.0554,,0.0486,,0.0464,0.0592,,,-0.006,0.0328,,,0.0266,0.063652327,,,0.0752,-0.0268,,,0.021,,,0.0252 +1999-05-31,-0.0839,-0.031,-0.0297,,-0.036,,0.0044,-0.0075,,-0.0029,0.085,-0.0247,,,0.0217,-0.037969459,,-0.0266,-0.0609,0.0208,,,-0.015,,,-0.0093 +1999-06-30,0.0529,0.0451,0.0566,,0.0457,,0.0312,0.0354,,-0.0014,0.039,0.0226,,,0.0271,0.026598027,,0.0281,-0.0068,0.0069,,,0.0234,,,0.0305 +1999-07-31,-0.0201,0.0195,-0.0323,,0.005,,0.0235,0.0051,,-0.0222,0.086,0.0294,,,-0.0366,-0.027580443,,-0.0177,-0.0083,0.0232,,,-0.0051,,,-0.0163 +1999-08-31,-0.0347,-0.0251,0.0113,,0.0042,,0.0149,0.0174,,0.0213,-0.168,0.0161,,,0.0564,0.032660077,,-0.0173,0.0312,-0.0345,,,-0.0027,,,0.0099 +1999-09-30,-0.0017,0.0063,0.0086,,0.0145,,0.015,0.0002,,-0.0481,-0.073,0.0355,,,0.0156,0.026217228,,0.0112,0.0099,-0.0091,,,0.0064,,,-0.0001 +1999-10-31,-0.062,-0.0682,-0.1106,,-0.049,,-0.043,-0.055,,-0.048,-0.161,-0.0774,,,-0.059,-0.041768045,,-0.0526,-0.0957,-0.003,,,-0.0354,,,-0.0371 +1999-11-30,0.1393,0.0184,0.0236,,0.0147,,-0.0045,0.0681,,0.0701,0.094,0.037,,,0.003,0.047820567,,0.0426,0.1364,0.0263,,,0.0166,,,0.0155 +1999-12-31,0.0904,0.0429,0.0278,,0.0313,,-0.0037,0.0201,,0.0484,-0.073,0.0756,,,0.0883,0.027463651,,0.0111,0.0841,0.0072,,,0.0142,,,0.0028 +2000-01-31,-0.0396,0.0155,0.0187,,0.0306,,0.0087,-0.0038,,-0.0502,-0.003,-0.0102,,,0.0261,0.011399371,0.016249516,0.0418,0.0802,0.0708,,,0.0128,0.0234,0.0197,0.0071 +2000-02-29,0.0172,-0.0199,-0.0137,,-0.0071,,-0.061,-0.004,,0.0252,-0.098,0.0519,,,-0.0644,-0.001554606,0.019403525,-0.0154,-0.0905,-0.0039,,,-0.0022,-0.0078,-0.024,-0.019 +2000-03-31,-0.0328,-0.0229,-0.0426,,-0.0288,,0.0329,-0.0177,,-0.084,0.012,-0.0001,,,0.0282,-0.027247956,0.000101203,0.0714,-0.0416,-0.0067,,,-0.0138,-0.0188,-0.0303,-0.0251 +2000-04-30,0.0206,0.0014,0.0085,,-0.0118,,-0.0425,-0.0252,,-0.0027,-0.009,0.0073,,,0.0178,-0.039215686,-0.02683367,-0.0285,0.0548,-0.0046,,,-0.0241,-0.0145,-0.034,-0.0255 +2000-05-31,-0.0026,0.0252,-0.016,,0.0162,,0.0453,0.0246,,0.0697,0.141,0.0031,,,-0.0315,0.069970845,0.024075047,0.0803,-0.0258,0.0161,0.0335,,0.0114,0.0013,-0.0115,0.0024 +2000-06-30,-0.0127,-0.0244,-0.0437,,0.0253,,-0.0328,-0.0118,,0.0155,0.082,0.0041,,,-0.0564,-0.053328143,0.007073294,-0.0416,-0.0219,-0.0108,-0.0225,,-0.0124,-0.0185,-0.0298,-0.0174 +2000-07-31,-0.0458,-0.0077,-0.0167,,-0.024,,-0.0656,-0.03,,-0.0125,0.165,-0.0071,,,0.0129,-0.021792763,-0.013435084,-0.0257,-0.0526,-0.0124,0.0088,,-0.0131,-0.0157,-0.0238,-0.0131 +2000-08-31,0.0323,0.0181,0.0353,,0.0291,,-0.0157,0.0264,,0.1268,0.039,0.0185,,,0.0394,0.034468264,0.041223074,0.0317,0.1176,-0.0197,-0.0429,,0.0189,0.0083,0.0318,0.004 +2000-09-30,-0.0776,0.0062,-0.0269,,-0.0332,,-0.0734,-0.0034,,-0.0436,-0.073,-0.045,,,0.0178,-0.006095083,-0.019748163,-0.0283,-0.0453,0.0093,-0.0134,,-0.0208,-0.0324,-0.0678,-0.0284 +2000-10-31,0.0209,0.0254,0.0481,,0.0307,,-0.007,0.0194,,0.0196,0.016,-0.027,,,0.0227,0.046197874,0.009497228,0.0485,0.0951,-0.0031,0.0309,,0.0075,0.0174,0.0323,0.0205 +2000-11-30,0.0733,0.0597,0.0622,,0.0602,,0.1425,0.0581,,0.0905,-0.051,0.0179,,,0.0961,0.037514654,0.122588251,0.0797,0.0858,0.0552,0.0569,,0.0425,0.0675,0.1262,0.0687 +2000-12-31,0.1681,0.0447,0.1362,,0.024,,0.1412,0.0412,,0.089,0.201,0.1078,,,0.1219,0.149152542,0.104380562,0.1805,-0.0018,0.0229,0.0754,,0.0682,0.0885,0.1441,0.0983 +2001-01-31,0.0438,0.0072,0.0071,,-0.009,,0.0063,-0.0042,,-0.0962,0.269,0.0115,,,0.0272,-0.000983284,0.047054891,-0.0522,0.0228,-0.029,0.0206,,0.0025,0.0074,0.01,0.0064 +2001-02-28,0.0056,0.006,-0.013,,0.0133,,-0.0143,0.0252,,0.1876,-0.022,-0.0161,,,0.0749,0.029199475,0.031228539,-0.0543,0.0299,0.0142,0.0269,,-0.0016,0.0013,0.0109,0.0013 +2001-03-31,0.0709,0.0675,0.0881,,0.0678,,0.0263,0.1051,,0.1346,0.032,0.0597,,,0.1296,0.118265859,0.033070988,0.1211,0.1517,0.0613,0.0709,,0.0438,0.0534,0.0833,0.0516 +2001-04-30,-0.0531,-0.0148,-0.0493,,-0.0839,,-0.0735,-0.0688,,-0.1525,-0.07,-0.034,,,-0.1015,-0.06299886,-0.031001517,-0.0559,-0.102,0.0015,-0.0466,,-0.0362,-0.0427,-0.0837,-0.042 +2001-05-31,-0.0261,0.0129,0.0185,,0.0151,,-0.0029,-0.0027,,-0.0066,0.032,0.0126,,,0.0261,-0.021296015,-0.00501932,0.0389,0.0513,0.0194,-0.0049,,0.0081,0.0106,0.0103,0.005 +2001-06-30,-0.0266,-0.0136,-0.0194,,-0.0152,,0.0156,-0.0092,,0.0539,0.038,-0.0181,,,-0.0077,0.014609885,-0.013099608,-0.022,0.0447,0.0118,-0.0268,0.0043,-0.0077,-0.0119,-0.0235,-0.0085 +2001-07-31,0.0066,0.0472,-0.0498,-0.0001,0.0124,,-0.0372,-0.0052,,-0.0126,-0.027,-0.0145,,,-0.0534,0.016237745,0.024119608,0.0368,-0.0285,0.0247,0.0077,0.0172,-0.004,-0.0073,-0.029,-0.0048 +2001-08-31,0.0056,0.0237,0.0328,0.0270027,0.0168,,0.0038,0.0185,,,0.019,-0.0289,,,0.0612,0.008139885,0.038707027,-0.0452,0.0489,0.0169,0.0802,0.0718,0.0153,0.0184,0.0285,0.0205 +2001-09-30,0.0464,0.0782,-0.0264,0.0601811,0.073,,0.0597,0.0984,,,0.093,0.0599,,,0.1991,0.163875598,0.061420024,0.0738,0.0928,0.055,0.0681,0.0837,0.0246,0.016,0.0404,0.0252 +2001-10-31,0.1375,0.0107,0.0405,0.0577753,0.0468,,0.0601,0.0535,,,0.003,0.053,,,0.1032,0.008992806,0.067656211,0.0297,0.0413,0.0307,0.0025,0.0426,0.0336,0.0371,0.0721,0.0384 +2001-11-30,-0.071,-0.0312,-0.0771,-0.0624348,-0.0995,,-0.1103,-0.0316,,,-0.076,-0.0357,,,-0.0882,-0.046600458,-0.074169939,0.0058,-0.1368,-0.0393,-0.0635,-0.0684,-0.0543,-0.0764,-0.136,-0.0696 +2001-12-31,-0.0515,0.0486,0.0205,0.0029637,0.0357,,0.0288,0.0049,,,-0.084,0.0072,,,0.0355,-0.02457265,-0.019735697,0.1042,-0.005,-0.0166,0.0236,-0.0181,0.0148,0.0257,0.0385,0.0205 +2002-01-31,-0.1013,-0.0117,0.0214,-0.0317665,-0.0086,,0.0042,-0.0249,,,-0.131,-0.046,,,0.0159,-0.035323111,-0.026675217,0.0405,-0.0173,-0.0083,0.004,0.0096,-0.0072,-0.0044,-0.0071,-0.005 +2002-02-28,-0.0604,-0.0069,-0.0419,0.0018121,-0.0198,,-0.0079,-0.0654,,,-0.072,-0.0222,,,-0.0173,-0.046551235,-0.01643611,-0.1371,0.0133,-0.0387,-0.073,-0.0384,-0.0202,-0.0217,-0.0392,-0.0287 +2002-03-31,0.1262,0.02,0.0119,-0.0631188,-0.0176,,-0.0029,0.057,,-0.0752,0.124,0.0314,,,-0.0148,-0.019648705,0.027078247,0.1653,-0.0662,0.037,0.027,-0.0775,0.0009,0.0017,0.0032,0.0021 +2002-04-30,-0.0376,-0.0099,-0.0337,0.0353622,-0.0408,,0.0052,-0.0843,,0.0155,-0.123,-0.0329,,,-0.0959,-0.019131491,-0.003836072,-0.0144,0.0499,-0.0073,0.016,0.0147,-0.0104,-0.0146,-0.0249,-0.0172 +2002-05-31,-0.0396,0.0234,0.0659,0.0187457,0.0372,,0.0037,0.0134,,0.0675,0.003,0.0627,,,0.0546,-0.001857585,0.001136794,-0.0249,0.0151,0.0082,0.0565,-0.0037,0.027,0.038,0.0504,0.0291 +2002-06-30,0.0795,0.0841,0.1209,0.0763969,0.0793,,0.1133,0.0693,,0.0738,0.077,0.0714,,,0.1532,0.129652605,0.083505033,0.0922,0.0775,0.0787,0.1075,0.1072,0.0655,0.0728,0.131,0.0729 +2002-07-31,0.0471,0.0597,0.0573,0.0365166,0.075,,0.0408,0.049,,0.0595,0.122,0.0362,,,0.1744,0.057386052,0.094730375,0.0376,-0.0397,0.0279,0.0527,0.0465,0.0413,0.0277,0.0734,0.0358 +2002-08-31,0.0604,0.0278,0.0154,0.052068,0.0352,,0.0322,0.0426,,0.0544,0.095,0.0149,,,0.0748,0.024668917,0.065068603,0.0083,0.0986,0.0208,0.0148,0.0401,0.022,0.022,0.0386,0.0217 +2002-09-30,0.0763,0.0344,0.06,0.1049737,0.0344,,0.0473,0.0692,,0.0513,0.045,0.0143,,,0.0866,0.082615307,0.086873878,0.069,0.0329,0.0334,0.0335,0.0653,0.0284,0.0218,0.0584,0.0339 +2002-10-31,-0.0796,-0.0272,-0.0692,-0.0447151,-0.0456,,-0.09,-0.0711,,-0.0773,-0.136,-0.0442,,,-0.0609,-0.095973783,-0.046019836,0.0099,-0.1019,-0.0141,-0.0456,-0.0588,-0.0376,-0.0327,-0.0678,-0.0406 +2002-11-30,-0.0069,-0.0171,-0.0352,0.0242593,-0.0114,,-0.0364,-0.0549,,-0.0508,0.053,-0.0043,,,-0.0188,-0.046090109,-0.021584735,-0.035,-0.018,-0.014,-0.0285,-0.0554,-0.0164,-0.0225,-0.0541,-0.0257 +2002-12-31,0.1416,0.0658,0.0694,0.0810749,0.0328,,0.069,0.075,,0.078,0.201,0.0111,,,0.0837,0.104234528,0.103974663,0.1692,0.1841,0.0552,0.0473,0.148,0.0489,0.0472,0.0935,0.0576 +2003-01-31,0.0595,0.0518,0.0399,0.1011867,0.072,,0.0779,0.0418,,0.132,-0.006,0.0716,,,0.0802,0.069321534,0.051773364,-0.0184,0.2418,-0.005,0.0299,0.0978,0.0441,0.0495,0.0764,0.0522 +2003-02-28,0.1195,0.0403,0.0637,0.0779875,0.0746,,0.0516,0.0444,,0.0722,-0.022,0.0521,,,0.0921,0.08091954,0.086873445,0.0616,0.1318,0.0091,0.0399,0.0556,0.0402,0.0472,0.0741,0.0561 +2003-03-31,-0.108,-0.0504,-0.0824,-0.0950606,-0.0434,,-0.0363,-0.0297,,-0.1283,-0.031,-0.0748,,,-0.0619,-0.082730753,-0.083752593,0.0093,-0.0473,0.0082,-0.0452,-0.0556,-0.0445,-0.0552,-0.0912,-0.058 +2003-04-30,0.0245,0.0377,0.0011,0.053471,0.0269,,0.0127,-0.016,,0.0145,-0.032,0.0505,,,0.0208,-0.009738001,-0.000186325,-0.079,0.0202,-0.0362,0.0418,0.0014,0.0065,0.0165,0.0076,0.0161 [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3560 From noreply at r-forge.r-project.org Sun Nov 23 15:29:57 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Sun, 23 Nov 2014 15:29:57 +0100 (CET) Subject: [Returnanalytics-commits] r3561 - pkg/PerformanceAnalytics/sandbox Message-ID: <20141123142957.7CBE4186A0C@r-forge.r-project.org> Author: peter_carl Date: 2014-11-23 15:29:57 +0100 (Sun, 23 Nov 2014) New Revision: 3561 Added: pkg/PerformanceAnalytics/sandbox/CDaR.R pkg/PerformanceAnalytics/sandbox/CED.R pkg/PerformanceAnalytics/sandbox/MaxQL.R pkg/PerformanceAnalytics/sandbox/Normalize.R pkg/PerformanceAnalytics/sandbox/chart.RankBars.R Log: - function stubs and drafts from QWAFAFEW presentation Added: pkg/PerformanceAnalytics/sandbox/CDaR.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/CDaR.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/CDaR.R 2014-11-23 14:29:57 UTC (rev 3561) @@ -0,0 +1,57 @@ +#' Calculates Conditional Drawdown at Risk (CDaR) +#' +#' @description Conditional Drawdown at Risk (CDaR) is part of a one-parameter +#' family of risk measures called Conditional Drawdown (CDD). These measures of +#' risk are functionals of the portfolio drawdown (underwater) curve. For some +#' value of the tolerance parameter, in the case of +#' a single sample path, drawdown functional is defineed as the mean of the +#' worst (1 - \eqn{\alpha})% drawdowns. +#' @details +#' The \bold{CDD} is related to Value-at-Risk (VaR) and Conditional +#' Value-at-Risk (CVaR) measures studied by Rockafellar and Uryasev . By +#' definition, with respect to a specified probability level \eqn{\alpha}, the +#' \bold{\eqn{\alpha}-VaR} of a portfolio is the lowest amount \eqn{\epsilon}, +#' \eqn{\alpha} such that, with probability \eqn{\alpha}, the loss will not +#' exceed \eqn{\epsilon}, \eqn{\alpha} in a specified time T, whereas the +#' \bold{\eqn{\alpha}-CVaR} is the conditional expectation of losses above that +#' amount \eqn{\epsilon}. +#' The CDD is similar to CVaR and can be viewed as a modification of the CVaR +#' to the case when the loss-function is defined as a drawdown. CDD and CVaR are +#' conceptually related percentile-based risk performance functionals. +#' +#' Like CVaR and ETL, CDaR is defined as the mean of the worst drawdowns above +#' a quantile. For example, the 0.90 CDaR is the average of the worst 10% +#' drawdowns over the period. +#' Convex measure, so useful in optimization +#' @param R an xts, vector, matrix, data frame, timeSeries or zoo object of asset returns +#' @param p confidence level for calculation, default p=.95 +#' @return +#' @author Peter Carl, Shubhankit Mohan +#' @references Chekhlov, A., Uryasev, S. and Zabarankin, M. (2000). \emph{Portfolio Optimization with Drawdown Constraints.} Research Report #2000-5. Available at SSRN: \url{http://ssrn.com/abstract=223323} +#' @references Chekhlov, A., Uryasev, S. and Zabarankin, M. (2003) \emph{Drawdown Measure in Portfolio Optimization}. Paper available at SSRN: \url{http://ssrn.com/abstract=544742} +#' @export +CDaR <- function(R, p=0.95, ...){ + # Peter Carl + # @TODO this is just the interior function; needs multi-col framework + R = checkData(R) + R = na.omit(R) + nr = nrow(R) + dd = coredata(-PerformanceAnalytics:::Drawdowns(R)) + dd = dd[order(dd),increasing = TRUE] + # result = -(1/((1-p)*nr))*sum(dd[((p)*nr):nr]) + dar = quantile(dd, p=0.90, type=8) + result = -1*mean(dd[dd>dar]) + result +} + +############################################################################### +# R (http://r-project.org/) Econometrics for Performance and Risk Analysis +# +# Copyright (c) 2004-2014 Peter Carl and Brian G. Peterson +# +# This R package is distributed under the terms of the GNU Public License (GPL) +# for full details see the file COPYING +# +# $Id: $ +# +############################################################################### \ No newline at end of file Added: pkg/PerformanceAnalytics/sandbox/CED.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/CED.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/CED.R 2014-11-23 14:29:57 UTC (rev 3561) @@ -0,0 +1,40 @@ + +#' Calculates Conditional Expected Drawdown (CED) +#' +#' Defined as the tail mean of a distribution of maximum drawdowns +#' Analagous to ETL, but accounts for autocorrelation +#' Convex and supports risk attribution, but isn't monetary +#' Calculated from the distribution of the rolling maximum drawdowns +#' @param R an xts, vector, matrix, data frame, timeSeries or zoo object of asset returns +#' @param p confidence level for calculation, default p=.95 +#' @return +#' @references Goldberg, L. and Mahmoud, O. (2014). On a Convex Measure of Drawdown Risk. Available at SSRN: http://ssrn.com/abstract=2430918 + +CED <- function(R, p=0.95, ...){ + # @TODO this is an interior function only; add multi-col support + # Rolling 12-month max DD + x.rMDD = NULL # initialize rolling maximum drawdown results + for(i in colnames(x.R)){ + x.rMDD1 <- rollapply(na.omit(x.R[,i]), width = 12, align="right", FUN=maxDrawdown) + x.rMDD = cbind(x.rMDD, x.rMDD1) + } + x.qrMDD=apply(x.rMDD["1998::",], MARGIN=2, FUN=quantile, probs=0.90, na.rm=TRUE) # this is the quantile + x.CED = NULL # Calculate the CED from the rolling MDD obs > quantile MDD + for(i in 1:NCOL(x.R)){ + .CED = mean(x.rMDD[x.rMDD[,i] > x.qrMDD[i], i]) + x.CED = c(x.CED, .CED) + } + return(.CED) +} + +############################################################################### +# R (http://r-project.org/) Econometrics for Performance and Risk Analysis +# +# Copyright (c) 2004-2014 Peter Carl and Brian G. Peterson +# +# This R package is distributed under the terms of the GNU Public License (GPL) +# for full details see the file COPYING +# +# $Id: $ +# +############################################################################### \ No newline at end of file Added: pkg/PerformanceAnalytics/sandbox/MaxQL.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/MaxQL.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/MaxQL.R 2014-11-23 14:29:57 UTC (rev 3561) @@ -0,0 +1,130 @@ +#' @name +#' Max Quartile Loss at Confidence level MLdP +#' +#' @param R an xts, vector, matrix, data frame, timeSeries or zoo object of asset returns +#' @param p confidence level for calculation, default p=.95 +#' @param method Either "ac" (default) or "normal", see details. +#' +#' Up to how much could a particular strategy lose with a given confidence level, and regardless of time horizon/number of bets involved? +#' Closed-form expressions of quantile-loss potential associated with a significance level. The function provides two methods. The first, "normal," assumes iid returns and the second, "ac," incorporates first-order autoregressive conditions. +#' As a parametric function, this is computationally efficient, and useful for optimization. +#' Analagous to VaR +#' @references Bailey, D. and Lopez de Prado, M. (2014). Stop-Outs Under Serial Correlation and 'The Triple Penance Rule'. Journal of Risk, Forthcoming. Available at SSRN: http://ssrn.com/abstract=2201302 + +#' @export +MaxQL<-function(R, p = 0.95, method = c("ac","normal"), ...) { + # @TODO Handle multi-column output + if(!is.vector(R)) + x = checkData(R[,1]) + else + x = checkData(R) + x = na.omit(x) + if(type[1]=="ac"){ + mu = mean(x, na.rm = TRUE) + sigma_infinity = StdDev(x) + phi = cov(x[-1],x[-length(x)])/(cov(x[-length(x)])) + phi=.5 + sigma = sigma_infinity*((1-phi^2)^0.5) + dPi0 = 0 + minQ = minQ(phi, mu, sigma, dPi0, confidence) + } + if(type[1]=="normal"){ + minQ = minQ_norm(x, confidence) + } + MaxQL=min(0,minQ[[1]]) + # rownames(MaxQL) = paste0("MaxQL (", confidence*100, "%)") #,"t*") + return(MaxQL) +} + +# -------------------------------------------------------------------- +# Max Quartile Loss at Confidence level - First-order AC +# -------------------------------------------------------------------- +getQ <- function(bets, phi, mu, sigma, dPi0, confidence) { + # Compute analytical solution to quantile + #1) Mean (eq 15) + mean=(phi^(bets+1)-phi)/(1-phi)*(dPi0-mu)+mu*bets # wrong? + #2) Variance (eq 15) + var=sigma^2/(phi-1)^2 + var=var*((phi^(2*(bets+1))-1)/(phi^2-1)-2*(phi^(bets+1)-1)/(phi-1)+bets+1) + #3) Quantile + q=mean+qnorm(1-confidence)*(var^0.5) + #print(sprintf("bets %g, mean %g, var %g, var1 %g, var2 %g, var3 %g, q %g", bets, mean, var, var1, var2, var3, q)) + q +} + +goldenSection<-function(a, b, FUN, minimum = TRUE, ...) { + FUN = match.fun(FUN) + tol = 10^-9 + sign = 1 + + if(minimum) sign = -1 + N = round(ceiling(-2.078087*log(tol/abs(b-a)))) + r = 0.618033989 + c = 1.0 - r + x1 = r*a + c*b + x2 = c*a + r*b + f1 = sign * FUN(x1,...=...) + f2 = sign * FUN(x2,...=...) + #print(f1); print(f2) + for(i in 1:N){ + if(f1>f2){ + a = x1 + x1 = x2 + f1 = f2 + x2 = c*a+r*b + f2 = sign*FUN(x2,...=...) + } else { + b = x2 + x2 = x1 + f2 = f1 + x1 = r*a + c*b + f1 = sign*FUN(x1,...=...) + } + } + if(f1 < f2){ + return(list(minQ=sign*f1, t=x1)) + } else { + return(list(minQ=sign*f2, t=x2)) + } +} + +minQ <- function(phi, mu, sigma, dPi0, confidence) { + q = 0 + bets = 0 + while (q <= 0) { + bets = bets + 1 + q = getQ(bets, phi, mu, sigma, dPi0, confidence) + } + #print(sprintf("bets %g, q %g", bets, q)) + goldenSection(0,bets,getQ,FALSE,phi=phi,mu=mu,sigma=sigma,dPi0=dPi0,confidence=confidence) +} + +# minQ(0.5, 1, 2, 1, 0.95) +# MinQ = -9.15585580378 +# Time at MinQ = 12.4832517718 + +# -------------------------------------------------------------------- +# Max Quartile Loss at Confidence level - Assuming IID (eq. 5) +# -------------------------------------------------------------------- + +minQ_norm<-function(x, confidence){ + # Calculate the maximum drawdown for a normal distribution, assuming iid returns + x = na.omit(x) + sd = StdDev(x) + mu = mean(x, na.rm = TRUE) + minQ = -((qnorm(1-confidence)*sd)^2)/(4*mu) + t = ((qnorm(1-confidence)*sd)/(2*mu))^2 + return(list(minQ=minQ,t=t)) +} + +############################################################################### +# R (http://r-project.org/) Econometrics for Performance and Risk Analysis +# +# Copyright (c) 2004-2014 Peter Carl and Brian G. Peterson +# +# This R package is distributed under the terms of the GNU Public License (GPL) +# for full details see the file COPYING +# +# $Id: $ +# +############################################################################### \ No newline at end of file Added: pkg/PerformanceAnalytics/sandbox/Normalize.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/Normalize.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/Normalize.R 2014-11-23 14:29:57 UTC (rev 3561) @@ -0,0 +1,51 @@ +# Two functions: +# Return.normalize +# chart.NDD + +#' Calculate mean and volatility normalized time series +#' +#' +#' @param R +#' @param targetMean +#' @param targetVol +#' @param ... passes arguments to par +#' @return xts or other time series +#' @author Peter Carl +#' @references Burghardt, G. Duncan, R., and Liu, L. (2003). Deciphering Drawdowns. Risk Magazine. Available at: http://www.risk.net/data/risk/pdf/investor/0903_risk.pdf +#' @rdname Return.normalize +#' @export Return.normalize +#' @export chart.NDD +Return.normalize <- (R, targetMean=0, targetVol=0, ...){ + # Peter Carl + x=checkData(R) + x.Mean=apply(x, MARGIN=2, FUN="mean", na.rm = TRUE) + x.SD=StdDev(x) + # @TODO wil this work for vector? checkData as matrix? + # Apply z-score + x.Z = apply(x, MARGIN=2, FUN=function(x){ (x - mean(x, na.rm = TRUE))/sd(x, na.rm = TRUE) }) # x.Z has mean=0, sd=1 + x.N= targetMean + x.Z * (rep(1, nrow(x.R)) %o% rep(targetVol,NCOL(x.R))) + x.N = as.xts(x.N, by=index(x.R)) + x.N = reclass(x.N, R) + return(x.N) +} + +chart.NDD <- function(R, targetMean=0, targetVol=0){ + # Peter Carl + x.N = Return.normalize(x) + x.NDD = PerformanceAnalytics:::Drawdowns(x.N) + par(mar = c(3, 5, 2, 3)+0.1) #c(bottom, left, top, right) + chart.TimeSeries(x.NDD[start.date,c(manager.col, index.cols, peer.cols)], colorset=colorset, lwd=lwdset, legend.loc=NULL, lty=lineset, main="", cex.axis=1.2, cex.lab=1.5) + par(op) +} + +############################################################################### +# R (http://r-project.org/) Econometrics for Performance and Risk Analysis +# +# Copyright (c) 2004-2014 Peter Carl and Brian G. Peterson +# +# This R package is distributed under the terms of the GNU Public License (GPL) +# for full details see the file COPYING +# +# $Id: $ +# +############################################################################### \ No newline at end of file Added: pkg/PerformanceAnalytics/sandbox/chart.RankBars.R =================================================================== --- pkg/PerformanceAnalytics/sandbox/chart.RankBars.R (rev 0) +++ pkg/PerformanceAnalytics/sandbox/chart.RankBars.R 2014-11-23 14:29:57 UTC (rev 3561) @@ -0,0 +1,28 @@ +# Generalize this to take any function, determine the rankings, and draw a bar plot in order of the ranking +# Peter Carl +chart.RankBars <- function(R, FUN="mean", ...){ + # @todo split dots between the function and par + if(!is.function(FUN)) + FUN = match.fun("FUN",...) + t.AC = table.Autocorrelation(last(x.R[,c(manager.col, index.cols, peer.cols)],36)) + y=colSums(t.AC[1:6,]) + layout(matrix(1:2,ncol=1), heights=c(3,2)) + par(mar = c(1, 5, 3, 3)+0.1) #c(bottom, left, top, right) + barplot(y[order(y)], col=colorset[order(y)], border = NA, axisnames=FALSE, ylim=range(pretty(y)), cex.axis=1.2, cex.lab=1.5, cex.main=2, ylab="Sum of Lag 1-6 AC", main="Trailing 36-month Autocorrelation") + box() + barplot(as.numeric(t.AC[7,order(y)]), col=colorset[order(y)], ylim=range(pretty(c(0,1))), axisnames=FALSE, border=NA, cex.axis=1.2, cex.lab=1.5, ylab="Q(6) p-value") + box() + par(op) +} + +############################################################################### +# R (http://r-project.org/) Econometrics for Performance and Risk Analysis +# +# Copyright (c) 2004-2014 Peter Carl and Brian G. Peterson +# +# This R package is distributed under the terms of the GNU Public License (GPL) +# for full details see the file COPYING +# +# $Id: $ +# +############################################################################### \ No newline at end of file From noreply at r-forge.r-project.org Tue Nov 25 16:07:07 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Tue, 25 Nov 2014 16:07:07 +0100 (CET) Subject: [Returnanalytics-commits] r3562 - in pkg/FactorAnalytics: . R man vignettes Message-ID: <20141125150707.907A4187841@r-forge.r-project.org> Author: pragnya Date: 2014-11-25 16:07:07 +0100 (Tue, 25 Nov 2014) New Revision: 3562 Added: pkg/FactorAnalytics/R/fitSfm.R pkg/FactorAnalytics/man/fitSfm.Rd Modified: pkg/FactorAnalytics/DESCRIPTION pkg/FactorAnalytics/NAMESPACE pkg/FactorAnalytics/R/Misc.R pkg/FactorAnalytics/R/fitTsfm.control.R pkg/FactorAnalytics/man/CornishFisher.Rd pkg/FactorAnalytics/man/fitTsfm.Rd pkg/FactorAnalytics/man/fitTsfm.control.Rd pkg/FactorAnalytics/man/fmCov.Rd pkg/FactorAnalytics/man/fmEsDecomp.Rd pkg/FactorAnalytics/man/fmSdDecomp.Rd pkg/FactorAnalytics/man/fmVaRDecomp.Rd pkg/FactorAnalytics/man/paFm.Rd pkg/FactorAnalytics/man/plot.pafm.Rd pkg/FactorAnalytics/man/plot.tsfm.Rd pkg/FactorAnalytics/man/predict.tsfm.Rd pkg/FactorAnalytics/man/print.pafm.Rd pkg/FactorAnalytics/man/print.tsfm.Rd pkg/FactorAnalytics/man/summary.pafm.Rd pkg/FactorAnalytics/man/summary.tsfm.Rd pkg/FactorAnalytics/vignettes/FA.bib Log: Refactored & updated fitSfm. Roxygen2 update modified Rd files. Modified: pkg/FactorAnalytics/DESCRIPTION =================================================================== --- pkg/FactorAnalytics/DESCRIPTION 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/DESCRIPTION 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,8 +1,8 @@ Package: factorAnalytics Type: Package Title: Factor Analytics -Version: 2.0.1 -Date: 2014-11-03 +Version: 2.0.2 +Date: 2014-11-23 Author: Eric Zivot, Yi-An Chen and Sangeetha Srinivasan Maintainer: Sangeetha Srinivasan Description: An R package for the estimation and risk analysis of linear factor Modified: pkg/FactorAnalytics/NAMESPACE =================================================================== --- pkg/FactorAnalytics/NAMESPACE 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/NAMESPACE 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,6 +1,8 @@ -# Generated by roxygen2 (4.0.1): do not edit by hand +# Generated by roxygen2 (4.0.2): do not edit by hand +S3method(coef,sfm) S3method(coef,tsfm) +S3method(fitted,sfm) S3method(fitted,tsfm) S3method(fmCov,tsfm) S3method(fmEsDecomp,tsfm) @@ -12,10 +14,12 @@ S3method(print,pafm) S3method(print,summary.tsfm) S3method(print,tsfm) +S3method(residuals,sfm) S3method(residuals,tsfm) S3method(summary,pafm) S3method(summary,tsfm) export(dCornishFisher) +export(fitSfm) export(fitTsfm) export(fmCov) export(fmEsDecomp) Modified: pkg/FactorAnalytics/R/Misc.R =================================================================== --- pkg/FactorAnalytics/R/Misc.R 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/R/Misc.R 2014-11-25 15:07:07 UTC (rev 3562) @@ -13,4 +13,5 @@ #' @importFrom sandwich vcovHC.default vcovHAC.default #' @importFrom lattice barchart panel.barchart panel.grid #' @importFrom corrplot corrplot -#' @importFrom strucchange efp \ No newline at end of file +#' @importFrom strucchange efp +#' @importFrom MASS ginv \ No newline at end of file Added: pkg/FactorAnalytics/R/fitSfm.R =================================================================== --- pkg/FactorAnalytics/R/fitSfm.R (rev 0) +++ pkg/FactorAnalytics/R/fitSfm.R 2014-11-25 15:07:07 UTC (rev 3562) @@ -0,0 +1,410 @@ +#' @title Fit a statistical factor model using principal component analysis +#' +#' @description Fits a statistical factor model using principal component +#' analysis for one or more asset returns or excess returns. When the number of +#' assets exceeds the number of time periods, APCA (Asymptotic Principal +#' Component Analysis) is performed. This function is based on the S+FinMetric +#' function \code{mfactor}. An object of class \code{"sfm"} is returned. +#' +#' @details +#' If \code{data} is not of class \code{"xts"}, rownames must provide an +#' \code{xts} compatible time index. If the data contains missing values, +#' \code{na.rm} should be set to \code{TRUE} to remove NAs. +#' +#' Let \code{N} be the number of columns or assets and \code{T} be the number +#' of rows or observations. When \code{N < T}, Principal Component Analysis +#' (PCA) is performed. Otherwise, Asymptotic Principal Component Analysis +#' (APCA) is performed. In either case, any number of factors less than +#' \code{min(N,T)} can be chosen via argument \code{k}. Default is 1. Refer to +#' Zivot and Wang (2007) for more details and references. +#' +#' Alternately, for APCA, a method to determine the number of factors can be +#' specified: \code{k="bn"} corresponds to Bai and Ng (2002) and \code{k="ck"} +#' corresponds to Connor and Korajczyk (1993). User can specify the maximum +#' number of factors, \code{max.k} to consider with these methods. If not, a +#' default maximum is calculated from \code{min(10, T-1)}. +#' +#' \code{refine} specifies whether a refinement of the APCA procedure (that may +#' improve efficiency) from Connor and Korajczyk (1988) is to be used. +#' +#' If \code{check=TRUE}, a warning is issued if any asset is found to have +#' identical observations. +#' +#' Note about NAs: Before model fitting, incomplete cases in \code{data} are +#' removed using \code{\link[stats]{na.omit}}. Otherwise, all observations are +#' included. +#' +#' @param data vector, matrix, data.frame, xts, timeSeries or zoo object with +#' asset returns. See details. +#' @param k number of factors; a number (or) a method for determining the +#' optimal number of factors, one of "bn" or "ck". See details. Default is 1. +#' @param max.k scalar; the maximum number of factors to be considered for +#' methods "bn" or "ck". Default is \code{NULL}. See details. +#' @param refine logical; whether to use the Connor-Korajczyk refinement for +#' APCA. Default is \code{TRUE}. +#' @param sig scalar; desired level of significance when "ck" method is +#' specified. Default is 0.05. +#' @param check logical; to check if any asset has identical observations. +#' Default is \code{FALSE}. +#' @param ... arguments passed to other functions. +#' +#' @return fitTsfm returns an object of class \code{"sfm"} for which +#' \code{print}, \code{plot}, \code{predict} and \code{summary} methods exist. +#' +#' The generic accessor functions \code{coef}, \code{fitted} and +#' \code{residuals} extract various useful features of the fit object. +#' Additionally, \code{fmCov} computes the covariance matrix for asset returns +#' based on the fitted factor model +#' +#' An object of class \code{"sfm"} is a list containing the following +#' components: +#' \item{asset.fit}{list of fitted objects of class \code{lm} for each asset, +#' from the time-series OLS regression of asset returns on estimated factors.} +#' \item{k}{number of factors; as input or determined by "ck" or "bn" methods.} +#' \item{factors}{T x K xts object of estimated factor realizations.} +#' \item{loadings}{N x K matrix of factor loadings estimated by +#' regressing the asset returns on estimated factors.} +#' \item{alpha}{length-N vector of estimated alphas.} +#' \item{r2}{length-N vector of R-squared values.} +#' \item{resid.sd}{length-N vector of residual standard deviations.} +#' \item{residuals}{T x N xts object of residuals from the OLS regression.} +#' \item{Omega}{M x M return covariance matrix estimated by the factor model, +#' where M = min(N,T).} +#' \item{eigen}{length-K vector of eigenvalues of the sample covariance matrix.} +#' \item{mimic}{N x K matrix of factor mimicking portfolio weights.} +#' \item{call}{the matched function call.} +#' \item{data}{T x N xts data object containing the asset returns.} +#' \item{asset.names}{length-N vector of column names from data.} +#' Where N is the number of assets, K is the number of factors, and T is the +#' number of observations. +#' +#' \item{residuals}{T x N matrix of residuals from the regression.} +#' \item{asset.ret}{N x T matrix of fitted asset returns from the factor model.} +#' +#' @author Eric Zivot, Sangeetha Srinivasan and Yi-An Chen +#' +#' @references +#' Bai, J., & Ng, S. (2002). Determining the number of factors in approximate +#' factor models. Econometrica, 70(1), 191-221. +#' +#' Connor, G., & Korajczyk, R. A. (1988). Risk and return in an equilibrium +#' APT: Application of a new test methodology. Journal of Financial Economics, +#' 21(2), 255-289. +#' +#' Connor, G., & Korajczyk, R. A. (1993). A test for the number of factors in +#' an approximate factor model. The Journal of Finance, 48(4), 1263-1291. +#' +#' Zivot, E., & Wang, J. (2007). Modeling Financial Time Series with S-PLUS +#' (Vol. 191). Springer. +#' +#' +#' @seealso The \code{sfm} methods for generic functions: +#' \code{\link{plot.sfm}}, \code{\link{predict.sfm}}, +#' \code{\link{print.sfm}} and \code{\link{summary.sfm}}. +#' +#' And, the following extractor functions: \code{\link[stats]{coef}}, +#' \code{\link[stats]{fitted}}, \code{\link[stats]{residuals}}, +#' \code{\link{fmCov}}, \code{\link{fmSdDecomp}}, \code{\link{fmVaRDecomp}} +#' and \code{\link{fmEsDecomp}}. +#' +#' \code{\link{paFm}} for Performance Attribution. +#' +#' @examples +#' +#' # load data for fitSfm.r +#' data(stat.fm.data) +#' # data is from finmetric berndt.dat and folio.dat +#' +#' # PCA is performed on sfm.dat and APCA on sfm.apca.dat +#' class(sfm.dat) +#' class(sfm.apca.dat) +#' +#' # pca +#' args(fitSfm) +#' sfm.pca.fit <- fitSfm(sfm.dat, k=2) +#' class(sfm.pca.fit) +#' names(sfm.pca.fit) +#' head(sfm.pca.fit$factors) +#' head(sfm.pca.fit$loadings) +#' sfm.pca.fit$r2 +#' sfm.pca.fit$resid.sd +#' sfm.pca.fit$mimic +#' +#' # apca with number of factors, k=15 +#' # sfm.apca.fit <- fitSfm(sfm.apca.dat, k=15, refine=TRUE) +#' +#' # apca with the Bai & Ng method +#' sfm.apca.fit.bn <- fitSfm(sfm.apca.dat, k="bn") +#' +#' # apca with the Connor-Korajczyk method +#' # sfm.apca.fit.ck <- fitSfm(sfm.apca.dat, k="ck") +#' +#' @importFrom PerformanceAnalytics checkData +#' +#' @export + +fitSfm <- function(data, k=1, max.k=NULL, refine=TRUE, sig=0.05, check=FALSE) { + + # record the call as an element to be returned + call <- match.call() + + # check input data type and format and coerce to desired type for use + R.xts <- checkData(data, method="xts") + R.mat <- coredata(R.xts) + + # remove NAs + R.mat <- na.omit(R.mat) + + # dim and dimnames of R.mat + n <- ncol(R.mat) + obs <- nrow(R.mat) + if (is.null(dimnames(data))) { + dimnames(R.mat) <- list(1:obs, paste("V", 1:n, sep = ".")) + colnames(R.xts) <- paste("V", 1:n, sep = ".") + } + + # check input vailidity for argument k + if (is.numeric(k)) { + if (k <= 0 || round(k) != k) { + stop("Invalid argument: k, the number of factors, must be a positive + integer.") + } else if (k >= min(n,obs)) { + stop("Invalid argument: k, the number of factors, must be less than the + number of variables.") + } + } else if (is.character(k) && (n > obs)) { + if (!(k %in% c("bn","ck"))) { + stop("Invalid argument: Method for determining the number of factors for + APCA must be one of 'ck' or 'bn'.") + } + } else { + stop("Invalid argument: k, the number of factors, must either be a positive + integer or methods 'ck' or 'bn'. The latter methods are relevant only + for APCA (when, number of assets >= number of observations).") + } + + # check input vailidity or assign default for argument max.k + if (is.null(max.k)) { + max.k <- min(10, obs - 1) + } else if (max.k >= obs) { + stop("Invalid argument: max.k must be less than the number of observations") + } + + # check if any asset has identical observations + temp <- apply(data, 2, range) + if(any(abs(temp[2, ] - temp[1, ]) < .Machine$single.eps)) { + warning("Some variables have identical observations.") + } + + # select method to estimate factors based on k and n + # in each case a partial list of return values are obtained + if (n < obs) { + result <- UsePCA(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs) + } else if (k == "ck") { + result <- UseAPCA_ck(R.xts=R.xts, R.mat=R.mat, max.k=max.k, refine=refine, + sig=sig, n=n, obs=obs) + } else if (k == "bn") { + result <- UseAPCA_bn(R.xts=R.xts, R.mat=R.mat, max.k=max.k, refine=refine, + n=n, obs=obs) + } else { + result <- UseAPCA(R.xts=R.xts, R.mat=R.mat, k=k, refine=refine, n=n, + obs=obs) + } + + # create list of return values. + input <- list(call=call, data=R.xts, asset.names=colnames(R.xts)) + result <- c(result, input) + class(result) <- "sfm" + return(result) +} + +### Principal Component Analysis when N < T +# +UsePCA <- function(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs) { + + # demean TxN matrix of returns + R.mat.d <- t(t(R.mat) - colMeans(R.mat)) + # NxN return covariance matrix + Omega.N <- crossprod(R.mat.d)/obs + # get eigen decomposition + eig.decomp <- eigen(Omega.N, symmetric=TRUE) + eig.val <- eig.decomp$values + X <- eig.decomp$vectors[, 1:k, drop=FALSE] # NxK + dimnames(X) <- list(colnames(R.xts), paste("F", 1:k, sep = ".")) + # get TxK factor realizations + f <- R.mat %*% X + colnames(f) <- paste("F", 1:k, sep = ".") + + # OLS time series regression to get B: NxK matrix of factor loadings + f <- xts(f, index(R.xts)) + asset.fit <- lm(R.xts ~ f) + B <- t(coef(asset.fit)[-1, , drop=FALSE]) + alpha <- coef(asset.fit)[1,] + + # extract r2, residual SD and residuals + resid.xts <- do.call(merge, sapply(X=summary(asset.fit), FUN="[", "residuals")) + r2 <- as.numeric(sapply(X=summary(asset.fit), FUN="[", "r.squared")) + resid.sd <- as.numeric(sapply(X=summary(asset.fit), FUN="[", "sigma")) + + # compute factor model return covariance: NxN + Omega.fm <- B %*% var(f) %*% t(B) + diag(resid.sd^2) + + # compute factor mimicking portfolio weights: NxK + mimic <- X / colSums(X) + + # assign row and column names + names(eig.val) = names(r2) = names(resid.sd) = colnames(R.xts) + + # return list + list(asset.fit=asset.fit, k=k, factors=f, loadings=B, alpha=alpha, r2=r2, + resid.sd=resid.sd, residuals=resid.xts, Omega=Omega.fm, eigen=eig.val, + mimic=mimic) +} + + +### Asymptotic Principal Component Analysis when N >= T +# +UseAPCA <- function(R.xts=R.xts, R.mat=R.mat, k=k, refine=refine, n=n, obs=obs) { + + # demean TxN matrix of returns + R.mat.d <- t(t(R.mat) - colMeans(R.mat)) + # TxT return covariance matrix + Omega.T <- tcrossprod(R.mat.d)/n + # get eigen decomposition + eig.decomp <- eigen(Omega.T, symmetric=TRUE) + eig.val <- eig.decomp$values + # get TxK factor realizations + X <- eig.decomp$vectors[, 1:k, drop=FALSE] # TxK + dimnames(X) <- list(1:obs, paste("F", 1:k, sep = ".")) + + # OLS time series regression to get B: NxK matrix of factor loadings + f <- xts(X, index(R.xts)) + asset.fit <- lm(R.xts ~ f) + B <- t(coef(asset.fit)[-1, , drop=FALSE]) + alpha <- coef(asset.fit)[1,] + + # estimate residual standard deviations + resid.sd <- as.numeric(sapply(X=summary(asset.fit), FUN="[", "sigma")) + + if (refine) { + R.mat.rescaled <- t(R.mat.d)/resid.sd + Omega.T <- crossprod(R.mat.rescaled)/n + eig.decomp <- eigen(Omega.T, symmetric=TRUE) + eig.val <- eig.decomp$values + X <- eig.decomp$vectors[, 1:k, drop=FALSE] + dimnames(X) <- list(1:obs, paste("F", 1:k, sep = ".")) + f <- xts(X, index(R.xts)) + asset.fit <- lm(R.xts ~ f) + B <- t(coef(asset.fit)[-1, , drop=FALSE]) + alpha <- coef(asset.fit)[1,] + resid.sd <- as.numeric(sapply(X=summary(asset.fit), FUN="[", "sigma")) + } + + # compute factor model return covariance: NxN + Omega.fm <- B %*% var(f) %*% t(B) + diag(resid.sd^2) + + # compute factor mimicking portfolio weights + mimic <- X / colSums(X) + + # extract r2, residuals + resid.xts <- do.call(merge, sapply(X=summary(asset.fit), FUN="[", "residuals")) + r2 <- as.numeric(sapply(X=summary(asset.fit), FUN="[", "r.squared")) + + # assign row and column names + names(eig.val) = 1:obs + names(r2) = names(resid.sd) = colnames(R.xts) + + # return list + list(asset.fit=asset.fit, k=k, factors=f, loadings=B, alpha=alpha, r2=r2, + resid.sd=resid.sd, residuals=resid.xts, Omega=Omega.fm, eigen=eig.val, + mimic=mimic) +} + + +### Asymptotic Principal Component Analysis using 'ck' method to determine k +# +UseAPCA_ck <- function(R.xts=R.xts, R.mat=R.mat, max.k=max.k, refine=refine, + sig=sig, n=n, obs=obs) { + + idx <- 2*(1:(obs/2)) + + # dof-adjusted squared residuals for k=1 + fit <- UseAPCA(R.xts=R.xts, R.mat=R.mat, k=1, n=n, obs=obs, refine=refine) + eps2 <- fit$residuals^2 / (1-2/obs-1/n) + + for (k in 2:max.k) { + f <- fit + mu <- rowMeans(eps2[idx-1,,drop=FALSE]) + # dof-adjusted squared residuals for k + fit <- UseAPCA(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs, refine=refine) + eps2.star <- fit$residuals^2 / (1-(k+1)/obs-k/n) + mu.star <- rowMeans(eps2[idx,,drop=FALSE]) + # cross sectional differences in sqd. errors btw odd & even time periods + delta <- mu - mu.star + # test for a positive mean value for Delta + if(t.test(delta, alternative="greater")$p.value > sig) {return(f)} + eps2 <- eps2.star + } + return(fit) +} + + +### Asymptotic Principal Component Analysis using 'bn' method to determine k +# +UseAPCA_bn <- function(R.xts=R.xts, R.mat=R.mat, max.k=max.k, refine=refine, + n=n, obs=obs) { + # intialize sigma + sigma <- rep(NA, max.k) + + for (k in 1:max.k) { + # fit APCA for k factors + fit <- UseAPCA(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs, refine=refine) + # get cross-sectional average of residual variances + sigma[k] <- mean(fit$resid.sd^2) + } + + idx <- 1:max.k + # Preferred criteria PC_p1 and PC_p2 + PC_p1 <- sigma[idx] + idx*sigma[max.k]*(n+obs)/(n*obs)*log((n*obs)/(n+obs)) + PC_p2 <- sigma[idx] + idx*sigma[max.k]*(n+obs)/(n*obs)*log(min(n,obs)) + + if(order(PC_p1)[1] != order(PC_p2)[1]) { + warning("PC_p1 and PC_p2 did not yield the same result. The smaller one was + used.") + } + k <- min(order(PC_p1)[1], order(PC_p2)[1]) + UseAPCA(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs, refine=refine) +} + + +#' @param object a fit object of class \code{sfm} which is returned by +#' \code{fitSfm} + +#' @rdname fitSfm +#' @method coef sfm +#' @export + +coef.sfm <- function(object, ...) { + # cbind alpha and beta + coef.mat <- cbind(object$alpha, object$loadings) + # name for alpha/intercept column + colnames(coef.mat)[1] <- "(Intercept)" + return(coef.mat) +} + +#' @rdname fitSfm +#' @method fitted sfm +#' @export + +fitted.sfm <- function(object, ...) { + fitted.xts <- object$data - object$residuals + return(fitted.xts) +} + +#' @rdname fitSfm +#' @method residuals sfm +#' @export + +residuals.sfm <- function(object, ...) { + return(object$residuals) +} Modified: pkg/FactorAnalytics/R/fitTsfm.control.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/R/fitTsfm.control.R 2014-11-25 15:07:07 UTC (rev 3562) @@ -2,7 +2,7 @@ #' #' @description Creates a list of control parameters for \code{\link{fitTsfm}}. #' All control parameters that are not passed to this function are set to -#' default values. +#' default values. This function is meant for internal use only!! #' #' @details This control function is used to process optional arguments passed #' via \code{...} to \code{fitTsfm}. These arguments are validated and defaults Modified: pkg/FactorAnalytics/man/CornishFisher.Rd =================================================================== --- pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/CornishFisher.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{dCornishFisher} \alias{Cornish-Fisher} \alias{dCornishFisher} @@ -16,20 +16,20 @@ rCornishFisher(n, sigma, skew, ekurt, seed = NULL) } \arguments{ +\item{x,q}{vector of standardized quantiles.} + \item{n}{scalar; number of simulated values in random simulation, sample length in density, distribution and quantile functions.} -\item{sigma}{scalar standard deviation.} - \item{skew}{scalar; skewness.} \item{ekurt}{scalar; excess kurtosis.} -\item{seed}{scalar; set seed. Default is \code{NULL}.} +\item{p}{vector of probabilities.} -\item{x,q}{vector of standardized quantiles.} +\item{sigma}{scalar standard deviation.} -\item{p}{vector of probabilities.} +\item{seed}{scalar; set seed. Default is \code{NULL}.} } \value{ \code{dCornishFisher} gives the density, \code{pCornishFisher} gives the Added: pkg/FactorAnalytics/man/fitSfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitSfm.Rd (rev 0) +++ pkg/FactorAnalytics/man/fitSfm.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -0,0 +1,167 @@ +% Generated by roxygen2 (4.0.2): do not edit by hand +\name{fitSfm} +\alias{coef.sfm} +\alias{fitSfm} +\alias{fitted.sfm} +\alias{residuals.sfm} +\title{Fit a statistical factor model using principal component analysis} +\usage{ +fitSfm(data, k = 1, max.k = NULL, refine = TRUE, sig = 0.05, + check = FALSE) + +\method{coef}{sfm}(object, ...) + +\method{fitted}{sfm}(object, ...) + +\method{residuals}{sfm}(object, ...) +} +\arguments{ +\item{data}{vector, matrix, data.frame, xts, timeSeries or zoo object with +asset returns. See details.} + +\item{k}{number of factors; a number (or) a method for determining the +optimal number of factors, one of "bn" or "ck". See details. Default is 1.} + +\item{max.k}{scalar; the maximum number of factors to be considered for +methods "bn" or "ck". Default is \code{NULL}. See details.} + +\item{refine}{logical; whether to use the Connor-Korajczyk refinement for +APCA. Default is \code{TRUE}.} + +\item{sig}{scalar; desired level of significance when "ck" method is +specified. Default is 0.05.} + +\item{check}{logical; to check if any asset has identical observations. +Default is \code{FALSE}.} + +\item{object}{a fit object of class \code{sfm} which is returned by +\code{fitSfm}} +} +\value{ +fitTsfm returns an object of class \code{"sfm"} for which +\code{print}, \code{plot}, \code{predict} and \code{summary} methods exist. + +The generic accessor functions \code{coef}, \code{fitted} and +\code{residuals} extract various useful features of the fit object. +Additionally, \code{fmCov} computes the covariance matrix for asset returns +based on the fitted factor model + +An object of class \code{"sfm"} is a list containing the following +components: +\item{asset.fit}{list of fitted objects of class \code{lm} for each asset, +from the time-series OLS regression of asset returns on estimated factors.} +\item{k}{number of factors; as input or determined by "ck" or "bn" methods.} +\item{factors}{T x K xts object of estimated factor realizations.} +\item{loadings}{N x K matrix of factor loadings estimated by +regressing the asset returns on estimated factors.} +\item{alpha}{length-N vector of estimated alphas.} +\item{r2}{length-N vector of R-squared values.} +\item{resid.sd}{length-N vector of residual standard deviations.} +\item{residuals}{T x N xts object of residuals from the OLS regression.} +\item{Omega}{M x M return covariance matrix estimated by the factor model, +where M = min(N,T).} +\item{eigen}{length-K vector of eigenvalues of the sample covariance matrix.} +\item{mimic}{N x K matrix of factor mimicking portfolio weights.} +\item{call}{the matched function call.} +\item{data}{T x N xts data object containing the asset returns.} +\item{asset.names}{length-N vector of column names from data.} +Where N is the number of assets, K is the number of factors, and T is the +number of observations. + +\item{residuals}{T x N matrix of residuals from the regression.} +\item{asset.ret}{N x T matrix of fitted asset returns from the factor model.} +} +\description{ +Fits a statistical factor model using principal component +analysis for one or more asset returns or excess returns. When the number of +assets exceeds the number of time periods, APCA (Asymptotic Principal +Component Analysis) is performed. This function is based on the S+FinMetric +function \code{mfactor}. An object of class \code{"sfm"} is returned. +} +\details{ +If \code{data} is not of class \code{"xts"}, rownames must provide an +\code{xts} compatible time index. If the data contains missing values, +\code{na.rm} should be set to \code{TRUE} to remove NAs. + +Let \code{N} be the number of columns or assets and \code{T} be the number +of rows or observations. When \code{N < T}, Principal Component Analysis +(PCA) is performed. Otherwise, Asymptotic Principal Component Analysis +(APCA) is performed. In either case, any number of factors less than +\code{min(N,T)} can be chosen via argument \code{k}. Default is 1. Refer to +Zivot and Wang (2007) for more details and references. + +Alternately, for APCA, a method to determine the number of factors can be +specified: \code{k="bn"} corresponds to Bai and Ng (2002) and \code{k="ck"} +corresponds to Connor and Korajczyk (1993). User can specify the maximum +number of factors, \code{max.k} to consider with these methods. If not, a +default maximum is calculated from \code{min(10, T-1)}. + +\code{refine} specifies whether a refinement of the APCA procedure (that may +improve efficiency) from Connor and Korajczyk (1988) is to be used. + +If \code{check=TRUE}, a warning is issued if any asset is found to have +identical observations. + +Note about NAs: Before model fitting, incomplete cases in \code{data} are +removed using \code{\link[stats]{na.omit}}. Otherwise, all observations are +included. +} +\examples{ +# load data for fitSfm.r +data(stat.fm.data) +# data is from finmetric berndt.dat and folio.dat + +# PCA is performed on sfm.dat and APCA on sfm.apca.dat +class(sfm.dat) +class(sfm.apca.dat) + +# pca +args(fitSfm) +sfm.pca.fit <- fitSfm(sfm.dat, k=2) +class(sfm.pca.fit) +names(sfm.pca.fit) +head(sfm.pca.fit$factors) +head(sfm.pca.fit$loadings) +sfm.pca.fit$r2 +sfm.pca.fit$resid.sd +sfm.pca.fit$mimic + +# apca with number of factors, k=15 +sfm.apca.fit <- fitSfm(sfm.apca.dat, k=15, refine=TRUE) + +# apca with the Bai & Ng method +sfm.apca.fit.bn <- fitSfm(sfm.apca.dat, k="bn") + +# apca with the Connor-Korajczyk method +sfm.apca.fit.ck <- fitSfm(sfm.apca.dat, k="ck") +} +\author{ +Eric Zivot, Sangeetha Srinivasan and Yi-An Chen +} +\references{ +Bai, J., & Ng, S. (2002). Determining the number of factors in approximate +factor models. Econometrica, 70(1), 191-221. + +Connor, G., & Korajczyk, R. A. (1988). Risk and return in an equilibrium +APT: Application of a new test methodology. Journal of Financial Economics, +21(2), 255-289. + +Connor, G., & Korajczyk, R. A. (1993). A test for the number of factors in +an approximate factor model. The Journal of Finance, 48(4), 1263-1291. + +Zivot, E., & Wang, J. (2007). Modeling Financial Time Series with S-PLUS +(Vol. 191). Springer. +} +\seealso{ +The \code{sfm} methods for generic functions: +\code{\link{plot.sfm}}, \code{\link{predict.sfm}}, +\code{\link{print.sfm}} and \code{\link{summary.sfm}}. + +And, the following extractor functions: \code{\link[stats]{coef}}, +\code{\link[stats]{fitted}}, \code{\link[stats]{residuals}}, +\code{\link{fmCov}}, \code{\link{fmSdDecomp}}, \code{\link{fmVaRDecomp}} +and \code{\link{fmEsDecomp}}. + +\code{\link{paFm}} for Performance Attribution. +} + Modified: pkg/FactorAnalytics/man/fitTsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fitTsfm.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fitTsfm} \alias{coef.tsfm} \alias{fitTsfm} Modified: pkg/FactorAnalytics/man/fitTsfm.control.Rd =================================================================== --- pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fitTsfm.control.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fitTsfm.control} \alias{fitTsfm.control} \title{List of control parameters for \code{fitTsfm}} @@ -110,7 +110,7 @@ \description{ Creates a list of control parameters for \code{\link{fitTsfm}}. All control parameters that are not passed to this function are set to -default values. +default values. This function is meant for internal use only!! } \details{ This control function is used to process optional arguments passed Modified: pkg/FactorAnalytics/man/fmCov.Rd =================================================================== --- pkg/FactorAnalytics/man/fmCov.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fmCov.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fmCov} \alias{fmCov} \alias{fmCov.tsfm} @@ -11,13 +11,13 @@ \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} +\item{...}{optional arguments passed to \code{\link[stats]{cov}}.} + \item{use}{an optional character string giving a method for computing covariances in the presence of missing values. This must be (an abbreviation of) one of the strings "everything", "all.obs", "complete.obs", "na.or.complete", or "pairwise.complete.obs". Default is "pairwise.complete.obs".} - -\item{...}{optional arguments passed to \code{\link[stats]{cov}}.} } \value{ The computed \code{N x N} covariance matrix for asset returns based Modified: pkg/FactorAnalytics/man/fmEsDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmEsDecomp.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fmEsDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fmEsDecomp} \alias{fmEsDecomp} \alias{fmEsDecomp.tsfm} @@ -12,6 +12,9 @@ \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} +\item{...}{other optional arguments passed to +\code{\link[PerformanceAnalytics]{VaR}}.} + \item{p}{confidence level for calculation. Default is 0.95.} \item{method}{method for computing VaR, one of "modified","gaussian", @@ -19,9 +22,6 @@ \item{invert}{logical; whether to invert the VaR measure. Default is \code{FALSE}.} - -\item{...}{other optional arguments passed to -\code{\link[PerformanceAnalytics]{VaR}}.} } \value{ A list containing Modified: pkg/FactorAnalytics/man/fmSdDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmSdDecomp.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fmSdDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fmSdDecomp} \alias{fmSdDecomp} \alias{fmSdDecomp.tsfm} @@ -11,13 +11,13 @@ \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} +\item{...}{optional arguments passed to \code{\link[stats]{cov}}.} + \item{use}{an optional character string giving a method for computing covariances in the presence of missing values. This must be (an abbreviation of) one of the strings "everything", "all.obs", "complete.obs", "na.or.complete", or "pairwise.complete.obs". Default is "pairwise.complete.obs".} - -\item{...}{optional arguments passed to \code{\link[stats]{cov}}.} } \value{ A list containing Modified: pkg/FactorAnalytics/man/fmVaRDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmVaRDecomp.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/fmVaRDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{fmVaRDecomp} \alias{fmVaRDecomp} \alias{fmVaRDecomp.tsfm} @@ -12,6 +12,9 @@ \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} +\item{...}{other optional arguments passed to +\code{\link[PerformanceAnalytics]{VaR}}.} + \item{p}{confidence level for calculation. Default is 0.95.} \item{method}{method for computing VaR, one of "modified","gaussian", @@ -19,9 +22,6 @@ \item{invert}{logical; whether to invert the VaR measure. Default is \code{FALSE}.} - -\item{...}{other optional arguments passed to -\code{\link[PerformanceAnalytics]{VaR}}.} } \value{ A list containing Modified: pkg/FactorAnalytics/man/paFm.Rd =================================================================== --- pkg/FactorAnalytics/man/paFm.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/paFm.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{paFm} \alias{paFm} \title{Compute cumulative mean attribution for factor models} Modified: pkg/FactorAnalytics/man/plot.pafm.Rd =================================================================== --- pkg/FactorAnalytics/man/plot.pafm.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/plot.pafm.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{plot.pafm} \alias{plot.pafm} \title{plot \code{"pafm"} object} Modified: pkg/FactorAnalytics/man/plot.tsfm.Rd =================================================================== --- pkg/FactorAnalytics/man/plot.tsfm.Rd 2014-11-23 14:29:57 UTC (rev 3561) +++ pkg/FactorAnalytics/man/plot.tsfm.Rd 2014-11-25 15:07:07 UTC (rev 3562) @@ -1,4 +1,4 @@ -% Generated by roxygen2 (4.0.1): do not edit by hand +% Generated by roxygen2 (4.0.2): do not edit by hand \name{plot.tsfm} \alias{plot.tsfm} \title{Plots from a fitted time series factor model} Modified: pkg/FactorAnalytics/man/predict.tsfm.Rd [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3562 From noreply at r-forge.r-project.org Wed Nov 26 00:37:06 2014 From: noreply at r-forge.r-project.org (noreply at r-forge.r-project.org) Date: Wed, 26 Nov 2014 00:37:06 +0100 (CET) Subject: [Returnanalytics-commits] r3563 - in pkg/FactorAnalytics: . R man vignettes Message-ID: <20141125233706.89FE518784C@r-forge.r-project.org> Author: pragnya Date: 2014-11-26 00:37:06 +0100 (Wed, 26 Nov 2014) New Revision: 3563 Added: pkg/FactorAnalytics/R/predict.sfm.r pkg/FactorAnalytics/R/print.sfm.r pkg/FactorAnalytics/R/summary.sfm.r pkg/FactorAnalytics/man/predict.sfm.Rd pkg/FactorAnalytics/man/print.sfm.Rd pkg/FactorAnalytics/man/summary.sfm.Rd Modified: pkg/FactorAnalytics/NAMESPACE pkg/FactorAnalytics/R/fitSfm.R pkg/FactorAnalytics/R/fitTsfm.R pkg/FactorAnalytics/R/fmCov.R pkg/FactorAnalytics/R/fmEsDecomp.R pkg/FactorAnalytics/R/fmSdDecomp.R pkg/FactorAnalytics/R/fmVaRDecomp.R pkg/FactorAnalytics/R/paFm.r pkg/FactorAnalytics/man/fitSfm.Rd pkg/FactorAnalytics/man/fmCov.Rd pkg/FactorAnalytics/man/fmEsDecomp.Rd pkg/FactorAnalytics/man/fmSdDecomp.Rd pkg/FactorAnalytics/man/fmVaRDecomp.Rd pkg/FactorAnalytics/vignettes/fitTsfm_vignette.Rnw pkg/FactorAnalytics/vignettes/fitTsfm_vignette.pdf Log: Added method functions for fitSfm. Updated fitTsfm vignette Modified: pkg/FactorAnalytics/NAMESPACE =================================================================== --- pkg/FactorAnalytics/NAMESPACE 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/NAMESPACE 2014-11-25 23:37:06 UTC (rev 3563) @@ -4,19 +4,27 @@ S3method(coef,tsfm) S3method(fitted,sfm) S3method(fitted,tsfm) +S3method(fmCov,sfm) S3method(fmCov,tsfm) +S3method(fmEsDecomp,sfm) S3method(fmEsDecomp,tsfm) +S3method(fmSdDecomp,sfm) S3method(fmSdDecomp,tsfm) +S3method(fmVaRDecomp,sfm) S3method(fmVaRDecomp,tsfm) S3method(plot,pafm) S3method(plot,tsfm) +S3method(predict,sfm) S3method(predict,tsfm) S3method(print,pafm) +S3method(print,sfm) +S3method(print,summary.sfm) S3method(print,summary.tsfm) S3method(print,tsfm) S3method(residuals,sfm) S3method(residuals,tsfm) S3method(summary,pafm) +S3method(summary,sfm) S3method(summary,tsfm) export(dCornishFisher) export(fitSfm) @@ -43,9 +51,11 @@ importFrom(lattice,panel.barchart) importFrom(lattice,panel.grid) importFrom(leaps,regsubsets) +importFrom(lmtest,coeftest) importFrom(lmtest,coeftest.default) importFrom(robust,lmRob) importFrom(robust,step.lmRob) importFrom(sandwich,vcovHAC.default) +importFrom(sandwich,vcovHC) importFrom(sandwich,vcovHC.default) importFrom(strucchange,efp) Modified: pkg/FactorAnalytics/R/fitSfm.R =================================================================== --- pkg/FactorAnalytics/R/fitSfm.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fitSfm.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -68,8 +68,7 @@ #' \item{r2}{length-N vector of R-squared values.} #' \item{resid.sd}{length-N vector of residual standard deviations.} #' \item{residuals}{T x N xts object of residuals from the OLS regression.} -#' \item{Omega}{M x M return covariance matrix estimated by the factor model, -#' where M = min(N,T).} +#' \item{Omega}{N x N return covariance matrix estimated by the factor model.} #' \item{eigen}{length-K vector of eigenvalues of the sample covariance matrix.} #' \item{mimic}{N x K matrix of factor mimicking portfolio weights.} #' \item{call}{the matched function call.} @@ -131,13 +130,13 @@ #' sfm.pca.fit$mimic #' #' # apca with number of factors, k=15 -#' # sfm.apca.fit <- fitSfm(sfm.apca.dat, k=15, refine=TRUE) +#' sfm.apca.fit <- fitSfm(sfm.apca.dat, k=15, refine=TRUE) #' #' # apca with the Bai & Ng method #' sfm.apca.fit.bn <- fitSfm(sfm.apca.dat, k="bn") #' #' # apca with the Connor-Korajczyk method -#' # sfm.apca.fit.ck <- fitSfm(sfm.apca.dat, k="ck") +#' sfm.apca.fit.ck <- fitSfm(sfm.apca.dat, k="ck") #' #' @importFrom PerformanceAnalytics checkData #' @@ -254,6 +253,7 @@ # assign row and column names names(eig.val) = names(r2) = names(resid.sd) = colnames(R.xts) + colnames(B) <- colnames(f) # return list list(asset.fit=asset.fit, k=k, factors=f, loadings=B, alpha=alpha, r2=r2, @@ -313,6 +313,7 @@ # assign row and column names names(eig.val) = 1:obs names(r2) = names(resid.sd) = colnames(R.xts) + colnames(B) <- colnames(f) # return list list(asset.fit=asset.fit, k=k, factors=f, loadings=B, alpha=alpha, r2=r2, @@ -338,7 +339,7 @@ # dof-adjusted squared residuals for k fit <- UseAPCA(R.xts=R.xts, R.mat=R.mat, k=k, n=n, obs=obs, refine=refine) eps2.star <- fit$residuals^2 / (1-(k+1)/obs-k/n) - mu.star <- rowMeans(eps2[idx,,drop=FALSE]) + mu.star <- rowMeans(eps2.star[idx,,drop=FALSE]) # cross sectional differences in sqd. errors btw odd & even time periods delta <- mu - mu.star # test for a positive mean value for Delta @@ -397,6 +398,7 @@ #' @export fitted.sfm <- function(object, ...) { + # use residuals already computed via fitSfm function fitted.xts <- object$data - object$residuals return(fitted.xts) } @@ -406,5 +408,6 @@ #' @export residuals.sfm <- function(object, ...) { + # already computed via fitSfm function return(object$residuals) } Modified: pkg/FactorAnalytics/R/fitTsfm.R =================================================================== --- pkg/FactorAnalytics/R/fitTsfm.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fitTsfm.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -233,8 +233,7 @@ dat.xts <- "[<-"(dat.xts,,vapply(dat.xts, function(x) x-data.xts[,rf.name], FUN.VALUE = numeric(nrow(dat.xts)))) } else { - warning("Excess returns were not computed. Returns data were used as input - for all factors and assets.") + warning("Excess returns were not computed.") } # opt add mkt-timing factors: up.market=max(0,Rm-Rf), market.sqd=(Rm-Rf)^2 @@ -290,7 +289,7 @@ tmp <- matrix(NA, length(asset.names), length(factor.names)) colnames(tmp) <- factor.names rownames(tmp) <- asset.names - beta <- merge(beta, tmp, all.x=TRUE, sort=FALSE)[,factor.names] + beta <- merge(beta, tmp, all.x=TRUE, sort=FALSE)[,factor.names, drop=FALSE] rownames(beta) <- asset.names # extract r2 and residual sd r2 <- sapply(reg.list, function(x) summary(x)$r.squared) Modified: pkg/FactorAnalytics/R/fmCov.R =================================================================== --- pkg/FactorAnalytics/R/fmCov.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fmCov.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -52,18 +52,12 @@ #' factor.names=c("EDHEC LS EQ","SP500 TR"), data=managers) #' fmCov(fit) #' -#' \dontrun{ #' # Statistical Factor Model #' data(stat.fm.data) #' sfm.pca.fit <- fitSfm(sfm.dat, k=2) -#' #' fmCov(t(sfm.pca.fit$loadings), var(sfm.pca.fit$factors), -#' sfm.pca.fit$resid.sd) -#' -#' sfm.apca.fit <- fitSfm(sfm.apca.dat, k=2) -#' -#' fmCov(t(sfm.apca.fit$loadings), var(sfm.apca.fit$factors), -#' sfm.apca.fit$resid.sd) -#' +#' fmCov(sfm.pca.fit) +#' +#' \dontrun{ #' # Fundamental Factor Model #' data(stock) #' # there are 447 assets @@ -121,3 +115,14 @@ return(cov.fm) } + +#' @rdname fmCov +#' @method fmCov sfm +#' @export + +fmCov.sfm <- function(object, use="pairwise.complete.obs", ...) { + + # already computed via fitSfm function + return(object$Omega) +} + Modified: pkg/FactorAnalytics/R/fmEsDecomp.R =================================================================== --- pkg/FactorAnalytics/R/fmEsDecomp.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fmEsDecomp.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -73,11 +73,16 @@ #' data(managers) #' fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), #' factor.names=colnames(managers[,(7:8)]), data=managers) -#' #' ES.decomp <- fmEsDecomp(fit.macro) #' # get the component contributions #' ES.decomp$cES #' +#' # Statistical Factor Model +#' data(stat.fm.data) +#' sfm.pca.fit <- fitSfm(sfm.dat, k=2) +#' ES.decomp <- fmEsDecomp(sfm.pca.fit) +#' ES.decomp$cES +#' #' @importFrom PerformanceAnalytics VaR #' #' @export @@ -173,3 +178,89 @@ return(fm.ES.decomp) } + +#' @rdname fmEsDecomp +#' @method fmEsDecomp sfm +#' @export + +fmEsDecomp.sfm <- function(object, p=0.95, + method=c("modified","gaussian","historical", + "kernel"), invert=FALSE, ...) { + + # set defaults and check input vailidity + method = method[1] + + if (!(method %in% c("modified", "gaussian", "historical", "kernel"))) { + stop("Invalid argument: method must be 'modified', 'gaussian', + 'historical' or 'kernel'") + } + + # get beta.star + beta <- object$loadings + beta[is.na(beta)] <- 0 + beta.star <- as.matrix(cbind(beta, object$resid.sd)) + colnames(beta.star)[dim(beta.star)[2]] <- "residual" + + # factor returns and residuals data + factors.xts <- object$factors + resid.xts <- as.xts(t(t(residuals(object))/object$resid.sd)) + time(resid.xts) <- as.Date(time(resid.xts)) + + # initialize lists and matrices + N <- length(object$asset.names) + K <- object$k + VaR.fm <- rep(NA, N) + ES.fm <- rep(NA, N) + idx.exceed <- list() + n.exceed <- rep(NA, N) + names(VaR.fm) = names(ES.fm) = names(n.exceed) = object$asset.names + mES <- matrix(NA, N, K+1) + cES <- matrix(NA, N, K+1) + pcES <- matrix(NA, N, K+1) + rownames(mES)=rownames(cES)=rownames(pcES)=object$asset.names + colnames(mES)=colnames(cES)=colnames(pcES)=c(paste("F",1:K,sep="."), + "residuals") + + for (i in object$asset.names) { + # return data for asset i + R.xts <- object$data[,i] + # get VaR for asset i + VaR.fm[i] <- VaR(R.xts, p=p, method=method, invert=invert, ...) + # index of VaR exceedances + idx.exceed[[i]] <- which(R.xts <= VaR.fm[i]) + # number of VaR exceedances + n.exceed[i] <- length(idx.exceed[[i]]) + + # get F.star data object + factor.star <- merge(factors.xts, resid.xts[,i]) + colnames(factor.star)[dim(factor.star)[2]] <- "residual" + + if (!invert) {inv=-1} else {inv=1} + + # compute ES as expected value of asset return, such that the given asset + # return is less than or equal to its value-at-risk (VaR) and approximated + # by a kernel estimator. + idx <- which(R.xts <= inv*VaR.fm[i]) + ES.fm[i] <- inv * mean(R.xts[idx], na.rm =TRUE) + + # compute marginal ES as expected value of factor returns, such that the + # given asset return is less than or equal to its value-at-risk (VaR) and + # approximated by a kernel estimator. + mES[i,] <- inv * colMeans(factor.star[idx,], na.rm =TRUE) + + # correction factor to ensure that sum(cES) = portfolio ES + cf <- as.numeric( ES.fm[i] / sum(mES[i,]*beta.star[i,], na.rm=TRUE) ) + + # compute marginal, component and percentage contributions to ES + # each of these have dimensions: N x (K+1) + mES[i,] <- cf * mES[i,] + cES[i,] <- mES[i,] * beta.star[i,] + pcES[i,] <- 100* cES[i,] / ES.fm[i] + } + + fm.ES.decomp <- list(VaR.fm=VaR.fm, n.exceed=n.exceed, idx.exceed=idx.exceed, + ES.fm=ES.fm, mES=mES, cES=cES, pcES=pcES) + + return(fm.ES.decomp) +} + Modified: pkg/FactorAnalytics/R/fmSdDecomp.R =================================================================== --- pkg/FactorAnalytics/R/fmSdDecomp.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fmSdDecomp.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -59,10 +59,15 @@ #' fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), #' factor.names=colnames(managers[,(7:9)]), #' rf.name="US 3m TR", data=managers) -#' #' decomp <- fmSdDecomp(fit.macro) #' # get the percentage component contributions #' decomp$pcSd +#' +#' # Statistical Factor Model +#' data(stat.fm.data) +#' sfm.pca.fit <- fitSfm(sfm.dat, k=2) +#' decomp <- fmSdDecomp(sfm.pca.fit) +#' decomp$pcSd #' #' @export @@ -122,3 +127,41 @@ return(fm.sd.decomp) } + +#' @rdname fmSdDecomp +#' @method fmSdDecomp sfm +#' @export + +fmSdDecomp.sfm <- function(object, use="pairwise.complete.obs", ...) { + + # get beta.star: N x (K+1) + beta <- object$loadings + beta[is.na(beta)] <- 0 + beta.star <- as.matrix(cbind(beta, object$resid.sd)) + colnames(beta.star)[dim(beta.star)[2]] <- "residual" + + # get cov(F): K x K + factor <- as.matrix(object$factors) + factor.cov = cov(factor, use=use, ...) + + # get cov(F.star): (K+1) x (K+1) + K <- object$k + factor.star.cov <- diag(K+1) + factor.star.cov[1:K, 1:K] <- factor.cov + colnames(factor.star.cov) <- c(colnames(factor.cov),"residuals") + rownames(factor.star.cov) <- c(colnames(factor.cov),"residuals") + + # compute factor model sd; a vector of length N + Sd.fm <- sqrt(rowSums(beta.star %*% factor.star.cov * beta.star)) + + # compute marginal, component and percentage contributions to sd + # each of these have dimensions: N x (K+1) + mSd <- (t(factor.star.cov %*% t(beta.star)))/Sd.fm + cSd <- mSd * beta.star + pcSd = 100* cSd/Sd.fm + + fm.sd.decomp <- list(Sd.fm=Sd.fm, mSd=mSd, cSd=cSd, pcSd=pcSd) + + return(fm.sd.decomp) +} + Modified: pkg/FactorAnalytics/R/fmVaRDecomp.R =================================================================== --- pkg/FactorAnalytics/R/fmVaRDecomp.R 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/fmVaRDecomp.R 2014-11-25 23:37:06 UTC (rev 3563) @@ -73,6 +73,12 @@ #' # get the component contributions #' VaR.decomp$cVaR #' +#' # Statistical Factor Model +#' data(stat.fm.data) +#' sfm.pca.fit <- fitSfm(sfm.dat, k=2) +#' VaR.decomp <- fmVaRDecomp(sfm.pca.fit) +#' VaR.decomp$cVaR +#' #' @importFrom PerformanceAnalytics VaR #' #' @export @@ -174,3 +180,94 @@ return(fm.VaR.decomp) } + +#' @rdname fmVaRDecomp +#' @method fmVaRDecomp sfm +#' @export + +fmVaRDecomp.sfm <- function(object, p=0.95, + method=c("modified","gaussian","historical", + "kernel"), invert=FALSE, ...) { + + # set defaults and check input vailidity + method = method[1] + + if (!(method %in% c("modified", "gaussian", "historical", "kernel"))) { + stop("Invalid argument: method must be 'modified', 'gaussian', + 'historical' or 'kernel'") + } + + # get beta.star + beta <- object$loadings + beta[is.na(beta)] <- 0 + beta.star <- as.matrix(cbind(beta, object$resid.sd)) + colnames(beta.star)[dim(beta.star)[2]] <- "residual" + + # factor returns and residuals data + factors.xts <- object$factors + resid.xts <- as.xts(t(t(residuals(object))/object$resid.sd)) + time(resid.xts) <- as.Date(time(resid.xts)) + + # initialize lists and matrices + N <- length(object$asset.names) + K <- object$k + VaR.fm <- rep(NA, N) + idx.exceed <- list() + n.exceed <- rep(NA, N) + names(VaR.fm) = names(n.exceed) = object$asset.names + mVaR <- matrix(NA, N, K+1) + cVaR <- matrix(NA, N, K+1) + pcVaR <- matrix(NA, N, K+1) + rownames(mVaR)=rownames(cVaR)=rownames(pcVaR)=object$asset.names + colnames(mVaR)=colnames(cVaR)=colnames(pcVaR)=c(paste("F",1:K,sep="."), + "residuals") + + for (i in object$asset.names) { + # return data for asset i + R.xts <- object$data[,i] + # get VaR for asset i + VaR.fm[i] <- VaR(R.xts, p=p, method=method, invert=invert, ...) + # index of VaR exceedances + idx.exceed[[i]] <- which(R.xts <= VaR.fm[i]) + # number of VaR exceedances + n.exceed[i] <- length(idx.exceed[[i]]) + + # # plot exceedances for asset i + # plot(R.xts, type="b", main="Asset Returns and 5% VaR Violations", + # ylab="Returns") + # abline(h=0) + # abline(h=VaR.fm[i], lwd=2, col="red") + # points(R.xts[idx.exceed[[i]]], type="p", pch=16, col="red") + + # get F.star data object + factor.star <- merge(factors.xts, resid.xts[,i]) + colnames(factor.star)[dim(factor.star)[2]] <- "residual" + + if (!invert) {inv=-1} else {inv=1} + + # epsilon is apprx. using Silverman's rule of thumb (bandwidth selection) + # the constant 2.575 corresponds to a triangular kernel + eps <- 2.575*sd(R.xts, na.rm =TRUE) * (nrow(R.xts)^(-1/5)) + # compute marginal VaR as expected value of factor returns, such that the + # asset return was incident in the triangular kernel region peaked at the + # VaR value and bandwidth = epsilon. + k.weight <- as.vector(1 - abs(R.xts - VaR.fm[i]) / eps) + k.weight[k.weight<0] <- 0 + mVaR[i,] <- inv * colMeans(factor.star*k.weight, na.rm =TRUE) + + # correction factor to ensure that sum(cVaR) = portfolio VaR + cf <- as.numeric( VaR.fm[i] / sum(mVaR[i,]*beta.star[i,], na.rm=TRUE) ) + + # compute marginal, component and percentage contributions to VaR + # each of these have dimensions: N x (K+1) + mVaR[i,] <- cf * mVaR[i,] + cVaR[i,] <- mVaR[i,] * beta.star[i,] + pcVaR[i,] <- 100* cVaR[i,] / VaR.fm[i] + } + + fm.VaR.decomp <- list(VaR.fm=VaR.fm, n.exceed=n.exceed, idx.exceed=idx.exceed, + mVaR=mVaR, cVaR=cVaR, pcVaR=pcVaR) + + return(fm.VaR.decomp) +} + Modified: pkg/FactorAnalytics/R/paFm.r =================================================================== --- pkg/FactorAnalytics/R/paFm.r 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/R/paFm.r 2014-11-25 23:37:06 UTC (rev 3563) @@ -158,21 +158,21 @@ if (class(fit)=="sfm") { # return attributed to factors - cum.attr.ret <- t(fit$loadings) + cum.attr.ret <- fit$loadings cum.spec.ret <- fit$r2 - factorNames <- rownames(fit$loadings) - fundNames <- colnames(fit$loadings) + factorNames <- colnames(fit$loadings) + fundNames <- rownames(fit$loadings) data <- checkData(fit$data) # create list for attribution attr.list <- list() # pca method - if ( dim(fit$asset.ret)[1] > dim(fit$asset.ret)[2] ) { + if ( dim(fit$data)[1] > dim(fit$data)[2] ) { for (k in fundNames) { fit.lm <- fit$asset.fit[[k]] ## extract information from lm object - date <- index(data[, k]) + date <- index(data[,k]) # probably needs more general Date setting actual.xts <- xts(fit.lm$model[1], as.Date(date)) # attributed returns @@ -181,11 +181,11 @@ # setup initial value attr.ret.xts.all <- xts(, as.Date(date)) - for ( i in factorNames ) { + for (i in factorNames) { attr.ret.xts <- actual.xts - xts(as.matrix(fit.lm$model[i])%*%as.matrix(fit.lm$coef[i]), as.Date(date)) - cum.attr.ret[k, i] <- cum.ret - + cum.attr.ret[k,i] <- cum.ret - Return.cumulative(actual.xts - attr.ret.xts) attr.ret.xts.all <- merge(attr.ret.xts.all, attr.ret.xts) } @@ -194,31 +194,30 @@ spec.ret.xts <- actual.xts - xts(as.matrix(fit.lm$model[, -1])%*%as.matrix(fit.lm$coef[-1]), as.Date(date)) - cum.spec.ret[k] <- cum.ret - Return.cumulative(actual.xts-spec.ret.xts) + cum.spec.ret[k] <- cum.ret - Return.cumulative(actual.xts- spec.ret.xts) attr.list[[k]] <- merge(attr.ret.xts.all, spec.ret.xts) colnames(attr.list[[k]]) <- c(factorNames, "specific.returns") } } else { # apca method: - # fit$loadings # f X K - # fit$factors # T X f + # fit$loadings # N X K + # fit$factors # T X K date <- index(fit$factors) - for ( k in fundNames) { + for (k in fundNames) { attr.ret.xts.all <- xts(, as.Date(date)) - actual.xts <- xts(fit$asset.ret[, k], as.Date(date)) - cum.ret <- Return.cumulative(actual.xts) + actual.xts <- xts(fit$data[,k], as.Date(date)) + cum.ret <- Return.cumulative(actual.xts) for (i in factorNames) { - attr.ret.xts <- xts(fit$factors[, i] * fit$loadings[i, k], - as.Date(date)) + attr.ret.xts <- xts(fit$factors[,i]*fit$loadings[k,i], as.Date(date)) attr.ret.xts.all <- merge(attr.ret.xts.all, attr.ret.xts) - cum.attr.ret[k, i] <- cum.ret - Return.cumulative(actual.xts - - attr.ret.xts) + cum.attr.ret[k,i] <- cum.ret - Return.cumulative(actual.xts - + attr.ret.xts) } - spec.ret.xts <- actual.xts - xts(fit$factors%*%fit$loadings[, k], + spec.ret.xts <- actual.xts - xts(fit$factors%*%t(fit$loadings[k,]), as.Date(date)) - cum.spec.ret[k] <- cum.ret - Return.cumulative(actual.xts-spec.ret.xts) + cum.spec.ret[k] <- cum.ret - Return.cumulative(actual.xts- spec.ret.xts) attr.list[[k]] <- merge(attr.ret.xts.all, spec.ret.xts) colnames(attr.list[[k]]) <- c(factorNames, "specific.returns") } @@ -226,7 +225,7 @@ } ans <- list(cum.ret.attr.f=cum.attr.ret, cum.spec.ret=cum.spec.ret, - attr.list=attr.list) + attr.list=attr.list) class(ans) <- "pafm" return(ans) } Added: pkg/FactorAnalytics/R/predict.sfm.r =================================================================== --- pkg/FactorAnalytics/R/predict.sfm.r (rev 0) +++ pkg/FactorAnalytics/R/predict.sfm.r 2014-11-25 23:37:06 UTC (rev 3563) @@ -0,0 +1,43 @@ +#' @title Predicts asset returns based on a fitted statistical factor model +#' +#' @description S3 \code{predict} method for object of class \code{sfm}. It +#' calls the \code{predict} method for fitted objects of class \code{lm}. +#' +#' @param object an object of class \code{sfm} produced by \code{fitSfm}. +#' @param newdata a vector, matrix, data.frame, xts, timeSeries or zoo object +#' containing the variables with which to predict. +#' @param ... optional arguments passed to \code{predict.lm}. +#' +#' @return +#' \code{predict.sfm} produces a vector or a matrix of predictions. +#' +#' @author Yi-An Chen and Sangeetha Srinivasan +#' +#' @seealso \code{\link{fitSfm}}, \code{\link{summary.sfm}} +#' +#' @examples +#' # load data from the database +#' data(stat.fm.data) +#' # fit the factor model with PCA +#' fit <- fitSfm(sfm.dat, k=2) +#' +#' pred.fit <- predict(fit) +#' newdata <- data.frame("EDHEC LS EQ"=rnorm(n=120), "SP500 TR"=rnorm(n=120)) +#' rownames(newdata) <- rownames(fit$data) +#' pred.fit2 <- predict(fit, newdata, interval="confidence") +#' +#' @importFrom PerformanceAnalytics checkData +#' +#' @method predict sfm +#' @export +#' + +predict.sfm <- function(object, newdata = NULL, ...){ + + if (missing(newdata) || is.null(newdata)) { + predict(object$asset.fit, ...) + } else { + newdata <- checkData(newdata, method="data.frame") + predict(object$asset.fit, newdata, ...) + } +} \ No newline at end of file Added: pkg/FactorAnalytics/R/print.sfm.r =================================================================== --- pkg/FactorAnalytics/R/print.sfm.r (rev 0) +++ pkg/FactorAnalytics/R/print.sfm.r 2014-11-25 23:37:06 UTC (rev 3563) @@ -0,0 +1,40 @@ +#' @title Prints out a fitted statictical factor model object +#' +#' @description S3 \code{print} method for object of class \code{sfm}. Prints +#' the call, factor model dimension, factor loadings, r-squared and residual +#' volatilities from the fitted object. +#' +#' @param x an object of class \code{sfm} produced by \code{fitSfm}. +#' @param digits an integer value, to indicate the required number of +#' significant digits. Default is 3. +#' @param ... optional arguments passed to the \code{print} method. +#' +#' @author Yi-An Chen and Sangeetha Srinivasan +#' +#' @seealso \code{\link{fitSfm}}, \code{\link{summary.sfm}} +#' +#' @examples +#' data(stat.fm.data) +#' fit <- fitSfm(sfm.dat, k=2) +#' print(fit) +#' +#' @method print sfm +#' @export +#' + +print.sfm <- function(x, digits=max(3, .Options$digits - 3), ...){ + if(!is.null(cl <- x$call)){ + cat("\nCall:\n") + dput(cl) + } + cat("\nModel dimensions:\n") + tmp <- c(dim(t(x$loadings)), nrow(x$data)) + names(tmp) <- c("Factors", "Assets", "Periods") + print(tmp) + cat("\nFactor Loadings:\n") + print(summary(x$loadings), digits=digits, ...) + cat("\nR-squared values:\n") + print(summary(x$r2), digits=digits, ...) + cat("\nResidual Volatilities:\n") + print(summary(x$resid.sd), digits=digits, ...) +} Added: pkg/FactorAnalytics/R/summary.sfm.r =================================================================== --- pkg/FactorAnalytics/R/summary.sfm.r (rev 0) +++ pkg/FactorAnalytics/R/summary.sfm.r 2014-11-25 23:37:06 UTC (rev 3563) @@ -0,0 +1,96 @@ +#' @title Summarizing a fitted time series factor model +#' +#' @description \code{summary} method for object of class \code{sfm}. +#' Returned object is of class {summary.sfm}. +#' +#' @details The default \code{summary} method for a fitted \code{lm} object +#' computes the standard errors and t-statistics under the assumption of +#' homoskedasticty. Argument \code{se.type} gives the option to compute +#' heteroskedasticity-consistent (HC) standard errors and t-statistics using +#' \code{\link[lmtest]{coeftest}}. +#' +#' @param object an object of class \code{sfm} returned by \code{fitSfm}. +#' @param se.type one of "Default" or "HC"; option for computing HC standard +#' errors and t-statistics. +#' @param x an object of class \code{summary.sfm}. +#' @param digits number of significants digits to use when printing. +#' Default is 3. +#' @param ... futher arguments passed to or from other methods. +#' +#' @return Returns an object of class \code{summary.sfm}. +#' The print method for class \code{summary.sfm} outputs the call, +#' coefficients (with standard errors and t-statistics), r-squared and +#' residual volatilty (under the homoskedasticity assumption) for all assets. +#' +#' Object of class \code{summary.sfm} is a list of length N+2 containing: +#' \item{call}{the function call to \code{fitSfm}} +#' \item{se.type}{standard error type as input} +#' \item{}{summary of the fit object of class \code{mlm} for the factor model.} +#' +#' @note For a more detailed printed summary for each asset, refer to +#' \code{\link[stats]{summary.lm}}, which includes F-statistics, +#' Multiple R-squared, Adjusted R-squared, further formats the coefficients, +#' standard errors, etc. and additionally gives significance stars if +#' \code{signif.stars} is TRUE. +#' +#' @author Sangeetha Srinivasan +#' +#' @seealso \code{\link{fitSfm}}, \code{\link[stats]{summary.lm}} +#' +#' @examples +#' data(stat.fm.data) +#' # fit the factor model with PCA +#' fit <- fitSfm(sfm.dat, k=2) +#' +#' # summary of factor model fit for all assets +#' summary(fit, "HAC") +#' +#' @importFrom lmtest coeftest +#' @importFrom sandwich vcovHC +#' +#' @method summary sfm +#' @export + +summary.sfm <- function(object, se.type="Default", ...){ + + # check input object validity + if (!inherits(object, "sfm")) { + stop("Invalid 'sfm' object") + } + + # extract list of mlm summary object for the entire model + mlm.fit.summary <- summary(object$asset.fit) + + # get coefficients and convert to HC standard errors and t-stats if specified + coefficients <- coeftest(object$asset.fit, vcov.=vcovHC, data=sfm.data[,1]) + if (se.type=="HC") { + coefficients <- coeftest(object$asset.fit, vcov.=vcovHC) + } + + # include the call and se.type to fitSfm + sum <- list(call=object$call, se.type=se.type, coefficients=coefficients, + mlm.fit.summary=mlm.fit.summary, r.squared=object$r2, + sigma=object$resid.sd) + class(sum) <- "summary.sfm" + return(sum) +} + +#' @rdname summary.sfm +#' @method print summary.sfm +#' @export + +print.summary.sfm <- function(x, digits=3, ...) { + + if(!is.null(cl <- x$call)) { + cat("\nCall:\n") + dput(cl) + } + cat("\nFactor Model Coefficients:", "\n(", x$se.type, + " Standard Errors & T-stats)\n\n", sep="") + c <- x$coefficients + print(c, digits=digits, ...) + r2 <- x$r.squared + print(r2, digits=digits, ...) + sig <- x$sigma + print(sig, digits=digits, ...) +} Modified: pkg/FactorAnalytics/man/fitSfm.Rd =================================================================== --- pkg/FactorAnalytics/man/fitSfm.Rd 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/man/fitSfm.Rd 2014-11-25 23:37:06 UTC (rev 3563) @@ -36,6 +36,8 @@ \item{object}{a fit object of class \code{sfm} which is returned by \code{fitSfm}} + +\item{...}{arguments passed to other functions.} } \value{ fitTsfm returns an object of class \code{"sfm"} for which @@ -58,8 +60,7 @@ \item{r2}{length-N vector of R-squared values.} \item{resid.sd}{length-N vector of residual standard deviations.} \item{residuals}{T x N xts object of residuals from the OLS regression.} -\item{Omega}{M x M return covariance matrix estimated by the factor model, -where M = min(N,T).} +\item{Omega}{N x N return covariance matrix estimated by the factor model.} \item{eigen}{length-K vector of eigenvalues of the sample covariance matrix.} \item{mimic}{N x K matrix of factor mimicking portfolio weights.} \item{call}{the matched function call.} Modified: pkg/FactorAnalytics/man/fmCov.Rd =================================================================== --- pkg/FactorAnalytics/man/fmCov.Rd 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/man/fmCov.Rd 2014-11-25 23:37:06 UTC (rev 3563) @@ -1,12 +1,15 @@ % Generated by roxygen2 (4.0.2): do not edit by hand \name{fmCov} \alias{fmCov} +\alias{fmCov.sfm} \alias{fmCov.tsfm} \title{Covariance Matrix for assets' returns from fitted factor model.} \usage{ fmCov(object, ...) \method{fmCov}{tsfm}(object, use = "pairwise.complete.obs", ...) + +\method{fmCov}{sfm}(object, use = "pairwise.complete.obs", ...) } \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} @@ -55,18 +58,12 @@ factor.names=c("EDHEC LS EQ","SP500 TR"), data=managers) fmCov(fit) -\dontrun{ # Statistical Factor Model data(stat.fm.data) sfm.pca.fit <- fitSfm(sfm.dat, k=2) -#' fmCov(t(sfm.pca.fit$loadings), var(sfm.pca.fit$factors), - sfm.pca.fit$resid.sd) +fmCov(sfm.pca.fit) -sfm.apca.fit <- fitSfm(sfm.apca.dat, k=2) - -fmCov(t(sfm.apca.fit$loadings), var(sfm.apca.fit$factors), - sfm.apca.fit$resid.sd) - +\dontrun{ # Fundamental Factor Model data(stock) # there are 447 assets Modified: pkg/FactorAnalytics/man/fmEsDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmEsDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/man/fmEsDecomp.Rd 2014-11-25 23:37:06 UTC (rev 3563) @@ -1,6 +1,7 @@ % Generated by roxygen2 (4.0.2): do not edit by hand \name{fmEsDecomp} \alias{fmEsDecomp} +\alias{fmEsDecomp.sfm} \alias{fmEsDecomp.tsfm} \title{Decompose ES into individual factor contributions} \usage{ @@ -8,6 +9,9 @@ \method{fmEsDecomp}{tsfm}(object, p = 0.95, method = c("modified", "gaussian", "historical", "kernel"), invert = FALSE, ...) + +\method{fmEsDecomp}{sfm}(object, p = 0.95, method = c("modified", + "gaussian", "historical", "kernel"), invert = FALSE, ...) } \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} @@ -67,10 +71,15 @@ data(managers) fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:8)]), data=managers) - ES.decomp <- fmEsDecomp(fit.macro) # get the component contributions ES.decomp$cES + +# Statistical Factor Model +data(stat.fm.data) +sfm.pca.fit <- fitSfm(sfm.dat, k=2) +ES.decomp <- fmEsDecomp(sfm.pca.fit) +ES.decomp$cES } \author{ Eric Zviot, Sangeetha Srinivasan and Yi-An Chen Modified: pkg/FactorAnalytics/man/fmSdDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmSdDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/man/fmSdDecomp.Rd 2014-11-25 23:37:06 UTC (rev 3563) @@ -1,12 +1,15 @@ % Generated by roxygen2 (4.0.2): do not edit by hand \name{fmSdDecomp} \alias{fmSdDecomp} +\alias{fmSdDecomp.sfm} \alias{fmSdDecomp.tsfm} \title{Decompose standard deviation into individual factor contributions} \usage{ fmSdDecomp(object, ...) \method{fmSdDecomp}{tsfm}(object, use = "pairwise.complete.obs", ...) + +\method{fmSdDecomp}{sfm}(object, use = "pairwise.complete.obs", ...) } \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} @@ -53,10 +56,15 @@ fit.macro <- fitTsfm(asset.names=colnames(managers[,(1:6)]), factor.names=colnames(managers[,(7:9)]), rf.name="US 3m TR", data=managers) - decomp <- fmSdDecomp(fit.macro) # get the percentage component contributions decomp$pcSd + +# Statistical Factor Model +data(stat.fm.data) +sfm.pca.fit <- fitSfm(sfm.dat, k=2) +decomp <- fmSdDecomp(sfm.pca.fit) +decomp$pcSd } \author{ Eric Zivot, Sangeetha Srinivasan and Yi-An Chen Modified: pkg/FactorAnalytics/man/fmVaRDecomp.Rd =================================================================== --- pkg/FactorAnalytics/man/fmVaRDecomp.Rd 2014-11-25 15:07:07 UTC (rev 3562) +++ pkg/FactorAnalytics/man/fmVaRDecomp.Rd 2014-11-25 23:37:06 UTC (rev 3563) @@ -1,6 +1,7 @@ % Generated by roxygen2 (4.0.2): do not edit by hand \name{fmVaRDecomp} \alias{fmVaRDecomp} +\alias{fmVaRDecomp.sfm} \alias{fmVaRDecomp.tsfm} \title{Decompose VaR into individual factor contributions} \usage{ @@ -8,6 +9,9 @@ \method{fmVaRDecomp}{tsfm}(object, p = 0.95, method = c("modified", "gaussian", "historical", "kernel"), invert = FALSE, ...) + +\method{fmVaRDecomp}{sfm}(object, p = 0.95, method = c("modified", + "gaussian", "historical", "kernel"), invert = FALSE, ...) } \arguments{ \item{object}{fit object of class \code{tsfm}, \code{sfm} or \code{ffm}.} @@ -69,6 +73,12 @@ VaR.decomp <- fmVaRDecomp(fit.macro) # get the component contributions VaR.decomp$cVaR + +# Statistical Factor Model +data(stat.fm.data) +sfm.pca.fit <- fitSfm(sfm.dat, k=2) +VaR.decomp <- fmVaRDecomp(sfm.pca.fit) +VaR.decomp$cVaR } \author{ Eric Zivot, Sangeetha Srinivasan and Yi-An Chen Added: pkg/FactorAnalytics/man/predict.sfm.Rd =================================================================== [TRUNCATED] To get the complete diff run: svnlook diff /svnroot/returnanalytics -r 3563