[Returnanalytics-commits] r3153 - in pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm: R man vignettes
noreply at r-forge.r-project.org
noreply at r-forge.r-project.org
Sun Sep 22 18:06:39 CEST 2013
Author: shubhanm
Date: 2013-09-22 18:06:39 +0200 (Sun, 22 Sep 2013)
New Revision: 3153
Added:
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ACFSTDEV.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ACFSTDEV.rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ConditionalDrawdown.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ConditionalDrawdown.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-EmaxDDGBM.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-EmaxDDGBM.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-GLMReturn.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-GLMReturn.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-GLMSmoothIndex.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-GLMSmoothIndex.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-LoSharpeRatio.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-LoSharpeRatio.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-Managers.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-Managers.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-NormCalmar-Sterling.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-NormCalmar-Sterling.rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-OWReturn.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-OWReturn.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-OkunevWhite.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-OkunevWhite.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ShaneAcarMaxLoss.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-ShaneAcarMaxLoss.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-UnSmoothReturnAnalysis.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid-UnSmoothReturnAnalysis.pdf
Removed:
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ACFSTDEV.rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ConditionalDrawdown.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/EmaxDDGBM.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMReturn.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMSmoothIndex.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/LoSharpeRatio.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid - EmaxDDGBM.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid - LoSharpeRatio.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid - UnSmooth Return Analysis.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -ACFSTDEV.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -Commodity Index Fund Analysis.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -ConditionalDrawdown.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -GLMReturn.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -GLMSmoothIndex.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -NormCalmar-Sterling.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -OkunevWhite.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/Non-iid -ShaneAcarMaxLoss.pdf
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/NormCalmar.rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/OWReturn.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/OkunevWhite.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ShaneAcarMaxLoss.Rnw
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/UnSmoothReturnAnalysis.Rnw
Modified:
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/glmi.R
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/lmi.R
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/glmi.Rd
pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/lmi.Rd
Log:
final touches -
Modified: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/glmi.R
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/glmi.R 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/glmi.R 2013-09-22 16:06:39 UTC (rev 3153)
@@ -4,22 +4,20 @@
#' @details
#' see \code{\link{glm}}.
#' @param formula
-#'an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted. The details of model specification are given under ‘Details’.
-#'
+#'an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted.
#'@param family
#' a description of the error distribution and link function to be used in the model. This can be a character string naming a family function, a family function or the result of a call to a family function. (See family for details of family functions.)
#'@param data
#'an optional data frame, list or environment (or object coercible by as.data.frame to a data frame) containing the variables in the model. If not found in data, the variables are taken from environment(formula), typically the environment from which lm is called.
#'
#'@param vcov HC-HAC covariance estimation
-#'@param weights
-#'an optional vector of weights to be used in the fitting process. Should be NULL or a numeric vector. If non-NULL, weighted least squares is used with weights weights (that is, minimizing sum; otherwise ordinary least squares is used. See also ‘Details’,
+#'@param weights an optional vector of weights to be used in the fitting process.
#'@param subset
#'an optional vector specifying a subset of observations to be used in the fitting process.
#'
#'
#'@param na.action
-#'a function which indicates what should happen when the data contain NAs. The default is set by the na.action setting of options, and is na.fail if that is unset. The ‘factory-fresh’ default is na.omit. Another possible value is NULL, no action. Value na.exclude can be useful.
+#'a function which indicates what should happen when the data contain NAs. Another possible value is NULL, no action. Value na.exclude can be useful.
#'
#'@param start
#'starting values for the parameters in the linear predictor.
@@ -51,6 +49,14 @@
#'additional arguments to be passed to the low level regression fitting functions (see below).
#' @author The original R implementation of glm was written by Simon Davies working for Ross Ihaka at the University of Auckland, but has since been extensively re-written by members of the R Core team.
#' The design was inspired by the S function of the same name described in Hastie & Pregibon (1992).
+#' @examples
+#' ## Dobson (1990) Page 93: Randomized Controlled Trial :
+#' counts <- c(18,17,15,20,10,20,25,13,12)
+#' outcome <- gl(3,1,9)
+#' treatment <- gl(3,3)
+#' print(d.AD <- data.frame(treatment, outcome, counts))
+#'glm.D93 <- glmi(counts ~ outcome + treatment, family = poisson())
+#'summary(glm.D93)
#' @keywords HC HAC covariance estimation regression fitting model
#' @rdname glmi
#' @export
Modified: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/lmi.R
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/lmi.R 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/lmi.R 2013-09-22 16:06:39 UTC (rev 3153)
@@ -4,7 +4,7 @@
#' @details
#' see \code{\link{lm}}.
#' @param formula
-#'an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted. The details of model specification are given under ‘Details’.
+#'an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted.
#'
#'
#'@param data
@@ -12,13 +12,13 @@
#'
#'@param vcov HC-HAC covariance estimation
#'@param weights
-#'an optional vector of weights to be used in the fitting process. Should be NULL or a numeric vector. If non-NULL, weighted least squares is used with weights weights (that is, minimizing sum; otherwise ordinary least squares is used. See also ‘Details’,
+#'an optional vector of weights to be used in the fitting process. Should be NULL or a numeric vector. If non-NULL, weighted least squares is used with weights weights (that is, minimizing sum; otherwise ordinary least squares is used.
#'
#'
#'@param subset
#'an optional vector specifying a subset of observations to be used in the fitting process.
#'@param na.action
-#'a function which indicates what should happen when the data contain NAs. The default is set by the na.action setting of options, and is na.fail if that is unset. The ‘factory-fresh’ default is na.omit. Another possible value is NULL, no action. Value na.exclude can be useful.
+#'a function which indicates what should happen when the data contain NAs. The default is set by the na.action setting of options, and is na.fail if that is unsed. Another possible value is NULL, no action. Value na.exclude can be useful.
#'
#'@param method
#'the method to be used; for fitting, currently only method = "qr" is supported; method = "model.frame" returns the model frame (the same as with model = TRUE, see below).
@@ -41,6 +41,14 @@
#' @author The original R implementation of glm was written by Simon Davies working for Ross Ihaka at the University of Auckland, but has since been extensively re-written by members of the R Core team.
#' The design was inspired by the S function of the same name described in Hastie & Pregibon (1992).
#' @keywords HC HAC covariance estimation regression fitting model
+#' @examples
+#'ctl <- c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14)
+#'trt <- c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69)
+#'group <- gl(2, 10, 20, labels = c("Ctl","Trt"))
+#'weight <- c(ctl, trt)
+#'lm.D9 <- lmi(weight ~ group)
+#'lm.D90 <- lmi(weight ~ group - 1) # omitting intercept
+#'summary(lm.D90)
#' @rdname lmi
#' @export
lmi <- function (formula, data,vcov = NULL, subset, weights, na.action, method = "qr",
Modified: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/glmi.Rd
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/glmi.Rd 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/glmi.Rd 2013-09-22 16:06:39 UTC (rev 3153)
@@ -11,8 +11,7 @@
\arguments{
\item{formula}{an object of class "formula" (or one that
can be coerced to that class): a symbolic description of
- the model to be fitted. The details of model
- specification are given under ‘Details’.}
+ the model to be fitted.}
\item{family}{a description of the error distribution and
link function to be used in the model. This can be a
@@ -29,20 +28,14 @@
\item{vcov}{HC-HAC covariance estimation}
\item{weights}{an optional vector of weights to be used
- in the fitting process. Should be NULL or a numeric
- vector. If non-NULL, weighted least squares is used with
- weights weights (that is, minimizing sum; otherwise
- ordinary least squares is used. See also ‘Details’,}
+ in the fitting process.}
\item{subset}{an optional vector specifying a subset of
observations to be used in the fitting process.}
\item{na.action}{a function which indicates what should
- happen when the data contain NAs. The default is set by
- the na.action setting of options, and is na.fail if that
- is unset. The ‘factory-fresh’ default is na.omit.
- Another possible value is NULL, no action. Value
- na.exclude can be useful.}
+ happen when the data contain NAs. Another possible value
+ is NULL, no action. Value na.exclude can be useful.}
\item{start}{starting values for the parameters in the
linear predictor.}
@@ -95,6 +88,15 @@
\details{
see \code{\link{glm}}.
}
+\examples{
+## Dobson (1990) Page 93: Randomized Controlled Trial :
+counts <- c(18,17,15,20,10,20,25,13,12)
+outcome <- gl(3,1,9)
+treatment <- gl(3,3)
+print(d.AD <- data.frame(treatment, outcome, counts))
+glm.D93 <- glmi(counts ~ outcome + treatment, family = poisson())
+summary(glm.D93)
+}
\author{
The original R implementation of glm was written by Simon
Davies working for Ross Ihaka at the University of
Modified: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/lmi.Rd
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/lmi.Rd 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/man/lmi.Rd 2013-09-22 16:06:39 UTC (rev 3153)
@@ -10,8 +10,7 @@
\arguments{
\item{formula}{an object of class "formula" (or one that
can be coerced to that class): a symbolic description of
- the model to be fitted. The details of model
- specification are given under ‘Details’.}
+ the model to be fitted.}
\item{data}{an optional data frame, list or environment
(or object coercible by as.data.frame to a data frame)
@@ -25,7 +24,7 @@
in the fitting process. Should be NULL or a numeric
vector. If non-NULL, weighted least squares is used with
weights weights (that is, minimizing sum; otherwise
- ordinary least squares is used. See also ‘Details’,}
+ ordinary least squares is used.}
\item{subset}{an optional vector specifying a subset of
observations to be used in the fitting process.}
@@ -33,9 +32,8 @@
\item{na.action}{a function which indicates what should
happen when the data contain NAs. The default is set by
the na.action setting of options, and is na.fail if that
- is unset. The ‘factory-fresh’ default is na.omit.
- Another possible value is NULL, no action. Value
- na.exclude can be useful.}
+ is unsed. Another possible value is NULL, no action.
+ Value na.exclude can be useful.}
\item{method}{the method to be used; for fitting,
currently only method = "qr" is supported; method =
@@ -83,6 +81,15 @@
\details{
see \code{\link{lm}}.
}
+\examples{
+ctl <- c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14)
+trt <- c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69)
+group <- gl(2, 10, 20, labels = c("Ctl","Trt"))
+weight <- c(ctl, trt)
+lm.D9 <- lmi(weight ~ group)
+lm.D90 <- lmi(weight ~ group - 1) # omitting intercept
+summary(lm.D90)
+}
\author{
The original R implementation of glm was written by Simon
Davies working for Ross Ihaka at the University of
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ACFSTDEV.rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ACFSTDEV.rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ACFSTDEV.rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,90 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
-%\VignetteDepends{PerformanceAnalytics}
-%\VignetteKeywords{returns, performance, risk, benchmark, portfolio}
-%\VignettePackage{PerformanceAnalytics}
-
-%\documentclass[a4paper]{article}
-%\usepackage[noae]{Sweave}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage[top=3cm, bottom=3cm, left=2.5cm]{geometry}
-%\usepackage{graphicx}
-%\usepackage{graphicx, verbatim}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage{graphicx}
-
-\title{Autocorrelated Standard Deviation}
-\author{R Project for Statistical Computing}
-
-\begin{document}
-\SweaveOpts{concordance=TRUE}
-
-\maketitle
-
-
-\begin{abstract}
-The fact that many hedge fund returns exhibit extraordinary levels of serial correlation is now well-known and generally accepted as fact.Because hedge fund strategies have exceptionally high autocorrelations in reported returns and this is taken as evidence of return smoothing, we highlight the effect autocorrelation has on volatility which is hazed by the square root of time rule used in the industry
-\end{abstract}
-
-<<echo=FALSE >>=
-library(PerformanceAnalytics)
-data(edhec)
-@
-
-<<echo=FALSE>>=
-require(noniid.sm) #source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/ACStdDev.annualized.R')
-@
-
-\section{Methodology}
-Given a sample of historical returns \((R_1,R_2, . . .,R_T)\),the method assumes the fund manager smooths returns in the following manner, when 't' is the unit time interval:
-
-%Let $X \sim N(0,1)$ and $Y \sim \textrm{Exponential}(\mu)$. Let
-%$Z = \sin(X)$. $\sqrt{X}$.
-
-%$\hat{\mu}$ = $\displaystyle\frac{22}{7}$
-%e^{2 \mu} = 1
-%\begin{equation}
-%\left(\sum_{t=1}^{T} R_t/T\right) = \hat{\mu} \\
-%\end{equation}
-\begin{equation}
- \sigma_{T} = T \sqrt{\sigma_{t}} \\
-\end{equation}
-
-
-\section{Usage}
-
-In this example we use edhec database, to compute true Hedge Fund Returns.
-
-<<echo=T,fig=T>>=
-library(PerformanceAnalytics)
-data(edhec)
-ACFVol = ACStdDev.annualized(edhec[,1:3])
-Vol = StdDev.annualized(edhec[,1:3])
-Vol
-ACFVol
-barplot(rbind(ACFVol,Vol), main="ACF and Orignal Volatility",
- xlab="Fund Type",ylab="Volatilty (in %)", col=c("darkblue","red"), beside=TRUE)
- legend("topright", c("1","2"), cex=0.6,
- bty="2", fill=c("darkblue","red"));
-@
-
-The above figure shows the behaviour of the distribution tending to a normal IID distribution.For comparitive purpose, one can observe the change in the charateristics of return as compared to the orignal.
-
-\end{document}
\ No newline at end of file
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ConditionalDrawdown.Rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ConditionalDrawdown.Rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/ConditionalDrawdown.Rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,85 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
-%\VignetteDepends{PerformanceAnalytics}
-%\VignetteKeywords{returns, performance, risk, benchmark, portfolio}
-%\VignettePackage{PerformanceAnalytics}
-
-%\documentclass[a4paper]{article}
-%\usepackage[noae]{Sweave}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage[top=3cm, bottom=3cm, left=2.5cm]{geometry}
-%\usepackage{graphicx}
-%\usepackage{graphicx, verbatim}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage{graphicx}
-
-\title{Chekhlov Conditional Drawdown at Risk}
-\author{R Project for Statistical Computing}
-
-\begin{document}
-\SweaveOpts{concordance=TRUE}
-
-\maketitle
-
-
-\begin{abstract}
-A new one-parameter family of risk measures called Conditional Drawdown (CDD) has
-been proposed. These measures of risk are functionals of the portfolio drawdown (underwater) curve considered in active portfolio management. For some value of $\hat{\alpha}$ the tolerance parameter, in the case of a single sample path, drawdown functional is defined as the mean of the worst (1 \(-\) $\hat{\alpha}$)100\% drawdowns. The CDD measure generalizes the notion of the drawdown functional to a multi-scenario case and can be considered as a generalization of deviation measure to a dynamic case. The CDD measure includes the Maximal Drawdown and Average Drawdown as its limiting cases.
-\end{abstract}
-
-<<echo=FALSE >>=
-library(PerformanceAnalytics)
-data(edhec)
-@
-
-<<echo=FALSE,eval=TRUE,results=verbatim >>=
-require(noniid.sm) #source("C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/CDrawdown.R")
-@
-
-\section{Background}
-
-The model is focused on concept of drawdown measure which is in possession of all properties of a deviation measure,generalization of deviation measures to a dynamic case.Concept of risk profiling - Mixed Conditional Drawdown (generalization of CDD).Optimization techniques for CDD computation - reduction to linear programming (LP) problem. Portfolio optimization with constraint on Mixed CDD
-The model develops concept of drawdown measure by generalizing the notion
-of the CDD to the case of several sample paths for portfolio uncompounded rate
-of return.
-
-
-\section{Methodology}
-For a given value of sequence ${\xi_k}$ where ${\xi}$ is a time series unit drawdown vector.The CV at R is formally defined as :
-\begin{equation}
-CV at R_{\alpha}(\xi)=\frac{\pi_{\xi}(\zeta(\alpha))-\alpha}{1-\alpha}\zeta(\alpha) + \frac{ \sum_{\xi_k=1}^{} \xi_k}{(1-\alpha)N}
-\end{equation}
-
-Note that the first term in the right-hand side of equation appears because of inequality greater than equal to $\hat{\alpha}$. If (1 \(-\) $\hat{\alpha}$) \* 100\% of the worst drawdowns can be counted precisely, then and the first term in the right-hand side of equation disappears. Equation follows from the framework of the CVaR methodology
-
-
-\section{Usage}
-
-In this example we use edhec database, to compute true Hedge Fund Returns.
-
-<<>>=
-library(PerformanceAnalytics)
-data(edhec)
-CDrawdown(edhec)
-@
-
-
-
-\end{document}
\ No newline at end of file
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/EmaxDDGBM.Rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/EmaxDDGBM.Rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/EmaxDDGBM.Rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,102 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
-%\VignetteDepends{PerformanceAnalytics}
-%\VignetteKeywords{returns, performance, risk, benchmark, portfolio}
-%\VignettePackage{PerformanceAnalytics}
-
-%\documentclass[a4paper]{article}
-%\usepackage[noae]{Sweave}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage[top=3cm, bottom=3cm, left=2.5cm]{geometry}
-%\usepackage{graphicx}
-%\usepackage{graphicx, verbatim}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage{graphicx}
-
-\title{On the Maximum Drawdown of a Brownian Motion}
-\author{Shubhankit Mohan}
-
-\begin{document}
-\SweaveOpts{concordance=TRUE}
-
-\maketitle
-
-
-\begin{abstract}
-The maximum drawdown possible of an asset whose return series follows a Geometric Brownian Motion Process.
-
-\end{abstract}
-
-
-<<echo=FALSE>>=
-require(noniid.sm) #source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/LoSharpe.R')
-@
-
-<<echo=FALSE >>=
-library(PerformanceAnalytics)
-data(edhec)
-data(managers)
-@
-\section{Background}
- If X(t) is a random process on [0, T ], the maximum
- drawdown at time T , D(T), is defined by where \deqn{D(T)
- = sup [X(s) - X(t)]} where s belongs to [0,t] and s
- belongs to [0,T] Informally, this is the largest drop
- from a peak to a bottom. In this paper, we investigate
- the behavior of this statistic for a Brownian motion with
- drift. In particular, we give an infinite series
- representation of its distribution, and consider its
- expected value. When the drift is zero, we give an
- analytic expression for the expected value, and for
- non-zero drift, we give an infinite series
- representation. For all cases, we compute the limiting
- \bold{(\eqn{T "tends to" \infty})} behavior, which can be
- logarithmic (\eqn{\mu > 0} ), square root (\eqn{\mu = 0}),
- or linear (\eqn{\mu < 0} ).
-
-
-
-<<echo=F,fig=T>>=
-source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/EmaxDDGBM.R')
-data(edhec)
-Lo.Sharpe = -100*ES(edhec,.99)
-Theoretical.Sharpe= EmaxDDGBM(edhec)
-barplot(as.matrix(rbind(Theoretical.Sharpe,Lo.Sharpe)), main="Expected Shortfall(.99) and Drawdown of a Brwonian Motion Asset Process",
- xlab="Fund Type",ylab="Value", col=rich6equal[1:2], beside=TRUE)
- legend("topright", c("ES","EGBMDD"), cex=0.6,
- bty="2", fill=rich6equal[1:2]);
-@
-
-We can observe that the fund "\textbf{Emerging Markets}", which has the largest drawdown and serial autocorrelation, has highest Drawdown , \emph{decrease} most significantly as comapared to other funds.
-
-<<echo=F,fig=T>>=
-
-data(managers)
-Lo.Sharpe = -100*ES(managers[,1:6],.99)
-Theoretical.Sharpe= EmaxDDGBM(managers[,1:6])
-barplot(as.matrix(rbind(Theoretical.Sharpe,Lo.Sharpe)), main="Expected Shortfall(.99) and Drawdown of a Brwonian Motion Asset Process",
- xlab="Fund Type",ylab="Value", col=rich6equal[1:2], beside=TRUE)
- legend("topright", c("ES","EGBMDD"), cex=0.6,
- bty="2", fill=rich6equal[1:2]);
-@
-
-We can see that the model, correctly ranks the highest drawdown fund managers, i.e. \textbf{HAM2}, which has the largest drawdown among all the funds.
-
-\end{document}
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMReturn.Rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMReturn.Rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMReturn.Rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,135 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
-%\VignetteDepends{PerformanceAnalytics}
-%\VignetteKeywords{returns, performance, risk, benchmark, portfolio}
-%\VignettePackage{PerformanceAnalytics}
-
-%\documentclass[a4paper]{article}
-%\usepackage[noae]{Sweave}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage[top=3cm, bottom=3cm, left=2.5cm]{geometry}
-%\usepackage{graphicx}
-%\usepackage{graphicx, verbatim}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage{graphicx}
-
-\title{Gemantsky Lo Makarov Return Model}
-\author{R Project for Statistical Computing}
-
-\begin{document}
-\SweaveOpts{concordance=TRUE}
-
-\maketitle
-
-
-\begin{abstract}
-The returns to hedge funds and other alternative investments are often highly serially correlated. In this paper, we explore several sources of such serial correlation and show that the most likely explanation is illiquidity exposure and smoothed returns. We propose an econometric model of return smoothingand develop estimators for the smoothing profile as well as a smoothing-adjusted obtained Sharpe ratio.\end{abstract}
-
-<<echo=FALSE >>=
-library(PerformanceAnalytics)
-data(edhec)
-@
-
-<<echo=FALSE,eval=TRUE,results=verbatim >>=
-require(noniid.sm) #source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/Return.GLM.R')
-require(noniid.sm) #source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/na.skip.R')
-@
-
-\section{Methodology}
-Given a sample of historical returns \((R_1,R_2, . . .,R_T)\),the method assumes the fund manager smooths returns in the following manner:
-
-To quantify the impact of all of these possible sources of serial correlation, denote by \(R_t\),the true economic return of a hedge fund in period t; and let \(R_t\) satisfy the following linear single-factor model:
-
-\begin{equation}
- R_t = \\ {\mu} + {\beta}{{\delta}}_t+ \xi_t
-\end{equation}
-
-Where $\xi_t, \sim N(0,1)$
-and Var[\(R_t\)] = $\sigma$\ \(^2\)
-
-True returns represent the flow of information that would determine the equilibrium value of the fund's securities in a frictionless market. However, true economic returns are not observed. Instead, \(R_t^0\) denotes the reported or observed return in period t; and let
-%$Z = \sin(X)$. $\sqrt{X}$.
-
-%$\hat{\mu}$ = $\displaystyle\frac{22}{7}$
-%e^{2 \mu} = 1
-%\begin{equation}
-%\left(\sum_{t=1}^{T} R_t/T\right) = \hat{\mu} \\
-%\end{equation}
-\begin{equation}
- R_t^0 = \theta _0R_{t} + \theta _1R_{t-1}+\theta _2R_{t-2} + \cdots + \theta _kR_{t-k}\\
-\end{equation}
-\begin{equation}
-\theta _j \epsilon [0,1] where : j = 0,1, \cdots , k \\
-\end{equation}
-
-and
-%\left(\mu \right) = \sum_{t=1}^{T} \(Ri)/T\ \\
-\begin{equation}
-\theta _1 + \theta _2 + \theta _3 \cdots + \theta _k = 1 \\
-\end{equation}
-
-which is a weighted average of the fund's true returns over the most recent k + 1
-periods, including the current period.
-\section{Smoothing Profile Estimates}
-
-Using the methods outlined above , the paper estimates the smoothing model
-using maximumlikelihood procedure-programmed in Matlab using the Optimization Toolbox andreplicated in Stata usingits MA(k) estimation routine.Using Time seseries analysis and computational finance("tseries") library , we fit an it an ARMA model to a univariate time series by conditional least squares. For exact maximum likelihood estimation,arima0 from package stats can be used.
-
-\section{Usage}
-
-In this example we use edhec database, to compute true Hedge Fund Returns.
-
-<<Graph10,echo=T,fig=T>>=
-library(PerformanceAnalytics)
-data(edhec)
-Returns = Return.GLM(edhec[,1])
-skewness(edhec[,1])
-skewness(Returns)
-# Right Shift of Returns Ditribution for a negative skewed distribution
-kurtosis(edhec[,1])
-kurtosis(Returns)
-# Reduction in "peakedness" around the mean
-layout(rbind(c(1, 2), c(3, 4)))
- chart.Histogram(Returns, main = "Plain", methods = NULL)
- chart.Histogram(Returns, main = "Density", breaks = 40,
- methods = c("add.density", "add.normal"))
- chart.Histogram(Returns, main = "Skew and Kurt",
- methods = c("add.centered", "add.rug"))
-chart.Histogram(Returns, main = "Risk Measures",
- methods = c("add.risk"))
-@
-
-The above figure shows the behaviour of the distribution tending to a normal IID distribution.For comparitive purpose, one can observe the change in the charateristics of return as compared to the orignal.
-
-<<Graph1,echo=T,fig=T>>=
-library(PerformanceAnalytics)
-data(edhec)
-Returns = Return.GLM(edhec[,1])
-layout(rbind(c(1, 2), c(3, 4)))
- chart.Histogram(edhec[,1], main = "Plain", methods = NULL)
- chart.Histogram(edhec[,1], main = "Density", breaks = 40,
- methods = c("add.density", "add.normal"))
- chart.Histogram(edhec[,1], main = "Skew and Kurt",
- methods = c("add.centered", "add.rug"))
-chart.Histogram(edhec[,1], main = "Risk Measures",
- methods = c("add.risk"))
-@
-
-\end{document}
\ No newline at end of file
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMSmoothIndex.Rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMSmoothIndex.Rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/GLMSmoothIndex.Rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,107 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
-%\VignetteDepends{PerformanceAnalytics}
-%\VignetteKeywords{returns, performance, risk, benchmark, portfolio}
-%\VignettePackage{PerformanceAnalytics}
-
-%\documentclass[a4paper]{article}
-%\usepackage[noae]{Sweave}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage[top=3cm, bottom=3cm, left=2.5cm]{geometry}
-%\usepackage{graphicx}
-%\usepackage{graphicx, verbatim}
-%\usepackage{ucs}
-%\usepackage[utf8x]{inputenc}
-%\usepackage{amsmath, amsthm, latexsym}
-%\usepackage{graphicx}
-
-\title{GLM Smoothing Index}
-\author{R Project for Statistical Computing}
-
-\begin{document}
-\SweaveOpts{concordance=TRUE}
-
-\maketitle
-
-
-\begin{abstract}
-The returns to hedge funds and other alternative investments are often highly serially correlated.Gemanstsy,Lo and Markov propose an econometric model of return smoothingand develop estimators for the smoothing profile.The magnitude of impact is measured by the smoothing index, which is a measure of concentration of weight in lagged terms.
-\end{abstract}
-
-<<echo=FALSE >>=
-library(PerformanceAnalytics)
-data(edhec)
-@
-
-<<echo=FALSE>>=
-require(noniid.sm) #source('C:/Users/shubhankit/Desktop/Again/pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/R/GLMSmoothIndex.R')
-@
-
-\section{Background}
-To quantify the impact of all of these possible sources of serial correlation, denote by \(R_t\),the true economic return of a hedge fund in period t; and let \(R_t\) satisfy the following linear single factor model:
-
-\begin{equation}
- R_t = \\ {\mu} + {\beta}{{\delta}}_t+ \xi_t
-\end{equation}
-
-Where $\xi_t, \sim N(0,1)$
-and Var[\(R_t\)] = $\sigma$\ \(^2\)
-
-True returns represent the flow of information that would determine the equilibrium value of the fund's securities in a frictionless market. However, true economic returns are not observed. Instead, \(R_t^0\) denotes the reported or observed return in period t; and let
-%$Z = \sin(X)$. $\sqrt{X}$.
-
-%$\hat{\mu}$ = $\displaystyle\frac{22}{7}$
-%e^{2 \mu} = 1
-%\begin{equation}
-%\left(\sum_{t=1}^{T} R_t/T\right) = \hat{\mu} \\
-%\end{equation}
-\begin{equation}
- R_t^0 = \theta _0R_{t} + \theta _1R_{t-1}+\theta _2R_{t-2} + \cdots + \theta _kR_{t-k}\\
-\end{equation}
-\begin{equation}
-\theta _j \epsilon [0,1] where : j = 0,1, \cdots , k \\
-\end{equation}
-
-and
-%\left(\mu \right) = \sum_{t=1}^{T} \(Ri)/T\ \\
-\begin{equation}
-\theta _1 + \theta _2 + \theta _3 \cdots + \theta _k = 1 \\
-\end{equation}
-
-which is a weighted average of the fund's true returns over the most recent k + 1
-periods, including the current period.
-
-\section{Smoothing Index}
-A useful summary statistic for measuringthe concentration of weights is :
-\begin{equation}
-\xi = \sum_{j=0}^{k} \theta _j^2 \\
-\end{equation}
-
-This measure is well known in the industrial organization literature as the Herfindahl index, a measure of the concentration of firms in a given industry where $\theta$\(_j\) represents the market share of firm j. Becaus $\xi_t$\ is confined to the unit interval, and is minimized when all the $\theta$\(_j\) 's are identical, which implies a value of 1/k+1 for $\xi_i$\ ; and is maximized when one coefficient is 1 and the rest are 0. In the context of smoothed returns, a lower value of implies more smoothing, and the upper bound of 1 implies no smoothing, hence we shall refer to $\theta$\(_j\) as a ''\textbf{smoothingindex}''.
-
-\section{Usage}
-
-In this example we use edhec database, to compute Smoothing Index for Hedge Fund Returns.
-<<>>=
-library(PerformanceAnalytics)
-data(edhec)
-GLMSmoothIndex(edhec)
-@
-
-
-\end{document}
\ No newline at end of file
Deleted: pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/LoSharpeRatio.Rnw
===================================================================
--- pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/LoSharpeRatio.Rnw 2013-09-22 12:49:32 UTC (rev 3152)
+++ pkg/PerformanceAnalytics/sandbox/Shubhankit/noniid.sm/vignettes/LoSharpeRatio.Rnw 2013-09-22 16:06:39 UTC (rev 3153)
@@ -1,116 +0,0 @@
-%% no need for \DeclareGraphicsExtensions{.pdf,.eps}
-
-\documentclass[12pt,letterpaper,english]{article}
-\usepackage{times}
-\usepackage[T1]{fontenc}
-\IfFileExists{url.sty}{\usepackage{url}}
- {\newcommand{\url}{\texttt}}
-
-\usepackage{babel}
-%\usepackage{noweb}
-\usepackage{Rd}
-
-\usepackage{Sweave}
-\SweaveOpts{engine=R,eps=FALSE}
-%\VignetteIndexEntry{Performance Attribution from Bacon}
[TRUNCATED]
To get the complete diff run:
svnlook diff /svnroot/returnanalytics -r 3153
More information about the Returnanalytics-commits
mailing list