[GSoC-PortA] PortfolioAnalytics Testing

Ross Bennett rossbennett34 at gmail.com
Mon Dec 2 02:47:08 CET 2013


Will do. Thanks for the suggestion.

Ross
 On Dec 1, 2013 4:32 PM, "Joshua Ulrich" <josh.m.ulrich at gmail.com> wrote:

> While you're working on unit testing, also think about performance
> benchmarking.  I've wanted to write something like Wes McKinney's
> (Python pandas author) vbench package for R ever since I was able to
> make large performance improvements to quantstrat.  I'm not suggesting
> you work on building that functionality, but rather than you keep
> notes on which functions should be monitored for performance
> regressions.
> --
> Joshua Ulrich  |  about.me/joshuaulrich
> FOSS Trading  |  www.fosstrading.com
>
>
> On Sun, Dec 1, 2013 at 6:21 PM, Ross Bennett <rossbennett34 at gmail.com>
> wrote:
> > Sounds good, I'll get started with tests for the demos and add tests for
> > specific functions only as needed. I like the idea of 'integrated
> testing'
> > using the demo files. It will also be a good opportunity to look closer
> at
> > the demos to make sure each demo is comprehensive and serves a specific
> > purpose.
> >
> > Thanks for the guidance!
> >
> > Ross
> >
> >
> >
> > On Sun, Dec 1, 2013 at 2:46 PM, Brian G. Peterson <brian at braverock.com>
> > wrote:
> >>
> >> As you say, both approaches have merit.
> >>
> >> I would suggest starting with the demos, as this will test a broad set
> of
> >> functionality, and provides 'integration testing' for the entire
> package.
> >> It also minimizes test code to write at first.
> >>
> >> I agree that testing specific functions and inputs may be 'more robust',
> >> but it will also take (significantly) more time to design a test plan
> and
> >> suite of separate tests.  I don't want to discourage that by any means,
> but
> >> I would want to get test coverage for the functionality in the demos
> first.
> >>
> >> Regards,
> >>
> >> Brian
> >>
> >>
> >> On 12/01/2013 02:29 PM, Ross Bennett wrote:
> >>>
> >>> With the optimization I've been doing in constrained_objective and now
> >>> with the ROI changes, I am really starting to appreciate a more formal
> >>> testing process instead of running demos and seeing what fails. I've
> >>> started playing with the testthat package and really like it.
> >>>
> >>> I've had a look at quantstrat and plyr to see how the testthat package
> >>> is used. The main difference is that quantstrat uses the demo files
> >>> whereas plyr uses manually written test files. There are pros and cons
> >>> to both approaches... using the demos as the tests minimizes duplicate
> >>> code, but manual tests allow us to test specific things we may not
> >>> need/want in the demos.
> >>>
> >>> I want to write some tests using testthat to include in
> >>> PortfolioAnalytics, but would like your guidance on what framework or
> >>> approach to use before I begin.
> >>
> >> _______________________________________________
> >> GSoC-PortA mailing list
> >> GSoC-PortA at lists.r-forge.r-project.org
> >> http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta
> >
> >
> >
> > _______________________________________________
> > GSoC-PortA mailing list
> > GSoC-PortA at lists.r-forge.r-project.org
> > http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta
> >
> _______________________________________________
> GSoC-PortA mailing list
> GSoC-PortA at lists.r-forge.r-project.org
> http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.r-forge.r-project.org/pipermail/gsoc-porta/attachments/20131201/8d01f169/attachment-0001.html>


More information about the GSoC-PortA mailing list