From rossbennett34 at gmail.com Sun Dec 1 21:29:31 2013 From: rossbennett34 at gmail.com (Ross Bennett) Date: Sun, 1 Dec 2013 12:29:31 -0800 Subject: [GSoC-PortA] PortfolioAnalytics Testing Message-ID: All, With the optimization I've been doing in constrained_objective and now with the ROI changes, I am really starting to appreciate a more formal testing process instead of running demos and seeing what fails. I've started playing with the testthat package and really like it. I've had a look at quantstrat and plyr to see how the testthat package is used. The main difference is that quantstrat uses the demo files whereas plyr uses manually written test files. There are pros and cons to both approaches... using the demos as the tests minimizes duplicate code, but manual tests allow us to test specific things we may not need/want in the demos. I want to write some tests using testthat to include in PortfolioAnalytics, but would like your guidance on what framework or approach to use before I begin. Ross -------------- next part -------------- An HTML attachment was scrubbed... URL: From brian at braverock.com Sun Dec 1 21:46:46 2013 From: brian at braverock.com (Brian G. Peterson) Date: Sun, 01 Dec 2013 14:46:46 -0600 Subject: [GSoC-PortA] PortfolioAnalytics Testing In-Reply-To: References: Message-ID: <529BA036.30706@braverock.com> As you say, both approaches have merit. I would suggest starting with the demos, as this will test a broad set of functionality, and provides 'integration testing' for the entire package. It also minimizes test code to write at first. I agree that testing specific functions and inputs may be 'more robust', but it will also take (significantly) more time to design a test plan and suite of separate tests. I don't want to discourage that by any means, but I would want to get test coverage for the functionality in the demos first. Regards, Brian On 12/01/2013 02:29 PM, Ross Bennett wrote: > With the optimization I've been doing in constrained_objective and now > with the ROI changes, I am really starting to appreciate a more formal > testing process instead of running demos and seeing what fails. I've > started playing with the testthat package and really like it. > > I've had a look at quantstrat and plyr to see how the testthat package > is used. The main difference is that quantstrat uses the demo files > whereas plyr uses manually written test files. There are pros and cons > to both approaches... using the demos as the tests minimizes duplicate > code, but manual tests allow us to test specific things we may not > need/want in the demos. > > I want to write some tests using testthat to include in > PortfolioAnalytics, but would like your guidance on what framework or > approach to use before I begin. From rossbennett34 at gmail.com Mon Dec 2 01:21:50 2013 From: rossbennett34 at gmail.com (Ross Bennett) Date: Sun, 1 Dec 2013 18:21:50 -0600 Subject: [GSoC-PortA] PortfolioAnalytics Testing In-Reply-To: <529BA036.30706@braverock.com> References: <529BA036.30706@braverock.com> Message-ID: Sounds good, I'll get started with tests for the demos and add tests for specific functions only as needed. I like the idea of 'integrated testing' using the demo files. It will also be a good opportunity to look closer at the demos to make sure each demo is comprehensive and serves a specific purpose. Thanks for the guidance! Ross On Sun, Dec 1, 2013 at 2:46 PM, Brian G. Peterson wrote: > As you say, both approaches have merit. > > I would suggest starting with the demos, as this will test a broad set of > functionality, and provides 'integration testing' for the entire package. > It also minimizes test code to write at first. > > I agree that testing specific functions and inputs may be 'more robust', > but it will also take (significantly) more time to design a test plan and > suite of separate tests. I don't want to discourage that by any means, but > I would want to get test coverage for the functionality in the demos first. > > Regards, > > Brian > > > On 12/01/2013 02:29 PM, Ross Bennett wrote: > >> With the optimization I've been doing in constrained_objective and now >> with the ROI changes, I am really starting to appreciate a more formal >> testing process instead of running demos and seeing what fails. I've >> started playing with the testthat package and really like it. >> >> I've had a look at quantstrat and plyr to see how the testthat package >> is used. The main difference is that quantstrat uses the demo files >> whereas plyr uses manually written test files. There are pros and cons >> to both approaches... using the demos as the tests minimizes duplicate >> code, but manual tests allow us to test specific things we may not >> need/want in the demos. >> >> I want to write some tests using testthat to include in >> PortfolioAnalytics, but would like your guidance on what framework or >> approach to use before I begin. >> > _______________________________________________ > GSoC-PortA mailing list > GSoC-PortA at lists.r-forge.r-project.org > http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh.m.ulrich at gmail.com Mon Dec 2 01:31:37 2013 From: josh.m.ulrich at gmail.com (Joshua Ulrich) Date: Sun, 1 Dec 2013 18:31:37 -0600 Subject: [GSoC-PortA] PortfolioAnalytics Testing In-Reply-To: References: <529BA036.30706@braverock.com> Message-ID: While you're working on unit testing, also think about performance benchmarking. I've wanted to write something like Wes McKinney's (Python pandas author) vbench package for R ever since I was able to make large performance improvements to quantstrat. I'm not suggesting you work on building that functionality, but rather than you keep notes on which functions should be monitored for performance regressions. -- Joshua Ulrich | about.me/joshuaulrich FOSS Trading | www.fosstrading.com On Sun, Dec 1, 2013 at 6:21 PM, Ross Bennett wrote: > Sounds good, I'll get started with tests for the demos and add tests for > specific functions only as needed. I like the idea of 'integrated testing' > using the demo files. It will also be a good opportunity to look closer at > the demos to make sure each demo is comprehensive and serves a specific > purpose. > > Thanks for the guidance! > > Ross > > > > On Sun, Dec 1, 2013 at 2:46 PM, Brian G. Peterson > wrote: >> >> As you say, both approaches have merit. >> >> I would suggest starting with the demos, as this will test a broad set of >> functionality, and provides 'integration testing' for the entire package. >> It also minimizes test code to write at first. >> >> I agree that testing specific functions and inputs may be 'more robust', >> but it will also take (significantly) more time to design a test plan and >> suite of separate tests. I don't want to discourage that by any means, but >> I would want to get test coverage for the functionality in the demos first. >> >> Regards, >> >> Brian >> >> >> On 12/01/2013 02:29 PM, Ross Bennett wrote: >>> >>> With the optimization I've been doing in constrained_objective and now >>> with the ROI changes, I am really starting to appreciate a more formal >>> testing process instead of running demos and seeing what fails. I've >>> started playing with the testthat package and really like it. >>> >>> I've had a look at quantstrat and plyr to see how the testthat package >>> is used. The main difference is that quantstrat uses the demo files >>> whereas plyr uses manually written test files. There are pros and cons >>> to both approaches... using the demos as the tests minimizes duplicate >>> code, but manual tests allow us to test specific things we may not >>> need/want in the demos. >>> >>> I want to write some tests using testthat to include in >>> PortfolioAnalytics, but would like your guidance on what framework or >>> approach to use before I begin. >> >> _______________________________________________ >> GSoC-PortA mailing list >> GSoC-PortA at lists.r-forge.r-project.org >> http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > > > > _______________________________________________ > GSoC-PortA mailing list > GSoC-PortA at lists.r-forge.r-project.org > http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > From rossbennett34 at gmail.com Mon Dec 2 02:47:08 2013 From: rossbennett34 at gmail.com (Ross Bennett) Date: Sun, 1 Dec 2013 19:47:08 -0600 Subject: [GSoC-PortA] PortfolioAnalytics Testing In-Reply-To: References: <529BA036.30706@braverock.com> Message-ID: Will do. Thanks for the suggestion. Ross On Dec 1, 2013 4:32 PM, "Joshua Ulrich" wrote: > While you're working on unit testing, also think about performance > benchmarking. I've wanted to write something like Wes McKinney's > (Python pandas author) vbench package for R ever since I was able to > make large performance improvements to quantstrat. I'm not suggesting > you work on building that functionality, but rather than you keep > notes on which functions should be monitored for performance > regressions. > -- > Joshua Ulrich | about.me/joshuaulrich > FOSS Trading | www.fosstrading.com > > > On Sun, Dec 1, 2013 at 6:21 PM, Ross Bennett > wrote: > > Sounds good, I'll get started with tests for the demos and add tests for > > specific functions only as needed. I like the idea of 'integrated > testing' > > using the demo files. It will also be a good opportunity to look closer > at > > the demos to make sure each demo is comprehensive and serves a specific > > purpose. > > > > Thanks for the guidance! > > > > Ross > > > > > > > > On Sun, Dec 1, 2013 at 2:46 PM, Brian G. Peterson > > wrote: > >> > >> As you say, both approaches have merit. > >> > >> I would suggest starting with the demos, as this will test a broad set > of > >> functionality, and provides 'integration testing' for the entire > package. > >> It also minimizes test code to write at first. > >> > >> I agree that testing specific functions and inputs may be 'more robust', > >> but it will also take (significantly) more time to design a test plan > and > >> suite of separate tests. I don't want to discourage that by any means, > but > >> I would want to get test coverage for the functionality in the demos > first. > >> > >> Regards, > >> > >> Brian > >> > >> > >> On 12/01/2013 02:29 PM, Ross Bennett wrote: > >>> > >>> With the optimization I've been doing in constrained_objective and now > >>> with the ROI changes, I am really starting to appreciate a more formal > >>> testing process instead of running demos and seeing what fails. I've > >>> started playing with the testthat package and really like it. > >>> > >>> I've had a look at quantstrat and plyr to see how the testthat package > >>> is used. The main difference is that quantstrat uses the demo files > >>> whereas plyr uses manually written test files. There are pros and cons > >>> to both approaches... using the demos as the tests minimizes duplicate > >>> code, but manual tests allow us to test specific things we may not > >>> need/want in the demos. > >>> > >>> I want to write some tests using testthat to include in > >>> PortfolioAnalytics, but would like your guidance on what framework or > >>> approach to use before I begin. > >> > >> _______________________________________________ > >> GSoC-PortA mailing list > >> GSoC-PortA at lists.r-forge.r-project.org > >> http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > > > > > > > > _______________________________________________ > > GSoC-PortA mailing list > > GSoC-PortA at lists.r-forge.r-project.org > > http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > > > _______________________________________________ > GSoC-PortA mailing list > GSoC-PortA at lists.r-forge.r-project.org > http://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/gsoc-porta > -------------- next part -------------- An HTML attachment was scrubbed... URL: