[datatable-help] Random segfaults

Matthew Dowle mdowle at mdowle.plus.com
Sat Dec 17 01:27:52 CET 2011


It'd be good to get to the bottom of it in case it's not a pre-2.14.0
problem.  Try this :

apt-get install valgrind  (if not already installed)
R -d valgrind
require(data.table)
test.data.table()

When I do this it runs very slowly but eventually completes ok with just
test 120 failing. Test 120 is a timing test, which takes longer because
of valgrind mode, so that's ok. Ignore the valgrind messages for R
itself that happen before R's banner comes up.

If you get the same, then proceed to run your tests that crash it.
Hopefully you'll get some messages at the point the corruption occurs.


On Fri, 2011-12-16 at 12:37 -0500, Chris Neff wrote:
> >
> > Only other thought ... your special internal build of R ... does it
> > increase R_len_t on 64bit to allow longer vectors than 2^31, by any
> > chance?  I've used R_len_t quite a bit in data.table to future proof for
> > when that happens, but if you've done it already in your build then that
> > would help to know since it's never been tested afaik when R_len_t != int
> > on 64bit.  I'm also assuming R_len_t is signed. If your R has R_len_t as
> > unsigned would need to know.
> 
> Answer to this is no, we haven't touched that.
> 
> I'm happy to keep helping, but if you'd rather not worry about that
> stuff, we will be upgrading to 2.14 in the next few months apparently,
> and I can live with 1.7.1 until then.




More information about the datatable-help mailing list