[datatable-help] Random segfaults

Chris Neff caneff at gmail.com
Tue Dec 20 14:45:49 CET 2011


Emailed too soon. Crashing again.  I'll renable debugging and see what
comes up the next time it happens. Still isn't at all consistent as to
when exactly it crashes.  I just have a script that makes a data.table
that I know will eventually crash if I use the data.table enough.
Can't reproduce on toy sets.

In regards to the valgrind request, I ran test.data.table with
valgrind on and everything passed.  It timed out when trying to run my
script though, and was way way slower than normal in the process.


On 20 December 2011 07:56, Chris Neff <caneff at gmail.com> wrote:
> So far so good. The state before this latest patch was I would run my
> script, and then try to mess with the resultant data.table, and almost
> immediately it would segfault. 10 minutes of playing and no segfaults
> yet.  Will update if there is one.
>
> On 19 December 2011 20:12, Chris Neff <caneff at gmail.com> wrote:
>> I definitely do that somewhere in my code. I'll patch tomorrow and try.
>>
>> On 19 December 2011 19:03, Matthew Dowle <mdowle at mdowle.plus.com> wrote:
>>> Chris,
>>>
>>> Are you returning any character or list() columns in j when grouping? If
>>> so, Jim Holtman provided a reproducible example and a fix has just been
>>> committed. Same errors / seg faults, and, for R >= 2.14.0, not just R <
>>> 2.14.0. Could this also be the same problem Timothée Carayol mentioned?
>>> Fingers crossed ...
>>>
>>> Matthew
>>>
>>>
>>> On Sat, 2011-12-17 at 00:27 +0000, Matthew Dowle wrote:
>>>> It'd be good to get to the bottom of it in case it's not a pre-2.14.0
>>>> problem.  Try this :
>>>>
>>>> apt-get install valgrind  (if not already installed)
>>>> R -d valgrind
>>>> require(data.table)
>>>> test.data.table()
>>>>
>>>> When I do this it runs very slowly but eventually completes ok with just
>>>> test 120 failing. Test 120 is a timing test, which takes longer because
>>>> of valgrind mode, so that's ok. Ignore the valgrind messages for R
>>>> itself that happen before R's banner comes up.
>>>>
>>>> If you get the same, then proceed to run your tests that crash it.
>>>> Hopefully you'll get some messages at the point the corruption occurs.
>>>>
>>>>
>>>> On Fri, 2011-12-16 at 12:37 -0500, Chris Neff wrote:
>>>> > >
>>>> > > Only other thought ... your special internal build of R ... does it
>>>> > > increase R_len_t on 64bit to allow longer vectors than 2^31, by any
>>>> > > chance?  I've used R_len_t quite a bit in data.table to future proof for
>>>> > > when that happens, but if you've done it already in your build then that
>>>> > > would help to know since it's never been tested afaik when R_len_t != int
>>>> > > on 64bit.  I'm also assuming R_len_t is signed. If your R has R_len_t as
>>>> > > unsigned would need to know.
>>>> >
>>>> > Answer to this is no, we haven't touched that.
>>>> >
>>>> > I'm happy to keep helping, but if you'd rather not worry about that
>>>> > stuff, we will be upgrading to 2.14 in the next few months apparently,
>>>> > and I can live with 1.7.1 until then.
>>>>
>>>>
>>>> _______________________________________________
>>>> datatable-help mailing list
>>>> datatable-help at lists.r-forge.r-project.org
>>>> https://lists.r-forge.r-project.org/cgi-bin/mailman/listinfo/datatable-help
>>>
>>>


More information about the datatable-help mailing list