Perl 6 - the future is here, just unevenly distributed

IRC log for #moarvm, 2016-07-27

| Channels | #moarvm index | Today | | Search | Google Search | Plain-Text | summary

All times shown according to UTC.

Time Nick Message
01:11 FROGGS_ joined #moarvm
03:36 doublec joined #moarvm
05:38 domidumont joined #moarvm
05:44 domidumont joined #moarvm
06:35 zakharyas joined #moarvm
07:19 domidumont joined #moarvm
07:25 zakharyas joined #moarvm
07:47 zakharyas joined #moarvm
08:39 zakharyas joined #moarvm
08:55 brrt joined #moarvm
11:37 jnthn timotimo: MSVC gives this warning: c:\consulting\moarvm\src\spesh\manipulate.c(268) : warning C4716: 'MVM_spesh_manipulate_split_BB_at' : must return a value
13:07 [Coke] just tried to build moar (from nqp, from rakudo) using strawberry perl+git bash; got a "unknown OS 'msys'")
13:07 [Coke] RT #125501
13:07 synopsebot6 Link:  https://rt.perl.org/rt3//Public/Bug/Display.html?id=125501
13:15 dalek MoarVM: 10ae72c | jnthn++ | src/6model/reprs/ConcBlockingQueue.c:
13:15 dalek MoarVM: Mark blocked around some more lock acquisitions.
13:15 dalek MoarVM: review: https://github.com/MoarVM/MoarVM/commit/10ae72c979
13:15 dalek MoarVM: 583bb52 | jnthn++ | src/gc/orchestrate.c:
13:15 dalek MoarVM: Make a panic message clearer in its origin.
13:15 dalek MoarVM: review: https://github.com/MoarVM/MoarVM/commit/583bb52620
13:15 dalek MoarVM: 5b64f88 | jnthn++ | src/6model/reprs/ConcBlockingQueue.c:
13:15 dalek MoarVM: Fix various cases of out-dated pointer reads.
13:15 dalek MoarVM:
13:15 dalek MoarVM: When we come back from being blocked, GC may have taken place. This
13:15 dalek MoarVM: means that objects may have moved, and any interior pointers we hold
13:15 dalek MoarVM: must be updated.
13:15 dalek MoarVM: review: https://github.com/MoarVM/MoarVM/commit/5b64f88882
13:18 jnthn Those get things at least somewhat more reliable. Turns out that - at least so far - the bugs I expected to uncover in GC orchestration weren't actually there.
13:24 timotimo oof
13:25 lizmat jnthn: is there a reason that ShapedArrayCommon role has a .plan method?  (whereas we removed that everywhere else)
13:26 jnthn lizmat: 'cus at the time it was added, it hadn't been removed everywhere else, perhaps?
13:26 jnthn And got left behind during the removal?
13:26 lizmat ok, then I'll remove it  :-)
13:26 timotimo .plan is really oldschool
13:26 timotimo i wouldn't mind a revival
13:26 lizmat the only thing it now dies, saying it's illegal on shaped arrays
13:27 jnthn It's possible it got removed in nom while I was working in a branch :)
13:30 mst lizmat: maybe somebody subclassed it into Dildo and without a .plan it wouldn't've had finger support?
13:31 * lizmat looks outside and sees a crow land on the garage
13:40 dalek MoarVM: bf8fc05 | jnthn++ | src/6model/reprs/ (2 files):
13:40 dalek MoarVM: A couple of further missing rootings.
13:40 dalek MoarVM:
13:40 dalek MoarVM: From auditing the various places a thread GC-blocks itself.
13:40 dalek MoarVM: review: https://github.com/MoarVM/MoarVM/commit/bf8fc0578e
13:41 timotimo that could certainly improve stability
13:45 jnthn All of those help a little
13:45 jnthn I'm still seeing the odd issue
13:49 jnthn Hmm. I wonder if things may go bad if a uv_sem_t moves around in memory over its lifetime :)
13:49 jnthn We hold locks and condvars at a LoI to prevent that
13:50 nwc10 "LoI" - Level of Indirection?
13:50 jnthn aye
13:54 jnthn yeah, this seems like an improvement
13:55 timotimo ... i don't understand this. i thought we already have the LoI in place to prevent trouble, so what change have you just made that seems like an improvement? %)
13:57 dalek MoarVM: 11e02fe | jnthn++ | src/6model/reprs/Semaphore. (2 files):
13:57 dalek MoarVM: Hold uv_sem_t at a level of indirection.
13:57 dalek MoarVM:
13:57 dalek MoarVM: So its location in memory doesn't change over its lifetime. Turns out
13:57 dalek MoarVM: that it doing so can make pthreads very sad indeed.
13:57 dalek MoarVM: review: https://github.com/MoarVM/MoarVM/commit/11e02fe5e1
13:58 jnthn That one :)
13:58 jnthn We didn't have it in place for semaphores
13:58 jnthn Probably 'cus we don't, afaik, actually use them inside of CORE.setting, just provide them for end users needing low level stuff
13:58 * jnthn checks that
13:59 jnthn oh no, we use it in ThreadPoolScheduler.pm
13:59 timotimo a-ha!
13:59 timotimo good, good
14:01 zakharyas joined #moarvm
14:11 * jnthn is running the thing that once could barely get it right 10 times in a row 1000 times
14:21 jnthn Worked out fine :)
14:32 timotimo yays!
14:34 * jnthn has moved on to investigating a spectest that sometimes hangs, that sadly was not magically fixed by this
14:34 timotimo that just means it's still useful :)
14:35 FROGGS joined #moarvm
14:36 jnthn Oh, it'll let me close an RT and the non-semaphore fixes will have also improved overall stability
14:37 timotimo \o/
15:08 jnthn Does anyone know in GDB how you can thread apply all [2 commands here]?
16:30 zakharyas joined #moarvm
17:37 domidumont joined #moarvm
18:11 japhb jnthn: SO MANY PLUSSES for today's fixes.  I'd been writing single-threaded Perl 6 code for quite a while because I was just hitting too many odd failures; if your comment from 4 hours ago of two orders of magnitude threading reliability improvements is representative, I'm about to have a very good day.  :-)
18:12 mst joined #moarvm
18:49 nwc10 jnthn: this still deadlocks: http://paste.scsys.co.uk/528614
18:50 nwc10 no, wait, *that* ons is new
18:50 nwc10 I think that t/spec/S17-supply/syntax.t was the one that sometimes hung
18:51 nwc10 it now hangs like this: http://paste.scsys.co.uk/528615
18:52 nwc10 t/spec/S17-supply/wait.t now hangs in the same way as the other two: http://paste.scsys.co.uk/528616
18:53 nwc10 and t/spec/S17-supply/act.t is doing something strange with a single thread and a lot of sleeping
18:55 nwc10 and slowly mmap'ing in more memory, but I can't work out why from strace
18:56 nwc10 http://paste.scsys.co.uk/528618
18:57 nwc10 oooh, many tests failed
18:58 nwc10 all in t/spec/S17-supply/ I think
18:58 nwc10 ASAN makes no comment
19:03 nwc10 let's try a complete clean build again...
19:04 geekosaur it's entirely possible that fixing this uncovered other issues
19:04 nwc10 that was roughly my assumption
19:05 nwc10 but I'm a bit surprised that both I see it and jnthn didn't, and that I see so much but ASAN remains silent
19:39 nwc10 repeatable.
19:46 jnthn What on earth...
19:46 * jnthn had clean spectest :/
19:47 * jnthn does a clean build of what's in nom
19:55 jnthn Moar master / NQP 255d2e9d1f1 / Rakudo 948aee0f405 just gave me a clean spectest on Linux in my VM, TEST_JOBS = 12, 4 cores available to the VM
19:57 nwc10 this is 24 cores
19:57 jnthn Building with NQP and Rakudo at HEAD now
19:57 jnthn Though NQP had no changes execpt to the js backend
19:58 jnthn Feels like a big discrepancy to just be "bad luck"
19:58 jnthn Or "good luck" in my case
19:58 jnthn Not that I'm especially lucky, in that I see to be the only person around here who can debug difficult concurrency problems. :P
19:59 nwc10 a non-ASAN built then run under valgrind doesn't give any insight
20:00 nwc10 (same failures, still no barfage)
20:01 jnthn Guess I can try it on Windows too
20:01 jnthn ooh
20:01 jnthn I just got an S17-supply failure
20:01 jnthn Two
20:02 nwc10 \o/ and /o\
20:02 jnthn 3
20:02 jnthn More
20:03 jnthn So, interesting. Same MoarVM version in both runs. Same NQP code in both runs.
20:04 jnthn That ASAN doesn't barf is at least mildly suggestive "not MoarVM" too
20:05 nwc10 yes, agree. Looks like the bad logic might be further up the stack
20:06 jnthn It doesn't need load to fail
20:06 jnthn elems.t is totally repeatable for example
20:06 jnthn Slightly complicating this is that I think when I put my Supply patch in I did a pull --rebase
20:16 jnthn But it was clean even with that, it turns out. Looks like a commit after my supply changes in nom bust it
20:16 nwc10 ah, OK, oops
20:16 nwc10 so I'm mentioning this in the wrong channel
20:16 jnthn Yeah, though here was a good guess.
20:17 jnthn I mean, given what I was working on today, it was overwhelmingly likely just from the list of failed tests that it was going to be something I'd done. :)
20:18 jnthn On the other hand, everything I've worked on today was about things that were hard to reproduce :)
20:19 jnthn So when the failures were so easily reproducable I had a hunch it was going to be something else. :)

| Channels | #moarvm index | Today | | Search | Google Search | Plain-Text | summary