Perl 6 - the future is here, just unevenly distributed

IRC log for #marpa, 2015-08-03

| Channels | #marpa index | Today | | Search | Google Search | Plain-Text | summary

All times shown according to UTC.

Time Nick Message
00:23 purmou hi all
00:24 purmou thinking about bit vector implementation of earley set
00:25 Aria ooog, I'm listening!
00:26 purmou still wrapping my head around it -- should there be a position indicating each state of each rule?
00:27 purmou or do we just care about a rule appearing at all?
00:27 purmou say you have a grammar with 5 rules...each earley set is essentially a bit vector of 5 bits, right? the first rule is indicated at vector[0], last one is indicated at vector[4]
00:28 purmou if a rule already appears in an earley set, then vector[ rule # ] will already be set
00:28 Aria An item though is a rule and origin and position within the rule.
00:29 purmou right, which is why I arrived at the notion of computing all the permutations of all the rules.
00:29 Aria Not just a rule. Though binarizing them lets position within the rule be reduced to a bit, and origin likewise.
00:29 Aria Permutations, eh?
00:30 purmou i.e. a production with 4 rules in its expansion has 5 permutations...one with the dot at the beginning, one advanced 1 position, etc
00:31 purmou I feel like that idea is fruitless, though...we also have to define unique origins, which, when parsing a tooooon of tokens, can make the permutations of just a single rule way too large a number
00:34 purmou i took a look at your code, Aria, and noticed you have bit vectors representing all the predictable rules for a given rule. am I understanding that right?
00:44 purmou hmmmm...because predictions always originate from the current origin.
00:47 Aria Yep.
00:47 Aria Predictions are a special case.
00:48 purmou the next question I need to explore is: is it possible for the same rule to appear in the same set with a different position for the dot?
00:52 purmou oh, wait. all predictions will have the dot at position 0.
00:53 purmou and this never changes...even if the first word in a predicted rule is terminal, the result of that is only seen in the next set.
01:03 Aria Yeah. But the other entries in an earley set? Yes. Totally possible for teh same rule to appear twice, with different positions.
01:32 purmou so the bit vectors only optimize the predictions
01:33 purmou are there other unrelated optimizations that can be done elsewhere in the algo?
01:34 Aria That is a good question!
01:35 purmou likely the subject of many, many a research projects. haha
01:37 purmou also, my parser works just fine on empty rules without any additional work done. why all the worry about building a nullable set to handle that?
01:52 ronsavage Re: http://irclog.perlgeek.de/marpa/2015-08-02#i_10992028. I've announced MarpaX::Languages::Lua::Parser on the Lua mailing list.
01:54 jeffreykegler joined #marpa
01:55 jeffreykegler ronsavage: Thanks!
04:31 ronsavage joined #marpa
04:41 ceridwen joined #marpa
06:59 jeffreykegler joined #marpa
07:01 ronsavage joined #marpa
07:02 pczarn joined #marpa
07:03 jeffreykegler joined #marpa
08:09 mvuets joined #marpa
09:25 rns generalized approach to pretty printing -- http://www.semanticdesigns.com/Products/DMS/DMSPrettyPrinters.html
09:27 rns the author defines prettyprinting according to a box model, which allows AST visitors to pretty-print by "building and composing" vertical, horizontal, and indented boxes.
09:33 rns boxes for a '{' statements '}' AST node, an AST visitor would return V('{', I(ast.children), '}') -- V, I, and H are shortcuts for vertical, indented and horizontal boxes
09:35 rns referring to this discussion: http://irclog.perlgeek.de/marpa/2015-07-31#i_10986869
10:29 koo9 joined #marpa
12:33 koo9 joined #marpa
16:27 koo9 joined #marpa
17:03 jeffreykegler joined #marpa
17:18 pczarn joined #marpa
17:28 pczarn linguistics researchers tried to develop SPPFs in 1992: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.57.3400
17:47 koo9 joined #marpa
17:59 lwa joined #marpa
18:01 koo9 joined #marpa
18:37 koo9 joined #marpa
18:53 * mvuets blinks
18:54 mvuets at first i thought this link was posted in the #tokipona channel
18:54 jeffreykegler wrong window?
18:54 koo9 joined #marpa
19:05 koo9 joined #marpa
19:49 jeffreykegler By the way, here's the next stop on Maxim's Toki Pona / Marpa world tour: http://act.perl-workshop.ch/spw2015/talk/6355
19:50 ceridwen Most of the early papers about parse forests came out of the computational linguistics community.
19:52 ceridwen At the time, there was very little interest in generalized parsing in computer science/software engineering, everyone was convinced that LALR was the way to go.
19:54 jeffreykegler Indeed. It got to the point where you could take a full semester course on parsing at a good school, and come out of it having never even heard of Earley's algorithm.
19:55 jeffreykegler Nowadays, I'm not sure courses on parsing are offered anywhere at any level, and they are even dropping courses on compilers.
19:56 pczarn IIRC there's a course on compilers at my university
20:00 pczarn the professor at least understands Earley's algorithm in general
20:01 mvuets joined #marpa
20:03 mvuets jeffreykegler: no, not wrong window. i just happen to have #marpa and #tokipona opened next to each other. and i thought i was in #tokipona when saw pczarn's link (-: sorry for noise
20:04 mvuets world tour - haha, thank you (-: by the way, i decided i'd drop some toki pona slides in favor of dedicating more time to Marpa and parsing
20:05 purmou joined #marpa
20:10 pczarn mvuets: Good luck! are you going to explain Markov chains?
20:47 purmou the undergrad compilers course at my university just uses CUP
20:47 purmou so you don't even really learn about LALR but end up generating one
20:48 purmou I think the prerequisite course for that, though, covers parsing. I don't think we learn about Earley there though.
20:48 purmou I have yet to take both of those courses, though
21:22 mvuets pczarn: unlikely, i should update the talk abstract. my last talk proved it's very saturated even without Markov chains. however i still want to write a rubbish generator and pipe into the parser to see what happens
22:24 ronsavage joined #marpa
23:03 flaviu joined #marpa
23:04 koo9 joined #marpa

| Channels | #marpa index | Today | | Search | Google Search | Plain-Text | summary