Perl 6 - the future is here, just unevenly distributed

IRC log for #marpa, 2014-02-07

| Channels | #marpa index | Today | | Search | Google Search | Plain-Text | summary

All times shown according to UTC.

Time Nick Message
00:03 jeffreykegler jdurand: re http://irclog.perlgeek.de/​marpa/2014-02-06#i_8244315 according to the docs, they may "contain anything acceptable to Perl", and what is meant is "acceptable to whichever Perl you are running with"
00:03 jeffreykegler Had I it to do over, I would shrink from tying the DSL to Perl that intimately, but what'
00:03 jeffreykegler ... what's done is done.
00:05 jeffreykegler Bottom line: character classes should allow anything allowed by your current Perl in a character class.
00:05 ronsavage joined #marpa
00:06 ronsavage jdurand: I could not see any reference to Jeffrey, Kegler, Marpa, parse or lex on that page. Am I missing something?
00:11 jeffreykegler ronsavage: re http://irclog.perlgeek.de/​marpa/2014-02-06#i_8244315 -- that mystified me, until I noticed someone cited my "Perl and Undecidability" result, which (to my surprise) seems to have become accepted as a major finding in programming languages.
00:13 jeffreykegler I say "to my surprise" because, first, it's about a specific programming language, and findings specific to, say, COBOL, are usually regarded as of transitory interest.
00:14 jeffreykegler And, second, because it was not without precedent.  But I was the first to formalize what others has just conjectured, I initially caught a lot of heat about it, and now I get the credit.
00:16 ronsavage All is explained!
01:05 jeffreykegler joined #marpa
04:40 shadowpaste "lucs" at 70.81.138.180 pasted "Semantics package misunderstanding?" (34 lines) at http://scsys.co.uk:8002/301065
05:02 ronsavage joined #marpa
05:13 jeffreykegler joined #marpa
05:14 jeffreykegler lucs: re http://irclog.perlgeek.de/​marpa/2014-02-07#i_8245469 -- fixed in commit cc8060093f54f9a8eecedbdfa02b77260f22855f
05:15 lucs jeffreykegler: Oh, so my expectation was correct?
05:15 jeffreykegler That's in the phase 3 branch.  It should make it into a developer's release in a week or so.
05:15 jeffreykegler lucs: yes your expectation was correct and the fix produces the behavior you expected
05:15 lucs Okay, thanks.
17:06 jeffreykegler joined #marpa
17:20 lucs jeffreykegler: Good morning Jeffrey. I have another problem snippet:
17:20 shadowpaste "lucs" at 70.81.138.180 pasted "Location problem?" (52 lines) at http://scsys.co.uk:8002/301418
17:20 jeffreykegler lucs: Good morning, if that's what it is your time. :-)
17:21 lucs Well, it's 12:21 here (Montreal)
17:22 jeffreykegler 3000 miles away is enough to make you the closest IRC regular to me, geographically.
17:22 lucs :)
17:22 lucs I know where ronsavage is (approximately), but not where jdurand is...
17:23 jeffreykegler jdurand lives in the French Alps, I believe, which I picture as very beautiful.
17:23 lucs Aha. Surely.
17:24 jeffreykegler Btw, I'm looking at your issue :-)
17:24 lucs Okay :)
17:28 jeffreykegler lucs: re http://irclog.perlgeek.de/​marpa/2014-02-07#i_8248217 -- you are getting G1 locations, not the input stream locations which you were expecting.
17:28 lucs Okay, makes sense.
17:28 lucs Is that what you meant location() to return?
17:29 jeffreykegler I'm trying to remember if the docs say anything about what "location" means by default.
17:29 * jeffreykegler is going back and checking
17:29 lucs It's mentioned at the start of the Semantics POD.
17:29 * jeffreykegler does not always remember what he intended and relies on the documentation, test suite, etc.
17:30 jeffreykegler lucs: yes, but I was looking for language like, "Unless otherwise specified, in these docs the word location 'means' ...."
17:31 lucs Ah. I don't remember anything like that (but I wouldn't trust my memory too much...)
17:32 lucs Supposing that location() returning 0-1 and 1-2 (like they currently do), is there a way to get to the 0-3 and 4-7 values?
17:33 lucs s/,/ is correct,/
17:37 jeffreykegler OK. Dealing with a question earlier in the stack, in the Scanless::R POD, I say, "In this document, the word "location" refers to location in the input stream unless otherwise specified."
17:37 jeffreykegler So I really should fix the Semantics POD to be consistent.
17:37 lucs Maybe you could add that to the Vocabulary POD>
17:37 lucs *.
17:38 jeffreykegler The Vocabulary POD has a special intention -- I pictured lots of potential users who knew conventional parsing terminology, and so I just deal with updating that, not terms which are new to Marpa.
17:39 lucs Ah.
17:39 jeffreykegler Btw, in retrospect, I am not sure the Vocabulary POD'
17:39 jeffreykegler s intended audience ever actually existed.
17:39 lucs Well, for what t's worth, I read it :)
17:39 lucs i
17:40 jeffreykegler Next question: I believe Context::location is behaving as intended.
17:40 jeffreykegler So it's a documentation bug.
17:40 lucs Okay.
17:41 lucs And can we get from 0-1, 1-2 to 0-3, 4-7 somehow?
17:42 jeffreykegler Next question: at the SLIF level there is g1_location_to_span(): https://metacpan.org/pod/release/JKEGL/Marpa-R2-​2.078000/pod/Scanless/R.pod#g1_location_to_span
17:43 jeffreykegler That's because a G1 location is actually a span of the input stream.
17:43 jeffreykegler Things keep complicated in the mapping due to, for example, discards.
17:44 jeffreykegler At the semantics level, discards, mean you are skipping forward in the input stream.
17:45 jeffreykegler For that matter, you are allowed to just plain old jump around in the input stream, which means in the general case, the question from a semantics callback, "Where am I?" can have a very complicated answer.
17:46 lucs Okie doke. This is one heck of a tool :)
17:47 jeffreykegler In practice, an application may want to use $slr->pos as it goes along, and do its own mapping from G1 location to input stream location.
17:48 lucs Okay; I'll keep reading the docs (I only skimmed the R and G PODs so far).
17:49 jeffreykegler lucs: they are written as reference docs, and I expect very few will read them from beginning to end before starting to code.  And I'd much rather folks look at a tutorial or an inspiring example and charge right in.
17:50 lucs It turns out I jumped in a bit too fast (and got really mixed up), so that's why I'm reading the docs in so much detail -- and loving it!
17:51 jeffreykegler lucs: Great.  I'm convinced that, if you are going to make a mistake, better to jump in too fast then too slow.
17:52 jeffreykegler Particular because to modern programmers, BNF can seem intimidating, when it's really as easy to work with (or easier) than regular expressions.
17:53 jeffreykegler BNF is certainly *a lot* easier than Perl regexes in their full glory.
17:54 jeffreykegler Speaking of not reading things from beginning to end, I've never read perlre from beginning to end, despite sitting down with the intent to do so a few times.
17:59 jeffreykegler Here'
18:00 jeffreykegler Here's the Semantics POD fix: "Marpa::R2::Context::location() returns the start and end G1 locations of the current rule instance. Note that these are G1 locations, not input stream locations."
18:01 lucs I'm not sure how clear "G1 location" is for the reader at this point...
18:04 jeffreykegler lucs: Good point.  I'll add a link back to the Scanless::R doc.
18:04 lucs Aha. Yes, there it is! :)
18:13 lucs jeffreykegler: Say, if I point my PERL5LIB (for example) to the cpan/lib directory of my local clone of the github Marpa repo, will I get a workable version of the current state of the code?
18:15 jeffreykegler lucs: Probably not is my guess.  It needs the matching Libmarpa to be built and found.
18:15 lucs Oh, right.
18:15 jeffreykegler lucs: Are you trying to get a specific commit?
18:16 lucs Nah, just wondering, no big deal.
18:18 jeffreykegler In other news, I've reached a "thinking point" in Phase 3, and I often use the inter-phases for these -- it makes sense to hack other stuff while I get my ideas in order.
18:19 jeffreykegler So I am thinking of closing out Phase 3 and starting a brief Interphase 3/4.
18:19 lucs Is this a good time to submit typo fix pull requests?
18:20 jeffreykegler lucs: Yes.  Although it is *always* a good time to submit typo fix pull requests.  I may not act on them right away, but they can sit on the queue until I am ready.
18:20 lucs Okay.
18:20 jeffreykegler One disappointment re Phase 3:
18:21 jeffreykegler I had hoped to end it was some real, visible value-added for the user, and that is not the case at this point.
18:21 jeffreykegler *"end it was" -> "end it with"
18:22 lucs What was your work focused on for Phase 3? (and what will Phase 4 be about?)
18:22 jeffreykegler Phase 4 will be about what I didn'
18:22 jeffreykegler t get done in Phase 3. :-)
18:22 lucs :)
18:24 jeffreykegler Phase 3 was about allowing the user to intervene in the parse more aggressively -- to undo things already seen by the parser.
18:24 lucs Advanced stuff, eh, not what beginners will care about I suppose.
18:24 jeffreykegler One effect of this (and the main target use case) is allowing LATM to be more reliably efficient
18:25 jeffreykegler LATM = longest acceptable token matching
18:25 jeffreykegler Currently, Marpa is longest token matching (LTM) by default, which is traditional.
18:26 lucs Not sure what the difference is :/
18:26 jeffreykegler I've added the "forgiving" option, which means that tokens which ordinarily would end by parse by being rejected as unacceptable by the G1 grammar ...
18:27 jeffreykegler are "forgiven" -- they no longer end the parse, and Marpa looks for any other, acceptable, tokens which might exist.
18:27 lucs Shorter onw?
18:27 lucs *ones
18:27 jeffreykegler lucs: There was a discussion of this on the mailing list.
18:27 jeffreykegler lucs: Yes, shorter ones
18:27 lucs Oh, I'll look it up.
18:27 lucs Aha.
18:28 jeffreykegler LTM is how traditional lexers work, because they aren't smart enough at the lexing level to know which tokens will be acceptable and which not.
18:29 jeffreykegler AFK for a few minutes
18:44 jeffreykegler1 joined #marpa
18:45 lucs I just sent a pull request, not sure if it's done correctly though (I don't have much experience with github).
18:54 jeffreykegler1 lucs: re http://irclog.perlgeek.de/​marpa/2014-02-07#i_8248802 -- looks good
18:56 lucs Okay, thanks.
19:00 lucs jeffreykegler1: That pull request, is it an all-or-nothing thing, or can you easily select which of the commits you want to accept/reject?
19:13 jeffreykegler1 lucs: sort of both.  There's a very easy one-click all-or-nothing, but selection is also straightforward.
19:14 lucs Ah, good -- most of those are real typos, but there are some you may want to set aside.
19:41 jeffreykegler joined #marpa
19:52 jeffreykegler joined #marpa
23:28 jeffreykegler joined #marpa
23:30 jeffreykegler I've just uploaded Marpa-R2 2.079_014 which unless I hear otherwise will end Phase 3.  I'll move on to an interphase, and give the Github issues list some attention, before resuming with Phase 4.
23:30 jeffreykegler CPANtesters looks good, but as always your test results are much appreciated.  Thanks!

| Channels | #marpa index | Today | | Search | Google Search | Plain-Text | summary