Perl 6 - the future is here, just unevenly distributed

IRC log for #gluster-dev, 2016-07-18

| Channels | #gluster-dev index | Today | | Search | Google Search | Plain-Text | summary

All times shown according to UTC.

Time Nick Message
01:30 hchiramm joined #gluster-dev
01:47 ilbot3 joined #gluster-dev
01:47 Topic for #gluster-dev is now Gluster Development Channel - http://gluster.org | For general chat go to #gluster | Patches - http://review.gluster.org/ | Channel Logs - https://botbot.me/freenode/gluster-dev/ & http://irclog.perlgeek.de/gluster-dev/
02:44 julim joined #gluster-dev
02:59 penguinRaider_ joined #gluster-dev
03:33 magrawal joined #gluster-dev
03:42 atinm joined #gluster-dev
03:50 kshlm joined #gluster-dev
03:53 kaushal_ joined #gluster-dev
03:57 itisravi joined #gluster-dev
03:59 poornimag joined #gluster-dev
04:06 sanoj joined #gluster-dev
04:12 kdhananjay joined #gluster-dev
04:14 gem joined #gluster-dev
04:16 araj_ joined #gluster-dev
04:24 nbalacha joined #gluster-dev
04:24 itisravi joined #gluster-dev
04:27 shubhendu joined #gluster-dev
04:28 ashiq joined #gluster-dev
04:42 jiffin joined #gluster-dev
04:49 aspandey joined #gluster-dev
04:49 kshlm joined #gluster-dev
04:55 kotreshhr joined #gluster-dev
05:02 penguinRaider joined #gluster-dev
05:03 csaba joined #gluster-dev
05:08 lkoranda joined #gluster-dev
05:12 prasanth joined #gluster-dev
05:13 nishanth joined #gluster-dev
05:18 Muthu_ joined #gluster-dev
05:22 mchangir joined #gluster-dev
05:23 Manikandan joined #gluster-dev
05:27 penguinRaider joined #gluster-dev
05:27 hgowtham joined #gluster-dev
05:28 ndarshan joined #gluster-dev
05:28 ppai joined #gluster-dev
05:29 Bhaskarakiran joined #gluster-dev
05:33 Manikandan joined #gluster-dev
05:33 hchiramm joined #gluster-dev
05:35 ramky joined #gluster-dev
05:35 itisravi joined #gluster-dev
05:41 nigelb We're short of netbsd machines again because we're out of disk space.
05:44 nigelb ndevos++ and kkeithley++ for jumping in during the weekend.
05:44 glusterbot nigelb: ndevos's karma is now 287
05:44 glusterbot nigelb: kkeithley's karma is now 134
05:45 aravindavk joined #gluster-dev
05:46 kshlm joined #gluster-dev
05:50 sakshi joined #gluster-dev
05:51 kaushal_ joined #gluster-dev
05:52 aspandey joined #gluster-dev
06:01 msvbhat joined #gluster-dev
06:02 karthik_ joined #gluster-dev
06:02 ashiq joined #gluster-dev
06:04 devyani7_ joined #gluster-dev
06:07 kaushal_ joined #gluster-dev
06:13 pkalever joined #gluster-dev
06:17 atalur joined #gluster-dev
06:27 mchangir joined #gluster-dev
06:29 itisravi joined #gluster-dev
06:35 msvbhat joined #gluster-dev
06:42 itisravi joined #gluster-dev
06:43 aspandey joined #gluster-dev
06:44 kdhananjay joined #gluster-dev
06:44 pur_ joined #gluster-dev
06:47 pranithk1 joined #gluster-dev
06:47 atalur joined #gluster-dev
06:48 csaba joined #gluster-dev
06:52 nishanth joined #gluster-dev
06:54 lkoranda joined #gluster-dev
07:00 mchangir joined #gluster-dev
07:00 ndarshan joined #gluster-dev
07:07 kshlm joined #gluster-dev
07:16 Saravanakmr joined #gluster-dev
07:18 rafi joined #gluster-dev
07:24 kdhananjay joined #gluster-dev
07:25 aspandey_ joined #gluster-dev
07:25 gem joined #gluster-dev
07:25 itisravi joined #gluster-dev
07:28 mchangir joined #gluster-dev
07:30 ndevos nigelb: I'd like a smoke test that checks the correctness of the commit message, preventing cherry-picks from keeping the orignal (un-indented) tags
07:31 ndevos nigelb: for example, see the difference between http://review.gluster.org/#/c/13863/1 and http://review.gluster.org/#/c/13863/2
07:31 itisravi_ joined #gluster-dev
07:31 ndevos nigelb: what would be the right place to request such a check? gluster-infra, BZ or a github issue?
07:32 nigelb ndevos: BZ
07:32 nigelb If you want to write said test, pull request in scripts folder.
07:32 ndevos nigelb: under the tests component, or rather project-infrastructure?
07:32 nigelb tests
07:32 nigelb Actually
07:32 nigelb can we make cherrypicks keep the original change-Id?
07:33 ndevos yes, the same change-id is best, it makes it easier to fund backports of a change
07:33 ndevos *find even
07:33 nigelb It makes it way easier to gatekeep them as well.
07:34 ndevos it would be even better if there is an option to disable the [cherry-pick] button in the webui, many use it incorrectly
07:35 ndevos http://gluster.readthedocs.io/en/lates​t/Developer-guide/Backport-Guidelines/ contains the steps, but it is most often ignored
07:35 nigelb ndevos: How about we fail smoke tests for jobs that don't follow it?
07:35 ndevos nigelb: yes, that was my idea :)
07:36 nigelb Oh excellent
07:36 nigelb so this is exactly what I wanted to propose to you after my bangalore trip.
07:36 nigelb you and gluster-devel
07:37 ndevos ah, good, we only really need to check for ^Reviewed-on:, ^Smoke:, ^Centos-regression: or ^NetBSD-regression:
07:37 atalur joined #gluster-dev
07:38 nigelb Yep. I'd planned this already :D
07:38 ndevos do you happen to have a bug reported for that?
07:38 nigelb Not yet. I wanted to talk about it on the devel list before doing anything.
07:38 nigelb So file away and assign to me.
07:44 rraja joined #gluster-dev
07:45 penguinRaider joined #gluster-dev
07:51 ndevos nigelb: https://bugzilla.redhat.co​m/show_bug.cgi?id=1357421 is all yours now :)
07:51 glusterbot Bug 1357421: unspecified, unspecified, ---, nigelb, ASSIGNED , Fail smoke tests if cherry-picked bugs contain the old git-tags
07:52 Apeksha joined #gluster-dev
07:53 nigelb Thank you :)
07:53 nigelb I'll punt it over to you for review when I'm done :D
08:08 ndevos atinm: btw, it is not needed to set verified+1 or manually "recheck smoke" when a commit message was updated ;-)
08:14 mchangir joined #gluster-dev
08:15 pranithk1 joined #gluster-dev
08:22 pranithk1 joined #gluster-dev
08:25 pranithk1 nigelb: On http://review.gluster.org/#/c/14816, smoke succeeded but +1 didn't come. Did we make any changes to this functionality
08:26 rastar joined #gluster-dev
08:28 pranithk1 nigelb: I retriggered smoke
08:28 nigelb pranithk1: Nope. Nothing to affect the voting.
08:28 nigelb I disabled the test-compare job this morning, but I see that your job already has a success from it.
08:28 nigelb So it should have gotten a +1
08:29 pranithk1 nigelb: yeah, that is why I asked... I retriggered smoke for now
08:29 pranithk1 nigelb: let's see
08:29 nigelb If it fails, I'll just delete the test-compare job.
08:34 itisravi joined #gluster-dev
08:36 aspandey joined #gluster-dev
08:45 kotreshhr joined #gluster-dev
08:46 kdhananjay joined #gluster-dev
08:46 atinm ndevos, noted, I was not knowing that
08:46 atinm ndevos, thanks!
08:47 mchangir joined #gluster-dev
08:47 karthik_ joined #gluster-dev
08:54 atalur joined #gluster-dev
08:56 Guest63682 joined #gluster-dev
08:59 Bhaskarakiran joined #gluster-dev
09:05 rastar joined #gluster-dev
09:05 Apeksha joined #gluster-dev
09:15 poornimag joined #gluster-dev
09:19 nbalacha joined #gluster-dev
09:20 nigelb pranithk1: That seems to have worked.
09:21 hgowtham joined #gluster-dev
09:24 devyani7_ joined #gluster-dev
09:25 araj_ joined #gluster-dev
09:26 rastar joined #gluster-dev
09:30 kdhananjay joined #gluster-dev
09:33 penguinRaider joined #gluster-dev
09:34 misc nigelb: no objection on a reboot of the 2 unused servers in RH DC ? (ie, the 2 new one, not the gerrit/jenkins)
09:36 itisravi_ joined #gluster-dev
09:50 kdhananjay joined #gluster-dev
09:50 nigelb misc: no, please go ahead.
09:50 misc I am also writing the doc on it
09:50 misc well, once I finish fixing ntp on the cage
09:57 nigelb misc: do we have hard disk now?
10:00 misc nigelb: we always had harddrive
10:00 misc the problem was that I needed to verify partitions
10:00 nigelb ah.
10:00 misc where does the "we have no harddrive" idea come from ?
10:00 penguinRaider joined #gluster-dev
10:01 nigelb Mistyping in my hurry.
10:01 nigelb But we have all partitions now?
10:01 misc I am trying to connect to the vnc interface with socks
10:02 misc and oracle is trying to foil my plan by provinding me crappy java :)
10:02 misc so nope, not yet
10:03 nigelb Emmanuel has given me more pointers to finding space on netbsd.
10:03 nigelb So I'm going to dig into that.
10:04 misc yep, did see that
10:04 misc good luck :)
10:05 nigelb misc: I think you fixed the issue I had with ssh, btw!
10:05 nigelb Thank you :)
10:05 hgowtham joined #gluster-dev
10:05 misc I did nothing but pushing back, but ok :)
10:06 misc (or maybe I did, I am not sure :/)
10:07 nbalacha joined #gluster-dev
10:08 nigelb The biggest trouble was not being able to run ansible/fabric on the machines.
10:08 nigelb which was important to figure out which machines had space and which didn't.
10:08 nigelb eventually, I resorted to checking via jenkins + manual.
10:08 misc yeah, but that's something that should have worked fine :/
10:09 misc now, I suspect ansible do not work well when disk is full
10:10 bfoster joined #gluster-dev
10:14 sakshi joined #gluster-dev
10:20 aspandey joined #gluster-dev
10:21 aravindavk pranithk1, kotreshhr last mail in "GFID to Path" conversation https://www.gluster.org/pipermail/gl​uster-devel/2016-January/047739.html
10:21 pranithk1 aravindavk: thanks!
10:21 nigelb misc: I take that back. I still can't automate my into netbsd machines :(
10:35 msvbhat joined #gluster-dev
10:35 kaushal_ joined #gluster-dev
10:37 araj_ joined #gluster-dev
10:38 misc nigelb: well, netbsd are outside freeipa
10:38 misc nigelb: what do you try ?
10:38 misc (also outside of ansible, in fact, so maybe that's the issue)
10:39 misc and I have to go offline for maintainance
10:39 nigelb ah
10:40 nigelb I bet it's because I'm connecting as root.
10:40 nigelb err, as jenkins rather than root
10:40 atinm joined #gluster-dev
10:41 pranithk1 joined #gluster-dev
10:41 kshlm joined #gluster-dev
10:41 penguinRaider joined #gluster-dev
10:58 itisravi joined #gluster-dev
11:02 ppai_ joined #gluster-dev
11:03 ashiq joined #gluster-dev
11:07 mchangir joined #gluster-dev
11:15 julim joined #gluster-dev
11:25 penguinRaider joined #gluster-dev
11:33 kkeithley what did I do this weekend?
11:39 jiffin kkeithley: what did u do this weekend?
11:39 jiffin only kkeithley knows the answer
11:39 kkeithley nigelb gave me karma for doing something this weekend. I was wondering what it was?
11:40 nigelb kkeithley: netbsd machines :)
11:40 kkeithley I do know the other things I did this weekend.  (Pretty boring stuff. Working on my house.)
11:41 kkeithley ndevos: did you say you ran into the issue with python compiling .../hooks/S57glusterfind-delete-post.py when you built RPMs in CBS?
11:42 kaushal_ joined #gluster-dev
11:45 ppai_ joined #gluster-dev
11:45 karthik_ joined #gluster-dev
11:47 ndevos kkeithley: no, but I hit it on fedora (24?) at one point
11:48 ira joined #gluster-dev
11:48 kkeithley oh, okay. I had thought you mentioned it when you were building the latest packages for the Storage SIG.   hmmm.
11:58 rastar joined #gluster-dev
11:59 poornimag joined #gluster-dev
12:07 gem joined #gluster-dev
12:18 Guest62141 http://serverfault.com/questions/621919/glus​terfs-replication-over-odd-numbers-of-nodes anyone please
12:21 atinm Guest62141, you have given three bricks instead of two, that's the problem
12:22 Guest62141 thanks atim
12:28 mchangir joined #gluster-dev
12:33 ira joined #gluster-dev
12:43 kshlm joined #gluster-dev
12:47 mchangir joined #gluster-dev
12:53 ashiq joined #gluster-dev
12:58 rastar joined #gluster-dev
13:00 Manikandan joined #gluster-dev
13:08 nbalacha joined #gluster-dev
13:15 post-factum http://review.gluster.org/#/c/13658/ passed all regression tests, need code review, please
13:20 gem joined #gluster-dev
13:23 overclk nigelb: ping, bz #1357545
13:33 kotreshhr left #gluster-dev
13:37 gem joined #gluster-dev
13:43 julim joined #gluster-dev
13:43 lpabon joined #gluster-dev
13:45 nigelb overclk: is it blocking work at the moment?
13:46 nigelb Because I'm really on the verge of removing all the netbsd machines and bringing up new ones instead with a better template.
13:46 overclk nigelb: not much -- at ease..
13:46 nigelb In that case, I can set it one place and have it synced everywhere.
13:46 overclk nigelb: no worries
13:46 nigelb I'll make a call by tomorrow on how I'll do it and reply on the bug.
13:47 hagarth joined #gluster-dev
13:59 ilbot3 joined #gluster-dev
13:59 Topic for #gluster-dev is now Gluster Development Channel - http://gluster.org | For general chat go to #gluster | Patches - http://review.gluster.org/ | Channel Logs - https://botbot.me/freenode/gluster-dev/ & http://irclog.perlgeek.de/gluster-dev/
14:08 pranithk1 joined #gluster-dev
14:14 hagarth pranithk1: ping, does the shard root have a consistent gfid?
14:14 pranithk1 hagarth: yes
14:14 hagarth pranithk1: ok
14:16 aravindavk joined #gluster-dev
14:16 hagarth pranithk1: thanks, noticed SHARD_ROOT_GFID now
14:17 pranithk1 hagarth: :-)
14:28 pkalever left #gluster-dev
14:28 mchangir joined #gluster-dev
14:46 penguinRaider joined #gluster-dev
14:51 kshlm joined #gluster-dev
15:06 wushudoin joined #gluster-dev
15:07 Manikandan joined #gluster-dev
15:09 jiffin joined #gluster-dev
15:19 shaunm joined #gluster-dev
15:21 hagarth joined #gluster-dev
15:29 Apeksha joined #gluster-dev
15:34 nbalacha joined #gluster-dev
15:35 julim joined #gluster-dev
15:39 msvbhat joined #gluster-dev
15:39 pkalever joined #gluster-dev
15:47 poornimag joined #gluster-dev
16:28 julim joined #gluster-dev
16:37 atinm joined #gluster-dev
16:39 atinm pranithk1, pm
16:40 pranithk1 atinm: I didn't get pm :-(
16:45 kaushal_ joined #gluster-dev
16:47 post-factum 3.8.1 arrived to arch linux repos
16:50 hagarth joined #gluster-dev
17:05 shubhendu joined #gluster-dev
17:08 PotatoGim joined #gluster-dev
17:16 hagarth joined #gluster-dev
17:21 overclk joined #gluster-dev
17:25 devyani7 joined #gluster-dev
18:02 julim_ joined #gluster-dev
18:07 lpabon joined #gluster-dev
18:11 hchiramm joined #gluster-dev
18:27 hagarth joined #gluster-dev
18:35 overclk joined #gluster-dev
18:45 lpabon joined #gluster-dev
18:45 pkalever joined #gluster-dev
18:46 pkalever left #gluster-dev
18:48 overclk joined #gluster-dev
19:08 hchiramm joined #gluster-dev
19:13 pkalever1 joined #gluster-dev
19:24 pkalever1 left #gluster-dev
21:40 Acinonyx joined #gluster-dev
22:58 penguinRaider joined #gluster-dev

| Channels | #gluster-dev index | Today | | Search | Google Search | Plain-Text | summary