Perl 6 - the future is here, just unevenly distributed

IRC log for #salt, 2018-04-13

| Channels | #salt index | Today | | Search | Google Search | Plain-Text | summary

All times shown according to UTC.

Time Nick Message
00:00 onslack joined #salt
00:05 Edgan hemebond: That and one decrepitated/renamed salt master option so far. So about normal for a major upgrade.
00:16 esteban joined #salt
00:26 armin_ joined #salt
00:44 thelocehiliosan joined #salt
01:39 armin_ joined #salt
01:56 ilbot3 joined #salt
01:56 Topic for #salt is now Welcome to #salt! <+> Latest Versions: 2017.7.5, 2018.3.0 <+> Support: https://www.saltstack.com/support/ <+> Logs: http://irclog.perlgeek.de/salt/ <+> Paste: https://gist.github.com/ <+> See also: #salt-devel, #salt-offtopic, and https://saltstackcommunity.herokuapp.com (for slack) <+> We are volunteers and may not have immediate answers
02:10 zerocoolback joined #salt
02:19 chowmeined joined #salt
02:19 thelocehiliosan joined #salt
02:25 shiranaihito joined #salt
02:41 bigjazzsound1 joined #salt
02:43 orichards joined #salt
03:07 JacobsLadd3r joined #salt
03:10 JacobsLadd3r joined #salt
03:44 Graypup_om joined #salt
03:50 lompik joined #salt
04:01 fhKitty joined #salt
04:43 v12aml joined #salt
05:18 Hybrid joined #salt
05:28 sauvin joined #salt
05:32 briner joined #salt
05:36 Hybrid joined #salt
05:41 ProT-0-TypE joined #salt
05:42 andrew4 joined #salt
05:44 nielsk joined #salt
05:44 quantumsummers joined #salt
05:45 uncool joined #salt
05:45 legreffier joined #salt
06:25 bachler joined #salt
06:25 Ricardo1000 joined #salt
06:28 briner joined #salt
06:34 zerocoolback joined #salt
06:47 thelocehiliosan joined #salt
06:47 aldevar joined #salt
06:48 darioleidi joined #salt
06:50 briner joined #salt
06:56 xet7 joined #salt
07:06 zerocoolback joined #salt
07:08 Pjusur joined #salt
07:17 Hybrid joined #salt
07:21 EthPyth joined #salt
07:28 orichards joined #salt
07:28 Tucky joined #salt
07:33 jrenner joined #salt
07:39 Waples_ joined #salt
07:46 cewood joined #salt
07:50 hojgaard joined #salt
07:53 tys101010 joined #salt
07:59 mikecmpbll joined #salt
08:25 Elsmorian joined #salt
08:26 aT_ joined #salt
08:27 peters-tx joined #salt
08:36 Elsmorian joined #salt
08:38 EthPyth joined #salt
08:46 ventris joined #salt
08:49 EthPyth joined #salt
08:50 pf_moore joined #salt
08:55 aT__ joined #salt
09:09 cablekevin joined #salt
09:21 Pjusur joined #salt
09:27 DanyC joined #salt
09:28 DanyC joined #salt
09:40 Elsmorian joined #salt
09:43 zerocoolback joined #salt
09:51 xet7_ joined #salt
10:02 dendazen joined #salt
10:04 sploenix joined #salt
10:04 aldevar left #salt
10:04 sploenix hi. is there somewhere a formula existing for installing powershell via saltstack on windows?
10:05 Elsmoria_ joined #salt
10:06 * hemebond vomited in his mouth a little.
10:12 stack how do I specify the salt environment in a roster file? I continue to get duplicate id errors with multiple environment and salt-ssh '*' state.apply saltenv=prod pillarenv=prod
10:13 hoonetorg hi, want to upgrade a testbox to salt 2018.3 on centos 7.4. In the salt repo libsodium in a newer version is available, libsodium.so.23()(64bit), but zeromq in the same repo requires libsodium.so.13()(64bit)
10:23 hoonetorg why is a newer libsodium (libsodium-1.0.16-1.el7.x86_64.rpm in the repo https://repo.saltstack.com/yum/redhat/7/x86_64/2018.3/), when zeromq-devel-4.1.4-6.el7.x86_64.rpm in same repo still requires the old version libsodium-1.0.5-1.el7.x86_64.rpm which was available last on 2018/04/03 on https://repo.saltstack.com/yum/redhat/7/x86_64/2017.7/
10:23 hoonetorg ???
10:27 hoonetorg a matching version (libsodium13) would be available in EPEL, not all my servers have EPEL activated.
10:28 Elsmorian joined #salt
10:28 Micromus joined #salt
10:33 hoonetorg ok seems nobody around, make a bug report
10:35 rcvu joined #salt
10:36 Church- joined #salt
10:36 zerocoolback joined #salt
10:42 babilen hoonetorg: Sounds like a packaging issue, not sure if intentional or not
10:44 hoonetorg https://gist.github.com/hoonetorg/2735303d5d0f37aee83c07f634cd607a
10:47 hoonetorg seems like somebody put a new version of zeromq-4.1.4-6.el7.x86_64.rpm into the repo(s) with the same version number than the old zeromq (forgot to raise the version number)
10:47 hoonetorg babilen ^^^
10:47 babilen m(
10:48 hoonetorg old version requires libsodium.so.13(), newer version requires libsodium.so.23()
10:49 hoonetorg this is no problem with new installations, but generates problems on machines with "old" zeromq-rpm installed
10:51 hoonetorg fix would be a rebuild of zeromq with a newer version number
10:51 hoonetorg my fix is (on all hosts):
10:51 hoonetorg yum install epel-release
10:52 babilen Sounds like you should really report that bug report against salt-pack
10:52 hoonetorg yum reinstall zeromq(which installs libsodium13 to fix dep)
10:53 hoonetorg yum update
10:53 hoonetorg ok will do
10:55 hoonetorg ah yeah and after all that:
10:55 hoonetorg yum autoremove libsodium13 #which is not required anymore
10:56 hoonetorg yum remove epel-release
11:00 zerocoolback joined #salt
11:00 J0hnSteel joined #salt
11:06 inetpro joined #salt
11:14 Elsmorian joined #salt
11:22 hoonetorg https://github.com/saltstack/salt-pack/issues/531
11:26 Elsmorian joined #salt
11:27 rominf joined #salt
11:45 EthPyth joined #salt
11:49 dendazen joined #salt
11:58 rcvu joined #salt
11:58 Church- joined #salt
11:59 Elsmorian joined #salt
12:00 Morrolan joined #salt
12:05 EthPyth joined #salt
12:06 rcvu joined #salt
12:10 Nahual joined #salt
12:13 Church- joined #salt
12:17 ventris joined #salt
12:18 brokensyntax joined #salt
12:18 Hybrid joined #salt
12:24 rcvu joined #salt
12:26 mage_ left #salt
12:26 mage_ joined #salt
12:26 mage_ hello
12:27 ProT-0-TypE joined #salt
12:27 mage_ I'm running a command through cmd.run which echoes color code, any idea how could I preserve them ?
12:27 mage_ currently it outputs somethings like ?[1mC?[0m: 16, 0: ?[1mLine too long (104/100)?[0m (?[1mline-too-long?[0m) :(
12:36 ProT-0-TypE joined #salt
12:42 darioleidi joined #salt
12:43 mage_ see https://github.com/saltstack/salt/issues/47050
12:46 zer0def so you're running the linter as part of state application?
12:47 mage_ yes, only if environment != "production"
12:48 zer0def is there any reason for it to be a part of state, instead of being ran beforehand?
12:49 rcvu joined #salt
12:49 Elsmorian joined #salt
12:50 mage_ because I have many projects with only a single line in .gitlab-ci.yml::  sudo /usr/local/bin/salt-call --retcode-passthrough --force-color state.apply webapps.lepidoptera
12:50 mage_ and I'd like to avoid having to modify 50 .gitlab-ci.yml files when I'm changing something
12:50 zer0def so basically "no"
12:51 mage_ plus all the logic (venv creation, pkg, ..) is already in my state file
12:51 mage_ I prefer to keep everything in Salt an reuse as much as possible
12:51 mage_ rather than duplicating stuff in multiple .gitlab-ci.yml scripts
12:51 zer0def i'd argue it shouldn't be part of state, because it's not an indispensible part of deployment
12:52 zer0def and hence it can be ran during integration, without the need to involve salt at all
12:53 mage_ it's only run on a dedicated VM (test machine) for environment != production
12:57 mchlumsky joined #salt
13:02 Elsmorian Seems like there is an issue with the dockerng state running containers with names longer than ~30 characters
13:03 Elsmorian you get a go exception about an invalid argument, so might be an issue with docker-py..
13:13 mage_ zer0def: I agree it's not part of the desployment, but I find somewhat "elegant" that the whole CI/CD pipeline uses my "Saltstack machinery"
13:13 mage_ deployment*
13:14 mage_ so that everything is handled in Saltstack
13:35 EthPyth joined #salt
13:35 mchlumsky joined #salt
13:36 cgiroua joined #salt
13:39 briner joined #salt
13:49 racooper joined #salt
13:52 Hybrid joined #salt
14:00 Elsmorian joined #salt
14:04 JPT I would like to use grains to target (or to not target) machines for certain actions. In order to do this, i would like to deploy various grains with very primitive information (e.g. "autoupdate: true") on my minions.
14:04 JPT Now i would like to deploy these grains through the pillar, because it would be very easy this way.
14:05 JPT Is there an existing state (maybe from the salt-formula?) that allows me to set these types of grains within the minion config or perhaps something like /etc/salt/grains.d?
14:05 zer0def grains.present, although in your case you might want to just straight up target from pillar
14:06 JPT Oh, wait. I can target machines from pillar data?
14:06 zer0def i don't remember whether there's an explicit option for it, but you absolutely can with compound matching
14:07 JPT Neat. There's salt -I :-)
14:07 JPT I'll take a look at it. Thanks :)
14:07 zer0def yeah, compound matching also has pcre targeting
14:07 zer0def in case you wanted to get really fancy with it
14:09 JPT I'll start with the basics and see where this goes.
14:11 JPT Oh - can i use pillar targeting in an environment with syndics?
14:11 JPT I have one salt master which knows a few syndic masters which know their minions.
14:12 JPT And that one salt master at the very top needs to trigger actions for all of the minions depending on a thing. As far as i know, grains would work - but pillar data?
14:12 hrumph2 joined #salt
14:14 zer0def i would assume so
14:14 zer0def just try with something non-harmful, like a `test.ping`
14:14 rcvu joined #salt
14:17 DanyC joined #salt
14:17 JPT Okay, will do.
14:18 dvdmuckle joined #salt
14:21 crux-capacitor hi all, during a state run, is it possible to rerun one of the commands if it fails? as in, just call it again by it's name?
14:21 Sacro its, state.sls
14:23 zer0def crux-capacitor: you're probably looking for `onfail` and duplicating the state that needs to be re-ran with `use`: https://docs.saltstack.com/en/latest/ref/states/requisites.html
14:24 crux-capacitor ok, will read through that. thanks
14:27 Church- joined #salt
14:33 DammitJim joined #salt
14:34 DammitJim how can I "rename" a minion for targeting?
14:34 DammitJim right now, I have to run: salt DammitServer test.ping
14:34 DammitJim I don't know why, but we normally use lowercase for all of our servers
14:34 DammitJim I'd like to be able to say: salt dammitjimserver test.ping
14:35 zer0def crux-capacitor: in the simplest case it's https://ghostbin.com/paste/zkdfp
14:36 zer0def so you'd like to change the minion's id?
14:36 zer0def DammitJim: ^
14:36 DammitJim yeah... I've done it before
14:36 DammitJim but I don't remember what I need to do
14:37 DammitJim can't I just edit the minion_id on the master?
14:37 zer0def depending on whether it's defined as the `id` value in /etc/salt/minion or value is stashed in /etc/salt/minion_id, alter that, restart minion, accept key, ???, profit?
14:38 DammitJim oh, so we'd do this on the minion
14:38 DammitJim let me see. thanks
14:39 DammitJim do I have to re-generate keys? I think that's my concern
14:39 DammitJim it's defined in minion_id (not minion)
14:40 zer0def i don't think so, but i wouldn't give guarantees on this at this moment
14:43 zer0def you don't, just be sure to clean up your former minion-id's association in salt-key, so the master doesn't choke when accidentally targeting it
14:44 DammitJim how do I clean it up on the master?
14:44 DammitJim I'm afraid if I delete it, it'll delete the new one because the hash is the same?
14:46 zer0def when you change the name, it'll request approval on master, then `test.ping` on both for peace of mind and drop the old one, since it should never return
14:47 crux-capacitor zer0def, thanks for the example
14:59 lordcirth_work MTecknology, re: yesterday, I was told that 18.04 defaults to netplan and doesn't read /etc/networking interfaces by default.
15:00 lordcirth_work Perhaps they only meant that /etc/network/interfaces was empty by default.
15:08 Elsmorian joined #salt
15:15 briner joined #salt
15:15 ecdhe joined #salt
15:18 tiwula joined #salt
15:27 sjorge joined #salt
15:30 Elsmorian joined #salt
15:34 dezertol joined #salt
15:43 Elsmorian joined #salt
15:44 fl3sh joined #salt
15:51 sjorge joined #salt
15:51 evle1 joined #salt
16:09 BarBQ joined #salt
16:16 zerocoolback joined #salt
17:03 pmcg joined #salt
17:04 crux-capacitor joined #salt
17:08 dendazen joined #salt
17:13 mikecmpbll joined #salt
17:33 JacobsLadd3r joined #salt
17:38 Elsmorian joined #salt
17:46 user-and-abuser joined #salt
18:13 mavhq joined #salt
18:18 MTecknology lordcirth_work: isn't that an X thing?
18:22 lordcirth_work MTecknology, as in Xorg?  No.  Desktop sets netplan to use NetworkManager, and Server uses networkd, but both still netplan.
18:23 MTecknology netplan isn't installed on my 18.04 boxes
18:24 DanyC joined #salt
18:25 zer0def after upgrade or fresh install?
18:25 MTecknology yes
18:25 zer0def also, netplan apparently interfaces with networkd and networkmanager
18:34 pcn Weird, netplan man pages are on the laptop I just upgraded to 18.04
18:36 zer0def that reminds me that manpages may have not been part of the core distribution for 16.04
18:39 ProT-0-TypE joined #salt
19:07 gmoro joined #salt
19:09 ymasson joined #salt
19:11 sjorge joined #salt
19:26 armin joined #salt
19:32 armin joined #salt
19:32 stuhgkl joined #salt
19:35 jeffspeff joined #salt
19:38 eekrano joined #salt
19:42 eekrano joined #salt
20:01 eekrano joined #salt
20:04 scooby2 joined #salt
20:07 jeffspeff i just saw that 2018.3.0 is out. are there any known issues when upgrading from 2017.7.3?
20:11 Trauma joined #salt
20:18 orichards joined #salt
20:26 Trauma joined #salt
20:27 jab416171 joined #salt
20:43 sjorge I broke ZFS on Linux and Solaris 10/11, not sure if there are any other issues.
20:46 hiroshi- joined #salt
20:55 eekrano joined #salt
21:00 MTecknology Is there any way to come up with a uniq uid for a minion id? I'm debating '12' + cksum(minion.id), but that seems like it'll someday produce a collision.
21:04 greatgatsby uuid.uuid4() ?
21:04 MTecknology that just produces something random on every call
21:05 greatgatsby oh, you aren't keeping track?  Ok, then that's not a solution, sorry
21:05 dendazen joined #salt
21:07 MTecknology I suppose abs(zlib.crc32(minion.id)) should be sufficient..
21:51 Kelsar joined #salt
22:07 Kelsar joined #salt
22:15 ventris left #salt
22:30 cgiroua joined #salt
22:50 sjorge joined #salt
22:55 zulutango joined #salt
23:12 Trauma joined #salt
23:19 JacobsLadd3r joined #salt
23:24 tyx joined #salt
23:45 hoonetorg joined #salt

| Channels | #salt index | Today | | Search | Google Search | Plain-Text | summary