One of the best tutorials of nested loops in python

I was looking different tutorials and I’ve seen this. I think it’s the best video of nested loops that I’ve seen.

Thanks to Left peel youtube channel ! https://www.youtube.com/channel/UC9-58hKjwQiD-RkH6Qu1lFQ / http://leftpeel.com/

Run commands remotely via ssh, using groups of hosts lists

Run commands remotely via ssh, using groups of hosts lists

Some time ago I made two scripts:

  1. Script named as: multiple_ssh_exec.sh * When executed, it displays a menu. * In the first lines it gives the host groups names. * You will need to input as first option a host group name. * On the second option you will need to input a Username. * Third option: input a command to execute on remote host.
  2. Script named: multiple_ssh_exec_w_args.sh * This scripts require arguments from the cli. * First arg: -f the file with the host groups * Second arg:-s to show the host group names (requires -f ) * Third arg: -n to define the host group name (requires -f) * Fourth arg:-u username to use with ssh (requires -f -n) * Fifth arg: -c command to use on remote host (requires -f -n -u)

You could take a look here:

https://github.com/SiteReliabilityEngineering/sre/tree/master/multiple_ssh

One of the script’s !

#!/bin/sh
# Koldo Oteo <koldo.oteo1@gmail.com>
#

### FUNCTIONS

#
usage()
{
cat <<EOF
Usage:   $0 [OPTION]
Example: $0 -f hosts_file.txt -s (it displays the host groups)
	 $0 -f hosts_file.txt -n host_group -u USER -c command (runs command on the host group)
        Options:
                -h,     Print help.
                -f,     File with host groups.
		-s,	Show host groups names.
                -n,     Host group name.
                -u,     User you use to login via ssh.
                -c,     Command to run in remote server.
EOF
exit 0
}

#
show_hgroups()
{
echo -e "Available host groups:\n"
echo -e "##########"
grep -e '\[[a-zA-Z]' $f | tr -d '[/]'
echo -e "##########"
echo -e "\n"
}

#
exec_comm()
{
for i in $(sed -n -e '/\['$n'\]/,/\[\/'$n'\]/ p' "$f" | sed -e '1d;$d')
        do ssh  "$u"@$i "$c"
done
}

###

while getopts ":hf:sn:u:c:" opts; do
    case "${opts}" in
        h)
	    usage 
            ;;
        f)
            f=${OPTARG}
            ;;
        s)
	    s=1
            ;;
        n)
            n=${OPTARG}
            ;;
        u)
            u=${OPTARG}
            ;;
        c)
            c=${OPTARG}
            ;;
        *)
            usage
            ;;
    esac
done

#
[[ -z "$1" ]] && usage

if

[ -n "$f" ] && [ -n "$c" ]  && [ -n "$u" ] && [ -n "$c" ]
then
	exec_comm
fi

if [ -n "$f" ] && [ -n "$s" ]
then
	show_hgroups
fi

Not always I’m going to write about bytes…

Yesterday I saw one of the best films of 2017: Wind river

http://www.imdb.com/title/tt5362988/

I’m not going to tell too much you about this film, only that the two main actor’s are: 1 hunter and an FBI agent.

Python’s map power – newbies

There are some methods that I love, one of them is map. Surely you already know!

Imagine that you want to read the line cpu in the file /proc/stat

The file looks similar to this:

cpu 5513 0 7271 2832299 15350 1 1327 0 0 0

cpu1 xxx xxx xxx xxx xxx xxx xxx

  1. We open the file with open and assign it to a file object with (‘as’)
  2. Loop the file object (st)
  3. If the line startswith(‘cpu ‘) then we continue. (I left a space on the right of cpu, because in the file also we have cpu1, cpu2, and I only need ‘cpu’.
  4. I assign the result of map. We split the line starting in index 1 and we map the elements executing float in every element.

 

1
2
3
4
5
with open('/proc/stat', 'r') as st:
    for line in st:
        if line.startswith('cpu '):
            line_split = map(float, line.strip().split()[1:])
    print line_split

Other way to do it:  with a list comprehension:

1
line_split = [ int(x) for x in line.strip().split()[1:] ]

Starting with Go!

I’m for some time reading about Go and every day like it more…

Here it is a very nice video tutorial.

 

What do you think about Go? I will love to see comments from sre/devops/sysadmins.