IBMi DevOPS: Deploying objects to multiple IBMi servers with Fabric.

Now it is time to create some tools with Fabric.

There are some third-party tools to move data between systems, but we could use Fabric "get" and "put" functions to move data between systems. This is "scp" in the background, so our copies are safe.

To start with a new script, i will use PASE Shell and IBMi command line, so the definition start again like:


from fabric.api import *
IBM_PASE = "/QOpenSys/usr/bin/bsh -c"
IBM_OS = "system"
env.user = "USER"
env.password = "PASS"

The next step is to define what is the source servers and target servers. For me, im interested in move the objects for my development server to the rest of the nodes. But it is easy to define source as "local" (our computer), and deploy a SAVF to all the server.

env.roledefs = {                                                              
    'source': ['dev'],                                           
    'target': ['test1','test2','int1','int2','prod1','prod2'],                                 
}

Next step is to configure what library and temporary SAVF file we are going to use to deploy our packages. I define this into a function that i will call only once for all my servers.

@roles('source','target')
def initsavfile():
    env.shell = IBM_OS
    # Create library
    with settings(warn_only=True):
        result = run("CRTLIB FABRIC")
    
    with settings(warn_only=True):
        run("CRTSAVF FILE(FABRIC/SAVF)")

The decorator "@roles", define in what servers i will run the script. Because probably i created the library before, i use "with settings(warn_only=True):" to monitor errors. Fabric "get" function always works from "remote" to "local", and local is the server when Fabric scripts are running.

Next code is to define what steps i should perform to put a SAVF on my local computer/server. I will send LIBRARY as a parameter and it will use the same SAVF for the whole operation:

def get_file(library):
    env.shell = IBM_OS
    # remove the Local File
    with settings(warn_only=True):
        local("rm /mnt/c/pythonmanagement/SAVF.FILE")
    # Create savf.
    # I got always CPA4067 error from SSH session even with the RPLLE entry
    run("CRTSAVF FILE(FABRIC/FABRIC)")
    command_savlib = "SAVLIB LIB(" + library + ") DEV(*SAVF) SAVF(FABRIC/FABRIC)"
    run(command_savlib)
    # put files on my local computer or server fabric
    get('/QSYS.LIB/FABRIC.LIB/FABRIC.FILE','/mnt/c/pythonmanagement/SAVF.FILE')
    # removing the file
    run("DLTOBJ OBJ(FABRIC/FABRIC) OBJTYPE(*FILE)")

I didnt define yet the servers this code will run. But the "get_file" should run only on the source server. And regarding Fabric "put" operation, it will works from "local" to "remote" server.

def put_file(library):
    env.shell = IBM_OS
    with settings(warn_only=True):
        result = put('/mnt/c/pythonmanagement/SAVF.FILE','/QSYS.LIB/FABRIC.LIB/SAVF.FILE')
    if result.failed:
        print("Deployment of library " + library + " failed")
    else:
        command = "RSTLIB SAVLIB(" + library +") DEV(*SAVF)  SAVF(FABRIC/SAVF)"
        run(command) 
        print("Deployment of library " + library + " succeeded")

The final step is to create or main function to deploy a Library. The function should accept a libray name as a parameter and execute "get_file" and "put_file" synchronously.

@task
def deploy_savf(library):
    env.shell = IBM_OS
    #Get Files from source
    get_file.roles = ('source',)   
    execute(get_file, library)
    # Put files on target
    put_file.roles = ('target',)
    execute(put_file,library)

The decorator @task indicate that this function will be the only "public" function of this scripts.


IBMi DevOPS: Config management with Python and Fabric, part I

You probably had heard about DevOps many times and figured out how to start with IBMi?

What is DevOps?


DevOps represents a change in the IT culture, it is not only about software. DevOps focus on rapid delivery, focus on people and seeks to improve the developer-admin teams. If in our IBMi world, we want to be part of DevOps team, we should be able to implement some automation tools and implement the so called "infraestructure as a code".

I played a bit in windows and linux world with Chef and my colleagues just kept telling me "no, you cant do DevOps on AS400". (I know, i did my best educating them to use IBM instead ... usually when they pronounce that AS400 word, i dont listen).

How can i start using some DevOps tools on IBMi?


There are many tools than can help multidisciplinary teams (windows, linux, db teams) to work in the world of DevOps: Chef, Ansible, Salt, Puppet... If you already know one of them, try to adapt them to the IBMi world.

It takes a lot of time and effort to understand this tools, but you will be able to manage a huge amount of nodes in your infrastructure. Some of this  tools uses agents or SSH sessions and they are based on configuration instead of programming.

Recently i deployed recently 12 IBMi partitions... they were with "initial state": LIC and sofware installed and latest PTF installed. Then, it came to my mind how many repetitive tasks i would need to perform to configure those systemes!

Because all systems are part of same dev/integration/test/production enviroment), imagine adding system values, configuring subsystems,  set secure values (system values, journals, firewall) and deploy software and packages (php, node, tools for developes, apache configuration, etc).

No, i dont want to repeat myself doing that all the time.

Python Fabric: decentralized DevOps.


After checking alternatives, i decided to go to a most simple implementation of DevOps and dont use a complicated "centralized" system.

Everything i need is just my computer ,a repository of code and SSH access to my nodes.  Simplicity  does not mean bad or non-DevOps.

There are a couple of  decentralized  light-DevOps tools that could be very funny to use: Fabric and BundleWrap. Fabric is more about to program in python what to do with tasks. BundleWrap is most about configuration and it is also Python based.

I choice Fabric.

Fabric is a python library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks. It uses SSH tuneling via paramiko library. it is a very  proactive tool and you can start to automate many tasks in your IBMi in minutes.



 With Fabric, it is simple to write some code on your computer and run it against all your nodes.



 Lets start installing Fabric ( i tested on Python27).

  > pip install fabric

 Next step is to create your directory and start typing tasks! For a "Hello World" i recommend you to use a single node and simple command

Note: On those examples im using simple password, but of course it is better to manage your connections with SSH keys and know_hosts.

Create the file "fabfile.py" in your directory and write this code:

from fabric.api import run

def test():
    run('uname -s')

Now we can use the command fab --list and see the tasks included:

fab --list

Available commands

      test


and run it with
 > fab -H (your node) test

and the output should be like

[myserver] Executing task 'test'
[myserver] run: uname -s
[myserver] Login password for 'andres':
[myserver] out: bsh: /bin/bash: not found
[myserver] out:


Fatal error: run() received nonzero return code 1 while executing!

Requested: uname -s
Executed: /bin/bash -l -c "uname -s"

Aborting.
Disconnecting from myserver... done.

Oops! failed! But why? Well, Fabric is looking for a default Linux bash in "/bin/bash" so we need to tell Fabric that this is an IBMi and i want to use another bash. We are lucky, Fabric provides a settings context manager to change our enviroment.

from fabric.api import run,env

env.shell = "/QOpenSys/usr/bin/bsh -c"
# Change user name, my SSH user
env.user = "ACL"

def test():
    run('uname -s')

> fab -H (your node) test

C:\pythonmanagement>fab -H myserver test
[myserver] Executing task 'test'
[myserver] run: uname -s
[myserver] Login password for 'ACL':
[myserver] Login password for 'ACL':
[myserver] out: OS400
[myserver] out:


Done.
Disconnecting from myserver... done.

So thats all!.

With the command line we told our program what node to connect and what task should perform. Fabric gives and output and we can know at the moment the status of our task.

Now it comes to  my mind a big amount of tasks that we could perform with Fabric:

1. Define host names into our scripts or even roles (development servers, production servers, etc).

2. Define any kind of tasks to perform at a time.

3. Check status of services in our servers and response to them.

4. Deploy a bashscript to all our servers and call it from Fabric. But, what if we want to run IBMi/OS commands? Just simple, change enviromental shell to "system" command

from fabric.api import run,env

IBM_PASE = "/QOpenSys/usr/bin/bsh -c"
IBM_OS = "system"
env.user = "ACL"

def set_hosts():
# Define my hosts.
     env.hosts = ['disibic21', 'disibic22']
def check_lib():
    env.shell = IBM_OS
    try: 
        run('crtlib test1')
    except:
        print('.................      Library exists')

def test():
    env.shell = IBM_PASE
    run('uname -s')

in this code:

1. we define 2 shells: "system" and "bash", so we can call IBMi commands or PASE commands depending of our fabric task.

2. We define 2 hosts to run the task.

3. Add an exception to handle an error.

4. In each task, it is possible to change the environmental settings of the type of BASH.

To run this code > fab set_hosts check_lib test

C:\pythonmanagement>fab set_hosts check_lib test
[disibic21] Executing task 'check_lib'
[disibic21] run: crtlib test1
[disibic21] Login password for 'ACL':
[disibic21] out: CPF2111: Library TEST1 already exists.
[disibic21] out:


Fatal error: run() received nonzero return code 255 while executing!

Requested: crtlib test1
Executed: system "crtlib test1"

Aborting.
.................      Library exists
[disibic22] Executing task 'check_lib'
[disibic22] run: crtlib test1
[disibic22] out: CPF2111: Library TEST1 already exists.
[disibic22] out:


Fatal error: run() received nonzero return code 255 while executing!

Requested: crtlib test1
Executed: system "crtlib test1"

Aborting.
.................      Library exists
[disibic21] Executing task 'test'
[disibic21] run: uname -s
[disibic21] out: OS400
[disibic21] out:

[disibic22] Executing task 'test'
[disibic22] run: uname -s
[disibic22] out: OS400
[disibic22] out:


Done.
Disconnecting from disibic21... done.
Disconnecting from disibic22... done.

As you noticed, im writing 2 tasks in 1 single file. This is because by default Fabric has a single, serial execution method, thought there is an alternative parallel mode.

The default mode perform the following:

  • set_host 
  • check_lib in node1
  • check_lib in node2
  • test in node1
  • test in node2
Fabric is elastic, so you can make a call just to 1 server like

> Fab -H myserver1,myserver2 test


This method is very simplistic for now, but useful to understand Fabric.If your number of servers is huge or tasks need time to perform, it is better to use a parallel approach, but i would not explain it for now. Check in Parallel Fabric

Other things we could do is to get and put files into our file systems (with secure copy) and deploy IFS files, software, PTFs, etc.

There is also a very interesting project for a web-interface for Fabric to deploy code stored in a repository and logs of deployments are stored. This project is called FabricBolt.

This is everything for this post...i added some examples on my github 

your imagination is your limit..remember, it is just python!