Deployment Solution

 View Only

Job Parameterisation in DS6.9 

Oct 15, 2014 11:23 AM

Contents

1. Introduction
2. Why Did I Come Up With This?
3. Implementing Job Parameters in DS6.9
   3.1. Preparing your Deployment Server for Job Parameters
   3.2. Preparing your Deployment Server for Parameter Retrieval
   3.3. Creating Parameterised Job Walk-Through
4. Summary
5. Appendix: Thoughts on Task Parameterisation
   5.1. Task-by-Task Parameterisation
   5.2. The ‘Ideal’ DS Enhancement: The Job Parameters Task

This article presents a powerful technique which can enhance your DS6.9 jobs by opening the way to upgrade them to flexible, parameterised modules. By adding just two REM statements in a single script task, I’ll demonstrate how you can prep each of your jobs to have a single key area where critical parameters are defined.

The benefits of this are huge. Separating jobs into data and functional task components allows them to be designed from the outset to be portable across both servers and organisations. Jobs designed in this way also naturally become more transparent, making for increased ease when it comes to job upgrades and maintenance.

And the best news is that if you already use Deployment Server token replacement, you can be up and running with this in no time.

1. Introduction

Parameterisation is a powerful programming technique in which variables, initialised at the start of project, are used to replace hard-wired external data references. This is good practice as leveraging variables for external data not only makes code easier to read, but it also makes the resultant code more versatile and future-proof.

The current design of Deployment Server however makes it difficult to realise the full benefit of parameterisation. The problem is that variables declared within one task cannot easily be made visible within any tasks that follow. As a result, DS administrators can often find themselves managing jobs whose key configuration data is scattered and hidden throughout a number of a job’s subtasks.

The technique I demonstrate here paves the way for job parameterisation by allowing variables to be set and then read across every task that supports token replacement. One advantage of this is that it allows you to create repurposable, modular task libraries that possess a single location where variables can specified for consumption by other tasks.

Below I show a sample couple of tasks from an image deployment job that uses this parameterisation technique; the first task sets the job’s parameters (aka global variables) and a subsequent task recalls one in order to execute a file copy operation.

image01.png

The ultimate aim here is to provide DS Administrators a mechanism to create jobs that possess a new degree of flexibility and portability. My hope is that this will enable us to work more collaboratively, as with increased job portability we can reduce our individual need to build jobs from scratch for those common deployment functions.

The target audience for this article is Deployment Server administrators who are familiar with script tasks and token replacement.

2. Why Did I Come Up With This?

Recently I revisited some DS6.9 jobs with one simple objective -to update them for prospective Windows 8 deployment scenarios. Within minutes however, I was brutally reminded why I’d been putting this off for so long; the revision process was not only cumbersome but also rather error prone. To amend each job’s constituent tasks required a level of focus which intuitively seemed more appropriate for those working for NASA, or perhaps in the bomb disposal profession.

The fundamental reason why revising my jobs wasn’t enjoyable was this - tasks in DS are selfish by design. They are simply unable to share the information they birth with other tasks. This meant, in my case, that when I had to make alterations to a number of complex, multi-task jobs, I had to review deeply each constituent task to ensure that my intended updates were applied consistently.

This is an area of potentially great improvement for DS; allowing tasks to share new information so that parameters can be declared just once, at the job level, for consumption by any task that follows. Such an enhancement would provide a native and supported way to parameterize our job sets, allowing multiple tasks to share common custom variables in user-friendly way. This would not only make DS jobs more portable, but it would also make them easier to maintain throughout their lifecycle.

3. Implementing Job Parameters in DS6.9

After prototyping several options over the last few months for job parameterisation, I decided on the following approach,

  1. To have a ‘Run Script’ task define the job’s key parameters. The script embedded by the task would be specifically crafted to allow parameters to be uploaded into the eXpress database.
  2. To retrieve the parameters using Deployment Server’s standard token replacement mechanism.

The only repercussion in this approach is a tincy little security hurdle. ‘Run Script’ tasks are not permitted by default to directly manipulate the eXpress database using T-SQL. The Deployment Server engine imposes a necessary and sensible security layer which by default blocks T-SQL commands which change data. To lift that block, we need to create (and have authorised) a SQL stored procedure.

3.1. Preparing your Deployment Server for Job Parameters

In this section we’ll create a stored procedure which will permit us to upload job parameters into the eXpress database. These parameters will be stored in a table called custom_vars (which the stored procedure will create on-the-fly if it doesn’t already exist).

Open up SQL Server Management Studio on your deployment Server, change focus to the Express database, and then execute the following code (also in the ins_parameters.sql.txt attached to this article).

 

USE [eXpress] 
go 
SET ansi_nulls ON 
go 
SET quoted_identifier ON 
go 

IF EXISTS (SELECT * 
           FROM   sys.objects 
           WHERE  type = 'P' 
                  AND NAME = 'ins_parameters') 
  DROP PROCEDURE ins_parameters 
go 

CREATE PROCEDURE [dbo].[Ins_parameters] @myid INT 
AS 
    --- 
    --- [dbo].[ins_parameters] created by Ian Atkin 
    --- Enables full parameterisation of DS jobs 
    --- Version 1.1 (Feb 16th Dec 2014) 
    ---    (Bug fixed in initial 1.0 release where token replacement failed 
    ---     in scenarios where SQL targeted the wrong script task) 
    IF NOT EXISTS (SELECT * 
                   FROM   dbo.sysobjects 
                   WHERE  NAME = 'custom_vars') 
      CREATE TABLE custom_vars 
        ( 
           myid    INT, 
           varname VARCHAR(64), 
           value   VARCHAR(1024) 
        ) 

    DECLARE @script VARCHAR(max) 

    SET @script = (SELECT TOP 1 script 
                   FROM   event_schedule es 
                          JOIN script_task st 
                            ON es.event_id = st.event_id 
                               AND Isnull(es.cond_seq, 0) = st.cond_seq 
                               AND es.next_task_seq = st.task_seq 
                   WHERE  es.status_code IS NULL 
                          AND computer_id = @myid 
                          AND script LIKE '%REM SET PARAMETERS%' 
                   ORDER  BY es.schedule_id ASC) 

    --@i Keeps track of current postition in script file 
    DECLARE @i INT 
    --@d is ith newline marker, @e is the ith+1 newline marker 
    --@line -will hold the string that exists between ith and ith+1 newline markers 
    DECLARE @d INT 
    DECLARE @e INT 
    DECLARE @line VARCHAR(max) 
    --Position marker of equals sign in located SET statements 
    DECLARE @a INT 
    --@param will hold the variable name as defined in the script's SET statement 
    --@value will hold the variable's value as defined in the script's SET statement 
    DECLARE @param VARCHAR(64) 
    DECLARE @value VARCHAR(1024) 

    --Do some housekeeping on the table for this computer 
    DELETE FROM dbo.custom_vars 
    WHERE  myid = @myid 

    IF @script LIKE '%REM SET PARAMETERS%' 
      BEGIN 
          SET @i=1 

          WHILE @i > 0 
                AND @i <= Len(@script) 
            BEGIN 
                --Let's find the string between the new line characters 
                SET @d=Charindex(Char(13) + Char(10), @script, @i) 
                --First new line characters            
                SET @e=Charindex(Char(13) + Char(10), @script, @d + 2) 
                --Second new line characters 

                --SELECT CAST(@d as varchar) + ' ' + CAST(@e as varchar) 
                IF @e = 0 
                  RETURN --if we can't get a whole line, exit now 
                --extract line and remove leading/trailing spaces 
                SET @line=Rtrim(Ltrim(Substring(@script, @d + 2, @e - ( @d + 2 ) 
                                      ))) 

                IF LEFT(@line, 4) = 'SET ' 
                  BEGIN 
                      SET @a=Charindex('=', @line) 

                      IF @a > 0 
                        BEGIN 
                            SET @param=Rtrim(Ltrim(Substring(@line, 5, @a - 5))) 
                            SET @value=Ltrim(Substring(@line, @a + 1, 
                                             Len(@line) - @a) 
                                       ) 

                            --                 SELECT @param + ' == ' + @value 
                            IF EXISTS (SELECT varname 
                                       FROM   dbo.custom_vars 
                                       WHERE  myid = @myid 
                                              AND varname = @param) 
                              BEGIN 
                                  UPDATE custom_vars 
                                  SET    [value] = @value 
                                  WHERE  myid = @myid 
                                         AND varname = @param 
                              END 
                            ELSE 
                              BEGIN 
                                  INSERT INTO custom_vars 
                                              (myid, 
                                               varname, 
                                               value) 
                                  VALUES      ( @myid, 
                                                @param, 
                                                @value) 
                              END 
                        END 
                  END 

                SET @i=@e 
            END 
      END 

 

This function of this stored procedure is simple; when called it parses any running script containing the keywords ‘REM Set Parameters’ and uploads any variable declarations made within it into the eXpress database.

When called for the first time, this stored procedure will build a table in the database with three columns to store our data; @myid, @valuename and @data. It will then insert or update rows according to the parameter definitions it finds in the ‘Run Script’ task, ensuring that only a single instance of each parameter is stored for the target machine.

In order to reduce parameter clutter, every call to this procedure will also do a little housekeeping and delete any previously declared parameters for this machine.

Once this stored procedure is in place, you’ll need to authorise it in the DS Console as follows,

  1. In the DS Console, select ‘Options’ from the ‘Tools’ menu bar
  2. Click ‘Allowed Stored Procedures’ from the ‘Custom Data Sources’ tab

image02.png

  1. Select ins_parameters from the ‘available stored procedures’ pane and click the right double-chevron move it to the right-pane of ‘allowed stored procedures’

image03.png

  1. Click ‘OK’

One day, I might create a little script to do all this, but for now I’m afraid this is a manual process.

3.2. Preparing your Deployment Server for Parameter Retrieval

Now that we have a mechanism for uploading our parameters, we need one for getting them back. Although native token replacement can do this fine without any additional functions it can get a bit wordy. The following function’s purpose is simply to make parameter retrieval a little bit cleaner.

Paste the following code too into SQL Server Management Studio and execute it (also in vars.sql.txt attached to this article),

 


USE [eXpress] 
go 
SET ansi_nulls ON 
go 
SET quoted_identifier ON 
go 

IF EXISTS (SELECT * 
           FROM   sys.objects 
           WHERE  type = 'FN' 
                  AND NAME = 'vars') 
  DROP FUNCTION vars 
go 

CREATE FUNCTION [dbo].[Vars] (@VARIABLE VARCHAR(64), 
                              @COMPID   INT) 
returns VARCHAR(1024) 
AS 
  BEGIN 
      RETURN 
        (SELECT value 
         FROM   custom_vars 
         WHERE  varname = @VARIABLE 
                AND myid = @COMPID) 
  END 

 

3.3. Creating Parameterised Job Walk-Through

In this section I’m now going to lead you through the steps of parameterising a job. I’m going to use a software deployment job as an example here, one which deploys Adobe Reader to our ‘standard’ which does the following;

  • Creates a hidden software source folder on the target machine
  • Copies the software package files into it
  • Executes an install.bat file located with the software source to manage the install

Let’s now create a job with the following 4 tasks to manage this as a fully parameterised job (note you can alternatively import the software_deploy_parameterised.bin file attached to this article),

  1. The “Set Parameters” Script Task
    This first task will store all the jobs parameters and should be run as a server-side run-script task. The script should be embedded and have the following contents,

    REM Set Parameters

    :: This job deploys software by copying source to client and executing
    :: install.bat file created by software packager in application root.
    ::
    :: This job has three parameters,
    ::                   APP_SHARE = UNC path to application share point
    ::                   APP_PATH   = Path to applications on share      
    ::                   LOCAL_SRC = Client root folder for software delivery
    
     
    SET APP_SHARE=\\ALTIRIS-DS2\EXPRESS
    SET APP_PATH=Apps\Adobe_Reader\11.0.09
    SET LOCAL_SRC=C:\SOURCE
    
    :: Make variables defined above available through token replacement
    REM %#*"{ call ins_parameters( %ID% )  }"%
     

    The above embedded script just looks very much like a normal script, so it’s difficult to imagine that anything special is happening here to make these variables visible outside this script task. The last line however, the one that calls the stored procedure ins_parameters, is the magic item. It tells the DS engine parse this script looking first for the “REM Set Parameters” string, and if that’s found it them proceeds to hunt down any local variable declarations. All variable declarations found are tucked away for later retrieval by the dbo.vars function.

    In summary you only need to insert two small lines into an embedded script in order to convert variable declarations into job parameters. Try telling me that’s not cool ;-)

  2. The ‘Create Local Source Folder’ Script Task
    This task runs on the client and makes sure the local source folder exists as a hidden folder. Note how we now use the standard token replacement functionality to retrieve the value of the parameter LOCAL_SRC for this machine.
    REM Create local source folder
    
    SET SOURCE=%#*"select dbo.vars('LOCAL_SRC',%ID%)"%
    
    IF EXIST %SOURCE% GOTO END
    MD %SOURCE%
    ATTRIB %SOURCE% +H
    
    :END
  3. The ‘File Copy’ Task
    This is the task which looks messiest with the token replacement. The file copy path should copy the software source folder as follows,

    Source,

    %#*"select dbo.vars('APP_SHARE',%ID%)"%\%#*"select dbo.vars('APP_PATH',%ID%)"%

     

    Destination,

    %#*"select dbo.vars('LOCAL_SRC',%ID%)"%\%#*"select dbo.vars('APP_PATH',%ID%)"%

     

    It’s not pretty, but the nice thing is you’ll never need to edit this again. Just change the variables in the ‘Set Parameters’ task to change the locations!

  4. The ‘Install Software’ Script task
    This task also has a simple parameterised equivalent. All it does is change directory into the software source and run the install.bat file that drives the install.
    REM Install Software
    
    SET APP_PATH=%#*"select dbo.vars('APP_PATH',%ID%)"%
    SET LOCAL_SRC=%#*"select dbo.vars('LOCAL_SRC',%ID%)"%
    CD "%LOCAL_SRC%\%APP_PATH%"
    
    INSTALL.BAT

And that’s it; this job has now been fully parameterised. One handy point I’d like to emphasise is that this job’s data and its mechanics have now been completely divorced. This means that should I in the future have to update a particular instance of this job it will simply be a case of slotting the new parameterised tasks in place. And that’s it.

4. Summary

What I’ve tried to do today is pave the way for DS Administrators to parameterise their job sets. To enable this feature requires one stored procedure and one function to be imported into your MS SQL eXpress database.

Having a job parameterisation mechanism is a very positive and natural evolution in Deployment Server. Among its benefits are,

  1. The Altiris community now have a practical mechanism which allows them to share jobs. The provision of the ‘Set Parameters’ task is effectively a wizard for setting the job up to work in your environment.
  2. Less need for the Altiris developers to create new wizards to create new functionality.
  3. Distribution and consumption of ‘Best Practice’ jobs now much easier
  4. No need to maintain multiple forks of complex job sets where the only differentiation is that they take into account different configuration parameters.

As always, if you have any comments or issues with this, feel free to contact me.

5. Appendix: Thoughts on Task Parameterisation

In this section I’m going to delve into a multi-task DS job so you can understand some of the thinking which led to me creating this feature. In particular I’m going to highlight how inefficient positioning of configuration data across tasks can be, and steps we can take as DS administrators to manage this. To finish, I’ll present my thoughts on about how DS could handle parameterisation should that ever become a native feature.

The figure below depicts a software deployment job which delivers Adobe Reader.

image04.png

This job consists of three tasks which do the following;

  • first the local machine is prepared for the software delivery
  • next we copy the source files
  • finally the install script is executed

This is quite a simple job and forms the standard for our DS software deployment.

When we package a new software product we clone an existing software deployment job and edit the constituent tasks as appropriate. What we find in this process of cloning is every task needs to be touched to change just one highly duplicated piece of information; the software path.

In order to demonstrate this duplication of configuration data, below is a summary of each task’s function. With each task, I also present a table noting each configuration item the task consumes,

  1. A Client ‘Run Script’ Task (Create hidden local source folder) This script creates a local folder to act as a common, single root location for software installs. The script looks like this,
    REM Create local source folder
    IF EXIST C:\Source GOTO END
    MD C:\Source
    ATTRIB C:\Source +H
    
    :END

    The single piece of configuration data in this script is the local source folder C:\SOURCE.

    Configuration Item Value
    Client Software Delivery Source folder C:\SOURCE

     

  2. A ‘File Copy’ Task
    This second copies the software source to the local source folder. This task requires both a a source and destination folder to execute.

    In this particular instance, the source folder is,

    \\altiris-ds2\express\Apps\Adobe_Reader\11.0.09

     

    and the delivery point on the client is in the local hidden source,

    c:\source\apps\AdobeReader\11.0.9

     

    If we were extract the configuration items out of this file copy script we’d have,

    Configuration Item Value
    Client Software Delivery Source folder C:\SOURCE
    Application Share \\ALTIRIS-DS\Express
    Application path Apps\Adobe_Reader\11.0.09

     

  3.  A Client ‘Run Script’ task (Execute software install)
    Finally we have the client-side ‘Run Script’ task which executes the software install bath file from the local source folder. This script looks like this,
    REM Install Software
    CD C:\Source\Apps\Adobe_Reader\11.0.09
    INSTALL.BAT

    And this script uses the following configuration items,

    Configuration Item Value
    Client Software Delivery Source folder C:\SOURCE
    Application path Apps\Adobe_Reader\11.0.09

     

So, what does this show us? Well, as DS tasks can’t share information we are forced to duplicate it; the client source folder (C:\SOURCE) as a result must be hardcoded in every task. This requirement to duplicate data means in fact that we have 6 instances of only 3 configuration items. Even in this relatively simple and common scenario, much of the configuration information held in each task is simply a repeat of information provided elsewhere. And this job isn’t even complex; the ones we have for OS image delivery are significantly more intricate than this.

This has been an issue this DS for years too. I’ve created and maintained many DS job templates over my time as a DS admin, and managing these has led me to an irrefutable conclusion about DS job design; not being able to manage the location and duplication of information introduces a significant and challenging maintenance burden.

This makes DS administrators naturally wary of maintaining large job sets. Managing the inefficiencies imposed the lack of configuration data management can bluntly be very cumbersome.

5.1. Task-by-Task Parameterisation

In cases where DS jobs reach a complexity level where their maintenance begins to annoy the DS administrator, the impact of maintaining the complexity can be reduced somewhat by individually parameterising each task. This is achieved by ensuring all references to external configuration data within each task are declared as variables. I personally also attempt to modularise my scripts by following these declarations where possible with subroutines to improve clarity.

Below is an example snippet of a script which injects DAgent install files to our client machines as a post-image task,

TITLE Deliver DS6.9 Deployment Agent

:: This script delivers the DS6.9 Agent with the appropriate
:: configuration files to the correct drive

::::::::::::::::::::::::::::::
:: SET TASK PARAMETERS HERE ::
::::::::::::::::::::::::::::::

SET DAGENT=%ALTIRIS_SHARE%\AGENTS\ACLIENT\DAGENT_x64.MSI
SET DSSERVER_FQND=MYDS1.ACME.ORG

:::::::::::::::::::::::::::::
::  END OF TASK PARAMETERS ::
:::::::::::::::::::::::::::::

:: Find the drive letter for the production Windows drive
:: Drive with C:\Windows\System32 folder which isn't WinPE
CALL :Find_Production_Drive_Letter

:: Copy the Agent install MSI to production drive and 
:: create custom DAgent configuration  file
CALL :COPYAGENTFILES
CALL :CREATEAGENTCFG

EXIT

In the above we have two sections to the script: the first declares the two variables I’ve gauged as being useful to parameterise and the second calls subroutines that consume. These variables I refer to as parameters as they reflect external environmental configuration values (rather than internal data objects that have been created purely to enable the programming flow).

This approach has the advantage that should anyone need to modify this task, they are instantly exposed to the data the task supports as being user changeable. Critically, there is now no need to dig deep into the script’s routines to make changes to site data, like the Deployment Server FQDN. What this means is that this task has can now be copied and consumed by other DS administrators without leaning on their programming expertise.

However, even with individual task parameterization complex job maintenance can still be a challenge. You can’t guarantee that only one place, in one task, needs to be updated to effect what could appear to be one change. All tasks should be ideally be analysed to avoid unintended consequences.

5.2. The ‘Ideal’ DS Enhancement: The Job Parameters Task

It should be no surprise to learn that the creators of Deployment Server already understood the need to parameterize complex jobs. In fact most of the task types available in DS are a consequence of this. Take the ‘scripted OS install’ task for example. When you create a scripted OS install task, a friendly Altiris wizard leads you through all the parameters required to drive the task and the internal scripts that execute the OS installation are hidden from view.

Critically, if you need to change anything about this fully parameterised task (such as the OS source) you can be totally confident that there is just one clear place to make that change and the wizard and will just make it work. Amazing huh?

Let’s pretend then, just a moment, that Deployment Server has a user facing mechanism to define Job Parameters. In this fantasy, what would our jobs look like?

First, I think it’s likely that the first task in any of our more complex jobs would be a DS task called “Set Parameters”. This native task would enable us to configure the job’s configuration parameters. A parameterised job to deploy Adobe Reader might therefore look like this,

image05.png

Note that the custom tokens %!APP_PATH!% and %!APP_SHARE!% in the “Copy File” task are completely fictitious; they simply illustrate how such parameters could be retrieved in this ‘ideal’ world.

In this job, the only thing we’d need to change when cloning for a new software delivery are the parameters declared in the ‘Set Parameters’ task. From our previous analysis of this job, the task parameters would be as follows,

Configuration Item Parameter Name Value
Client Software Delivery Source folder LOCAL_SRC C:\SOURCE
Application Share APP_SHARE \\ALTIRIS-DS\Express
Application path APP_PATH Apps\Adobe_Reader\11.0.09

 

If we drilled down into the “Set Parameters” task, we’d find a listing of the job’s user-defined parameters. Perhaps something like this,

image06.png

This would ultimately provide a much more intuitive means to parameterise jobs that I can achieve in T-SQL and scripts alone.

Statistics
0 Favorited
0 Views
4 Files
0 Shares
0 Downloads
Attachment(s)
zip file
DS6.9_Parameterisation_v1.0.zip   4 KB   1 version
Uploaded - Feb 25, 2020
zip file
DS6.9_Parameterisation_v1.1.zip   4 KB   1 version
Uploaded - Feb 25, 2020
zip file
DS6.9_Parameterisation_v1.1b.zip   4 KB   1 version
Uploaded - Feb 25, 2020
docx file
Job Parameterisation in DS6.9.docx   240 KB   1 version
Uploaded - Feb 25, 2020

Tags and Keywords

Comments

May 07, 2015 04:54 AM

Minor update to zip today to update code to 1.1b to allow T-SQL to execute 'as is' if sp and function already exist.

Feb 04, 2015 09:01 AM

Note: Feb 4th 2015: Updated stored procedure ins_parameters to fix situations where tasks weren't getting token replacements. Attached zip now at version 1.1

Oct 16, 2014 08:51 AM

This is truly awesome! I will definitely make use of this! Ian your work is going to cause DS6.9 to live on forever in our environment.  Thanks for all your hard work! We really appreciate it.

Related Entries and Links

No Related Resource entered.