Video Screencast Help
Symantec Appoints Michael A. Brown CEO. Learn more.

Altiris workflow designer read filefrom linux and execute sqlcomponent

Created: 09 Sep 2013 | 2 comments

HI All,

We have a requirement .

In my environment  some linux  batch jobs  run every day and its generate some log dynamically (daily_dm_[ddmmyyhhmmss.txt].

this log file generate in the linux environment,  we have read file  from altiris workflow template and based on log content we have to execute  the

 sqlcomponent and after  that  we have to log file to archive location.

For single file if it does not have date and time we can read by using  "cat daily_dm.txt".

i am  not  getting any idea how to read this dynamic logs and run  oracle sql query(sqlcompoent)  by using log data  from altiris workflow, please help us to resolve this.

 

Thanks

 

 

Operating Systems:

Comments 2 CommentsJump to latest comment

reecardo's picture

I'm unsure of Linux support for Workflow, but you can probably organize a job to copy the logs to some Windows directory. If you know the directory the log is written to, you can use a List Files in Directory component to get the filenames. The you can use an Iterate text File Lines component to read the files line by line.

As far as running SQL against Oracle, I would just generate a SQL script component against an Oracle provider. You can specify parameters for the SQL if needed, using log data as the parameter values in the Designer after the component is generated.

treddy329's picture

HI Recardo,

In my environment file copying one server to another server is forbidden.

AWF will support for linux environment.

By using plink we can connect linux server .

in my log file i have only date like 11-sep-2013 and Country name(USA) this two values pass to sql component.

eg daily_dm_11092013_hhmmss.log(only one line every time)

11-sep-2103 USA

 

i want  read the content hence execute worklfow and at end of flow my file has to move  archive directory in same server

we already achieved file reading executing work flow and archiving for satic logs.If there is dynamic logs we are not understanding how we read files one by one in looping  way.

Please share your views 

Thanks in advance for your help.

 

Thanks