Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
menu search
person
Welcome To Ask or Share your Answers For Others

Categories

I'm trying to run a Pentaho job in a remote system using airflow. I was able to use airflow's SSH operator to SSH into remote system and run the shell script but I'm wondering how to pass parameters to the shell script.

For example the shell command looks like

sh -C "$PENT_HOME"/kitchen.sh -file="$PENT_HOME/Details.kjb" -level=Basic > $logFolder/date -u +"%Y-%m-%dT%H:%M:%SZ"-Details.log 2>&1

My DAG script for running a simple shell script without parameters is

t1 = SSHOperator(ssh_conn_id='SSH-dev',
    task_id='ssh_operator',
    command='/opt/scripts/test.sh ',
    dag=dag)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
4.3k views
Welcome To Ask or Share your Answers For Others

1 Answer

command argument takes any bash/shell command, so you can do this for example:

t1 = SSHOperator(ssh_conn_id='SSH-dev',
    task_id='ssh_operator',
    command='/opt/scripts/test.sh arg1 arg2',
    dag=dag)

In your case:

t1 = SSHOperator(ssh_conn_id='SSH-dev',
    task_id='ssh_operator',
    command='bash "$PENT_HOME"/kitchen.sh -file="$PENT_HOME/Details.kjb" -level=Basic > $logFolder/date -u +"%Y-%m-%dT%H:%M:%SZ"-Details.log 2>&1',
    dag=dag)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
thumb_up_alt 0 like thumb_down_alt 0 dislike
Welcome to ShenZhenJia Knowledge Sharing Community for programmer and developer-Open, Learning and Share
...