Ich versuche, eine einfache Shell-Aktion in Oozie zu schreiben, die die Dateien von Remote-Kopie wird hdfs.But Ich erhalte Fehler.Oozie Rohbauarbeiten fließen
Hier ist meine Arbeit flow.xml
<workflow-app name="WorkFlowCopyLocalTohdfs" xmlns="uri:oozie:workflow:0.1">
<start to="sshAction"/>
<action name="sshAction">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<exec>/user/root5/Oozie/Workflow/WorkFlowCopyLocalTohdfs/uploadFile.sh</exec>
<file>/user/root5/Oozie/Workflow/WorkFlowCopyLocalTohdfs/uploadFile.sh#upload File.sh</file>
<capture-output/>
</shell>
<ok to="end" />
<error to="killAction"/>
</action>
<kill name="killAction">
<message>"Killed job due to error"</message>
</kill>
<end name="end"/>
</workflow-app>
Mein uploadFile.sh ist
#!/bin/bash -e
hadoop fs -copyFromLocal /home/root5/Desktop/Avinash_sampleData/DataFolder/Data_04-05-2016 /user/root5/Oozie/DataFolder
Mein job.properties
nameNode=hdfs://localhost:8020
jobTracker=localhost:8021
queueName=default
oozie.libpath=${nameNode}/user/root/oozie-workflows/lib
oozie.use.system.libpath=true
oozie.wf.rerun.failnodes=true
oozieProjectRoot=${nameNode}/user/root5/Oozie
appPath=${oozieProjectRoot}/Workflow/WorkFlowCopyLocalTohdfs
oozie.wf.application.path=${appPath}
#inputDir=${oozieProjectRoot}/data
[email protected]
ist und die Stack-Trace in Oozie ist
2016-05-04 16:09:36,023 INFO ActionStartXCommand:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]:start:] Start action [[email protected]:start:] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-05-04 16:09:36,023 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]:start:] [***[email protected]:start:***]Action status=DONE
2016-05-04 16:09:36,023 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]:start:] [***[email protected]:start:***]Action updated in DB!
2016-05-04 16:09:36,209 INFO ActionStartXCommand:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] Start action [[email protected]] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-05-04 16:09:36,353 WARN ShellActionExecutor:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] credentials is null for the action
2016-05-04 16:09:37,441 INFO ShellActionExecutor:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] checking action, external ID [job_201604251732_0160] status [RUNNING]
2016-05-04 16:09:37,544 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] [***[email protected]***]Action status=RUNNING
2016-05-04 16:09:37,544 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] [***[email protected]***]Action updated in DB!
2016-05-04 16:09:53,082 INFO CallbackServlet:539 - USER[-] GROUP[-] TOKEN[-] APP[-] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] callback for action [[email protected]]
2016-05-04 16:09:53,317 INFO ShellActionExecutor:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] action completed, external ID [job_201604251732_0160]
2016-05-04 16:09:53,346 WARN ShellActionExecutor:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.ShellMain], exit code [1]
2016-05-04 16:09:53,576 INFO ActionEndXCommand:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] ERROR is considered as FAILED for SLA
2016-05-04 16:09:53,754 INFO ActionStartXCommand:539 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] Start action [[email protected]] with user-retry state : userRetryCount [0], userRetryMax [0], userRetryInterval [10]
2016-05-04 16:09:53,755 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] [***[email protected]***]Action status=DONE
2016-05-04 16:09:53,755 WARN ActionStartXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[[email protected]] [***[email protected]***]Action updated in DB!
2016-05-04 16:09:53,943 WARN CoordActionUpdateXCommand:542 - USER[root5] GROUP[-] TOKEN[] APP[WorkFlowCopyLocalTohdfs] JOB[0000012-160425173341619-oozie-oozi-W] ACTION[-] E1100: Command precondition does not hold before execution, [, coord action is null], Error Code: E1100
Bitte helfen Sie mir bei der weiteren Vorgehensweise.
Hive-workflow.xml
<workflow-app name="WorkFlowCopyLocalTohdfs" xmlns="uri:oozie:workflow:0.1">
<start to="hive-node"/>
<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<job-xml>hive-site.xml</job-xml>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>default</value>
</property>
<property>
<name>oozie.hive.defaults</name>
<value>hive-site.xml</value>
</property>
</configuration>
<script>Hive_script.hql</script>
</hive>
<ok to="end"/>
<error to="killAction"/>
</action>
<kill name="killAction">
<message>"Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]"</message>
</kill>
<end name="end"/>
</workflow-app>
And the Hive_script.hql
.# LOAD DATA inpath '/user/root5/Oozie/DataFolder/Data_04_05_2016.txt' INTO TABLE OOZIE_TABLE1;
Sie meinen, die Skriptdatei /user/root5/Oozie/Workflow/WorkFlowCopyLocalTohdfs/uploadFile.sh sagen sollte in hdfs platziert werden ?? Wenn ja, erwähnt der Pfad hdfs Pfad Auch, i setzen http versucht haben: // localhost: 8020 // user/root5/Oozie/Workflow/WorkFlowCopyLocalTohdfs/uploadFile.sh, haben beide – Avinash
nicht funktionieren Was ich rede ist diese Linie 'hadoop fs -copyFromLocal/home/root5/Desktop/Avinash_sampleData/Datafolder/Data_04-05-2016/user/root5/Oozie/DataFolder' in Ihrem upload.sh ich glaube, Sie nicht eine solche Linie laufen kann in einem oozie-job, weil oozie nicht wissen wird, welchen pfad auf welchem lokalen filesystem du meinst. Die Datei muss schon auf den hdfs liegen. – OberstK
Ok, aber ich versuche die Hive Arbeitsablauf Aktion als gut zu laufen, es kehrt auch die simlar error.Instead von shellMain Fehlereinführung, es hiveMain Fehler wirft. 2016-05-05 11: 44: 34,880 WARN HiveActionExecutor: 542 - USER [root5] GRUPPE [-] TOKEN [] APP [WorkFlowCopyLocalTohdfs] JOB [0000029-160425173341619-oozie-oozi-W] AKTION [0000029-160425173341619-oozie- oozi-W @ hive-node] Startprogramm ERROR, Grund: Hauptklasse [org.apache.oozie.action.hadoop.HiveMain], Exitcode [40000] – Avinash