2016-05-04 21 views
0

Stapel: Installierte HDP-2.3.2.0-2950 mit Ambari 2.1Permission Ausnahme für Sqoop

Die Installation war automatisch. Da die Computer (insgesamt 9 Knoten) über eine Internetverbindung verfügten und Root-Anmeldeinformationen verwendet wurden.

Ein ls Befehlsausgabe Referenz (Sqoop Benutzer fehlt):

[[email protected] ~]# hadoop fs -ls /user 
Found 7 items 
drwx------ - accumulo hdfs   0 2015-11-05 14:03 /user/accumulo 
drwxrwx--- - ambari-qa hdfs   0 2015-10-30 16:08 /user/ambari-qa 
drwxr-xr-x - hcat  hdfs   0 2015-10-30 16:17 /user/hcat 
drwxr-xr-x - hdfs  hdfs   0 2015-11-11 10:09 /user/hdfs 
drwx------ - hive  hdfs   0 2015-11-06 09:42 /user/hive 
drwxrwxr-x - oozie  hdfs   0 2015-11-05 12:53 /user/oozie 
drwxrwxr-x - spark  hdfs   0 2015-11-05 13:59 /user/spark 
[[email protected] ~]# 
[[email protected] ~]# 

Ein weiterer Anlass zur Sorge (Sqoop Benutzer fehlt) ausgegeben wird, wenn ich dachte, über die Benutzergruppen:

cat /etc/group 
root:x:0: 
bin:x:1:bin,daemon 
daemon:x:2:bin,daemon 
sys:x:3:bin,adm 
adm:x:4:adm,daemon 
tty:x:5: 
disk:x:6: 
lp:x:7:daemon 
mem:x:8: 
kmem:x:9: 
wheel:x:10: 
mail:x:12:mail 
uucp:x:14: 
man:x:15: 
games:x:20: 
gopher:x:30: 
video:x:39: 
dip:x:40: 
ftp:x:50: 
lock:x:54: 
audio:x:63: 
nobody:x:99: 
users:x:100:oozie,ambari-qa,tez,falcon 
dbus:x:81: 
utmp:x:22: 
utempter:x:35: 
floppy:x:19: 
vcsa:x:69: 
cdrom:x:11: 
tape:x:33: 
dialout:x:18: 
haldaemon:x:68:haldaemon 
ntp:x:38: 
saslauth:x:76: 
mailnull:x:47: 
smmsp:x:51: 
stapusr:x:156: 
stapsys:x:157: 
stapdev:x:158: 
sshd:x:74: 
tcpdump:x:72: 
slocate:x:21: 
ovirtagent:x:175: 
rpc:x:32: 
rpcuser:x:29: 
nfsnobody:x:65534: 
munin:x:499: 
screen:x:84: 
scotty:x:999: 
tquest:x:6382: 
fuse:x:497: 
httpfs:x:496:httpfs 
knox:x:6383: 
spark:x:6384: 
hdfs:x:6385:hdfs 
accumulo:x:495: 
falcon:x:494: 
flume:x:493: 
hbase:x:492: 
hive:x:491: 
oozie:x:490: 
storm:x:489: 

Beim Importieren einer Tabelle von SQL Server auf HDFS mit Sqoop (als 'sqoop' Linux-Benutzer):

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) 
     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) 
     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) 
     at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) 
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
     at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
     at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
     at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) 
     at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
     at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
     at org.apache.sqoop.Sqoop.run(Sqoop.java:148) 
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) 
     at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=sqoop, access=WRITE, inode="/user/sqoop/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1427) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1358) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
     at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) 
     ... 28 more 

Während eine Tabelle von SQL Server auf HDFS Import Sqoop Verwendung (als 'root' Linux Benutzer):

ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) 
     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) 
     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) 
     at java.lang.reflect.Constructor.newInstance(Constructor.java:526) 
     at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106) 
     at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3010) 
     at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2978) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1047) 
     at org.apache.hadoop.hdfs.DistributedFileSystem$21.doCall(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1043) 
     at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1036) 
     at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) 
     at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) 
     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) 
     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) 
     at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) 
     at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) 
     at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) 
     at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163) 
     at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) 
     at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) 
     at org.apache.sqoop.Sqoop.run(Sqoop.java:148) 
     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) 
     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226) 
     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235) 
     at org.apache.sqoop.Sqoop.main(Sqoop.java:244) 
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213) 
     at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1771) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1755) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1738) 
     at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71) 
     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3896) 
     at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:984) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622) 
     at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616) 
     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2137) 
     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2133) 
     at java.security.AccessController.doPrivileged(Native Method) 
     at javax.security.auth.Subject.doAs(Subject.java:415) 
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) 
     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2131) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1427) 
     at org.apache.hadoop.ipc.Client.call(Client.java:1358) 
     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) 
     at com.sun.proxy.$Proxy14.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:558) 
     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
     at java.lang.reflect.Method.invoke(Method.java:606) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) 
     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) 
     at com.sun.proxy.$Proxy15.mkdirs(Unknown Source) 
     at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:3008) 
     ... 28 more 

Wenn ich eine Tabelle von SQL Server auf HDFS Verwendung Sqoop importieren (als die ‚hdfs‘ Linux-Benutzer), es funktioniert aber es gibt ein Fehlerprotokoll Anweisung

INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.3.2.0-2950 
16/05/04 16:34:13 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 
16/05/04 16:34:14 INFO manager.SqlManager: Using default fetchSize of 1000 
16/05/04 16:34:14 INFO tool.CodeGenTool: Beginning code generation 
16/05/04 16:34:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [DimSampleDesc] AS t WHERE 1=0 
16/05/04 16:34:15 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/hdp/2.3.2.0-2950/hadoop-mapreduce 
Note: /tmp/sqoop-hdfs/compile/6f239d67662b5e2a3462b51268033d6e/DimSampleDesc.java uses or overrides a deprecated API. 
Note: Recompile with -Xlint:deprecation for details. 
16/05/04 16:34:17 ERROR orm.CompilationManager: Could not make directory: /root/. 

ich habe folgende Fragen:

  1. Warum der Fehler trotz einer Auto. Installation dh ich habe überspringen keine Dienste/Konfigurationen
  2. Was ist der ideale Weg, um einen Sqoop Import oder MR auszuführen (ich meine die jeweiligen Benutzer verwendet werden soll)

Antwort

0

Sie haben einen Benutzer zu Hause erstellen (auf HDFS) für den Benutzer, der

den Befehl startet Wenn Sie Sqoop Befehl starten, wird Hadoop den lokalen Benutzer mit einem HDFS Benutzer zuordnet, versuchen, das Haus zu finden, das /user/${USER.NAME ist}

Wie es scheint, dass Hadoop Super User hdfs ist, so müssen Sie tun:

$ su - hdfs 'hadoop fs -mkdir /user/sqoop' 
$ su - hdfs 'hadoop fs -chown sqoop:hdfs /user/sqoop ' 

und starten Sie dann Sqoop als Benutzer sqoop

Eine weitere Alternative ist hive‘Staging-Verzeichnis zu einem anderen HDFS Standort zu ändern, in dem alle Benutzer Schreibzugriff auf (wie/tmp)

haben in hive-site.xml

<property> 
    <name>hive.exec.stagingdir</name> 
    <value>/tmp</value> 
</property> 
+0

Ich habe meine Frage bearbeitet, können Sie überprüfen und mir helfen zu verstehen? –

+0

Ja genau! Wenn Sie den Befehl als 'hdfs'-Benutzer starten, wird hadoop (oder hdfs) den Befehl als hdfs ausführen, und da der Benutzer' hdfs' bereits seine Heimat auf hdfs hat, wird die Anfrage übergeben ... aber mit dem Benutzer sqoop, es versucht, das home/user/sqoop zu verwenden, konnte es nicht finden, also versucht, es zu schaffen, und findet, dass sqoop keine Durchdringlichkeit hat, auf/user zu schreiben (besessen von Superuser 'hdfs'), also geschah der Fehler .... .hoffe, dass das hilft – user1314742

+0

Das funktioniert, jetzt habe ich mehrere Zweifel und ein weiteres Problem, aber werde eine andere Frage für das –