2016-07-27 16 views
0

Ich erhalte diesen Fehler, wenn ich einen neuen Job an Flink (v 1.0.3) in meiner lokalen Umgebung übergebe.Flink Paket fehlende Klasse CheckpointCommitter - flink-connector-cassandra - schwerer Fehler

Verursacht durch: java.lang.NoClassDefFoundError: org/Apache/flink/Streaming/runtime/Betreiber/CheckpointCommitter bei org.apache.flink.streaming.connectors.cassandra.CassandraSink.addSink (CassandraSink.java:164) bei com.xxx.yyy.sample.backend.flink.AAAAA.main (AAAAA.java:99) bei sun.reflect.NativeMethodAccessorImpl.invoke0 (native Methode) bei sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java : 62) bei sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) bei java.lang.reflect.Method.invoke (Method.java:498) bei org.apache.flink.client.program.PackagedProgram .callMainMethod (PackagedProgram.java:505) bei org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution (PackagedProgram.java:403) bei org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan (OptimizerPlanEnvironment.java:80) ... 33 weiteren mit Blick auf den Quellcode in github

https://github.com/apache/flink/tree/master/flink-streaming-java/src/main/java/org/apache/flink/streaming/runtime/operators

ich würde denken, dass die Klasse verfügbar sein und es sollte in flink-Streaming-java.jar verpackt werden, aber sobald die Datei importiert wird, ist die Klasse nicht dort

missing CheckpointCommitter

Irgendwelche Ideen?

pom.xml

http://maven.apache.org/xsd/maven-4.0.0.xsd "> 4.0.0

<groupId>com.AAAAA.BBB</groupId> 
<artifactId>sample-backend-flink</artifactId> 
<version>0.0.1-SNAPSHOT</version> 
<packaging>jar</packaging> 

<name>Flink Quickstart Job</name> 
<url>http://www.myorganization.org</url> 

<properties> 
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> 
    <flink.version>1.0.3</flink.version> 
    <cassandra.version>2.2.7</cassandra.version> 
    <driver.version>3.0.0</driver.version> 
</properties> 

<repositories> 
    <repository> 
     <id>apache.snapshots</id> 
     <name>Apache Development Snapshot Repository</name> 
     <url>https://repository.apache.org/content/repositories/snapshots/</url> 
     <releases> 
      <enabled>true</enabled> 
     </releases> 
     <snapshots> 
      <enabled>true</enabled> 
     </snapshots> 
    </repository> 
</repositories> 


<dependencies> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-java</artifactId> 
     <version>${flink.version}</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-streaming-java_2.10</artifactId> 
     <version>${flink.version}</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-clients_2.10</artifactId> 
     <version>${flink.version}</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-core</artifactId> 
     <version>${flink.version}</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-connector-kafka-0.8_2.10</artifactId> 
     <version>1.0.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-connector-cassandra_2.10</artifactId> 
     <version>1.0.2</version> 
    </dependency> 
    <dependency> 
     <groupId>org.apache.flink</groupId> 
     <artifactId>flink-connector-redis_2.10</artifactId> 
     <version>1.1-SNAPSHOT</version> 
    </dependency> 
</dependencies> 

<profiles> 
    <profile> 
     <!-- Profile for packaging correct JAR files --> 
     <id>build-jar</id> 
     <activation> 
      <activeByDefault>false</activeByDefault> 
     </activation> 
     <dependencies> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-java</artifactId> 
       <version>${flink.version}</version> 
       <scope>provided</scope> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-streaming-java_2.10</artifactId> 
       <version>${flink.version}</version> 
       <scope>provided</scope> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-streaming-scala_2.10</artifactId> 
       <version>${flink.version}</version> 
       <scope>provided</scope> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-clients_2.10</artifactId> 
       <version>${flink.version}</version> 
       <scope>provided</scope> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-core</artifactId> 
       <version>${flink.version}</version> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-connector-kafka-0.8_2.10</artifactId> 
       <version>1.0.2</version> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-connector-cassandra_2.10</artifactId> 
       <version>1.1-SNAPSHOT</version> 
      </dependency> 
      <dependency> 
       <groupId>org.apache.flink</groupId> 
       <artifactId>flink-connector-redis_2.10</artifactId> 
       <version>1.1-SNAPSHOT</version> 
      </dependency> 
     </dependencies> 

     <build> 
      <plugins> 
       <!-- disable the exclusion rules --> 
       <plugin> 
        <groupId>org.apache.maven.plugins</groupId> 
        <artifactId>maven-shade-plugin</artifactId> 
        <version>2.4.1</version> 
        <executions> 
         <execution> 
          <phase>package</phase> 
          <goals> 
           <goal>shade</goal> 
          </goals> 
          <configuration> 
           <artifactSet> 
            <excludes combine.self="override"></excludes> 
           </artifactSet> 
          </configuration> 
         </execution> 
        </executions> 
       </plugin> 
      </plugins> 
     </build> 
    </profile> 
</profiles> 

<build> 
    <plugins> 
     <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-shade-plugin</artifactId> 
      <version>2.4.1</version> 
      <executions> 
       <!-- Run shade goal on package phase --> 
       <execution> 
        <phase>package</phase> 
        <goals> 
         <goal>shade</goal> 
        </goals> 
        <configuration> 
         <artifactSet> 
          <excludes><exclude>org.apache.flink:flink-annotations</exclude> 
           <exclude>org.apache.flink:flink-shaded-hadoop1</exclude> 
           <exclude>org.apache.flink:flink-shaded-hadoop2</exclude> 
           <exclude>org.apache.flink:flink-shaded-curator-recipes</exclude> 
           <exclude>org.apache.flink:flink-core</exclude> 
           <exclude>org.apache.flink:flink-java</exclude> 
           <exclude>org.apache.flink:flink-scala_2.10</exclude> 
           <exclude>org.apache.flink:flink-runtime_2.10</exclude> 
           <exclude>org.apache.flink:flink-optimizer_2.10</exclude> 
           <exclude>org.apache.flink:flink-clients_2.10</exclude> 
           <exclude>org.apache.flink:flink-avro_2.10</exclude> 
           <exclude>org.apache.flink:flink-examples-batch_2.10</exclude> 
           <exclude>org.apache.flink:flink-examples-streaming_2.10</exclude> 
           <exclude>org.apache.flink:flink-streaming-java_2.10</exclude> 


           <exclude>org.scala-lang:scala-library</exclude> 
           <exclude>org.scala-lang:scala-compiler</exclude> 
           <exclude>org.scala-lang:scala-reflect</exclude> 
           <exclude>com.amazonaws:aws-java-sdk</exclude> 
           <exclude>com.typesafe.akka:akka-actor_*</exclude> 
           <exclude>com.typesafe.akka:akka-remote_*</exclude> 
           <exclude>com.typesafe.akka:akka-slf4j_*</exclude> 
           <exclude>io.netty:netty-all</exclude> 
           <exclude>io.netty:netty</exclude> 
           <exclude>commons-fileupload:commons-fileupload</exclude> 
           <exclude>org.apache.avro:avro</exclude> 
           <exclude>commons-collections:commons-collections</exclude> 
           <exclude>org.codehaus.jackson:jackson-core-asl</exclude> 
           <exclude>org.codehaus.jackson:jackson-mapper-asl</exclude> 
           <exclude>com.thoughtworks.paranamer:paranamer</exclude> 
           <exclude>org.xerial.snappy:snappy-java</exclude> 
           <exclude>org.apache.commons:commons-compress</exclude> 
           <exclude>org.tukaani:xz</exclude> 
           <exclude>com.esotericsoftware.kryo:kryo</exclude> 
           <exclude>com.esotericsoftware.minlog:minlog</exclude> 
           <exclude>org.objenesis:objenesis</exclude> 
           <exclude>com.twitter:chill_*</exclude> 
           <exclude>com.twitter:chill-java</exclude> 
           <exclude>com.twitter:chill-avro_*</exclude> 
           <exclude>com.twitter:chill-bijection_*</exclude> 
           <exclude>com.twitter:bijection-core_*</exclude> 
           <exclude>com.twitter:bijection-avro_*</exclude> 
           <exclude>commons-lang:commons-lang</exclude> 
           <exclude>junit:junit</exclude> 
           <exclude>de.javakaffee:kryo-serializers</exclude> 
           <exclude>joda-time:joda-time</exclude> 
           <exclude>org.apache.commons:commons-lang3</exclude> 
           <exclude>org.slf4j:slf4j-api</exclude> 
           <exclude>org.slf4j:slf4j-log4j12</exclude> 
           <exclude>log4j:log4j</exclude> 
           <exclude>org.apache.commons:commons-math</exclude> 
           <exclude>org.apache.sling:org.apache.sling.commons.json</exclude> 
           <exclude>commons-logging:commons-logging</exclude> 
           <exclude>commons-codec:commons-codec</exclude> 
           <exclude>com.fasterxml.jackson.core:jackson-core</exclude> 
           <exclude>com.fasterxml.jackson.core:jackson-databind</exclude> 
           <exclude>com.fasterxml.jackson.core:jackson-annotations</exclude> 
           <exclude>stax:stax-api</exclude> 
           <exclude>com.typesafe:config</exclude> 
           <exclude>org.uncommons.maths:uncommons-maths</exclude> 
           <exclude>com.github.scopt:scopt_*</exclude> 
           <exclude>commons-io:commons-io</exclude> 
           <exclude>commons-cli:commons-cli</exclude> 
          </excludes> 
         </artifactSet> 
         <filters> 
          <filter> 
           <artifact>org.apache.flink:*</artifact> 
           <excludes> 
            <!-- exclude shaded google but include shaded curator --> 
            <exclude>org/apache/flink/shaded/com/**</exclude> 
            <exclude>web-docs/**</exclude> 
           </excludes> 
          </filter> 
          <filter> 
           <!-- Do not copy the signatures in the META-INF folder. 
           Otherwise, this might cause SecurityExceptions when using the JAR. --> 
           <artifact>*:*</artifact> 
           <excludes> 
            <exclude>META-INF/*.SF</exclude> 
            <exclude>META-INF/*.DSA</exclude> 
            <exclude>META-INF/*.RSA</exclude> 
           </excludes> 
          </filter> 
         </filters> 
         <transformers> 
          <!-- add Main-Class to manifest file --> 
          <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> 
           <mainClass>com.demandware.unified.sample.backend.flink.PhysicalInventory</mainClass> 
          </transformer> 
         </transformers> 
         <createDependencyReducedPom>false</createDependencyReducedPom> 
        </configuration> 
       </execution> 
      </executions> 
     </plugin> 

     <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
      <artifactId>maven-compiler-plugin</artifactId> 
      <version>3.1</version> 
      <configuration> 
       <source>1.8</source> <!-- If you want to use Java 8, change this to "1.8" --> 
       <target>1.8</target> <!-- If you want to use Java 8, change this to "1.8" --> 
      </configuration> 
     </plugin> 
    </plugins> 
</build> 

+0

Können Sie Ihren Maven Pom mit uns teilen? Es könnte helfen, Ihr Problem zu lösen. – twalthr

+0

Ich habe einen Verweis auf den Redis Connector hinzugefügt und jetzt bekomme ich den gleichen Fehler, aber für eine andere Klasse, – osva

+0

Ich habe einen Verweis auf den Redis Connector hinzugefügt und jetzt bekomme ich den gleichen Fehler, aber für eine andere Klasse, java.lang. NoClassDefFoundError: org/apache/flink/util/Vorbedingungen, wieder github eingecheckt, die Klasse ist am richtigen Platz github.com/apache/flink/tree/master/flink-core/src/main/java/org/apache/ flink/util, aber wenn ich die Klasse floink-core.jar nicht dort bin. – osva

Antwort

0

Ich hatte ein ähnliches Problem. Was für mich behoben wurde, war sicherzustellen, dass der in der Quelle verwendete Kafka Consumer/Producer mit dem übereinstimmt, auf den Sie angewiesen sind.

Es scheint einen Konflikt zwischen den verschiedenen Versionen von FlinkKafkaC zu geben Verbraucher.

Bei der Verwendung dieses Import

<dependency> 
    <groupId>org.apache.flink</groupId> 
    <artifactId>flink-connector-kafka-0.8_2.10</artifactId> 
    <version>1.0.2</version> 
</dependency> 

stellen Sie sicher, FlinkKafkaConsumer08 verwenden anstatt FlinkKafkaConsumer (oder Hersteller, wenn Sie den Hersteller verwenden)

Die seltsame Sache für mich ist, dass meine IDE das nicht lösen können FlinkKafkaConsumer08, aber es kompiliert mit einem Build-Tool und läuft ordnungsgemäß.