歡迎來到Linux教程網
Linux教程網
Linux教程網
Linux教程網
您现在的位置: Linux教程網 >> UnixLinux >  >> Linux編程 >> Linux編程

Eclipse下第一個Hadoop程序出現錯誤ClassCastException

java.lang.ClassCastException: interface javax.xml.soap.Text 
    at java.lang.Class.asSubclass(Unknown Source) 
    at org.apache.Hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:599) 
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:791) 
    at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350) 
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) 
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177) 

解決方法:

原來發現是版本的問題,我使用的是如下代碼:

並用這些代碼運行在hadoop-0.20.2版本上才出現這樣的問題,在hadoop-0.20.2上請使用新的接口方法來實現就不會有這樣的問題.


public static void main(String[] args) throws IOException {

if(args.length != 2){

System.err.print("Usage: MaxTemperature<input path> <output path>");

System.exit(-1);

}

 

JobConf conf = new JobConf(MaxTemperature.class);

conf.setJobName("Max temperature");


FileInputFormat.addInputPath(conf, new Path(args[0]));

FileOutputFormat.setOutputPath(conf, new Path(args[1]));


conf.setMapperClass(MaxTemperatureMapper.class);

conf.setReducerClass(MaxTemperatureReducer.class);


conf.setOutputKeyClass(Text.class);

conf.setOutputValueClass(IntWritable.class);


JobClient.runJob(conf);


}


public void map(LongWritable key, Text value,

OutputCollector<Text, IntWritable> output, Reporter reporter)

throws IOException {

String line = value.toString();

System.out.println("key: " + key);

String year = line.substring(15,19);

int airTemperature;


if(line.charAt(45) == '+'){

airTemperature = Integer.parseInt(line.substring(46,50));

}else{

airTemperature = Integer.parseInt(line.substring(45,50));

}


String quality = line.substring(50,51);

System.out.println("quality: " + quality);


if(airTemperature != MISSING && quality.matches("[01459")){

output.collect(new Text(year), new IntWritable(airTemperature));

}

}

 


@Override

public void reduce(Text key, Iterator<IntWritable> values,

OutputCollector<Text, IntWritable> output, Reporter reporter)

throws IOException {

int maxValue = Integer.MIN_VALUE;


while(values.hasNext()){

maxValue = Math.max(maxValue, values.next().get());

}


output.collect(key, new IntWritable(maxValue));

 

}

 

請將上面的代碼改成新版本的寫法就不會出現問題.


public static void main(String[] args) throws IOException, InterruptedException, ClassNotFoundException {

if(args.length != 2){

System.err.print("Usage: MaxTemperature<input path> <output path>");

System.exit(-1);

}

 

Job job = new Job();

job.setJarByClass(NewMaxTemperature.class);

 

FileInputFormat.addInputPath(job, new Path(args[0]));

FileOutputFormat.setOutputPath(job, new Path(args[1]));


job.setMapperClass(NewMaxTemperatureMapper.class);

job.setReducerClass(NewMaxTemperatureReducer.class);


job.setOutputKeyClass(Text.class);

job.setOutputValueClass(IntWritable.class);


System.exit(job.waitForCompletion(true)?0:1);

 

 


public class NewMaxTemperatureMapper extends

Mapper<LongWritable,Text,Text,IntWritable>{


private static final int MISSING= 9999;

@Override

public void map(LongWritable key, Text value,

Context context)

throws IOException ,InterruptedException{

String line = value.toString();

System.out.println("key: " + key);

String year = line.substring(15,19);

int airTemperature;


if(line.charAt(45) == '+'){

airTemperature = Integer.parseInt(line.substring(46,50));

}else{

airTemperature = Integer.parseInt(line.substring(45,50));

}


String quality = line.substring(50,51);

System.out.println("quality: " + quality);

if(airTemperature != MISSING && quality.matches("[01459]")){

context.write(new Text(year), new IntWritable(airTemperature));

}

}

 

}

 

public class NewMaxTemperatureReducer extends

Reducer<Text, IntWritable, Text, IntWritable> {


@Override

public void reduce(Text key, Iterable<IntWritable> values,

Context context)

throws IOException,InterruptedException {

int maxValue = Integer.MIN_VALUE;

for(IntWritable value: values){

maxValue = Math.max(maxValue, value.get());

}


context.write(key, new IntWritable(maxValue));

 

}

 


}

解決方案2:可以使用書中指定的hadoop版本,可以使用hadoop 0.20.0以前的版本

Copyright © Linux教程網 All Rights Reserved