BoldSignEasily embed eSignatures in your .NET applications. Free sandbox with native SDK available.
REGISTER file:///c:/Syncfusion/BigDataSDK/1.1.0.8/SDK/Pig/contrib/piggybank/java/piggybank.jar;DEFINE XMLLoader org.apache.pig.piggybank.storage.XMLLoader();mydata = LOAD '/myxml.xml' USING XMLLoader();DUMP mydata;
<?xml version="1.0" encoding="UTF-8" ?><rootnode><elementnode attributenode="value">element content</elementnode></rootnode>
2015-03-18 10:52:57,112 [JobControl] INFO org.apache.hadoop.yarn.client.RMProxy - Connecting to ResourceManager at /0.0.0.0:80322015-03-18 10:52:57,175 [JobControl] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS2015-03-18 10:52:58,238 [JobControl] INFO org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths to process : 12015-03-18 10:52:58,253 [JobControl] INFO org.apache.hadoop.mapreduce.JobSubmitter - Cleaning up the staging area /tmp/hadoop-yarn/staging/csavell/.staging/job_1426522352277_00182015-03-18 10:52:58,269 [JobControl] ERROR org.apache.pig.backend.hadoop23.PigJobControl - Error while trying to run jobs.java.lang.RuntimeException: java.lang.reflect.InvocationTargetExceptionat org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:130)at org.apache.pig.backend.hadoop23.PigJobControl.run(PigJobControl.java:191)at java.lang.Thread.run(Thread.java:744)at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:270)Caused by: java.lang.reflect.InvocationTargetExceptionat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.lang.reflect.Method.invoke(Method.java:606)at org.apache.pig.backend.hadoop23.PigJobControl.submit(PigJobControl.java:128)... 3 moreCaused by: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expectedat org.apache.pig.piggybank.storage.XMLLoader$XMLFileInputFormat.isSplitable(XMLLoader.java:615)at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:352)at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.getSplits(PigInputFormat.java:274)at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)at java.security.AccessController.doPrivileged(Native Method)at javax.security.auth.Subject.doAs(Subject.java:415)at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)... 8 more
REGISTER file:///c:/pig-0.14.0/pig-0.14.0/contrib/piggybank/java/piggybank.jar;REGISTER file:///c:/pig-0.13.0/pig-0.13.0/contrib/piggybank/java/piggybank.jar;REGISTER file:///c:/pig-0.12.1/pig-0.12.1/contrib/piggybank/java/piggybank.jar;
Hi Carl,
Thank you for using Syncfusion products.
Loading Xml file
using piggy bank.jar failed. |
The issue occurred due to version incompatibility between the piggybank build and the
hadoop version in our Syncfusion platform. To apply fix:
|
Pig script for
loading xml file |
By applying the fix and executing the provided pig
script will submit the job and run successfully but no data will be dumped as
it stores 0 records. So please modify the script as shown below by passing the
element of the xml file while defining XMLLoader(). REGISTER file:///c:/Syncfusion/BigDataSDK/1.1.0.8/SDK/Pig/contrib/piggybank/java/piggybank.jar;
DEFINE XMLLoader
org.apache.pig.piggybank.storage.XMLLoader('rootnode');
mydata = LOAD '/myxml.xml' USING XMLLoader();
DUMP mydata;
|
Please let me know if you need any further assistance on
this.
Regards
Praveena
Hi Carl,
We are glad that your problem has resolved. Please let us know if you have any queries.
Regards,
Praveena.