We use cookies to give you the best experience on our website. If you continue to browse, then you agree to our privacy policy and cookie policy.
Unfortunately, activation email could not send to your email. Please try again.

Sqoop import into hive throws error

Thread ID:

Created:

Updated:

Platform:

Replies:

132981 Oct 2,2017 01:34 PM Oct 10,2017 03:24 AM Big Data Platform 7
loading
Tags: General
Artur Bukowski
Asked On October 2, 2017 01:37 PM

Hello,

I'm encountering an error implying that Hive metastore haven't been properly initialized.
This happens after fresh desktop installation of Big Data Platform (ver. 3.2.0.20).
I've tried to re-init metastore with Hive's schemaTool, re-create it - error still occurs.

Here is sqoop job I started:
sqoop import --connect jdbc:sqlserver://localhost:1433 --username sa --password sa --hive-import --hive-table tm.securities -split-by sec_id --hive-overwrite --query "select * from test.dbo.securities WHERE $CONDITIONS" --m 1 --target-dir /Data/tm/securities

and stack trace is attached.

Could you please help?

Cheers,
Artur

STACK TRACE:

NestedThrowablesStackTrace:
Required table missing : "VERSION" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "VERSION" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"
at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:606)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3365)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2877)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:119)
at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1608)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:671)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2069)
at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1271)
at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3759)
at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2078)
at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1922)
..

17/10/02 19:24:39 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception thrown in Hive
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:360)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:333)
... 9 more
Caused by: java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)


C:\Syncfusion\BigData\3.2.0.20\BigDataSDK\SDK\Sqoop\bin>exit
Sqoop job execution completed.





Aravindraja Thinakaran [Syncfusion]
Replied On October 3, 2017 12:35 PM

Hi Artur, 

Sorry for the inconvenience caused. 

We have reproduced the same while loading SQL Server tables to hive using Sqoop import. We checking our side to resolve the issue. We will update you the solution for the issue on or before 4th October 2017. 
You can load the SQL Server tables to hive by using following steps. 
  1. Import the SQL tables into HDFS using Sqoop import.
  2. Create Hive tables with imported HDFS files using Create Table option available in Hive.

Thanks, 
Aravindraja T 


Aravindraja Thinakaran [Syncfusion]
Replied On October 4, 2017 08:36 AM

Hi Artur,

We were able to reproduce the issue at our end. A support incident to track the status of this defect has been created under your account. Please log on to our support website to check for further updates

https://www.syncfusion.com/account/login?ReturnUrl=%2fsupport%2fdirecttrac%2fincidents

Please let me know if you have any questions.

Thanks,
Aravindraja T


Artur Bukowski
Replied On October 4, 2017 01:29 PM

Hi Aravindraja,

Thanks a lot for quick information and defect creation.

When do you think it would be fixed?

Cheers,
Artur


Aravindraja Thinakaran [Syncfusion]
Replied On October 5, 2017 08:09 AM

Hi Artur, 
Thank you for the patience.
 
The solution for the reported issue has been updated in the incident created for this defect. Could you please check with that solution? 
Thanks, 
Aravindraja T 


Aravindraja Thinakaran [Syncfusion]
Replied On October 9, 2017 04:43 AM

Hi Artur, 

Root Cause for the Issue:
Sqoop process doesn’t aware of Hive configuration including metastore information.
 
Solution:
You can solve this by copying hive-site.xml file from hive to Sqoop bin. 

Please follow the below step and retry your Sqoop job. 

Copy the hive-site.xml file from the following location 
“<Install Drive>:\Syncfusion\BigData\<Install Version>\BigDataSDK\SDK\Hive\conf” 
to the below location. 
“<Install Drive>:\Syncfusion\BigData\<Install Version>\BigDataSDK\SDK\Sqoop\bin” 

Please let us know if you need further assistance. 

Thanks, 
Aravindraja T


Artur Bukowski
Replied On October 9, 2017 06:23 AM

Thanks a lot, it works.

Cheers,
Artur



Aravindraja Thinakaran [Syncfusion]
Replied On October 10, 2017 03:24 AM

Hi Artur,

Thank you for the update.
Please let us know if you need further assistance.

Thanks,
Aravindraja T 


CONFIRMATION

This post will be permanently deleted. Are you sure you want to continue?

Sorry, An error occured while processing your request. Please try again later.

You are using an outdated version of Internet Explorer that may not display all features of this and other websites. Upgrade to Internet Explorer 8 or newer for a better experience.

;