We use cookies to give you the best experience on our website. If you continue to browse, then you agree to our privacy policy and cookie policy. Image for the cookie policy date

Sqoop import into hive throws error

Hello,

I'm encountering an error implying that Hive metastore haven't been properly initialized.
This happens after fresh desktop installation of Big Data Platform (ver. 3.2.0.20).
I've tried to re-init metastore with Hive's schemaTool, re-create it - error still occurs.

Here is sqoop job I started:
sqoop import --connect jdbc:sqlserver://localhost:1433 --username sa --password sa --hive-import --hive-table tm.securities -split-by sec_id --hive-overwrite --query "select * from test.dbo.securities WHERE $CONDITIONS" --m 1 --target-dir /Data/tm/securities

and stack trace is attached.

Could you please help?

Cheers,
Artur

STACK TRACE:

NestedThrowablesStackTrace:
Required table missing : "VERSION" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"
org.datanucleus.store.rdbms.exceptions.MissingTableException: Required table missing : "VERSION" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"
at org.datanucleus.store.rdbms.table.AbstractTable.exists(AbstractTable.java:606)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.performTablesValidation(RDBMSStoreManager.java:3365)
at org.datanucleus.store.rdbms.RDBMSStoreManager$ClassAdder.run(RDBMSStoreManager.java:2877)
at org.datanucleus.store.rdbms.AbstractSchemaTransaction.execute(AbstractSchemaTransaction.java:119)
at org.datanucleus.store.rdbms.RDBMSStoreManager.manageClasses(RDBMSStoreManager.java:1608)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getDatastoreClass(RDBMSStoreManager.java:671)
at org.datanucleus.store.rdbms.RDBMSStoreManager.getPropertiesForGenerator(RDBMSStoreManager.java:2069)
at org.datanucleus.store.AbstractStoreManager.getStrategyValue(AbstractStoreManager.java:1271)
at org.datanucleus.ExecutionContextImpl.newObjectId(ExecutionContextImpl.java:3759)
at org.datanucleus.state.StateManagerImpl.setIdentity(StateManagerImpl.java:2267)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:484)
at org.datanucleus.state.StateManagerImpl.initialiseForPersistentNew(StateManagerImpl.java:120)
at org.datanucleus.state.ObjectProviderFactoryImpl.newForPersistentNew(ObjectProviderFactoryImpl.java:218)
at org.datanucleus.ExecutionContextImpl.persistObjectInternal(ExecutionContextImpl.java:2078)
at org.datanucleus.ExecutionContextImpl.persistObjectWork(ExecutionContextImpl.java:1922)
..

17/10/02 19:24:39 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Exception thrown in Hive
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:360)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:333)
... 9 more
Caused by: java.lang.RuntimeException: Hive metastore database is not initialized. Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed, don't forget to include the option to auto-create the underlying database in your JDBC connection string (e.g. ?createDatabaseIfNotExist=true for mysql)


C:\Syncfusion\BigData\3.2.0.20\BigDataSDK\SDK\Sqoop\bin>exit
Sqoop job execution completed.





7 Replies

AT Aravindraja Thinakaran Syncfusion Team October 3, 2017 04:35 PM UTC

Hi Artur, 

Sorry for the inconvenience caused. 

We have reproduced the same while loading SQL Server tables to hive using Sqoop import. We checking our side to resolve the issue. We will update you the solution for the issue on or before 4th October 2017. 
You can load the SQL Server tables to hive by using following steps. 
  1. Import the SQL tables into HDFS using Sqoop import.
  2. Create Hive tables with imported HDFS files using Create Table option available in Hive.

Thanks, 
Aravindraja T 



AT Aravindraja Thinakaran Syncfusion Team October 4, 2017 12:36 PM UTC

Hi Artur,

We were able to reproduce the issue at our end. A support incident to track the status of this defect has been created under your account. Please log on to our support website to check for further updates

https://www.syncfusion.com/account/login?ReturnUrl=%2fsupport%2fdirecttrac%2fincidents

Please let me know if you have any questions.

Thanks,
Aravindraja T



AB Artur Bukowski October 4, 2017 05:29 PM UTC

Hi Aravindraja,

Thanks a lot for quick information and defect creation.

When do you think it would be fixed?

Cheers,
Artur



AT Aravindraja Thinakaran Syncfusion Team October 5, 2017 12:09 PM UTC

Hi Artur, 
Thank you for the patience.
 
The solution for the reported issue has been updated in the incident created for this defect. Could you please check with that solution? 
Thanks, 
Aravindraja T 



AT Aravindraja Thinakaran Syncfusion Team October 9, 2017 08:43 AM UTC

Hi Artur, 

Root Cause for the Issue:
Sqoop process doesn’t aware of Hive configuration including metastore information.
 
Solution:
You can solve this by copying hive-site.xml file from hive to Sqoop bin. 

Please follow the below step and retry your Sqoop job. 

Copy the hive-site.xml file from the following location 
“<Install Drive>:\Syncfusion\BigData\<Install Version>\BigDataSDK\SDK\Hive\conf” 
to the below location. 
“<Install Drive>:\Syncfusion\BigData\<Install Version>\BigDataSDK\SDK\Sqoop\bin” 

Please let us know if you need further assistance. 

Thanks, 
Aravindraja T



AB Artur Bukowski October 9, 2017 10:23 AM UTC

Thanks a lot, it works.

Cheers,
Artur




AT Aravindraja Thinakaran Syncfusion Team October 10, 2017 07:24 AM UTC

Hi Artur,

Thank you for the update.
Please let us know if you need further assistance.

Thanks,
Aravindraja T 


Loader.
Live Chat Icon For mobile
Up arrow icon