Hi
Alex,
Thank
you for using Syncfusion Products.
S.No |
Query |
Response |
1 |
Does the 'records' in the SQL server
will also imported to the HDFS or they just import the 'Table' design
only? |
The records in the table from SQL server are imported to
the HDFS as a set of files containing a copy of a table and not the table
design. |
2 |
Can I load the 'records' into the
table that is created at HIVE? or how I select those data from
HIVE |
Yes, certainly the records of table transferred from
SQL server to HDFS using Sqoop can be loaded in to the table created in Hive
and can query the records from hive.
This can be achieved by hive script
.
Hive script for loading imported data in to a table
created in hive.
load data inpath
‘/user/SYSTEM/Members’ into table HiveTable;
HiveTable – name of the table created in
hive.
‘/user/SYSTEM/Members’ – HDFS path of the imported
data.
The data in Hive can be queried similar to SQL query
Example : Select * from
HiveTable;
Please refer to
hive module in studio for details. |
3 |
From the image 2.png, why is it the
data have converted into part-m-00000, how can i read in HIVE AND
PIG |
The import process of Sqoop is an Mapping job and the
output files of the imported data are by default named based on below
Part – (m) based on mapper job – mapper_job_
number.
In pig ,the files in HDFs can be read by load commands
in Pig.
Members = LOAD ‘/user/SYSTEM/Members’ using
PigStorage(',') as (ID,NAME,AGE,EMAIL);
And perform operations on data using pig
Script.
In Hive, Please refer the solution of query 2 as both
queries are similar.
Please refer the getting started samples of pig and hive
for your reference.
|
4 |
Does these
database can be used in Hive ? if can how does it being used? |
I hope that this query is
similar to the query (2) , which is addressed earlier. You need to import the
contents of the database to HDFS using Sqoop and then load it in hive using
‘Load’ command. |
Please let me know if you have any concerns.
Regards,
Praveena