Column type issue when transferring bulk data between MySQL to Hive through SqoopShould I use the datetime or timestamp data type in MySQL?Facing issue for the below command in sqoopsqoop to transfer data to HDFS from Teradatasqoop to mysql importing issuesqoop - connect to oracle and import data to HDFS in IBM BigInsightsSQOOP Import Fails, File Not Found ExceptionSqoop import error message ERROR tool.ImportTool: Import failed: ENOENT: No such file or directory Optionssqoop error : Direct import is not compatible with HCatalog operationssqoop incremental append last modified

Existence of infinite set of positive integers s.t sum of reciprocals is rational and set of primes dividing an element is infinite

Polynomial and roots problems

Is it advisable to inform the CEO about his brother accessing his office?

Why will we fail creating a self sustaining off world colony?

Can I hire several veteran soldiers to accompany me?

Can you run PoE Cat6 alongside standard Cat6 cables?

What verb goes with "coup"?

GFCI versus circuit breaker

Confusion in understanding the behavior of inductor in RL circuit with DC source

How to idiomatically express the idea "if you can cheat without being caught, do it"

Are there advantages in writing by hand over typing out a story?

Using PTSerif-TLF for Cyrillic with TeX Gyre Pagella

How does entropy depend on location and scale?

What's the difference between the Find Steed and Find Greater Steed spells?

Why should I allow multiple IP addresses on a website for a single session?

Is it OK to say "The situation is pregnant with a crisis"?

How did sloshing prevent the Apollo Service Module from moving safely away from the Command Module and how was this fixed?

Emphasize numbers in tables

How soon after takeoff can you recline your airplane seat?

Why are examinees often not allowed to leave during the start and end of an exam?

Runtime too long for NDSolveValue, FindRoot breaks down at sharp turns

How to model a Coral or Sponge Structure?

How might boat designs change in order to allow them to be pulled by dragons?

Are the plates of a battery really charged?



Column type issue when transferring bulk data between MySQL to Hive through Sqoop


Should I use the datetime or timestamp data type in MySQL?Facing issue for the below command in sqoopsqoop to transfer data to HDFS from Teradatasqoop to mysql importing issuesqoop - connect to oracle and import data to HDFS in IBM BigInsightsSQOOP Import Fails, File Not Found ExceptionSqoop import error message ERROR tool.ImportTool: Import failed: ENOENT: No such file or directory Optionssqoop error : Direct import is not compatible with HCatalog operationssqoop incremental append last modified













0















I'm trying send data from my MySQL thought Sqoop to Hive database. After run, this execution stops with a issue in the column type:



19/03/25 14:26:37 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table piwik_archive_blob_2019_03
19/03/25 14:26:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `piwik_archive_blob_2019_03` AS t LIMIT 1
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date1 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date2 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column ts_archived had to be cast to a less precise type in Hive
19/03/25 14:26:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive does not support the SQL type for column value
at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:191)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)


My import command has the mapping column:



sqoop import 
-Dorg.apache.sqoop.splitter.allow_text_splitter=true
--connect jdbc:mysql://172.18.9.81:3306/piwik
--hive-import
--hive-table piwik.piwik_archive_blob_2019_03
--map-column-hive date1=Date,date2=Date,ts_archived=Date
--password 'root'
--table piwik_archive_blob_2019_03
--username root
-m 1









share|improve this question






















  • I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

    – Bruno Wego
    Mar 25 at 17:53















0















I'm trying send data from my MySQL thought Sqoop to Hive database. After run, this execution stops with a issue in the column type:



19/03/25 14:26:37 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table piwik_archive_blob_2019_03
19/03/25 14:26:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `piwik_archive_blob_2019_03` AS t LIMIT 1
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date1 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date2 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column ts_archived had to be cast to a less precise type in Hive
19/03/25 14:26:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive does not support the SQL type for column value
at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:191)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)


My import command has the mapping column:



sqoop import 
-Dorg.apache.sqoop.splitter.allow_text_splitter=true
--connect jdbc:mysql://172.18.9.81:3306/piwik
--hive-import
--hive-table piwik.piwik_archive_blob_2019_03
--map-column-hive date1=Date,date2=Date,ts_archived=Date
--password 'root'
--table piwik_archive_blob_2019_03
--username root
-m 1









share|improve this question






















  • I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

    – Bruno Wego
    Mar 25 at 17:53













0












0








0








I'm trying send data from my MySQL thought Sqoop to Hive database. After run, this execution stops with a issue in the column type:



19/03/25 14:26:37 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table piwik_archive_blob_2019_03
19/03/25 14:26:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `piwik_archive_blob_2019_03` AS t LIMIT 1
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date1 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date2 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column ts_archived had to be cast to a less precise type in Hive
19/03/25 14:26:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive does not support the SQL type for column value
at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:191)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)


My import command has the mapping column:



sqoop import 
-Dorg.apache.sqoop.splitter.allow_text_splitter=true
--connect jdbc:mysql://172.18.9.81:3306/piwik
--hive-import
--hive-table piwik.piwik_archive_blob_2019_03
--map-column-hive date1=Date,date2=Date,ts_archived=Date
--password 'root'
--table piwik_archive_blob_2019_03
--username root
-m 1









share|improve this question














I'm trying send data from my MySQL thought Sqoop to Hive database. After run, this execution stops with a issue in the column type:



19/03/25 14:26:37 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table piwik_archive_blob_2019_03
19/03/25 14:26:37 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `piwik_archive_blob_2019_03` AS t LIMIT 1
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date1 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column date2 had to be cast to a less precise type in Hive
19/03/25 14:26:37 WARN hive.TableDefWriter: Column ts_archived had to be cast to a less precise type in Hive
19/03/25 14:26:37 ERROR tool.ImportTool: Import failed: java.io.IOException: Hive does not support the SQL type for column value
at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:191)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:189)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)


My import command has the mapping column:



sqoop import 
-Dorg.apache.sqoop.splitter.allow_text_splitter=true
--connect jdbc:mysql://172.18.9.81:3306/piwik
--hive-import
--hive-table piwik.piwik_archive_blob_2019_03
--map-column-hive date1=Date,date2=Date,ts_archived=Date
--password 'root'
--table piwik_archive_blob_2019_03
--username root
-m 1






mysql hive sqoop






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 25 at 17:33









Bruno WegoBruno Wego

2931 gold badge5 silver badges18 bronze badges




2931 gold badge5 silver badges18 bronze badges












  • I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

    – Bruno Wego
    Mar 25 at 17:53

















  • I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

    – Bruno Wego
    Mar 25 at 17:53
















I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

– Bruno Wego
Mar 25 at 17:53





I understand that is relationated with JDBC data types, but I try different types and everyone respond with the same error.

– Bruno Wego
Mar 25 at 17:53










0






active

oldest

votes










Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55343528%2fcolumn-type-issue-when-transferring-bulk-data-between-mysql-to-hive-through-sqoo%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes




Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.







Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.



















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55343528%2fcolumn-type-issue-when-transferring-bulk-data-between-mysql-to-hive-through-sqoo%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현