sqoop incremental job is failing due to org.kitesdk.data.DatasetOperationExceptionwhat are the following commands in sqoop?How to run a sqoop import and associate the task with specific scheduler queueGot exception running Sqoop: java.lang.NullPointerException using -query and --as-parquetfileSqoop job fails with KiteSDK validation error for Oracle importSqoop hive import from mysql to hive is failingSqoop Failed to list columns from querySqoop import - incremental load using appendPassing libjars in sqoop jobSqoop: --as-parquetfile is not workingHive table outdated after Sqoop incremental import
Casting Goblin Matron with Plague Engineer on the battlefield
"How do you solve a problem like Maria?"
LWC: Can I have an expression and a string as a class combination?
Will a paper be retracted if a flaw in released software code invalidates its central idea?
Traveling from Germany to other countries by train?
Is there a loss of quality when converting RGB to HEX?
Scripting a Maintenance Plan in SQL Server Express
Could one become a successful researcher by writing some really good papers while being outside academia?
Is it double speak?
Secure my password from unsafe servers
Infeasibility in mathematical optimization models
Looking for a new job because of relocation - is it okay to tell the real reason?
Why couldn't soldiers sight their own weapons without officers' orders?
What is to be understood by the assertion 'Israels right to exist'?
Our group keeps dying during the Lost Mine of Phandelver campaign. What are we doing wrong?
Finish the Mastermind
How to realistically deal with a shield user?
Validation and verification of mathematical models
Did WWII Japanese soldiers engage in cannibalism of their enemies?
Look mom! I made my own (Base 10) numeral system!
Non-OR journals which regularly publish OR research
Premier League simulation
Why do private jets such as Gulfstream fly higher than other civilian jets?
Was there ever a difference between 'volo' and 'volo'?
sqoop incremental job is failing due to org.kitesdk.data.DatasetOperationException
what are the following commands in sqoop?How to run a sqoop import and associate the task with specific scheduler queueGot exception running Sqoop: java.lang.NullPointerException using -query and --as-parquetfileSqoop job fails with KiteSDK validation error for Oracle importSqoop hive import from mysql to hive is failingSqoop Failed to list columns from querySqoop import - incremental load using appendPassing libjars in sqoop jobSqoop: --as-parquetfile is not workingHive table outdated after Sqoop incremental import
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I am trying to import data from oracle to hive table using sqoop incremental job, using parquet file format. But job is failing due to below error
Error: org.kitesdk.data.DatasetOperationException: Failed to append
{"CLG_ID": "5",.....19/03/27 00:37:06 INFO mapreduce.Job: Task Id :
attempt_15088_130_m_000_2, Status : FAILED
Query to create saved job:
sqoop job -Dhadoop.security.credential.provider.path=jceks://xxxxx
--create job1 -- import --connect "jdbc:oracle:thinxxxxxx" --verbose --username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --as-parquetfile --incremental
append --check-column CLG_TS --target-dir /hdfs/clg_data/ -m 1
import query :
sqoop job -Dhadoop.security.credential.provider.path=jceks:/xxxxx
--exec job1 -- --connect "jdbc:oracle:xxx"
--username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --target-dir /hdfs/clg_data/ -m 1
--hive-import --hive-database clg_db --hive-table clg_table --as-parquetfile
oracle sqoop parquet
add a comment |
I am trying to import data from oracle to hive table using sqoop incremental job, using parquet file format. But job is failing due to below error
Error: org.kitesdk.data.DatasetOperationException: Failed to append
{"CLG_ID": "5",.....19/03/27 00:37:06 INFO mapreduce.Job: Task Id :
attempt_15088_130_m_000_2, Status : FAILED
Query to create saved job:
sqoop job -Dhadoop.security.credential.provider.path=jceks://xxxxx
--create job1 -- import --connect "jdbc:oracle:thinxxxxxx" --verbose --username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --as-parquetfile --incremental
append --check-column CLG_TS --target-dir /hdfs/clg_data/ -m 1
import query :
sqoop job -Dhadoop.security.credential.provider.path=jceks:/xxxxx
--exec job1 -- --connect "jdbc:oracle:xxx"
--username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --target-dir /hdfs/clg_data/ -m 1
--hive-import --hive-database clg_db --hive-table clg_table --as-parquetfile
oracle sqoop parquet
add a comment |
I am trying to import data from oracle to hive table using sqoop incremental job, using parquet file format. But job is failing due to below error
Error: org.kitesdk.data.DatasetOperationException: Failed to append
{"CLG_ID": "5",.....19/03/27 00:37:06 INFO mapreduce.Job: Task Id :
attempt_15088_130_m_000_2, Status : FAILED
Query to create saved job:
sqoop job -Dhadoop.security.credential.provider.path=jceks://xxxxx
--create job1 -- import --connect "jdbc:oracle:thinxxxxxx" --verbose --username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --as-parquetfile --incremental
append --check-column CLG_TS --target-dir /hdfs/clg_data/ -m 1
import query :
sqoop job -Dhadoop.security.credential.provider.path=jceks:/xxxxx
--exec job1 -- --connect "jdbc:oracle:xxx"
--username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --target-dir /hdfs/clg_data/ -m 1
--hive-import --hive-database clg_db --hive-table clg_table --as-parquetfile
oracle sqoop parquet
I am trying to import data from oracle to hive table using sqoop incremental job, using parquet file format. But job is failing due to below error
Error: org.kitesdk.data.DatasetOperationException: Failed to append
{"CLG_ID": "5",.....19/03/27 00:37:06 INFO mapreduce.Job: Task Id :
attempt_15088_130_m_000_2, Status : FAILED
Query to create saved job:
sqoop job -Dhadoop.security.credential.provider.path=jceks://xxxxx
--create job1 -- import --connect "jdbc:oracle:thinxxxxxx" --verbose --username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --as-parquetfile --incremental
append --check-column CLG_TS --target-dir /hdfs/clg_data/ -m 1
import query :
sqoop job -Dhadoop.security.credential.provider.path=jceks:/xxxxx
--exec job1 -- --connect "jdbc:oracle:xxx"
--username user1 --password-alias alisas --query "select CLG_ID,.... from CLG_TBL where $CONDITIONS" --target-dir /hdfs/clg_data/ -m 1
--hive-import --hive-database clg_db --hive-table clg_table --as-parquetfile
oracle sqoop parquet
oracle sqoop parquet
edited Mar 27 at 10:19
Prabhanj
asked Mar 27 at 6:16
PrabhanjPrabhanj
341 silver badge8 bronze badges
341 silver badge8 bronze badges
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
This error is a known issue. We have faced with same problem a couple of weeks ago and
found this.
Here is the link.
Description of the problem or behavior
In HDP 3, managed Hive tables must be transactional (hive.strict.managed.tables=true). Transactional tables with Parquet format are not supported by Hive. Hive imports with --as-parquetfile must use external tables by specifying --external-table-dir.
Associated error message
Table db.table failed strict managed table checks due to the
following reason: Table is marked as a managed table but is not
transactional.
Workaround
When using --hive-import with --as-parquetfile, users must also provide --external-table-dir with a fully qualified location of the table:
sqoop import ... --hive-import
--as-parquetfile
--external-table-dir hdfs:///path/to/table
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55370914%2fsqoop-incremental-job-is-failing-due-to-org-kitesdk-data-datasetoperationexcepti%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
This error is a known issue. We have faced with same problem a couple of weeks ago and
found this.
Here is the link.
Description of the problem or behavior
In HDP 3, managed Hive tables must be transactional (hive.strict.managed.tables=true). Transactional tables with Parquet format are not supported by Hive. Hive imports with --as-parquetfile must use external tables by specifying --external-table-dir.
Associated error message
Table db.table failed strict managed table checks due to the
following reason: Table is marked as a managed table but is not
transactional.
Workaround
When using --hive-import with --as-parquetfile, users must also provide --external-table-dir with a fully qualified location of the table:
sqoop import ... --hive-import
--as-parquetfile
--external-table-dir hdfs:///path/to/table
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
add a comment |
This error is a known issue. We have faced with same problem a couple of weeks ago and
found this.
Here is the link.
Description of the problem or behavior
In HDP 3, managed Hive tables must be transactional (hive.strict.managed.tables=true). Transactional tables with Parquet format are not supported by Hive. Hive imports with --as-parquetfile must use external tables by specifying --external-table-dir.
Associated error message
Table db.table failed strict managed table checks due to the
following reason: Table is marked as a managed table but is not
transactional.
Workaround
When using --hive-import with --as-parquetfile, users must also provide --external-table-dir with a fully qualified location of the table:
sqoop import ... --hive-import
--as-parquetfile
--external-table-dir hdfs:///path/to/table
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
add a comment |
This error is a known issue. We have faced with same problem a couple of weeks ago and
found this.
Here is the link.
Description of the problem or behavior
In HDP 3, managed Hive tables must be transactional (hive.strict.managed.tables=true). Transactional tables with Parquet format are not supported by Hive. Hive imports with --as-parquetfile must use external tables by specifying --external-table-dir.
Associated error message
Table db.table failed strict managed table checks due to the
following reason: Table is marked as a managed table but is not
transactional.
Workaround
When using --hive-import with --as-parquetfile, users must also provide --external-table-dir with a fully qualified location of the table:
sqoop import ... --hive-import
--as-parquetfile
--external-table-dir hdfs:///path/to/table
This error is a known issue. We have faced with same problem a couple of weeks ago and
found this.
Here is the link.
Description of the problem or behavior
In HDP 3, managed Hive tables must be transactional (hive.strict.managed.tables=true). Transactional tables with Parquet format are not supported by Hive. Hive imports with --as-parquetfile must use external tables by specifying --external-table-dir.
Associated error message
Table db.table failed strict managed table checks due to the
following reason: Table is marked as a managed table but is not
transactional.
Workaround
When using --hive-import with --as-parquetfile, users must also provide --external-table-dir with a fully qualified location of the table:
sqoop import ... --hive-import
--as-parquetfile
--external-table-dir hdfs:///path/to/table
answered Apr 12 at 11:30
Ali ErdemAli Erdem
16
16
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
add a comment |
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
Thanks, will try this solution
– Prabhanj
Apr 16 at 13:59
add a comment |
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55370914%2fsqoop-incremental-job-is-failing-due-to-org-kitesdk-data-datasetoperationexcepti%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown