Building Impala depends upon Hive, HBase and Sentry or not?When to use Hadoop, HBase, Hive and Pig?Comparison on Hive, Impala, HBASE and SQL for “order by”Missing dependency hive-builtins causes build failure with error code 410 for Ooziehow to preprocess the data and load into hiveHow Hive stores the data (loaded from HDFS)?Info on building Spark with “(CDH 4.2.0), yarn (Hadoop 2.4.0)” with Hive?:INIT_FAILURE, Fail to create InputInitializerManagerPlain vanilla Hadoop installation vs Hadoop installation using AmbariHive Testbench data generation failedWhat does it mean 'limited to Hive table data' in Apache Sentry reference?
Mistakenly modified `/bin/sh'
How would you write do the dialogues of two characters talking in a chat room?
Does Google Maps take into account hills/inclines for route times?
Construct a pentagon avoiding compass use
Do native speakers use ZVE or CPU?
Why does the autopilot disengage even when it does not receive pilot input?
Why is dry soil hydrophobic? Bad gardener paradox
Why are Japanese translated subtitles non-conversational?
Cubic programming and beyond?
Doing research in academia and not liking competition
Can a continent naturally split into two distant parts within a week?
Is this more than a packing puzzle?
Are villager price increases due to killing them temporary?
How to draw a gif with expanding circles that reveal lines connecting a non-centered point to the expanding circle using Tikz
Is `curl something | sudo bash -` a reasonably safe installation method?
Does optical correction here give more aesthetic look to logo?
Nested-Loop-Join: How many comparisons and how many pages-accesses?
Ragged justification of captions depending on odd/even page
Why do they not say "The Baby"
How are "soeben" and "eben" different from one another?
What exactly is the Tension force?
Why does the Earth have a z-component at the start of the J2000 epoch?
Should you avoid redundant information after dialogue?
Bob's unnecessary trip to the shops
Building Impala depends upon Hive, HBase and Sentry or not?
When to use Hadoop, HBase, Hive and Pig?Comparison on Hive, Impala, HBASE and SQL for “order by”Missing dependency hive-builtins causes build failure with error code 410 for Ooziehow to preprocess the data and load into hiveHow Hive stores the data (loaded from HDFS)?Info on building Spark with “(CDH 4.2.0), yarn (Hadoop 2.4.0)” with Hive?:INIT_FAILURE, Fail to create InputInitializerManagerPlain vanilla Hadoop installation vs Hadoop installation using AmbariHive Testbench data generation failedWhat does it mean 'limited to Hive table data' in Apache Sentry reference?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;
I have a Hadoop cluster, with one master and 3 slaves. Now, I want to add Apache Impala functionality over this cluster. I've downloaded the tarball from here. I want to build Impala, but am not sure what are the prerequisites. There are two different sources:
This, from the Docs, which says the requirements are: MySQL (or PostgreSQL), Hive metastore, and Java dependencies (obviously).The
README.md
file inside theapache-impala
directory created after untarring the tar ball. Quoting it:
Impala can be built with pre-built components, downloaded from S3, or
can be built with an in-place toolchain located in the thirdparty
directory (not recommended). The components needed to build Impala are
Apache Hadoop, Hive, HBase, and Sentry.
I am confused regarding both the sources. What should I do? A clear set of dependencies for Apache Impala would be great!
hadoop install impala
add a comment |
I have a Hadoop cluster, with one master and 3 slaves. Now, I want to add Apache Impala functionality over this cluster. I've downloaded the tarball from here. I want to build Impala, but am not sure what are the prerequisites. There are two different sources:
This, from the Docs, which says the requirements are: MySQL (or PostgreSQL), Hive metastore, and Java dependencies (obviously).The
README.md
file inside theapache-impala
directory created after untarring the tar ball. Quoting it:
Impala can be built with pre-built components, downloaded from S3, or
can be built with an in-place toolchain located in the thirdparty
directory (not recommended). The components needed to build Impala are
Apache Hadoop, Hive, HBase, and Sentry.
I am confused regarding both the sources. What should I do? A clear set of dependencies for Apache Impala would be great!
hadoop install impala
add a comment |
I have a Hadoop cluster, with one master and 3 slaves. Now, I want to add Apache Impala functionality over this cluster. I've downloaded the tarball from here. I want to build Impala, but am not sure what are the prerequisites. There are two different sources:
This, from the Docs, which says the requirements are: MySQL (or PostgreSQL), Hive metastore, and Java dependencies (obviously).The
README.md
file inside theapache-impala
directory created after untarring the tar ball. Quoting it:
Impala can be built with pre-built components, downloaded from S3, or
can be built with an in-place toolchain located in the thirdparty
directory (not recommended). The components needed to build Impala are
Apache Hadoop, Hive, HBase, and Sentry.
I am confused regarding both the sources. What should I do? A clear set of dependencies for Apache Impala would be great!
hadoop install impala
I have a Hadoop cluster, with one master and 3 slaves. Now, I want to add Apache Impala functionality over this cluster. I've downloaded the tarball from here. I want to build Impala, but am not sure what are the prerequisites. There are two different sources:
This, from the Docs, which says the requirements are: MySQL (or PostgreSQL), Hive metastore, and Java dependencies (obviously).The
README.md
file inside theapache-impala
directory created after untarring the tar ball. Quoting it:
Impala can be built with pre-built components, downloaded from S3, or
can be built with an in-place toolchain located in the thirdparty
directory (not recommended). The components needed to build Impala are
Apache Hadoop, Hive, HBase, and Sentry.
I am confused regarding both the sources. What should I do? A clear set of dependencies for Apache Impala would be great!
hadoop install impala
hadoop install impala
asked Mar 26 at 6:33


MooncraterMooncrater
1,02812 silver badges32 bronze badges
1,02812 silver badges32 bronze badges
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
If you carefully read the Impala Requirements you will see that Hadoop support is implied while the Sentry requirement is buried in the Impala Security link near the bottom of the page.
Under the Java Dependencies section it says:
All Java dependencies are packaged in the impala-dependencies.jar file, which is located at /usr/lib/impala/lib/. These map to everything that is built under fe/target/dependency.
Looking at the corresponding pom.xml you will see all the dependencies. Grepping artifactId
shows the following:
$ grep artifactId fe/pom.xml
<artifactId>impala-parent</artifactId>
<artifactId>impala-frontend</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>impala-data-source-api</artifactId>
<artifactId>hadoop-hdfs</artifactId>
<artifactId>hadoop-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-auth</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-aws</artifactId>
<artifactId>hadoop-azure-datalake</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-core-common</artifactId>
<artifactId>yarn-extras</artifactId>
<artifactId>sentry-core-model-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-common</artifactId>
<artifactId>sentry-provider-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-file</artifactId>
<artifactId>sentry-provider-cache</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-common</artifactId>
<artifactId>sentry-binding-hive</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-engine</artifactId>
<artifactId>sentry-service-api</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>parquet-hadoop-bundle</artifactId>
<artifactId>hbase-client</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-protocol</artifactId>
<artifactId>commons-lang</artifactId>
<artifactId>java-cup</artifactId>
<artifactId>libthrift</artifactId>
<artifactId>hive-service</artifactId>
<artifactId>hive-llap-server</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hive-serde</artifactId>
So the README.md is correct in stating you need Hadoop, Hive, HBase, and Sentry to build Impala.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55351066%2fbuilding-impala-depends-upon-hive-hbase-and-sentry-or-not%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
If you carefully read the Impala Requirements you will see that Hadoop support is implied while the Sentry requirement is buried in the Impala Security link near the bottom of the page.
Under the Java Dependencies section it says:
All Java dependencies are packaged in the impala-dependencies.jar file, which is located at /usr/lib/impala/lib/. These map to everything that is built under fe/target/dependency.
Looking at the corresponding pom.xml you will see all the dependencies. Grepping artifactId
shows the following:
$ grep artifactId fe/pom.xml
<artifactId>impala-parent</artifactId>
<artifactId>impala-frontend</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>impala-data-source-api</artifactId>
<artifactId>hadoop-hdfs</artifactId>
<artifactId>hadoop-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-auth</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-aws</artifactId>
<artifactId>hadoop-azure-datalake</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-core-common</artifactId>
<artifactId>yarn-extras</artifactId>
<artifactId>sentry-core-model-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-common</artifactId>
<artifactId>sentry-provider-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-file</artifactId>
<artifactId>sentry-provider-cache</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-common</artifactId>
<artifactId>sentry-binding-hive</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-engine</artifactId>
<artifactId>sentry-service-api</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>parquet-hadoop-bundle</artifactId>
<artifactId>hbase-client</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-protocol</artifactId>
<artifactId>commons-lang</artifactId>
<artifactId>java-cup</artifactId>
<artifactId>libthrift</artifactId>
<artifactId>hive-service</artifactId>
<artifactId>hive-llap-server</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hive-serde</artifactId>
So the README.md is correct in stating you need Hadoop, Hive, HBase, and Sentry to build Impala.
add a comment |
If you carefully read the Impala Requirements you will see that Hadoop support is implied while the Sentry requirement is buried in the Impala Security link near the bottom of the page.
Under the Java Dependencies section it says:
All Java dependencies are packaged in the impala-dependencies.jar file, which is located at /usr/lib/impala/lib/. These map to everything that is built under fe/target/dependency.
Looking at the corresponding pom.xml you will see all the dependencies. Grepping artifactId
shows the following:
$ grep artifactId fe/pom.xml
<artifactId>impala-parent</artifactId>
<artifactId>impala-frontend</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>impala-data-source-api</artifactId>
<artifactId>hadoop-hdfs</artifactId>
<artifactId>hadoop-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-auth</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-aws</artifactId>
<artifactId>hadoop-azure-datalake</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-core-common</artifactId>
<artifactId>yarn-extras</artifactId>
<artifactId>sentry-core-model-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-common</artifactId>
<artifactId>sentry-provider-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-file</artifactId>
<artifactId>sentry-provider-cache</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-common</artifactId>
<artifactId>sentry-binding-hive</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-engine</artifactId>
<artifactId>sentry-service-api</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>parquet-hadoop-bundle</artifactId>
<artifactId>hbase-client</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-protocol</artifactId>
<artifactId>commons-lang</artifactId>
<artifactId>java-cup</artifactId>
<artifactId>libthrift</artifactId>
<artifactId>hive-service</artifactId>
<artifactId>hive-llap-server</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hive-serde</artifactId>
So the README.md is correct in stating you need Hadoop, Hive, HBase, and Sentry to build Impala.
add a comment |
If you carefully read the Impala Requirements you will see that Hadoop support is implied while the Sentry requirement is buried in the Impala Security link near the bottom of the page.
Under the Java Dependencies section it says:
All Java dependencies are packaged in the impala-dependencies.jar file, which is located at /usr/lib/impala/lib/. These map to everything that is built under fe/target/dependency.
Looking at the corresponding pom.xml you will see all the dependencies. Grepping artifactId
shows the following:
$ grep artifactId fe/pom.xml
<artifactId>impala-parent</artifactId>
<artifactId>impala-frontend</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>impala-data-source-api</artifactId>
<artifactId>hadoop-hdfs</artifactId>
<artifactId>hadoop-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-auth</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-aws</artifactId>
<artifactId>hadoop-azure-datalake</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-core-common</artifactId>
<artifactId>yarn-extras</artifactId>
<artifactId>sentry-core-model-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-common</artifactId>
<artifactId>sentry-provider-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-file</artifactId>
<artifactId>sentry-provider-cache</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-common</artifactId>
<artifactId>sentry-binding-hive</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-engine</artifactId>
<artifactId>sentry-service-api</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>parquet-hadoop-bundle</artifactId>
<artifactId>hbase-client</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-protocol</artifactId>
<artifactId>commons-lang</artifactId>
<artifactId>java-cup</artifactId>
<artifactId>libthrift</artifactId>
<artifactId>hive-service</artifactId>
<artifactId>hive-llap-server</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hive-serde</artifactId>
So the README.md is correct in stating you need Hadoop, Hive, HBase, and Sentry to build Impala.
If you carefully read the Impala Requirements you will see that Hadoop support is implied while the Sentry requirement is buried in the Impala Security link near the bottom of the page.
Under the Java Dependencies section it says:
All Java dependencies are packaged in the impala-dependencies.jar file, which is located at /usr/lib/impala/lib/. These map to everything that is built under fe/target/dependency.
Looking at the corresponding pom.xml you will see all the dependencies. Grepping artifactId
shows the following:
$ grep artifactId fe/pom.xml
<artifactId>impala-parent</artifactId>
<artifactId>impala-frontend</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>impala-data-source-api</artifactId>
<artifactId>hadoop-hdfs</artifactId>
<artifactId>hadoop-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-auth</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hadoop-aws</artifactId>
<artifactId>hadoop-azure-datalake</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-core-common</artifactId>
<artifactId>yarn-extras</artifactId>
<artifactId>sentry-core-model-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-common</artifactId>
<artifactId>sentry-provider-db</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-provider-file</artifactId>
<artifactId>sentry-provider-cache</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-common</artifactId>
<artifactId>sentry-binding-hive</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>sentry-policy-engine</artifactId>
<artifactId>sentry-service-api</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>parquet-hadoop-bundle</artifactId>
<artifactId>hbase-client</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-common</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hbase-protocol</artifactId>
<artifactId>commons-lang</artifactId>
<artifactId>java-cup</artifactId>
<artifactId>libthrift</artifactId>
<artifactId>hive-service</artifactId>
<artifactId>hive-llap-server</artifactId>
<artifactId>json-smart</artifactId>
<artifactId>hive-serde</artifactId>
So the README.md is correct in stating you need Hadoop, Hive, HBase, and Sentry to build Impala.
answered Mar 26 at 19:40
tk421tk421
4,5196 gold badges17 silver badges29 bronze badges
4,5196 gold badges17 silver badges29 bronze badges
add a comment |
add a comment |
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55351066%2fbuilding-impala-depends-upon-hive-hbase-and-sentry-or-not%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown