How to access data in a Spark Dataset ColumnHow do I iterate over the words of a string?How do you split a list into evenly sized chunks?How do I split a string on a delimiter in Bash?Updating a dataframe column in sparkHow to “negative select” columns in spark's dataframeMultiple Aggregate operations on the same column of a spark dataframeHow are stages split into tasks in Spark?How to replace specific columns multiple value in Spark Dataframe?Multiple column names for one underline column in Spark DataframeCreate an array column from other columns after processing the column values

How did the Apollo guidance computer handle parity bit errors?

Installing Debian 10, upgrade to stable later?

Python 3 - simple temperature program version 1.3

Can a good but unremarkable PhD student become an accomplished professor?

Efficient deletion of specific list entries

All of my Firefox add-ons been disabled suddenly, how can I re-enable them?

What's the 2-minute timer on mobile Deutsche Bahn tickets?

Where did Lovecraft write about Carcosa?

Two denim hijabs

Is there precedent or are there procedures for a US president refusing to concede to an electoral defeat?

How to replace space with '+' symbol in a triangular array?

Do quaternary sulfur dications exist?

Find the area of the smallest rectangle to contain squares of sizes up to n

How would you say "You forget wearing what you're wearing"?

Is crescere the correct word meaning to to grow or cultivate?

Emergency stop in plain TeX, pdfTeX, XeTeX and LuaTeX?

HSA - Continue to Invest?

Which "exotic salt" can lower water's freezing point by –70 °C?

Can an Iranian citizen enter the USA on a Dutch passport?

Subnumcases as a part of align

Has the United States ever had a non-Christian President?

What does the coin flipping before dying mean?

What is a common way to tell if an academic is "above average," or outstanding in their field? Is their h-index (Hirsh index) one of them?

Reverse ColorFunction or ColorData



How to access data in a Spark Dataset Column


How do I iterate over the words of a string?How do you split a list into evenly sized chunks?How do I split a string on a delimiter in Bash?Updating a dataframe column in sparkHow to “negative select” columns in spark's dataframeMultiple Aggregate operations on the same column of a spark dataframeHow are stages split into tasks in Spark?How to replace specific columns multiple value in Spark Dataframe?Multiple column names for one underline column in Spark DataframeCreate an array column from other columns after processing the column values






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








-2















I have a Dataframe like this:



+------+---+
| Name|Age|
+------+---+
|A-2 | 26|
|B-1 | 30|
|C-3 | 20|
+------+---+

scala> p.select("Name", "Age")
res2: org.apache.spark.sql.DataFrame = [Name: string, Age: string]


We can see clearly here that the data in the columns are of type String



I want to transform the Name column with a split("-") like method to get only the first part of it (i.e A, B, C).
But type Column in spark doesn't have such a method, so i'm thinking how to get the 'string' inside of the Column so i can perform the split operation.



Does anyone know what should i do ?










share|improve this question




























    -2















    I have a Dataframe like this:



    +------+---+
    | Name|Age|
    +------+---+
    |A-2 | 26|
    |B-1 | 30|
    |C-3 | 20|
    +------+---+

    scala> p.select("Name", "Age")
    res2: org.apache.spark.sql.DataFrame = [Name: string, Age: string]


    We can see clearly here that the data in the columns are of type String



    I want to transform the Name column with a split("-") like method to get only the first part of it (i.e A, B, C).
    But type Column in spark doesn't have such a method, so i'm thinking how to get the 'string' inside of the Column so i can perform the split operation.



    Does anyone know what should i do ?










    share|improve this question
























      -2












      -2








      -2


      1






      I have a Dataframe like this:



      +------+---+
      | Name|Age|
      +------+---+
      |A-2 | 26|
      |B-1 | 30|
      |C-3 | 20|
      +------+---+

      scala> p.select("Name", "Age")
      res2: org.apache.spark.sql.DataFrame = [Name: string, Age: string]


      We can see clearly here that the data in the columns are of type String



      I want to transform the Name column with a split("-") like method to get only the first part of it (i.e A, B, C).
      But type Column in spark doesn't have such a method, so i'm thinking how to get the 'string' inside of the Column so i can perform the split operation.



      Does anyone know what should i do ?










      share|improve this question














      I have a Dataframe like this:



      +------+---+
      | Name|Age|
      +------+---+
      |A-2 | 26|
      |B-1 | 30|
      |C-3 | 20|
      +------+---+

      scala> p.select("Name", "Age")
      res2: org.apache.spark.sql.DataFrame = [Name: string, Age: string]


      We can see clearly here that the data in the columns are of type String



      I want to transform the Name column with a split("-") like method to get only the first part of it (i.e A, B, C).
      But type Column in spark doesn't have such a method, so i'm thinking how to get the 'string' inside of the Column so i can perform the split operation.



      Does anyone know what should i do ?







      api apache-spark split apache-spark-sql col






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Mar 23 at 4:49









      mctrjallohmctrjalloh

      1448




      1448






















          2 Answers
          2






          active

          oldest

          votes


















          1














          Use functions.split method



          df.select(split(col("Name"), "-").getItem(0))





          share|improve this answer






























            1














            Split function available for spark dataframe. See the example below.



            //Creating Test Data
            val df = Seq(("A-2", 26)
            , ("B-1", 30)
            , ("C-3", 20)
            ).toDF("name", "age")

            df.withColumn("new_name", split(col("name"),"-")(0)).show(false)

            +----+---+--------+
            |name|age|new_name|
            +----+---+--------+
            |A-2 |26 |A |
            |B-1 |30 |B |
            |C-3 |20 |C |
            +----+---+--------+





            share|improve this answer























              Your Answer






              StackExchange.ifUsing("editor", function ()
              StackExchange.using("externalEditor", function ()
              StackExchange.using("snippets", function ()
              StackExchange.snippets.init();
              );
              );
              , "code-snippets");

              StackExchange.ready(function()
              var channelOptions =
              tags: "".split(" "),
              id: "1"
              ;
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function()
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled)
              StackExchange.using("snippets", function()
              createEditor();
              );

              else
              createEditor();

              );

              function createEditor()
              StackExchange.prepareEditor(
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader:
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              ,
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              );



              );













              draft saved

              draft discarded


















              StackExchange.ready(
              function ()
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55310705%2fhow-to-access-data-in-a-spark-dataset-column%23new-answer', 'question_page');

              );

              Post as a guest















              Required, but never shown

























              2 Answers
              2






              active

              oldest

              votes








              2 Answers
              2






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              1














              Use functions.split method



              df.select(split(col("Name"), "-").getItem(0))





              share|improve this answer



























                1














                Use functions.split method



                df.select(split(col("Name"), "-").getItem(0))





                share|improve this answer

























                  1












                  1








                  1







                  Use functions.split method



                  df.select(split(col("Name"), "-").getItem(0))





                  share|improve this answer













                  Use functions.split method



                  df.select(split(col("Name"), "-").getItem(0))






                  share|improve this answer












                  share|improve this answer



                  share|improve this answer










                  answered Mar 23 at 6:21









                  Grisha WeintraubGrisha Weintraub

                  6,59111539




                  6,59111539























                      1














                      Split function available for spark dataframe. See the example below.



                      //Creating Test Data
                      val df = Seq(("A-2", 26)
                      , ("B-1", 30)
                      , ("C-3", 20)
                      ).toDF("name", "age")

                      df.withColumn("new_name", split(col("name"),"-")(0)).show(false)

                      +----+---+--------+
                      |name|age|new_name|
                      +----+---+--------+
                      |A-2 |26 |A |
                      |B-1 |30 |B |
                      |C-3 |20 |C |
                      +----+---+--------+





                      share|improve this answer



























                        1














                        Split function available for spark dataframe. See the example below.



                        //Creating Test Data
                        val df = Seq(("A-2", 26)
                        , ("B-1", 30)
                        , ("C-3", 20)
                        ).toDF("name", "age")

                        df.withColumn("new_name", split(col("name"),"-")(0)).show(false)

                        +----+---+--------+
                        |name|age|new_name|
                        +----+---+--------+
                        |A-2 |26 |A |
                        |B-1 |30 |B |
                        |C-3 |20 |C |
                        +----+---+--------+





                        share|improve this answer

























                          1












                          1








                          1







                          Split function available for spark dataframe. See the example below.



                          //Creating Test Data
                          val df = Seq(("A-2", 26)
                          , ("B-1", 30)
                          , ("C-3", 20)
                          ).toDF("name", "age")

                          df.withColumn("new_name", split(col("name"),"-")(0)).show(false)

                          +----+---+--------+
                          |name|age|new_name|
                          +----+---+--------+
                          |A-2 |26 |A |
                          |B-1 |30 |B |
                          |C-3 |20 |C |
                          +----+---+--------+





                          share|improve this answer













                          Split function available for spark dataframe. See the example below.



                          //Creating Test Data
                          val df = Seq(("A-2", 26)
                          , ("B-1", 30)
                          , ("C-3", 20)
                          ).toDF("name", "age")

                          df.withColumn("new_name", split(col("name"),"-")(0)).show(false)

                          +----+---+--------+
                          |name|age|new_name|
                          +----+---+--------+
                          |A-2 |26 |A |
                          |B-1 |30 |B |
                          |C-3 |20 |C |
                          +----+---+--------+






                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Mar 23 at 6:21









                          Apurba PandeyApurba Pandey

                          655614




                          655614



























                              draft saved

                              draft discarded
















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid


                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.

                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function ()
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55310705%2fhow-to-access-data-in-a-spark-dataset-column%23new-answer', 'question_page');

                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

                              Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

                              Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript