Google Cloud Pub/Sub - Cloud Function & Bigquery - Data insert is not happeningGoogle Cloud Functions to only Ack Pub/Sub on successHow to Crash/Stop DataFlow Pub/Sub Ingestion on BigQuery Insert ErrorPros/cons of streaming into BigQuery directly vs through Google Pub/Sub + DataflowGoogle Cloud - Pub/Sub into DataFlowGoogle dataflow template UDF transform - how to format dateUsing a cloud function with Pub/SubInserting csv files from Google Cloud Storage into BigQuery through a Cloud FunctionCloud Dataflow executed successfully but not inserted data into BigqueryGCP Cloud Function & BigQuery : Row is inserted but all columns are nullGoogle BigQuery Pub/Sub notification

Constant integers and constant evaluation

If every star in the universe except the Sun were destroyed, would we die?

Is Sanskrit really the mother of all languages?

Python reimplementation of Lost In Space by Tim Hartnell

Is future tense in English really a myth?

Does the word voltage exist in academic engineering?

When does order matter in probability?

Is mountain bike good for long distances?

What exactly is Apple Cider

What is the extent of the commands a Cambion can issue through Fiendish Charm?

Examples where "thin + thin = nice and thick"

Why did Tony's Arc Reactor do this?

O.5=north.5 (are this the same?)

How do English-speaking kids loudly request something?

だけ + two questions

Friend is very nit picky about side comments I don't intend to be taken too seriously

Why are UK MPs allowed to abstain (but it counts as a no)?

Why are some hotels asking you to book through Booking.com instead of matching the price at the front desk?

What quests do you need to stop at before you make an enemy of a faction for each faction?

Fantasy Military Arms and Armor: the Dwarven Grand Armory

How do I make my fill-in-the-blank exercise more obvious?

Does a familiar/steed/greater steed keep its memories from before it was summoned/made to change form?

Round away from zero

Where on Earth is it easiest to survive in the wilderness?



Google Cloud Pub/Sub - Cloud Function & Bigquery - Data insert is not happening


Google Cloud Functions to only Ack Pub/Sub on successHow to Crash/Stop DataFlow Pub/Sub Ingestion on BigQuery Insert ErrorPros/cons of streaming into BigQuery directly vs through Google Pub/Sub + DataflowGoogle Cloud - Pub/Sub into DataFlowGoogle dataflow template UDF transform - how to format dateUsing a cloud function with Pub/SubInserting csv files from Google Cloud Storage into BigQuery through a Cloud FunctionCloud Dataflow executed successfully but not inserted data into BigqueryGCP Cloud Function & BigQuery : Row is inserted but all columns are nullGoogle BigQuery Pub/Sub notification






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















I am using a Google Cloud Platform Function that listens to a Pub/SubTopic and inserts the data in BigQuery.



The input data which I am passing from pub/sub console is in JSON format "NAME", "ABCD", but from the console log, I could see that message is coming as NAME, ABCD, and during execution, it error as well. 2 common errors I faced




  1. SyntaxError: Unexpected token n in JSON at position 1 at Object.parse (native) at exports.helloPubSub"




  2. "ERROR: { Error: Invalid value at 'rows[0].json' "



Input given:



gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"


Tried various formats of input data with single quotes and square brackets and other possible options as well, nothing helps



Workarounds tried like using JSON.parse, JSON.stringfy which helps to avoid the 1st issue which mentioned above but ends up with row[0] issue



When I pass the JSON input data as hard-coded values inside the cloud function like "NAME", "ABCD", data is getting inserted properly.



/**This is working code since i hardcoded the data in JSON format, commented the lines which i tried and did not helped**/

/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* @param !Object event Event payload and metadata.
* @param !Function callback Callback function to signal completion.
*/
exports.helloPubSub = (event, callback) =>
const pubsubMessage = event.data;
console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
const BigQuery = require('@google-cloud/bigquery');
const bigquery = new BigQuery();
//console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
//console.log(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()));
var myjson='"NAME":"ABCD","STATE":"HHHH","AGE":"12"';
console.log(myjson);
bigquery
.dataset("DEMO")
.table("EMP")
.insert(JSON.parse(myjson),
'ignoreUnknownValues':true, 'raw':false)
//.insert(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()),
.then ((data) =>
console.log('Inserted 1 rows');
console.log(data);
)
.catch(err =>
if (err && err.name === 'PartialFailureError')
if (err.errors && err.errors.length > 0)
console.log('Insert errors:');
err.errors.forEach(err => console.error(err));

else
console.error('ERROR`enter code here`:', err);

);
;









share|improve this question
































    0















    I am using a Google Cloud Platform Function that listens to a Pub/SubTopic and inserts the data in BigQuery.



    The input data which I am passing from pub/sub console is in JSON format "NAME", "ABCD", but from the console log, I could see that message is coming as NAME, ABCD, and during execution, it error as well. 2 common errors I faced




    1. SyntaxError: Unexpected token n in JSON at position 1 at Object.parse (native) at exports.helloPubSub"




    2. "ERROR: { Error: Invalid value at 'rows[0].json' "



    Input given:



    gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"


    Tried various formats of input data with single quotes and square brackets and other possible options as well, nothing helps



    Workarounds tried like using JSON.parse, JSON.stringfy which helps to avoid the 1st issue which mentioned above but ends up with row[0] issue



    When I pass the JSON input data as hard-coded values inside the cloud function like "NAME", "ABCD", data is getting inserted properly.



    /**This is working code since i hardcoded the data in JSON format, commented the lines which i tried and did not helped**/

    /**
    * Triggered from a message on a Cloud Pub/Sub topic.
    *
    * @param !Object event Event payload and metadata.
    * @param !Function callback Callback function to signal completion.
    */
    exports.helloPubSub = (event, callback) =>
    const pubsubMessage = event.data;
    console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
    const BigQuery = require('@google-cloud/bigquery');
    const bigquery = new BigQuery();
    //console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
    //console.log(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()));
    var myjson='"NAME":"ABCD","STATE":"HHHH","AGE":"12"';
    console.log(myjson);
    bigquery
    .dataset("DEMO")
    .table("EMP")
    .insert(JSON.parse(myjson),
    'ignoreUnknownValues':true, 'raw':false)
    //.insert(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()),
    .then ((data) =>
    console.log('Inserted 1 rows');
    console.log(data);
    )
    .catch(err =>
    if (err && err.name === 'PartialFailureError')
    if (err.errors && err.errors.length > 0)
    console.log('Insert errors:');
    err.errors.forEach(err => console.error(err));

    else
    console.error('ERROR`enter code here`:', err);

    );
    ;









    share|improve this question




























      0












      0








      0








      I am using a Google Cloud Platform Function that listens to a Pub/SubTopic and inserts the data in BigQuery.



      The input data which I am passing from pub/sub console is in JSON format "NAME", "ABCD", but from the console log, I could see that message is coming as NAME, ABCD, and during execution, it error as well. 2 common errors I faced




      1. SyntaxError: Unexpected token n in JSON at position 1 at Object.parse (native) at exports.helloPubSub"




      2. "ERROR: { Error: Invalid value at 'rows[0].json' "



      Input given:



      gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"


      Tried various formats of input data with single quotes and square brackets and other possible options as well, nothing helps



      Workarounds tried like using JSON.parse, JSON.stringfy which helps to avoid the 1st issue which mentioned above but ends up with row[0] issue



      When I pass the JSON input data as hard-coded values inside the cloud function like "NAME", "ABCD", data is getting inserted properly.



      /**This is working code since i hardcoded the data in JSON format, commented the lines which i tried and did not helped**/

      /**
      * Triggered from a message on a Cloud Pub/Sub topic.
      *
      * @param !Object event Event payload and metadata.
      * @param !Function callback Callback function to signal completion.
      */
      exports.helloPubSub = (event, callback) =>
      const pubsubMessage = event.data;
      console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
      const BigQuery = require('@google-cloud/bigquery');
      const bigquery = new BigQuery();
      //console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
      //console.log(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()));
      var myjson='"NAME":"ABCD","STATE":"HHHH","AGE":"12"';
      console.log(myjson);
      bigquery
      .dataset("DEMO")
      .table("EMP")
      .insert(JSON.parse(myjson),
      'ignoreUnknownValues':true, 'raw':false)
      //.insert(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()),
      .then ((data) =>
      console.log('Inserted 1 rows');
      console.log(data);
      )
      .catch(err =>
      if (err && err.name === 'PartialFailureError')
      if (err.errors && err.errors.length > 0)
      console.log('Insert errors:');
      err.errors.forEach(err => console.error(err));

      else
      console.error('ERROR`enter code here`:', err);

      );
      ;









      share|improve this question
















      I am using a Google Cloud Platform Function that listens to a Pub/SubTopic and inserts the data in BigQuery.



      The input data which I am passing from pub/sub console is in JSON format "NAME", "ABCD", but from the console log, I could see that message is coming as NAME, ABCD, and during execution, it error as well. 2 common errors I faced




      1. SyntaxError: Unexpected token n in JSON at position 1 at Object.parse (native) at exports.helloPubSub"




      2. "ERROR: { Error: Invalid value at 'rows[0].json' "



      Input given:



      gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"


      Tried various formats of input data with single quotes and square brackets and other possible options as well, nothing helps



      Workarounds tried like using JSON.parse, JSON.stringfy which helps to avoid the 1st issue which mentioned above but ends up with row[0] issue



      When I pass the JSON input data as hard-coded values inside the cloud function like "NAME", "ABCD", data is getting inserted properly.



      /**This is working code since i hardcoded the data in JSON format, commented the lines which i tried and did not helped**/

      /**
      * Triggered from a message on a Cloud Pub/Sub topic.
      *
      * @param !Object event Event payload and metadata.
      * @param !Function callback Callback function to signal completion.
      */
      exports.helloPubSub = (event, callback) =>
      const pubsubMessage = event.data;
      console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
      const BigQuery = require('@google-cloud/bigquery');
      const bigquery = new BigQuery();
      //console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
      //console.log(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()));
      var myjson='"NAME":"ABCD","STATE":"HHHH","AGE":"12"';
      console.log(myjson);
      bigquery
      .dataset("DEMO")
      .table("EMP")
      .insert(JSON.parse(myjson),
      'ignoreUnknownValues':true, 'raw':false)
      //.insert(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()),
      .then ((data) =>
      console.log('Inserted 1 rows');
      console.log(data);
      )
      .catch(err =>
      if (err && err.name === 'PartialFailureError')
      if (err.errors && err.errors.length > 0)
      console.log('Insert errors:');
      err.errors.forEach(err => console.error(err));

      else
      console.error('ERROR`enter code here`:', err);

      );
      ;






      google-bigquery google-cloud-functions google-cloud-pubsub






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 28 at 7:49









      Usman Maqbool

      2,2637 gold badges22 silver badges35 bronze badges




      2,2637 gold badges22 silver badges35 bronze badges










      asked Mar 28 at 6:04









      vakvak

      288 bronze badges




      288 bronze badges

























          1 Answer
          1






          active

          oldest

          votes


















          1
















          I ran a quick test using gcloud to publish and to pull the message as well.



          Using the syntax you mentioned I get the following result:



          gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA │ name:ABCD




          If you use this syntax instead:



          gcloud pubsub topics publish pubsubtopic1 --message ""name":"ABCD""
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA | "name":"ABCD"




          EDIT 2019-04-01



          The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:



          POST https://pubsub.googleapis.com/v1/projects/YOUR_PROJECT_ID/topics/YOUR_TOPIC:publish?key=YOUR_API_KEY


          "messages": [

          "data": "eyJuYW1lIjoiQUJDRCJ9"

          ]






          share|improve this answer



























          • Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

            – vak
            Mar 30 at 2:27











          • Hi, I updated the solution with my latest comments.

            – ch_mike
            Apr 3 at 15:26











          • Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

            – vak
            Apr 4 at 2:37











          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );














          draft saved

          draft discarded
















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55391084%2fgoogle-cloud-pub-sub-cloud-function-bigquery-data-insert-is-not-happening%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1
















          I ran a quick test using gcloud to publish and to pull the message as well.



          Using the syntax you mentioned I get the following result:



          gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA │ name:ABCD




          If you use this syntax instead:



          gcloud pubsub topics publish pubsubtopic1 --message ""name":"ABCD""
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA | "name":"ABCD"




          EDIT 2019-04-01



          The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:



          POST https://pubsub.googleapis.com/v1/projects/YOUR_PROJECT_ID/topics/YOUR_TOPIC:publish?key=YOUR_API_KEY


          "messages": [

          "data": "eyJuYW1lIjoiQUJDRCJ9"

          ]






          share|improve this answer



























          • Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

            – vak
            Mar 30 at 2:27











          • Hi, I updated the solution with my latest comments.

            – ch_mike
            Apr 3 at 15:26











          • Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

            – vak
            Apr 4 at 2:37
















          1
















          I ran a quick test using gcloud to publish and to pull the message as well.



          Using the syntax you mentioned I get the following result:



          gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA │ name:ABCD




          If you use this syntax instead:



          gcloud pubsub topics publish pubsubtopic1 --message ""name":"ABCD""
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA | "name":"ABCD"




          EDIT 2019-04-01



          The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:



          POST https://pubsub.googleapis.com/v1/projects/YOUR_PROJECT_ID/topics/YOUR_TOPIC:publish?key=YOUR_API_KEY


          "messages": [

          "data": "eyJuYW1lIjoiQUJDRCJ9"

          ]






          share|improve this answer



























          • Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

            – vak
            Mar 30 at 2:27











          • Hi, I updated the solution with my latest comments.

            – ch_mike
            Apr 3 at 15:26











          • Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

            – vak
            Apr 4 at 2:37














          1














          1










          1









          I ran a quick test using gcloud to publish and to pull the message as well.



          Using the syntax you mentioned I get the following result:



          gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA │ name:ABCD




          If you use this syntax instead:



          gcloud pubsub topics publish pubsubtopic1 --message ""name":"ABCD""
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA | "name":"ABCD"




          EDIT 2019-04-01



          The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:



          POST https://pubsub.googleapis.com/v1/projects/YOUR_PROJECT_ID/topics/YOUR_TOPIC:publish?key=YOUR_API_KEY


          "messages": [

          "data": "eyJuYW1lIjoiQUJDRCJ9"

          ]






          share|improve this answer















          I ran a quick test using gcloud to publish and to pull the message as well.



          Using the syntax you mentioned I get the following result:



          gcloud pubsub topics publish pubsubtopic1 --message "name":"ABCD"
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA │ name:ABCD




          If you use this syntax instead:



          gcloud pubsub topics publish pubsubtopic1 --message ""name":"ABCD""
          gcloud pubsub subscriptions pull pubsubsubscription1


          The result is:




          DATA | "name":"ABCD"




          EDIT 2019-04-01



          The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:



          POST https://pubsub.googleapis.com/v1/projects/YOUR_PROJECT_ID/topics/YOUR_TOPIC:publish?key=YOUR_API_KEY


          "messages": [

          "data": "eyJuYW1lIjoiQUJDRCJ9"

          ]







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Apr 1 at 14:57

























          answered Mar 28 at 17:55









          ch_mikech_mike

          7742 silver badges8 bronze badges




          7742 silver badges8 bronze badges















          • Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

            – vak
            Mar 30 at 2:27











          • Hi, I updated the solution with my latest comments.

            – ch_mike
            Apr 3 at 15:26











          • Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

            – vak
            Apr 4 at 2:37


















          • Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

            – vak
            Mar 30 at 2:27











          • Hi, I updated the solution with my latest comments.

            – ch_mike
            Apr 3 at 15:26











          • Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

            – vak
            Apr 4 at 2:37

















          Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

          – vak
          Mar 30 at 2:27





          Thanks a lot mike, the method you mentioned above is perfectly working but my source data will be always in proper JSON format , which don't have this "" appended. Could you please help to understand why did our JSON syntax did not worked! As per your suggestion i think only way to get this worked is, we need to find a way to include "" to appened everytime when we process the source JSON dataset . Kindly suggest if you find a better solution

          – vak
          Mar 30 at 2:27













          Hi, I updated the solution with my latest comments.

          – ch_mike
          Apr 3 at 15:26





          Hi, I updated the solution with my latest comments.

          – ch_mike
          Apr 3 at 15:26













          Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

          – vak
          Apr 4 at 2:37






          Thanks Mike. Your first solution helped me a lot. I made the changes in Cloud function to convert the JSON to "JSON" format using java script replace method.Messages are being picked and getting inserted now properly.Testing with different data sets now. In case of any more issues, i will update here. Thanks once again

          – vak
          Apr 4 at 2:37









          Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







          Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.




















          draft saved

          draft discarded















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55391084%2fgoogle-cloud-pub-sub-cloud-function-bigquery-data-insert-is-not-happening%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

          은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현