Can't use different KDCs and realms for Kafka Connect and HDFS Sink Connector?Change the default maximum ticket life of a kerberos principalSpnego Kerberos Spring SSOConnecting to Kerberrized HDFS , java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name;Kafka Connect HDFS Sink IssueMonitoring HDFS sink connector lagConfluent Kafka Connect HDFS Sink connector latencykafka connect hdfs sink connector is failingKafka connect with HDFS Sink connector errorCan't access HDFS when Hadoop cluster kerberizedKafka Connect JDBC Sink Connector

What defenses are there against being summoned by the Gate spell?

How long does it take to type this?

Is there a familial term for apples and pears?

How does one intimidate enemies without having the capacity for violence?

Is it possible to do 50 km distance without any previous training?

Why CLRS example on residual networks does not follows its formula?

How to report a triplet of septets in NMR tabulation?

whey we use polarized capacitor?

Prevent a directory in /tmp from being deleted

Why is this code 6.5x slower with optimizations enabled?

A function which translates a sentence to title-case

Is there a minimum number of transactions in a block?

Download, install and reboot computer at night if needed

What are these boxed doors outside store fronts in New York?

Example of a relative pronoun

If Manufacturer spice model and Datasheet give different values which should I use?

A Journey Through Space and Time

Circuitry of TV splitters

Why doesn't Newton's third law mean a person bounces back to where they started when they hit the ground?

The magic money tree problem

Is Social Media Science Fiction?

How is the claim "I am in New York only if I am in America" the same as "If I am in New York, then I am in America?

How to add power-LED to my small amplifier?

What typically incentivizes a professor to change jobs to a lower ranking university?



Can't use different KDCs and realms for Kafka Connect and HDFS Sink Connector?


Change the default maximum ticket life of a kerberos principalSpnego Kerberos Spring SSOConnecting to Kerberrized HDFS , java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name;Kafka Connect HDFS Sink IssueMonitoring HDFS sink connector lagConfluent Kafka Connect HDFS Sink connector latencykafka connect hdfs sink connector is failingKafka connect with HDFS Sink connector errorCan't access HDFS when Hadoop cluster kerberizedKafka Connect JDBC Sink Connector






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















I have setup my Kafka Connect with kerberized Kafka Cluster .(Say the KDC
is "kafka-auth101.hadoop.local" and the realm "KAFKA.MYCOMPANY.COM")



Now I am trying to setup HDFS Sink to write into a Kerberized Hadoop Cluster with a different KDC (Say the KDC is "hadoop-auth101.hadoop.local" and the realm "
HADOOP.MYCOMPANY.COM"



I have added both of these realms to the krb5.conf used by Kafka Connect.



But during initialisation , HDFS Sink connector instance fails giving error



Any tips on this ?
Basically with this configuration, one JVM tries to use different KDCs and Realms.



>>> KdcAccessibility: reset
>>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
>>> KeyTabInputStream, readName(): hdfsuser
>>> KeyTab: load() entry length: 85; type: 18
Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
[2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
(io.confluent.connect.hdfs.HdfsSinkTask)
org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
/etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
Unable to obtain password from user

at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
at
org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Login failure for
hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
javax.security.auth.login.LoginException: Unable to obtain password from
user

at
org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
... 10 more
Caused by: javax.security.auth.login.LoginException: Unable to obtain
password from user


krb5.conf looks like this:



[logging]
kdc = FILE:/var/log/krb5/krb5kdc.log
admin_server = FILE:/var/log/krb5/kadmin.log
default = FILE:/var/log/krb5/krb5libs.log

[libdefaults]
default_realm = KAFKA.MYCOMPANY.COM
dns_lookup_realm = false
dns_lookup_kdc = false
ticket_lifetime = 24h
forwardable = yes
allow_weak_crypto = true
renew_lifetime = 7d
kdc_timeout = 3000
max_retries = 2
clockskew = 120
default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
permitted_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



[realms]
# KDC,Realm for Kafka
KAFKA.MYCOMPANY.COM =
kdc = kafka-auth101.hadoop.local
admin_server = kafka-auth101.hadoop.local:2749


# KDC,Realm for Hadoop/HDFS
HADOOP.MYCOMPANY.COM =
kdc = hadoop-auth101.hadoop.local
admin_server = hadoop-auth101.hadoop.local:2749


[appdefaults]
pam =
debug = false
ticket_lifetime = 36000
renew_lifetime = 36000
forwardable = true
krb4_convert = false










share|improve this question






























    0















    I have setup my Kafka Connect with kerberized Kafka Cluster .(Say the KDC
    is "kafka-auth101.hadoop.local" and the realm "KAFKA.MYCOMPANY.COM")



    Now I am trying to setup HDFS Sink to write into a Kerberized Hadoop Cluster with a different KDC (Say the KDC is "hadoop-auth101.hadoop.local" and the realm "
    HADOOP.MYCOMPANY.COM"



    I have added both of these realms to the krb5.conf used by Kafka Connect.



    But during initialisation , HDFS Sink connector instance fails giving error



    Any tips on this ?
    Basically with this configuration, one JVM tries to use different KDCs and Realms.



    >>> KdcAccessibility: reset
    >>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
    >>> KeyTabInputStream, readName(): hdfsuser
    >>> KeyTab: load() entry length: 85; type: 18
    Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
    Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
    [2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
    (io.confluent.connect.hdfs.HdfsSinkTask)
    org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
    Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
    /etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
    Unable to obtain password from user

    at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
    at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
    at
    org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
    at
    org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
    at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
    at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at
    java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
    Caused by: java.io.IOException: Login failure for
    hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
    javax.security.auth.login.LoginException: Unable to obtain password from
    user

    at
    org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
    at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
    ... 10 more
    Caused by: javax.security.auth.login.LoginException: Unable to obtain
    password from user


    krb5.conf looks like this:



    [logging]
    kdc = FILE:/var/log/krb5/krb5kdc.log
    admin_server = FILE:/var/log/krb5/kadmin.log
    default = FILE:/var/log/krb5/krb5libs.log

    [libdefaults]
    default_realm = KAFKA.MYCOMPANY.COM
    dns_lookup_realm = false
    dns_lookup_kdc = false
    ticket_lifetime = 24h
    forwardable = yes
    allow_weak_crypto = true
    renew_lifetime = 7d
    kdc_timeout = 3000
    max_retries = 2
    clockskew = 120
    default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
    default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
    permitted_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



    [realms]
    # KDC,Realm for Kafka
    KAFKA.MYCOMPANY.COM =
    kdc = kafka-auth101.hadoop.local
    admin_server = kafka-auth101.hadoop.local:2749


    # KDC,Realm for Hadoop/HDFS
    HADOOP.MYCOMPANY.COM =
    kdc = hadoop-auth101.hadoop.local
    admin_server = hadoop-auth101.hadoop.local:2749


    [appdefaults]
    pam =
    debug = false
    ticket_lifetime = 36000
    renew_lifetime = 36000
    forwardable = true
    krb4_convert = false










    share|improve this question


























      0












      0








      0








      I have setup my Kafka Connect with kerberized Kafka Cluster .(Say the KDC
      is "kafka-auth101.hadoop.local" and the realm "KAFKA.MYCOMPANY.COM")



      Now I am trying to setup HDFS Sink to write into a Kerberized Hadoop Cluster with a different KDC (Say the KDC is "hadoop-auth101.hadoop.local" and the realm "
      HADOOP.MYCOMPANY.COM"



      I have added both of these realms to the krb5.conf used by Kafka Connect.



      But during initialisation , HDFS Sink connector instance fails giving error



      Any tips on this ?
      Basically with this configuration, one JVM tries to use different KDCs and Realms.



      >>> KdcAccessibility: reset
      >>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
      >>> KeyTabInputStream, readName(): hdfsuser
      >>> KeyTab: load() entry length: 85; type: 18
      Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
      Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
      [2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
      (io.confluent.connect.hdfs.HdfsSinkTask)
      org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
      Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
      /etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
      Unable to obtain password from user

      at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
      at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
      at
      org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
      at
      org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
      at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
      at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at
      java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at
      java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      Caused by: java.io.IOException: Login failure for
      hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
      javax.security.auth.login.LoginException: Unable to obtain password from
      user

      at
      org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
      at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
      ... 10 more
      Caused by: javax.security.auth.login.LoginException: Unable to obtain
      password from user


      krb5.conf looks like this:



      [logging]
      kdc = FILE:/var/log/krb5/krb5kdc.log
      admin_server = FILE:/var/log/krb5/kadmin.log
      default = FILE:/var/log/krb5/krb5libs.log

      [libdefaults]
      default_realm = KAFKA.MYCOMPANY.COM
      dns_lookup_realm = false
      dns_lookup_kdc = false
      ticket_lifetime = 24h
      forwardable = yes
      allow_weak_crypto = true
      renew_lifetime = 7d
      kdc_timeout = 3000
      max_retries = 2
      clockskew = 120
      default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
      default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
      permitted_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



      [realms]
      # KDC,Realm for Kafka
      KAFKA.MYCOMPANY.COM =
      kdc = kafka-auth101.hadoop.local
      admin_server = kafka-auth101.hadoop.local:2749


      # KDC,Realm for Hadoop/HDFS
      HADOOP.MYCOMPANY.COM =
      kdc = hadoop-auth101.hadoop.local
      admin_server = hadoop-auth101.hadoop.local:2749


      [appdefaults]
      pam =
      debug = false
      ticket_lifetime = 36000
      renew_lifetime = 36000
      forwardable = true
      krb4_convert = false










      share|improve this question
















      I have setup my Kafka Connect with kerberized Kafka Cluster .(Say the KDC
      is "kafka-auth101.hadoop.local" and the realm "KAFKA.MYCOMPANY.COM")



      Now I am trying to setup HDFS Sink to write into a Kerberized Hadoop Cluster with a different KDC (Say the KDC is "hadoop-auth101.hadoop.local" and the realm "
      HADOOP.MYCOMPANY.COM"



      I have added both of these realms to the krb5.conf used by Kafka Connect.



      But during initialisation , HDFS Sink connector instance fails giving error



      Any tips on this ?
      Basically with this configuration, one JVM tries to use different KDCs and Realms.



      >>> KdcAccessibility: reset
      >>> KeyTabInputStream, readName(): HADOOP.MYCOMPANY.COM
      >>> KeyTabInputStream, readName(): hdfsuser
      >>> KeyTab: load() entry length: 85; type: 18
      Looking for keys for: hdfsuser@HADOOP.MYCOMPANY.COM
      Found unsupported keytype (18) for hdfsuser@HADOOP.MYCOMPANY.COM
      [2019-03-19 07:21:12,330] INFO Couldn't start HdfsSinkConnector:
      (io.confluent.connect.hdfs.HdfsSinkTask)
      org.apache.kafka.connect.errors.ConnectException: java.io.IOException:
      Login failure for hdfsuser@HADOOP.MYCOMPANY.COM from keytab
      /etc/hadoop/keytab/stg.keytab: javax.security.auth.login.LoginException:
      Unable to obtain password from user

      at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:202)
      at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:76)
      at
      org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:232)
      at
      org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:145)
      at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:146)
      at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:190)
      at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
      at java.util.concurrent.FutureTask.run(FutureTask.java:266)
      at
      java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
      at
      java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
      at java.lang.Thread.run(Thread.java:745)
      Caused by: java.io.IOException: Login failure for
      hdfsuser@HADOOP.MYCOMPANY.COM from keytab /etc/hadoop/keytab/stg.keytab:
      javax.security.auth.login.LoginException: Unable to obtain password from
      user

      at
      org.apache.hadoop.security.UserGroupInformation.loginUserFromKeytab(UserGroupInformation.java:963)
      at io.confluent.connect.hdfs.DataWriter.<init>(DataWriter.java:127)
      ... 10 more
      Caused by: javax.security.auth.login.LoginException: Unable to obtain
      password from user


      krb5.conf looks like this:



      [logging]
      kdc = FILE:/var/log/krb5/krb5kdc.log
      admin_server = FILE:/var/log/krb5/kadmin.log
      default = FILE:/var/log/krb5/krb5libs.log

      [libdefaults]
      default_realm = KAFKA.MYCOMPANY.COM
      dns_lookup_realm = false
      dns_lookup_kdc = false
      ticket_lifetime = 24h
      forwardable = yes
      allow_weak_crypto = true
      renew_lifetime = 7d
      kdc_timeout = 3000
      max_retries = 2
      clockskew = 120
      default_tkt_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
      default_tgs_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc
      permitted_enctypes = rc4-hmac aes256-cts aes128-cts des3-cbc-sha1 des-cbc-md5 des-cbc-crc



      [realms]
      # KDC,Realm for Kafka
      KAFKA.MYCOMPANY.COM =
      kdc = kafka-auth101.hadoop.local
      admin_server = kafka-auth101.hadoop.local:2749


      # KDC,Realm for Hadoop/HDFS
      HADOOP.MYCOMPANY.COM =
      kdc = hadoop-auth101.hadoop.local
      admin_server = hadoop-auth101.hadoop.local:2749


      [appdefaults]
      pam =
      debug = false
      ticket_lifetime = 36000
      renew_lifetime = 36000
      forwardable = true
      krb4_convert = false







      apache-kafka hdfs kerberos apache-kafka-connect confluent






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Mar 22 at 1:09







      Ashika Umanga Umagiliya

















      asked Mar 22 at 1:04









      Ashika Umanga UmagiliyaAshika Umanga Umagiliya

      3,0121777154




      3,0121777154






















          0






          active

          oldest

          votes












          Your Answer






          StackExchange.ifUsing("editor", function ()
          StackExchange.using("externalEditor", function ()
          StackExchange.using("snippets", function ()
          StackExchange.snippets.init();
          );
          );
          , "code-snippets");

          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "1"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55291428%2fcant-use-different-kdcs-and-realms-for-kafka-connect-and-hdfs-sink-connector%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes















          draft saved

          draft discarded
















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid


          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.

          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55291428%2fcant-use-different-kdcs-and-realms-for-kafka-connect-and-hdfs-sink-connector%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

          Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

          Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript