Per class weighted loss for multiclass-multilabel classificationMultilabel image classification with sparse labels in TensorFlow?Multilabel classification converges to all zeroesHow tf.nn.softmax_cross_entropy_with_logits can compute softmax cross entropy in tensorflow?TensorFlow: Implementing a class-wise weighted cross entropy loss?What is weight decay loss?Weighted loss for class imbalance in tensorflow object detection faster rcnnHow to set loss weight in chainer?TensorFlow softmax_crossentropy_with logits: are “labels” also trained (if differentiable)?Using weighted_cross_entropy_with_logits for multilabel sparse classificationLoss curve in multilabel classification with high class imbalance

🍩🔔🔥Scrambled emoji tale⚛️🎶🛒 #2️⃣

How to interpret a promising preprint that was never published in peer-review?

How do BIP numbers get assigned?

How to delete all messages in the message window programmatically?

Will the internet speed decrease on second router if there are multiple devices connected to primary router?

How can I help our ranger feel special about her beast companion?

Are there any satellites in geosynchronous but not geostationary orbits?

Can error correction and detection be done without adding extra bits?

How long were the Apollo astronauts allowed to breathe 100% oxygen at 1 atmosphere continuously?

"Je suis petite, moi?", purpose of the "moi"?

How do I reproduce this layout and typography?

How to tell readers that I know my story is factually incorrect?

Is encryption still applied if you ignore the SSL certificate warning for self-signed certs?

Why does a tetrahedral molecule like methane have a dipole moment of zero?

How was Luke's prosthetic hand in Episode V filmed?

May I use a railway velocipede on actively-used British railways?

How did J. J. Thomson establish the particle nature of the electron?

Did Hitler say this quote about homeschooling?

In this iconic lunar orbit rendezvous photo of John Houbolt, why do arrows #5 and #6 point the "wrong" way?

Wait or be waiting?

I have found a mistake on someone's code published online: what is the protocol?

Is Tales of Old an official bard ability?

Demographic consequences of closed loop reincarnation

Why teach C using scanf without talking about command line arguments?



Per class weighted loss for multiclass-multilabel classification


Multilabel image classification with sparse labels in TensorFlow?Multilabel classification converges to all zeroesHow tf.nn.softmax_cross_entropy_with_logits can compute softmax cross entropy in tensorflow?TensorFlow: Implementing a class-wise weighted cross entropy loss?What is weight decay loss?Weighted loss for class imbalance in tensorflow object detection faster rcnnHow to set loss weight in chainer?TensorFlow softmax_crossentropy_with logits: are “labels” also trained (if differentiable)?Using weighted_cross_entropy_with_logits for multilabel sparse classificationLoss curve in multilabel classification with high class imbalance






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








1















I'm doing multiclass-multilabel classification. Namely, I have N_labels fully independent labels for each example, whereas each label may have N_classes different values (mutually exclusive). More concretely, each example is classified by N_labels-dimensional vector, while each vector components can by from the set 0, 1, ..., N_classes



For example, if N_labels = 5 and N_classes = 3, each example may be classified by the following tags:



[2, 1, 0, 0, 1], [0, 0, 2, 2, 1], [0, 0, 0, 0, 0]


In addition, for each label I have very imbalance between different classes, namely 90% of examples in training set belong to set 0. So, I'd like to perform weighted softmax cross entropy in order to compute loss for each label (and average afterwards).



Tried to use:



tf.losses.sparse_softmax_cross_entropy # but it seems that it performs weightening between different label and not between classes for each label.

tf.nn.softmax_cross_entropy_with_logits, tf.nn.softmax_cross_entropy_with_logits_v2 # does not have weightening option ever

tf.nn.weighted_cross_entropy_with_logits # good only for binary classification


I'd like to find compute_loss function to compute loss in the following way:



loss = compute_loss(logits=my_logits, labels=my_labels, weights=my_weights)


where



my_logits is of shape [batch_size, N_labels, N_classes]
my_labels is of shape [batch_size, N_labels]
my_weight is of shape [N_labels, N_classes]


Note that each label may have different weights (for classes)










share|improve this question



















  • 1





    Can you provide an input logits, labels, weights and a desired output?

    – gorjan
    Mar 26 at 10:44











  • Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

    – Gio
    Mar 26 at 10:44











  • @Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

    – Mike E
    Mar 26 at 10:49











  • @gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

    – Mike E
    Mar 26 at 10:49


















1















I'm doing multiclass-multilabel classification. Namely, I have N_labels fully independent labels for each example, whereas each label may have N_classes different values (mutually exclusive). More concretely, each example is classified by N_labels-dimensional vector, while each vector components can by from the set 0, 1, ..., N_classes



For example, if N_labels = 5 and N_classes = 3, each example may be classified by the following tags:



[2, 1, 0, 0, 1], [0, 0, 2, 2, 1], [0, 0, 0, 0, 0]


In addition, for each label I have very imbalance between different classes, namely 90% of examples in training set belong to set 0. So, I'd like to perform weighted softmax cross entropy in order to compute loss for each label (and average afterwards).



Tried to use:



tf.losses.sparse_softmax_cross_entropy # but it seems that it performs weightening between different label and not between classes for each label.

tf.nn.softmax_cross_entropy_with_logits, tf.nn.softmax_cross_entropy_with_logits_v2 # does not have weightening option ever

tf.nn.weighted_cross_entropy_with_logits # good only for binary classification


I'd like to find compute_loss function to compute loss in the following way:



loss = compute_loss(logits=my_logits, labels=my_labels, weights=my_weights)


where



my_logits is of shape [batch_size, N_labels, N_classes]
my_labels is of shape [batch_size, N_labels]
my_weight is of shape [N_labels, N_classes]


Note that each label may have different weights (for classes)










share|improve this question



















  • 1





    Can you provide an input logits, labels, weights and a desired output?

    – gorjan
    Mar 26 at 10:44











  • Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

    – Gio
    Mar 26 at 10:44











  • @Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

    – Mike E
    Mar 26 at 10:49











  • @gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

    – Mike E
    Mar 26 at 10:49














1












1








1








I'm doing multiclass-multilabel classification. Namely, I have N_labels fully independent labels for each example, whereas each label may have N_classes different values (mutually exclusive). More concretely, each example is classified by N_labels-dimensional vector, while each vector components can by from the set 0, 1, ..., N_classes



For example, if N_labels = 5 and N_classes = 3, each example may be classified by the following tags:



[2, 1, 0, 0, 1], [0, 0, 2, 2, 1], [0, 0, 0, 0, 0]


In addition, for each label I have very imbalance between different classes, namely 90% of examples in training set belong to set 0. So, I'd like to perform weighted softmax cross entropy in order to compute loss for each label (and average afterwards).



Tried to use:



tf.losses.sparse_softmax_cross_entropy # but it seems that it performs weightening between different label and not between classes for each label.

tf.nn.softmax_cross_entropy_with_logits, tf.nn.softmax_cross_entropy_with_logits_v2 # does not have weightening option ever

tf.nn.weighted_cross_entropy_with_logits # good only for binary classification


I'd like to find compute_loss function to compute loss in the following way:



loss = compute_loss(logits=my_logits, labels=my_labels, weights=my_weights)


where



my_logits is of shape [batch_size, N_labels, N_classes]
my_labels is of shape [batch_size, N_labels]
my_weight is of shape [N_labels, N_classes]


Note that each label may have different weights (for classes)










share|improve this question
















I'm doing multiclass-multilabel classification. Namely, I have N_labels fully independent labels for each example, whereas each label may have N_classes different values (mutually exclusive). More concretely, each example is classified by N_labels-dimensional vector, while each vector components can by from the set 0, 1, ..., N_classes



For example, if N_labels = 5 and N_classes = 3, each example may be classified by the following tags:



[2, 1, 0, 0, 1], [0, 0, 2, 2, 1], [0, 0, 0, 0, 0]


In addition, for each label I have very imbalance between different classes, namely 90% of examples in training set belong to set 0. So, I'd like to perform weighted softmax cross entropy in order to compute loss for each label (and average afterwards).



Tried to use:



tf.losses.sparse_softmax_cross_entropy # but it seems that it performs weightening between different label and not between classes for each label.

tf.nn.softmax_cross_entropy_with_logits, tf.nn.softmax_cross_entropy_with_logits_v2 # does not have weightening option ever

tf.nn.weighted_cross_entropy_with_logits # good only for binary classification


I'd like to find compute_loss function to compute loss in the following way:



loss = compute_loss(logits=my_logits, labels=my_labels, weights=my_weights)


where



my_logits is of shape [batch_size, N_labels, N_classes]
my_labels is of shape [batch_size, N_labels]
my_weight is of shape [N_labels, N_classes]


Note that each label may have different weights (for classes)







python tensorflow classification loss-function






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 26 at 13:09







Mike E

















asked Mar 26 at 10:37









Mike EMike E

84 bronze badges




84 bronze badges







  • 1





    Can you provide an input logits, labels, weights and a desired output?

    – gorjan
    Mar 26 at 10:44











  • Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

    – Gio
    Mar 26 at 10:44











  • @Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

    – Mike E
    Mar 26 at 10:49











  • @gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

    – Mike E
    Mar 26 at 10:49













  • 1





    Can you provide an input logits, labels, weights and a desired output?

    – gorjan
    Mar 26 at 10:44











  • Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

    – Gio
    Mar 26 at 10:44











  • @Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

    – Mike E
    Mar 26 at 10:49











  • @gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

    – Mike E
    Mar 26 at 10:49








1




1





Can you provide an input logits, labels, weights and a desired output?

– gorjan
Mar 26 at 10:44





Can you provide an input logits, labels, weights and a desired output?

– gorjan
Mar 26 at 10:44













Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

– Gio
Mar 26 at 10:44





Hi Mike, welcome to stackoverflow. You mention that for each Label your Classes are mutually exclusive. I am not sure I get this from the example you provide.

– Gio
Mar 26 at 10:44













@Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

– Mike E
Mar 26 at 10:49





@Gio, labels are not mutually exclusive. Each example is ALWAYS labeled with, say, 5 labels. And each label may have 3 different values (e.g. 0, 1, 2)

– Mike E
Mar 26 at 10:49













@gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

– Mike E
Mar 26 at 10:49






@gorjan, my input dimensions are: my_logits.shape = [batch_size, N_labels, N_classes] my_labels.shape = [batch_size, N_labels] my_weight.shape = [N_labels, N_classes]. Logits are real numbers, my_weight are positive numbers with sum(my_weight, axis=1) = ones(N_labels), while each component of my_labels may be each number from 0, 1, ..., N_classes

– Mike E
Mar 26 at 10:49













1 Answer
1






active

oldest

votes


















0














I think you need tf.losses.sigmoid_cross_entropy It uses multi_class_labels just as you described, and have functionality to apply weights.
https://www.tensorflow.org/api_docs/python/tf/losses/sigmoid_cross_entropy



Example:
Suppose you have a multiclass multilabel classification problem, where you have 10 classes in total and label for single example look like this [1, 3, 6], meaning example contains classes 1, 3 and 6.

You need to use k-hot encoding



labels = tf.reduce_max(tf.one_hot([1, 3, 6], 10, dtype=tf.int32), axis=0)


In this case output will be [0, 1, 0, 1, 0, 0, 1, 0, 0, 0]






share|improve this answer

























  • not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

    – Mike E
    Mar 27 at 9:24












  • No, this actually the function for multiclass multilabel classification. Added example

    – Sharky
    Mar 27 at 13:25











  • Cool, thank you

    – Mike E
    Mar 28 at 14:06











  • If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

    – Sharky
    Mar 28 at 14:13










Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355059%2fper-class-weighted-loss-for-multiclass-multilabel-classification%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









0














I think you need tf.losses.sigmoid_cross_entropy It uses multi_class_labels just as you described, and have functionality to apply weights.
https://www.tensorflow.org/api_docs/python/tf/losses/sigmoid_cross_entropy



Example:
Suppose you have a multiclass multilabel classification problem, where you have 10 classes in total and label for single example look like this [1, 3, 6], meaning example contains classes 1, 3 and 6.

You need to use k-hot encoding



labels = tf.reduce_max(tf.one_hot([1, 3, 6], 10, dtype=tf.int32), axis=0)


In this case output will be [0, 1, 0, 1, 0, 0, 1, 0, 0, 0]






share|improve this answer

























  • not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

    – Mike E
    Mar 27 at 9:24












  • No, this actually the function for multiclass multilabel classification. Added example

    – Sharky
    Mar 27 at 13:25











  • Cool, thank you

    – Mike E
    Mar 28 at 14:06











  • If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

    – Sharky
    Mar 28 at 14:13















0














I think you need tf.losses.sigmoid_cross_entropy It uses multi_class_labels just as you described, and have functionality to apply weights.
https://www.tensorflow.org/api_docs/python/tf/losses/sigmoid_cross_entropy



Example:
Suppose you have a multiclass multilabel classification problem, where you have 10 classes in total and label for single example look like this [1, 3, 6], meaning example contains classes 1, 3 and 6.

You need to use k-hot encoding



labels = tf.reduce_max(tf.one_hot([1, 3, 6], 10, dtype=tf.int32), axis=0)


In this case output will be [0, 1, 0, 1, 0, 0, 1, 0, 0, 0]






share|improve this answer

























  • not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

    – Mike E
    Mar 27 at 9:24












  • No, this actually the function for multiclass multilabel classification. Added example

    – Sharky
    Mar 27 at 13:25











  • Cool, thank you

    – Mike E
    Mar 28 at 14:06











  • If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

    – Sharky
    Mar 28 at 14:13













0












0








0







I think you need tf.losses.sigmoid_cross_entropy It uses multi_class_labels just as you described, and have functionality to apply weights.
https://www.tensorflow.org/api_docs/python/tf/losses/sigmoid_cross_entropy



Example:
Suppose you have a multiclass multilabel classification problem, where you have 10 classes in total and label for single example look like this [1, 3, 6], meaning example contains classes 1, 3 and 6.

You need to use k-hot encoding



labels = tf.reduce_max(tf.one_hot([1, 3, 6], 10, dtype=tf.int32), axis=0)


In this case output will be [0, 1, 0, 1, 0, 0, 1, 0, 0, 0]






share|improve this answer















I think you need tf.losses.sigmoid_cross_entropy It uses multi_class_labels just as you described, and have functionality to apply weights.
https://www.tensorflow.org/api_docs/python/tf/losses/sigmoid_cross_entropy



Example:
Suppose you have a multiclass multilabel classification problem, where you have 10 classes in total and label for single example look like this [1, 3, 6], meaning example contains classes 1, 3 and 6.

You need to use k-hot encoding



labels = tf.reduce_max(tf.one_hot([1, 3, 6], 10, dtype=tf.int32), axis=0)


In this case output will be [0, 1, 0, 1, 0, 0, 1, 0, 0, 0]







share|improve this answer














share|improve this answer



share|improve this answer








edited Mar 27 at 13:24

























answered Mar 26 at 15:13









SharkySharky

2,7042 gold badges9 silver badges19 bronze badges




2,7042 gold badges9 silver badges19 bronze badges












  • not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

    – Mike E
    Mar 27 at 9:24












  • No, this actually the function for multiclass multilabel classification. Added example

    – Sharky
    Mar 27 at 13:25











  • Cool, thank you

    – Mike E
    Mar 28 at 14:06











  • If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

    – Sharky
    Mar 28 at 14:13

















  • not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

    – Mike E
    Mar 27 at 9:24












  • No, this actually the function for multiclass multilabel classification. Added example

    – Sharky
    Mar 27 at 13:25











  • Cool, thank you

    – Mike E
    Mar 28 at 14:06











  • If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

    – Sharky
    Mar 28 at 14:13
















not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

– Mike E
Mar 27 at 9:24






not exactly. It is designed to solely multiclass classification when it is not combined with multilabel. But It can be used for each label separately and this is the way I doing it now invoking tf.losses.sparse_softmax_cross_entropy() .

– Mike E
Mar 27 at 9:24














No, this actually the function for multiclass multilabel classification. Added example

– Sharky
Mar 27 at 13:25





No, this actually the function for multiclass multilabel classification. Added example

– Sharky
Mar 27 at 13:25













Cool, thank you

– Mike E
Mar 28 at 14:06





Cool, thank you

– Mike E
Mar 28 at 14:06













If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

– Sharky
Mar 28 at 14:13





If the answer was helpful, please consider accepting it stackoverflow.com/help/someone-answers

– Sharky
Mar 28 at 14:13








Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.







Got a question that you can’t ask on public Stack Overflow? Learn more about sharing private information with Stack Overflow for Teams.



















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55355059%2fper-class-weighted-loss-for-multiclass-multilabel-classification%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript