How to access BERT intermediate layer outputs in TF Hub Module?How to import a module given the full path?How to flush output of print function?How can I get a list of locally installed Python modules?How to access environment variable values?List of tensor names in graph in TensorflowTensorflow: How can I assign numpy pre-trained weights to subsections of graph?Continuing training (image_retraining/retrain.py) by loading an intermediate_output_graphs(.pb)Choosing output layer with no tensor in TensorFlowExtract the weights from a TensorFlow graph to use them in KerasUsing internal layer outputs of Inception-v3
Does a vocal melody have any rhythmic responsibility to the underlying arrangement in pop music?
RandomInteger with equal number of 1 and -1
Is "Busen" just the area between the breasts?
Is a single radon-daughter atom in air a solid?
I found a password with hashcat, but it doesn't work
How does a blind passenger not die, if driver becomes unconscious
Term or phrase for simply moving a problem from one area to another
Trainee keeps passing deadlines for independent learning
Why does cooking oatmeal starting with cold milk make it creamy?
Why is it recommended to mix yogurt starter with a small amount of milk before adding to the entire batch?
Walk a Crooked Path
What is the highest voltage from the power supply a Raspberry Pi 3 B can handle without getting damaged?
Why do all the teams that I have worked with always finish a sprint without completion of all the stories?
Is there a term for the belief that "if it's legal, it's moral"?
Should I submit Original or copy of my passport to canadian embassy?
Is declining an undergraduate award which causes me discomfort appropriate?
Explain why a line can never intersect a plane in exactly two points.
How to remove this component from PCB
What is the origin of Scooby-Doo's name?
How to make clear to people I don't want to answer their "Where are you from?" question?
Paralleling mosfets reduce Rds On?
Understanding the reasoning of the woman who agreed with Shlomo to "cut the baby in half"
Boss wants someone else to lead a project based on the idea I presented to him
Is it illegal to withhold someone's passport and green card in California?
How to access BERT intermediate layer outputs in TF Hub Module?
How to import a module given the full path?How to flush output of print function?How can I get a list of locally installed Python modules?How to access environment variable values?List of tensor names in graph in TensorflowTensorflow: How can I assign numpy pre-trained weights to subsections of graph?Continuing training (image_retraining/retrain.py) by loading an intermediate_output_graphs(.pb)Choosing output layer with no tensor in TensorFlowExtract the weights from a TensorFlow graph to use them in KerasUsing internal layer outputs of Inception-v3
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;
Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?
The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.
Right now, all I can think of to do is:
- Run
[i.values() for i in tf.get_default_graph().get_operations()]
to get the names of the tensors, find the ones I want (out of thousands) then tf.get_default_graph().get_tensor_by_name(name_of_the_tensor)
to access the values and stitch them together and connect them to my downstream layers.
Anybody know a cleaner solution with Tensorflow?
python tensorflow nlp
add a comment |
Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?
The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.
Right now, all I can think of to do is:
- Run
[i.values() for i in tf.get_default_graph().get_operations()]
to get the names of the tensors, find the ones I want (out of thousands) then tf.get_default_graph().get_tensor_by_name(name_of_the_tensor)
to access the values and stitch them together and connect them to my downstream layers.
Anybody know a cleaner solution with Tensorflow?
python tensorflow nlp
1
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50
add a comment |
Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?
The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.
Right now, all I can think of to do is:
- Run
[i.values() for i in tf.get_default_graph().get_operations()]
to get the names of the tensors, find the ones I want (out of thousands) then tf.get_default_graph().get_tensor_by_name(name_of_the_tensor)
to access the values and stitch them together and connect them to my downstream layers.
Anybody know a cleaner solution with Tensorflow?
python tensorflow nlp
Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?
The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.
Right now, all I can think of to do is:
- Run
[i.values() for i in tf.get_default_graph().get_operations()]
to get the names of the tensors, find the ones I want (out of thousands) then tf.get_default_graph().get_tensor_by_name(name_of_the_tensor)
to access the values and stitch them together and connect them to my downstream layers.
Anybody know a cleaner solution with Tensorflow?
python tensorflow nlp
python tensorflow nlp
asked Mar 25 at 8:10
AlexDelPieroAlexDelPiero
1946
1946
1
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50
add a comment |
1
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50
1
1
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50
add a comment |
0
active
oldest
votes
Your Answer
StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55333558%2fhow-to-access-bert-intermediate-layer-outputs-in-tf-hub-module%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55333558%2fhow-to-access-bert-intermediate-layer-outputs-in-tf-hub-module%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.
– greeness
Mar 25 at 19:23
Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.
– AlexDelPiero
Mar 25 at 20:46
Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.
– greeness
Mar 25 at 20:50