How to access BERT intermediate layer outputs in TF Hub Module?How to import a module given the full path?How to flush output of print function?How can I get a list of locally installed Python modules?How to access environment variable values?List of tensor names in graph in TensorflowTensorflow: How can I assign numpy pre-trained weights to subsections of graph?Continuing training (image_retraining/retrain.py) by loading an intermediate_output_graphs(.pb)Choosing output layer with no tensor in TensorFlowExtract the weights from a TensorFlow graph to use them in KerasUsing internal layer outputs of Inception-v3

Does a vocal melody have any rhythmic responsibility to the underlying arrangement in pop music?

RandomInteger with equal number of 1 and -1

Is "Busen" just the area between the breasts?

Is a single radon-daughter atom in air a solid?

I found a password with hashcat, but it doesn't work

How does a blind passenger not die, if driver becomes unconscious

Term or phrase for simply moving a problem from one area to another

Trainee keeps passing deadlines for independent learning

Why does cooking oatmeal starting with cold milk make it creamy?

Why is it recommended to mix yogurt starter with a small amount of milk before adding to the entire batch?

Walk a Crooked Path

What is the highest voltage from the power supply a Raspberry Pi 3 B can handle without getting damaged?

Why do all the teams that I have worked with always finish a sprint without completion of all the stories?

Is there a term for the belief that "if it's legal, it's moral"?

Should I submit Original or copy of my passport to canadian embassy?

Is declining an undergraduate award which causes me discomfort appropriate?

Explain why a line can never intersect a plane in exactly two points.

How to remove this component from PCB

What is the origin of Scooby-Doo's name?

How to make clear to people I don't want to answer their "Where are you from?" question?

Paralleling mosfets reduce Rds On?

Understanding the reasoning of the woman who agreed with Shlomo to "cut the baby in half"

Boss wants someone else to lead a project based on the idea I presented to him

Is it illegal to withhold someone's passport and green card in California?



How to access BERT intermediate layer outputs in TF Hub Module?


How to import a module given the full path?How to flush output of print function?How can I get a list of locally installed Python modules?How to access environment variable values?List of tensor names in graph in TensorflowTensorflow: How can I assign numpy pre-trained weights to subsections of graph?Continuing training (image_retraining/retrain.py) by loading an intermediate_output_graphs(.pb)Choosing output layer with no tensor in TensorFlowExtract the weights from a TensorFlow graph to use them in KerasUsing internal layer outputs of Inception-v3






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








0















Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?



The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.



Right now, all I can think of to do is:



  1. Run [i.values() for i in tf.get_default_graph().get_operations()] to get the names of the tensors, find the ones I want (out of thousands) then


  2. tf.get_default_graph().get_tensor_by_name(name_of_the_tensor) to access the values and stitch them together and connect them to my downstream layers.

Anybody know a cleaner solution with Tensorflow?










share|improve this question

















  • 1





    Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

    – greeness
    Mar 25 at 19:23












  • Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

    – AlexDelPiero
    Mar 25 at 20:46











  • Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

    – greeness
    Mar 25 at 20:50

















0















Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?



The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.



Right now, all I can think of to do is:



  1. Run [i.values() for i in tf.get_default_graph().get_operations()] to get the names of the tensors, find the ones I want (out of thousands) then


  2. tf.get_default_graph().get_tensor_by_name(name_of_the_tensor) to access the values and stitch them together and connect them to my downstream layers.

Anybody know a cleaner solution with Tensorflow?










share|improve this question

















  • 1





    Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

    – greeness
    Mar 25 at 19:23












  • Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

    – AlexDelPiero
    Mar 25 at 20:46











  • Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

    – greeness
    Mar 25 at 20:50













0












0








0








Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?



The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.



Right now, all I can think of to do is:



  1. Run [i.values() for i in tf.get_default_graph().get_operations()] to get the names of the tensors, find the ones I want (out of thousands) then


  2. tf.get_default_graph().get_tensor_by_name(name_of_the_tensor) to access the values and stitch them together and connect them to my downstream layers.

Anybody know a cleaner solution with Tensorflow?










share|improve this question














Does anybody know a way to access the outputs of the intermediate layers from BERT's hosted models on Tensorflow Hub?



The model is hosted here. I have explored the meta graph and found the only signatures available are "tokens, "tokenization_info", and "mlm". The first two are illustrated in the examples on github, and the masked language model signature doesn't help much. Some models like inception allow you to access all of the intermediate layers, but not this one.



Right now, all I can think of to do is:



  1. Run [i.values() for i in tf.get_default_graph().get_operations()] to get the names of the tensors, find the ones I want (out of thousands) then


  2. tf.get_default_graph().get_tensor_by_name(name_of_the_tensor) to access the values and stitch them together and connect them to my downstream layers.

Anybody know a cleaner solution with Tensorflow?







python tensorflow nlp






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 25 at 8:10









AlexDelPieroAlexDelPiero

1946




1946







  • 1





    Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

    – greeness
    Mar 25 at 19:23












  • Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

    – AlexDelPiero
    Mar 25 at 20:46











  • Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

    – greeness
    Mar 25 at 20:50












  • 1





    Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

    – greeness
    Mar 25 at 19:23












  • Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

    – AlexDelPiero
    Mar 25 at 20:46











  • Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

    – greeness
    Mar 25 at 20:50







1




1





Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

– greeness
Mar 25 at 19:23






Probably the way you hacked is the only way to read the intermediate layer since they are not exposed through the hub signature. Another option is to propose a feature request asking for other outputs on github.

– greeness
Mar 25 at 19:23














Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

– AlexDelPiero
Mar 25 at 20:46





Thanks! That's kind of what I thought. I'll probably propose a feature request. Correct me if I'm wrong, but the more I read in BERT literature, the more it seems there are many benefits from these other layers being concatenated.

– AlexDelPiero
Mar 25 at 20:46













Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

– greeness
Mar 25 at 20:50





Yeah, I see some people (some particular domain/use case), using intermediate layer is helpful. but your mileage may vary. Anyway, it is worth a try.

– greeness
Mar 25 at 20:50












0






active

oldest

votes












Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55333558%2fhow-to-access-bert-intermediate-layer-outputs-in-tf-hub-module%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55333558%2fhow-to-access-bert-intermediate-layer-outputs-in-tf-hub-module%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현