Dealing with data which have different sizes in multi task learning (Pytorch)Iterate over two list of different sizes in pythonabout torch.nn.CrossEntropyLoss parameter shapemulti-variable linear regression with pytorchIs this a correct reimplementation of Pytorch Seq2Seq model?PyTorch 0.4 LSTM: Why does each epoch get slower?Deep learning models with different input size?Multi-task learning: keep same batch size or same epochs?pytorch restore model with different batch sizePytorch RNN always gives the same output for multivariate time seriesLoss function used for multi-dimensional feature mappingWhat is the difference between MLP implementation from scratch and in PyTorch?

9 hrs long transit in DEL

What happened after the end of the Truman Show?

Playing a fast but quiet Alberti bass

Metal that glows when near pieces of itself

The Lucky House

What allows us to use imaginary numbers?

From France west coast to Portugal via ship?

Why did St. Jerome use "virago" in Gen. 2:23?

"A y'vama acquires herself through chalitza", really?

Installing certbot - error - "nothing provides pyparsing"

Show two plots together: a two dimensional curve tangent to the maxima of a three dimensional plot

What kind of (probable) traffic accident might lead to the desctruction of only (!) the brain stem and cerebellum?

Can I submit a paper computer science conference using an alias if using my real name can cause legal trouble in my original country

What's the point of writing that I know will never be used or read?

How to add a table description to a longtable?

Meaning and structure of headline "Hair it is: A List of ..."

Did they show Truman doing private things (toilet, etc) when filming him for 24 hours, 7 days a week?

Adding things to bunches of things vs multiplication

Why don't politicians push for fossil fuel reduction by pointing out their scarcity?

Eric Andre had a dream

What security risks does exposing the size of the plaintext entail?

Did Wernher von Braun really have a "Saturn V painted as the V2"?

Unsolved Problems due to Lack of Computational Power

Why doesn't mathematics collapse down, even though humans quite often make mistakes in their proofs?



Dealing with data which have different sizes in multi task learning (Pytorch)


Iterate over two list of different sizes in pythonabout torch.nn.CrossEntropyLoss parameter shapemulti-variable linear regression with pytorchIs this a correct reimplementation of Pytorch Seq2Seq model?PyTorch 0.4 LSTM: Why does each epoch get slower?Deep learning models with different input size?Multi-task learning: keep same batch size or same epochs?pytorch restore model with different batch sizePytorch RNN always gives the same output for multivariate time seriesLoss function used for multi-dimensional feature mappingWhat is the difference between MLP implementation from scratch and in PyTorch?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








0















I want to create multi-task learning model with Pytorch.



The model I designed follows hard parameter sharing (http://ruder.io/multi-task/)



The problem is the size of each dataset is different. So they cannot be trained in the same for loop in the model like this:



for epoch in range(EPOCHS):
for data in dataloader_train:
output_a = model(data.a)
output_b = model(data.b)
output_c = model(data.c)

loss_a = criterion(output_a, data.target_a)
loss_b = criterion(output_b, data.target_b)
loss_c = criterion(output_c, data.target_c)

optimizer.zero_grad()
loss.backward([loss_a, loss_b, loss_c])
optimizer.step()


I would like to know how to deal with this situation. I thought I can use zip with itertools.cycle (Iterate over two list of different sizes in python) but it may affect the model significantly because it changes the frequency of certain data especially located in the early index:



data.a data.b
a 1
b 2
c 3
a 4
b 5
c 1
a 2
b 3


On the other hand, if I regard it as just the differences between epochs of the data sets, it seems that it has no problem. E.g. data.a has 5 epochs and data.b has 3 epochs when EPOCHS == 15.



Or if there is no problem to design model like this:



for epoch in range(EPOCHS):
for data, target in data_a:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_b:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_c:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()


I think this will be the most simple solution. However, I don't know why but I got obsessed with the idea that I have to build the model like the former model.



All the codes above are pseudocode.



Thank you for the advice :)










share|improve this question
























  • It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

    – Cedias
    Mar 27 at 20:28

















0















I want to create multi-task learning model with Pytorch.



The model I designed follows hard parameter sharing (http://ruder.io/multi-task/)



The problem is the size of each dataset is different. So they cannot be trained in the same for loop in the model like this:



for epoch in range(EPOCHS):
for data in dataloader_train:
output_a = model(data.a)
output_b = model(data.b)
output_c = model(data.c)

loss_a = criterion(output_a, data.target_a)
loss_b = criterion(output_b, data.target_b)
loss_c = criterion(output_c, data.target_c)

optimizer.zero_grad()
loss.backward([loss_a, loss_b, loss_c])
optimizer.step()


I would like to know how to deal with this situation. I thought I can use zip with itertools.cycle (Iterate over two list of different sizes in python) but it may affect the model significantly because it changes the frequency of certain data especially located in the early index:



data.a data.b
a 1
b 2
c 3
a 4
b 5
c 1
a 2
b 3


On the other hand, if I regard it as just the differences between epochs of the data sets, it seems that it has no problem. E.g. data.a has 5 epochs and data.b has 3 epochs when EPOCHS == 15.



Or if there is no problem to design model like this:



for epoch in range(EPOCHS):
for data, target in data_a:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_b:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_c:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()


I think this will be the most simple solution. However, I don't know why but I got obsessed with the idea that I have to build the model like the former model.



All the codes above are pseudocode.



Thank you for the advice :)










share|improve this question
























  • It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

    – Cedias
    Mar 27 at 20:28













0












0








0








I want to create multi-task learning model with Pytorch.



The model I designed follows hard parameter sharing (http://ruder.io/multi-task/)



The problem is the size of each dataset is different. So they cannot be trained in the same for loop in the model like this:



for epoch in range(EPOCHS):
for data in dataloader_train:
output_a = model(data.a)
output_b = model(data.b)
output_c = model(data.c)

loss_a = criterion(output_a, data.target_a)
loss_b = criterion(output_b, data.target_b)
loss_c = criterion(output_c, data.target_c)

optimizer.zero_grad()
loss.backward([loss_a, loss_b, loss_c])
optimizer.step()


I would like to know how to deal with this situation. I thought I can use zip with itertools.cycle (Iterate over two list of different sizes in python) but it may affect the model significantly because it changes the frequency of certain data especially located in the early index:



data.a data.b
a 1
b 2
c 3
a 4
b 5
c 1
a 2
b 3


On the other hand, if I regard it as just the differences between epochs of the data sets, it seems that it has no problem. E.g. data.a has 5 epochs and data.b has 3 epochs when EPOCHS == 15.



Or if there is no problem to design model like this:



for epoch in range(EPOCHS):
for data, target in data_a:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_b:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_c:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()


I think this will be the most simple solution. However, I don't know why but I got obsessed with the idea that I have to build the model like the former model.



All the codes above are pseudocode.



Thank you for the advice :)










share|improve this question














I want to create multi-task learning model with Pytorch.



The model I designed follows hard parameter sharing (http://ruder.io/multi-task/)



The problem is the size of each dataset is different. So they cannot be trained in the same for loop in the model like this:



for epoch in range(EPOCHS):
for data in dataloader_train:
output_a = model(data.a)
output_b = model(data.b)
output_c = model(data.c)

loss_a = criterion(output_a, data.target_a)
loss_b = criterion(output_b, data.target_b)
loss_c = criterion(output_c, data.target_c)

optimizer.zero_grad()
loss.backward([loss_a, loss_b, loss_c])
optimizer.step()


I would like to know how to deal with this situation. I thought I can use zip with itertools.cycle (Iterate over two list of different sizes in python) but it may affect the model significantly because it changes the frequency of certain data especially located in the early index:



data.a data.b
a 1
b 2
c 3
a 4
b 5
c 1
a 2
b 3


On the other hand, if I regard it as just the differences between epochs of the data sets, it seems that it has no problem. E.g. data.a has 5 epochs and data.b has 3 epochs when EPOCHS == 15.



Or if there is no problem to design model like this:



for epoch in range(EPOCHS):
for data, target in data_a:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_b:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()

for data, target in data_c:
output = model(data)
loss = criterion(output, target)
optimizer.zero_grad()
loss.backward()
optimizer.step()


I think this will be the most simple solution. However, I don't know why but I got obsessed with the idea that I have to build the model like the former model.



All the codes above are pseudocode.



Thank you for the advice :)







for-loop deep-learning pytorch subsampling






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 27 at 13:57









c0mu51c4r7c0mu51c4r7

1




1















  • It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

    – Cedias
    Mar 27 at 20:28

















  • It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

    – Cedias
    Mar 27 at 20:28
















It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

– Cedias
Mar 27 at 20:28





It is not clear on why you cannot concatenate data_a, data_b and data_c ? What size is different ? is it only the number of training examples ?

– Cedias
Mar 27 at 20:28












0






active

oldest

votes










Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55379023%2fdealing-with-data-which-have-different-sizes-in-multi-task-learning-pytorch%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes




Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.







Is this question similar to what you get asked at work? Learn more about asking and sharing private information with your coworkers using Stack Overflow for Teams.



















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55379023%2fdealing-with-data-which-have-different-sizes-in-multi-task-learning-pytorch%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

SQL error code 1064 with creating Laravel foreign keysForeign key constraints: When to use ON UPDATE and ON DELETEDropping column with foreign key Laravel error: General error: 1025 Error on renameLaravel SQL Can't create tableLaravel Migration foreign key errorLaravel php artisan migrate:refresh giving a syntax errorSQLSTATE[42S01]: Base table or view already exists or Base table or view already exists: 1050 Tableerror in migrating laravel file to xampp serverSyntax error or access violation: 1064:syntax to use near 'unsigned not null, modelName varchar(191) not null, title varchar(191) not nLaravel cannot create new table field in mysqlLaravel 5.7:Last migration creates table but is not registered in the migration table

은진 송씨 목차 역사 본관 분파 인물 조선 왕실과의 인척 관계 집성촌 항렬자 인구 같이 보기 각주 둘러보기 메뉴은진 송씨세종실록 149권, 지리지 충청도 공주목 은진현