Building a keras model to apply a Dense network to every column in a 3-D array and return a 2-D array
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
I have a large number of n
xm
xm
arrays. I would like to train a keras model that learns a single Dense NN to apply to all of the n
x1
column vectors. As a concrete example, suppose A
is a 6
x10
x10
tensor; it therefore has 100 6
x1
column vectors.
I have a keras model to train a Dense neural network:
import keras as K
import keras.layers as L
def column_nn():
layers=[12,36,12,1]
columns=L.Input(shape=(6,))
x=L.Dense(layers[0],activation='relu')(columns)
for l in layers[1:]:
x=L.Dense(l,activation='relu')(x)
return K.models.Model(inputs=columns, outputs=x)
I'd like to apply this to each of the 100 column vectors, but I want to return a 2-D 10
x10
tensor that I can do other things to, such as pass through Conv2D
layers.
One way that comes to mind is to use reshaping and a keras shared layer.
data=L.Input(shape=(6,10,10))
column_nn=column()
x=L.Permute((2,3,1))(data)
x=L.Reshape((-1,6))(x)
new_layer=column_nn()(x)
x=L.Reshape((10,10))(new_layer)
# now do a bunch of stuff to the 2-D new_layer, such as
x=L.Conv2d(filters=5,kernel_size=[3,3])(x)
x=L.MaxPooling2D(pool_size=(2,2),strides=(2,2))(x)
x=L.Flatten(x)
output=L.Dense(x)
return K.models.Model(inputs=A,outputs=output)
Seem ok? I'd love to know if if there were a slicker way to pull this off?
python tensorflow keras
add a comment |
I have a large number of n
xm
xm
arrays. I would like to train a keras model that learns a single Dense NN to apply to all of the n
x1
column vectors. As a concrete example, suppose A
is a 6
x10
x10
tensor; it therefore has 100 6
x1
column vectors.
I have a keras model to train a Dense neural network:
import keras as K
import keras.layers as L
def column_nn():
layers=[12,36,12,1]
columns=L.Input(shape=(6,))
x=L.Dense(layers[0],activation='relu')(columns)
for l in layers[1:]:
x=L.Dense(l,activation='relu')(x)
return K.models.Model(inputs=columns, outputs=x)
I'd like to apply this to each of the 100 column vectors, but I want to return a 2-D 10
x10
tensor that I can do other things to, such as pass through Conv2D
layers.
One way that comes to mind is to use reshaping and a keras shared layer.
data=L.Input(shape=(6,10,10))
column_nn=column()
x=L.Permute((2,3,1))(data)
x=L.Reshape((-1,6))(x)
new_layer=column_nn()(x)
x=L.Reshape((10,10))(new_layer)
# now do a bunch of stuff to the 2-D new_layer, such as
x=L.Conv2d(filters=5,kernel_size=[3,3])(x)
x=L.MaxPooling2D(pool_size=(2,2),strides=(2,2))(x)
x=L.Flatten(x)
output=L.Dense(x)
return K.models.Model(inputs=A,outputs=output)
Seem ok? I'd love to know if if there were a slicker way to pull this off?
python tensorflow keras
add a comment |
I have a large number of n
xm
xm
arrays. I would like to train a keras model that learns a single Dense NN to apply to all of the n
x1
column vectors. As a concrete example, suppose A
is a 6
x10
x10
tensor; it therefore has 100 6
x1
column vectors.
I have a keras model to train a Dense neural network:
import keras as K
import keras.layers as L
def column_nn():
layers=[12,36,12,1]
columns=L.Input(shape=(6,))
x=L.Dense(layers[0],activation='relu')(columns)
for l in layers[1:]:
x=L.Dense(l,activation='relu')(x)
return K.models.Model(inputs=columns, outputs=x)
I'd like to apply this to each of the 100 column vectors, but I want to return a 2-D 10
x10
tensor that I can do other things to, such as pass through Conv2D
layers.
One way that comes to mind is to use reshaping and a keras shared layer.
data=L.Input(shape=(6,10,10))
column_nn=column()
x=L.Permute((2,3,1))(data)
x=L.Reshape((-1,6))(x)
new_layer=column_nn()(x)
x=L.Reshape((10,10))(new_layer)
# now do a bunch of stuff to the 2-D new_layer, such as
x=L.Conv2d(filters=5,kernel_size=[3,3])(x)
x=L.MaxPooling2D(pool_size=(2,2),strides=(2,2))(x)
x=L.Flatten(x)
output=L.Dense(x)
return K.models.Model(inputs=A,outputs=output)
Seem ok? I'd love to know if if there were a slicker way to pull this off?
python tensorflow keras
I have a large number of n
xm
xm
arrays. I would like to train a keras model that learns a single Dense NN to apply to all of the n
x1
column vectors. As a concrete example, suppose A
is a 6
x10
x10
tensor; it therefore has 100 6
x1
column vectors.
I have a keras model to train a Dense neural network:
import keras as K
import keras.layers as L
def column_nn():
layers=[12,36,12,1]
columns=L.Input(shape=(6,))
x=L.Dense(layers[0],activation='relu')(columns)
for l in layers[1:]:
x=L.Dense(l,activation='relu')(x)
return K.models.Model(inputs=columns, outputs=x)
I'd like to apply this to each of the 100 column vectors, but I want to return a 2-D 10
x10
tensor that I can do other things to, such as pass through Conv2D
layers.
One way that comes to mind is to use reshaping and a keras shared layer.
data=L.Input(shape=(6,10,10))
column_nn=column()
x=L.Permute((2,3,1))(data)
x=L.Reshape((-1,6))(x)
new_layer=column_nn()(x)
x=L.Reshape((10,10))(new_layer)
# now do a bunch of stuff to the 2-D new_layer, such as
x=L.Conv2d(filters=5,kernel_size=[3,3])(x)
x=L.MaxPooling2D(pool_size=(2,2),strides=(2,2))(x)
x=L.Flatten(x)
output=L.Dense(x)
return K.models.Model(inputs=A,outputs=output)
Seem ok? I'd love to know if if there were a slicker way to pull this off?
python tensorflow keras
python tensorflow keras
edited Jan 4 at 15:31
AstroBen
asked Jan 4 at 14:35
AstroBenAstroBen
14619
14619
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
If you reshape and transpose your input data to (m*m, n)
, you can use Dense(k)
in conjunction with TimeDistributed
to apply the same weights to the m^2 vectors separately. The output shape would be (m*m, k)
, after which you can reshape again to suit your needs
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on usingTimeDistributed
. I will definitely look into it.
– AstroBen
Jan 4 at 15:26
This answer is ok, butTimeDistributed(Dense(...))
is equal to justDense(...)
. -- The layer will be applied to the last dimension.
– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54040974%2fbuilding-a-keras-model-to-apply-a-dense-network-to-every-column-in-a-3-d-array-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
If you reshape and transpose your input data to (m*m, n)
, you can use Dense(k)
in conjunction with TimeDistributed
to apply the same weights to the m^2 vectors separately. The output shape would be (m*m, k)
, after which you can reshape again to suit your needs
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on usingTimeDistributed
. I will definitely look into it.
– AstroBen
Jan 4 at 15:26
This answer is ok, butTimeDistributed(Dense(...))
is equal to justDense(...)
. -- The layer will be applied to the last dimension.
– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
add a comment |
If you reshape and transpose your input data to (m*m, n)
, you can use Dense(k)
in conjunction with TimeDistributed
to apply the same weights to the m^2 vectors separately. The output shape would be (m*m, k)
, after which you can reshape again to suit your needs
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on usingTimeDistributed
. I will definitely look into it.
– AstroBen
Jan 4 at 15:26
This answer is ok, butTimeDistributed(Dense(...))
is equal to justDense(...)
. -- The layer will be applied to the last dimension.
– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
add a comment |
If you reshape and transpose your input data to (m*m, n)
, you can use Dense(k)
in conjunction with TimeDistributed
to apply the same weights to the m^2 vectors separately. The output shape would be (m*m, k)
, after which you can reshape again to suit your needs
If you reshape and transpose your input data to (m*m, n)
, you can use Dense(k)
in conjunction with TimeDistributed
to apply the same weights to the m^2 vectors separately. The output shape would be (m*m, k)
, after which you can reshape again to suit your needs
answered Jan 4 at 15:02
BlackBearBlackBear
15.5k83368
15.5k83368
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on usingTimeDistributed
. I will definitely look into it.
– AstroBen
Jan 4 at 15:26
This answer is ok, butTimeDistributed(Dense(...))
is equal to justDense(...)
. -- The layer will be applied to the last dimension.
– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
add a comment |
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on usingTimeDistributed
. I will definitely look into it.
– AstroBen
Jan 4 at 15:26
This answer is ok, butTimeDistributed(Dense(...))
is equal to justDense(...)
. -- The layer will be applied to the last dimension.
– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on using
TimeDistributed
. I will definitely look into it.– AstroBen
Jan 4 at 15:26
I was just looking into reshape and transpose when you wrote this. :-) Great pointer on using
TimeDistributed
. I will definitely look into it.– AstroBen
Jan 4 at 15:26
This answer is ok, but
TimeDistributed(Dense(...))
is equal to just Dense(...)
. -- The layer will be applied to the last dimension.– Daniel Möller
Jan 4 at 15:37
This answer is ok, but
TimeDistributed(Dense(...))
is equal to just Dense(...)
. -- The layer will be applied to the last dimension.– Daniel Möller
Jan 4 at 15:37
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
@DanielMöller I didn't know that, thank you for pointing out
– BlackBear
Jan 5 at 8:15
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54040974%2fbuilding-a-keras-model-to-apply-a-dense-network-to-every-column-in-a-3-d-array-a%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown