dynamic column name between multiple tables import mySQL
I have imported a table named book1 with column names: ID, 2018M01, 2018M02, 2018M03
I need to now import a second table named book2 with column names: ID, 2018M05, 2018M06, 2018M07
There is something similar between the column names from the two tables, i.e the prefix year 2018.
If possible I prefer not to keep writing this part of the code for each column name for each table like below
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
as in reality the actual tables has several columns.
The code written below considers the first table with some dynamic section code, however I need some help for the second import table.
DROP TABLE IF EXISTS book1;
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
SELECT
GROUP_CONCAT(
CONCAT(
'SELECT id, ''', COLUMN_NAME, ''' as month, ', COLUMN_NAME, ' as QTY FROM t1 ') SEPARATOR ' UNION ALL ')
FROM
`INFORMATION_SCHEMA`.`COLUMNS`
WHERE
`COLUMN_NAME` LIKE '2018%'
INTO @sql;
SET @query = CONCAT('select id, month, QTY from (' , @sql , ') x order by id;');
SELECT @query;
PREPARE stmt FROM @query;
EXECUTE stmt;
I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening
mysql
add a comment |
I have imported a table named book1 with column names: ID, 2018M01, 2018M02, 2018M03
I need to now import a second table named book2 with column names: ID, 2018M05, 2018M06, 2018M07
There is something similar between the column names from the two tables, i.e the prefix year 2018.
If possible I prefer not to keep writing this part of the code for each column name for each table like below
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
as in reality the actual tables has several columns.
The code written below considers the first table with some dynamic section code, however I need some help for the second import table.
DROP TABLE IF EXISTS book1;
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
SELECT
GROUP_CONCAT(
CONCAT(
'SELECT id, ''', COLUMN_NAME, ''' as month, ', COLUMN_NAME, ' as QTY FROM t1 ') SEPARATOR ' UNION ALL ')
FROM
`INFORMATION_SCHEMA`.`COLUMNS`
WHERE
`COLUMN_NAME` LIKE '2018%'
INTO @sql;
SET @query = CONCAT('select id, month, QTY from (' , @sql , ') x order by id;');
SELECT @query;
PREPARE stmt FROM @query;
EXECUTE stmt;
I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening
mysql
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59
add a comment |
I have imported a table named book1 with column names: ID, 2018M01, 2018M02, 2018M03
I need to now import a second table named book2 with column names: ID, 2018M05, 2018M06, 2018M07
There is something similar between the column names from the two tables, i.e the prefix year 2018.
If possible I prefer not to keep writing this part of the code for each column name for each table like below
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
as in reality the actual tables has several columns.
The code written below considers the first table with some dynamic section code, however I need some help for the second import table.
DROP TABLE IF EXISTS book1;
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
SELECT
GROUP_CONCAT(
CONCAT(
'SELECT id, ''', COLUMN_NAME, ''' as month, ', COLUMN_NAME, ' as QTY FROM t1 ') SEPARATOR ' UNION ALL ')
FROM
`INFORMATION_SCHEMA`.`COLUMNS`
WHERE
`COLUMN_NAME` LIKE '2018%'
INTO @sql;
SET @query = CONCAT('select id, month, QTY from (' , @sql , ') x order by id;');
SELECT @query;
PREPARE stmt FROM @query;
EXECUTE stmt;
I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening
mysql
I have imported a table named book1 with column names: ID, 2018M01, 2018M02, 2018M03
I need to now import a second table named book2 with column names: ID, 2018M05, 2018M06, 2018M07
There is something similar between the column names from the two tables, i.e the prefix year 2018.
If possible I prefer not to keep writing this part of the code for each column name for each table like below
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
as in reality the actual tables has several columns.
The code written below considers the first table with some dynamic section code, however I need some help for the second import table.
DROP TABLE IF EXISTS book1;
CREATE TABLE book1 (
ID VARCHARACTER(10),
2018M01 decimal(4,2),
2018M02 decimal(4,2),
2018M03 decimal(4,2)
);
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
SELECT
GROUP_CONCAT(
CONCAT(
'SELECT id, ''', COLUMN_NAME, ''' as month, ', COLUMN_NAME, ' as QTY FROM t1 ') SEPARATOR ' UNION ALL ')
FROM
`INFORMATION_SCHEMA`.`COLUMNS`
WHERE
`COLUMN_NAME` LIKE '2018%'
INTO @sql;
SET @query = CONCAT('select id, month, QTY from (' , @sql , ') x order by id;');
SELECT @query;
PREPARE stmt FROM @query;
EXECUTE stmt;
I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening
mysql
mysql
edited Jan 1 at 2:01
bp123
asked Jan 1 at 1:29
bp123bp123
697
697
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59
add a comment |
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59
add a comment |
1 Answer
1
active
oldest
votes
This is just a suggestion, obviously with no testing and with some inherent assumptions.
Looks like you could load each file into the SAME table, and therefore avoid having dynamic column names. i.e. you are ignoring the top row which I presume has column names in it
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
So a "staging" could be defined like so:
CREATE TABLE book_staging (
ID VARCHARACTER(10),
MNTH_A decimal(4,2),
MNTH_B decimal(4,2),
MNTH_C decimal(4,2)
);
And you could setup a fixed query with parameters, like this
set @mnth_a = '2018M04';
set @mnth_b = '2018M05';
set @mnth_c = '2018M06';
select id, month, QTY
from (
SELECT id, @mthn_a as month, MNTH_A as QTY FROM book_staging WHERE MNTH_A IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_B as QTY FROM book_staging WHERE MNTH_B IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_C as QTY FROM book_staging WHERE MNTH_C IS NOT NULL
) x
order by id, month
;
At the end of each cycle of loading a file the staging table can be truncated or dropped (depending on preferred approach) so you a ready for the next run.
Just as an observation, column names that are also function names or reserved words are painful to use. Personally I would not use the column name month
. Similarly I would much rather use a real date like 2018-01-01 than a string like 2018M01 but there may be other reasons for using those strings I'm unaware of.
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53992512%2fdynamic-column-name-between-multiple-tables-import-mysql%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
This is just a suggestion, obviously with no testing and with some inherent assumptions.
Looks like you could load each file into the SAME table, and therefore avoid having dynamic column names. i.e. you are ignoring the top row which I presume has column names in it
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
So a "staging" could be defined like so:
CREATE TABLE book_staging (
ID VARCHARACTER(10),
MNTH_A decimal(4,2),
MNTH_B decimal(4,2),
MNTH_C decimal(4,2)
);
And you could setup a fixed query with parameters, like this
set @mnth_a = '2018M04';
set @mnth_b = '2018M05';
set @mnth_c = '2018M06';
select id, month, QTY
from (
SELECT id, @mthn_a as month, MNTH_A as QTY FROM book_staging WHERE MNTH_A IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_B as QTY FROM book_staging WHERE MNTH_B IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_C as QTY FROM book_staging WHERE MNTH_C IS NOT NULL
) x
order by id, month
;
At the end of each cycle of loading a file the staging table can be truncated or dropped (depending on preferred approach) so you a ready for the next run.
Just as an observation, column names that are also function names or reserved words are painful to use. Personally I would not use the column name month
. Similarly I would much rather use a real date like 2018-01-01 than a string like 2018M01 but there may be other reasons for using those strings I'm unaware of.
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
add a comment |
This is just a suggestion, obviously with no testing and with some inherent assumptions.
Looks like you could load each file into the SAME table, and therefore avoid having dynamic column names. i.e. you are ignoring the top row which I presume has column names in it
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
So a "staging" could be defined like so:
CREATE TABLE book_staging (
ID VARCHARACTER(10),
MNTH_A decimal(4,2),
MNTH_B decimal(4,2),
MNTH_C decimal(4,2)
);
And you could setup a fixed query with parameters, like this
set @mnth_a = '2018M04';
set @mnth_b = '2018M05';
set @mnth_c = '2018M06';
select id, month, QTY
from (
SELECT id, @mthn_a as month, MNTH_A as QTY FROM book_staging WHERE MNTH_A IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_B as QTY FROM book_staging WHERE MNTH_B IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_C as QTY FROM book_staging WHERE MNTH_C IS NOT NULL
) x
order by id, month
;
At the end of each cycle of loading a file the staging table can be truncated or dropped (depending on preferred approach) so you a ready for the next run.
Just as an observation, column names that are also function names or reserved words are painful to use. Personally I would not use the column name month
. Similarly I would much rather use a real date like 2018-01-01 than a string like 2018M01 but there may be other reasons for using those strings I'm unaware of.
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
add a comment |
This is just a suggestion, obviously with no testing and with some inherent assumptions.
Looks like you could load each file into the SAME table, and therefore avoid having dynamic column names. i.e. you are ignoring the top row which I presume has column names in it
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
So a "staging" could be defined like so:
CREATE TABLE book_staging (
ID VARCHARACTER(10),
MNTH_A decimal(4,2),
MNTH_B decimal(4,2),
MNTH_C decimal(4,2)
);
And you could setup a fixed query with parameters, like this
set @mnth_a = '2018M04';
set @mnth_b = '2018M05';
set @mnth_c = '2018M06';
select id, month, QTY
from (
SELECT id, @mthn_a as month, MNTH_A as QTY FROM book_staging WHERE MNTH_A IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_B as QTY FROM book_staging WHERE MNTH_B IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_C as QTY FROM book_staging WHERE MNTH_C IS NOT NULL
) x
order by id, month
;
At the end of each cycle of loading a file the staging table can be truncated or dropped (depending on preferred approach) so you a ready for the next run.
Just as an observation, column names that are also function names or reserved words are painful to use. Personally I would not use the column name month
. Similarly I would much rather use a real date like 2018-01-01 than a string like 2018M01 but there may be other reasons for using those strings I'm unaware of.
This is just a suggestion, obviously with no testing and with some inherent assumptions.
Looks like you could load each file into the SAME table, and therefore avoid having dynamic column names. i.e. you are ignoring the top row which I presume has column names in it
LOAD DATA LOCAL INFILE '/Users/blytonpereira/Desktop/Book1.csv' REPLACE INTO TABLE book1
FIELDS TERMINATED BY ',' LINES TERMINATED BY 'n' IGNORE 1 LINES;
So a "staging" could be defined like so:
CREATE TABLE book_staging (
ID VARCHARACTER(10),
MNTH_A decimal(4,2),
MNTH_B decimal(4,2),
MNTH_C decimal(4,2)
);
And you could setup a fixed query with parameters, like this
set @mnth_a = '2018M04';
set @mnth_b = '2018M05';
set @mnth_c = '2018M06';
select id, month, QTY
from (
SELECT id, @mthn_a as month, MNTH_A as QTY FROM book_staging WHERE MNTH_A IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_B as QTY FROM book_staging WHERE MNTH_B IS NOT NULL
UNION ALL
SELECT id, @mnth_b as month, MNTH_C as QTY FROM book_staging WHERE MNTH_C IS NOT NULL
) x
order by id, month
;
At the end of each cycle of loading a file the staging table can be truncated or dropped (depending on preferred approach) so you a ready for the next run.
Just as an observation, column names that are also function names or reserved words are painful to use. Personally I would not use the column name month
. Similarly I would much rather use a real date like 2018-01-01 than a string like 2018M01 but there may be other reasons for using those strings I'm unaware of.
edited Jan 1 at 22:13
answered Jan 1 at 4:09
Used_By_AlreadyUsed_By_Already
23k22038
23k22038
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
add a comment |
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
@usedbyalready. when I am loading the next file assuming name is Book2.csv....I can just skip the CREATE TABLE part ....I am not understanding clearly what is meant by the staging part can be dropped ? does it mean we only need the staging part once when we load the first file and no more again ?
– bp123
Jan 1 at 12:07
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
The prime purpose of my suggestion was how to avoid dynamic columns. A staging table can be truncated or dropped, it depends on your preferred approach.
– Used_By_Already
Jan 1 at 22:16
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53992512%2fdynamic-column-name-between-multiple-tables-import-mysql%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
I don't understand what your problem is. If what you've posted is working for the first table , you seem to be on the right track. Where are you stuck?
– Dan Farrell
Jan 1 at 1:35
What help do you need? What issue you are facing with this code?
– Chetan Ranpariya
Jan 1 at 1:36
@DanFarrell for the SECOND import table from a local .csv file, when importing it I can use the LOAD DATA LOCAL INFILE code however then this needs me to create the table in SQL beforehand (i.e. the column names etc). I want this to be dynamic, i.e. each time I create a new import I do not want to define the column names in SQL as there are several columns, so I need some help writing this part of the code continuing from what I have already written
– bp123
Jan 1 at 1:40
@ChetanRanpariya pls see image ibb.co/TbDwNmp I would like to avoid explicitly defining column name for each and every import. Each import I am doing are similar but slightly different, i.e. the column names are updating each time e.g. in first table it is 2018M01..2018M02...2018M03....in second table it is 2018M02..2018M03..2018M04 . I need some help writing this second part of the code where the import of the second file is happening . The similarity is that the year is similar 2018 for the column name.
– bp123
Jan 1 at 1:45
You know MySQL has a hard limit number for columns 4096 as limit but innoDB has a limit of 1000? Besides incrementing column names is a great candidate for normalisation.
– Raymond Nijland
Jan 1 at 1:59