How can I insert 100000 rows in SQL Server?












17















INSERT INTO pantscolor_t (procode, color, pic) 
VALUES
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
('74251', 'Black', '511black.jpg'),
('74251', 'OD Green', '511odgreen.jpg'),
..........
..........
..........

INSERT INTO pantscolor_t (procode,color,pic)
VALUES
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
('74251', 'Charcoal', '511charcoal.jpg'),
.............
.............
.............

INSERT INTO........................
INSERT INTO........................
INSERT INTO........................
INSERT INTO........................


I have 100000 rows like this but my insert statements bigger than 1000 rows. When I run the SQL statement in SSMS, I get an error:




The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.











share|improve this question





























    17















    INSERT INTO pantscolor_t (procode, color, pic) 
    VALUES
    ('74251', 'Black', '511black.jpg'),
    ('74251', 'OD Green', '511odgreen.jpg'),
    ('74251', 'Black', '511black.jpg'),
    ('74251', 'OD Green', '511odgreen.jpg'),
    ('74251', 'Black', '511black.jpg'),
    ('74251', 'OD Green', '511odgreen.jpg'),
    ..........
    ..........
    ..........

    INSERT INTO pantscolor_t (procode,color,pic)
    VALUES
    ('74251', 'Charcoal', '511charcoal.jpg'),
    ('74251', 'Charcoal', '511charcoal.jpg'),
    ('74251', 'Charcoal', '511charcoal.jpg'),
    ('74251', 'Charcoal', '511charcoal.jpg'),
    .............
    .............
    .............

    INSERT INTO........................
    INSERT INTO........................
    INSERT INTO........................
    INSERT INTO........................


    I have 100000 rows like this but my insert statements bigger than 1000 rows. When I run the SQL statement in SSMS, I get an error:




    The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.











    share|improve this question



























      17












      17








      17


      5






      INSERT INTO pantscolor_t (procode, color, pic) 
      VALUES
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ..........
      ..........
      ..........

      INSERT INTO pantscolor_t (procode,color,pic)
      VALUES
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      .............
      .............
      .............

      INSERT INTO........................
      INSERT INTO........................
      INSERT INTO........................
      INSERT INTO........................


      I have 100000 rows like this but my insert statements bigger than 1000 rows. When I run the SQL statement in SSMS, I get an error:




      The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.











      share|improve this question
















      INSERT INTO pantscolor_t (procode, color, pic) 
      VALUES
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ('74251', 'Black', '511black.jpg'),
      ('74251', 'OD Green', '511odgreen.jpg'),
      ..........
      ..........
      ..........

      INSERT INTO pantscolor_t (procode,color,pic)
      VALUES
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      ('74251', 'Charcoal', '511charcoal.jpg'),
      .............
      .............
      .............

      INSERT INTO........................
      INSERT INTO........................
      INSERT INTO........................
      INSERT INTO........................


      I have 100000 rows like this but my insert statements bigger than 1000 rows. When I run the SQL statement in SSMS, I get an error:




      The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.








      sql sql-server sql-server-2008






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Dec 29 '18 at 7:12









      marc_s

      573k12811071255




      573k12811071255










      asked Oct 12 '11 at 8:52









      mekar10mekar10

      2061316




      2061316
























          3 Answers
          3






          active

          oldest

          votes


















          16














          Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.



          http://msdn.microsoft.com/en-us/library/ms188365.aspx






          share|improve this answer
























          • +1 From code, you could use the SqlBulkCopy class

            – Andomar
            Oct 12 '11 at 9:07











          • I have a sql file it is coming from mysl with export...

            – mekar10
            Oct 12 '11 at 9:21











          • Then I guess; it will be easy to convert into csv file.

            – Bala
            Oct 12 '11 at 9:27











          • ok! thank you..

            – mekar10
            Oct 12 '11 at 9:33











          • You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

            – endo64
            Jan 15 '18 at 15:37





















          11














          Another solution is to use a select query with unions.



          INSERT INTO pantscolor_t (procode,color,pic)
          SELECT '74251', 'Black', '511black.jpg'
          UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
          UNION ALL SELECT '74251', 'Black', '511black.jpg'
          UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
          UNION ALL SELECT '74251', 'Black', '511black.jpg'
          UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
          --etc....


          UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.






          share|improve this answer

































            1














            By applying the following you should not have any error :



            INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

            INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

            INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

            INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

            INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

            ...........


            I tried it and it worked, of course you can use the excel to concatenate the values easily.






            share|improve this answer























              Your Answer






              StackExchange.ifUsing("editor", function () {
              StackExchange.using("externalEditor", function () {
              StackExchange.using("snippets", function () {
              StackExchange.snippets.init();
              });
              });
              }, "code-snippets");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "1"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f7737472%2fhow-can-i-insert-100000-rows-in-sql-server%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              16














              Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.



              http://msdn.microsoft.com/en-us/library/ms188365.aspx






              share|improve this answer
























              • +1 From code, you could use the SqlBulkCopy class

                – Andomar
                Oct 12 '11 at 9:07











              • I have a sql file it is coming from mysl with export...

                – mekar10
                Oct 12 '11 at 9:21











              • Then I guess; it will be easy to convert into csv file.

                – Bala
                Oct 12 '11 at 9:27











              • ok! thank you..

                – mekar10
                Oct 12 '11 at 9:33











              • You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

                – endo64
                Jan 15 '18 at 15:37


















              16














              Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.



              http://msdn.microsoft.com/en-us/library/ms188365.aspx






              share|improve this answer
























              • +1 From code, you could use the SqlBulkCopy class

                – Andomar
                Oct 12 '11 at 9:07











              • I have a sql file it is coming from mysl with export...

                – mekar10
                Oct 12 '11 at 9:21











              • Then I guess; it will be easy to convert into csv file.

                – Bala
                Oct 12 '11 at 9:27











              • ok! thank you..

                – mekar10
                Oct 12 '11 at 9:33











              • You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

                – endo64
                Jan 15 '18 at 15:37
















              16












              16








              16







              Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.



              http://msdn.microsoft.com/en-us/library/ms188365.aspx






              share|improve this answer













              Create csv file (or some file with defined field delimiter and row delimiter) and use "BULK INSERT" option to load file to database. File can have 100000 rows; there won't be any problem of loading huge file using bulk upload.



              http://msdn.microsoft.com/en-us/library/ms188365.aspx







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Oct 12 '11 at 9:04









              BalaBala

              2,06452029




              2,06452029













              • +1 From code, you could use the SqlBulkCopy class

                – Andomar
                Oct 12 '11 at 9:07











              • I have a sql file it is coming from mysl with export...

                – mekar10
                Oct 12 '11 at 9:21











              • Then I guess; it will be easy to convert into csv file.

                – Bala
                Oct 12 '11 at 9:27











              • ok! thank you..

                – mekar10
                Oct 12 '11 at 9:33











              • You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

                – endo64
                Jan 15 '18 at 15:37





















              • +1 From code, you could use the SqlBulkCopy class

                – Andomar
                Oct 12 '11 at 9:07











              • I have a sql file it is coming from mysl with export...

                – mekar10
                Oct 12 '11 at 9:21











              • Then I guess; it will be easy to convert into csv file.

                – Bala
                Oct 12 '11 at 9:27











              • ok! thank you..

                – mekar10
                Oct 12 '11 at 9:33











              • You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

                – endo64
                Jan 15 '18 at 15:37



















              +1 From code, you could use the SqlBulkCopy class

              – Andomar
              Oct 12 '11 at 9:07





              +1 From code, you could use the SqlBulkCopy class

              – Andomar
              Oct 12 '11 at 9:07













              I have a sql file it is coming from mysl with export...

              – mekar10
              Oct 12 '11 at 9:21





              I have a sql file it is coming from mysl with export...

              – mekar10
              Oct 12 '11 at 9:21













              Then I guess; it will be easy to convert into csv file.

              – Bala
              Oct 12 '11 at 9:27





              Then I guess; it will be easy to convert into csv file.

              – Bala
              Oct 12 '11 at 9:27













              ok! thank you..

              – mekar10
              Oct 12 '11 at 9:33





              ok! thank you..

              – mekar10
              Oct 12 '11 at 9:33













              You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

              – endo64
              Jan 15 '18 at 15:37







              You can also use bcp.exe : bcp mytable in data.txt -S (local) -d mydatabase -T -c (-T for truasted connection, -c for character data you can use it for tab separated files. You can see other options with bcp --help for example -t for field -r for row terminator in case your data is not tab separated or tab delimited)

              – endo64
              Jan 15 '18 at 15:37















              11














              Another solution is to use a select query with unions.



              INSERT INTO pantscolor_t (procode,color,pic)
              SELECT '74251', 'Black', '511black.jpg'
              UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
              UNION ALL SELECT '74251', 'Black', '511black.jpg'
              UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
              UNION ALL SELECT '74251', 'Black', '511black.jpg'
              UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
              --etc....


              UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.






              share|improve this answer






























                11














                Another solution is to use a select query with unions.



                INSERT INTO pantscolor_t (procode,color,pic)
                SELECT '74251', 'Black', '511black.jpg'
                UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                UNION ALL SELECT '74251', 'Black', '511black.jpg'
                UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                UNION ALL SELECT '74251', 'Black', '511black.jpg'
                UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                --etc....


                UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.






                share|improve this answer




























                  11












                  11








                  11







                  Another solution is to use a select query with unions.



                  INSERT INTO pantscolor_t (procode,color,pic)
                  SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  UNION ALL SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  UNION ALL SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  --etc....


                  UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.






                  share|improve this answer















                  Another solution is to use a select query with unions.



                  INSERT INTO pantscolor_t (procode,color,pic)
                  SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  UNION ALL SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  UNION ALL SELECT '74251', 'Black', '511black.jpg'
                  UNION ALL SELECT '74251', 'OD Green', '511odgreen.jpg'
                  --etc....


                  UNION ALL is used instead of UNION in order to speed up the query when dealing with thousands of records. UNION ALL allows for duplicate rows whereas UNION will ensure that duplicates do not exist in the result set. For this scenario we don't want to remove any possible duplicates, so UNION ALL is used.







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited May 17 '17 at 15:50









                  ahsteele

                  17.4k25119221




                  17.4k25119221










                  answered Jan 4 '16 at 18:34









                  Alex JorgensonAlex Jorgenson

                  4441915




                  4441915























                      1














                      By applying the following you should not have any error :



                      INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

                      INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                      INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                      INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                      INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                      ...........


                      I tried it and it worked, of course you can use the excel to concatenate the values easily.






                      share|improve this answer




























                        1














                        By applying the following you should not have any error :



                        INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

                        INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                        INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                        INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                        INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                        ...........


                        I tried it and it worked, of course you can use the excel to concatenate the values easily.






                        share|improve this answer


























                          1












                          1








                          1







                          By applying the following you should not have any error :



                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                          ...........


                          I tried it and it worked, of course you can use the excel to concatenate the values easily.






                          share|improve this answer













                          By applying the following you should not have any error :



                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251','Black','511black.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'OD Green', '511odgreen.jpg')

                          INSERT INTO pantscolor_t (procode,color,pic) VALUES ('74251', 'Black', '511black.jpg')

                          ...........


                          I tried it and it worked, of course you can use the excel to concatenate the values easily.







                          share|improve this answer












                          share|improve this answer



                          share|improve this answer










                          answered Oct 24 '17 at 12:00









                          Scorpion99Scorpion99

                          153112




                          153112






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Stack Overflow!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f7737472%2fhow-can-i-insert-100000-rows-in-sql-server%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              Monofisismo

                              Angular Downloading a file using contenturl with Basic Authentication

                              Olmecas