Complexity of creating list with concat operator in python
I am starting to learn about data structures+algorithms, and I have encountered an issue. Here is the function I am testing:
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
Here is my thought process:
I know that the concat operator is O(k)
where k
is the size of the list being added to the original list. Since the size of k
is always 1
in this case because we are adding one character lists at a time, the concat operation takes 1
step. Since the loop iterates n
times, the algorithm will perform n
steps - doing 1
step per iteration. Therefore, the algorithm's time complexity would be O(n)
. The algorithm's actual execution time would look something like T(n) = dn
where d
is the time it takes to perform the concatenation. For such a function, I would expect the following to be true: when you increase the input size by 10 times, the output (execution time) would increase by 10 times since:
(x, dx) --> (10x, 10dx) --> 10dx/dx = 10
However, when I actually test out the algorithm on real values and time the executions, this does not seem to be happening. Instead, when I increase the input size by 10 times, the output (execution time) increases by 100 times, and when I increase the input size by 100 times, the output increases by 10000 times. These outputs suggest a quadratic time function and O(n squared)
.
Here is my full code:
import timeit
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
t1 = timeit.Timer("create_list_with_concat(100)", "from __main__ import
create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
t1 = timeit.Timer("create_list_with_concat(1000)", "from __main__
import create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
# OUTPUT
# concat 0.05283101927489042 milliseconds
# concat 2.8588240093085915 milliseconds
Thanks so much for the help.
python algorithm data-structures time-complexity
add a comment |
I am starting to learn about data structures+algorithms, and I have encountered an issue. Here is the function I am testing:
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
Here is my thought process:
I know that the concat operator is O(k)
where k
is the size of the list being added to the original list. Since the size of k
is always 1
in this case because we are adding one character lists at a time, the concat operation takes 1
step. Since the loop iterates n
times, the algorithm will perform n
steps - doing 1
step per iteration. Therefore, the algorithm's time complexity would be O(n)
. The algorithm's actual execution time would look something like T(n) = dn
where d
is the time it takes to perform the concatenation. For such a function, I would expect the following to be true: when you increase the input size by 10 times, the output (execution time) would increase by 10 times since:
(x, dx) --> (10x, 10dx) --> 10dx/dx = 10
However, when I actually test out the algorithm on real values and time the executions, this does not seem to be happening. Instead, when I increase the input size by 10 times, the output (execution time) increases by 100 times, and when I increase the input size by 100 times, the output increases by 10000 times. These outputs suggest a quadratic time function and O(n squared)
.
Here is my full code:
import timeit
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
t1 = timeit.Timer("create_list_with_concat(100)", "from __main__ import
create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
t1 = timeit.Timer("create_list_with_concat(1000)", "from __main__
import create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
# OUTPUT
# concat 0.05283101927489042 milliseconds
# concat 2.8588240093085915 milliseconds
Thanks so much for the help.
python algorithm data-structures time-complexity
2
You realize there is a difference betweenl = l + [i]
, andl += [i]
?
– Willem Van Onsem
Dec 31 '18 at 23:09
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
1
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45
add a comment |
I am starting to learn about data structures+algorithms, and I have encountered an issue. Here is the function I am testing:
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
Here is my thought process:
I know that the concat operator is O(k)
where k
is the size of the list being added to the original list. Since the size of k
is always 1
in this case because we are adding one character lists at a time, the concat operation takes 1
step. Since the loop iterates n
times, the algorithm will perform n
steps - doing 1
step per iteration. Therefore, the algorithm's time complexity would be O(n)
. The algorithm's actual execution time would look something like T(n) = dn
where d
is the time it takes to perform the concatenation. For such a function, I would expect the following to be true: when you increase the input size by 10 times, the output (execution time) would increase by 10 times since:
(x, dx) --> (10x, 10dx) --> 10dx/dx = 10
However, when I actually test out the algorithm on real values and time the executions, this does not seem to be happening. Instead, when I increase the input size by 10 times, the output (execution time) increases by 100 times, and when I increase the input size by 100 times, the output increases by 10000 times. These outputs suggest a quadratic time function and O(n squared)
.
Here is my full code:
import timeit
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
t1 = timeit.Timer("create_list_with_concat(100)", "from __main__ import
create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
t1 = timeit.Timer("create_list_with_concat(1000)", "from __main__
import create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
# OUTPUT
# concat 0.05283101927489042 milliseconds
# concat 2.8588240093085915 milliseconds
Thanks so much for the help.
python algorithm data-structures time-complexity
I am starting to learn about data structures+algorithms, and I have encountered an issue. Here is the function I am testing:
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
Here is my thought process:
I know that the concat operator is O(k)
where k
is the size of the list being added to the original list. Since the size of k
is always 1
in this case because we are adding one character lists at a time, the concat operation takes 1
step. Since the loop iterates n
times, the algorithm will perform n
steps - doing 1
step per iteration. Therefore, the algorithm's time complexity would be O(n)
. The algorithm's actual execution time would look something like T(n) = dn
where d
is the time it takes to perform the concatenation. For such a function, I would expect the following to be true: when you increase the input size by 10 times, the output (execution time) would increase by 10 times since:
(x, dx) --> (10x, 10dx) --> 10dx/dx = 10
However, when I actually test out the algorithm on real values and time the executions, this does not seem to be happening. Instead, when I increase the input size by 10 times, the output (execution time) increases by 100 times, and when I increase the input size by 100 times, the output increases by 10000 times. These outputs suggest a quadratic time function and O(n squared)
.
Here is my full code:
import timeit
def create_list_with_concat(n):
l =
for i in range(n):
l = l + [i]
t1 = timeit.Timer("create_list_with_concat(100)", "from __main__ import
create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
t1 = timeit.Timer("create_list_with_concat(1000)", "from __main__
import create_list_with_concat")
print("concat ",t1.timeit(number=1)*1000, "milliseconds")
# OUTPUT
# concat 0.05283101927489042 milliseconds
# concat 2.8588240093085915 milliseconds
Thanks so much for the help.
python algorithm data-structures time-complexity
python algorithm data-structures time-complexity
asked Dec 31 '18 at 23:08
davidimdavidim
303
303
2
You realize there is a difference betweenl = l + [i]
, andl += [i]
?
– Willem Van Onsem
Dec 31 '18 at 23:09
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
1
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45
add a comment |
2
You realize there is a difference betweenl = l + [i]
, andl += [i]
?
– Willem Van Onsem
Dec 31 '18 at 23:09
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
1
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45
2
2
You realize there is a difference between
l = l + [i]
, and l += [i]
?– Willem Van Onsem
Dec 31 '18 at 23:09
You realize there is a difference between
l = l + [i]
, and l += [i]
?– Willem Van Onsem
Dec 31 '18 at 23:09
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
1
1
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45
add a comment |
3 Answers
3
active
oldest
votes
The time complexity is not O(N)
The time complexity of the concat operation for two lists, A and B, is O(A + B)
. This is because you aren't adding to one list, but instead are creating a whole new list and populating it with elements from both A and B, requiring you to iterate through both.
Therefore, doing the operation l = l + [i]
is O(len(l))
, leaving you with N
steps of doing an N
operation, resulting in an overall complexity of O(N^2)
You are confusing concat with the append
or extend
function, which doesn't create a new list but adds to the original. If you used those functions, your time complexity would indeed be O(N)
An additional note:
The notation l = l + [i]
can be confusing because intuitively it seems like [i]
is simply being added to the existing l
. This isn't true!
l + [i]
builds a entirely new list and then has l
point to that list.
On the other hand l += [i]
modifies the original list and behaves like extend
l += [i]
should really just bel.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference betweenl += [i]
andl = l + [i]
– Primusa
Jan 1 at 0:19
add a comment |
Here is my thought process: I know that the concat operator is O(k) where k is the size of the list being added to the original list. Since the size of k is always 1 in this case because we are adding one character lists at a time, the concat operation takes 1 step.
This assumption is incorrect. If you write:
l + [i]
you construct a new list, this list will have m+1 elements, with m the number of elements in l
, given a list is implemented like an array, we know that constructing such list will take O(m) time. We then assign the new list to l
.
So that means that the total number of steps is:
n
---
2
/ O(m) = O(n )
---
m=0
so the time complexity is O(n2).
You can however boost performance, by using l += [i]
, or even faster l.append(i)
, where the amortize cost is, for both l += [i]
and l.append(i)
O(1), so then the algorithm is O(n), the l.append(i)
will however likely be a bit faster because we save on constructing a new list, etc.
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
add a comment |
>>> spam =
>>> eggs = spam
>>> spam += [1]
>>> eggs
[1]
>>> spam =
>>> eggs = spam
>>> spam = spam + [1]
>>> eggs
There's a difference in complexity between mutating a list and making a new one.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53992048%2fcomplexity-of-creating-list-with-concat-operator-in-python%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
The time complexity is not O(N)
The time complexity of the concat operation for two lists, A and B, is O(A + B)
. This is because you aren't adding to one list, but instead are creating a whole new list and populating it with elements from both A and B, requiring you to iterate through both.
Therefore, doing the operation l = l + [i]
is O(len(l))
, leaving you with N
steps of doing an N
operation, resulting in an overall complexity of O(N^2)
You are confusing concat with the append
or extend
function, which doesn't create a new list but adds to the original. If you used those functions, your time complexity would indeed be O(N)
An additional note:
The notation l = l + [i]
can be confusing because intuitively it seems like [i]
is simply being added to the existing l
. This isn't true!
l + [i]
builds a entirely new list and then has l
point to that list.
On the other hand l += [i]
modifies the original list and behaves like extend
l += [i]
should really just bel.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference betweenl += [i]
andl = l + [i]
– Primusa
Jan 1 at 0:19
add a comment |
The time complexity is not O(N)
The time complexity of the concat operation for two lists, A and B, is O(A + B)
. This is because you aren't adding to one list, but instead are creating a whole new list and populating it with elements from both A and B, requiring you to iterate through both.
Therefore, doing the operation l = l + [i]
is O(len(l))
, leaving you with N
steps of doing an N
operation, resulting in an overall complexity of O(N^2)
You are confusing concat with the append
or extend
function, which doesn't create a new list but adds to the original. If you used those functions, your time complexity would indeed be O(N)
An additional note:
The notation l = l + [i]
can be confusing because intuitively it seems like [i]
is simply being added to the existing l
. This isn't true!
l + [i]
builds a entirely new list and then has l
point to that list.
On the other hand l += [i]
modifies the original list and behaves like extend
l += [i]
should really just bel.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference betweenl += [i]
andl = l + [i]
– Primusa
Jan 1 at 0:19
add a comment |
The time complexity is not O(N)
The time complexity of the concat operation for two lists, A and B, is O(A + B)
. This is because you aren't adding to one list, but instead are creating a whole new list and populating it with elements from both A and B, requiring you to iterate through both.
Therefore, doing the operation l = l + [i]
is O(len(l))
, leaving you with N
steps of doing an N
operation, resulting in an overall complexity of O(N^2)
You are confusing concat with the append
or extend
function, which doesn't create a new list but adds to the original. If you used those functions, your time complexity would indeed be O(N)
An additional note:
The notation l = l + [i]
can be confusing because intuitively it seems like [i]
is simply being added to the existing l
. This isn't true!
l + [i]
builds a entirely new list and then has l
point to that list.
On the other hand l += [i]
modifies the original list and behaves like extend
The time complexity is not O(N)
The time complexity of the concat operation for two lists, A and B, is O(A + B)
. This is because you aren't adding to one list, but instead are creating a whole new list and populating it with elements from both A and B, requiring you to iterate through both.
Therefore, doing the operation l = l + [i]
is O(len(l))
, leaving you with N
steps of doing an N
operation, resulting in an overall complexity of O(N^2)
You are confusing concat with the append
or extend
function, which doesn't create a new list but adds to the original. If you used those functions, your time complexity would indeed be O(N)
An additional note:
The notation l = l + [i]
can be confusing because intuitively it seems like [i]
is simply being added to the existing l
. This isn't true!
l + [i]
builds a entirely new list and then has l
point to that list.
On the other hand l += [i]
modifies the original list and behaves like extend
edited Dec 31 '18 at 23:20
answered Dec 31 '18 at 23:13
PrimusaPrimusa
7,6051629
7,6051629
l += [i]
should really just bel.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference betweenl += [i]
andl = l + [i]
– Primusa
Jan 1 at 0:19
add a comment |
l += [i]
should really just bel.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference betweenl += [i]
andl = l + [i]
– Primusa
Jan 1 at 0:19
l += [i]
should really just be l.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
l += [i]
should really just be l.append(x)
– juanpa.arrivillaga
Dec 31 '18 at 23:51
I just wanted to highlight the difference between
l += [i]
and l = l + [i]
– Primusa
Jan 1 at 0:19
I just wanted to highlight the difference between
l += [i]
and l = l + [i]
– Primusa
Jan 1 at 0:19
add a comment |
Here is my thought process: I know that the concat operator is O(k) where k is the size of the list being added to the original list. Since the size of k is always 1 in this case because we are adding one character lists at a time, the concat operation takes 1 step.
This assumption is incorrect. If you write:
l + [i]
you construct a new list, this list will have m+1 elements, with m the number of elements in l
, given a list is implemented like an array, we know that constructing such list will take O(m) time. We then assign the new list to l
.
So that means that the total number of steps is:
n
---
2
/ O(m) = O(n )
---
m=0
so the time complexity is O(n2).
You can however boost performance, by using l += [i]
, or even faster l.append(i)
, where the amortize cost is, for both l += [i]
and l.append(i)
O(1), so then the algorithm is O(n), the l.append(i)
will however likely be a bit faster because we save on constructing a new list, etc.
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
add a comment |
Here is my thought process: I know that the concat operator is O(k) where k is the size of the list being added to the original list. Since the size of k is always 1 in this case because we are adding one character lists at a time, the concat operation takes 1 step.
This assumption is incorrect. If you write:
l + [i]
you construct a new list, this list will have m+1 elements, with m the number of elements in l
, given a list is implemented like an array, we know that constructing such list will take O(m) time. We then assign the new list to l
.
So that means that the total number of steps is:
n
---
2
/ O(m) = O(n )
---
m=0
so the time complexity is O(n2).
You can however boost performance, by using l += [i]
, or even faster l.append(i)
, where the amortize cost is, for both l += [i]
and l.append(i)
O(1), so then the algorithm is O(n), the l.append(i)
will however likely be a bit faster because we save on constructing a new list, etc.
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
add a comment |
Here is my thought process: I know that the concat operator is O(k) where k is the size of the list being added to the original list. Since the size of k is always 1 in this case because we are adding one character lists at a time, the concat operation takes 1 step.
This assumption is incorrect. If you write:
l + [i]
you construct a new list, this list will have m+1 elements, with m the number of elements in l
, given a list is implemented like an array, we know that constructing such list will take O(m) time. We then assign the new list to l
.
So that means that the total number of steps is:
n
---
2
/ O(m) = O(n )
---
m=0
so the time complexity is O(n2).
You can however boost performance, by using l += [i]
, or even faster l.append(i)
, where the amortize cost is, for both l += [i]
and l.append(i)
O(1), so then the algorithm is O(n), the l.append(i)
will however likely be a bit faster because we save on constructing a new list, etc.
Here is my thought process: I know that the concat operator is O(k) where k is the size of the list being added to the original list. Since the size of k is always 1 in this case because we are adding one character lists at a time, the concat operation takes 1 step.
This assumption is incorrect. If you write:
l + [i]
you construct a new list, this list will have m+1 elements, with m the number of elements in l
, given a list is implemented like an array, we know that constructing such list will take O(m) time. We then assign the new list to l
.
So that means that the total number of steps is:
n
---
2
/ O(m) = O(n )
---
m=0
so the time complexity is O(n2).
You can however boost performance, by using l += [i]
, or even faster l.append(i)
, where the amortize cost is, for both l += [i]
and l.append(i)
O(1), so then the algorithm is O(n), the l.append(i)
will however likely be a bit faster because we save on constructing a new list, etc.
edited Dec 31 '18 at 23:55
answered Dec 31 '18 at 23:15
Willem Van OnsemWillem Van Onsem
148k16142232
148k16142232
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
add a comment |
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
@juanpa.arrivillaga: yes that correct, but after rereading, I realize the wording might be a bit "misleading", will edit.
– Willem Van Onsem
Dec 31 '18 at 23:54
add a comment |
>>> spam =
>>> eggs = spam
>>> spam += [1]
>>> eggs
[1]
>>> spam =
>>> eggs = spam
>>> spam = spam + [1]
>>> eggs
There's a difference in complexity between mutating a list and making a new one.
add a comment |
>>> spam =
>>> eggs = spam
>>> spam += [1]
>>> eggs
[1]
>>> spam =
>>> eggs = spam
>>> spam = spam + [1]
>>> eggs
There's a difference in complexity between mutating a list and making a new one.
add a comment |
>>> spam =
>>> eggs = spam
>>> spam += [1]
>>> eggs
[1]
>>> spam =
>>> eggs = spam
>>> spam = spam + [1]
>>> eggs
There's a difference in complexity between mutating a list and making a new one.
>>> spam =
>>> eggs = spam
>>> spam += [1]
>>> eggs
[1]
>>> spam =
>>> eggs = spam
>>> spam = spam + [1]
>>> eggs
There's a difference in complexity between mutating a list and making a new one.
answered Dec 31 '18 at 23:16
gilchgilch
3,9251716
3,9251716
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53992048%2fcomplexity-of-creating-list-with-concat-operator-in-python%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
You realize there is a difference between
l = l + [i]
, andl += [i]
?– Willem Van Onsem
Dec 31 '18 at 23:09
@WillemVanOnsem in terms of the end result, there isn't. If you're alluding to something that's occurring at a lower-level, it would be nice if you were to share your knowledge.
– chb
Dec 31 '18 at 23:39
1
@chb: but the question deals with time complexity, and in terms of time complexity, for lists there is a difference.
– Willem Van Onsem
Dec 31 '18 at 23:43
@WillemVanOnsem Sorry, was in the review queue and hadn't seen that you'd posted an answer.
– chb
Dec 31 '18 at 23:45