How to pair each element of a Seq with the rest?

Multi tool use
Multi tool use












2















I'm looking for an elegant way to combine every element of a Seq with the rest for a large collection.
Example: Seq(1,2,3).someMethod should produce something like



Iterator(
(1,Seq(2,3)),
(2,Seq(1,3)),
(3,Seq(1,2))
)


Order of elements doesn't matter. It doesn't have to be a tuple, a Seq(Seq(1),Seq(2,3)) is also acceptable (although kinda ugly).



Note the emphasis on large collection (which is why my example shows an Iterator).
Also note that this is not combinations.



Ideas?



Edit:
In my use case, the numbers are expected to be unique. If a solution can eliminate the dupes, that's fine, but not at additional cost. Otherwise, dupes are acceptable.



Edit 2: In the end, I went with a nested for-loop, and skipped the case when i == j. No new collections were created. I upvoted the solutions that were correct and simple ("simplicity is the ultimate sophistication" - Leonardo da Vinci), but even the best ones are quadratic just by the nature of the problem, and some create intermediate collections by usage of ++ that I wanted to avoid because the collection I'm dealing with has close to 50000 elements, 2.5 billion when quadratic.










share|improve this question

























  • What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

    – Jack Leow
    Dec 30 '18 at 15:47
















2















I'm looking for an elegant way to combine every element of a Seq with the rest for a large collection.
Example: Seq(1,2,3).someMethod should produce something like



Iterator(
(1,Seq(2,3)),
(2,Seq(1,3)),
(3,Seq(1,2))
)


Order of elements doesn't matter. It doesn't have to be a tuple, a Seq(Seq(1),Seq(2,3)) is also acceptable (although kinda ugly).



Note the emphasis on large collection (which is why my example shows an Iterator).
Also note that this is not combinations.



Ideas?



Edit:
In my use case, the numbers are expected to be unique. If a solution can eliminate the dupes, that's fine, but not at additional cost. Otherwise, dupes are acceptable.



Edit 2: In the end, I went with a nested for-loop, and skipped the case when i == j. No new collections were created. I upvoted the solutions that were correct and simple ("simplicity is the ultimate sophistication" - Leonardo da Vinci), but even the best ones are quadratic just by the nature of the problem, and some create intermediate collections by usage of ++ that I wanted to avoid because the collection I'm dealing with has close to 50000 elements, 2.5 billion when quadratic.










share|improve this question

























  • What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

    – Jack Leow
    Dec 30 '18 at 15:47














2












2








2








I'm looking for an elegant way to combine every element of a Seq with the rest for a large collection.
Example: Seq(1,2,3).someMethod should produce something like



Iterator(
(1,Seq(2,3)),
(2,Seq(1,3)),
(3,Seq(1,2))
)


Order of elements doesn't matter. It doesn't have to be a tuple, a Seq(Seq(1),Seq(2,3)) is also acceptable (although kinda ugly).



Note the emphasis on large collection (which is why my example shows an Iterator).
Also note that this is not combinations.



Ideas?



Edit:
In my use case, the numbers are expected to be unique. If a solution can eliminate the dupes, that's fine, but not at additional cost. Otherwise, dupes are acceptable.



Edit 2: In the end, I went with a nested for-loop, and skipped the case when i == j. No new collections were created. I upvoted the solutions that were correct and simple ("simplicity is the ultimate sophistication" - Leonardo da Vinci), but even the best ones are quadratic just by the nature of the problem, and some create intermediate collections by usage of ++ that I wanted to avoid because the collection I'm dealing with has close to 50000 elements, 2.5 billion when quadratic.










share|improve this question
















I'm looking for an elegant way to combine every element of a Seq with the rest for a large collection.
Example: Seq(1,2,3).someMethod should produce something like



Iterator(
(1,Seq(2,3)),
(2,Seq(1,3)),
(3,Seq(1,2))
)


Order of elements doesn't matter. It doesn't have to be a tuple, a Seq(Seq(1),Seq(2,3)) is also acceptable (although kinda ugly).



Note the emphasis on large collection (which is why my example shows an Iterator).
Also note that this is not combinations.



Ideas?



Edit:
In my use case, the numbers are expected to be unique. If a solution can eliminate the dupes, that's fine, but not at additional cost. Otherwise, dupes are acceptable.



Edit 2: In the end, I went with a nested for-loop, and skipped the case when i == j. No new collections were created. I upvoted the solutions that were correct and simple ("simplicity is the ultimate sophistication" - Leonardo da Vinci), but even the best ones are quadratic just by the nature of the problem, and some create intermediate collections by usage of ++ that I wanted to avoid because the collection I'm dealing with has close to 50000 elements, 2.5 billion when quadratic.







scala scala-collections






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Dec 31 '18 at 9:58







Abhijit Sarkar

















asked Dec 30 '18 at 12:34









Abhijit SarkarAbhijit Sarkar

7,28973892




7,28973892













  • What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

    – Jack Leow
    Dec 30 '18 at 15:47



















  • What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

    – Jack Leow
    Dec 30 '18 at 15:47

















What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

– Jack Leow
Dec 30 '18 at 15:47





What if there are duplicates in your original sequence, says, Seq(1, 1, 1), would the output be 3x 1 -> Seq(1, 1), or do you wish to de-dup?

– Jack Leow
Dec 30 '18 at 15:47












5 Answers
5






active

oldest

votes


















3














The following code has constant runtime (it does everything lazily), but accessing every element of the resulting collections has constant overhead (when accessing each element, an index shift must be computed every time):



def faceMap(i: Int)(j: Int) = if (j < i) j else j + 1

def facets[A](simplex: Vector[A]): Seq[(A, Seq[A])] = {
val n = simplex.size
(0 until n).view.map { i => (
simplex(i),
(0 until n - 1).view.map(j => simplex(faceMap(i)(j)))
)}
}


Example:



println("Example: facets of a 3-dimensional simplex")
for ((i, v) <- facets((0 to 3).toVector)) {
println(i + " -> " + v.mkString("[", ",", "]"))
}


Output:



Example: facets of a 3-dimensional simplex
0 -> [1,2,3]
1 -> [0,2,3]
2 -> [0,1,3]
3 -> [0,1,2]


This code expresses everything in terms of simplices, because "omitting one index" corresponds exactly to the face maps for a combinatorially described simplex. To further illustrate the idea, here is what the faceMap does:



println("Example: how `faceMap(3)` shifts indices")
for (i <- 0 to 5) {
println(i + " -> " + faceMap(3)(i))
}


gives:



Example: how `faceMap(3)` shifts indices
0 -> 0
1 -> 1
2 -> 2
3 -> 4
4 -> 5
5 -> 6


The facets method uses the faceMaps to create a lazy view of the original collection that omits one element by shifting the indices by one starting from the index of the omitted element.






share|improve this answer


























  • Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

    – Dima
    Dec 30 '18 at 13:42











  • @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

    – Andrey Tyukin
    Dec 30 '18 at 13:48











  • ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

    – Dima
    Dec 30 '18 at 13:51













  • @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

    – Andrey Tyukin
    Dec 30 '18 at 13:54





















2














If I understand what you want correctly, in terms of handling duplicate values (i.e., duplicate values are to be preserved), here's something that should work. Given the following input:



import scala.util.Random

val nums = Vector.fill(20)(Random.nextInt)


This should get you what you need:



for (i <- Iterator.from(0).take(nums.size)) yield {
nums(i) -> (nums.take(i) ++ nums.drop(i + 1))
}


On the other hand, if you want to remove dups, I'd convert to Sets:



val numsSet = nums.toSet
for (num <- nums) yield {
num -> (numsSet - num)
}





share|improve this answer































    1














    seq.iterator.map { case x => x -> seq.filter(_ != x) }


    This is quadratic, but I don't think there is very much you can do about that, because in the end of the day, creating a collection is linear, and you are going to need N of them.






    share|improve this answer


























    • filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

      – Abhijit Sarkar
      Dec 30 '18 at 13:24













    • Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

      – Dima
      Dec 30 '18 at 13:36











    • filter has the same behavior, see this post.

      – Abhijit Sarkar
      Dec 30 '18 at 13:43






    • 2





      They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

      – Dima
      Dec 30 '18 at 13:45






    • 3





      If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

      – Jack Leow
      Dec 30 '18 at 15:45



















    0














    import scala.annotation.tailrec

    def prems(s : Seq[Int]):Map[Int,Seq[Int]]={
    @tailrec
    def p(prev: Seq[Int],s :Seq[Int],res:Map[Int,Seq[Int]]):Map[Int,Seq[Int]] = s match {
    case x::Nil => res+(x->prev)
    case x::xs=> p(x +: prev,xs, res+(x ->(prev++xs)))
    }

    p(Seq.empty[Int],s,Map.empty[Int,Seq[Int]])
    }

    prems(Seq(1,2,3,4))
    res0: Map[Int,Seq[Int]] = Map(1 -> List(2, 3, 4), 2 -> List(1, 3, 4), 3 -> List(2, 1, 4),4 -> List(3, 2, 1))





    share|improve this answer
























    • Seems too complicated

      – Abhijit Sarkar
      Dec 31 '18 at 9:47











    • it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

      – Arnon Rotem-Gal-Oz
      Dec 31 '18 at 10:25



















    -2














    I think you are looking for permutations. You can map the resulting lists into the structure you are looking for:



    Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toList
    res49: List[(Int, Seq[Int])] = List((1,List(2, 3)), (1,List(3, 2)), (2,List(1, 3)), (2,List(3, 1)), (3,List(1, 2)), (3,List(2, 1)))


    Note that the final toList call is only there to trigger the evaluation of the expressions; otherwise, the result is an iterator as you asked for.



    In order to get rid of the duplicate heads, toMap seems like the most straight-forward approach:



    Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toMap
    res50: scala.collection.immutable.Map[Int,Seq[Int]] = Map(1 -> List(3, 2), 2 -> List(3, 1), 3 -> List(2, 1))





    share|improve this answer
























    • This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

      – Abhijit Sarkar
      Dec 30 '18 at 13:03








    • 1





      @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

      – Andrey Tyukin
      Dec 30 '18 at 14:05











    • I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

      – Carsten
      Dec 31 '18 at 8:48











    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53977619%2fhow-to-pair-each-element-of-a-seq-with-the-rest%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    5 Answers
    5






    active

    oldest

    votes








    5 Answers
    5






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3














    The following code has constant runtime (it does everything lazily), but accessing every element of the resulting collections has constant overhead (when accessing each element, an index shift must be computed every time):



    def faceMap(i: Int)(j: Int) = if (j < i) j else j + 1

    def facets[A](simplex: Vector[A]): Seq[(A, Seq[A])] = {
    val n = simplex.size
    (0 until n).view.map { i => (
    simplex(i),
    (0 until n - 1).view.map(j => simplex(faceMap(i)(j)))
    )}
    }


    Example:



    println("Example: facets of a 3-dimensional simplex")
    for ((i, v) <- facets((0 to 3).toVector)) {
    println(i + " -> " + v.mkString("[", ",", "]"))
    }


    Output:



    Example: facets of a 3-dimensional simplex
    0 -> [1,2,3]
    1 -> [0,2,3]
    2 -> [0,1,3]
    3 -> [0,1,2]


    This code expresses everything in terms of simplices, because "omitting one index" corresponds exactly to the face maps for a combinatorially described simplex. To further illustrate the idea, here is what the faceMap does:



    println("Example: how `faceMap(3)` shifts indices")
    for (i <- 0 to 5) {
    println(i + " -> " + faceMap(3)(i))
    }


    gives:



    Example: how `faceMap(3)` shifts indices
    0 -> 0
    1 -> 1
    2 -> 2
    3 -> 4
    4 -> 5
    5 -> 6


    The facets method uses the faceMaps to create a lazy view of the original collection that omits one element by shifting the indices by one starting from the index of the omitted element.






    share|improve this answer


























    • Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

      – Dima
      Dec 30 '18 at 13:42











    • @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

      – Andrey Tyukin
      Dec 30 '18 at 13:48











    • ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

      – Dima
      Dec 30 '18 at 13:51













    • @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

      – Andrey Tyukin
      Dec 30 '18 at 13:54


















    3














    The following code has constant runtime (it does everything lazily), but accessing every element of the resulting collections has constant overhead (when accessing each element, an index shift must be computed every time):



    def faceMap(i: Int)(j: Int) = if (j < i) j else j + 1

    def facets[A](simplex: Vector[A]): Seq[(A, Seq[A])] = {
    val n = simplex.size
    (0 until n).view.map { i => (
    simplex(i),
    (0 until n - 1).view.map(j => simplex(faceMap(i)(j)))
    )}
    }


    Example:



    println("Example: facets of a 3-dimensional simplex")
    for ((i, v) <- facets((0 to 3).toVector)) {
    println(i + " -> " + v.mkString("[", ",", "]"))
    }


    Output:



    Example: facets of a 3-dimensional simplex
    0 -> [1,2,3]
    1 -> [0,2,3]
    2 -> [0,1,3]
    3 -> [0,1,2]


    This code expresses everything in terms of simplices, because "omitting one index" corresponds exactly to the face maps for a combinatorially described simplex. To further illustrate the idea, here is what the faceMap does:



    println("Example: how `faceMap(3)` shifts indices")
    for (i <- 0 to 5) {
    println(i + " -> " + faceMap(3)(i))
    }


    gives:



    Example: how `faceMap(3)` shifts indices
    0 -> 0
    1 -> 1
    2 -> 2
    3 -> 4
    4 -> 5
    5 -> 6


    The facets method uses the faceMaps to create a lazy view of the original collection that omits one element by shifting the indices by one starting from the index of the omitted element.






    share|improve this answer


























    • Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

      – Dima
      Dec 30 '18 at 13:42











    • @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

      – Andrey Tyukin
      Dec 30 '18 at 13:48











    • ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

      – Dima
      Dec 30 '18 at 13:51













    • @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

      – Andrey Tyukin
      Dec 30 '18 at 13:54
















    3












    3








    3







    The following code has constant runtime (it does everything lazily), but accessing every element of the resulting collections has constant overhead (when accessing each element, an index shift must be computed every time):



    def faceMap(i: Int)(j: Int) = if (j < i) j else j + 1

    def facets[A](simplex: Vector[A]): Seq[(A, Seq[A])] = {
    val n = simplex.size
    (0 until n).view.map { i => (
    simplex(i),
    (0 until n - 1).view.map(j => simplex(faceMap(i)(j)))
    )}
    }


    Example:



    println("Example: facets of a 3-dimensional simplex")
    for ((i, v) <- facets((0 to 3).toVector)) {
    println(i + " -> " + v.mkString("[", ",", "]"))
    }


    Output:



    Example: facets of a 3-dimensional simplex
    0 -> [1,2,3]
    1 -> [0,2,3]
    2 -> [0,1,3]
    3 -> [0,1,2]


    This code expresses everything in terms of simplices, because "omitting one index" corresponds exactly to the face maps for a combinatorially described simplex. To further illustrate the idea, here is what the faceMap does:



    println("Example: how `faceMap(3)` shifts indices")
    for (i <- 0 to 5) {
    println(i + " -> " + faceMap(3)(i))
    }


    gives:



    Example: how `faceMap(3)` shifts indices
    0 -> 0
    1 -> 1
    2 -> 2
    3 -> 4
    4 -> 5
    5 -> 6


    The facets method uses the faceMaps to create a lazy view of the original collection that omits one element by shifting the indices by one starting from the index of the omitted element.






    share|improve this answer















    The following code has constant runtime (it does everything lazily), but accessing every element of the resulting collections has constant overhead (when accessing each element, an index shift must be computed every time):



    def faceMap(i: Int)(j: Int) = if (j < i) j else j + 1

    def facets[A](simplex: Vector[A]): Seq[(A, Seq[A])] = {
    val n = simplex.size
    (0 until n).view.map { i => (
    simplex(i),
    (0 until n - 1).view.map(j => simplex(faceMap(i)(j)))
    )}
    }


    Example:



    println("Example: facets of a 3-dimensional simplex")
    for ((i, v) <- facets((0 to 3).toVector)) {
    println(i + " -> " + v.mkString("[", ",", "]"))
    }


    Output:



    Example: facets of a 3-dimensional simplex
    0 -> [1,2,3]
    1 -> [0,2,3]
    2 -> [0,1,3]
    3 -> [0,1,2]


    This code expresses everything in terms of simplices, because "omitting one index" corresponds exactly to the face maps for a combinatorially described simplex. To further illustrate the idea, here is what the faceMap does:



    println("Example: how `faceMap(3)` shifts indices")
    for (i <- 0 to 5) {
    println(i + " -> " + faceMap(3)(i))
    }


    gives:



    Example: how `faceMap(3)` shifts indices
    0 -> 0
    1 -> 1
    2 -> 2
    3 -> 4
    4 -> 5
    5 -> 6


    The facets method uses the faceMaps to create a lazy view of the original collection that omits one element by shifting the indices by one starting from the index of the omitted element.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Dec 30 '18 at 18:07

























    answered Dec 30 '18 at 13:29









    Andrey TyukinAndrey Tyukin

    27.8k42349




    27.8k42349













    • Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

      – Dima
      Dec 30 '18 at 13:42











    • @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

      – Andrey Tyukin
      Dec 30 '18 at 13:48











    • ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

      – Dima
      Dec 30 '18 at 13:51













    • @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

      – Andrey Tyukin
      Dec 30 '18 at 13:54





















    • Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

      – Dima
      Dec 30 '18 at 13:42











    • @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

      – Andrey Tyukin
      Dec 30 '18 at 13:48











    • ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

      – Dima
      Dec 30 '18 at 13:51













    • @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

      – Andrey Tyukin
      Dec 30 '18 at 13:54



















    Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

    – Dima
    Dec 30 '18 at 13:42





    Clever. But in the end of the day, getting all the results materialized, is still quadratic, right? So, essentially, the same as what I have (perhaps, with .view added before .filter) ...

    – Dima
    Dec 30 '18 at 13:42













    @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

    – Andrey Tyukin
    Dec 30 '18 at 13:48





    @Dima If you force every view into an explicit linear data structure (e.g. array or list), then the resulting data structures will occupy quadratic space, so, if you force all the views, you will also have quadratic runtime. But maybe forcing every view is not necessary, maybe it's sufficient to leave the original vector with the data at the same position in memory, and then address the elements with the funny index-shifts described above. In this case, it only adds constant overhead for each element access, and uses almost no extra space.

    – Andrey Tyukin
    Dec 30 '18 at 13:48













    ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

    – Dima
    Dec 30 '18 at 13:51







    ... or if you iterate over each element of every view. It may still be linear (constant?) space, but quadratic time.

    – Dima
    Dec 30 '18 at 13:51















    @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

    – Andrey Tyukin
    Dec 30 '18 at 13:54







    @Dima yeah, right... Iterating over each element of the resulting views would also be O(n^2) time. The lazy views merely give you the opportunity to pay only for what you actually use - if you don't use it, it's not materialized.

    – Andrey Tyukin
    Dec 30 '18 at 13:54















    2














    If I understand what you want correctly, in terms of handling duplicate values (i.e., duplicate values are to be preserved), here's something that should work. Given the following input:



    import scala.util.Random

    val nums = Vector.fill(20)(Random.nextInt)


    This should get you what you need:



    for (i <- Iterator.from(0).take(nums.size)) yield {
    nums(i) -> (nums.take(i) ++ nums.drop(i + 1))
    }


    On the other hand, if you want to remove dups, I'd convert to Sets:



    val numsSet = nums.toSet
    for (num <- nums) yield {
    num -> (numsSet - num)
    }





    share|improve this answer




























      2














      If I understand what you want correctly, in terms of handling duplicate values (i.e., duplicate values are to be preserved), here's something that should work. Given the following input:



      import scala.util.Random

      val nums = Vector.fill(20)(Random.nextInt)


      This should get you what you need:



      for (i <- Iterator.from(0).take(nums.size)) yield {
      nums(i) -> (nums.take(i) ++ nums.drop(i + 1))
      }


      On the other hand, if you want to remove dups, I'd convert to Sets:



      val numsSet = nums.toSet
      for (num <- nums) yield {
      num -> (numsSet - num)
      }





      share|improve this answer


























        2












        2








        2







        If I understand what you want correctly, in terms of handling duplicate values (i.e., duplicate values are to be preserved), here's something that should work. Given the following input:



        import scala.util.Random

        val nums = Vector.fill(20)(Random.nextInt)


        This should get you what you need:



        for (i <- Iterator.from(0).take(nums.size)) yield {
        nums(i) -> (nums.take(i) ++ nums.drop(i + 1))
        }


        On the other hand, if you want to remove dups, I'd convert to Sets:



        val numsSet = nums.toSet
        for (num <- nums) yield {
        num -> (numsSet - num)
        }





        share|improve this answer













        If I understand what you want correctly, in terms of handling duplicate values (i.e., duplicate values are to be preserved), here's something that should work. Given the following input:



        import scala.util.Random

        val nums = Vector.fill(20)(Random.nextInt)


        This should get you what you need:



        for (i <- Iterator.from(0).take(nums.size)) yield {
        nums(i) -> (nums.take(i) ++ nums.drop(i + 1))
        }


        On the other hand, if you want to remove dups, I'd convert to Sets:



        val numsSet = nums.toSet
        for (num <- nums) yield {
        num -> (numsSet - num)
        }






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Dec 30 '18 at 16:02









        Jack LeowJack Leow

        18.2k34447




        18.2k34447























            1














            seq.iterator.map { case x => x -> seq.filter(_ != x) }


            This is quadratic, but I don't think there is very much you can do about that, because in the end of the day, creating a collection is linear, and you are going to need N of them.






            share|improve this answer


























            • filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

              – Abhijit Sarkar
              Dec 30 '18 at 13:24













            • Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

              – Dima
              Dec 30 '18 at 13:36











            • filter has the same behavior, see this post.

              – Abhijit Sarkar
              Dec 30 '18 at 13:43






            • 2





              They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

              – Dima
              Dec 30 '18 at 13:45






            • 3





              If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

              – Jack Leow
              Dec 30 '18 at 15:45
















            1














            seq.iterator.map { case x => x -> seq.filter(_ != x) }


            This is quadratic, but I don't think there is very much you can do about that, because in the end of the day, creating a collection is linear, and you are going to need N of them.






            share|improve this answer


























            • filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

              – Abhijit Sarkar
              Dec 30 '18 at 13:24













            • Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

              – Dima
              Dec 30 '18 at 13:36











            • filter has the same behavior, see this post.

              – Abhijit Sarkar
              Dec 30 '18 at 13:43






            • 2





              They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

              – Dima
              Dec 30 '18 at 13:45






            • 3





              If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

              – Jack Leow
              Dec 30 '18 at 15:45














            1












            1








            1







            seq.iterator.map { case x => x -> seq.filter(_ != x) }


            This is quadratic, but I don't think there is very much you can do about that, because in the end of the day, creating a collection is linear, and you are going to need N of them.






            share|improve this answer















            seq.iterator.map { case x => x -> seq.filter(_ != x) }


            This is quadratic, but I don't think there is very much you can do about that, because in the end of the day, creating a collection is linear, and you are going to need N of them.







            share|improve this answer














            share|improve this answer



            share|improve this answer








            edited Dec 30 '18 at 13:36

























            answered Dec 30 '18 at 13:20









            DimaDima

            24.7k32235




            24.7k32235













            • filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

              – Abhijit Sarkar
              Dec 30 '18 at 13:24













            • Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

              – Dima
              Dec 30 '18 at 13:36











            • filter has the same behavior, see this post.

              – Abhijit Sarkar
              Dec 30 '18 at 13:43






            • 2





              They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

              – Dima
              Dec 30 '18 at 13:45






            • 3





              If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

              – Jack Leow
              Dec 30 '18 at 15:45



















            • filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

              – Abhijit Sarkar
              Dec 30 '18 at 13:24













            • Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

              – Dima
              Dec 30 '18 at 13:36











            • filter has the same behavior, see this post.

              – Abhijit Sarkar
              Dec 30 '18 at 13:43






            • 2





              They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

              – Dima
              Dec 30 '18 at 13:45






            • 3





              If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

              – Jack Leow
              Dec 30 '18 at 15:45

















            filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

            – Abhijit Sarkar
            Dec 30 '18 at 13:24







            filterNot is going to create a copy for even non-strict sequences, isn't it? Perhaps withFilter is better.

            – Abhijit Sarkar
            Dec 30 '18 at 13:24















            Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

            – Dima
            Dec 30 '18 at 13:36





            Interesting ... I didn't know that. No idea, why it does that actually, looks like a bug. filter does the right thing. Weird. I'll update.

            – Dima
            Dec 30 '18 at 13:36













            filter has the same behavior, see this post.

            – Abhijit Sarkar
            Dec 30 '18 at 13:43





            filter has the same behavior, see this post.

            – Abhijit Sarkar
            Dec 30 '18 at 13:43




            2




            2





            They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

            – Dima
            Dec 30 '18 at 13:45





            They both create a new collection. But if the original collection is lazy, then result of .filter is also lazy (while the result of .filterNot is for some reason strict).

            – Dima
            Dec 30 '18 at 13:45




            3




            3





            If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

            – Jack Leow
            Dec 30 '18 at 15:45





            If the sequence has more than one X, your implementation removes all of them. I don't believe this is what the question asks for.

            – Jack Leow
            Dec 30 '18 at 15:45











            0














            import scala.annotation.tailrec

            def prems(s : Seq[Int]):Map[Int,Seq[Int]]={
            @tailrec
            def p(prev: Seq[Int],s :Seq[Int],res:Map[Int,Seq[Int]]):Map[Int,Seq[Int]] = s match {
            case x::Nil => res+(x->prev)
            case x::xs=> p(x +: prev,xs, res+(x ->(prev++xs)))
            }

            p(Seq.empty[Int],s,Map.empty[Int,Seq[Int]])
            }

            prems(Seq(1,2,3,4))
            res0: Map[Int,Seq[Int]] = Map(1 -> List(2, 3, 4), 2 -> List(1, 3, 4), 3 -> List(2, 1, 4),4 -> List(3, 2, 1))





            share|improve this answer
























            • Seems too complicated

              – Abhijit Sarkar
              Dec 31 '18 at 9:47











            • it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

              – Arnon Rotem-Gal-Oz
              Dec 31 '18 at 10:25
















            0














            import scala.annotation.tailrec

            def prems(s : Seq[Int]):Map[Int,Seq[Int]]={
            @tailrec
            def p(prev: Seq[Int],s :Seq[Int],res:Map[Int,Seq[Int]]):Map[Int,Seq[Int]] = s match {
            case x::Nil => res+(x->prev)
            case x::xs=> p(x +: prev,xs, res+(x ->(prev++xs)))
            }

            p(Seq.empty[Int],s,Map.empty[Int,Seq[Int]])
            }

            prems(Seq(1,2,3,4))
            res0: Map[Int,Seq[Int]] = Map(1 -> List(2, 3, 4), 2 -> List(1, 3, 4), 3 -> List(2, 1, 4),4 -> List(3, 2, 1))





            share|improve this answer
























            • Seems too complicated

              – Abhijit Sarkar
              Dec 31 '18 at 9:47











            • it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

              – Arnon Rotem-Gal-Oz
              Dec 31 '18 at 10:25














            0












            0








            0







            import scala.annotation.tailrec

            def prems(s : Seq[Int]):Map[Int,Seq[Int]]={
            @tailrec
            def p(prev: Seq[Int],s :Seq[Int],res:Map[Int,Seq[Int]]):Map[Int,Seq[Int]] = s match {
            case x::Nil => res+(x->prev)
            case x::xs=> p(x +: prev,xs, res+(x ->(prev++xs)))
            }

            p(Seq.empty[Int],s,Map.empty[Int,Seq[Int]])
            }

            prems(Seq(1,2,3,4))
            res0: Map[Int,Seq[Int]] = Map(1 -> List(2, 3, 4), 2 -> List(1, 3, 4), 3 -> List(2, 1, 4),4 -> List(3, 2, 1))





            share|improve this answer













            import scala.annotation.tailrec

            def prems(s : Seq[Int]):Map[Int,Seq[Int]]={
            @tailrec
            def p(prev: Seq[Int],s :Seq[Int],res:Map[Int,Seq[Int]]):Map[Int,Seq[Int]] = s match {
            case x::Nil => res+(x->prev)
            case x::xs=> p(x +: prev,xs, res+(x ->(prev++xs)))
            }

            p(Seq.empty[Int],s,Map.empty[Int,Seq[Int]])
            }

            prems(Seq(1,2,3,4))
            res0: Map[Int,Seq[Int]] = Map(1 -> List(2, 3, 4), 2 -> List(1, 3, 4), 3 -> List(2, 1, 4),4 -> List(3, 2, 1))






            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Dec 30 '18 at 21:22









            Arnon Rotem-Gal-OzArnon Rotem-Gal-Oz

            19.3k13560




            19.3k13560













            • Seems too complicated

              – Abhijit Sarkar
              Dec 31 '18 at 9:47











            • it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

              – Arnon Rotem-Gal-Oz
              Dec 31 '18 at 10:25



















            • Seems too complicated

              – Abhijit Sarkar
              Dec 31 '18 at 9:47











            • it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

              – Arnon Rotem-Gal-Oz
              Dec 31 '18 at 10:25

















            Seems too complicated

            – Abhijit Sarkar
            Dec 31 '18 at 9:47





            Seems too complicated

            – Abhijit Sarkar
            Dec 31 '18 at 9:47













            it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

            – Arnon Rotem-Gal-Oz
            Dec 31 '18 at 10:25





            it's a simple recursion (essentially a reduce) that builds the previous items which it always concatenate to the tail (rest of the sequence) as it traverser the sequence once

            – Arnon Rotem-Gal-Oz
            Dec 31 '18 at 10:25











            -2














            I think you are looking for permutations. You can map the resulting lists into the structure you are looking for:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toList
            res49: List[(Int, Seq[Int])] = List((1,List(2, 3)), (1,List(3, 2)), (2,List(1, 3)), (2,List(3, 1)), (3,List(1, 2)), (3,List(2, 1)))


            Note that the final toList call is only there to trigger the evaluation of the expressions; otherwise, the result is an iterator as you asked for.



            In order to get rid of the duplicate heads, toMap seems like the most straight-forward approach:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toMap
            res50: scala.collection.immutable.Map[Int,Seq[Int]] = Map(1 -> List(3, 2), 2 -> List(3, 1), 3 -> List(2, 1))





            share|improve this answer
























            • This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

              – Abhijit Sarkar
              Dec 30 '18 at 13:03








            • 1





              @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

              – Andrey Tyukin
              Dec 30 '18 at 14:05











            • I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

              – Carsten
              Dec 31 '18 at 8:48
















            -2














            I think you are looking for permutations. You can map the resulting lists into the structure you are looking for:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toList
            res49: List[(Int, Seq[Int])] = List((1,List(2, 3)), (1,List(3, 2)), (2,List(1, 3)), (2,List(3, 1)), (3,List(1, 2)), (3,List(2, 1)))


            Note that the final toList call is only there to trigger the evaluation of the expressions; otherwise, the result is an iterator as you asked for.



            In order to get rid of the duplicate heads, toMap seems like the most straight-forward approach:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toMap
            res50: scala.collection.immutable.Map[Int,Seq[Int]] = Map(1 -> List(3, 2), 2 -> List(3, 1), 3 -> List(2, 1))





            share|improve this answer
























            • This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

              – Abhijit Sarkar
              Dec 30 '18 at 13:03








            • 1





              @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

              – Andrey Tyukin
              Dec 30 '18 at 14:05











            • I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

              – Carsten
              Dec 31 '18 at 8:48














            -2












            -2








            -2







            I think you are looking for permutations. You can map the resulting lists into the structure you are looking for:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toList
            res49: List[(Int, Seq[Int])] = List((1,List(2, 3)), (1,List(3, 2)), (2,List(1, 3)), (2,List(3, 1)), (3,List(1, 2)), (3,List(2, 1)))


            Note that the final toList call is only there to trigger the evaluation of the expressions; otherwise, the result is an iterator as you asked for.



            In order to get rid of the duplicate heads, toMap seems like the most straight-forward approach:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toMap
            res50: scala.collection.immutable.Map[Int,Seq[Int]] = Map(1 -> List(3, 2), 2 -> List(3, 1), 3 -> List(2, 1))





            share|improve this answer













            I think you are looking for permutations. You can map the resulting lists into the structure you are looking for:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toList
            res49: List[(Int, Seq[Int])] = List((1,List(2, 3)), (1,List(3, 2)), (2,List(1, 3)), (2,List(3, 1)), (3,List(1, 2)), (3,List(2, 1)))


            Note that the final toList call is only there to trigger the evaluation of the expressions; otherwise, the result is an iterator as you asked for.



            In order to get rid of the duplicate heads, toMap seems like the most straight-forward approach:



            Seq(1,2,3).permutations.map(p => (p.head, p.tail)).toMap
            res50: scala.collection.immutable.Map[Int,Seq[Int]] = Map(1 -> List(3, 2), 2 -> List(3, 1), 3 -> List(2, 1))






            share|improve this answer












            share|improve this answer



            share|improve this answer










            answered Dec 30 '18 at 12:58









            CarstenCarsten

            581727




            581727













            • This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

              – Abhijit Sarkar
              Dec 30 '18 at 13:03








            • 1





              @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

              – Andrey Tyukin
              Dec 30 '18 at 14:05











            • I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

              – Carsten
              Dec 31 '18 at 8:48



















            • This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

              – Abhijit Sarkar
              Dec 30 '18 at 13:03








            • 1





              @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

              – Andrey Tyukin
              Dec 30 '18 at 14:05











            • I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

              – Carsten
              Dec 31 '18 at 8:48

















            This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

            – Abhijit Sarkar
            Dec 30 '18 at 13:03







            This is going to create a bunch of intermediate collections. Perhaps that's why someone (not me) downvoted you.

            – Abhijit Sarkar
            Dec 30 '18 at 13:03






            1




            1





            @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

            – Andrey Tyukin
            Dec 30 '18 at 14:05





            @AbhijitSarkar "bunch of" is a little bit vague. It will generate a superexponential amount of irrelevant duplicates for a problem where even the completely materialized explicit solution is of quadratic size (factorial grows faster than exponential function).

            – Andrey Tyukin
            Dec 30 '18 at 14:05













            I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

            – Carsten
            Dec 31 '18 at 8:48





            I agree that there are many redundant pairs are created, but I doubt that this is harmful memory-wise. The intermediate collections are only actually evaluated in the toMap step; Seq(1,2,3).permutations.map(p => (p.head, p.tail)) results in an Iterator. I am pretty sure toMap is not implemented in a way that it stores all intermediate pairs in memory. Runtime complexity is another issue, obviously.

            – Carsten
            Dec 31 '18 at 8:48


















            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53977619%2fhow-to-pair-each-element-of-a-seq-with-the-rest%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            nLrCstE,0O
            UjhkC,vGuN5V84,v

            Popular posts from this blog

            Monofisismo

            Angular Downloading a file using contenturl with Basic Authentication

            Olmecas