Sparksql-Search for events in a time window with Sparksql
I have the CSV file with events as per table below.
+-------------------+-------+
|Created |Name |
++------------------+-------+
|2018-09-30 21:00:08|EVENT A|
|2018-09-30 21:03:11|Event C|
|2018-09-30 21:04:17|Event 3|
|2018-09-30 21:05:27|Event Y| <<<
|2018-09-30 21:06:11|Event 5|
|2018-09-30 21:07:17|Event P|
|2018-09-30 21:08:25|Event X| <<<
|2018-09-30 21:09:26|Event B|
|2018-09-30 21:10:39|Event O|
-----------------------------
I need to partition the events by timestamp, in Windows lasting 5 minutes and search within this window the occurrence of an event x, if this event occurs, I need to search in that same window a Y event, in the time previus the event x found until the Start of the window.
java scala apache-spark
add a comment |
I have the CSV file with events as per table below.
+-------------------+-------+
|Created |Name |
++------------------+-------+
|2018-09-30 21:00:08|EVENT A|
|2018-09-30 21:03:11|Event C|
|2018-09-30 21:04:17|Event 3|
|2018-09-30 21:05:27|Event Y| <<<
|2018-09-30 21:06:11|Event 5|
|2018-09-30 21:07:17|Event P|
|2018-09-30 21:08:25|Event X| <<<
|2018-09-30 21:09:26|Event B|
|2018-09-30 21:10:39|Event O|
-----------------------------
I need to partition the events by timestamp, in Windows lasting 5 minutes and search within this window the occurrence of an event x, if this event occurs, I need to search in that same window a Y event, in the time previus the event x found until the Start of the window.
java scala apache-spark
add a comment |
I have the CSV file with events as per table below.
+-------------------+-------+
|Created |Name |
++------------------+-------+
|2018-09-30 21:00:08|EVENT A|
|2018-09-30 21:03:11|Event C|
|2018-09-30 21:04:17|Event 3|
|2018-09-30 21:05:27|Event Y| <<<
|2018-09-30 21:06:11|Event 5|
|2018-09-30 21:07:17|Event P|
|2018-09-30 21:08:25|Event X| <<<
|2018-09-30 21:09:26|Event B|
|2018-09-30 21:10:39|Event O|
-----------------------------
I need to partition the events by timestamp, in Windows lasting 5 minutes and search within this window the occurrence of an event x, if this event occurs, I need to search in that same window a Y event, in the time previus the event x found until the Start of the window.
java scala apache-spark
I have the CSV file with events as per table below.
+-------------------+-------+
|Created |Name |
++------------------+-------+
|2018-09-30 21:00:08|EVENT A|
|2018-09-30 21:03:11|Event C|
|2018-09-30 21:04:17|Event 3|
|2018-09-30 21:05:27|Event Y| <<<
|2018-09-30 21:06:11|Event 5|
|2018-09-30 21:07:17|Event P|
|2018-09-30 21:08:25|Event X| <<<
|2018-09-30 21:09:26|Event B|
|2018-09-30 21:10:39|Event O|
-----------------------------
I need to partition the events by timestamp, in Windows lasting 5 minutes and search within this window the occurrence of an event x, if this event occurs, I need to search in that same window a Y event, in the time previus the event x found until the Start of the window.
java scala apache-spark
java scala apache-spark
edited Jan 3 at 21:17
Leo C
12k2720
12k2720
asked Jan 3 at 17:05
Marcelino SantosMarcelino Santos
31
31
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Here's one approach that first creates the 5-minute time windows, collects an event list per time-window partition, and then applies a udf
to mark the wanted events:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp
val df = Seq(
(Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
(Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
(Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
(Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
(Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
(Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
(Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
(Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
(Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")
val winSpec = Window.partitionBy($"Win5m")
def checkEvents(e1: String, e2: String) = udf(
(currEvent: String, events: Seq[String]) =>
events.contains(e1) && events.contains(e2) &&
events.indexOf(e1) < events.indexOf(e2) &&
(currEvent == e1 || currEvent == e2)
)
df.
withColumn("Win5m", window($"Created", "5 minutes")).
withColumn("Events", collect_list($"Name").over(winSpec)).
withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
select($"Created", $"Name").
where($"marked").
show(false)
// +-------------------+-------+
// |Created |Name |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+
Below is the dataset with the intermediate columns excluded from the above final result:
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created |Name |Win5m |Events |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O] |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
add a comment |
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54026716%2fsparksql-search-for-events-in-a-time-window-with-sparksql%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Here's one approach that first creates the 5-minute time windows, collects an event list per time-window partition, and then applies a udf
to mark the wanted events:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp
val df = Seq(
(Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
(Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
(Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
(Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
(Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
(Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
(Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
(Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
(Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")
val winSpec = Window.partitionBy($"Win5m")
def checkEvents(e1: String, e2: String) = udf(
(currEvent: String, events: Seq[String]) =>
events.contains(e1) && events.contains(e2) &&
events.indexOf(e1) < events.indexOf(e2) &&
(currEvent == e1 || currEvent == e2)
)
df.
withColumn("Win5m", window($"Created", "5 minutes")).
withColumn("Events", collect_list($"Name").over(winSpec)).
withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
select($"Created", $"Name").
where($"marked").
show(false)
// +-------------------+-------+
// |Created |Name |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+
Below is the dataset with the intermediate columns excluded from the above final result:
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created |Name |Win5m |Events |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O] |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
add a comment |
Here's one approach that first creates the 5-minute time windows, collects an event list per time-window partition, and then applies a udf
to mark the wanted events:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp
val df = Seq(
(Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
(Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
(Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
(Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
(Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
(Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
(Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
(Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
(Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")
val winSpec = Window.partitionBy($"Win5m")
def checkEvents(e1: String, e2: String) = udf(
(currEvent: String, events: Seq[String]) =>
events.contains(e1) && events.contains(e2) &&
events.indexOf(e1) < events.indexOf(e2) &&
(currEvent == e1 || currEvent == e2)
)
df.
withColumn("Win5m", window($"Created", "5 minutes")).
withColumn("Events", collect_list($"Name").over(winSpec)).
withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
select($"Created", $"Name").
where($"marked").
show(false)
// +-------------------+-------+
// |Created |Name |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+
Below is the dataset with the intermediate columns excluded from the above final result:
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created |Name |Win5m |Events |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O] |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
add a comment |
Here's one approach that first creates the 5-minute time windows, collects an event list per time-window partition, and then applies a udf
to mark the wanted events:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp
val df = Seq(
(Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
(Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
(Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
(Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
(Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
(Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
(Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
(Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
(Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")
val winSpec = Window.partitionBy($"Win5m")
def checkEvents(e1: String, e2: String) = udf(
(currEvent: String, events: Seq[String]) =>
events.contains(e1) && events.contains(e2) &&
events.indexOf(e1) < events.indexOf(e2) &&
(currEvent == e1 || currEvent == e2)
)
df.
withColumn("Win5m", window($"Created", "5 minutes")).
withColumn("Events", collect_list($"Name").over(winSpec)).
withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
select($"Created", $"Name").
where($"marked").
show(false)
// +-------------------+-------+
// |Created |Name |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+
Below is the dataset with the intermediate columns excluded from the above final result:
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created |Name |Win5m |Events |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O] |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
Here's one approach that first creates the 5-minute time windows, collects an event list per time-window partition, and then applies a udf
to mark the wanted events:
import org.apache.spark.sql.functions._
import org.apache.spark.sql.expressions.Window
import java.sql.Timestamp
val df = Seq(
(Timestamp.valueOf("2018-09-30 21:00:08"), "Event A"),
(Timestamp.valueOf("2018-09-30 21:03:11"), "Event C"),
(Timestamp.valueOf("2018-09-30 21:04:17"), "Event 3"),
(Timestamp.valueOf("2018-09-30 21:05:27"), "Event Y"),
(Timestamp.valueOf("2018-09-30 21:06:11"), "Event 5"),
(Timestamp.valueOf("2018-09-30 21:07:17"), "Event P"),
(Timestamp.valueOf("2018-09-30 21:08:25"), "Event X"),
(Timestamp.valueOf("2018-09-30 21:09:26"), "Event B"),
(Timestamp.valueOf("2018-09-30 21:10:39"), "Event O")
).toDF("Created", "Name")
val winSpec = Window.partitionBy($"Win5m")
def checkEvents(e1: String, e2: String) = udf(
(currEvent: String, events: Seq[String]) =>
events.contains(e1) && events.contains(e2) &&
events.indexOf(e1) < events.indexOf(e2) &&
(currEvent == e1 || currEvent == e2)
)
df.
withColumn("Win5m", window($"Created", "5 minutes")).
withColumn("Events", collect_list($"Name").over(winSpec)).
withColumn("marked", checkEvents("Event Y", "Event X")($"Name", $"Events")).
select($"Created", $"Name").
where($"marked").
show(false)
// +-------------------+-------+
// |Created |Name |
// +-------------------+-------+
// |2018-09-30 21:05:27|Event Y|
// |2018-09-30 21:08:25|Event X|
// +-------------------+-------+
Below is the dataset with the intermediate columns excluded from the above final result:
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |Created |Name |Win5m |Events |marked|
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
// |2018-09-30 21:00:08|Event A|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:03:11|Event C|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:04:17|Event 3|[2018-09-30 21:00:00.0,2018-09-30 21:05:00.0]|[Event A, Event C, Event 3] |false |
// |2018-09-30 21:05:27|Event Y|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:06:11|Event 5|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:07:17|Event P|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:08:25|Event X|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|true |
// |2018-09-30 21:09:26|Event B|[2018-09-30 21:05:00.0,2018-09-30 21:10:00.0]|[Event Y, Event 5, Event P, Event X, Event B]|false |
// |2018-09-30 21:10:39|Event O|[2018-09-30 21:10:00.0,2018-09-30 21:15:00.0]|[Event O] |false |
// +-------------------+-------+---------------------------------------------+---------------------------------------------+------+
answered Jan 3 at 21:19
Leo CLeo C
12k2720
12k2720
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54026716%2fsparksql-search-for-events-in-a-time-window-with-sparksql%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown