Spring WebFlux with MongoDB - throttling SSE clients












0















I am working on a simple chat service ran by Spring Boot 2.1.1 with WebFlux, Reactor 3.2.3, Mongo 3.8.2 and Netty 4.1.31.



Each chat room has 2 collections - messages archive and a capped collection with current events (eg. new message event, user typing indicators etc.). The capped collection has 100 elements and I am using tail() method of ReactiveMongoTemplate to retrieve latest events.



The service exposes 2 kinds of endpoints for retrieving the recent events: SSE and for polling. I have done some stress testing with 2000 concurrent users which apart from listening to the chat, were spamming tons of events.



The observations are:




  • polling every 2 seconds brings a bit of stress to the service (~40% CPU usage during the test) and almost no stress to the MongoDB (~4%)

  • listening via SSE maxes out the MongoDB (~90%), also stresses the service (which tries to use the rest of available resources), but Mongo is particularly struggling and overall the service becomes almost unresponsive.


The observation seems obvious, because when I have connected via SSE during the test, it has updated me almost instantly when new event arrived - basically SSE was hundreds of times more responsive than polling every 2 seconds.



The question is:



Given that the client is ultimately the subscriber (or at least I think it is given by limited knowledge), can I somehow throttle the rate of publishing messages by ReactiveMongoTemplate? Or somehow decrease the demand for new events without having to do that client-side?



I have been trying my luck with Flux buffering and cache'ing, however it caused even more stress...



Code:



// ChatRepository.java

private static final Query chatEventsQuery = new Query();

public Flux<ChatEvent> getChatEventsStream(String chatId) {
return reactiveMongoTemplate.tail(
chatEventsQuery,
ChatEvent.class,
chatId
);
}


,



// ChatHandler.java

public Mono<ServerResponse> getChatStream(ServerRequest request) {

String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
String username = getUsername(request);

Flux<ServerSentEvent> chatEventsStream = chatRepository
.getChatEventsStream(chatId)
.map(addUserSpecificPropsToChatEvent(username))
.map(event -> ServerSentEvent.<ChatEvent>builder()
.event(event.getType().getEventName())
.data(event)
.build());

log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

return ServerResponse.ok().body(
chatEventsStream,
ServerSentEvent.class
);
}


,



// ChatRouter.java

RouterFunction<ServerResponse> routes(ChatHandler handler) {
return route(GET("/api/chat/{chatId}/stream"), handler::getChatStream);
}









share|improve this question





























    0















    I am working on a simple chat service ran by Spring Boot 2.1.1 with WebFlux, Reactor 3.2.3, Mongo 3.8.2 and Netty 4.1.31.



    Each chat room has 2 collections - messages archive and a capped collection with current events (eg. new message event, user typing indicators etc.). The capped collection has 100 elements and I am using tail() method of ReactiveMongoTemplate to retrieve latest events.



    The service exposes 2 kinds of endpoints for retrieving the recent events: SSE and for polling. I have done some stress testing with 2000 concurrent users which apart from listening to the chat, were spamming tons of events.



    The observations are:




    • polling every 2 seconds brings a bit of stress to the service (~40% CPU usage during the test) and almost no stress to the MongoDB (~4%)

    • listening via SSE maxes out the MongoDB (~90%), also stresses the service (which tries to use the rest of available resources), but Mongo is particularly struggling and overall the service becomes almost unresponsive.


    The observation seems obvious, because when I have connected via SSE during the test, it has updated me almost instantly when new event arrived - basically SSE was hundreds of times more responsive than polling every 2 seconds.



    The question is:



    Given that the client is ultimately the subscriber (or at least I think it is given by limited knowledge), can I somehow throttle the rate of publishing messages by ReactiveMongoTemplate? Or somehow decrease the demand for new events without having to do that client-side?



    I have been trying my luck with Flux buffering and cache'ing, however it caused even more stress...



    Code:



    // ChatRepository.java

    private static final Query chatEventsQuery = new Query();

    public Flux<ChatEvent> getChatEventsStream(String chatId) {
    return reactiveMongoTemplate.tail(
    chatEventsQuery,
    ChatEvent.class,
    chatId
    );
    }


    ,



    // ChatHandler.java

    public Mono<ServerResponse> getChatStream(ServerRequest request) {

    String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
    String username = getUsername(request);

    Flux<ServerSentEvent> chatEventsStream = chatRepository
    .getChatEventsStream(chatId)
    .map(addUserSpecificPropsToChatEvent(username))
    .map(event -> ServerSentEvent.<ChatEvent>builder()
    .event(event.getType().getEventName())
    .data(event)
    .build());

    log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

    return ServerResponse.ok().body(
    chatEventsStream,
    ServerSentEvent.class
    );
    }


    ,



    // ChatRouter.java

    RouterFunction<ServerResponse> routes(ChatHandler handler) {
    return route(GET("/api/chat/{chatId}/stream"), handler::getChatStream);
    }









    share|improve this question



























      0












      0








      0








      I am working on a simple chat service ran by Spring Boot 2.1.1 with WebFlux, Reactor 3.2.3, Mongo 3.8.2 and Netty 4.1.31.



      Each chat room has 2 collections - messages archive and a capped collection with current events (eg. new message event, user typing indicators etc.). The capped collection has 100 elements and I am using tail() method of ReactiveMongoTemplate to retrieve latest events.



      The service exposes 2 kinds of endpoints for retrieving the recent events: SSE and for polling. I have done some stress testing with 2000 concurrent users which apart from listening to the chat, were spamming tons of events.



      The observations are:




      • polling every 2 seconds brings a bit of stress to the service (~40% CPU usage during the test) and almost no stress to the MongoDB (~4%)

      • listening via SSE maxes out the MongoDB (~90%), also stresses the service (which tries to use the rest of available resources), but Mongo is particularly struggling and overall the service becomes almost unresponsive.


      The observation seems obvious, because when I have connected via SSE during the test, it has updated me almost instantly when new event arrived - basically SSE was hundreds of times more responsive than polling every 2 seconds.



      The question is:



      Given that the client is ultimately the subscriber (or at least I think it is given by limited knowledge), can I somehow throttle the rate of publishing messages by ReactiveMongoTemplate? Or somehow decrease the demand for new events without having to do that client-side?



      I have been trying my luck with Flux buffering and cache'ing, however it caused even more stress...



      Code:



      // ChatRepository.java

      private static final Query chatEventsQuery = new Query();

      public Flux<ChatEvent> getChatEventsStream(String chatId) {
      return reactiveMongoTemplate.tail(
      chatEventsQuery,
      ChatEvent.class,
      chatId
      );
      }


      ,



      // ChatHandler.java

      public Mono<ServerResponse> getChatStream(ServerRequest request) {

      String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
      String username = getUsername(request);

      Flux<ServerSentEvent> chatEventsStream = chatRepository
      .getChatEventsStream(chatId)
      .map(addUserSpecificPropsToChatEvent(username))
      .map(event -> ServerSentEvent.<ChatEvent>builder()
      .event(event.getType().getEventName())
      .data(event)
      .build());

      log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

      return ServerResponse.ok().body(
      chatEventsStream,
      ServerSentEvent.class
      );
      }


      ,



      // ChatRouter.java

      RouterFunction<ServerResponse> routes(ChatHandler handler) {
      return route(GET("/api/chat/{chatId}/stream"), handler::getChatStream);
      }









      share|improve this question
















      I am working on a simple chat service ran by Spring Boot 2.1.1 with WebFlux, Reactor 3.2.3, Mongo 3.8.2 and Netty 4.1.31.



      Each chat room has 2 collections - messages archive and a capped collection with current events (eg. new message event, user typing indicators etc.). The capped collection has 100 elements and I am using tail() method of ReactiveMongoTemplate to retrieve latest events.



      The service exposes 2 kinds of endpoints for retrieving the recent events: SSE and for polling. I have done some stress testing with 2000 concurrent users which apart from listening to the chat, were spamming tons of events.



      The observations are:




      • polling every 2 seconds brings a bit of stress to the service (~40% CPU usage during the test) and almost no stress to the MongoDB (~4%)

      • listening via SSE maxes out the MongoDB (~90%), also stresses the service (which tries to use the rest of available resources), but Mongo is particularly struggling and overall the service becomes almost unresponsive.


      The observation seems obvious, because when I have connected via SSE during the test, it has updated me almost instantly when new event arrived - basically SSE was hundreds of times more responsive than polling every 2 seconds.



      The question is:



      Given that the client is ultimately the subscriber (or at least I think it is given by limited knowledge), can I somehow throttle the rate of publishing messages by ReactiveMongoTemplate? Or somehow decrease the demand for new events without having to do that client-side?



      I have been trying my luck with Flux buffering and cache'ing, however it caused even more stress...



      Code:



      // ChatRepository.java

      private static final Query chatEventsQuery = new Query();

      public Flux<ChatEvent> getChatEventsStream(String chatId) {
      return reactiveMongoTemplate.tail(
      chatEventsQuery,
      ChatEvent.class,
      chatId
      );
      }


      ,



      // ChatHandler.java

      public Mono<ServerResponse> getChatStream(ServerRequest request) {

      String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
      String username = getUsername(request);

      Flux<ServerSentEvent> chatEventsStream = chatRepository
      .getChatEventsStream(chatId)
      .map(addUserSpecificPropsToChatEvent(username))
      .map(event -> ServerSentEvent.<ChatEvent>builder()
      .event(event.getType().getEventName())
      .data(event)
      .build());

      log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

      return ServerResponse.ok().body(
      chatEventsStream,
      ServerSentEvent.class
      );
      }


      ,



      // ChatRouter.java

      RouterFunction<ServerResponse> routes(ChatHandler handler) {
      return route(GET("/api/chat/{chatId}/stream"), handler::getChatStream);
      }






      spring-webflux project-reactor spring-mongodb reactor-netty






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 3 at 14:01









      akarnokd

      50.6k10105153




      50.6k10105153










      asked Jan 3 at 8:10









      Jakub MalecJakub Malec

      865




      865
























          1 Answer
          1






          active

          oldest

          votes


















          0














          The answer is:
          You do it by using Flux.buffer method. Then the flux will send the events to the subscribers in bulks at a defined rate.



          The code I have posted had 2 major issues




          1. Given that multiple users are usually listening to one chat, I have refactored the ChatRepository to take advantage of "Hot", replayable Fluxes (now I have 1 stream per chat instead of 1 stream per user) which I store in Caffeine cache.
            Additionally I am buffering them by short time intervals to avoid heavy resource usage on pushing the events to the clients on busy chats.


          2. The new Query() I have used in the ChatRepository was redundant. I
            have looked at ReactiveMongoTemplate's code and if a non-null
            query is provided, the logic is a bit more complex. It's better to pass null
            to ReactiveMongoTemplate's tail() method instead.



          Code post-refactoring



          // ChatRepository.java

          public Flux<List<ChatEvent>> getChatEventsStream(String chatId) {
          return Optional.ofNullable(chatStreamsCache.getIfPresent(chatId))
          .orElseGet(newCachedChatEventsStream(chatId))
          .autoConnect();
          }

          private Supplier<ConnectableFlux<List<ChatEvent>>> newCachedChatEventsStream(String chatId) {
          return () -> {
          ConnectableFlux<List<ChatEvent>> chatEventsStream = reactiveMongoTemplate.tail(
          null,
          ChatEvent.class,
          chatId
          ).buffer(Duration.ofMillis(chatEventsBufferInterval))
          .replay(chatEventsReplayCount);

          chatStreamsCache.put(chatId, chatEventsStream);

          return chatEventsStream;
          };
          }


          ,



          // ChatHandler.java

          public Mono<ServerResponse> getChatStream(ServerRequest request) {

          String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
          String username = getUsername(request);

          Flux<ServerSentEvent> chatEventsStream = chatRepository
          .getChatEventsStream(chatId)
          .map(addUserSpecificPropsToChatEvents(username))
          .map(event -> ServerSentEvent.<List<ChatEvent>>builder()
          .event(CHAT_SSE_NAME)
          .data(event)
          .build());

          log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

          return ServerResponse.ok().body(
          chatEventsStream,
          ServerSentEvent.class
          );
          }


          ,



          After applying these changes, the service performs well even with 3000 active users (JVM uses ~50% of CPU, Mongo ~7% mostly due to lots of inserts - the streams are not that noticeable now)






          share|improve this answer























            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018502%2fspring-webflux-with-mongodb-throttling-sse-clients%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            The answer is:
            You do it by using Flux.buffer method. Then the flux will send the events to the subscribers in bulks at a defined rate.



            The code I have posted had 2 major issues




            1. Given that multiple users are usually listening to one chat, I have refactored the ChatRepository to take advantage of "Hot", replayable Fluxes (now I have 1 stream per chat instead of 1 stream per user) which I store in Caffeine cache.
              Additionally I am buffering them by short time intervals to avoid heavy resource usage on pushing the events to the clients on busy chats.


            2. The new Query() I have used in the ChatRepository was redundant. I
              have looked at ReactiveMongoTemplate's code and if a non-null
              query is provided, the logic is a bit more complex. It's better to pass null
              to ReactiveMongoTemplate's tail() method instead.



            Code post-refactoring



            // ChatRepository.java

            public Flux<List<ChatEvent>> getChatEventsStream(String chatId) {
            return Optional.ofNullable(chatStreamsCache.getIfPresent(chatId))
            .orElseGet(newCachedChatEventsStream(chatId))
            .autoConnect();
            }

            private Supplier<ConnectableFlux<List<ChatEvent>>> newCachedChatEventsStream(String chatId) {
            return () -> {
            ConnectableFlux<List<ChatEvent>> chatEventsStream = reactiveMongoTemplate.tail(
            null,
            ChatEvent.class,
            chatId
            ).buffer(Duration.ofMillis(chatEventsBufferInterval))
            .replay(chatEventsReplayCount);

            chatStreamsCache.put(chatId, chatEventsStream);

            return chatEventsStream;
            };
            }


            ,



            // ChatHandler.java

            public Mono<ServerResponse> getChatStream(ServerRequest request) {

            String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
            String username = getUsername(request);

            Flux<ServerSentEvent> chatEventsStream = chatRepository
            .getChatEventsStream(chatId)
            .map(addUserSpecificPropsToChatEvents(username))
            .map(event -> ServerSentEvent.<List<ChatEvent>>builder()
            .event(CHAT_SSE_NAME)
            .data(event)
            .build());

            log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

            return ServerResponse.ok().body(
            chatEventsStream,
            ServerSentEvent.class
            );
            }


            ,



            After applying these changes, the service performs well even with 3000 active users (JVM uses ~50% of CPU, Mongo ~7% mostly due to lots of inserts - the streams are not that noticeable now)






            share|improve this answer




























              0














              The answer is:
              You do it by using Flux.buffer method. Then the flux will send the events to the subscribers in bulks at a defined rate.



              The code I have posted had 2 major issues




              1. Given that multiple users are usually listening to one chat, I have refactored the ChatRepository to take advantage of "Hot", replayable Fluxes (now I have 1 stream per chat instead of 1 stream per user) which I store in Caffeine cache.
                Additionally I am buffering them by short time intervals to avoid heavy resource usage on pushing the events to the clients on busy chats.


              2. The new Query() I have used in the ChatRepository was redundant. I
                have looked at ReactiveMongoTemplate's code and if a non-null
                query is provided, the logic is a bit more complex. It's better to pass null
                to ReactiveMongoTemplate's tail() method instead.



              Code post-refactoring



              // ChatRepository.java

              public Flux<List<ChatEvent>> getChatEventsStream(String chatId) {
              return Optional.ofNullable(chatStreamsCache.getIfPresent(chatId))
              .orElseGet(newCachedChatEventsStream(chatId))
              .autoConnect();
              }

              private Supplier<ConnectableFlux<List<ChatEvent>>> newCachedChatEventsStream(String chatId) {
              return () -> {
              ConnectableFlux<List<ChatEvent>> chatEventsStream = reactiveMongoTemplate.tail(
              null,
              ChatEvent.class,
              chatId
              ).buffer(Duration.ofMillis(chatEventsBufferInterval))
              .replay(chatEventsReplayCount);

              chatStreamsCache.put(chatId, chatEventsStream);

              return chatEventsStream;
              };
              }


              ,



              // ChatHandler.java

              public Mono<ServerResponse> getChatStream(ServerRequest request) {

              String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
              String username = getUsername(request);

              Flux<ServerSentEvent> chatEventsStream = chatRepository
              .getChatEventsStream(chatId)
              .map(addUserSpecificPropsToChatEvents(username))
              .map(event -> ServerSentEvent.<List<ChatEvent>>builder()
              .event(CHAT_SSE_NAME)
              .data(event)
              .build());

              log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

              return ServerResponse.ok().body(
              chatEventsStream,
              ServerSentEvent.class
              );
              }


              ,



              After applying these changes, the service performs well even with 3000 active users (JVM uses ~50% of CPU, Mongo ~7% mostly due to lots of inserts - the streams are not that noticeable now)






              share|improve this answer


























                0












                0








                0







                The answer is:
                You do it by using Flux.buffer method. Then the flux will send the events to the subscribers in bulks at a defined rate.



                The code I have posted had 2 major issues




                1. Given that multiple users are usually listening to one chat, I have refactored the ChatRepository to take advantage of "Hot", replayable Fluxes (now I have 1 stream per chat instead of 1 stream per user) which I store in Caffeine cache.
                  Additionally I am buffering them by short time intervals to avoid heavy resource usage on pushing the events to the clients on busy chats.


                2. The new Query() I have used in the ChatRepository was redundant. I
                  have looked at ReactiveMongoTemplate's code and if a non-null
                  query is provided, the logic is a bit more complex. It's better to pass null
                  to ReactiveMongoTemplate's tail() method instead.



                Code post-refactoring



                // ChatRepository.java

                public Flux<List<ChatEvent>> getChatEventsStream(String chatId) {
                return Optional.ofNullable(chatStreamsCache.getIfPresent(chatId))
                .orElseGet(newCachedChatEventsStream(chatId))
                .autoConnect();
                }

                private Supplier<ConnectableFlux<List<ChatEvent>>> newCachedChatEventsStream(String chatId) {
                return () -> {
                ConnectableFlux<List<ChatEvent>> chatEventsStream = reactiveMongoTemplate.tail(
                null,
                ChatEvent.class,
                chatId
                ).buffer(Duration.ofMillis(chatEventsBufferInterval))
                .replay(chatEventsReplayCount);

                chatStreamsCache.put(chatId, chatEventsStream);

                return chatEventsStream;
                };
                }


                ,



                // ChatHandler.java

                public Mono<ServerResponse> getChatStream(ServerRequest request) {

                String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
                String username = getUsername(request);

                Flux<ServerSentEvent> chatEventsStream = chatRepository
                .getChatEventsStream(chatId)
                .map(addUserSpecificPropsToChatEvents(username))
                .map(event -> ServerSentEvent.<List<ChatEvent>>builder()
                .event(CHAT_SSE_NAME)
                .data(event)
                .build());

                log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

                return ServerResponse.ok().body(
                chatEventsStream,
                ServerSentEvent.class
                );
                }


                ,



                After applying these changes, the service performs well even with 3000 active users (JVM uses ~50% of CPU, Mongo ~7% mostly due to lots of inserts - the streams are not that noticeable now)






                share|improve this answer













                The answer is:
                You do it by using Flux.buffer method. Then the flux will send the events to the subscribers in bulks at a defined rate.



                The code I have posted had 2 major issues




                1. Given that multiple users are usually listening to one chat, I have refactored the ChatRepository to take advantage of "Hot", replayable Fluxes (now I have 1 stream per chat instead of 1 stream per user) which I store in Caffeine cache.
                  Additionally I am buffering them by short time intervals to avoid heavy resource usage on pushing the events to the clients on busy chats.


                2. The new Query() I have used in the ChatRepository was redundant. I
                  have looked at ReactiveMongoTemplate's code and if a non-null
                  query is provided, the logic is a bit more complex. It's better to pass null
                  to ReactiveMongoTemplate's tail() method instead.



                Code post-refactoring



                // ChatRepository.java

                public Flux<List<ChatEvent>> getChatEventsStream(String chatId) {
                return Optional.ofNullable(chatStreamsCache.getIfPresent(chatId))
                .orElseGet(newCachedChatEventsStream(chatId))
                .autoConnect();
                }

                private Supplier<ConnectableFlux<List<ChatEvent>>> newCachedChatEventsStream(String chatId) {
                return () -> {
                ConnectableFlux<List<ChatEvent>> chatEventsStream = reactiveMongoTemplate.tail(
                null,
                ChatEvent.class,
                chatId
                ).buffer(Duration.ofMillis(chatEventsBufferInterval))
                .replay(chatEventsReplayCount);

                chatStreamsCache.put(chatId, chatEventsStream);

                return chatEventsStream;
                };
                }


                ,



                // ChatHandler.java

                public Mono<ServerResponse> getChatStream(ServerRequest request) {

                String chatId = request.pathVariable(CHAT_ID_PATH_VARIABLE);
                String username = getUsername(request);

                Flux<ServerSentEvent> chatEventsStream = chatRepository
                .getChatEventsStream(chatId)
                .map(addUserSpecificPropsToChatEvents(username))
                .map(event -> ServerSentEvent.<List<ChatEvent>>builder()
                .event(CHAT_SSE_NAME)
                .data(event)
                .build());

                log.debug("nExposing chat streamnchat: {}nuser: {}", chatId, username);

                return ServerResponse.ok().body(
                chatEventsStream,
                ServerSentEvent.class
                );
                }


                ,



                After applying these changes, the service performs well even with 3000 active users (JVM uses ~50% of CPU, Mongo ~7% mostly due to lots of inserts - the streams are not that noticeable now)







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Jan 3 at 23:45









                Jakub MalecJakub Malec

                865




                865
































                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54018502%2fspring-webflux-with-mongodb-throttling-sse-clients%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Monofisismo

                    Angular Downloading a file using contenturl with Basic Authentication

                    Olmecas