+ All Categories
Home > Engineering > Reactive server with netty

Reactive server with netty

Date post: 16-Mar-2018
Category:
Upload: dmitriy-dumanskiy
View: 385 times
Download: 5 times
Share this document with a friend
98
Highload reactive server with Netty
Transcript
Page 1: Reactive server with netty

Highload reactive server with Netty

Page 2: Reactive server with netty

Dmitriy DumanskiyBlynk, CTO

Java blog : https://habrahabr.ru/users/doom369/topicsDOU : https://dou.ua/users/DOOM/articles/

Page 3: Reactive server with netty

Makers problem

+ = ?

Page 4: Reactive server with netty

Makers problem

● Http/s● Mqtt● WebSockets● Own binary protocol

Page 5: Reactive server with netty

Blynk

10000 req/sec3 VM * 2 cores, 60$

25% load10k of local installations

Page 6: Reactive server with netty

Why netty?

CassandraApache Spark

Elasticsearch

Graylog

Neo4j

Vert.x

HornetQInfinispan

Finagle

Async-http-clientFirebase

Akka

CouchbasePlay frameworkRedisson

Page 7: Reactive server with netty

Why netty?

~700k servers

Page 8: Reactive server with netty

Why netty?● Less GC

Page 9: Reactive server with netty

Why netty?● Less GC● Optimized for Linux based OS

Page 10: Reactive server with netty

Why netty?● Less GC● Optimized for Linux based OS● High performance buffers

Page 11: Reactive server with netty

Why netty?● Less GC● Optimized for Linux based OS● High performance buffers● Well defined threading model

Page 12: Reactive server with netty

Why netty?● Less GC● Optimized for Linux based OS● High performance buffers● Well defined threading model● HTTP, HTTP/2, SPDY, SCTP, TCP,

UDP, UDT, MQTT, etc

Page 13: Reactive server with netty

When to use?● Performance is critical

Page 14: Reactive server with netty

When to use?● Performance is critical● Own protocol

Page 15: Reactive server with netty

When to use?● Performance is critical● Own protocol● Full control over network

(so_reuseport, tcp_cork, tcp_fastopen, tcp_nodelay, etc)

Page 16: Reactive server with netty

When to use?● Performance is critical● Own protocol● Full control over network● Game engines (agario, slither,

minecraft)

Page 17: Reactive server with netty

When to use?● Performance is critical● Own protocol● Full control over network● Game engines● <3 reactive

Page 18: Reactive server with netty

Non-Blocking

● Few threads● No context switching● No memory consumption

Page 19: Reactive server with netty

Non-Blockingnew Channel

read / write

Selector

Thread

new Channel

read / write

new Channel

read / write

Page 20: Reactive server with netty

java.nio.channels.SelectorSelector selector = Selector.open();channel.configureBlocking(false);SelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); if (key.isReadable()) { ... } }}

Page 21: Reactive server with netty

Selector selector = Selector.open(); // creating selectorchannel.configureBlocking(false);SelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); if (key.isReadable()) { ... } }}

Page 22: Reactive server with netty

Selector selector = Selector.open(); channel.configureBlocking(false);//registering channel with selector, listening for READ events onlySelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); if (key.isReadable()) { ... } }}

Page 23: Reactive server with netty

Selector selector = Selector.open();channel.configureBlocking(false);SelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); //blocking until we get some READ events Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); if (key.isReadable()) { ... } }}

Page 24: Reactive server with netty

Selector selector = Selector.open();channel.configureBlocking(false);SelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); //now we have channels with some data Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); if (key.isReadable()) { ... } }}

Page 25: Reactive server with netty

Selector selector = Selector.open();channel.configureBlocking(false);SelectionKey key = channel.register(selector, SelectionKey.OP_READ);while(true) { selector.select(); Set<SelectionKey> selectedKeys = selector.selectedKeys(); Iterator<SelectionKey> keyIterator = selectedKeys.iterator(); while(keyIterator.hasNext()) { key = keyIterator.next(); //do something with data if (key.isReadable()) { key.channel() } }}

Page 26: Reactive server with netty

FlowSelector

SelectionKey

Channel

ChannelPipeline

Page 27: Reactive server with netty

FlowChannelPipeline

fireEvent()

invokeChannelRead() executor.execute()

invokeChannelRead()

Page 28: Reactive server with netty

Minimal setupServerBootstrap b = new ServerBootstrap();b.group( new NioEventLoopGroup(1), new NioEventLoopGroup()) .channel(NioServerSocketChannel.class) .childHandler(new ChannelInitializer() {...});

ChannelFuture f = b.bind(8080).sync();f.channel().closeFuture().sync();

Page 29: Reactive server with netty

Minimal setupServerBootstrap b = new ServerBootstrap();b.group( new NioEventLoopGroup(1), //IO thread new NioEventLoopGroup()) .channel(NioServerSocketChannel.class) .childHandler(new ChannelInitializer() {...});

ChannelFuture f = b.bind(8080).sync();f.channel().closeFuture().sync();

Page 30: Reactive server with netty

Minimal setupServerBootstrap b = new ServerBootstrap();b.group( new NioEventLoopGroup(1), new NioEventLoopGroup() //worker threads) .channel(NioServerSocketChannel.class) .childHandler(new ChannelInitializer() {...});

ChannelFuture f = b.bind(8080).sync();f.channel().closeFuture().sync();

Page 31: Reactive server with netty

Minimal setupServerBootstrap b = new ServerBootstrap();b.group( new NioEventLoopGroup(1), new NioEventLoopGroup() //worker threads) .channel(NioServerSocketChannel.class) .childHandler(new ChannelInitializer() {...}); //pipeline init

ChannelFuture f = b.bind(8080).sync();f.channel().closeFuture().sync();

Page 32: Reactive server with netty

Minimal setup

new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel ch) { final ChannelPipeline pipeline = ch.pipeline(); pipeline.addLast(new MyLogicHere()); }};

Page 33: Reactive server with netty

ChannelPipeline

Page 34: Reactive server with netty

ChannelPipeline

● Inbound event -> ChannelInboundHandler (CIHA)

● Outbound event -> ChannelOutboundHandler (COHA)

Page 35: Reactive server with netty

ChannelInboundHandlerpublic interface ChannelInboundHandler extends ChannelHandler { ... void channelRegistered(ChannelHandlerContext ctx); void channelActive(ChannelHandlerContext ctx); void channelRead(ChannelHandlerContext ctx, Object msg); void userEventTriggered(ChannelHandlerContext ctx, Object evt); void channelWritabilityChanged(ChannelHandlerContext ctx); ...}

Page 36: Reactive server with netty

void initChannel(SocketChannel ch) { ch.pipeline() .addLast(new MyProtocolDecoder()) .addLast(new MyProtocolEncoder()) .addLast(new MyLogicHandler());}

Own tcp/ip server

Page 37: Reactive server with netty

Channel

MyProtocolDecoder

MyLogicHandler

Own tcp/ip server

Channel

MyProtocolEncoder

MyLogicHandler

Page 38: Reactive server with netty

HandlersHttpServerCodec

ChannelTrafficShapingHandler

IdleStateHandler

ReadTimeoutHandler

ChunkedWriteHandler

SslHandler

LoggingHandler

RuleBasedIpFilter

StringDecoderJsonObjectDecoder

Base64DecoderJZlibDecoder

JZlibDecoder

Lz4FrameDecoder

ProtobufDecoderObjectDecoder

XmlFrameDecoder

Page 39: Reactive server with netty

void initChannel(SocketChannel ch) { ch.pipeline() .addLast(new HttpRequestDecoder()) .addLast(new HttpResponseEncoder()) .addLast(new MyHttpHandler());}

Http Server

Page 40: Reactive server with netty

void initChannel(SocketChannel ch) { ch.pipeline() .addLast(new HttpServerCodec()) .addLast(new MyHttpHandler());}

OR

Page 41: Reactive server with netty

void initChannel(SocketChannel ch) { ch.pipeline() .addLast(sslCtx.newHandler(ch.alloc())) .addLast(new HttpServerCodec()) .addLast(new MyHttpHandler());}

Https Server

Page 42: Reactive server with netty

void initChannel(SocketChannel ch) { ch.pipeline() .addLast(sslCtx.newHandler(ch.alloc())) .addLast(new HttpServerCodec()) .addLast(new HttpContentCompressor()) .addLast(new MyHttpHandler());}

Https Server + content gzip

Page 43: Reactive server with netty

@Overridepublic void channelRead(Context ctx, Object msg) { //pass flow processing to next handler super.channelRead(ctx, msg);}

Pipeline flow

Page 44: Reactive server with netty

@Override public void channelRead(Context ctx, Object msg) { //stop request processing return;}

Pipeline flow

Page 45: Reactive server with netty

public void channelRead(Context ctx, Object msg) { If (msg instanceOf LoginMessage) { LoginMessage login = (LoginMessage) msg; if (isSuperAdmin(login)) { ctx.pipeline().remove(this); ctx.pipeline().addLast(new SuperAdminHandler()); } }}

Pipeline flow on the fly

Page 46: Reactive server with netty

public void channelRead(Context ctx, Object msg) { ChannelFuture cf = ctx.writeAndFlush(response); cf.addListener(new ChannelFutureListener() { @Override public void complete(ChannelFuture future) { future.channel().close(); } });}

Pipeline futures

Page 47: Reactive server with netty

@Overridepublic void channelRead(Context ctx, Object msg) { ChannelFuture cf = ctx.writeAndFlush(response); //close connection after message was delivered cf.addListener(ChannelFutureListener.CLOSE);}

Pipeline futures

Page 48: Reactive server with netty

@Overridepublic void channelRead(Context ctx, Object msg) { ... ChannelFuture cf = ctx.writeAndFlush(response); cf.addListener(future -> { ... });}

Pipeline futures

Page 49: Reactive server with netty

public void channelRead(Context ctx, Object msg) { ChannelFuture cf = session.sendMsgToFriend(msg); cf.addListener(new ChannelFutureListener() { @Override public void complete(ChannelFuture future) { future.channel().writeAndFlush(“Delivered!”); } });}

Pipeline futures

Page 50: Reactive server with netty

Pipeline blocking IONon blocking pools Blocking pools

IO Event LoopsDB

Worker Event Loops

Mailing

File system

Page 51: Reactive server with netty

public void channelRead(Context ctx, Object msg) { if (msg instanceof HttpRequest) { HttpRequest req = (HttpRequest) msg; if (req.method() == GET && req.uri().eq(“/users”)) { Users users = dbManager.userDao.getAllUsers(); ctx.writeAndFlush(new Response(users)); } }}

Pipeline blocking IO

Page 52: Reactive server with netty

public void channelRead(Context ctx, Object msg) { if (msg instanceof HttpRequest) { HttpRequest req = (HttpRequest) msg; if (req.method() == POST && req.uri().eq(“/email”)) { mailManager.sendEmail(); } }

Pipeline blocking IO

Page 53: Reactive server with netty

public void channelRead(Context ctx, Object msg) { if (msg instanceof HttpRequest) { HttpRequest req = (HttpRequest) msg; if (req.method() == GET && req.uri().eq(“/property”)) { String property = fileManager.readProperty(); ctx.writeAndFlush(new Response(property)); } }}

Pipeline blocking IO

Page 54: Reactive server with netty

public void channelRead(Context ctx, Object msg) { ...

blockingThreadPool.execute(() -> {Users users = dbManager.userDao.getAllUsers();ctx.writeAndFlush(new Response(users));

});}

Pipeline blocking IO

Page 55: Reactive server with netty

Pipeline blocking IO

● Thread.sleep()

Page 56: Reactive server with netty

Pipeline blocking IO

● Thread.sleep()● java.util.concurrent.*

Page 57: Reactive server with netty

Pipeline blocking IO

● Thread.sleep()● java.util.concurrent.*● Intensive operations

Page 58: Reactive server with netty

Pipeline blocking IO

● Thread.sleep()● java.util.concurrent.*● Intensive operations● Any blocking IO (files, db, smtp, etc)

Page 59: Reactive server with netty

Pipeline blocking IO

● Thread.sleep()● java.util.concurrent.*● Intensive operations● Any blocking IO (files, db, smtp, etc)

Page 60: Reactive server with netty

@Overridepublic void channelInactive(Context ctx) { HardwareState state = getState(ctx.channel()); if (state != null) { ctx.executor().schedule( new DelayedPush(state), state.period, SECONDS ); }}

EventLoop is Executor!

Page 61: Reactive server with netty

public void channelRead(Context ctx, Object msg) { if (msg instanceof FullHttpRequest) { FullHttpRequest request = (FullHttpRequest) msg; User user = sessionDao.checkCookie(request); ... } super.channelRead(ctx, msg); }

Request state

Page 62: Reactive server with netty

private static AttributeKey<User> USER_KEY = AttributeKey.valueOf("user");

ctx.channel().attr(USER_KEY).set(user);

Request state

Page 63: Reactive server with netty

public void channelRead(Context ctx, Object msg) { if (msg instanceof FullHttpRequest) { FullHttpRequest request = (FullHttpRequest) msg; User user = sessionDao.checkCookie(request); ctx.channel().attr(USER_KEY).set(user); } super.channelRead(ctx, msg); }

Request state

Page 64: Reactive server with netty

if (isSsl(in)) { enableSsl(ctx);} else { if (isGzip()) { enableGzip(ctx); } else if (isHttp(in)) { switchToHttp(ctx); }}

Port unification

Page 65: Reactive server with netty
Page 66: Reactive server with netty

Back pressure

if (channel.isWritable()) { channel.writeAndFlush(msg);}

Page 67: Reactive server with netty

Back pressure

BackPressureHandler

coming soon...

Page 68: Reactive server with netty

Performance

Page 69: Reactive server with netty

Performance

https://www.techempower.com/benchmarks/#section=data-r13&hw=ph&test=plaintext

Page 70: Reactive server with netty

<dependency> <groupId>io.netty</groupId> <artifactId>netty-transport-native-epoll</artifactId> <version>${netty.version}</version> <classifier>${os}</classifier></dependency>

Native transport

Page 71: Reactive server with netty

Bootstrap b = new Bootstrap();b.group(new EpollEventLoopGroup());b.channel(EpollSocketChannel.class);

Native transport

Page 72: Reactive server with netty

SslContextBuilder.forServer().sslProvider(SslProvider.OpenSsl);

JNI OpenSslEngine

Page 73: Reactive server with netty

<dependency> <groupId>io.netty</groupId> <artifactId>netty-tcnative-boringssl-static</artifactId> <version>${netty.boring.ssl.version}</version> <classifier>${os}</classifier></dependency>

JNI OpenSslEngine

Page 74: Reactive server with netty

● Netty-tcnative● netty-tcnative-libressl● netty-tcnative-boringssl-static

JNI OpenSslEngine

Page 75: Reactive server with netty

Own ByteBuf

Page 76: Reactive server with netty

Own ByteBuf● Reference counted● Pooling by default● Direct memory by default● LeakDetector by default● Reduced branches, range-checks

Page 77: Reactive server with netty

Own ByteBuf

● ByteBufAllocator.buffer(size);● ctx.alloc().buffer(size);● channel.alloc().buffer(size);

Page 78: Reactive server with netty

Less system calls

for (Message msg : messages) { ctx.writeAndFlush(msg);}

Page 79: Reactive server with netty

Less system calls

for (Message msg : messages) { ctx.write(msg);}ctx.flush();

Page 80: Reactive server with netty

Thread Model

ChannelFuture inCf = ctx.deregister();

inCf.addListener(new ChannelFutureListener() {

@Override

public void operationComplete(ChannelFuture cf) {

targetLoop.register(cf.channel())

.addListener(completeHandler);

}

});

Page 81: Reactive server with netty

Reusing Event Loop

new ServerBootstrap().group(new EpollEventLoopGroup(1), new EpollEventLoopGroup()

).bind(80);

Page 82: Reactive server with netty

Reusing Event LoopEventLoopGroup boss = new EpollEventLoopGroup(1);

EventLoopGroup workers = new EpollEventLoopGroup();

new ServerBootstrap().group(

boss,

workers

).bind(80);

new ServerBootstrap().group(

boss,

workers

).bind(443);

Page 83: Reactive server with netty

Use direct buffers

ctx.writeAndFlush(

new ResponseMessage(messageId, OK)

);

Page 84: Reactive server with netty

Use direct buffers

ByteBuf buf = ctx.alloc().buffer(3);//pool

buf.writeByte(messageId);

buf.writeShort(OK);

ctx.writeAndFlush(buf);

Page 85: Reactive server with netty

Less allocations

ByteBuf msg = makeResponse(...);msg.retain(targets.size() - 1);

for (Channel ch : targets) { ch.writeAndFlush(msg);}

Page 86: Reactive server with netty

Void promise

ctx.writeAndFlush(

response

);

Page 87: Reactive server with netty

Void promise

ctx.writeAndFlush(

response, ctx.voidPromise()

);

Page 88: Reactive server with netty

Reuse handlers

@Sharable

public class StringDecoder extends MessageToMessageDecoder<ByteBuf> {

...

}

Page 89: Reactive server with netty

Prefer context

ctx.channel().writeAndFlush();

Page 90: Reactive server with netty

Prefer context

ctx.channel().writeAndFlush();

ctx.writeAndFlush();

Page 91: Reactive server with netty

Simpler - faster

ChannelInboundHandlerAdapter

does nothing, but fast

Page 92: Reactive server with netty

Simpler - faster

ByteToMessageDecoder

does some work, but slower

Page 93: Reactive server with netty

Simpler - faster

ReplayingDecoder

does job for you, but slowest

Page 94: Reactive server with netty

Turn off leak detection

ResourceLeakDetector.setLevel(

ResourceLeakDetector.Level.DISABLED);

Page 95: Reactive server with netty

What else?

● ASCIIString● FastThreadLocal● Unsafe● Optimized Encoders

Page 96: Reactive server with netty

● Really fast● Low GC load● Flexible● Rapidly evolve● Cool support

Summary

Page 97: Reactive server with netty

● Hard● Memory leaks● Still have issues

Summary

Page 98: Reactive server with netty

https://github.com/blynkkk/blynk-server


Recommended