Open5

cloudflare-worker においてビルドサイズとパフォーマンスの影響

mizchimizchi
$ pnpm create cloudflare@latest sized-worker

でプロジェクト作成

ランタイムはこういうデフォルトコード(TypeScript)

export interface Env {
}

export default {
	async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
		return new Response('Hello World!');
	},
};
$ brew install k6
bench.mjs
import { check } from "k6"
import http from "k6/http"

export const options = {
	iterations: 1000,
	thresholds: {
		http_req_failed: ["rate<0.01"],
		http_req_duration: ["p(90)<2000"]
	}
}

export default function () {
	const res = http.get(
		"http://localhost:8787"
	)
	check(res, {
		'is_status_200': (r) => r.status === 200
	})
}
$ k6 run bench.mjs

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  ()  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 1000 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 10000   
     data_received..................: 91 kB   8.3 kB/s
     data_sent......................: 80 kB   7.3 kB/s
     http_req_blocked...............: avg=2.76µs  min=0s       med=1µs    max=1.42ms  p(90)=2µs     p(95)=2µs    
     http_req_connecting............: avg=275ns   min=0s       med=0s     max=275µs   p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=10.87ms min=274µs    med=8.28ms max=32.94ms p(90)=26.3ms  p(95)=28.94ms
       { expected_response:true }...: avg=10.87ms min=274µs    med=8.28ms max=32.94ms p(90)=26.3ms  p(95)=28.94ms
   ✓ http_req_failed................: 0.00%   ✓ 01000
     http_req_receiving.............: avg=25.87µs min=9µs      med=24µs   max=77µs    p(90)=38µs    p(95)=42µs   
     http_req_sending...............: avg=6.39µs  min=2µs      med=6µs    max=44µs    p(90)=10µs    p(95)=11µs   
     http_req_tls_handshaking.......: avg=0s      min=0s       med=0s     max=0s      p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=10.84ms min=250µs    med=8.25ms max=32.89ms p(90)=26.27ms p(95)=28.9ms 
     http_reqs......................: 1000    91.461728/s
     iteration_duration.............: avg=10.92ms min=318.37µs med=8.33ms max=33ms    p(90)=26.36ms p(95)=29.01ms
     iterations.....................: 1000    91.461728/s
     vus............................: 1       min=1       max=1 
     vus_max........................: 1       min=1       max=1
mizchimizchi

ただデカいだけのscriptを作る

import * as ts from "typescript";

export interface Env {
}

export default {
	async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
		const num = ts.SyntaxKind.Identifier;
		return new Response(`${num}`);
	},
};

ベンチマーク

$ k6 run bench.mjs

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  ()  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 1000 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 10000   
     data_received..................: 80 kB   9.6 kB/s
     data_sent......................: 80 kB   9.6 kB/s
     http_req_blocked...............: avg=7.8µs   min=0s     med=1µs    max=6.33ms  p(90)=2µs     p(95)=2µs    
     http_req_connecting............: avg=517ns   min=0s     med=0s     max=517µs   p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=8.24ms  min=2.36ms med=8.08ms max=69.13ms p(90)=13.27ms p(95)=14.62ms
       { expected_response:true }...: avg=8.24ms  min=2.36ms med=8.08ms max=69.13ms p(90)=13.27ms p(95)=14.62ms
   ✓ http_req_failed................: 0.00%   ✓ 01000
     http_req_receiving.............: avg=28.81µs min=12µs   med=27µs   max=84µs    p(90)=38µs    p(95)=42µs   
     http_req_sending...............: avg=7.24µs  min=3µs    med=6µs    max=58µs    p(90)=11µs    p(95)=12µs   
     http_req_tls_handshaking.......: avg=0s      min=0s     med=0s     max=0s      p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=8.2ms   min=2.34ms med=8.04ms max=69.09ms p(90)=13.23ms p(95)=14.57ms
     http_reqs......................: 1000    120.386271/s
     iteration_duration.............: avg=8.3ms   min=2.39ms med=8.14ms max=69.21ms p(90)=13.35ms p(95)=14.72ms
     iterations.....................: 1000    120.386271/s
     vus............................: 1       min=1        max=1 
     vus_max........................: 1       min=1        max=1 


running (00m08.3s), 0/1 VUs, 1000 complete and 0 interrupted iterations
default ✓ [=================================] 1 VUs  00m08.3s/10m0s  1000/1000 shared iters

リクエストの条件を合わせる

export interface Env {
}

export default {
	async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
		// const num = ts.SyntaxKind.Identifier;
		const num = 80;
		return new Response(`${num}`);
	},
};
$ k6 run bench.mjs

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  ()  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 1000 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00%10000   
     data_received..................: 80 kB   65 kB/s
     data_sent......................: 80 kB   65 kB/s
     http_req_blocked...............: avg=2.66µs min=0s       med=1µs      max=1.54ms p(90)=2µs    p(95)=2µs   
     http_req_connecting............: avg=266ns  min=0s       med=0s       max=266µs  p(90)=0s     p(95)=0s    
   ✓ http_req_duration..............: avg=1.18ms min=259µs    med=959µs    max=3.92ms p(90)=2.47ms p(95)=2.57ms
       { expected_response:true }...: avg=1.18ms min=259µs    med=959µs    max=3.92ms p(90)=2.47ms p(95)=2.57ms
   ✓ http_req_failed................: 0.00%01000
     http_req_receiving.............: avg=21.4µs min=8µs      med=20µs     max=99µs   p(90)=30µs   p(95)=36µs  
     http_req_sending...............: avg=5.11µs min=2µs      med=5µs      max=46µs   p(90)=7µs    p(95)=10µs  
     http_req_tls_handshaking.......: avg=0s     min=0s       med=0s       max=0s     p(90)=0s     p(95)=0s    
     http_req_waiting...............: avg=1.15ms min=244µs    med=937µs    max=3.89ms p(90)=2.44ms p(95)=2.53ms
     http_reqs......................: 1000    815.6374/s
     iteration_duration.............: avg=1.22ms min=282.41µs med=989.04µs max=3.96ms p(90)=2.52ms p(95)=2.61ms
     iterations.....................: 1000    815.6374/s
     vus............................: 1       min=1      max=1 
     vus_max........................: 1       min=1      max=1 


running (00m01.2s), 0/1 VUs, 1000 complete and 0 interrupted iterations
default[=================================] 1 VUs  00m01.2s/10m0s  1000/1000 shared iters
mizchimizchi

wrangler dev --remote

N=100


$ k6 run bench.mjs

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  ()  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 100 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 1000  
     data_received..................: 61 kB   26 kB/s
     data_sent......................: 8.0 kB  3.3 kB/s
     http_req_blocked...............: avg=15.87µs min=0s      med=2µs     max=1.43ms  p(90)=2µs     p(95)=3µs    
     http_req_connecting............: avg=2.81µs  min=0s      med=0s      max=282µs   p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=23.88ms min=19.64ms med=23.03ms max=53.92ms p(90)=27.83ms p(95)=29.22ms
       { expected_response:true }...: avg=23.88ms min=19.64ms med=23.03ms max=53.92ms p(90)=27.83ms p(95)=29.22ms
   ✓ http_req_failed................: 0.00%   ✓ 0100
     http_req_receiving.............: avg=34.06µs min=17µs    med=32µs    max=86µs    p(90)=45µs    p(95)=50.39µs
     http_req_sending...............: avg=7.38µs  min=3µs     med=7µs     max=30µs    p(90)=11µs    p(95)=12µs   
     http_req_tls_handshaking.......: avg=0s      min=0s      med=0s      max=0s      p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=23.84ms min=19.61ms med=22.97ms max=53.9ms  p(90)=27.78ms p(95)=29.18ms
     http_reqs......................: 100     41.729191/s
     iteration_duration.............: avg=23.95ms min=19.69ms med=23.09ms max=53.96ms p(90)=27.92ms p(95)=29.29ms
     iterations.....................: 100     41.729191/s
     vus............................: 1       min=1       max=1
     vus_max........................: 1       min=1       max=1


running (00m02.4s), 0/1 VUs, 100 complete and 0 interrupted iterations
default ✓ [===================================] 1 VUs  00m02.4s/10m0s  100/100 shared iters

TS

$ k6 run bench.mjs

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  ()  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 1 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 100 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 1000  
     data_received..................: 61 kB   25 kB/s
     data_sent......................: 8.0 kB  3.2 kB/s
     http_req_blocked...............: avg=17.33µs min=0s      med=1µs     max=1.58ms  p(90)=2µs     p(95)=2µs    
     http_req_connecting............: avg=3.34µs  min=0s      med=0s      max=334µs   p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=24.79ms min=19.2ms  med=23.7ms  max=96.79ms p(90)=28.45ms p(95)=29.13ms
       { expected_response:true }...: avg=24.79ms min=19.2ms  med=23.7ms  max=96.79ms p(90)=28.45ms p(95)=29.13ms
   ✓ http_req_failed................: 0.00%   ✓ 0100
     http_req_receiving.............: avg=30.57µs min=11µs    med=28µs    max=69µs    p(90)=44.1µs  p(95)=49.05µs
     http_req_sending...............: avg=7.16µs  min=2µs     med=5.5µs   max=30µs    p(90)=12.1µs  p(95)=14µs   
     http_req_tls_handshaking.......: avg=0s      min=0s      med=0s      max=0s      p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=24.76ms min=19.17ms med=23.68ms max=96.69ms p(90)=28.4ms  p(95)=29.1ms 
     http_reqs......................: 100     40.201991/s
     iteration_duration.............: avg=24.86ms min=19.26ms med=23.74ms max=98.55ms p(90)=28.52ms p(95)=29.17ms
     iterations.....................: 100     40.201991/s
     vus............................: 1       min=1       max=1
     vus_max........................: 1       min=1       max=1


running (00m02.5s), 0/1 VUs, 100 complete and 0 interrupted iterations
default ✓ [===================================] 1 VUs  00m02.5s/10m0s  100/100 shared iters
mizchimizchi

deploy して比較

import { check } from "k6"
import http from "k6/http"

export const options = {
	iterations: 2000,
	thresholds: {
		http_req_failed: ["rate<0.01"],
		http_req_duration: ["p(90)<2000"]
	}
}

export default function () {
	const res = http.get(
		// "http://localhost:8787"
		"https://sized-worker.mizchi.workers.dev"
	)
	check(res, {
		'is_status_200': (r) => r.status === 200
	})
}

Vanilla

$ k6 run bench.mjs --vus 10

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  (‾)  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 10 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 2000 iterations shared among 10 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 2000       ✗ 0   
     data_received..................: 752 kB  267 kB/s
     data_sent......................: 77 kB   27 kB/s
     http_req_blocked...............: avg=313.51µs min=0s     med=0s      max=67.06ms  p(90)=1µs     p(95)=1µs    
     http_req_connecting............: avg=41.81µs  min=0s     med=0s      max=8.62ms   p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=13.68ms  min=9.61ms med=12.74ms max=261.49ms p(90)=15.1ms  p(95)=16.47ms
       { expected_response:true }...: avg=13.68ms  min=9.61ms med=12.74ms max=261.49ms p(90)=15.1ms  p(95)=16.47ms
   ✓ http_req_failed................: 0.00%   ✓ 0          ✗ 2000
     http_req_receiving.............: avg=15.81µs  min=4µs    med=8µs     max=4.41ms   p(90)=13µs    p(95)=15µs   
     http_req_sending...............: avg=19.35µs  min=8µs    med=16µs    max=3.24ms   p(90)=23µs    p(95)=30µs   
     http_req_tls_handshaking.......: avg=129.65µs min=0s     med=0s      max=30.13ms  p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=13.65ms  min=9.59ms med=12.71ms max=261.46ms p(90)=15.04ms p(95)=16.42ms
     http_reqs......................: 2000    710.285034/s
     iteration_duration.............: avg=14.03ms  min=9.64ms med=12.78ms max=261.52ms p(90)=15.22ms p(95)=16.68ms
     iterations.....................: 2000    710.285034/s
     vus............................: 10      min=10       max=10
     vus_max........................: 10      min=10       max=10

TypeScript

$ k6 run bench.mjs --vus 10

          /\      |‾‾| /‾‾/   /‾‾/   
     /\  /  \     |  |/  /   /  /    
    /  \/    \    |     (   /   ‾‾\  
   /          \   |  |\  \ |  (‾)  | 
  / __________ \  |__| \__\ \_____/ .io

  execution: local
     script: bench.mjs
     output: -

  scenarios: (100.00%) 1 scenario, 10 max VUs, 10m30s max duration (incl. graceful stop):
           * default: 2000 iterations shared among 10 VUs (maxDuration: 10m0s, gracefulStop: 30s)


     ✓ is_status_200

     checks.........................: 100.00% ✓ 2000       ✗ 0   
     data_received..................: 752 kB  178 kB/s
     data_sent......................: 77 kB   18 kB/s
     http_req_blocked...............: avg=172.54µs min=0s     med=0s      max=38.87ms p(90)=1µs     p(95)=1µs    
     http_req_connecting............: avg=46.09µs  min=0s     med=0s      max=9.48ms  p(90)=0s      p(95)=0s     
   ✓ http_req_duration..............: avg=20.89ms  min=9.77ms med=12.66ms max=2.09s   p(90)=15.16ms p(95)=16.79ms
       { expected_response:true }...: avg=20.89ms  min=9.77ms med=12.66ms max=2.09s   p(90)=15.16ms p(95)=16.79ms
   ✓ http_req_failed................: 0.00%   ✓ 0          ✗ 2000
     http_req_receiving.............: avg=13.15µs  min=4µs    med=8µs     max=5.96ms  p(90)=12µs    p(95)=14µs   
     http_req_sending...............: avg=17.53µs  min=7µs    med=16µs    max=119µs   p(90)=23µs    p(95)=28µs   
     http_req_tls_handshaking.......: avg=120.91µs min=0s     med=0s      max=28.56ms p(90)=0s      p(95)=0s     
     http_req_waiting...............: avg=20.86ms  min=9.75ms med=12.63ms max=2.09s   p(90)=15.13ms p(95)=16.74ms
     http_reqs......................: 2000    473.263118/s
     iteration_duration.............: avg=21.1ms   min=9.8ms  med=12.7ms  max=2.12s   p(90)=15.19ms p(95)=16.82ms
     iterations.....................: 2000    473.263118/s
     vus............................: 10      min=10       max=10
     vus_max........................: 10      min=10       max=10


running (00m04.2s), 00/10 VUs, 2000 complete and 0 interrupted iterations
default ✓ [================================] 10 VUs  00m04.2s/10m0s  2000/2000 shared iters

複数回やるとどんどん速度が向上していった。 最初は473/sで、最終的には 672/s まで向上。

mizchimizchi

まとめ

  • 5MB 制限は minify&gzip 後の制限
  • バンドルサイズが多いとオートスケールが間に合わず、このケースでは1.5倍ほど低速化
  • wrangler dev や wrangler dev --remote はたぶん1スレッドしかないのでベンチにならない

ちなみにgzip後に1MBを超えてると、警告が出る

Total Upload: 8030.32 KiB / gzip: 1296.12 KiB
▲ [WARNING] We recommend keeping your script less than 1MiB (1024 KiB) after gzip. Exceeding past this can affect cold start time

8MB(gzip後1.2MB) のスクリプトをアップロードしようとして成功するので、制限は gzip 後