handlename's blog

コード片など

大量のJSONをデコードする場合

全てつなげて一つのJSONにしたほうが速くなるんじゃないかと思ったんだけどそんなことなかった。

#!/usr/bin/env perl

use strict;
use warnings;

use Benchmark qw/timethese cmpthese/;
use JSON::XS;

my @jsons = map { encode_json({ hoge => 'huga' }) } 1..1000;

my $result = timethese(1000, {
    each => sub {
        my $decoded;

        for my $index (0..$#jsons) {
            $decoded->{$index} = decode_json($jsons[$index]);
        }
    },
    joined => sub {
        my $index = 0;
        my $joined = '{';
        $joined .= join ',', map { $index++; qq!"${index}":${_}! } @jsons;
        $joined .= '}';

        my $decoded = decode_json($joined);
    }
});

cmpthese($result);
Benchmark: timing 1000 iterations of each, joined...
      each:  1 wallclock secs ( 1.35 usr +  0.00 sys =  1.35 CPU) @ 740.74/s (n=1000)
    joined:  2 wallclock secs ( 1.55 usr +  0.00 sys =  1.55 CPU) @ 645.16/s (n=1000)
        Rate joined   each
joined 645/s     --   -13%
each   741/s    15%     --
perl benchmark_decode_json.pl  2.99s user 0.02s system 99% cpu 3.019 total