Skip to content

Decoding large documents panics #348

@j-sv

Description

@j-sv

I'm decoding a large file with multiple documents, and at some point the decoder panics while trying to index outside of a buffer. I've tried to recreate it with slight smaller data, but it has to be a certain size for the bug to trigger.

package main

import (
        "bytes"

        "github.com/goccy/go-json"
)

func mk() []int {
        data := make([]int, 500)

        for i := range data {
                data[i] = 1000
        }

        return data
}

func main() {
        buf := bytes.NewBuffer(nil)

        enc := json.NewEncoder(buf)

        for i := 0; i < 2; i++ {
                if err := enc.Encode(mk()); err != nil {
                        panic(err)
                }
        }

        dec := json.NewDecoder(bytes.NewBuffer(buf.Bytes()))

        for dec.More() {
                var foo interface{}

                if err := dec.Decode(&foo); err != nil {
                        panic(err)
                }
        }
}

This works if using encoding/json.

The actual panic appears to happen here: https://github.com/goccy/go-json/blob/master/internal/decoder/stream.go#L204

When debugging this with other data, s.filledBuffer is true and the s.bufSize is increase to 1024, while s.length and s.cursor is 1912. So it ends up with a 1024 capacity buffer, but tries to slice it at s.buf[1912+0:].

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions