JSON streaming

JSON streaming comprises communications protocols to delimit JSON objects built upon lower-level stream-oriented protocols (such as TCP), that ensures individual JSON objects are recognized, when the server and clients use the same one (e.g. implicitly coded in).

Introduction

JSON is a popular format for exchanging object data between systems. Frequently there's a need for a stream of objects to be sent over a single connection, such as a stock ticker or application log records.[1] In these cases there's a need to identify where one JSON encoded object ends and the next begins. Technically this is known as framing.

There are four common ways to achieve this:

  • Send the JSON objects formatted without newlines and use a newline as the delimiter.[2]
  • Send the JSON objects concatenated with a record separator control character as the delimiter.[3]
  • Send the JSON objects concatenated with no delimiters and rely on a streaming parser to extract them.
  • Send the JSON objects prefixed with their length and rely on a streaming parser to extract them.

Line-delimited JSON

Line-delimited JSON (LDJSON), newline-delimited JSON (NDJSON), and JSON lines (JSONL) are three terms for equivalent formats of JSON streaming. Streaming makes use of the fact that the JSON format does not allow newline and return characters within primitive values (in strings those must be escaped as \n and \r, respectively) and that most JSON formatters default to not including any whitespace, including newlines and returns. These features allow the newline and/or return characters to be used as a delimiter.

This format is specified at ndjson.org and documented at the JSON Lines website.

This example shows two JSON objects (the implicit newline characters at the end of each line are not shown):

{"some":"thing\n"}
{"may":{"include":"nested","objects":["and","arrays"]}}

The use of a newline as a delimiter enables this format to work very well with traditional line-oriented Unix tools.

A log file, for example, might look like:

{"ts":"2020-06-18T10:44:12","started":{"pid":45678}}
{"ts":"2020-06-18T10:44:13","logged_in":{"username":"foo"},"connection":{"addr":"1.2.3.4","port":5678}}
{"ts":"2020-06-18T10:44:15","registered":{"username":"bar","email":"bar@example.com"},"connection":{"addr":"2.3.4.5","port":6789}}
{"ts":"2020-06-18T10:44:16","logged_out":{"username":"foo"},"connection":{"addr":"1.2.3.4","port":5678}}

which is very easy to sort by date, grep for usernames, actions, IP addresses, etc.

Record separator-delimited JSON

Record separator-delimited JSON streaming allows JSON text sequences to be delimited without the requirement that the JSON formatter exclude whitespace. Since JSON text sequences cannot contain control characters, a record separator character can be used to delimit the sequences. In addition, it is suggested that each JSON text sequence be followed by a line feed character to allow proper handling of top-level JSON objects that are not self delimiting (numbers, true, false, and null).

This format is also known as Json Text Sequences or MIME type application/json-seq, and is formally described as IETF RFC 7464.

The example below shows two JSON objects with <RS> representing the record separator control character and <LF> representing the line feed character:

<RS>{"some":"thing\n"}<LF>
<RS>{
  "may": {
    "include": "nested",
    "objects": [
      "and",
      "arrays"
    ]
  }
}<LF>

Concatenated JSON

Concatenated JSON streaming allows the sender to simply write each JSON object into the stream with no delimiters. It relies on the receiver using a parser that can recognize and emit each JSON object as the terminating character is parsed. Concatenated JSON isn't a new format, it's simply a name for streaming multiple JSON objects without any delimiters.

The advantage of this format is that it can handle JSON objects that have been formatted with embedded newline characters, e.g., pretty-printed for human readability. For example, these two inputs are both valid and produce the same output:

{"some":"thing\n"}{"may":{"include":"nested","objects":["and","arrays"]}}
{
  "some": "thing\n"
}
{
  "may": {
    "include": "nested",
    "objects": [
      "and",
      "arrays"
    ]
  }
}

Implementations that rely on line-based input may require a newline character after each JSON object in order for the object to be emitted by the parser in a timely manner. (Otherwise the line may remain in the input buffer without being passed to the parser.) This is rarely recognised as an issue because terminating JSON objects with a newline character is very common.

Length-prefixed JSON

Length-prefixed or framed JSON streaming allows the sender to explicitly state the length of each message. It relies on the receiver using a parser that can recognize each length n and then read the following n bytes to parse as JSON.

The advantage of this format is that it can speed up parsing due to the fact that the exact length of each message is explicitly stated, rather than forcing the parser to search for delimiters. Length-prefixed JSON is also well-suited for TCP applications, where a single "message" may be divided into arbitrary chunks, because the prefixed length tells the parser exactly how many bytes to expect before attempting to parse a JSON string.

This example shows two length-prefixed JSON objects (with each length being the byte-length of the following JSON string):

18{"some":"thing\n"}55{"may":{"include":"nested","objects":["and","arrays"]}}

Comparison

Line-delimited JSON works very well with traditional line-oriented tools.

Concatenated JSON works with pretty-printed JSON but requires more effort and complexity to parse. It doesn't work well with traditional line-oriented tools. Concatenated JSON streaming is a superset of line-delimited JSON streaming.

Length-prefixed JSON works with pretty-printed JSON. It doesn't work well with traditional line-oriented tools, but may offer performance advantages over line-delimited or concatenated streaming. It can also be simpler to parse.

Compatibility

Line-delimited JSON can be read by a parser that can handle concatenated JSON. Concatenated JSON that contains newlines within a JSON object can't be read by a line-delimited JSON parser.

The terms "line-delimited JSON" and "newline-delimited JSON" are often used without clarifying if embedded newlines are supported.

There's also a format known as NDJ ("newline-delimited JSON")[4] which allows comments to be embedded if the first two characters of a given line are "//". This can't be used with standard JSON parsers if comments are included.

Concatenated JSON can be converted into line-delimited JSON by a suitable JSON utility such as jq. For example

jq --compact-output . < concatenated.json > lines.json

Applications and tools

Line-delimited JSON

Record separator-delimited JSON

  • jq can both create and read record separator-delimited JSON texts.

Concatenated JSON

  • concatjson concatenated JSON streaming parser/serializer module for Node.js
  • Jackson_(API) can read and write can read and write concatenated JSON content.
  • jq lightweight flexible command-line JSON processor
  • Noggit Solr's streaming JSON parser for Java
  • Yajl – Yet Another JSON Library. YAJL is a small event-driven (SAX-style) JSON parser written in ANSI C, and a small validating JSON generator.
  • ArduinoJson is a C++ library that supports concatenated JSON.

Length-prefixed JSON

  • missive Fast, lightweight library for encoding and decoding length-prefixed JSON messages over streams
  • burro auto-packaged, length-prefixed JSON byte streams
  • Native messaging WebExtensions Native Messaging
gollark: So you are collecting *some* data, then?
gollark: I have seen *some* of that page, so you are.
gollark: I do clearly have a page on it, as lignum has demonstrated.
gollark: Presumably whatever is on the secret internal wiki. They are not very transparent about it.
gollark: And other people.

References

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.