A protocol buffer is a mechanism to share objects between machines which is language agnostic and has a target to reduce the payload size. We are already common with JSON which is used by most RESTful APIs to send/receive objects to/from any kind of client. JSON is already convenient and supported by many platforms, but, why we should know about the protocol buffer.
Besides the optimization of payload encoding, protocol buffer which is also called protobuf introduces schema definition that should be maintained by the machines to encode or decode the objects delivered. The main processes for delivering the objects are called serialization and deserialization. Serialization is the process of transforming an object instance in an application into an optimized binary payload. Deserialization is the process of decoding the binary data into the desired object.
Let's take a look at the following table that shows a comparison of XML, JSON, and protobuf.
XML | JSON | protobuf | |
---|---|---|---|
Readability | Normal | High | Low |
Strictness | Low | Normal | High |
Size Efficiency | Low | Normal | High |
JSON is good to be debug and read by humans. If our object is mainly intended to be processed by machines, readability is not the focus.
JSON only support three basic types which are boolean, number, and string. Meanwhile, protobuf defines more data types so that any platforms or programming languages can decode the object into the desired type automatically. Besides, the strictness of the data type can be maintained among machines because protobuf requires all machines or applications to maintain the data or message schema with a set of rules and options.
Unlike XML or JSON which maintain the object schema in the payload explicitly and deliver the object as plain text, protobuf encodes the object into optimized binary with certain tags to describe the object structure. While the complete object schema is maintained in every machine. This leads to the payload size reduction.
There is also a CLI tool called protoc that can help us perform the encoding and decoding of data. Besides, this tool can also generate the class to instantiate the object defined in the protobuf schema for various programming languages.
Another benefit of protobuf, it standardises a mechanism so that it can handle both backward and forward compatibility. Backward compatibility means that a machine can still process the object delivered by another machine with an earlier version. Forward compatibility means it can handle the object delivered by a machine with a later version. This can be achieved inheritely by the concept of default value and reserved field embedded in protobuf.
As a reminder, protobuf doesn't handle the communication process. This is handled by another framework like gRPC. However, protobuf can define a service block in the schema to describe how a service will receive the request message and send the response message.
Comments
Post a Comment