Skip to content

Commit 164c5d9

Browse files
committed
Documentation update
1 parent 354119a commit 164c5d9

File tree

1 file changed

+22
-4
lines changed

1 file changed

+22
-4
lines changed

README.md

Lines changed: 22 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -162,10 +162,12 @@ message PlayerData {
162162

163163
`private` would apply tighter accessor to only one field, `private_message` apply to all fields in the message. But yeah, I didn't work on that yet. I just want to write these documentation as I code. :P
164164

165-
# JSON : interoperation with games backend
165+
# Interoperate with games backend
166166

167167
`protobuf-unity` and `ProtoBinaryManager` together deals with your **offline** save data. What about taking that online? Maybe it is just for backing up the save file for players (without help from e.g. iCloud), or you want to be able to view, inspect, or award your player something from the server.
168168

169+
## JSON at client side
170+
169171
The point of protobuf is often to sent everything over the wire with matching protobuf files waiting. But what if you are not in control of the receiving side? The key is often JSON serialization, since that is kinda the standard of interoperability. And what I just want to tell you is to know that there is a class called [`Google.Protobuf.JsonFormatter`](https://developers.google.com/protocol-buffers/docs/reference/csharp/class/google/protobuf/json-formatter) available for use from Google's dll already.
170172

171173
How to use it is just instantiate that class instance (or `JsonFormatter.Default` for no config quick formatting) then `.Format(yourProtobufMessageObject)`. It uses a bunch of reflections to make a key value pairs of C# variable name and its value which may not be the most efficient solution, but it did the trick.
@@ -179,10 +181,14 @@ If you want to see what the JSON string looks like, here is one example of `.For
179181
If just for backup, you may not need JSON and just dump the binary or base 64 of it and upload the entire thing. But JSON often allows the backend to actually **do something** about it. You may think that there is one benefit of Protobuf that it could be read in a server, so shouldn't we just upload the message instead of JSON? But that maybe only your own server which you code up yourself. For other commercial solutions maybe you need JSON.
180182

181183
For example [Microsoft Azure Playfab](https://playfab.com/) supports [attaching JSON object](https://docs.microsoft.com/en-us/gaming/playfab/features/data/entities/quickstart#entity-objects) to your entity. Then with understandable save data available in Playfab, you are able to segment and do live dev ops based on the save, e.g. the player that progressed slower in the game. Or award points from the server on event completion, then the player sync back to Protobuf in the local device. (However attaching a generic file [is also possible](https://docs.microsoft.com/en-us/gaming/playfab/features/data/entities/quickstart#entity-files).)
184+
185+
## Deciphering Protobuf at server side
186+
187+
As opposed to doing JSON at client side and send it to server, how about sending protobuf bytes to the server and deserialize with JS version of generated code instead?
182188

183-
## Node JS example
189+
### Node JS example
184190

185-
Here's an example how to setup Node's `Crypto` so it decrypts what C# encrypted. I used this pattern in my Firebase Functions where it spin up Node server with a lambda code fragment, receiving the save file for safekeeping and at the same time decipher it so server knows the save file's content. Assuming you have already got a Node `Buffer` of your save data at the server as `content` :
191+
Here's an example how to setup Node's `Crypto` so it decrypts what C# encrypted in my code. I used this pattern in my Firebase Functions where it spin up Node server with a lambda code fragment, receiving the save file for safekeeping and at the same time decipher it so server knows the save file's content. Assuming you have already got a Node `Buffer` of your save data at the server as `content` :
186192

187193
```js
188194
function decipher(saveBuffer)
@@ -204,11 +210,23 @@ function decipher(saveBuffer)
204210
const finalBuffer = Buffer.concat([decrypted, final])
205211

206212
//At this point you get a naked protobuf bytes without encryption.
207-
213+
//Now you can obtain a nicely structured class data.
208214
return YourGeneratedProtoClassJs.deserializeBinary(finalBuffer)
209215
}
210216
```
211217

218+
### Compatibility with Google Firebase Firestore
219+
220+
Firestore could store JSON-like data, and the JS Firestore library can store JS object straight into it. However not everything are supported as JS object is a superset of what Firestore supports.
221+
222+
One thing it cannot store `undefined`, and other it cannot store nested arrays. While `undefined` does not exist in JSON, nested array is possible in JSON but not possible to be stored in Firestore.
223+
224+
What you get from `YourGeneratedProtoClassJs.deserializeBinary` is not JS object. It is a `class` instance of Protobuf message. There is a method `.ToObject()` available to change it to JS object, but if you take a look at what you had as `map<A,B>`, the `.ToObject()` produces `[A,B][]` instead. This maybe how Protobuf really keep your `map`. Unfortunately as I said, nested array can't go straight into Firestore.
225+
226+
After eliminating possible `undefined` or `null`, you need to manually post-process every `map` field (and nested ones), changing `[A,B][]` into a proper JS object by using `A` as key and `B` as value by looping over them and replace the field.
227+
228+
`repeated` seems to be `ToObject()`-ed just fine. It is just a straight array.
229+
212230
## Bugs?
213231

214232
- `repeated` enum : It works serializing at C# and deserializing at C# just fine. But if you get `Unhandled error { AssertionError: Assertion failed` when trying to turn the buffer made from C# into JS object, it may be [this bug from a year ago](https://github.com/protocolbuffers/protobuf/issues/5232). I just encountered it in protobuf 3.10.0 (October 2019). Adding `[packed=false]` does not fix the issue for me.

0 commit comments

Comments
 (0)