Tag: serialization

Mongoid: Serialize to JSON with an string id attribute

A quick hint, on how to have an 'id' attribute in Mongoid document JSON:

When you serialize Mongoid objects to JSON, you won't get ID attribute, similar to ActiveRecord. Instead you'll have "_id" attribute containing "$oid" value:

{
  "_id": {"$oid":"52d3fdfd53656e5180020000"},
  "changed_at":"2014-02-15",
  "created_at":"2014-01-13T15:53:49.444+01:00",
  "updated_at":"2014-01-13T15:53:49.444+01:00"
}

If you would prefer something like this:

{
  "id": "52d3fdfd53656e5180020000",
  "_id": {"$oid":"52d3fdfd53656e5180020000"},
  "changed_at":"2014-02-15",
  "created_at":"2014-01-13T15:53:49.444+01:00",
  "updated_at":"2014-01-13T15:53:49.444+01:00"
}

Just put this into your initializers (for example in config/initializers/mongoid.rb):

module Mongoid
  module Document
    def as_json(options={})
      attrs = super(options)
      attrs["id"] = attrs["_id"].to_s
      attrs
    end
  end
end

Ruby, Rails + objects serialization (Marshal), Mongoid and performance matters

Introduction

Sometimes, we want to store our objects in files/database directly (not ORmapped or DRmapped). We can obtain this with serialization. This process will convert any Ruby object into format that can be saved as a byte stream. You can read more about serialization here.

Serializing stuff with Ruby

Ruby uses Marshal serialization. It is quite easy to use. If use use ActiveRecord, you can use this simple class to store objects in AR supported database:

class PendingObject < ActiveRecord::Base

  # Iterate through all pending objects
  def self.each
    self.all.each do |el|
      yield el, el.restore
    end
  end

  # Marshal given object and store it on db
  def store(object)
    self.object = Marshal.dump(object)
    self.save!
  end

  # "Unmarshal" it and return
  def restore
    Marshal.load(self.object)
  end

end

Of course this is just a simple example of how to use serialization. Serialized data should be stored in a binary field:

      t.binary :object

Mongo, Mongoid and its issues with serialization

Unfortunately you can't just copy-paste this ActiveRecord solution directly into Mongoid:

class PendingObject

  include Mongoid::Document
  include Mongoid::Timestamps

  field :object, :type => Binary

  # Iterate through all pending objects
  def self.each
    self.all.each do |el|
      yield el, el.restore
    end
  end

  def store(object)
    self.object = Marshal.dump(object)
    self.save!
  end

  def restore
    Marshal.load(self.object)
  end

end

It doesn't matter whether or not you use Binary or String in a field type decleration. Either way you'll get this as a result:

String not valid UTF-8

I can understand why this would happen with a String, but why when I set it as a binary value? It should just store whatever I put there...

Base64 to the rescue

In order to fix this, I've decided to use Base64 to convert serialized data. This has an significant impact on the size of each serialized object (30-35% more) but I can live with that. I was more concerned about the performance, that's why I've decided to test it. There are 2 cases what I've wanted to check:

  • Serialization
  • Serialization and deserialization (reading serialized objects)

Here are steps that I took:

  1. Create simple ruby object
  2. Serialize it 100 000 times with step every 1000 (without Base64)
  3. Serialize it 100 000 times with step every 1000 (with Base64)
  4. Benchmark creating of ruby simple objects (just as a reference point)
  5. Analyze all the data

Just to be sure (and to minimize random CPU spikes) I've performed test cases 10 times and then I took average values.

Benchmark

Benchmark code is really simple:

  1. Code responsible for iteration preparing
  2. DummyObject - object that will be serialized
  3. PendingObject - object that will be used to store data in Mongo
  4. ResultStorer - object used to store time results (time taken)
  5. Benchmark - container for all the things
  6. Loops :)

You can download source code here (benchmark.rb).

Results, charts, fancy data

First the reference point - pure objects initialization (without serialization). We can see, that there's no big decrease in performance, no matter how many objects we will initialize. Initializing 100 000 objects takes around 0.25 second.

Now some more interesting data :) Objects initialization and initialization with serialization (single direction and without base64):

It is pretty straightforward, that serialization isn't the fastest way to go. It might slowdown whole process around 10 times. But it's still like 2.5 seconds for 100 000 objects. Now lets see what will happen when we add a base64 to all of it (for a reference we will leave previous values on the chart as well):

It seems, that Base64 conversions will slow down the whole process about 10-12% max. It is still bearable (since for 100 000 objects its around 2.7s).

Now it is time for the most interesting part: deserialization. By "deserialization" I mean time that we need to convert a stream of bytes into objects (serialization time is not taken into consideration here):

Results are quite predictable. Adding Base64 to the deserialization process, increases overall time required around 12-14%. As previously, it is an overhead that can be accepted - especially when you realize that even then, 100 000 objects can be deserialized in less than 2 seconds.

Lets summarize all that we have (pure initialization, serialization, serialization with Base64, deserialization, deserialization with Base64, serialization-deserialization process and the serialization-deserialization with Base64):

Conclusions

Based on our calculations and benchmarks we can see, that the overall performance drop when serializing and deserializing using Base64 is around 23-26%. If you're not planning to work with huge number of objects at the same time, the whole process will still be extremely fast and you can use it.

Of course if you can use for example MySQL with Binary - there is no need to use Base64 with it. But on the other hand, if you're using MongoDB (with Mongoid) or any other database that has some issues with Binary and you still want to store serialized objects in it - this is a way to go. If you consider also the bigger size of Base64 data, the total performance loss should not exceed 35%.

So: if you don't have time to look for a better solution and you will be aware of disadvantages of this solution - you can use it ;)

Copyright © 2024 Closer to Code

Theme by Anders NorenUp ↑