Last night I gave a lightning talk at the newly tweaked #metabeertalks, these guys are great friends of mine, and their topic was “is realtime Fashion or Fad?”.
Modern hosting approaches, aka “the cloud” have many advantages: They scale trivially, encourage you to use best practices in how you architect and deploy, and are flexible to change as your application does.
The single most powerful thing though, is that it puts a cost on every element of your application. We can debate if it’s cheaper, more expensive, or about the same as hosting on tin: but you know what your components cost.
Your application isn’t being bundled up with a load of others on a server, with your IT team complaining they have to install a new one with about 1 months lead time every 3 months.
You host your application on instance the right size for it, be it small or huge, single or a fleet of 20. You use the storage you need, when you need it, without playing that impossible game of “how much storage will we need by the time the storage system actually arrives”.
And all this comes with transparency: set your system up with the right tags, and all the costs of an application are known.
Knowing those, you can start flexing: If you need 10 machines to keep up with realtime analysis, they’re yours. Or if you don’t want to pay that, bid for some cheaper instances and batch the work overnight.
Within reason, you can do anything, if you can afford it. So you can take a call about which bits of information are valuable enough to justify being realtime.
When Netflix launched House of Cards series two, you can hear them talking about the “Play Start” messages coming in. That kind of realtime information is amazingly helpful for debugging.
The deeper stats of how many people watched, and how many episodes they binged on, that information could probably wait a few hours to batch…
My take on realtime: do it where it’s valuable, and where you can justify the cost.
Which is exactly the same for all elements of cloud hosting.