On choosing boring technologies

A reaction / agreement to Choose Boring Technology.

I remember a very early argument at Bloc that I’m sure Roshan will remember. It had to do with whether or not we should build our application in Ruby on Rails or some combination of separate frontend and backend tech.

Those of you who know me now, 6 years later, may be surprised that I was an advocate of the separate frontend and backend approach. I can’t quite recall what argument I made, but I bet it was some combination of “this is the standard approach”, “this will scale”, “this is how everyone else does it”, and “you’ve gotta choose the best tool for the job”.

I believe, without a doubt, I was 100% wrong. Roshan was right. Roshan, you were right!

My approach would’ve meant we moved slower. We would’ve spent more time thinking about our technology choice rather than our customer problems. And it turns out, we didn’t need to scale to millions of users because our business didn’t require it.

Velocity is everything. It’s everything. Especially at an early stage startup where you haven’t achieved product market fit.

Now I’m not saying you should never try new technologies. You should. But you’ve got to be explicit about what you’re optimizing for. If you’re optimizing for velocity, then you better be able to explain how that technology choice is the best approach for velocity. If you’re optimizing for something else, that’s fine, but be explicit about it, and about how it ultimately maps to your ability to move faster to solve your customer’s problems.

If you find yourself solving a rather standard problem (building a messaging system, forum software, any flavor of CRUD, authentication system, etc) and you choose the “latest and greatest” without explaining how that is going to make you faster, I believe you’re making the wrong choice.

I’m a bit surprised at how divisive this stance has become. I’m not even advocating any particular approach or tool or framework – just being explicit about why you’re choosing it and being honest with yourself about whether or not velocity is the thing you’re optimizing for. If your team is full of top notch javascript engineers who know node.js inside and out, great, that could totally be the right choice.

For me, that choice is Ruby on Rails. The amount of velocity I can squeeze out of this framework after the better part of a decade working in it is insane. I sometimes forget this, and then I dip my toes into the waters of another stack and am reminded of the fact. It’s not that these other stacks are bad, or that I dislike them, or that I think they made terrible technical choices. It’s that I’m an order of magnitude slower when I use them.

Weekly Roundup

My favorite links from this week:


Some books I recently finished:

  • The Man Who Mistook his Wife for a Hat by Oliver Sacks. I liked it a lot. It reminds me a bit of the weirder medical cases from the TV show House (without the character development.) I had a hard time reading it from cover to cover, so I spread it out over many weeks.
  • The Russian Revolution by Alan Moorehad. An excellent read. It begins with a discussion of the ascension of Czar Nicholas and chronicles his family’s downfall and the rise of the Bolsheviks, Mensheviks, and the tale of the attempt at democracy in Russian towards the end of WW1.

I abandoned From Counterculture to Cyberculture: Steward Brand, The Whole Earth network, … because, as I should’ve gleamed from the title, the author took what promised to be an interesting subject and made it quite boring. Couldn’t even get halfway.


The internet on Mars

Eventually the first settlements on Mars will need to have not only a Mars computer network (and eventually and Mars-based internet) but a way for that internet to connect / synchronize with the Earth-based internet.

I did some thinking and some research about what it might take to establish such a system. It turns out to be sorta complicated.

Radio communications break down not just according to the inverse square law, but due to interference and noise, more like the inverse quadratic law. So we’ll have to use lasers or something much more efficient.

There’s also the last mile problem. Until there is a series of geosynchronous satellites in low-Mars-orbit to deliver 100% coverage to the future settlements, we’re at the mercy of Mars’ orbit. The latest and greatest orbiters and rovers have pitiful data rates compared to what we have on Earth so the best approximations should assume we are able to deliver top notch communications equipment.

But even if we get state-of-the-art satellites in orbit around Mars, delivering 100% radio coverage to the settlements on the surface with state-of-the-art radio communications between them, and assuming we have an established internet infrastructure on the surface, there is still a delay in communications of between 3 and 21 minutes between the Earth and Mars, depending on the state of the planets in their orbits around the sun.

No realtime communication is possible. No video conferencing.

I wonder about the existing protocols that run our internet, and how they will handle such a delay. TCP breaks down when you have a high-latency, high-bandwidth connection (like this). All the existing space communication tech uses error correcting algorithms instead of error-detecting protocols to reduce both the amount of power required to transmit radio data and to reduce the need to re-send data.

Take Wikipedia. If I’m living on Mars, I almost certainly want access to the world’s most comprehensive store of human knowledge. Wikipedia is huge, and growing. Even if I just look at the English version of Wikipedia, it clocks in at about 44 million pages, occupying about 100GB compressed, and 10TB uncompressed.

The only realistic way to have such access is to maintain a local mirror of the site, which means servers on racks and lots of power connected to the local grid.

So that means we’re looking at infrastructure that synchronizes behind the scenes, rather than infrastructure that allows point-to-point communication. Or at the very least a hybrid system.

And this all assumes that we’ve already placed high-tech satellites in orbit around Mars, the likelihood of which is unclear. It may be that the early days of computing on Mars will be much more limited in scope, and the communication bottleneck with Earth will be exacerbated the more people use it.

But the great thing is that the original memo, The Intergalactic Computer Network, designed most of the internet protocols to work out of the box, assuming the hardware functions properly. And that’s pretty cool. You’ll literally be able to take your MacBook with you and it should work (assuming we solve the grounding problem).