backend coding is about to get interesting again
The sea parted for me the day I saw what was then the "new" google maps (especially painful as I was working on the inferior yahoo maps at the time). The day before that, backend coding ruled the web stack, the frontend was merely an afterthought. The day after and for years since, advances in frontend techniques were considered the greater differentiator.
I think that is about to change.
Frontend techniques are still hugely important, but flexible layouts, stylish css, responsive js and well-structured html are widespread and hardly differentiators anymore. The bar is raised: frontend excellence is now a prerequisite for survival rather than a path to success.
What is more rare is advanced backend techniques that leverage advanced algorithms, exotic hardware, massive datasets and high-performance software. The best examples we have of such stacks are modern search engines - both microsoft and google have spent billions creating the concept of the next-generation backends to bear on one problem - indexing the web. But this is only the beginning. Speech interfaces, internet-of-things (i.e. self driving vehicles), image correction/synthesis and other interesting problems are demanding backend stacks that are incredibly advanced and so far, very expensive to build. To google's credit, they seem to be way ahead of everyone else here, and they seem to be able to apply their incredible globe-spanning realtime backend stack to more and more problems.
I'm predicting now that the rest of the web, which mostly treats backend stacks as a way to get data out of a database and on to a screen, will have to adopt some version of what google and others are exploring now if they wish to survive in the next decade. Most sites still rely on the end user as the agent of automation for data-intensive tasks like finding music, categorizing photos, identifying junk data, etc. we're seeing now that these tasks are amenable to algorithmic solutions, which invite possibilities for dealing with even greater amounts of data than users could ever hope to manage directly.
Some technologies that I believe are about to become the "new normal" for backend developers:
- AI techniques/machine learning. This is the big one. right now we're consigning this stuff to the category of "data science", but I am convinced that on some level, anyone working with data on the backend will have to have some degree of fluency in these techniques.
- managing and processing lots of data. "Big data" has become a cliche, but there are a lot of real issues involved in the management and processing of petabytes.
- realtime processing. I see the time-series as the dominant data dimension going forward. There's too much data being generated otherwise...time-agnostic views of data will become unmanageable.
- concurrency/parallelism. Most of the big problems on the backend will have to be solved in some concurrent/parallel/distributed fashion. Tools and techniques that are stuck in the serial world will be obsolete.
- some understanding of systems. Many developers continue to migrate to virtualized environments that obscure the real systems underneath, but at the cutting edge I predict we will continue to see a need for exotic systems that are programmed directly, which probably implies a detailed understanding of linux (the only backend platform that matters).
- high performance tools designed for building big, distributed, concurrent systems. All of the duck/stringly-typed languages (perl,python,ruby,php) will go away in favor of tools that produce fast programs at scale, and are amenable to writing large systems that are (more) correct. Examples of such tools are java/scala,go,rust,D and c++. Moore's law won't help inferior tools - the future is multicore, and most of the tools that will go away can't exploit these architectures.
last update 2013-06-26