When the iPhone was first introduced in January of 2007, it took the world by storm. The first device to compress – successfully, at least – a mobile phone, a computer with internet access and the 21 century’s equivalent of the Walkman into something that would easily fit into a pocket, the sheer breadth of its capabilities was without precedent. And this was before, notably, the SDK that made possible the “there’s an app for that” campaign. An SDK, as an aside, that Steve Jobs was originally opposed to.
What the market is telling developers and their employers alike, effectively, is that the market can provide a system that will shepherd code from its earliest juvenile days in version control through to its adult stage in production. It is telling them that the system can be robust, automated and increasingly intelligent. And it is also telling them that they have to build and maintain it themselves.
Fragmentation makes it impossible for vendors to natively supply the requisite components for a fully integrated toolchain. That does not change the reality, however, that developers are forced to borrow time from writing code and redirect it towards managing the issues associated with highly complex, multi-factor developer toolchains held together in places by duct tape and baling wire. This, then, is the developer experience gap. The same market that offers developers any infrastructure primitive they could possibly want is simultaneously telling them that piecing them together is a developer’s problem.