In my last post, I mentioned that the biggest reason for producing shit code (in my opinion, anyway) is that we let junior developers run around unsupervised. That’s worth expanding on.
I don’t mean to sound like I think developers with less than a decade of experience should be kept in the barrel. I like working with bright and enthusiastic junior devs – they bring an energy and exuberance to work that more jaded and cynical types such as myself sometimes find difficult to create. Some of the best people I’ve worked with were junior devs at the time – in some cases, they still are. I flatter myself to think I was like that once myself.
The problem is not that junior devs are stupid. They’re not. They are extremely clever. Furthermore, when they hit roadblocks, they know how to use tools such as Google to find answers to their problems immediately. They’ll never have to suffer with hours or days of pouring through manuals to try and figure out the configuration option that makes a product do what it’s meant to do: they’ll just google for a solution, plug it in and go. And, just like I did at the time, they’ll be too busy working out how to solve things to stop and think if they should be solving it.
“Experience is something you don’t get until just after you need it.”
― Steven Wright
The thing is, brains and talent are really good, but they don’t substitute for experience. Well, not enough anyway. Bright and talented inexperienced people will make mistakes. It’s that simple. And many of those mistakes are utterly avoidable given a bit of guidance.
Software development is one of the few professions in the world where we take bright young graduates straight out of college – or self-trained about the same age – and toss them straight into work as if they knew what they were doing. Lawyers don’t do that. Doctors don’t. I don’t think accountants do. Teachers don’t. Carpenters, electricians and plumbers don’t. Almost every single profession has built into it an apprenticeship period of sorts (though the white collar ones don’t call it that). They acknowledge that people new to an industry need to work under and learn from people who’ve been there for a while. But not IT.
There are lots of historical reasons why IT has never developed a strong culture of apprenticeship. The biggest, I think, relates to growth and half-life.
Growth is a huge part of the story. IT has grown massively over the last several decades. Every company and organisation wants more – even following the bubble burst of the early 2000s there wasn’t any real slack in demand globally. (Yes, there were some geographic areas that had excess). In the middle of the GFC there is still massive demand for IT workers. This leads to a pretty constant influx of newer staff – each year, there are more than the year before, more or less.
The flip side, though, is that the “career half-life” of your typical developer is less than 10 years. Within about 10 years of starting a career as a developer, about half will have changed paths. They may go into management, or become business analysts, or specialise in testing. They may drop out and start a bakery. Heck – they may even decide to become a DBA for some obscure reason. By 20 years, it will be half of that again. Following this completely-believable-but-totally-made-up statistic, only one in 16 developers will stay a developer over a 40 year career.
What this boils down to is that there aren’t enough senior developers around to mentor the juniors properly. This in turn contributes to the burnout rate. It sucks, but that’s the way it is.
What we really need to do is revive a craft structure. If I had the responsibility for doing this, this is how I’d do it in a company: Set up teams headed by a few craftmasters, who work closely with journeyman (and women) developers to do the work. And every journeyperson would be responsible for bringing up an apprentice as well. It’s a structure that’s worked for thousands of years – but it just seems like it doesn’t survive in the modern world 😦
There are places that try to do this – the Craftsmanship Academy is one such initiative. But not enough. Nor is this a new complaint – hell, I remember hearing gripes like this when I started uni 20 years ago.
[BTW, I don’t consider myself to be at the “craftsmaster” level. I rank myself at the senior journeyman stage: I’m competent, know what I’m doing, have a broad range of experience, and can train others. But there’s a long way to go yet, and there’s always a lot more to learn]
Before I go, let me make one thing absolutely clear: junior developers are fantastic assets. Every team needs a good healthy slice of junior devs – which is just as well, as they are the easiest to find. Junior devs will bring new ideas and insights that the senior devs won’t have. Jut don’t overload it. A senior dev (the “journeyman level” mentioned above) can probably train and mentor at most three juniors – anymore and that senior dev is going to be doing supervision, not training. If you shoot for a team size of six to seven, that implies two seniors (journeymen), two senior apprentices, and two junior apprentices, working for a craftsmaster-type.
[Sidenote: what’s a good team size? For a team that’s going to be mostly permanent and stick around for over a year, I like six to eight. Teams of four are more productive (per person), but don’t have much slack and have difficulty coping with turnover or people being absent. You have to allow people to take holidays, you know. Six to eight gives a bit more robustness without much extra overhead. It’s also a good number of people to do pairing – 4 people have only three pair combinations, but six people have 15. Eight people have 28]
So, to put a qualifier on the title – unsupervised junior developers considered harmful.
5 thoughts on “Junior Developers considered harmful.”
“I flatter myself to think I was like that once myself.”
Somehow I can’t picture that 🙂
Me being like that, or me flattering myself? 😉
This is a nice thought-provoking post. I would go further, though, and say none of us should run around unsupervised. It’s not just a matter of getting a good start–it’s a matter of continuous learning. Some of the worst code I’ve seen was written by developers with over ten years experience. Or, perhaps, one year of experience repeated over ten times.
Mentoring is good, but insufficient. We need to continue talking about our code, sharing new ideas and learning from each other. Not only is there ”always a lot more to learn,” but sharing our stories is how we build our culture.
Completely agree with all of that, George. I need to follow up with a “Senior Developers considered harmful” post as well.
Software development isn’t a solo sport anymore – at least it shouldn’t be.
I have a point similar to George’s. In IT, and programming, the situation is and probably always will be constantly changing. New languages come up, new paradigms (ajax vs server-side), new methodologies. All these challenge the existing knowledge and experience. Some patterns are always applicable, others must be abandoned completely. So as you’ve already said, “Older developers considered harmful”, as well. If their knowledge and experience mostly solidified after working on Cobol for 10 years, 30 years ago, they aren’t going to bring much to the table to a modern project, and may even train juniors in bad habits.
This is fairly unique in all the world’s industries. Few companies are expected to constantly change, and it shows, because many of them aren’t able to (Sony). Craftsmen, as you mention, would become fairly adept at say, carpentry, within 5 years, and then go on to add extra experience with different wood, climates, styles, etc. A new carpenter would gain much from a 40 year veteran. Further, companies expect this as well – that an older developer will be much faster with better quality, and bring that experience to the younger developers. Conversely, they also expect that all developers on a team should be coding, not being trained. I think this goes some way to explaining the sometimes bizzare hiring choices they make.