Training a Graduate Software Engineer
Previously I’ve written about universities and industry preparation of students. At the time of writing that post, I was working in a large company with a compact digital department that mainly utilised Ruby and front end skills, and now I’m working in a giant media company with a sizeable digital operation, which has just shown to me how tricky it is to make a great curriculum that caters to industry as a whole. Overall, however, the move has changed the specifics of my views, but not the general philosophy.
If we take decent graduated software engineering students, or students doing industrial placements during their degree, how would we best prepare them for industry? If people have the desire to be good software engineers, what do we need to do to equip them with the skills to be desirable in a team?
Here is what MY post-graduation curriculum would look cover, to create engineers for MY specific needs:
- Test driven development
- Agile processes focusing on collaboration & teamwork
- Advanced version control
- Continuous delivery
- Refactoring
- Caching strategies for the web
- Building services & APIs
- Front end technology bootcamp (getting started with JS, HTML5 & CSS3 intros)
A crucial note here is that aside from Agile processes, the main things I want covered are backed by computer science fundamentals at some level. This is an important note: the existing curriculum of a computer science degree is not redundant, as one of the opinion pieces mentioned in my previous post suggested.
We could teach these subjects to students after graduation, but could these subjects be taught as part of a university course? To some degree, yes, but it is not as simple as that, as the teaching would happen within the construct of university teaching. A course in an Australian university is usually twelve to thirteen weeks long, with a fulltime student usually doing four subjects per a semester. Face-to-face time for a course is usually an hour or two of lectures a week, and another hour or two of tutorial time. To develop many (not all) of these skills students need to be around their peers with expert guidance – they’re not skills that the majority of students will just pick up by reading a book.
With only a few dozen hours of facetime in total per course it would be a real challenge to cover anything more than the basics if we tried to cover the bulk of this in a single course. If we changed the approach from a single course to a whole semester covering industrial skills, i.e. four concurrent subjects, then we might have better success. Let’s see how that might look:
- Subject 1: Being an Agile developer
- Test driven development
- Refactoring
- Advanced version control
- Continuous Delivery (covering CI, but also how version control, CI and automated testing fit together to deliver to production)
- Subject 2: Working with others
- Agile team skills - collaboration, user stories, roles, planning
- Building a team and delivery culture
- Subject 3: Supporting web applications
- Caching strategies for the web
- Building services & APIs
- Subject 4: Building in the browser
- JavaScript
- HTML5 & CSS3
There, now that could work. Of course, the problem with this course is that it is built for my needs as a hiring manager in a particular company context. Would it work for everyone? No, and that is the problem with trying to insist that universities need to teach these courses. Could these work as individual electives? Perhaps, but then I wouldn’t be hiring an industry ready developer if they hadn’t covered the breadth of the curriculum outlined above.
Could a company work this curriculum into on the job training? Absolutely, but to really be effective at it would require teaching skills that most companies don’t have the capability or bandwidth to deliver. And relying on external training partners is not likely to be achieveable from a cost perspective, particularly when it is can be difficult to retain talent - investments that walk out the door are an issue.
So what is the answer? I think this could be fixed by a combination of approaches. Universities should consider how they teach practical subjects of being a software practitioner, in particular: version control, automated testing and refactoring, and some exposure to a wider variety of commercially popular languages. Then companies need to have a great approach to building teams, and could probably do so at a reasonably low cost with little impact, since building great teams should be one of their chief objectives anyway. And finally, companies can support both internal and external communities to help drive skills in areas such as web development and browser based technology.
Obviously coordinating this level of knowledge building across institutions like would be difficult, particularly as a single university sends graduates out to dozens or hundreds of different companies, but with some level of cooperation this could be a decent approach.
A final note on this subject is that it must be highlighted that a professional developer needs to take a lot of responsibility for their own progression in these areas. I recognise that some students will not struggle with motivation, but exposure – they don’t know what they don’t know, and that can make it hard to learn.
I’m interested in other thoughts in this area. I have spent a small amount of time with universities this year, and it has piqued my interest further, as I think faculties are asking themselves about how to stay relevant in a world where there is so much online education as well. For mine, universities are definitely still relevant, but they do need to adjust as well, just as companies need to come to the party.