Lead Data Scientist, BAE Systems Applied Intelligence
23 Feb 2021
Hannah Green says adaptability and mindset are both crucial when it comes to the ever-evolving field of software development
Data scientists like myself are, as you might expect, constantly scanning the horizon for the latest ground-breaking piece of tech. Like moths to a flame, we’re always drawn to the Holy Grail of something that promises speed and strength, efficiency and effectiveness.
Sometimes it doesn’t even need to be that big. Marginal gains can often have a profound and cumulative impact, but these developments are by no means limited to software. Sometimes change can come in the form of cultural approaches – such as Agile and Waterfall.
Both methodologies have their strengths and their advocates. Agile’s flexibility, for example, allows project teams to respond to customer reaction and constantly improve the product, whereas Waterfall’s sequential process minimises financial surprises, helps maintain timescales and helps deliver a clear outcome.
Personally, as Tech Lead for the Royal Navy’s Nelson Programme’s Data Platform Development Team, the vast majority of my experience is with Agile. I appreciate the ownership and the ability it gives me to deal well with change. But, to be honest, I am starting to think that one of the side effects of this approach is that it can allow the tactical solution to win more often than on a more traditionally managed project.
But what do I mean by “tactical” exactly? In this context, the actual definition, according to Harvard Business Review, is that there is a risk of ending up being too focused on process and micromanagement, or too adaptive, meaning you ending up avoiding long-term goals, timelines, or cross-functional collaboration.
So perhaps I don’t actually mean tactical, I mean adaptive.
I’m thinking about those situations when you do something slightly less than ideal in the long term to deliver value NOW. For example, when a project is in its early stages and you are trying to get the next round of funding, it’s far more valuable to be able to show the person holding the purse strings something exciting right now, than it is to have delivered a couple of the foundational code blocks to production quality, for example. It can be hard to assign value to code that doesn’t actually do anything by itself – and I appreciate this is not true in all cases – but it is in all those I have had experience of.
At some point, however, this approach has to change – and that involves, more than anything, a change in mindset that can be quite a challenge.
The opposite end of the scale is the ‘do it once, do it well’ approach. In our organisation, this is the approach needed to build a warship. You can’t change a design halfway through the build, so the time is taken up front – often years – to get it nailed down so that it can then be implemented, often over the course of many years.
But these are opposite extremes, and for most projects there is a balance to be found.
In software, change is often a good thing. If you intend to use the same technologies for the next five years, by the time those five years are up you will likely be missing out on some cool new tech that could make your software so much better.
Different stages of the Software Development Lifecycle require different approaches. When it comes to alpha development, as long as it is the right choice for the next six months, then that is probably fine. And when something goes live, you need it to be stable for some period – say the next five years. But the need for it to be stable, and the need to lock down technology, are two different things. This means that designers and data scientists have to build in flexibility to change the tools. Then if you need to you can, and if you don’t, you don’t.
Even if flexibility is coded in, being flexible comes down to decision making and mindset, and having the awareness to make the right decision. It requires consciously making the adaptive decision, or consciously taking the longer term view. It requires understanding when the adaptive decision will cause extra work down the line, and how much, but it also requires understanding when the long term decision will push the value too far out to be palatable to stakeholders.
Striking a balance
The ideal scenario is when the adaptive solution is a stepping stone to the longer term answer. So it might only be a piece of the longer term answer; a schema might not have all the fields it will ultimately or the testing might not be automated yet.
The important part is it gets you a step along the path to the full solution. With a partial schema, you can begin iterating a data science algorithm. With manual testing, you can release something, even if just to Beta. This means you can accelerate showing value but not by using effort that, in the longer term, may be viewed as wasted when the adaptive solution is replaced.
The bottom line is that rigidly sticking to one approach just doesn’t work anymore. This means that one of the most important things you can do for your career is get comfortable with change. Sticking with the status quo will only lead you to chasing waterfalls.