Fads and frameworks come and go. Frances Buontempo encourages us to find what works rather than follow fashions.
Having been on gardening leave recently, I have in fact been gardening. As you can imagine, this hasn’t left enough time to think of an editorial, so I apologise. I used some loppers to cut down a climbing hydrangea, which is making its way up a wall and over a roof. This seemed like a good idea, but there are a couple of cables going over the roof and into the wall. Fortunately, the one I cut through by mistake is for a television point we don’t use. A close call, since the other cable connects us to the internet. A near disaster I am not proud of. With hindsight, using something smaller, like secateurs, would have enabled more precise chopping. Though they wouldn’t have got through the thicker branches, I would have been able to see what I was doing better. I possibly also need to go to the opticians, but that’s another story.
They say a bad worker blames their tools, but if you use a hammer to put a screw in place, things will go wrong. Perhaps someone blaming a hammer for its failure to screw a nail in place is a sign of a bad worker. The tool used does impact your work, though the effects can be subtle. When I write my excuses for a lack of editorial straight into a word processor, my writing style appears to change slightly. I’m not sure why and find it hard to vocalise how, but I seem to want to type paragraphs in order and end up with slightly tortured links between ideas. If I scribble notes on paper first, I can draw arrows to remind me to move stuff around. Obviously, you can move paragraphs around in a word processor too, but I then forget what I was in the middle of. A line on a bit of paper seems less disruptive. In contrast, trying to make notes on code, outside an editor, means I have piles of paper dotted around, none of which make any sense afterwards. Leaving a TODO comment, or better, something which won’t compile works better for me. This doesn’t mean there is one true way for any creative process, but there are options and some work better for some people than others. Context is everything.
I have seen a recent trend claiming that agile is the only way to succeed with software projects. This usually specifically means a very rigid scrum process. Though this can work, it can also devolve into what might be termed ‘dark scrum’. Ron Jeffries coined this term, saying, “Too often, at least in software, Scrum seems to oppress people. Too often, Scrum does not deliver as rapidly, as reliably, as steadily as it should. As a result, everyone suffers. Most often, the developers suffer more than anyone.” [Jeffries16] I have seen estimates for work in story points, which are not supposed to reflect time required to complete, held as promises. If a developer takes longer than a manager expects the number of story points to need, trouble ensues. Stop using story points if you want to estimate how many hours’ work something might take [Cohn14]. Scrum is not the only project management approach and is sometimes not the right tool for the job. Kevlin Henney recently republished a blog post [Henney21] about the development process. He suggests that many agile shops are in fact using “waterfall projects rebadged with new terminology, more urgent time scales and increased micromanagement.” Micromanagement is almost always mismanagement; however, claiming to be agile because you hold all the right ceremonies is not agile. Furthermore, as Kevlin suggests in his blog, waterfall might be appropriate sometimes. The right process for a job depends on the context.
As opinions on processes change, tooling changes over time too. When did you last use a fax machine? Faxes were the right choice, before scanning and emails became common. They now seem like a slightly pointless historical curio. How many scart cables do you own? Do you still have any ‘off the shelf’ software on a CD or DVD, but a computer without a cd reader? Or, even more redundant, a floppy disc or two in a drawer somewhere? I have a nagging feeling I have a slide rule somewhere. Unlike the digital tools, a slide rule would still work, though I would need to spend a moment re-learning how to use one. Much of technology becomes obsolete, and some of it really rather quickly. I tend to think it takes a generation or two for tech to fade away, but fax machines prove me wrong. I do wonder if some new tech under active research might turn out to be a short lived fad if it ever becomes a reality, self-driving cars being one thing I have in my sights. I’ve said it before, but I want a transporter, and a replicator while you’re on the case, not a self-driving car. A functional public transport system, more people eating locally grown (or replicated) produce, thereby taking a few lorries off the road, and other ways to reduce volume of traffic can only be a good thing. Since the UK seems to be suffering a lorry driver issue at present, this is now a pressing need. “Tea, Earl Grey. Hot.” as Picard says.
To continue on the theme of fads, I shall now turn my attention to AI. For a long while, many new products proclaimed they used machine learning. More recently there seems to have been a move to claiming things are powered by AI. Much of this is an outright lie; however, some systems, such as chat bots, smart speakers and recommender systems are genuinely using elements of AI. Interestingly, anecdotal evidence suggests an increased use of deep learning recently. Specifically, many winners of Kaggle competitions [Kaggle], a website offering prizes for analysis of many disparate datasets, have used deep learning, particularly for ‘unstructured’ problems – vision, text and sound [KDnuggets]. For those unfamiliar with this tech, deep learning is a type of neural network, with many ‘hidden layers of neurons’ giving it many more sums to do than a traditional feedforward neural network, which tends to only have one hidden layer. Inputs go to the first layer, magic (maths) happens in the hidden layers, then predictions come out at the final layer. Back to this deep learning fad. Why is this happening? I suspect a spot of TDD – tool driven development – here. There are many free cloud-based frameworks that support deep learning, so it is easy to chuck some data into a network and see what happens without any local setup. You can even invoke the power of many GPUs without having them locally. This doesn’t make it the right tool for the job. I am not disparaging deep learning completely. It does manage some near-magic in disparate domains. I’m not convinced anyone really understands precisely how it works, though. Perhaps whatever works is the right tool for the job.
I gave a talk at an online conference earlier this year, The Machine Learning BASH [BASH], in which I explored the term ‘regression’. I was asked about which machine learning frameworks to use. Between ourselves, I panicked at this question. I do tend to zone in on Scikit Learn [scikit-learn], a python framework providing curve fitting, classification and clustering tools, and more besides. However, I have also used JCLEC, a Java framework for evolutionary computation [Ramirez17], and played around with tensor flow, keras, Google’s colab and many other frameworks. To be frank, I can’t keep up. As I start to get the hang of one thing, a new kid turns up on the block and I feel left behind. I have decided to stick with understanding the maths behind the algorithms and learning how to use the latest tool when I need to, with the full knowledge my ability to drive the tool will become obsolete very quickly. Reading docs on the latest version when required is better than memorising APIs etc which are bound to change, at least to my mind.
I acknowledge some people like to be at the bleeding edge, so make an effort to always upgrade to the latest version or try the newest framework or language. I do feel like I’m missing out when I haven’t tried Rust or whichever new language, tool or tech everyone is talking about, but I have learnt to just watch some things from the side lines. There isn’t enough time to learn everything, and your brain is a finite resource so I’m sure learning some new things makes you forget other things. Sometimes if you need to analyse data, keep track of something or create software, the best tool for the job is one you know. It’s OK to use your favourite IDE, even if people around you are being snobby and trying to make you use Vim or another editor. It’s acceptable to draw an architecture diagram with a pencil on a piece of paper, rather than spending an hour trying to work out how to draw a text box on Lucid charts or Visio or the like. It’s fine to use a spreadsheet if you want some basic adding up or plots and don’t know R or Python very well. I might raise an eyebrow if you tried to do this in C++, but if that’s what you want to do and can get results quickly enough that way, do it. Your experience counts for something. Unfortunately, though, even if you are familiar with one toolchain, you find yourself in a position where you cannot use what you know. I recall having to use very old versions of compilers and similar and being stumped when things went wrong. ProTip – don’t read the latest docs for gcc or python if you are using a much earlier version. Don’t expect things that compile under Microsoft to compile with gcc – though to be fair, the gap is narrowing compared to twenty years ago. Sometimes you have to use what’s to hand, even if you suspect there’s a better way.
While some businesses are either stuck on old versions of tools and others are on the cutting edge, many will have coding standards, dictating how to do almost anything. This ‘One True Way’ may be enforced automatically, or via gate-keeping code reviews. It is what it is; however, notice that these guidelines all tend to vary. They are written by people, and everyone has a different history. We’ve all been burnt by slightly different problems in the past, or been taught one way to arrange braces and whitespaces. Aside from the layout of the code, many guidelines stray into diktats on testing, telling you to always/never use mocks, achieve 100% coverage with end to end tests, ensure the unit tests run quickly. Always use parameterised tests. Never use parameterised tests. And so on. Perhaps the variety of guidelines means we’re all still trying to figure out what works. Time will tell.
Here’s the thing: many tools, processes and guidelines make sense when you look at the world one way, but if you change your perspective different things come into focus. I’ve recently been reading a handful of physics books I’ve found on our bookshelves. The tensions and contradictions between quantum (small subatomic scale) and classical (larger people and planet type scale) models left us searching for a grand unified theory. We do not seem to have found this yet. Classical models and relativity see the world as smooth and predictable. Quantum models have packets or quanta and are probabilistic. Both models make accurate predictions, even though they seem to make conflicting assumptions about the fundamental nature of the universe. The trick is to use the right equations for the scale at which you need results.
It seems there can be such a thing as the right tool for the job, even though opinions can be divided. Maybe the best thing to do is stand firm, ensuring you are on a stable footing. I have been told your stance can make a huge difference in snooker. You might think it’s all about maths models, and angles and trig, but it turns out you need to be able to stand firmly and look where you’re aiming. Don’t get distracted by what’s going on around you. Keep your eyes on the balls. Don’t be shy about using something you are familiar with if it gets the job done, but be willing to try out new things once in a while.
References
[BASH] Machine Learning BASH: https://www.youtube.com/watch?v=CFwlCCM8ZnI
[Cohn14] Mike Cohn, ‘Don’t Equate Story Points to Hours’, posted 16 Sept 2014 on https://www.mountaingoatsoftware.com/blog/dont-equate-story-points-to-hours
[Henney21] Kevlin Henney, ‘Getting over the Waterfall’, posted 30 Aug 2021 on https://kevlinhenney.medium.com/getting-over-the-waterfall-c090c6228ca9
[Jeffries16] Ron Jeffries, ‘Dark Scrum’, posted 8 Sept 2016 on https://ronjeffries.com/articles/016-09ff/defense/
[Kaggle] Competitions: https://www.kaggle.com/competitions
[KDnuggets] ‘Lessons from 2 Million Machine Learning Models on Kaggle’ at https://www.kdnuggets.com/2015/12/harasymiv-lessons-kaggle-machine-learning.html
[Ramirez17] Aurora Ramirez and Chris Simons (2017) ‘Evolutionary Computing Frameworks for Optimisation’ in Overload 142, published December 2017 and available from https://accu.org/journals/overload/25/142/ramirez_2444
[scikit-learn] skikit-learn: Machine Learning in Python at https://scikit-learn.org/stable/
has a BA in Maths + Philosophy, an MSc in Pure Maths and a PhD technically in Chemical Engineering, but mainly programming and learning about AI and data mining. She has been a programmer since the 90s, and learnt to program by reading the manual for her Dad’s BBC model B machine.