Geeks and Repetitive Tasks

by Joey deVilla on April 24, 2012

Graph: Geeks and Repetitive Tasks

Found via Bruno Oliveira.

{ 16 comments… read them below or add one }

1 David Janes April 24, 2012 at 7:40 am

The key descision points are when to make decision to write the script and knowing the slope. I’ve taken the general approach that the second time I do the same thing, it’s time to consider writing a script (or a common function, etc.). Where I’ve been caught up is overestimating the slope – if the lines doesn’t cross for years, you’ll proabably never get the time payoff.

cf. also your post from a long time ago about “so you’re considering writing a framework”.

2 Frank April 26, 2012 at 6:27 am

You should mention that the time scale is logarithmic.

3 Kwazai April 26, 2012 at 6:29 am

i’d bet they are more like bell curves (aka-statistics) and the slope only would apply inside of 2 standard deviations of repetitive (ness?) before the process that needed it changed.
if you change the light bulbs in the process, for example, the non-geek productivity goes up….lol

4 Jamie April 26, 2012 at 8:18 am

Even if it doesn’t technically “pay off” every time I would almost always rather do something the way that avoids the boring repetitive task. I’d rather spend six months inventing and building a machine to paint houses than spend four months on a ladder with a paintbrush in the hot sun.

Of course sometimes you don’t have six months to paint a house. In those situations, unless it’s just not possible to get the job done in time, I’d still usually rather work 12 hour days building a machine than spend the same number of 8 hour days painting.

Painting sucks. Creating and building new things is fun. Even if I have to spend more time to do the same job, I’d still rather do it that way. And of course I now have a painting machine and will paint circles around you next time.

5 Robert Poofenfloofer April 26, 2012 at 9:17 am

This is awesome!

6 Jwalant Natvarlal Soneji April 26, 2012 at 11:13 am

Nice explanation…

7 JRich April 26, 2012 at 12:16 pm

Also consider consistency and accuracy from a known script versus the manual approach where-by the means to the result could / will be recreated differently each time…

8 David Bowman April 26, 2012 at 7:15 pm

I am exactly such a geek. There are two bits missing from the graph.

1) The geek goes on holidays
The XXX department decides this is a great time to rearrange the folder structure on the shared drive. The scripts stops and everyone misses the automated email reports. The manual process is dusted off and reinstituted, but the script is on version 1.4 by now. Hurried emails, panicked meetings. The geek is called and over the phone, steps the non-geek through altering some declarations at the top to the script. Sweetness resumes, although conspiracy theories about the geek abound.

2) The base system is upgraded
The geek announces he needs a week to update 82 scripts. This is a disaster and goes to senior management to review is the upgrade should even proceed. Everyone forgets the savings of the last 3 years. The geek quietly does it. Sweetness resumes, although conspiracy theories about the geek abound.

9 swampwiz April 26, 2012 at 11:30 pm

Interesting. As an aerospace engineer, I had a supervisor that said that unless I could be sure that automating a task would be quicker than doing it “by hand”, that I should do it by hand. Instead, I always took the automated route unless the task was really small. Sure enough on one task, I was asked that task many times (with only a small change in the input.) Because I had automated the task, it was a breeze.

Another time, as a software developer contractor, I was asked to do a task, and thinking that it would be used repeatedly, looked to automate that as well. After a while of not having any result (because it took a while to set up the application), I got canned.

10 Tim Acheson April 27, 2012 at 5:30 am

Automation can be the key to effective time management and also quality. Manual vs automated acceptance testing of a website or web service provides a classic example of this. The most labour-intensive tasks are the tasks most prone to corner-cutting when performed manually.

11 Anonymous Coward April 29, 2012 at 1:19 am

I’m not so sure about the paint analogy presented above. With every repetitive task, unless it’s a very simple and straightforward one, you need to do it a few times by hand, before you get to understand it well enough that you can reliably automate it. Painting a house, from my experience, if you want it done properly, is such a task. If you need to do it manually 2-3 times before you understand all corner cases (pun intended) well enough to build a machine, you’re better off doing it manually.

Also, in many cases a simplistic approach to automation may lead to undesired side effects. A recent example I encountered was that in the process of code review and submission a simple script a colleague created generated so many reviewable snapshots that review became nearly impossible – deltas were spread across too long a string of snapshots.

I’d say “geek gets annoyed” above needs to be replaced by “geek groks how to automate it properly”, and in many cases this point is too much to the right of the diagram to make automation reasonable – the cases with low frequency or higher frequency but also proportionally high complexity of the processes or variability in execution.

I once had to rename thousands of files every few hours, according to a complicated scheme and some non-trivial pattern matching. I spent a couple of days writing a dedicated tool for this, and it paid off, even if the renaming took place only a few weeks. Would such a tool pay off if you had to rename only a few dozen files a day, according to the same complex rules, even for a longer period of time? I’m pretty sure it wouldn’t – the manual work to start it up and enter the adequate renaming rules would not pay off for a small number of files. For just a few files, you’d be better off with a few cached command lines, in terms of effort.

OTOH, I most often find myself writing small shell scripts to aid me in running tests, re-initializing a debugging environment and the like, even if these scripts most often don’t live for longer than a few days or weeks, months at most. (They do tend to get taken over by QA, though, being reused at least partially to build longer-lasting stuff).

Therefore my approach to automating stuff is the same as the one to writing software in general: be lazy. Choose the path of minimum effort – in the long run. Don’t automate what’s not worth automating.

12 Andrea Zimmerman May 7, 2012 at 2:38 pm

OMG I love this graphic!!!!!

I’ll have to show this to my hubby. He made fun of me for making Excel workbooks to track info about my online gaming with conditional formats set up to draw attention to important things,

UNTIL…
he had a spreadsheet he was using to keep track of student attendance and he wanted an easier way to see who had hit the limit of too many absences. THEN he became of fan of my workbooks and conditional formats.

And now the rest of the TA’s use the workbook I built :)

13 Tim Wry May 13, 2012 at 1:40 am

You did not finish this graph. The non-geek loses briefly, then takes ownership of the script as his own and forgets about the now redundant geek.

Lion eats fox. End of story.

14 Hayden November 30, 2013 at 2:52 pm

haa! good one! but the non-geeks will catch up as the software gets smarter. Just check out http://proteg.ee/mentor to get the idea.

15 Travis Cottreau June 4, 2014 at 5:49 pm
16 MC July 11, 2014 at 12:41 am

Then the Geek leaves the company, and all falls apart – because no-one else bothered to understand how the work was done.
They just knew it had to be done.

Leave a Comment

{ 3 trackbacks }

Previous post:

Next post: