tune into the voice of a spec
Sunday, December 9 2007
It seems I'd drunk more than I'd thought last night, which gave me a hangover today. It was a bad day to have a hangover because I was now up to my neck in the construction of that flexible questionnaire-building system I've been working on. In this particular system, the answers are gathered and processed according to a complex recipe that has been spec'd out for me. An early decision in the development of this system was to make it so the recipes could be easily-defined and altered. In making it this way I was actually going far beyond the spec, which had, after all, explicitly defined the recipes for me. Most people attacking the problem would have built the system exactly according to the given recipes, but I could see that doing so would be foolishness. The spec was difficult to read, what with its constant intermingling of units of different scale and occasional lapses into technical hand waving. Rather than fretting about whether I was getting things right, I exposed most of my algorithms as pseudocode in a database column, one allowing for a string full of keywords that I could replace, on the fly, with actual values from an ongoing calculation. Once these replacements were done, the expression could be evaluated using PHP's handy eval() function.
I hadn't thought much about it before, but today I realized that one of my strengths as an independent worker is that I am happy to table issues that I don't understand while continuing work on the parts I do understand. (That Windows XP doesn't behave this way when copying large numbers of files is one of my major complaints about that operating system. Has this been fixed in Vista yet? My guess is no.)
More importantly, though, if I need to make a wild assumption or take a complete guess about the intention of a spec writer, I'm perfectly willing to do that. If I get it wrong I can always fix it later, although it's actually rare that I'm wrong when making such decisions. Once you tune into the voice of a spec, it can begin to speak implicitly as well as explicitly, and with a little common sense as your guide, there's no need to bother anyone about what this or that passage means, even when it's saying something like this:
In order to makes from this their street I is real this went distant after the specification, which had, finally, precisely since they were determined recipes for with. The persons of majorities that attack in the problem e'htjzan the system precisely proportionally, were whoever gave in the recipes, but 4 that I could see that they make thus it will be foolery. Specification existed trudn so that it reads, his they were constant anamjgny'ontas barriers otherwise mashtaba and the accidental faults in in order to kymatj'sej technically the hands. Than to it wears out neighbour it was taken part 4 things right, I submitted in the action big of my algorithms as beydokw'djkas in the column of base of data, one completely from the words keys that, were who 4 I could replace, in the fly, the real indicator from the running calculation.
Better than routing around uncertainty and better even than blind leaps of inference is the deconstruction of uncertainty into its smallest components, extracting out any remaining certainty as it's discovered, and putting what little uncertainty remains into well-identified boxes (for me, these are invariably database fields). Ideally the uncertainty can be reduced to its bare essence and then be exposed to the person responsible for this uncertainty in a way that makes it easy for him to impose the certainty that has been locked unarticulated within his head. That was the goal for the tool I was building today, and it's been the goal for every tool I've made since I began building crude content management systems in 1998. It's an elusive goal, but I've been getting closer and closer to it with each passing year, building ever-higher on the shoulders of all my previous work.
This brings up an imporant point: to continue benefitting from my earlier work I am dependent, in large part, on the environment remaining stable enough for my old code to work as well as my new code. The transition from working mostly on Microsoft web servers to Linux servers (2001-2003) marked the roughest stage in my development as a programmer, a nadir in my programming creativity. My large VBScript ASP function libaries had to be translated line by line to work in the new C++esque world of PHP. But once I crossed that threshold, I could count on PHP staying fundamentally the same, allowing me to grow without being limited by having to constantly learn new syntax or techniques. PHP is an open-source environment that essentially belongs as common property to the entire human race. Sadly, it does so with a stronger legal basis than does the information in our own genomes. So no company unilaterally controls it, and there is no one with the power to alter it drastically.
Creativity depends on the mastery of a medium, and that cannot happen if that medium is in constant flux. Great oil paintings didn't happen until the technology was well-established and artists knew what to expect of their paints, brushes, and canvases. The static nature of the PHP environment has a much greater resemblance to the medium of oil painting than do any of Microsoft's programming environments or user experiences. When a single corporation controls an environment, it is to their benefit to keep changing it, convincing people that newer versions are such vast improvements over the past that they're worth paying money for. The Microsoft ASP I learned as my first web technology has been supplanted by the .NET paradigm, and many of the things we subconsciously know about Windows XP have been changed in Windows Vista for no other reason than to make it look and feel new. In doing these things, Microsoft might well be milking every last IT nickel out of the economy, but it is also creating poor soil for the blossoming of creativity.
For linking purposes this article's URL is:feedback
previous | next