OR/MS Today - June 2006



ORacle


The Programmer's Parable

By Doug Samuelson


"I just don't get it, Rob," the frustrated OR analyst sighed. "I've been over this computer code a few dozen times. I've inserted print messages to track when key values get changed. I can see it's going through the logic pretty much as I expected, but I'm still getting weird results."

Rob, the analyst's programmer friend, responded, "Well, Paul, how do you know the results are wrong and not just unexpected features of the system you're trying to model?"

Paul explained why he was pretty sure what he was seeing was not a feature of the system. "These buyers in this market model should be making a trade once in a while, even though they are seeking an average price that often takes them out of the bidding," he said. "When something that should be rare never happens, even when you have a lot of runs, I suspect the logic isn't letting it happen at all. And these stored intermediate bids don't have enough variation in them, either. Somewhere, it's just missing certain events."

They pored over the printouts a while longer. Eventually, Rob asked, "Why is this compare statement inside this loop?"

"Ah, that's it!" Paul exclaimed. "It always looks for a price in this range before it checks the intermediate results from earlier rounds, and it should be the other way around!" A flurry of keystrokes and a few mouse clicks later, the new printout appeared to confirm Paul's diagnosis and his remedy. "Thanks, Rob!" he added quickly. "I wonder how long it would have taken me to see it without your help."

"A second pair of eyes usually does help with debugging," Rob pointed out cheerfully. "When you're reasonably good at programming, errors usually come from some logical twist you've simply overlooked. When you look again, you still have the same blind spot, until new evidence forces you to see where you went wrong. Programmers have a saying: 'The toughest bug will usually be immediately obvious to the first colleague you ask.'"

"That's not always true," Paul laughed. "I've seen plenty of counter-examples. Sometimes analysts or even whole departments get totally fried trying to fix a tough bug. There's a reason the burnout rate among programmers is so high."

"Sometimes," Rob conceded, "although more often we just shrug it off and admit that problems with code keep us employed."

They shared another laugh.

"What I'm wondering about, though," Rob observed, "is why your logic went the way it did in the first place. Did you design that piece of code from scratch or adapt it from something else?'

"You got me," Paul admitted. "I did modify another program. How did you guess?"

"If you had written that logic from scratch," Rob explained, "you wouldn't have had the comparison logic inside the trading loop. You just wouldn't have. You know in what order they should take place. But we're all trained, in some cases thoroughly indoctrinated by doctoral-level computer science seminars and research, to re-use working code whenever and wherever we can. It does cut down on a lot of the more common kinds of errors. What can happen, though, is that you have a piece of working logic — the trading loop, in this case — and you add your new logic in the wrong place. I've seen this kind of logic error enough times to recognize the symptoms."

"That's really good," Paul acknowledged. "Actually, I think there's a principle there that applies to my profession's theories about system reliability and artificial intelligence! I need to think some more about this."

"What do you mean?" Rob inquired.

"We know about re-using working program code as part of new programs, and more generally using previously tested logic as part of new logical systems," Paul recounted. "It's the same as the idea, in math proofs, of reducing the new problem to something you've already proven, or the idea of building new systems out of components that have been demonstrated to be reliable. But the trouble is, what you gain in having more reliable components and subsystems may be offset by your higher probability of overlooking, or just not seeing, where they don't fit together properly. Now that I'm thinking about it, I believe I vaguely recall that this is where a lot of the 'theorem-proving' artificial intelligence algorithms ran into trouble, too. I guess you never get something for nothing in this world, do you?"

"Well," Rob affirmed, "certainly you've shown, once again, that there is no good substitute for understanding the whole problem correctly. So now you're back to what I said before about programming: If we could make it all routine, we'd be looking for other work! Be glad it's complicated!"



Doug Samuelson joined the Homeland Security Institute in Arlington, Va., as a senior analyst in December 2005. He remains president of InfoLogix, Inc., a consulting firm in Annandale, Va. This column has not been reviewed by the Homeland Security Institute or the Department of Homeland Security and does not represent the views of anyone but the author.





  • Table of Contents
  • OR/MS Today Home Page


    OR/MS Today copyright 2006 by the Institute for Operations Research and the Management Sciences. All rights reserved.


    Lionheart Publishing, Inc.
    506 Roswell Rd., Suite 220, Marietta, GA 30060 USA
    Phone: 770-431-0867 | Fax: 770-432-6969
    E-mail: lpi@lionhrtpub.com
    URL: http://www.lionhrtpub.com


    Web Site Copyright 2006 by Lionheart Publishing, Inc. All rights reserved.