Tuesday, November 22, 2011

Hunting Eichmann By Neal Bascomb : An international thriller.

I finished reading an absolute thriller this week, named 'Hunting Eichmann' written by Neal Bascomb, and found it simply un-putdownable. Its about a notorious Nazi war criminal named Lt. Col. Adolf Eichmann, his flight out of war-torn Germany into Argentina and then capture by Israeli secret service agents and trial.
Eichmann was a head of department on Jewish issues during the War and was in direct charge of deporting and plundering the jews of Hungary, Austria, Poland among other places to eventually dispatching to extermination in concentration camps. Eichmann got orders from Himmler but was known to be extravagantly aggressive in executing those orders. His infamy grew towards the end of the war, in 1944, through his publicized mass murders in Hungary. Bascomb has done great work in unraveling what Eichmann was up to, on the run just after Germany lost the war. Its surprising that he could manage to linger for as many as 6 years in Germany itself doing various odd jobs. Surprising because of the lack of attention from the Allied forces and failure to apprehend senior Nazi leaders and known mass murderers, such as Eichmann. Eventually it seems that Eichmann escaped to Argentina, not because of threat of capture in Germany but because of boredom and in search of a new life in a new country. The allies were clearly busy with countering the soviet threats and also capitalizing on the German technology and scientific manpower.
Eichmann, in Argentina was leading the life of a lower middle class citizen and not something like a celebrated fugitive in a favorable foreign country. Through meticulous planning and daring, more than 10 Israeli agents belonging to the Mossad reach Buenos Aires, Argentina. They are able to locate Eichmann's house through information provided primarily by the father of the girlfriend of Eichmann's son. Each of the Israeli agents have their own story to tell, mostly of losing family and kin in the concentration camps driven by Eichmann and others. The agents thankfully kept their cool and did not summarily assassinate Eichmann, but managed to smuggle him on an Israeli civilian jet to Israel for a proper trial.Eichmann was hanged until death in Israel and the ashes after the incineration of his corpse were scattered into the sea to avoid giving any neo-nazis a place of worship. After hanging, when Eichmann was released off the noose, there was a loud sound - of air escaping from the lungs of the corpse. I tried to imagine what that would sound like ! It is said to have given nightmares to the executioners ...

About the writing
The events are historical and dramatic but the narrative is colorful as well. Perhaps the reading seems even more gripping to me because the events are based on documented facts and the author manages to weave them into a superb storyline. It is pacy, there is never a dull moment anywhere. On several occasions, Bascomb does strain to impress upon the reader the dangers and suspense that the Israeli team faced. Sometimes he really overdoes it. Why should Vera Eichmann have a nightmare the very night before her husband is apprehended on the street ? Why should a car with four men stop Eichmann a few days prior to the operation just to ask directions to the city and put Eichmann in a state of tenuous doubt ? Such mentions border on the cinematic and the attempt to incite thrill just shows. Regardless, I found it eminently absorbing. The tension that builds when the Israelis are about to grab Eichmann on the street and push into a car is something to be experienced.
Some areas are thoroughly unconvincing and jarring. How come the Mossad chief even aimed to capture the Nazi doctor Josef Mengele also hiding in Argentina and take him along with Eichmann ? That too without informing other agents at all before Eichmann was captured ? The planning, reconnaissance and effort that went into trapping Eichmann was phenomenal, how could Mengele be captured as if it is just a casual takeaway ?

Banality of Evil
None of the characters in this affair are 'larger-than-life'. No matter what the 'James Bond' movies suggest, I have known that the secret service agents and spies are never so, they have to make every attempt to blend in with their surroundings and not catch attention. They need to be good at evading questions, telling lies, forgery, camouflaging through make-up, calm even under life-threatening pressure and be infinitely courageous. The Mossad agents, who trapped Eichmann were perfectly so in every degree. Further, these men had families and were settled in life otherwise. A comment on Mossad chief Isser Harel's recruitment strategy says that Mossad needed perfectly honest men to do a scoundrel's work.
Eichmann on the other hand was a revelation. During the war, this guy had a wife, three sons and a good house. He was slightly arrogant and had a taste for extravagant life but his appearance was no way close to resembling a bloodthirsty monster that he actually was. Eichmann's mass murders were purely hate-driven. When the Israeli agents caught him, Eichmann was nearly a nervous wreck, weak and balding, eking out a living on the city outskirts. Although its not unusual for such a weakling to be cruel, for a settled family guy to send millions of families to their death appears to be. That's where the banality of evil strikes.

Impact
Arresting Eichmann and hanging him was truly significant event in the context of our understanding of the terrible carnage in the war. Our collective consciousness of the Holocaust was enhanced because of the Eichmann trial. In contrast, the Nuremberg trials were really about the topmost leaders in the Nazi hierarchy and those were the public faces of the war crimes, not the real executioners. It also seems to have brought the Israeli Prime Minister David Ben Gurion a lot of political momentum and for Mossad, international awareness. According to Bascomb, Ben Gurion hurried the announcement of Eichmann's capture once he knew that more than 50 israelis were already aware of it. Such an announcement clearly endangered the Mossad operatives who were still in and around Argentina. Politicians are the same everywhere !

Israel : At founding and Modern times.
The nation of Israel is shining today as a progressive nation in spite of limited resources (human and natural) and being surrounded by Arab enemies. Terrible events like the Holocaust and Arab wars preceded the founding of this country some 60 years ago. Wars have continued. In Bascomb's words, what is admirable is the 'strong sense of purpose' that the young Israelis have, to withstand all calamities, fight for survival and be unique in its middle eastern neighborhood. Through ingenuity and hard work, they have succeeded in even farming in a desert through water management technologies. Somewhere I have heard about the Israelis retro-fitting an american F-16 fighter plane engine into the body of a french Mirage-2000 to gain performance advantages of both such aeroplanes.

Overall, it was a happy reading.

Monday, October 24, 2011

From "How Life imitates Chess" By Garry Kasparov

It has been a thought-provoking and pleasurable three weeks with "How Life imitates Chess", by Garry Kasparov. Well-written, in the sense that it does not feel like a self-help book at all (and it is not anyway). But it introduces the elements of success : approach, attitude, strategy, tactics, habits and thinking style. The famous world chess champion starts by an anecdote of his own chess encounters and then projects the concept into a real-life situation. Points to note are -

1/ Become more self-aware. Be constantly questioning the self on all decisions. Avoid being on an autopilot and relying on plain instincts to take any decisions.

2/ Play your own game. Everyone has his own style. Play a game that suits the style and in that way, make detailed notes on one self much more than focusing on any opponent.

3/ Add an element of surprise. Introduce some imagination and fantasy in the game. Break the routine every once in a while. Let the mind wander and bring in any radical ideas.

4/ Become a strategist as well as a good tactician. Pay attention to the bigger picture in addition to the detailed and sometimes routine calculations.

5/ MTQ ( Material, Time, Quality) is a crucial triad that tells the governing factors important for evaluating a situation. Material could be money, and physical resources. Time is universal. Quality is often understated in importance. Quality could mean having some strategic, knowledge, energy, ideas or skills advantage. Although quality is desirable as an end by itself, it can be utilized later for material or time. Somewhere, more than adequate importance is showered on material. Time is certainly sacrificed routinely for it. Quality is rarely in the picture when deciding what long-term strategy needs to be followed.

6/ Being able to evaluate a situation is quite different from just listing possibilities. Better decision making requires better evaluation and better evaluation relies on a proper blend of the MTQ factors. In particular, the MTQ blend must fit with temperament, strategy and willingness to take risks. MTQ factorization should provide at least a systematic way of beginning an evaluation. It should help take some weight off the instinctive and reactionary tendency.

7/ There can be a deadlock in some circumstances when I cannot see a good course of action. It is then a good time to introduce a radically different idea just to break the routine. Such a step can surprise others and bide some time for me. Importantly, it wins some Quality component of MTQ because it improves position, going from stagnant to dynamic, in such a way that I can take advantage later.

8/ Understand the rationale behind some decisions and events. Merely following a precedent does not go a long way because every situation is distinct from others and there is no sure recipe for all.

9/ SWOT (Strength, Weakness, Opportunity, Threat) analysis is the real life counterpart of MTQ. Only if I know the current position (or state of material, time and quality) can I really decide how to plan ahead. In chess, the opening game is the time for creativity but primarily for following certain tried-and-tested methods. Similarly, life's early years are spent in a grind on a beaten path of growing-up, schooling, college and career. Rarely does it vary by much. The middle game in chess is greatly dynamic, where there are not only various ways of doing things but also each move can potentially dispatch us either on a road to victory or total disaster. I am well into the middle game. The end game is where cold calculation and predictability rise to prominence. Everyone can get complacent and bored by the routine and predictability which makes one vulnerable to mistakes. Each day and each moment is therefore a challenge - to be constantly critical and questioning about ones decisions, searching for better alternatives, not become complacent or go on an auto-pilot.

10/ Seize the attacker's advantage, which means taking the initiative, having courage, taking calculated risks and innovation. The attacker gets a positive momentum going in the favour. The attacker seems to have a positive pressure to take action and before that, to take a decision, whereas the defender simply needs to wait and watch. The attacker always wants to disrupt the status quo and in that way, is naturally complacency-proof. According to Kasparov, attack is better not because it is the only way, but that it works best, and certainly did for him.

11/ Question success, and failure too. We should try to understand why things succeed or fail. We love to find agreement and consensus because it saves us from taking hard decisions and confrontation. Which is why we are surrounded by like-minded people and those with similar habits. But it is important not to be.

12/ Intuition is not about some amateur coming up with the right answer without much thinking. That is more like luck. Intuition is all about someone with experience and skill hitting on an unconventional approach or a novel idea.

13/ Have a multidimensional personality. It is better to be good at something other than the profession as well. Being a good public speaker will instill confidence in all other areas of occupation as well. Richard Feynman is said to have improved as a physicist by being a better drummer as well.

14/ When there is a crisis, it essentially means that there is a clear and present danger as well as an opportunity. Its very difficult to stir up opportunities in a still, static and silent environment. Only a crisis provides a window to create a break.

15/ In real life as in chess, there is never much doubt on what to do when a problem is seen at hand. Our minds begin to take it up and solve it in whatever manner appropriate. But the grander question is when there is no apparent problem, what should be done then ? Should a plain old routine be followed, or should there be some kind of improvisation ? Too often, such situations lead to complacency. How to detect a crisis before it forms, and particularly when there is nothing really visibly wrong with anything ? I feel the key to a great strategy is thinking of such questions and then tackling them. Ordinary thinking only leads us to answer questions that we see. The better minds come up with the right questions, worthy enough to be solved. This last one is my favorite...

Tuesday, September 27, 2011

Commentary on "Russia's War" by Richard Overy

I have had great expectations of this book Russia's War by Richard Overy but I had not been able to get around to reading it even two years since I bought it. And so I have finally begun. I see that this book recounts history of the second world war attack on Soviet Russia by Germany. But this is written with Russia, its people, army and polity, in the fore. It leverages some recently released archives to build a complete but not excessively detailed account. It's full of references to monumental sacrifices borne by the russian people and also to their unimaginable bravery, grit and fight against all odds. It all begins with a background of the Communist revolution and the bloody Civil war in 1917-1918. Then it goes on to focus on Stalin's leadership, misery heaped on russian people by a range of calamities, from famines, collective farming, secret police, to purges and torture.

Some of my impressions on reading this book are as follows,

About Soviet psyche : I came closer to understanding the soviet people. I had heard that the soviets are very lovable and gentle people who have borne such a monstrous deal of suffering, sacrifices and bloodshed. I would like to attribute all this to the russians specifically even if I write "Soviet". The soviet army did fight ferociously in the war, dumbfounded the germans who thought it was all suicidal. When Germany attacked the Soviet Union, it was widely expected to win in less than a few months even in the minds of observers as far away as in the US. The germans had won most of Europe in less than 18 months and had tremendous momentum going for them. The soviet army was supposed to be a primitive fighting force and a the country full of semi-asiatic people, they were the proverbial underdogs. I just love to see an underdog getting into a fight, against all odds and making history. Prof. Overy has made a case for the soviets by giving them due for forcing a win over Germany. This conflicted with my understanding hitherto that it was primarily the bitter russian winter and a series of blunders by Hitler that made Germany lose the war. So its not that Germany just lost the war, the Soviets did something to win it. Improvisation, military reforms, and massive hard work did it for the soviets. In recent times, reports of mathematical skills among soviets are abound, they have been among some notorious criminal hackers in the computing world.

Soviet feeling of being understated by Westerners : There is persistent mention in this book of the soviets being underestimated by the western powers. On one occasion, the soviets were not invited for peace talks with Germany. Above all, there is an unmistakable expression of disdain and racial/ethnic hatred among the germans and the british reserved for the russians. The Soviet Union suffered 80 % of the casualties among all the allied powers. Germany too had most of the defences lined up against the soviets compared to the british and americans. So even as I feel that the Britsh losses in second world war more than anything else won India its independence, its the german defeat at the hands of the soviets, more than anything else, that won the allies the world war.

Numbers of people dead, vanquished : The numbers are just mind-numbing. Just how cheap a soviet life was ! The pages are full of mention of thousands and millions of lives lost. By the time I finished, I could not really recollect some infamous massacres, like the one in Katyin forest of the Polish nationalists. I am certain that in spite of so many invasions that my country, India has faced, there has been no war or massacre or plunder that comes anywhere near half as close as what the soviets suffered.

Stalin : This is the personality at the centre of the russian response to German aggression. As in many other books, Stalin is a mysterious, power hungry, unpredictable and extremely cruel being. To me, Stalin appears to be an almost legendary master at accumulating power, setting people against each other, taking credit for others' achievements. But most notable is Stalin's capability to retain supreme power even as he goes on extracting bloody scarifices out of everyone. He vanquished ("purged" is the apt word) not only the obvious enemies of the collective farming (rich peasants or Kulaks) but also ukranians, minorities, intellectuals, army, and even the Communist party itself. He arrested and exiled the wife of a serving army general. Not even the celebrated Marshal Zhukov was spared of Stalin's caprice, hysterical suspicion and pathological cruelty. He let the secret police arrest, torture and kill any threatening opponents and then also proceeded to purge the secret police themselves ! Am reminded of Martin Amis (in "Koba, the dread"), where he says that Stalin was exceptionally harsh on Ukranians, Jews, minorities, and also georgians. Wonder what, Stalin was himself a Georgian. He hated his son because his son was a georgian, and his son was georgian because Stalin was georgian. Amis writes all this with remarkable effect.

Finally, I loved reading this book. Towards the end, Stalin dies. Beria is executed. The book has a slight documentary feel to it (has ample tables on wartime figures and maps but maps are tacky). Not quite as colourful and expressive as Martin Amis's piece (Koba, the dread). But Prof. Overy has changed some of the impressions I had about the war and also showed how the Russians actually did it. It challenges some myths about Stalin, particularly the manner he bowed to several of his generals' wartime decisions in time of crisis. The end is poignant too. Final assertion in the book is that even as the war was won and the soviet people had a momentary sweet taste of victory but the despotic oppression lasted much beyond the war.

Saturday, June 4, 2011

Development with pre-requisites and source code security.

Admittedly running a risk of combining two different subjects but they are so linked. One is development with pre-requisite, compiled libraries and other, source code security.
I want to write something on some of the problems arising out of having an extra large codebase, large and geographically scattered development teams. Certainly, we are looking at a distributed version control to help us out here. But I want to see what an individual developer needs to do to accomplish the daily bread-and-butter i.e to read and develop programs, build and test. This is particularly about individual source code management, ease of development and source code security.

Simple things first, how does a developer working on a small piece of that monstrous 4 million lines of code actually build and run the application on a measly desktop ? The actual run-time size of the application is much less troublesome as the time to download the code every morning because I have no idea which other developer has modified what other parts of the code and how it impacts my own code. Basically, there could be so many scattered changes coming in daily over the entire spectrum that the entire codebase on my desktop needs a refresh. Things go out of hand when you now consider that each developer will compile, build the entire code for testing any part of the application. The daily build time for each developer is the biggest blow here.

But really, each developer should just download and build code that is being touched by his/her team and now have to worry about all other code and treat it like a black-box. The UI development team almost never touches any code in Application Framework. There is a clear separation of concerns among those development teams. For testing, one needs to entire set of executables.

I have encountered this situation in my prevous stint working as a CAD-PLM application developer, for code written in C++. The method employed there is what I want to share.
We need a really sturdy server with a shared (and read-only) directory, that has a complete and "best-so-far" version of executables. A build machine does a daily build of the entire source code that is checked-in by the EOD and places all executables in this shared directory. We then have all developers map/mount this shared directory to some local directory.
Given this, the developer only downloads code that his/her team deals with and provides the shared directory as a 'include' and 'library' path after the local code. The compilers as well as the runtime environments happily work with the source code on local machine and pre-requisites on the shared path.

An intentional side-effect of this is that not all developers can see all source code, although they have all the executables to play with. The source is neatly divided into sub-directories containing code that belongs to a logical functionality. The version control system can explicitly disallow all but some users from checking-out certain such directories.
Why is hiding source sometimes desirable ,in today's computing world ? I can easily imagine a piece of source code that is truly break-through, with patented algorithms, technologies and what not. Then it follows that the organization does not want just "any" developer to get his hands on the source code - for securing it against tampering as well as source-code theft. What else but source code, is the most important asset of a software development company ? We have had several cases so far, where a sacked, disgruntled employee went out to sell his company's intellectual property for cool bucks. See reports here and here. After such incidents, those companies must have gone overboard, in searching employee baggage on each transit, for any smuggled electronic media.

All this might seem banal but if things do not get arranged in this manner to begin with, it is difficult to pull it back clean.

In the next few entries, I will explore a software licensing mechanism to control software usage, that will also hinge on some source code security. It all depends on some parts of the source code not bing readable by anyone. Secured source code is such a good thing.

Monday, April 25, 2011

XML file comparison

Another problem is to compare two XML files with the same structure and bring out a result that spells out the differences in the two files. Again I am set out to talk on a solution more involved than the simple one we actually chose to implement.

Simple approach :
A simple approach is to start by referring to the topmost parent node in the code and try to find it in both the XML files. If the node is found in both the files, then no difference found at the first level and we can proceed to the next deeper level of nodes to query. The actual XML tags are thus hard-coded and the same process can be repeated. If it is not found in either of the files, then the first difference is found. We can note this difference and then generate some other structure to describe the difference or a "delta". We can call this as the difference structure. If the XML is a representation of a relational database then this difference structure can then be used to translate into a set of RDBMS queries that equalise the database contents. Thus it is quite plausible that the difference structure is a linear structure rather than a tree-like XML.

Drawbacks of the simple approach :
The logic to parse the structure is inextricably mixed up with the logic to generate a difference structure that is a linear one. If the XML structure changes, the parsing logic also changes and the changes in the delta generated also follow. What would be ideal is the parsing of the nodes be independent of what the XML is about and just the actual difference structure (and subsequent RDBMS queries) generated be dependent on the actual differences in the XML. Its the changes needed to the parsing code that is a matter of contention. I think we can do better.

A better approach :
As it usually happens in computer algorithms, the better approach is more complex than the simple ones. As an aside, the situation is quite different when dealing with mathematical statements and proofs - the better proofs are shorter and have an "elegance" quality in them, in the long run, they prove to be more readily intuitive.
Ok, here we should let the coding logic be independent of any actual XML tags. We actually create a nice array of strings that is multi-dimensional and "hand-write" the XML tags in them. So If the XML has a hierarchy that goes two levels deep, then the array will be actually a list of a list of strings. We just create a string structure that mirrors the XML. This looks tedious but is way simpler than coding that structure for parsing into something the compiler finds acceptable.
The parsing code traverses this string array and treats it as as a descriptor to the XML. It does the same for both the files and if it now finds a difference, it will proceed to generate a in-memory linear difference structure. A separate piece of code will translate the differences into something like a set of RDBMS queries as required and the important fact is that this query generator is strictly different from the parser.
If the XML structure changes (or more likely, extended), no change is needed to the parsing logic. Just the hand-written string array should be changed to reflect the changes. Some skill involved in coding the parsing logic but once done, this approach scores over the earlier one on counts of maintainability and extensibility.

I see parallels in this approach and this is one that could be described as a "data-driven" approach (my term, my quotes). The actual code is only like a markup processor, that is very generic and simplfied. The key point is not that compilation is avoided but that the changes are not in code but in an array-like stucture of simple strings or values.

XML specification and duplicate tag processing

XML has been long touted as a very promising method for information exchange. Some count it as too verbose and doubt how efficient XML turns out to be if the information is voluminous. However, XML still reigns as the most widely accepted method to convey structured data, in a human readable form, for which parsers are widely available and one that is extensible.

One pattern of usage was noticed at my work product : Referring to another tag, to copy content -

A huge XML file that carries product control configuration of the entire application is usually being edited by humans. It basically stores configuration properties for various services that run as part of the product. What should we do if there are multiple duplicate services and they essentially have the identical properties ?

For example -

<Top_Parent_Node attr1="val1">
<Service_Node attr2="val21">
<Prop attr3="val3">
....
... Complex set of enclosed tags ....
....
</Prop>
</Service_Node>

<Service_Node attr2="val22"> <!-- duplicated service tag : we need this for the application -->
<Prop attr3="val3"> <!-- Forced to repeat this from the previous tag -->
....
... Complex set of enclosed tags ....
....
</Prop>
</Service_Node>
.... More such repetitions ....
</Top_Parent_Node>

The simplest way is to repeat the properties at both locations by copy-paste. We are rather good at that.
We, however, screw up miserably when it comes to propagating changes to one set of properties to all other identical locations.

I have a suspicion that this is a common situation that others run into as well. Which makes a good case for formalising this requirement in the XML specifications itself. The XML specification should allow a choice - either specify tags or make a reference to other tag that will be as good as copied into this tag while parsing.
For example -

<Top_Parent_Node attr1=val1>
<Service_Node attr2=val21 ?xmlref="N1" > <!-- Label this tag as a reference -->
<Prop attr3=val3>
....
... Complex set of enclosed tags ....
....
</Prop>
</Service_Node>

<Service_Node attr2=val22> <!-- duplicated service tag : we need this for the application -->
<?xmlref="N1" /> <!-- No need to repeat - referred label is treated as copied -->
</Service_Node>
.... More such repetitions ....
</Top_Parent_Node>

Few points to note :
- Only one place where entire spec of a node that will possibly duplicate resides.
- Any changes made to one place will reflect in all other places which refer to it.
- The first Service_Node, that carries the complete spec is labelled in a unique manner. This label is part of the specification and any node can be labelled in this manner. Thus it need not appear in any dtds or xsls as an available attribute.
- Any node can refer to this label by enclosing a <?xmlref> with a label identifier. The parser should copy the entire specification within the referred node into this node.
- The referring node and the referred node need not be in the same hierarchy or tree depth. The parser should deal with a referring node appearing before the referred node in the file. This is to keep the XML parsing independent of ordering. If the referred node is not found, the parser should throw an exception. I can see that DOM parsers can handle this in a straightforward manner. The SAX however should need to parse to the end in search of a referred node.
- I don't quite see the possibility to provide partial overriding capability to this idea without unnecessarily complicating the idea and obfuscating the XML specification.
- The fact that integrity is maintained easily with changes to the spec gives some credence and value to this idea over the fact that readability of the XML is somewhat hampered.

When I encountered this problem at my workplace, I must say that the problem was solved at the application layer, i.e- a new tag was added inside the duplicates to refer to the other node. It was a simple hack to the problem but it seems not to solve the problem but rather work around it. As you would have known, this is what happens in a commercial context under time pressure.

Friday, February 4, 2011

Software development (requirements and design)

Two strictly random thoughts on two aspects of software development - Requirements and Design follow. Just something to play with over the weekend -

1/ Systematic and scientific approach to software development instead of intuitive, programmer driven.

An intuitive, programmer-driven approach is a typical one taken up by programmers, that is heavily based on experience in solving similar natured problems and also on observations of other programmers' practices. Programmers are very trained to follow patterns of solutions. Most of the time, it goes OK, since the thought process has been reviewed by several programmers and has a proven workability. However, what if a fresh look is taken in some situations where the chosen solution is only based on tried and tested formulas ? How about deriving a solution based on a totally rational, logical and scientific approach ? Can it find some gaps in understanding, find some other, hitherto overlooked determining factors ? Most research in academia takes this approach and does succeed in finding better solutions than those in the industry. The industry is sometimes too focused on finding cost-effective, workable and time-bound solutions, to the detriment of doing something that will prove to be profitable and efficient for the long term. The hard, complex problems in software suffer in this regard much more than the run-of-the-mill kind of software, such as the CRUDs.
As a sunshine industry, software is still evolving, with new practices being thought of and proposed at regular intervals. It should be a place full of such opportunities for betterment.

2/ Allowing maximum configurability but not exposing it to users.

Customers use software built be developers but they usually end up complaining more often about features not implemented according to what they wanted and less often about things that don't seem to work correctly as per specifications. If something does not work correctly as per specifications, its a clear bug and the developers feel obliged to make the corrections. However, the point about missed specifications, misunderstood specifications and conventions is not so easy to rectify.
Can a solution be to allow for configuring everything and anything that is feasible ? Users will be quick to complain about huge configuration choices and complicated installation tasks. So the configuration out-of-the-box is chosen to be a vanilla, typically acceptable one and not exposed to users at all. If some user needs something different that how the software behaves, we will need to customize and configure but we will be very likely to find some way out without making code changes since we just made it as configurable as possible, even when we never expected the users to change it.
When I thought of this, I was really thinking about issues faced with various customers and the varied preferences each one has with respect to the same software. Some customers could complain about software components needing bi-directional access in network ports and interfering with firewalls, others cribbing about GUI of administration consoles and the layouts. Somehow, it is a bad bet for the developer to make a design choice, code something accordingly, and then face urgent situations because of making those choices if some customer expected otherwise. Even if we take care to implement according to some well-known standards and conventions, are we in a position to deny something to a prospective customer (and maintain same specs) if they don't like how it behaves ?