Stephan - I'm not sure what I wrote that's "incomprehensible" - my basic point is that the conceptual / thinking process is different between procedural and OO programming, and that's the main challenge for anyone making a transition between the two.
Research I've done on the topic suggests that it takes 6-18 months for a procedural developer to get to the "aha! So that's how it works!" stage, and ~3 years under the guidance of an OO expert for a developer to become sufficiently proficient to teach others.
Stefan is the name, no ph. I understand nothing of the whole of your text and I'm an experienced oo developer. Dependency injection, IoC container, patterns, GoF, you name it. For more than three years. But never mind, maybe others can bake a chocolate pie of your text.
I definitely think "better off" as it enables me to
And with Progress's ABL - I can be as "OO" or 'procedural' as I want to be depending on what the situation calls for. :)
TIM: And with Progress's ABL - I can be as "OO" or 'procedural' as I want to be depending on what the situation calls for.
Exactly. I'm not an OO developer, I'm not a procedural developer, I'm an ABL developer. I choose the solution to fit the problem, and my toolset is both wide and deep.
> my toolset is both wide and deep
Without a trace of self-mockery! How different is this from the utterings of someone like prof. dr. Edsger W. Dijkstra.
"The competent programmer is fully aware of the strictly limited size of his own skull; therefore he approaches the programming task in full humility, and among other things he avoids clever tricks like the plague."
I'm traveling, so not able to follow this exchange in real time, but as I catch up, I am quite amused ... by a number of aspects. But, I think one of the problems is that people are not really talking about the same things. The point of the Lahman quote is that the vast bulk of the programming done in OO languages is not a good example of OO and yet it is those examples which people often look at when they are deciding whether or not they like OO as a concept. This is rather like trying to decide how one likes ABL based on the examples in the manuals. The essence of OO is not in the implementation details, but in the separation and interaction of the parts. The principles are not very different than what made for good programming in ABL *before* we had an OO option. Implementation details are, just that, implementation details. These vary by language. ABL, being ABL, we have some language features available to us that oo3GLs don't have. This doesn't mean that the system as a whole has a different design overall which is independent of the language.
Consulting in Model-Based Development, Transformation, and Object-Oriented Best Practice http://www.cintegrity.com
Reminds me of the great Canadian piano player Glenn Gould (en.wikipedia.org/.../Glenn_Gould). He told about Chopin that he considered it a composer who had ' fallen into his instrument' (typical for a swooner ;-) - he only had the piano with it's specific possibilities in mind while composing. What he meant is comparable to what is seen and loved as oo and - opposed to oftenly - procedural/functional by the composers here: what the abl offers / makes possible.
Let's realize that limited size of our skull. We will never be free. :-) Amen.
Dogma is a hell of a drug :-)
Every tool has pros and cons. OO has quite a few benefits for a variety of application functions but procedural has a lower barrier to entry, especially if you have an existing procedural code base.
Just like people misuse inheritance... people abuse include files and supposedly reused procedures. You can write well structured, functional code in either procedural or OO. To be honest there aren't too many examples of either in most vendor packages (not just OpenEdge based).
Blaming poor OO design/code on OO itself is like blaming the ABL for some of the horrors we have all witnessed over the years... like include files nested 7 layers deep or finding 10 different "standard" versions of a function.
The preceding question could be: "can OO on itself be blamed".
Quite not the dumbest answer "yes". For example the interviewed man here www.stlport.org/.../StepanovUSA.html :
"I think that object orientedness is almost as much of a hoax as Artificial Intelligence. I have yet to see an interesting piece of code that comes from these OO people. In a sense, I am unfair to AI [....] I find OOP technically unsound. It attempts to decompose the world in terms of interfaces that vary on a single type. To deal with the real problems you need multisorted algebras - families of interfaces that span multiple types. I find OOP philosophically unsound. It claims that everything is an object. Even if it is true it is not very interesting - saying that everything is an object is saying nothing at all. I find OOP methodologically wrong. It starts with classes. It is as if mathematicians would start with axioms. You do not start with axioms - you start with proofs. Only when you have found a bunch of related proofs, can you come up with axioms. You end with axioms. The same thing is true in programming: you have to start with interesting algorithms. Only when you understand them well, can you come up with an interface that will let them work."
Maybe interesting also to read: http://jalf.dk/blog/2010/11/good-design-oop/
And socially about OO I can find truth in the following (of course hardly anyone drinking the oo abl koolaid here wants to hear these obvious but inconvenient things, what the hell when you can earn money with it + never counterspeak your good old 'friends' ;-):
"What went wrong? People rushed to use the complex stuff (see: inheritance, especially multiple) when it wasn’t necessary, and often with a poor understanding of the fundamentals. Bureaucratic entropy and requirement creep (it is rare that requirements are subtracted, even if the original stakeholders lose interest) became codified in ill-conceived software systems. Worst of all, over-complex systems became a great way for careerist engineers (and architects!) to gain “production experience” with the latest buzzwords and “design patterns”. With all the C++/Java corner-cases and OO nightmares that come up in interview questions, it’s actually quite reasonable that a number of less-skilled developers would get the idea that they need to start doing some of that stuff (so they can answer those questions!) to advance into the big leagues."
In biological anthropology we have a long established phenomenon in which a certain type of researcher will find a new specimen, seize on some element of uniqueness of that specimen, and proudly announce that it is a new species and deeply changes our understanding of human evolution. Of course, this is silly from the outset because given a small sample size and wide variation in space and time, one actually expects to find some unique element in any new specimen.
Moreover, once the dust settles, there isn't really any change in our understanding except that the finder's versions of the human phylogenetic tree will have a branch on them corresponding to the finder's fossil. Meanwhile, there is another type of researcher who is busy looking at the whole, looking for unifying patterns instead of unique spurs. To him or her the new specimen is mostly expected and, at most, provides a slight jiggle to what was known all along.
One often speaks of splitters vs lumpers, although the "my fossil is unique" is a special narcissistic form of splitting. The splitters win in the contemporary press because new, unique, change are words which drive news. The lumpers win in the end because that is all that is actually supported by the evidence.
There is a similar phenomenon in CS in which some seek to come out against something to provide a unique or at least a controversial or newsworthy stance while others are looking for underlying patterns or unity. It seems to me to be a variation of the same contrast. To me, an extreme lumper, what all these anti-OOP proclamations are often based on is railing against some detail, often a detail which may not even been good OOP to a core lumper. Moreover, their offered alternatives seem frequently to be expressions of core good OOP.
> railing against some detail, often a detail which may not even been good OOP to a core lumper. Moreover, their
> offered alternatives seem frequently to be expressions of core good OOP
That Stepanov does not rail against details, quite the contrary. Moreover he knows what he is talking about. As far as I understood.
"saying that everything is an object is saying nothing at all" - Stepanov
It is good to know that we can blame the procedural model for all of the horrible code out there and not the actual developers :-)