The conference proper lasted three days, with two keynotes, seven sessions, presenting 3 papers each, and two panels.
I felt the topic was surprising for Adele Golberg.
She went up accusing the software community of not having kept a record of the goals and the achievements over the last decades, ending up being unable to prove that progress has been accomplished.
Her references were Norm Fenton and Victor Basili (plus her own recent book).
She put the stress, on the one hand, on the economic aspects:
choose reusable assets, defined as meaningful to the organization; and
on the other on the need for scientific methods.
As an example, she prevented against changing the process as a
side-effect of developing a product.
The most interesting paper was the first, proposing to handle
covariant specialization (as in O2) through multi-methods (why isn't
the obvious relationship between the two more widely recognized?).
The solution uses run-time dispatch, and can be implemented by the
compiler, resulting in type-safe covariant specifications.
The problem in my opinion, is that this kind of type safety is not
quite satisfying. It falls short of providing intuitive semantics for
the designer. I.e. it may ends up in binding unexpected behaviours
where an error might have been better!
In particular, it ends up in dealing in non symmetric ways with the
two arguments of a comparison operator (the example). There are two
problematic cases with such covariant operators: these where the
dynamic types of the two arguments don't match. While only the one is
technically not type safe, the other should not more be deemed
acceptable!
The 2nd and 3rd papers dealt with yet new programming languages.
Of the three papers, only the first dealt with real user
experience... The system was described described at a general level
though. I grasped that it did use abstract classes as interfaces, with
object factories.
All the documents produced are used for communications. They are
application oriented and not bound to milestones.
The code presented in the 3rd paper dealt with streaming of complex
object bases, for GUI representation and persistent storage, and could
be of interest (?)
There were representatives of system providers (IBM, Siemens
Nixdorf), and of several European banks.
The first spoke of new solutions (FSOM, FINIS -IBM Visual Age
technology-, CORBA, CORBAfacilities, Common Business Objects...), the
others from problems experienced and to be avoided: instability of
products (releases without migration plans), consultants explaining
"us" our systems in abstract terms, expensive drawing tools, naive use
of OO...
The first paper was about profile-driven optimization of C++
programs, aiming at eliminating virtual function calls (so as to
inline them). Improvements of 18% were achieved.
This however only concerns fully linked applications (which can be
profiled), and not reusable components, thus reducing the interest of
the technique, IMHO.
The 2nd paper dealt with lifting ambiguities of method resolution
(due to multiple inheritance or multi-methods).
It advocated the supporting of explicit disambiguation (because implicit
schemes lead to arbitrary, sometimes non-wanted, and poorly
predictable solutions), and offered a method to assist the designer in
predicting and solving all the ambiguities (the method was found to be
equivalent to some other one, published earlier).
3rd paper: I was surprised that my feeling, that the presentation was deeply flawed, seemed not to be shared by everybody...
The first paper was too mathematical for me (describing classes as coalgebras, and building on this).
The second dealt with a language for building object-oriented concurrent specifications, Maude.
The third paper explored the problems related to mixing, as in C++, shared with replicated multiple inheritance.
The first paper introduced the concept of Parallel Operators in a Single Program Multiple Data environment using the language Eiffel.
The second sounded suspicious, dealing with object migration in the virtual memory: slightly too intelligent memory manager? There were other problems too (efficiency, hard-coding of server names, etc).
The last paper in the session described a library of protocol
classes aiming at providing reliability in distributed systems,
through the use of group communication. A related presentation was
given in the workshop.
Representatives from universities (German and US) gave a review of their programs. Linda Northrop from CMM and Adele Goldberg shared their opinions and experiences. The audience was quite receptive.
Francois Bancillon listed the species of the US ecosystem just to notice that a lot of them are missing in Europe. It was not so clear what advice he intended to give us: he somehow remained captive of this US fascination setup, although not advocating to clone it. Nice show anyway.
Good counterpoints from Adele Golberg (in the assistance), with concern for the customers and doubts above the future (will one still sell any software within 10 years?).
1st paper: an interesting study (although a similar study was
presumably presented 10 years ago at ECOOP) on dynamic clustering
-well argumented choice-, experimented on a publicly available
persistent storage system for C++, Texas, from the University
of Austin.
The difficulty is in the management of the statistics.
2nd paper: a methodology (read graphical notation) for database design.
3rd paper: "semantic enrichment" of persistent data in a relational
database, through an external specification (special language).
Any schema evolution would have to be matched by an evolution of the
specification...
1st paper: Incremental enhancement of the BETA type system (by its authors). A minor lack of orthogonality was spotted (unnecessary restriction) and a solution to correct it was reviewed.
2nd paper: An attempt to introduce metaphors to generate polymorphic behaviours. Not very convincing, at least because based on examples that, it seemed, could have been handled without this new mechanism.
3rd paper: a concept close to this of Operation (my own hobby), taken from the point of view of a multi-method (with many participants), not in the context of distribution or static typing.