Re: CINT and overload resolution

From: George Heintzelman (gah@bnl.gov)
Date: Tue Mar 21 2000 - 17:36:42 MET


> Cint checks parameter matches from exact match to user conversion
> all together with all the arguments. In this case,
>     TPhCalKey("String","String",100)
> Cint searches in following order
>     TPhCalKey(char*,char*,int)
>     template<class T,E> TPhCalKey(T,T,E)
>     TPhCalKey(char*,char*,(any integral type))
>     TPhCalKey(void*,void*,(any numerical type))
>     TPhCalKey(anyUserConv(char*),anyUserConv(char*),anyUserConv(int))
> 
> In this case, because all 3 parameters matched with user defined conversion
> before it sees the true candidate.  This behavior is not fully compliant to
> the standard , but speeds up overloading resolution in interpreter 
> environment. Please understand the speed advantage and stand with current
> implementation.

Yes, I understand what CINT is doing. I'm claiming the opposite: I 
think this particular variation from the standard is unintuitive and 
leads to subtle changes in code behavior between compiled and 
interpreted versions, and that this is worth a small speed penalty. At 
least for ROOT users, I suspect that little of the 'real work'/bulk of 
CPU time of code is spent in interpreted code. Certainly for us 
(Phobos), we use scripts first as control and direction of a ROOT 
session and second as a way to do fast prototyping, debugging and 
testing. I think this deviation causes potential problems with both of 
these uses.

Furthermore, the standard says that a case where there is a real 
ambiguity in a function call is an error and must be detected. CINT's 
current behavior here is to accept an ambiguous function, picking one 
essentially randomly, and issue no diagnostic. Solving the problem 
about sub-resolution within classes of conversion would certainly fix 
this second deviation from the standard as well.

Is the speed advantage really that big a deal? Most functions (except 
constructors) will have only at most a few overloads, and even for 
constructors, the cases where more complicated resolution is needed 
shouldn't be all that common; I wouldn't expect the time spent in 
overload resolution to expand by much except in pathological cases (all 
you need to do is make a list of those matching at a given stage, and 
make a single pass through them to find the best candidate if there is 
more than one). From profiling, you should be able to say how much time 
CINT currently spends in overload resolution versus its other tasks, in 
usual cases; is it really significant?

George



This archive was generated by hypermail 2b29 : Tue Jan 02 2001 - 11:50:21 MET