extern int max(int a,int b) { return a > b ? a : b; } extern int max(a,b) int a,b; { return a > b ? a : b; }
Here int a,b; is the declaration list for the parameters. The
difference between these two definitions is that the first form acts
as a prototype declaration that forces conversion of the arguments of
subsequent calls to the function,whereas the second form does not.
解决方法
If the expression that denotes the called function has a type that does not include a
prototype,the integer promotions are performed on each argument.…
If the expression that denotes the called function has a type that does include a prototype,the arguments are implicitly converted,as if by assignment,to the types of the
corresponding parameters.
char x = 3; char y = 7; max(x,y); // Equivalent to max((int)x,(int)y)
因为x和y在放入堆栈之前被提升为int.
但是,像这样的代码是不可行的:
double x = 3.0; long y = 7; max(x,y); // Uh-oh
x和y将以double和long的形式放在堆栈中,但max()将尝试读取两个int,这将导致未定义的行为(实际上,原始位将被重新解释).
这是不使用第二种形式的一个原因;它在标准中的唯一原因是提供与(极端)遗留代码的向后兼容性.如果您正在使用GCC,则可以使用-Wold-style-definition标志强制执行此操作;我希望其他编译器能提供相同的东西.