[Bug 357941] arith.h defines INT_MAX when it shouldn't

bugzilla at redhat.com bugzilla at redhat.com
Thu Oct 2 23:48:19 UTC 2008


Please do not reply directly to this email. All additional
comments should be made in the comments box of this bug.


https://bugzilla.redhat.com/show_bug.cgi?id=357941


John Ellson <john.ellson at comcast.net> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
               Flag|                            |needinfo?(tim at niemueller.de
                   |                            |)




--- Comment #8 from John Ellson <john.ellson at comcast.net>  2008-10-02 19:48:17 EDT ---
I've forgotten what my conclusion was on this, but it looks as though I made
one change.   limits.h is now included unconditionally by arith.h before
testing  #ifndef INT_MAX, so it looks to me like this should work if limit.h is 
included again either before or after arith.h.

If limits.h doesn't exist on SGI's then i don't understand why I haven't heard
complaints about this.

Could you test with rpms from www.graphviz.org to see if this problem still
exists?

-- 
Configure bugmail: https://bugzilla.redhat.com/userprefs.cgi?tab=email
------- You are receiving this mail because: -------
You are on the CC list for the bug.




More information about the fedora-triage-list mailing list