OpenCV compatible definition of int64

Unreal engine defines the global types int64 and uint64, which are signed long long and unsigned long long respectively.

So asides from the perplexing comment in the API documentation (“32-bit signed”??) this causes problems when linking more intimately with openCV, as that library also defines int64 and uint64, but as int64_t and uint64_t. These are sometimes long long, and sometimes long, depending on your platform, but they are always 64 bits.

Is it possible for you to change your definitions to the stdint ones, or can you give me a strong enough argument that long long is better so I can bring that to the opencv guys instead?

(my current solution is terrible and involved renaming the unreal definiton as I didn’t want to risk producing hidden bugs due to things not being the same size)

thanks

I had the same problem.

I don’t know if my approach was the best but I have changed OpenCV directly instead of Unreal

I had to change "opencv2/core/types_c.h"

I changed the code below:

#else
   typedef int64_t int64;
   typedef uint64_t uint64;
#  define CV_BIG_INT(n)   n##LL
#  define CV_BIG_UINT(n)  n##ULL
#endif

By the the following code:

#else
    typedef long long int64;
    typedef unsigned long long uint64;
#  define CV_BIG_INT(n)   n##LL
#  define CV_BIG_UINT(n)  n##ULL
#endif

No Unreal and OpenCV works.

1 Like