OpenCV compatible definition of int64
Unreal engine defines the global types int64 and uint64, which are signed long long and unsigned long long respectively.
So asides from the perplexing comment in the API documentation ("32-bit signed"??) this causes problems when linking more intimately with openCV, as that library also defines int64 and uint64, but as int64_t and uint64_t. These are sometimes long long, and sometimes long, depending on your platform, but they are always 64 bits.
Is it possible for you to change your definitions to the stdint ones, or can you give me a strong enough argument that long long is better so I can bring that to the opencv guys instead?
(my current solution is terrible and involved renaming the unreal definiton as I didn't want to risk producing hidden bugs due to things not being the same size)
asked Nov 30 '16 at 05:08 PM in C++ Programming
I had the same problem.
I don't know if my approach was the best but I have changed
I had to change
I changed the code below:
By the the following code:
No Unreal and OpenCV works.
answered Jan 19 '18 at 04:50 PM
Follow this question
Once you sign in you will be able to subscribe for any updates here