Hello, I found this situation!
const uint16_t au16dataSize (sizeof(au16data) / sizeof(au16data));
Serial.println(sizeof(matrix) / au16dataSize);
On serial monitor I got:
I was expecting the matrix with size 2. Looking around I found that some compilers was allocating more memory than needed.
It’s a compiler bug or code bug?
uint16_t au16data; is sizeof() = 2*16 byte = 32 byte.
au16dataSize (sizeof(au16data) / sizeof(au16data)) you are computing the number of elements in the array (32 / 2). Not the size in bytes. So that will 16.
The matrix is a 2x16 uint16_t (=2 Byte) elements is going to be 2162 = 64 byte (= sizeof(matrix)).
should be 64 / 16 = 4.
So, the output
If you want to know how many 16-element
uint16_t vectors the matrix has, the correct expression would be
sizeof(matrix) / sizeof(au16data). That would be 64 / 32 = 2.
I think that uint16_t was 16 byte long instead 32!
Thanks for your explanation!
Both is wrong –
uint16_t is an unsigned, 16 bit integer, aka, 2 bytes. Range 0 to 65535.