Double size allocated on matrix

Hello, I found this situation!

uint16_t au16data[16];
uint16_t matrix[2][16];
const uint16_t au16dataSize (sizeof(au16data) / sizeof(au16data[0]));

Serial.print("Matrix: ");
Serial.println(sizeof(matrix) / au16dataSize);
Serial.print("Array: ");

On serial monitor I got:

Matrix: 4
Array: 16

I was expecting the matrix with size 2. Looking around I found that some compilers was allocating more memory than needed.

It’s a compiler bug or code bug?


uint16_t au16data[16]; is sizeof() = 2*16 byte = 32 byte.

With au16dataSize (sizeof(au16data) / sizeof(au16data[0])) you are computing the number of elements in the array (32 / 2). Not the size in bytes. So that will 16.

The matrix is a 2x16 uint16_t (=2 Byte) elements is going to be 2162 = 64 byte (= sizeof(matrix)).


should be 64 / 16 = 4.

So, the output

is correct.

If you want to know how many 16-element uint16_t vectors the matrix has, the correct expression would be sizeof(matrix) / sizeof(au16data). That would be 64 / 32 = 2.


I think that uint16_t was 16 byte long instead 32!

Thanks for your explanation!

Both is wrong – uint16_t is an unsigned, 16 bit integer, aka, 2 bytes. Range 0 to 65535.