Using 8-bit font

I’m using a font with extended characters (0x20-0xff). This works ok but for each compilation I get:
Unicode decode error has occurred, please remove invalid (non-ASCII or non-UTF8) characters
Of course I cannot convert the source file to Unicode or UTF-8 as my characters are validly coded from 0x20 to 0xff and Unicode or UTF-8 would stop the font file from displaying characters correctly,
Is there an option to make the compiler behave?

Too many open questions:

Which one?

What are they used for?

Which compiler is involved?

The compiler ignores the font and the displayed characters.
You must use the coding required by the compiler (which is probably UTF-8).

Why is my post split into multiple entries? My original post is just one.

It’s a GFX font created with fontconvert using calibri.ttf as input. I use PlatformIO with VS Code.

If I use UTF-8 encoding, the compiler will encode all characters in the text I want to send to the display above 0x7f as 2 bytes. The font is designed to take one byte as input for each character.

Because I have quoted the parts I have questions about.

I think I misunderstood you.

Are you talking about code with which you control a display that is connected to a microcontroller?

Or is it about the encoding in VS code?
image

Or where are you using the font? On your PC?

1.8 TTF screen connected to ESP32. Everything works. Encoding is Windows 1252 and must be.
I just want to get rid of the annoying compiler error message about the encoding quoted in my first post.

Now the things becomse a bit more clear (to me).

Where, what and why is Windows1252 encoded?

It would be very helpful if you could share a minimal project that makes the error message reproducible.

Just compile with this:
int g_lineCounter = 0;

sprintf(buf, “%d: ÅÄÖåäöÉéÜüÇç”, g_lineCounter++);

No issues here…

main.cpp

#include <Arduino.h>

void setup() {
    Serial.begin(115200);
    char buf[100];
    int g_lineCounter = 0;

    sprintf(buf, "%d: ÅÄÖåäöÉéÜüÇç", g_lineCounter++);
    Serial.print(buf);
}

void loop() {
}

platformio.ini

[env:esp32dev]
platform = espressif32 @ 6.9.0
board = esp32dev
framework = arduino

Output:

Processing esp32dev (platform: espressif32 @ 6.9.0; board: esp32dev; framework: arduino)
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Verbose mode can be enabled via `-v, --verbose` option
CONFIGURATION: https://docs.platformio.org/page/boards/espressif32/esp32dev.html
PLATFORM: Espressif 32 (6.9.0) > Espressif ESP32 Dev Module
HARDWARE: ESP32 240MHz, 320KB RAM, 4MB Flash
DEBUG: Current (cmsis-dap) External (cmsis-dap, esp-bridge, esp-prog, iot-bus-jtag, jlink, minimodule, olimex-arm-usb-ocd, olimex-arm-usb-ocd-h, olimex-arm-usb-tiny-h, olimex-jtag-tiny, tumpa)
PACKAGES: 
 - framework-arduinoespressif32 @ 3.20017.241212+sha.dcc1105b 
 - tool-esptoolpy @ 1.40501.0 (4.5.1) 
 - toolchain-xtensa-esp32 @ 8.4.0+2021r2-patch5
LDF: Library Dependency Finder -> https://bit.ly/configure-pio-ldf
LDF Modes: Finder ~ chain, Compatibility ~ soft
Found 33 compatible libraries
Scanning dependencies...
No dependencies
Building in release mode
Retrieving maximum program size .pio\build\esp32dev\firmware.elf
Checking size .pio\build\esp32dev\firmware.elf
Advanced Memory Usage is available via "PlatformIO Home > Project Inspect"
RAM:   [=         ]   6.6% (used 21464 bytes from 327680 bytes)
Flash: [==        ]  20.4% (used 267245 bytes from 1310720 bytes)
=========================================================================================================================================================================================== [SUCCESS] Took 2.14 seconds ==========================================================================================================================================================================================

Encoding of main.cpp is set to UTF-8

Of course that works!
I’m sorry but did you read my original post? I guess not. The data sent to the display cannot be UTF-8 as the font is designed to use 8 bits, not UTF-8 bits.

I did, but I might have not understand correctly. That’s why I’m asking you few questions about fonts etc.
Your first post was very unclear about what you want to acheive…

I asked you for a minimial example to reproduce the error.

This is still unclear to me!?

I guess you should use -fexec-charset and -finput-charset options: Preprocessor Options (Using the GNU Compiler Collection (GCC))

Use build_flags in PlatformIO to supply them.
Be warned though that this influences all strings in your binary, not only ones you send to the display.

Thanks for the hint. Using
build_flags = -fexec-charset=Windows-1252 -finput-charset=Windows-1252
or any single one of them I get:
cc1plus.exe: error: no iconv implementation, cannot convert from UTF-8 to Windows-1252

I see why now:
"… you will have problems with encodings that do not fit exactly in ‘wchar_t’ "

Guess I have to find another cross-compiler that accepts 8 bit encodings.

Another solution is to hex encode strings for your display.

Solved in Using ICONV for ANSI character set - #7 by peakpeak-github