ANSI vs Unicode

How is it important for software programming ?

To keep it straight forward.

Byte is binary.

ANSI is the extension of ASCII. Thus when we speak of ANSI, we also speak about ASCII.

ANSI and Unicode are encodings. Encoding means how data is represented in binary.

Refer to

http://www.joelonsoftware.com/articles/Unicode.html

ANSI data is stored as 1 byte (8 bits) per character while Unicode data is usually stored as 2 bytes(16 bits) per character. (Means 16 bits are used to represent 1 character).

In VB6 and VB.NET, String variable is by default Unicode. (2 bytes per char).

To illustrate this, check out the following codes.


So, in terms of binary, ANSI of data ="a" is different from Unicode of data ="a".

This will give you one problem while communicating with different system, such as a home alarm system. Let's say you create a software to communicate with home alarm hardware. You talk to the firmware via TCP/IP socket. If the firmware supports ANSI while you send data as Unicode, then the communication will not work.

You have to convert the data to ANSI encoded bytes before sending over.

Comments