We are going to uncover the super cool secret of how computers understand the letters you type, like 'A,' 'B' 'C' and 'D' using a special code called binary which has only 2 numbers 0 & 1. When you press a key on your keyboard, it's like sending a message to your computer. Now, how do we turn alphabets/numbers/special characters into the '1's and '0' ? We use a secret code called ASCII (pronounced "ask-ee"). It's like a giant chart that matches each letter, number, or symbol with a unique combination of '1's and '0's. When it was first introduced, ASCII supported English language text only. When 8-bit computers became common during the 1970s, vendors and standards bodies began extending the ASCII character set to include 128 additional character values. Extended ASCII incorporates non-English characters, but it is still insufficient for comprehensive encoding of text in most world languages, including English Computers nowadays primarily use Unicode, specifically the UTF-8 encoding, for representing and understanding characters. While ASCII is still a part of Unicode, UTF-8 extends it to cover a broader range of characters, including those from various languages and symbols
ASCII stands for American Standard Code for Information Interchange. It's a set of rules or a code that computers use to represent characters, letters, numbers, and symbols as numbers. In ASCII, each character is assigned a unique number or code. For example, the ASCII code for the number '1' is 49, letter 'A' is 65, letter 'a' is 97.
The Binary Code: How 'A' Becomes 01000001