Question:
How does a computer store ASCII characters?
2013-06-09 16:51:13 UTC
I have a basic understanding of binary and ASCiI and general character encoding concepts, but my question is HOW are ASCII charters converted from binary, physically, in terms of I know there is a chart telling the computer a particular set of 7 digits equals a particular letter, but if the computer only reads binary, how is it able to even store a character chart? How does it store the symbol we know as an "A" or a "T" etc... To even e able to convert a set of binary sequences to characters? At the end of the day, where is this character chart found physically on the computer and how at the very most basic level can a computer store such a chart if ultimately it only interprets binary?

I work in online marketing or several years an am still pretty young, so i am relatively computet savvy, but i feel i am lacking the very most basic fundamental understsnding of how a computer itself physically works, So please also correct me if anything I said is inaccurate. I have spent several hours looking for this answer, but have only found bits an pieces. I understand the circuitry gateway and electrical purses and how these represent binary code, and I understand that there are character schemes which use binary to represent characters, and then codes an programs are written from there, but I just don't get the in between part of physically how a computer can change binary electrical pulses into English characters....

Thank You!
Four answers:
??????
2013-06-09 17:01:26 UTC
There is no such thing as a character chart.

Characters are only drawings, pictures.

The computer just needs to know the number of the drawing, this is the character number.

And once the computer knows the number, it shows the drawing with the corresponding number and a drawing is always binary also : for every pixel in the rectangle the computer has to know if it is black or white (1 or 0).

No matter what concept the computer handles, it boils always down to 0's and 1's.



Yes, that is right ! You got it.

The binary data have to be interpreted differently, sometimes they mean a pixel colour, sometimes they mean a character or a number, or a mouse key, ...
roger
2013-06-10 09:55:46 UTC
It is all binary inside the computer. Your graphics processor has a chart (map) that translates the binary character code into a map of the shape of the glyph that it prints on the screen.

So if you tell the computer to print character 'A' at some place on the screen the processor will get the code for 'A' 65 in ASCII (in binary of course since that is ALL the computer understands). It will then look up in its font table (map.. chart...) to find out the shape of the character (glyph). It then turns on the appropriate pixels on the screen to show the character.
Michael
2013-06-09 17:24:28 UTC
The technical details of how character glyphs, character encoding, and character maps differ greatly.



In graphical environments the process is handled by the operating system. It maps a character represented in some encoding format (such as ASCII, UCS, UTF8, UTF16) using a character map (might be selected based on current local) containing the respective character glyph to render. This is still a simplification; the process is a bit more complicated when we throw in different font files and character sets. In other words, ASCII 65 -> offset into character map to locate 'A' glyph -> render glyph at current cursor position. The glyph may be a bit map of the character but most go through a standard such as TrueType that describe glyphs.
2016-09-17 15:06:15 UTC
It depends


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...