UnicodeA standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.
Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.
Webopedia's student apps roundup will help you to better organize your class schedule and stay on top of assignments and homework. Read More »20 Ways to Shorten a URL
If you need to shorten a long URL try this list of 20 free online redirection services. Read More »Top 10 Tech Terms of 2015
The most popular Webopedia definitions of 2015. Read More »
This Webopedia study guide describes the different parts of a computer system and their relations. Read More »Network Fundamentals Study Guide
Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »The Five Generations of Computers
Learn about each of the five generations of computers and major technology developments that have led to the current devices that we use today. Read More »