UnicodeA standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.
Stay up to date on the latest developments in Internet terminology with a free weekly newsletter from Webopedia. Join to subscribe now.
The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »Facts about Computer Science: Education and Jobs
The following computer science facts and statistics provide a quick introduction to the changing trends in education and related careers. Read More »Texting & Chat Abbreviations
From A3 to ZZZ this guide lists 1,500 text message and online chat abbreviations to help you translate and understand today's texting lingo. Read More »
Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »Computer Architecture Study Guide
Computer architecture provides an introduction to system design basics for most computer science students. Read More »Network Fundamentals Study Guide
Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »