Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
LATEST ARTICLES
SEO Dictionary

From keyword analysis to backlinks and Google search engine algorithm updates, our search engine optimization glossary lists 85 SEO terms you need... Read More »

Slideshow: History of Microsoft Operating Systems

Microsoft Windows is a family of operating systems for personal computers. In this article we look at the history of Microsoft operating... Read More »

Slideshow: Interesting Facts About Google Search

From Goats to Penguins, a server outage and trillions of searches, our slideshow presents interesting facts about Google and the Google.com... Read More »

STUDY GUIDES
Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »