Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
5 Best Valentines Day Ideas for Open Source Romantics

Because nothing says "I Love You" Like an open source app. Read More »

Guarding your Data against Cyber Attacks

In this new era of heightened stakes comes an increased need for a comprehensive security strategy. Let's take a look at how cyber-threats have... Read More »

The Great Data Storage Debate: Is Tape Dead?

Tape clearly is on the decline. But remember, legacy systems can hang for a shockingly long time. Read More »

QUICK REFERENCE
Network Fundamentals Study Guide

A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »

Webopedia Polls

The trend for the past two years has been for shoppers to spend more online during the holiday season. How do you typically shop for holiday... Read More »