Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
It's Happening Now: Perceptual Computing is Real

Perceptual computing is the ability for a computer to recognize what is going on around it. More specifically, the computer can perceive the... Read More »

Apple Pay Promises to Strengthen Payment Security

Experts believe that Apple Pay and other competitive payment systems will be far more secure than cards, even cards equipped with EMV chips. Read More »

The Great Data Storage Debate: Is Tape Dead?

Tape clearly is on the decline. But remember, legacy systems can hang for a shockingly long time. Read More »

QUICK REFERENCE
Network Fundamentals Study Guide

A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »

Webopedia Polls

The trend for the past two years has been for shoppers to spend more online during the holiday season. How do you typically shop for holiday... Read More »