Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
Who's Moving Ahead in Cloud Computing?

The future remains, well, cloudy. But either way: Amazon, look out. Microsoft is gaining fast. Read More »

Hype Versus Action in the Developer's World

Often times technologies start as hype but with time become adopted. As a developer or technologist, it is worth reading the hype and knowing the... Read More »

Microsoft Hyper-V Network Virtualization Q&A

The top 5 Hyper-V questions with answers provided by Nirmal Sharma, a MCSEx3, MCITP and Microsoft MVP in Directory Services. Read More »

QUICK REFERENCE
Webopedia Polls

Webopedia Polls: Is iPhone 6 or iWatch on your list? Read More »

How to Create a Desktop Shortcut to a Website

This Webopedia guide will show you how to create a desktop shortcut to a website using Firefox, Chrome or Internet Explorer (IE). Read More »

Flash Data Storage Vendor Trends

Although it is almost impossible to keep up with the pace of ongoing product releases, here are three recent highlights in the flash data storage... Read More »