Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
Hype Versus Action in the Developer's World

Often times technologies start as hype but with time become adopted. As a developer or technologist, it is worth reading the hype and knowing the... Read More »

Microsoft Hyper-V Network Virtualization Q&A

The top 5 Hyper-V questions with answers provided by Nirmal Sharma, a MCSEx3, MCITP and Microsoft MVP in Directory Services. Read More »

Storage Trends: Solid State and Software Defined

Solid state drives and software defined storage are two leading trends in the rapidly growing storage market.  Read More »

QUICK REFERENCE
How to Create a Desktop Shortcut to a Website

Creating desktop shortcuts to a websites is useful. When you double-click the icon from your desktop it automatically launches the browser and... Read More »

Flash Data Storage Vendor Trends

Although it is almost impossible to keep up with the pace of ongoing product releases, here are three recent highlights in the flash data storage... Read More »

15 Important Big Data Facts for IT Professionals

Keeping track of big data trends, research and statistics gives IT professionals  a solid foundation to plan big data projects. Here are 15... Read More »