Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
Small Business Technology Then and Now

Technology for small business has changed a lot in the last decade. But as technology evolves, so do the challenges. Read More »

Windows XP: Move Along, There's Nothing to See Here

After more than 12 years of holding the title of most popular operating system in the world, Windows XP is taking center stage for its final... Read More »

Report: The Role of Big Data in the Marketing Industry

According to a new study from Infogroup Targeting Solutions, we can expect to see companies spend heavily on big data marketing initiatives in... Read More »

QUICK REFERENCE
Flash Data Storage Vendor Trends

Although it is almost impossible to keep up with the pace of ongoing product releases, here are three recent highlights in the flash data storage... Read More »

15 Important Big Data Facts for IT Professionals

Keeping track of big data trends, research and statistics gives IT professionals  a solid foundation to plan big data projects. Here are 15... Read More »

Enterprise Storage Vendors

There's a number of vendors that sell enterprise storage hardware or offer cloud-based enterprise storage. View Webopedia's Enterprise storage... Read More »