Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
Taking Ownership through Digital Governance

Taking ownership of our own misjudgments or simple forgetfulness takes a healthy amount of humility and some honest self-assessment. Yet sometimes... Read More »

Have We Become a World of Addicts?

It's hard to imagine our lives without smartphones. But people who suffer separation anxiety when they don't have their phones nearby may be in... Read More »

13 Best Free Android Apps

From secure messaging to document editing, our top free must-have apps have been rated, reviewed and named the best free Android apps of 2015. Read More »

QUICK REFERENCE
29 Free Android Apps for Cash-Strapped Students

From wacky alarm clocks to lecture hall tools and after class entertainment, these Android apps are a good fit for a student's life and budget. Read More »

Network Fundamentals Study Guide

A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »