Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
DID YOU KNOW?
The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the current devices that we use today. Read More »

Cloud Computing Market Leaders, 2015

If not for AWS, Microsoft would dominate the cloud. The race to capture market share will grow ever more fierce in the years ahead. Here's a look... Read More »

OpenStack Brings Liberty to the Cloud

Massive changes set to come to the cloud as Big Tent model lands in widely deployed cloud platform used by Walmart, Comcast, AT&T, Time Warner... Read More »

QUICK REFERENCE
29 Free Android Apps for Cash-Strapped Students

From wacky alarm clocks to lecture hall tools and after class entertainment, these Android apps are a good fit for a student's life and budget. Read More »

Network Fundamentals Study Guide

A network is a group of two or more computer systems or devices, linked together to share resources, exchange files and electronic communications.... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »