Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
LATEST ARTICLES
8 Agenda Apps to Help Students Stay Organized

Webopedia's student apps roundup will help you to better organize your class schedule and stay on top of assignments and homework. Read More »

20 Ways to Shorten a URL

If you need to shorten a long URL try this list of 20 free online redirection services. Read More »

Top 10 Tech Terms of 2015

The most popular Webopedia definitions of 2015. Read More »

STUDY GUIDES
Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »

The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the current devices that we use today. Read More »