Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »


A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.

8 Agenda Apps to Help Students Stay Organized

Webopedia's student apps roundup will help you to better organize your class schedule and stay on top of assignments and homework. Read More »

List of Free Shorten URL Services

A URL shortener is a way to make a long Web address shorter. Try this list of free services. Read More »

Top 10 Tech Terms of 2015

The most popular Webopedia definitions of 2015. Read More »

Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »