Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »


A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.

The Problem with Statistics

We look at a few of the more troubling aspects of statistics and how these may be used to advance an agenda or skew the facts to someone's... Read More »

29 Free Android Apps for Cash-Strapped Students

From wacky alarm clocks to lecture hall tools and after class entertainment, these Android apps are a good fit for a student's life and budget. Read More »

Sharing Threat Intelligence

A growing number of startups make the sharing of threat intelligence a key part of their solutions. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »

Computer Architecture Study Guide

This Webopedia  study guide describes the different parts of a computer system and their relations. Read More »