Webopedia on Google+Webopedia on TwitterWebopedia on FacebookTech Bytes Blog
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.







TECH RESOURCES FROM OUR PARTNERS
LATEST ARTICLES
What's Hot in Tech: AI Tops the List

Like everything in technology, AI touches on so many other trends, like self-driving cars and automation, and Big Data and the Internet of Things... Read More »

DevOp's Role in Application Security

As organizations rush to release new applications, security appears to be getting short shrift. DevSecOps is a new approach that holds promise. Read More »

Slideshow: Easy Editorial SEO Tips to Boost Traffic

This slideshow reviews five easy on-page editorial SEO tips to help drive organic search engine traffic, including the page title, heading,... Read More »

STUDY GUIDES
Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

The 7 Layers of the OSI Model

The Open System Interconnection (OSI) model defines a networking framework to implement protocols in seven layers. Use this handy guide to compare... Read More »