dcsimg
Main » TERM » U »

Unicode

A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.










LATEST ARTICLES
Facts about IT & Coding Boot Camps

The following coding and IT boot camp facts and statistics provide an introduction to the changing trends in education and training programs. Read More »

Top Cloud Computing Facts

The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »

STUDY GUIDES
Java Basics, Part 1

Java is a high-level programming language. This guide describes the basics of Java, providing an overview of syntax, variables, data types and... Read More »

Java Basics, Part 2

This second Study Guide describes the basics of Java, providing an overview of operators, modifiers and control Structures. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »