A standard for representing characters as integers. Unlike ASCII, which uses 7 bits for each character, Unicode uses 16 bits, which means that it can represent more than 65,000 unique characters. This is a bit of overkill for English and Western-European languages, but it is necessary for some other languages, such as Greek, Chinese and Japanese. Many analysts believe that as the software industry becomes increasingly global, Unicode will eventually supplant ASCII as the standard character coding format.
Unicode
Updated on:
Webopedia Staff
Since 1995, more than 100 tech experts and researchers have kept Webopedia’s definitions, articles, and study guides up to date. For more information on current editorial staff, please visit our About page.
Related Articles
Special Character
A special character is one that is not considered a number or letter. Symbols, accent marks, and punctuation marks are considered special characters. Similarly,...
Software
Table of contents
What is Software?
History of Software
Software vs. Hardware
Software vs. Hardware Comparison Chart
What Types of Software Exist?
Saas vs....
Email Address
What is an Email Address?
An email address is a designation for an electronic mailbox that sends and receives messages, known as email, on a...
Information Technology (IT) Architect
The information technology architect applies IT resources to meet specific business requirements. The role requires a high degree of technical expertise as well as...
Definitions
Geotargeting
Geotargeting is a method of delivering data or content to users based on...
Definitions
Agile Project Management
Agile project management enables business teams to approach their projects and tasks with...
Definitions
Private 5G Network
A private 5G network is a private local area network (LAN) that utilizes...