Main » TERM » L »

legacy

In computing terms, the word legacy is used to describe outdated or obsolete technology and equipment that is still being used by an individual or organization. Legacy implies that the system is out of date or in need of replacement, however it may be in good working order so the business or individual owner does not want to upgrade or update the equipment. Typically vendor or manufacturer support is not available for legacy systems and applications.







LATEST ARTICLES
Facts about Cloud Computing in 2017

The following facts and statistics capture the changing landscape of cloud computing and how service providers and customers are keeping up with... Read More »

Facts about Computer Science: Education and Jobs

The following computer science facts and statistics provide a quick introduction to the changing trends in education and related careers. Read More »

Texting & Chat Abbreviations

From A3 to ZZZ this guide lists 1,500 text message and online chat abbreviations to help you translate and understand today's texting lingo. Read More »

STUDY GUIDES
The Five Generations of Computers

Learn about each of the five generations of computers and major technology developments that have led to the computing devices that we use... Read More »

Computer Architecture Study Guide

Computer architecture provides an introduction to system design basics for most computer science students. Read More »

Network Fundamentals Study Guide

Networking fundamentals teaches the building blocks of modern network design. Learn different types of networks, concepts, architecture and... Read More »