Home / Blog / It’s Happening Now: Perceptual Computing is Real

It’s Happening Now: Perceptual Computing is Real

Webopedia Staff
Last Updated February 1, 2024 12:51 am

RealSense SupportIt is no longer just a matter of simply using touch, voice, and keyboards to provide your computer with information. Going forward, how you interact with your computer is going to change. More importantly, your computer will be able to perceive what is happening around it and take that into consideration as well.

Perceptual computing is not new. It has been in the mainstream consumer market for half a decade, thanks to Microsoft and the Microsoft Kinect. Perceptual computing, however, is not just a toy for gaming consoles; it is quickly about to become a part of computing systems across the ecosystem.

What is Perceptual Computing?

In simple terms, perceptual computing is the ability for a computer to recognize what is going on around it. More specifically, the computer can perceive the environment and the users in that environment. The computer then can determine what needs a user might have, as well as react to those needs without giving or receiving any additional information.

As mentioned, Microsoft released the Kinect with the Xbox One. This brought the basic concepts of perceptual computing to the forefront. Soon after it released the Kinect for Xbox, it then released a Windows version for desktop and notebook computers. The adoption of this device in the industry was minimal relative to the number of Kinects used with gaming consoles.

Other devices that allow for some portions of perceptual computing to be sensed are also available. This includes the LeapMotion device. Leap Motion, however, is more limited to gestures versus the full spectrum of sights and sounds.

Intel’s RealSense Device

The entry that will disrupt the market the most, however, is coming from Intel. This is Intel’s RealSense device. RealSense offers a depth sensor and cameras that provide a 3D vision and immersive experience. It can do facial analysis, hand and finger tracking, and speech recognition. It also can be used for augmented reality, including the removal of backgrounds. The device has the capability to track 10 fingers simultaneously as well as do gesture recognition. For facial tracking, multiple people can be tracked at once. The software has the capability of identify features such as eyes, mouth, and nose.

What does all of this “percepting” allow your computer to do? With the help of developers using the SDK, and a system like RealSense, computers will be able to gain insights from users that can then be applied to applications being run. A few examples can help illustrate this.

Passwords are no longer needed. When your computer is able to recognize you, there is no need for a password. You are your password. With perceptual computing, this is.

Measuring is simple. With perceptual computing, the computing system can determine depth. This in turn allows distances to be calculated. By applying this technology to devices such as cameras and phones, you’ll be able to point at something and determine distances.

In addition to measuring distance, gestures and other movements can be tracked. For example, with the Microsoft Kinect, they can track movements of a person’s chest. This will allow for things such as heart and breathing rates to be determined.

Checking a person’s emotions is possible. By analyzing facial expression, the computer can guess at a how a person is feeling. In fact, with Intel’s RealSense tools for developers, they include a set of routines that allows the developer to tap into this data and then apply it to their applications. If an application can now tell that a person is getting angry, it then would be possible to adjust what is happening to try to rectify the situation. For example, the application could offer to launch the help system.

perceptual computing devices

Nearly all of the capabilities of perceptual computing devices are open to developers to build into applications.

How Real is Perceptual Computing?

How real is all of this? As mentioned, the Microsoft Kinect and LeapMotion devices have been around for a while. It is worth noting, however, that the Intel RealSense technology is also available. In fact, you can find it on a number of devices, including:

  • Asus N441 JQ – A 15” Ultrabook
  • Asus Rog G5771 JM – A 17” gaming laptop
  • Asus X751LD – A 17” everyday notebook
  • HP Envy 15t Touch RealSense – A full-featured laptop
  • Dell Inspiron 15 5548 – a lightweight, slim laptop
  • Acer Aspire V 17 Nitro – A 17” high-performance notebook
  • Lenovo ThinkPad Yoga 15 – A thin, lightweight convertible
  • Lenovo ThinkPad E550 – a thin, full-featured notebook
  • Lenovo B5030 – An all-in-one with a 23.8” touch display
  • Dell Inspiron 23” 7000 – An all-in one with a 23” thin touch screen
  • HP Sprout – A “creativity station”

These are just some of the devices that support RealSense today. As you can see, with major players like HP, Asus, Dell, Acer, and Lenovo supporting the technology, is just a matter of time before the perceptual computing hardware becomes standard in the same way a video camera is standard in most new computing devices today.

Your computer will soon perceive you. When it does, you’ll want to be nicer to your computer because it will know when you are mad…


Bradley L. Jones
Bradley L. Jones is the Director and Editor in Chief of the Developer.com Network of sites, which includes Developer.com, Codeguru, DevX, and HTMLGoodies. He is an internationally bestselling author who has written more than 20 developer-related books across a variety of topics ranging from C++ to Windows and from C# to Web 2.0

This article was originally published on March 26, 2015