The most useful example of non-verbal communication:
A gesture is a form of non-verbal communication made with a part of the body, used instead of or in combination with verbal communication. The language of gesture is rich in ways for individuals to express a variety of feelings and thoughts, from contempt and hostility to approval and affection.
Most people use gestures and body language in addition to words when they speak; some ethnic groups and languages use them more than others do, and the amount of such gesturing that is considered culturally acceptable varies from one location to the next.
Hand gestures, i.e., gestures performed by one or two hands, is the most numerous category of gestures due to the ability of the human hand to acquire a huge number of clearly discernible configurations, the fact of importance for the sign languages.
Type of gesture:
Although some gestures, such as the ubiquitous act of pointing, differ little from one place to another, most gestures do not have invariable or universal meanings, having specific connotations only in certain cultures.
Different types of gestures are distinguished. The most famous type of gestures are the so-called emblems or quotable gestures (see the examples below). These are culture specific gestures that can be used as replacement for words. Communities have repertoires of such gestures.
A single emblematic gesture can have very different significance in different cultural contexts, ranging from complimentary to highly offensive.
Another type of gestures are the ones we use when we speak. These gestures are closely coordinated with speech. The meaningful part of the gesture is temporally synchronized (that is, occurs at the same time) with the co-expressive parts of speech.
For example, a gesture that depicts the act of throwing will be synchronous with the word 'threw' in the utterance "and then he threw the ball right into the window."
Other gestures like the so-called beat gestures, are used in conjunction with speech, keeping time with the rhythm of speech and to emphasize certain words or phrases. These types of gestures are integrally connected to speech and thought processes.
|- See how your hands can help you communicate in any language -|
|BBC - 10 October 2013
Ultrasound chip offers gesture control for mobiles:
Ultrasound technology that enables mobiles and tablets to be controlled by gesture could go into production as early as next year.
Norwegian start-up Elliptic Labs is in talks with Asian handset manufacturers to get the chip embedded in devices.
The technology works via an ultrasound chip that uses sound waves to interpret hand movements.
The move towards gesture control has gathered pace and there are now many such products on the market.
What sets Elliptic's gesture-control system apart from others is its wide field of use, up to a metre away from the phone. It means it can identify mid-air gestures accurately.
Because it uses sound rather than sight, the sensor can recognise gestures from a 180-degree field. It also consumes less power and works in the dark.
By contrast Samsung's Galaxy S4 uses an infrared sensor that can only interpret hand movements within a very small zone.
"The user needs to learn the exact spot to gesture to instead of having a large interactive space around the device," said Erik Forsstrom, the user interface designer for Elliptic Labs.
The ultrasound system in action
Allowing users more freedom in how they gesture is vital if such products are to become mainstream, he thinks.
"With a small screen such as a phone or a tablet, the normal body language is not that precise. You need a large zone in which to gesture."
If consumers can quickly see the effects their gestures have on screen, he thinks, "it is quite likely that this is the next step within mobile".
The technology was recently shown off at Japanese tech show Ceatec.
In the demonstration, an Android smartphone was housed in a case containing the ultrasound transmitters.
But Elliptic Labs said it had formed partnerships with a number of Asian handset manufacturers who are looking at building the ultrasound chip into devices, as early as next year.
Continue reading the main story “Start QuoteIt is ideal if you have dirty or sweaty hands”
End Quote Ben Wood CCS Insight
Increasingly firms are experimenting with gesture control.
PrimeSense, the company that developed gesture control for Microsoft's Kinect console, has also made strides towards bringing the technology to mobile.
By shrinking down the sensor used in the Kinect, the firm showed it working with a Nexus 10 at a Google developers' conference in May.
Meanwhile Disney is testing technology that allows users to "feel" the texture of objects on a flat touchscreen.
The technique involves sending tiny vibrations through the display that let people "feel" the shallow bumps, ridges and edges of an object.
Ben Wood, analyst with research firm CCS Insight thinks such devices could be ready for the mass market.
"Apple's success has made gestures a part of everyday life. Now consumers understand they can manipulate a screen with a gesture or a swipe everyone is racing to find innovative ways to exploit this behaviour.
"Ultrasonic is particularly interesting as you don't need to touch the screen which can be an almost magical experience.
"It is ideal if you have dirty or sweaty hands. A common example people use is flicking through a recipe when cooking. Other examples include transitioning through a slideshow of photos or flicking through music tracks or turning the page on an ebook," he said.
"The big challenge that remains is how you make users aware of the capability."
More Gesture recognition info:
Leap Motion's gesture-controlled device connects to a computer via a USB cable. It is far from being the first firm to try to revolutionise how we control our PCs.
There are already many other PC-compatible gesture sensors, including a version of Microsoft's Kinect for Windows. However, their size may have limited their appeal.
G-Speak platform is a more expensive gesture-based system that can be used with gloves or bare-handed. It inspired the tech used in the film Minority Report. Celluon CellMC1 Magic Cube Projection Keyboard Celluon's Magic Cube lets you project a full-size laser keyboard on to any flat surface. The bluetooth gadget also lets you use your fingertip as a mouse to drag, point and click. Myo armband The Myo armband includes a motion tracker as well as a sensor that detects muscles' electrical activity and gestures. It is still in development.
Core UC-100 Apple co-founder Steve Wozniak created the Core UC-100 control. All 16 keys on its pad were programmable to trigger custom functions, but it proved too clunky for the masses. Mattel Power Glove Mattel launched the Power Glove in 1989. It was designed to work with a Nintendo games console to simulate hand movements but ultimately failed to catch on. Space saucer keyboard Images of this saucery-spin on a keyboard and trackball appeared in 2009. It never made it beyond the concept stage so we'll never know how hard it would be to use. Tobii Tobii's gaze-controlled sensors allow users to control computers with their eyes. The look-and-do tech was originally developed for people with disabilities. Zspace Zspace is developing a system which combines special glasses and a stylus to allow users to work in what it describes as a "3D holographic-like environment".
5 June 2013- Wisee uses wi-fi signals to recognise body gestures
By Leo Kelion
Researchers say they have found a way to detect and recognise human gestures based on how they affect wi-fi signals.
They suggest it could let users control home appliances with a wave of the hand while in any room of the house.
They say the WiSee system offers a "simpler, cheaper" alternative to Microsoft's camera-based Kinect and other specialist gesture sensors.
However, other experts in the field question whether the new tech will be able to be as accurate.
Details about the project have been published by the University of Washington's computer science department ahead of the Mobicom computing conference in Miami in September. The paper is a "working draft" and has not appeared in a journal.
The researchers suggest offering an alternative to a vision-based system could make a range of home-based gesture controls practical.
"For example, using a swipe hand motion in-air, a user could control the music volume while showering, or change the song playing in the living room while cooking, or turn up the thermostat while in bed," wrote lead researcher Shyam Gollakota.
To achieve this, the researchers have experimented with the Doppler effect - the way a wave's frequency changes at the point it is observed depending on the source of the wave's movements.
The best known example of the effect is how one hears the pitch of a train's whistle change as a it approaches and then passes.
The team say a wireless router can be used to detect related changes in wi-fi signals - which are electromagnetic waves - as they reflect off a moving human body.
By using a specially developed software algorithm, the computer scientists say, they were able to distinguish nine different types of movements including a pushing motion, a punch, a circular hand movement and a kick.
To demonstrate the system, the team carried out tests in an office and a two-bedroom apartment.
They say their system was able to correctly identify 846 of the 900 gestures performed - a 94% accuracy rate.
They say these included situations in which the user was in a different room to both the wi-fi transmitter and receiver, requiring the waves to pass through walls before being detected.
The researchers acknowledge the risk of such a system being triggered by unintended gestures or even the risk of a hacker seeking to take control of a target's equipment.
Wisee graphic The researchers say their software can use a wi-fi router to detect when a human is making a gesture
To tackle this they suggest a password system that would involve the user repeating a preset sequence of gestures four times in order to ready their equipment for a command.
They say that an added benefit of this would be that it would reduce the risk of false positives - situations when the system mistakes natural variance in wi-fi signals for a gesture.
The researchers claim their equipment can work with up to five people in the vicinity of the router so long as it it is fitted with multiple antennas. But they note that the more people there are, the less accurate the system becomes.
They say their next step is to try to work out the best way to use the system to control multiple devices at once.
In the meantime they have set up a website to publicise WiSee, suggesting they want to bring it to market.
"Imagine that in the near future you would buy wireless router which could also do gesture recognition. WiSee enabled," they say.
"Unlike other gesture recognition systems like Kinect, Leap Motion or MYO [sensor armband], WiSee requires neither an infrastructure of cameras nor user instrumentation of devices.
"WiSee requires no change to current standards."
Dr Richard Picking - a human-computer interaction specialist at Glyndwr University in Wrexham - said the US team's research had merit, but would still need to overcome several hurdles to become a commercial product.
"There is real potential for WiSee to compete with other devices in their established markets, such as gaming and entertainment," he told the BBC.
"However, although the developers claim that it is unlikely that false commands could be triggered and that it is essentially a secure technology, there is a long way to go before people will be convinced that it will be reliable and safe enough to control household appliances."
Another activity recognition expert - Daniel Roggen from Newcastle University - agreed that the system had potential, praising its idea of reusing existing resources rather than requiring the installation of cameras and other sensors.
Kinect 2 Microsoft's Kinect 2 sensor uses a 1080p resolution camera to improve its activity recognition abilities
However, he noted that the wi-fi equipment used by the researchers was more expensive than the norm, costing about 10 times the price of Microsoft's Kinect.
"It remains to be seen if either a single such device is sufficient to cover an entire house and if the price of the equipment can be brought down," he said.
"The activities presented in the paper are also very coarse. It remains to be seen if subtler human behaviours can be picked up.
"[By contrast] Microsoft's Kinect is specifically designed for activity recognition. As such, it has an advantage over 'opportunistic sensing' approaches, and it can pick up subtler movements such as finger movements in its newest version."
|BBC news - 22 July 2013
Leap Motion: Touchless tech wants to take control
By Laura Locke, Technology reporter
The keyboard and mouse have long been the main bridge between humans and their computers.
More recently we've seen the rise of the touchscreen. But other attempts at re-imagining controls have proved vexing.
"It's one of the hardest problems in modern computer science," Michael Buckwald, chief executive and co-founder of Leap Motion, told the BBC.
But after years of development and $45m (£29m) in venture funding, his San Francisco-based start-up has come up with what it claims is the "most natural user interface possible."
It's a 3D-gesture sensing controller that allows touch-free computer interaction.
Continue reading the main story Look outIsraeli firm Primesense has been making headlines in recent days thanks to a report that it is in talks to be bought by Apple - something the 3D sensor firm says is an unfounded rumour.
Rather than trying to make consumer products of its own, the company licenses its depth-sensing tech to others.
Its sensors are used in Microsoft's original Kinect, a 3D scanner by Matterbot and iRobot's Ava - a device that guides itself through hospitals allowing doctors to use it to "visit" patients without leaving their office.
Primesense recently showed off Capri - a second-generation sensor that is 10 times smaller than the previous version and needs less power.
It has fitted the component to one of Google's Nexus tablets to stir up interest and also suggests it could be built into smartphones.
But rather than fitting the sensor to the front of devices to recognise owners' gestures, the firm suggests the best use would be on their backs to look out into the surrounding environments.
"Object recognition is something that is very easily do-able," chief executive Inon Beracha tells the BBC.
"Imagine you scan something - you would get an identification and then you could get the price for an object."
Although the sensor won't feature in the Xbox One games console's new Kinect - which is using Microsoft's own tech - Mr Beracha says to expect news of a tie-up with another big player "in the next months".
Primesense Capri sensor
Using only subtle movements of fingers and hands within a short distance of the device, virtual pointing, swiping, zooming, and painting become possible. First deliveries of the 3in (7.6cm)-long gadget begin this week.
"We're trying to do things like mould, grab, sculpt, draw, push," explains Mr Buckwald.
"These sorts of physical interactions require a lot of accuracy and a lot of responsiveness that past technologies just haven't had."
He adds that it's the only device in the world that accurately tracks hands and all 10 fingers at an "affordable" price point, and it's 200 times more precise than Microsoft's original Kinect.
It works by using three near-infrared LEDs (light emitting diodes) to illuminate the owner's hands, and then employs two CMOS (complementary metal-oxide-semiconductor) image sensors to obtain a stereoscopic view of the person's actions.
Hundreds of thousands of pre-orders have poured in from around the world, and thousands of developers are working on applications, Mr Buckwald says.
Leap Motion is convinced it has a shot at making gesture controls part of the mainstream PC and Mac computing experience.
But some high-profile Silicon Valley leaders doubt Leap Motion will render the mouse and keyboard obsolete anytime soon.
They include Tom Preston-Werner, chief executive and founder of Github, a service used by developers to share code and advice.
Coders will still have a need for keyboards and computer mice for years to come, he says, adding that sticking your arm out and waving it for any length of time will be uncomfortable and tiring.
For developers who work long hours, Mr Preston-Warner says he prefers approaches like the forthcoming Myo armband, which wirelessly transmits electrical signals from nerves and muscles to computers and gadgets without being tethered to a USB port.
Other Silicon Valley programmers like Ajay Juneja, aren't convinced Leap Motion's touch-free controller has entirely solved the human-computer interface problem either.
"It's a tool for hobbyists and game developers," says the founder of Speak With Me, a firm that develops natural-language voice-controlled software.
"What else am I going to use a gestural interface for?"
Of course, Leap Motion has lots of ideas.
The company already has its own app store called Airspace with 75 programmes including Core's Painter Freestyle art software, Google Earth and other data visualisation and music composition apps. The New York Times also plans to release a gesture-controlled version of its newspaper.
Airspace Leap Motion is launching its own app store to help developers promote compatible software
Mr Buckwald says he doesn't expect a single "killer app" to emerge. Instead he predicts there will be "a bunch of killer apps for different people".
Kwindla Kramer, chief executive of Oblong Industries - which helped inspire the gesture-controlled tech in the movie Minority Report - considers Leap Motion's controller "a step forward".
His firm makes higher-end devices ranging from $10,000 to $500,000 for industry.
Leap's "accuracy and pricing" is great, he says, but adds that the "tracking volume" - the area where the device can pick up commands - is somewhat limited.
Still, most experts believe the user interfaces of the future will accept a mash-up of different types of controls from a range of different sensors.
Meanwhile, Leap is already looking beyond the PC and says it hopes to embed its tech into smartphones, tablets, TVs, cars and even robots and fighter jets in future.
3 October 2013
Google buys human-gesture recognition start-up Flutter
Flutter said it was "thrilled" to be able to continue its research into gesture recognition at Google Google has bought a start-up that develops gesture-recognition software.
Flutter, founded three years ago in San Francisco, detects simple hand signals via webcam, using them to control apps such as iTunes and Netflix.
The acquisition has caused speculation that Google will integrate the technology, used by some of its rivals, into its ranges of Chromebook laptops and Nexus handheld devices.
The search giant has not released any details of the deal.
Continue reading the main story “Start QuoteIt could be particularly useful as a tool for older people”
End Quote Dr Richard Picking Glyndwr University
Gesture-recognition technology is widely used in gaming consoles, such as Nintendo's Wii and Microsoft's Xbox Kinect.
It is also used in some smart TVs, and Samsung recently introduced gesture recognition for its Galaxy S4 smartphone.
The phone's Air Gesture technology allows users to scroll through web pages, accept calls, and control music by waving their hands.
An expert told the BBC that despite Google's acquisition, it remained to be seen whether gesture recognition would become a mainstream technology.
"The more interface styles we can develop the better, but whether gesture recognition becomes the norm depends on how well it can be personalised and whether people embrace it," said Richard Picking, of Glyndwr University.
"It could be particularly useful as a tool for older people, or those with disabilities," he added.
Flutter was initially funded through Y Combinator, a company that nurtures start-ups.
Kinect Microsoft's Kinect technology uses gesture-recognition sensors for gaming
Co-founder Navneet Dalal, who used to work at Google, announced the deal on Flutter's website.
"Today, we are thrilled to announce that we will be continuing our research at Google," the statement read.
"We're excited to add their rocket fuel to our journey."
Flutter said it would continue to support its current app.
FUN Sites :
Travel Apparel | Self-Improvement | Horoscope Content | item | Riddles | Chore | Gesture | Gerodontics | Is it a Holiday ?
Greetings | Lyrics | CupidMail | Dating | FREE Email | Forms | Profiles | Horoscopes | Pants | Judgments | Recipes | Games | Free Auctions | Jokes