Disruptive user-interface technology available soon

Point out news stories, on the net or in mainstream media, related to polywell fusion.

Moderators: tonybarry, MSimon

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Disruptive user-interface technology available soon

Post by DeltaV »

https://leapmotion.com/

Not Kinect. For $70 you get 0.01mm resolution within an 8 ft^3 space. Uses infrared.

Check out the video on the home page, especially the point clouds of the hands.

This will be big.

Diogenes
Posts: 6967
Joined: Mon Jun 15, 2009 3:33 pm

Re: Disruptive user-interface technology available soon

Post by Diogenes »

DeltaV wrote:https://leapmotion.com/

Not Kinect. For $70 you get 0.01mm resolution within an 8 ft^3 space. Uses infrared.

Check out the video on the home page, especially the point clouds of the hands.

This will be big.

Very cool.
‘What all the wise men promised has not happened, and what all the damned fools said would happen has come to pass.’
— Lord Melbourne —

zapkitty
Posts: 267
Joined: Fri Apr 09, 2010 8:13 pm

Post by zapkitty »

Dammitall to hell...

... how are we going to get to the point of mech pilots wearing skintight data films when they can just use this gadget instead?!...

... Sorry, Shirow-san, the present has overrun the future again... *sob*...

krenshala
Posts: 914
Joined: Wed Jul 16, 2008 4:20 pm
Location: Austin, TX, NorAm, Sol III

Post by krenshala »

zapkitty wrote:Dammitall to hell...

... how are we going to get to the point of mech pilots wearing skintight data films when they can just use this gadget instead?!...

... Sorry, Shirow-san, the present has overrun the future again... *sob*...
The skinsuit is a serious contender for an actual pressure suit, so you may get your wish. :D

SheltonJ
Posts: 11
Joined: Wed Aug 29, 2012 4:14 pm

When disruptive technologies combine ....

Post by SheltonJ »

If you combine this ability to 3D scan to 0.01mm in an 8 cubic foot space with appropriate software and 3D printing, some amazing things become possible. The time to create a 3D model of a physical object at very high resolution will drop significantly. This combined with 3D printing would allow the cheap replication of spare parts given an undamaged original part. Damaged parts would of course require cleanup to 'remove' the damage from the scan.

Another very interesting area would be to use this interface technique to enhance 3D modeling software to allow a more direct manipulation style or even 'air sculpting'.

Wow, just wow.

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

According to various reports, 'air sculpting' was their original motivation to develop this.

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »


Skipjack
Posts: 6808
Joined: Sun Sep 28, 2008 2:29 pm

Post by Skipjack »

I have seen it a while ago. Very interesting technology!

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

http://www.technologyreview.com/news/50 ... ntrol-era/
Leap’s founders won’t share exact details of their technology, but Holz says that unlike the Kinect, the Leap doesn’t project a grid of infrared points onto the world that are tracked to figure out what is moving and where (see the pattern produced by the Kinect sensor).

Despite having two cameras, the Leap does not use stereovision techniques to determine depth, says Holz. Instead, the second camera is to provide an extra source of information and prevent errors due to parts of a person’s hand obscuring itself or the other hand.

Maui
Posts: 586
Joined: Wed Apr 09, 2008 12:10 am
Location: Madison, WI

Post by Maui »

Cool, but I'm not sure I see this catching on in the long run for desktops (I guess the counter-argument is that desktop's are dying, eh).

I honestly think I'd be faster and more accurate with the mouse, plus I gotta think mouse buttons are always going to be more precise and reliable that gesture recognition.

Plus, wouldn't you get tired holding your arms out in front of yourself the whole day? Hey, I know I could use more exercise, but...

I thought I heard years ago that someone was working on a pointer that tracked the direction of your gaze. Combine that with this for auxiliary functions and maybe I'm interested.

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

Maui wrote:Plus, wouldn't you get tired holding your arms out in front of yourself the whole day? Hey, I know I could use more exercise, but...

I thought I heard years ago that someone was working on a pointer that tracked the direction of your gaze. Combine that with this for auxiliary functions and maybe I'm interested.
As I read it, you can rest your hand on the desk and just move one or two fingers, if need be. The 'gains' (motion scale factors) in the software can be tuned to fit your particular style of use.

The gaze sensor might be problematic, since human eyes move in saccades which the conscious mind is usually not aware of. Not saying a saccade filter could not be developed. Something similar might be needed with Leap for people with Parkinson's, etc.

DeltaV
Posts: 2245
Joined: Mon Oct 12, 2009 5:05 am

Post by DeltaV »

I'd like to know how the point cloud points beyond line-of-sight (such as the backs of the fingers) are obtained. Infrared diffraction? But, that is part of the secret sauce...

Maui
Posts: 586
Joined: Wed Apr 09, 2008 12:10 am
Location: Madison, WI

Post by Maui »

That's interesting about the saccades. I learned something today, I like that.

I guess I'll have to wait for the mind reading tech then...

paperburn1
Posts: 2484
Joined: Fri Jun 19, 2009 5:53 am
Location: Third rock from the sun.

Post by paperburn1 »

DeltaV wrote:I'd like to know how the point cloud points beyond line-of-sight (such as the backs of the fingers) are obtained. Infrared diffraction? But, that is part of the secret sauce...
I suspect
http://www.technovelgy.com/ct/Science-F ... wsNum=3823

Blankbeard
Posts: 105
Joined: Wed Nov 21, 2012 9:56 pm

Post by Blankbeard »

DeltaV wrote:The gaze sensor might be problematic, since human eyes move in saccades which the conscious mind is usually not aware of. Not saying a saccade filter could not be developed. Something similar might be needed with Leap for people with Parkinson's, etc.
OpenCV does gaze tracking

http://hackaday.com/2012/05/30/opencv-k ... -tracking/

I wonder if that's what they use. Massively useful software. Body part tracking, face id, people location, counting, tracking. And a lot of it requires no more than smartphone style hardware.

Post Reply