zmsoft / google-glass-api

Automatically exported from code.google.com/p/google-glass-api
0 stars 0 forks source link

Support for native applications; GDK #12

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
This is rather to Google team,

Google Glass was hyped up to be the revolution in Augmented Reality. Now that 
the API docs is available, it's clear, that Glass's AR capabilities are quite 
limited.

Mirror API allows to send user some notifications, or one-time portions of 
information, and grab user's recorded photos, videos, and audio. No kind of 
real-time video, photo, audio, sensor data processing, so no interactive 
computer vision applications, no object recognition, no head's up navigation. 
The cards are static and can't be changed over time. Basically, the Mirror API 
doesn't allow for anything that is usually considered AR.

I believe, however, that Glass' hardware allows for implementing all of the 
mentioned above, but that requires more usual approach to apps. Such apps have 
to be able to run offline, on the device, and be able to control the visual 
output, as well as have access to media and sensors.

So, are there any plans on extending Google Glass API to offline applications 
running on the device?

Original issue reported on code.google.com by bogdan.k...@gmail.com on 18 Apr 2013 at 5:22

GoogleCodeExporter commented 9 years ago
I get what your are saying, instead I would think Glass should have a bluetooth 
profile to retrieve/send/interact the notifications from Android/iOS(jailbreak 
required maybe), which is similar idea to the Pebble Watch

Moreover, a interactive API could be invented to allow Android application 
directly push a card to the Glass, then send back to the phone with 
Pic/Text/Video/more! 

I think this is more efficient and a better offline solution for Glass

Original comment by zhangjun...@gmail.com on 18 Apr 2013 at 6:24

GoogleCodeExporter commented 9 years ago

Original comment by mimm...@google.com on 18 Apr 2013 at 9:09

GoogleCodeExporter commented 9 years ago
Agreed - while I can understand the reasons the Mirror API is designed the way 
it is (and it'll be fine as-is for huge numbers of applications), I was hoping 
to develop some apps that make use of the camera and processor and run 'on 
board' to do real-time computer vision.  (The bar-code scanning question was a 
perfect example.  I would hope to be able to write an app to scan the barcode 
image 'on board' to get the barcode number without having to send the whole 
image to a back-end server).  

Is that ever going to happen?  Will I ever be able to write (say) OpenCV apps 
to run on Google GLASS in near-real-time?  For many apps, syncing to back-end 
servers is too slow (or may not be possible at all, if no internet available at 
that location).

Original comment by reader....@gmail.com on 22 Apr 2013 at 1:38

GoogleCodeExporter commented 9 years ago
Issue 34 has been merged into this issue.

Original comment by mimm...@google.com on 4 May 2013 at 4:48

GoogleCodeExporter commented 9 years ago

Original comment by mimm...@google.com on 9 May 2013 at 9:35

GoogleCodeExporter commented 9 years ago
Agreed! Searching the net for Glass use cases over 90% of the results deal with 
some sort  of AR. But the actual cloud based API does not allow these kind of 
apps. Main functionality is capturing some sort of media, sharing it and being 
notified of it. Of course its all new and may take time to evolve. But the 
question is if the glass hardware is powerful enough to host native apps. Does 
google even have any plans in this direction? Maybe as zhangun suggested to 
interact over bluetooth with the phone!?
Anyway, the actual API seems very limited what AR use cases are concerned.

Original comment by ighani.m...@googlemail.com on 19 May 2013 at 8:12

GoogleCodeExporter commented 9 years ago

Original comment by mimm...@google.com on 21 May 2013 at 8:15

GoogleCodeExporter commented 9 years ago
I just want to echo the desire to have a direct link between Glass and a device 
to allow for the near real-time interaction. For the app I've developing and 
want to integrate with Glass, a data connection (wifi or mobile) may not be 
available and waiting/hoping for an eventual sync to occur isn't as elegant and 
user friendly as I would prefer.  If it were possible to have direct access to 
Glass through the GDK and use the camera and GPS, that would be perfect.

Original comment by parkbsu on 23 May 2013 at 12:54

GoogleCodeExporter commented 9 years ago
Our novel ideas for Glass relate to the eye sensor and movement/orientation 
sensors so we would like to have access to those data streams in real time - it 
seems that part of the API is not available (yet). Is it available and on 
restricted issue (understandable for H&S reasons possibly) or is it just not 
ready yet? Its hard to plan software without that access.

Original comment by hi99...@gmail.com on 22 Jun 2013 at 12:27

GoogleCodeExporter commented 9 years ago
Echo the desire to get this stuff available asap, thanks!

Original comment by jbye...@gmail.com on 4 Jul 2013 at 5:19

GoogleCodeExporter commented 9 years ago
Marking as Accepted since we've stated that GDK will exist.

Original comment by mimm...@google.com on 22 Jul 2013 at 10:00

GoogleCodeExporter commented 9 years ago
Sorry if its duplicate question; any estimation on when the GDK will be 
released?

Original comment by gajen.su...@gmail.com on 30 Aug 2013 at 3:49

GoogleCodeExporter commented 9 years ago
We published a GDK sneak peek earlier this week. You can learn all about it 
here: https://developers.google.com/glass/develop/gdk/index

Original comment by mimm...@google.com on 22 Nov 2013 at 10:00

GoogleCodeExporter commented 9 years ago
Nov 22 19:24:13 Wes-Wilsons-iPad-Test kernel[0] <Debug>: launchd[349]
Builtin profile: container (sandbox)

Nov 22 19:24:13 Wes-Wilsons-iPad-Test kernel[0] <Debug>: launchd[349]
Container:
/private/var/mobile/Applications/0A25A46A-2D7F-42E4-A63C-DDCEA295345B [69]
(sandbox)

Nov 22 19:24:14 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: <nwViewController: 0x17fbc0> - Did Start Load

Nov 22 19:24:14 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>: <!DOCTYPE
html>
<html xmlns="http://www.w3.org/1999/xhtml">
    <head>
        <meta http-equiv="encoding" content="utf-8"  />
        <meta http-equiv="content-type" content="application/xhtml+xml;
charset=utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge" />
        <link rel="stylesheet" type="text/css" href="
http://native.web/oj/main-dev.css" />
        <link rel="stylesheet" type="text/css" href="
http://native.web/nw/main-dev.css" />
        <link rel="stylesheet" type="text/css" href="
http://native.web/ngenius/main-dev.css" />
        <script type="text/javascript" language="javascript" src="
http://native.web/oj/main-dev.js"></script>
        <script type="text/javascript" language="javascript" src="
http://native.web/nw/main-dev.js"></script>
        <script type="text/javascript" language="javascript" src="
http://native.web/ngenius/main-dev.js"></script>
    </head>
    <body id="OJ" mode="dev"
version="0.9.0"><ngeniusapp></ngeniusapp></body>
</html>

Nov 22 19:24:14 Wes-Wilsons-iPad-Test sandboxd[346] <Notice>: N-Genius(349)
deny file-write-create
/private/var/mobile/Applications/E4C763EF-AB05-459C-9CAB-922776520D9F

Nov 22 19:24:14 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: <nwViewController: 0x17fbc0> - Did Finish Load

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: route nw.supportedOrientations : (
        (
        1,
        2
    )
)

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug2: Response to comm: nw.supportedOrientations = <7b226572
726f7222 3a6e756c 6c2c2269 64223a32 30342c22 72657375 6c74223a 6e756c6c 7d>

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: route nw.minSize : (
    320,
    320
)

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug2: Response to comm: nw.minSize = <7b226572 726f7222
3a6e756c 6c2c2269 64223a32 31312c22 72657375 6c74223a 6e756c6c 7d>

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: route nw.statusBarStyle : (
    "-1"
)

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug2: Response to comm: nw.statusBarStyle = <7b226572 726f7222
3a6e756c 6c2c2269 64223a32 31382c22 72657375 6c74223a 6e756c6c 7d>

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug1: route nw.ready : (
)

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug0: <nwViewController: 0x17fbc0> - Ready

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug2: Response to comm: nw.ready = <7b226572 726f7222 3a6e756c
6c2c2269 64223a32 39312c22 72657375 6c74223a 6e756c6c 7d>

Nov 22 19:24:15 Wes-Wilsons-iPad-Test N-Genius[349] <Warning>:
NativeWeb-Debug0: Loading View Removed

Nov 22 19:24:16 Wes-Wilsons-iPad-Test sandboxd[346] <Notice>: N-Genius(349)
deny file-write-create
/private/var/mobile/Applications/E4C763EF-AB05-459C-9CAB-922776520D9F

Wes Wilson | Co-Founder
c: two Enterprises, LLC
e: wes@twoenterprises.com
p: +1 256.434.1386
w: www.twoenterprises.com

CONFIDENTIALITY NOTICE: This message is intended solely for the addressed
recipient(s) and may contain legally privileged and confidential
information. Any unauthorized use, disclosure or duplication is strictly
prohibited. If you are not the addressed recipient please notify the sender
and destroy all copies of the original message.

Original comment by w...@twoenterprises.com on 23 Nov 2013 at 1:26