Amazon Fire Phone Drives new Technological Innovation
By Bill Bledsoe, Senior Product Marketing Manager – AT&T Mobility
The announcement that the Amazon Fire phone was coming to AT&T was an exciting event for us at the AT&T Developer program. We have a number of great partners whose devices do amazing things, but the Fire phone is a unique combination of features and technologies that developers can utilize when building apps.
Dynamic Perspective: It’s In The Way That You Use It
The first of these is a bold step by Amazon to change the way users interact with Fire called Dynamic Perspective. This technology allows the phone to react to how a user holds, views, and moves the phone. Developers are able to take advantage of these unique movements and perspectives via the Dynamic Perspective SDK in their apps and games. Dynamic Perspective includes a set of APIs and controls that enable developers to incorporate some of the key Dynamic Perspective features such as peek, tilt, and zoom right from their app. This enables the creation of a more immersive gaming or navigational experience via unique hardware and technology incorporated into the phone. This includes four ultra-low power cameras, infrared LEDs, dedicated processor with real-time vision algorithms, and a power-efficient graphics engine to track a users’ head movements in real-time.
Developers have APIs that access this groundbreaking technology, which includes the HeadTracking API, the MotionGesture API, and the Dynamic Perspective UI – Euclid API. The accompanying SDK comes with a full set of sample code and best practices to get developers fully immersed in this new set of functionality for their apps. There’s also great documentation on how to integrate the Amazon Unity Plugin for Head Tracking & Shortcut Gestures as well (for those of you developing games out there).
Firefly: Interacting With The World Around You
The second unique piece of technology, Firefly, continues the theme providing technologies for users to fully experience the area around them. With Firefly technology and the accompanying SDK, app developers can build apps that recognize real-world objects like QR or bar codes, songs, movies, artwork, e-mail addresses, phone numbers, URLs and much more. Firefly technology then allows the user to interact with these objects: Users simply press the dedicated Firefly button on Fire phone and Firefly identifies the object and provides relevant information about it. The fully featured SDK, again, includes some robust sample apps to show integration and capabilities, as well as how it might be deployed within the Amazon store. Firefly also supports a “plugin” system whereby apps could extend the core Firefly technology in a number of different ways.
Any apps you develop will of course have to be suited for Amazon’s Fire OS 3.5. However, Amazon says that most Android developers will find that their apps just work on Fire OS. You can test your apps via an analysis of your app’s APK in as little as 90 seconds. They say they’ll have a complete report for you within 6 hours. In addition, developers who enhance their existing Kindle Fire app to include the immersive new features in Fire phone can qualify for a special 500,000 Amazon Coins offer through the Appstore Developer Select Program and create campaigns where customers earn those Coins when they purchase apps and games.
With a rich set of developer capabilities along with the strength of both Amazon and AT&T behind it, Fire is poised to be a significant opportunity for developers both from a technological and monetization perspective. We get the feeling that developers will want to see how far they can take these new technologies and are excited to see where that takes us. We’d love to hear your perspective on these new technologies via the comments below.